content
stringlengths 10
4.9M
|
---|
Author has written 47 stories for Inuyasha.
Never more than forthright.
I came to read. I stayed to write.
I ship everyone... or no one. Take your pick!
Fandom is where I learned the ropes of storytelling. These fics are my beginnings, and I treasure all the love and effort that went into their telling. Updates have dwindled, but not because my muses abandoned me. I still write as much as I ever did. More, even, since I have my own stories to tell. But as time allows, I'll add to the unfinished works here, patiently bringing them to their foregone conclusions. May they be worth the wait. ~forth
CURRENTLY UPDATING:
Savvy will mostly update on Thursdays (once I get past a looming publishing deadline)
NEXT UP:
Mood Stripes is currently posting on my blog (and here, though with a several-chapter lag).
ORIGINAL SERIES:
I've been writing a playful (paranormal romance) series with you in mind. Here's hoping you'll be smitten! :twinkle:
Amaranthine Saga, #1 - Tsumiko and the Enslaved Fox - available everywhere
Amaranthine Saga #2 - Kimiko and the Accidental Proposal - available everywhere
Amaranthine Saga #3 - Tamiko and the Two Janitors - releases July 2019
NEWS! Audio books are coming! Tsumiko will release in February 2019, Kimiko in March.
SHORT STORIES - Songs of the Amaranthine is a collection of short stories. The first, Marked by Stars, is now available.
Request signed books on my website - ForthWrites dot com
OTHER FANDOMS: Every once in a while I get the urge to write for a fandom other than Inuyasha, and so I've opened a secondary account here on FanFiction.net to stash those stories. If you're interested in following any of my other ventures, you'll find them under the penname forthrightly. http://www.fanfiction.net/~forthrightly
FANART: My fanart links have been moved to my website. You're invited to explore them at ForthWrites dot com.
WEBSITE: I now have a blog. Follow along for FAQs, story news, art posts, random quotes, and the chance to chat about everything from fandom to forthcoming original stories. ForthWrites dot com. I'm also on tumblr (forthrightly), deviantART (forthrightly), twitter (ForthWrites), and GoodReads (Forthright.). Links to my assorted profile are on the ABOUT page of my website - ForthWrites dot com. |
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Fri Feb 12 14:16:53 2021
@author: alef
4. Implementar uma função que receba um dicionário e retorne a soma, a
média e a variação dos valores.
"""
import statistics
def analise_dados(saldo_filiais):
total = 0
count = 0
data = []
for filial,valores in saldo_filiais.items():
print(filial,' ==> ', valores)
count += len(valores)
data.extend(valores)
for valor in valores:
total += valor
media = float(total)/float(count) if count > 0 else 0
variance = statistics.pvariance(data, mu=media)
return total, media,variance
filiais = {'filial1':[100.0, 200.0, 500.0],
'filial2':[250.0, 00.50, 300.0],
'filial3':[700.0, 900.0, 100.0]}
result = analise_dados(filiais)
print(' Soma: ', result[0])
print(' Média: ', result[1])
print('variação: ', result[2]) |
/**
* clean just the cached file (storage of groups and remote repos)
*/
public void deleteCache( final String path )
throws IndyClientException
{
delete( path + "?" + CHECK_CACHE_ONLY + "=true" );
} |
//utility for Kruskal
class DisjointSets {
private:
vertex_index_t *parent, *rnk;
vertex_index_t n;
public:
DisjointSets(vertex_index_t n);
vertex_index_t find(vertex_index_t u);
void merge(vertex_index_t x, vertex_index_t y);
} |
import * as React from "react"
import renderer from 'react-test-renderer'
import {PureFlowTable as FlowTable} from '../../components/FlowTable'
import TestUtils from 'react-dom/test-utils'
import { TFlow, TStore } from '../ducks/tutils'
import { Provider } from 'react-redux'
window.addEventListener = jest.fn()
describe('FlowTable Component', () => {
let selectFn = jest.fn(),
tflow = TFlow(),
store = TStore()
it('should render correctly', () => {
let provider = renderer.create(
<Provider store={store}>
<FlowTable selectFlow={selectFn} flows={[tflow]}/>
</Provider>),
tree = provider.toJSON()
expect(tree).toMatchSnapshot()
})
let provider = renderer.create(
<Provider store={store} >
<FlowTable selectFlow={selectFn} flows={[tflow]}/>
</Provider>),
flowTable = provider.root.findByType(FlowTable)
it('should handle componentWillUnmount', () => {
flowTable.instance.UNSAFE_componentWillUnmount()
expect(window.addEventListener).toBeCalledWith('resize', flowTable.instance.onViewportUpdate)
})
it('should handle componentDidUpdate', () => {
// flowTable.shouldScrollIntoView == false
expect(flowTable.instance.componentDidUpdate()).toEqual(undefined)
// rowTop - headHeight < viewportTop
flowTable.instance.shouldScrollIntoView = true
flowTable.instance.componentDidUpdate()
// rowBottom > viewportTop + viewportHeight
flowTable.instance.shouldScrollIntoView = true
flowTable.instance.componentDidUpdate()
})
it('should handle componentWillReceiveProps', () => {
flowTable.instance.UNSAFE_componentWillReceiveProps({selected: tflow})
expect(flowTable.instance.shouldScrollIntoView).toBeTruthy()
})
})
|
<filename>hs2019.go<gh_stars>1-10
package httpsig
import (
"crypto"
"crypto/rsa"
)
type hs2019_pss struct {
saltLength int
hash crypto.Hash
}
func NewHS2019_PSS(saltLenght int) *hs2019_pss {
return &hs2019_pss{
saltLength: saltLenght,
hash: crypto.SHA512,
}
}
func (hs2019_pss) Name() string {
return "hs2019"
}
func (a hs2019_pss) Sign(key interface{}, data []byte) ([]byte, error) {
k := toRSAPrivateKey(key)
if k == nil {
return nil, unsupportedAlgorithm(a)
}
h := a.hash.New()
if _, err := h.Write(data); err != nil {
return nil, err
}
opt := &rsa.PSSOptions{SaltLength: a.saltLength}
return rsa.SignPSS(Rand, k, a.hash, h.Sum(nil), opt)
}
func (a hs2019_pss) Verify(key interface{}, data, sig []byte) error {
k := toRSAPublicKey(key)
if k == nil {
return unsupportedAlgorithm(a)
}
h := a.hash.New()
if _, err := h.Write(data); err != nil {
return err
}
opt := &rsa.PSSOptions{SaltLength: a.saltLength}
return rsa.VerifyPSS(k, a.hash, h.Sum(nil), sig, opt)
}
|
<gh_stars>0
package secrethub
import (
"fmt"
"strings"
"text/tabwriter"
"github.com/secrethub/secrethub-cli/internals/cli/ui"
"github.com/secrethub/secrethub-cli/internals/secrethub/command"
"github.com/secrethub/secrethub-go/internals/api"
)
// ServiceLsCommand lists all service accounts in a given repository.
type ServiceLsCommand struct {
repoPath api.RepoPath
quiet bool
io ui.IO
useTimestamps bool
newClient newClientFunc
newServiceTable func(t TimeFormatter) serviceTable
filters []func(service *api.Service) bool
help string
}
// NewServiceLsCommand creates a new ServiceLsCommand.
func NewServiceLsCommand(io ui.IO, newClient newClientFunc) *ServiceLsCommand {
return &ServiceLsCommand{
io: io,
newClient: newClient,
newServiceTable: newKeyServiceTable,
help: "List all service accounts in a given repository.",
}
}
func NewServiceAWSLsCommand(io ui.IO, newClient newClientFunc) *ServiceLsCommand {
return &ServiceLsCommand{
io: io,
newClient: newClient,
newServiceTable: newAWSServiceTable,
filters: []func(service *api.Service) bool{
isAWSService,
},
help: "List all AWS service accounts in a given repository.",
}
}
func NewServiceGCPLsCommand(io ui.IO, newClient newClientFunc) *ServiceLsCommand {
return &ServiceLsCommand{
io: io,
newClient: newClient,
newServiceTable: newGCPServiceTable,
filters: []func(service *api.Service) bool{
isGCPService,
},
help: "List all GCP service accounts in a given repository.",
}
}
// Register registers the command, arguments and flags on the provided Registerer.
func (cmd *ServiceLsCommand) Register(r command.Registerer) {
clause := r.Command("ls", cmd.help)
clause.Alias("list")
clause.Arg("repo-path", "The path to the repository to list services for").Required().PlaceHolder(repoPathPlaceHolder).SetValue(&cmd.repoPath)
clause.Flag("quiet", "Only print service IDs.").Short('q').BoolVar(&cmd.quiet)
registerTimestampFlag(clause).BoolVar(&cmd.useTimestamps)
command.BindAction(clause, cmd.Run)
}
// Run lists all service accounts in a given repository.
// Run lists all service accounts in a given repository.
func (cmd *ServiceLsCommand) Run() error {
client, err := cmd.newClient()
if err != nil {
return err
}
services, err := client.Services().List(cmd.repoPath.Value())
if err != nil {
return err
}
included := []*api.Service{}
outer:
for _, service := range services {
for _, filter := range cmd.filters {
if !filter(service) {
continue outer
}
}
included = append(included, service)
}
if cmd.quiet {
for _, service := range included {
fmt.Fprintf(cmd.io.Output(), "%s\n", service.ServiceID)
}
} else {
w := tabwriter.NewWriter(cmd.io.Output(), 0, 2, 2, ' ', 0)
serviceTable := cmd.newServiceTable(NewTimeFormatter(cmd.useTimestamps))
fmt.Fprintln(w, strings.Join(serviceTable.header(), "\t"))
for _, service := range included {
fmt.Fprintln(w, strings.Join(serviceTable.row(service), "\t"))
}
err = w.Flush()
if err != nil {
return err
}
}
return nil
}
type serviceTable interface {
header() []string
row(service *api.Service) []string
}
type baseServiceTable struct {
timeFormatter TimeFormatter
}
func (sw baseServiceTable) header(content ...string) []string {
res := append([]string{"ID", "DESCRIPTION"}, content...)
return append(res, "CREATED")
}
func (sw baseServiceTable) row(service *api.Service, content ...string) []string {
res := append([]string{service.ServiceID, service.Description}, content...)
return append(res, sw.timeFormatter.Format(service.CreatedAt.Local()))
}
func newKeyServiceTable(timeFormatter TimeFormatter) serviceTable {
return keyServiceTable{baseServiceTable{timeFormatter: timeFormatter}}
}
type keyServiceTable struct {
baseServiceTable
}
func (sw keyServiceTable) header() []string {
return sw.baseServiceTable.header("TYPE")
}
func (sw keyServiceTable) row(service *api.Service) []string {
return sw.baseServiceTable.row(service, string(service.Credential.Type))
}
func newAWSServiceTable(timeFormatter TimeFormatter) serviceTable {
return awsServiceTable{baseServiceTable{timeFormatter: timeFormatter}}
}
type awsServiceTable struct {
baseServiceTable
}
func (sw awsServiceTable) header() []string {
return sw.baseServiceTable.header("ROLE", "KMS-KEY")
}
func (sw awsServiceTable) row(service *api.Service) []string {
return sw.baseServiceTable.row(service, service.Credential.Metadata[api.CredentialMetadataAWSRole], service.Credential.Metadata[api.CredentialMetadataAWSKMSKey])
}
func isAWSService(service *api.Service) bool {
if service == nil {
return false
}
return service.Credential.Type == api.CredentialTypeAWS
}
type gcpServiceTable struct {
baseServiceTable
}
func newGCPServiceTable(timeFormatter TimeFormatter) serviceTable {
return gcpServiceTable{baseServiceTable{timeFormatter: timeFormatter}}
}
func (sw gcpServiceTable) header() []string {
return sw.baseServiceTable.header("SERVICE-ACCOUNT-EMAIL", "KMS-KEY")
}
func (sw gcpServiceTable) row(service *api.Service) []string {
return sw.baseServiceTable.row(service, service.Credential.Metadata[api.CredentialMetadataGCPServiceAccountEmail], service.Credential.Metadata[api.CredentialMetadataGCPKMSKeyResourceID])
}
func isGCPService(service *api.Service) bool {
if service == nil {
return false
}
return service.Credential.Type == api.CredentialTypeGCPServiceAccount
}
|
// an instance of the message is passed to it as a parameter.
@MessageMapping( "/AdvanceTime" )
public void AdvanceTime( AdvanceTimeMsg message ) throws Exception {
int hours, minutes;
try {
hours = Integer.parseInt( message.getHours() );
minutes = Integer.parseInt( message.getMinutes() );
TestControl.Singleton().AdvanceTime( hours, minutes );
}
catch ( Exception e ) {
System.out.printf( "Exception, %s, in AdvanceTime()\n", e );
}
} |
Movie budgets have been skyrocketing to absurd rates ever since Kevin Costner spent $175 million in 1995 dollars to make Waterworld. But you know which movie’s budget wasn’t absurd? Star Wars.
As we all know, George Lucas spent $11 million to make his epic science fantasy film, known to geeks like me as Star Wars Episode IV: A New Hope. Adjusted for 2012 dollars, that’s comes to a mere $40 million. (Click the image, above, to expand.)
On the other hand, it cost James Cameron $280 million to lens Avatar in 2009. Had he been filming his blue-skinned Na’vi circa 1977, it would have cost him $77 million to Lucas’ puny $11.
According to an article on Geek.com,
[W]hen adjusting the $775,398,007 A New Hope made in both domestic and international releases for inflation, the film has raked in $3,082,922,515 in revenue. Subtracting the $40 million in adjusted production costs, this leaves a tidy profit of $3,042,922,515.
Of course, this number doesn’t come close to Lucas’ total wealth. He earned 107th place on the Forbes 400 mostly from the sales of Star Wars-based merchandise.
Via Geek.com. |
<gh_stars>1-10
// OptFrame - Optimization Framework
// Copyright (C) 2009-2015
// http://optframe.sourceforge.net/
//
// This file is part of the OptFrame optimization framework. This framework
// is free software; you can redistribute it and/or modify it under the
// terms of the GNU Lesser General Public License v3 as published by the
// Free Software Foundation.
// This framework is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
// GNU Lesser General Public License v3 for more details.
// You should have received a copy of the GNU Lesser General Public License v3
// along with this library; see the file COPYING. If not, write to the Free
// Software Foundation, 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301,
// USA.
#ifndef OPTFRAME_MOS_EXTENDED_INDIVIDUAL_HPP_
#define OPTFRAME_MOS_EXTENDED_INDIVIDUAL_HPP_
#include <algorithm>
#include "../../MultiObjSearch.hpp"
#include "../../Solution.hpp"
#include "../../Evaluator.hpp"
#include "../../Evaluation.hpp"
#include "../../Population.hpp"
#include "../../NSSeq.hpp"
#include "../../ParetoDominance.hpp"
#include "MOSIndividual.hpp"
namespace optframe
{
// MultiObjSearch Extended Individual
template<class R, class X, class ADS = OPTFRAME_DEFAULT_ADS, class DS = OPTFRAME_DEFAULT_DS>
class MOSExtIndividual: public MOSIndividual<X, ADS, DS>
{
public:
MOSIndividual<R>& parent;
MOSExtIndividual(Solution<X, ADS>* s, MultiEvaluation<DS>* mev, MOSIndividual<R>* _parent) :
MOSIndividual<X, ADS, DS>(s, mev), parent(*_parent)
{
}
MOSExtIndividual(Solution<R, ADS>& s, MultiEvaluation<DS>& mev, MOSIndividual<R>* _parent) :
MOSIndividual<X, ADS, DS>(s, mev), parent(*_parent)
{
}
MOSExtIndividual(const MOSExtIndividual<R, X, ADS, DS>& ind) :
MOSIndividual<X, ADS, DS>(&ind.s->clone(), &ind.mev->clone()), parent(ind.parent)
{
this->fitness = ind.fitness;
this->diversity = ind.diversity;
this->id = ind.id;
}
virtual ~MOSExtIndividual()
{
}
virtual void print() const
{
cout << "MOSExtIndividual: parent=" << &parent << " fitness=" << this->fitness << "\t diversity=" << this->diversity;
cout << "\t[ ";
for(unsigned e = 0; e < this->mev->size(); e++)
cout << this->mev->at(e).evaluation() << (e == this->mev->size() - 1 ? " " : " ; ");
cout << " ]";
cout << endl;
}
virtual MOSIndividual<X, ADS, DS>& clone() const
{
return *new MOSExtIndividual<R, X, ADS, DS>(*this);
}
};
/*
template<class R, class X, class ADS = OPTFRAME_DEFAULT_ADS, class DS = OPTFRAME_DEFAULT_DS>
class MOSExtPopulation: public MOSPopulation<R>
{
public:
vector<MOSExtIndividual<R, X>*> PX;
MOSExtPopulation() :
MOSPopulation<R>()
{
}
MOSExtPopulation(const vector<MOSIndividual<R>*>& _PS, const vector<MOSExtIndividual<R, ADS, DS>*>& _PX) :
MOSPopulation<R>(_PS), PX(_PX)
{
}
virtual ~MOSExtPopulation()
{
}
inline void setVector(vector<MOSIndividual<R>*>& v)
{
this->P = v;
}
inline void setVector(vector<MOSExtIndividual<R, X>*>& v)
{
PX = v;
}
// reconfigure GET methods for 'PX'
inline vector<MOSIndividual<R>*>& getVector()
{
return getXVector();
}
inline vector<MOSIndividual<R>*>& getSVector()
{
return this->P;
}
inline vector<MOSExtIndividual<R, X>*>& getXVector()
{
return PX;
}
// reconfigure GET methods for 'PX'
inline vector<MOSIndividual<R>*> getVector() const
{
return getXVector();
}
inline vector<MOSIndividual<R>*> getSVector() const
{
return this->P;
}
inline vector<MOSExtIndividual<R, X, ADS, DS>*> getXVector() const
{
return PX;
}
// reconfigure GET methods for 'PX'
inline MOSIndividual<R>* at(unsigned id) const
{
return PX[id];
}
inline MOSIndividual<R>* atS(unsigned id) const
{
return this->P[id];
}
inline MOSExtIndividual<R, X, ADS, DS>* atX(unsigned id) const
{
return PX[id];
}
// reconfigure GET methods for 'PX'
inline unsigned size() const
{
return PX.size();
}
inline unsigned sizeS() const
{
return this->P.size();
}
inline unsigned sizeX() const
{
return PX.size();
}
inline void add(MOSIndividual<R>* ind)
{
this->P.push_back(ind);
}
inline void add(MOSExtIndividual<R, X, ADS, DS>* ind)
{
PX.push_back(ind);
}
virtual void add(MOSPopulation<R>& Pop)
{
this->P.insert(this->P.end(), Pop.P.begin(), Pop.P.end());
}
virtual void add(MOSExtPopulation<R, X, ADS, DS>& Pop)
{
this->P.insert(this->P.end(), Pop.P.begin(), Pop.P.end());
PX.insert(PX.end(), Pop.PX.begin(), Pop.PX.end());
}
inline void add(vector<MOSIndividual<R>*>& v)
{
this->P.insert(this->P.end(), v.begin(), v.end());
}
inline void add(vector<MOSExtIndividual<R, X, ADS, DS>*>& v)
{
PX.insert(PX.end(), v.begin(), v.end());
}
// reconfigure for both
inline void clear()
{
PX.clear();
this->P.clear();
}
// reconfigure for both
virtual void free()
{
for(unsigned i = 0; i < this->P.size(); i++)
if(this->P[i])
delete this->P[i];
this->P.clear();
for(unsigned i = 0; i < PX.size(); i++)
if(PX[i])
delete PX[i];
PX.clear();
}
};
*/
}
#endif /*OPTFRAME_MOS_EXTENDED_INDIVIDUAL_HPP_*/
|
Philip K. Dick
Philip K. Dick was a central figure of science fiction literature from the 1950s to the 1970s. His novels and short stories were greatly admired by fellow authors such as Brian Aldiss, Ursula Le Guin, and Stanisław Lem, as well as by theorists of postmodernism such as Jean Baudrillard, Fredric Jameson, and Slavoj Žižek. Dick was highly prolific, publishing over 120 short stories and authoring forty-four novels over the course of his career. In addition to his science fiction, Dick wrote several mainstream works of fiction, of which only one was published during his lifetime. Dick’s fiction has been widely adapted to cinema, both in and outside of Hollywood. Dick’s characteristic themes include Cold War paranoia, dystopia, artificial intelligence, psychopathology, drugs and the 1960s counterculture, illusion and simulation, empathy, entropy and determinism, spiritual revelation, and religious salvation. |
Investigation of Internal Fault Modeling of Power former
This paper describes a model of a 75 MVA and 150 kV synchronous machine (powerformer), which can be used to simulate internal fault waveforms for power system protection studies. The method employs a direct phase representation considering the cable capacitance. A method to calculate the inductance and its magnetic axis location of the faulty path is outlined. The machine equations are then solved using a suitable numerical technique. Comparisons are made between the simulated waveforms and recorded waveforms to verify the accuracy of the model |
/**
* Represents a grantee identified by their canonical Amazon ID.
*/
@NoArgsConstructor
@AllArgsConstructor
@XStreamAlias("s3-object-acl-grantee-canonical-user")
public class S3ObjectAclGranteeCanonicalUser implements S3ObjectAclGrantee {
/**
* Represents a grantee identified by their canonical Amazon ID.
*/
@Getter
@Setter
private String id;
@Override
public boolean grant() {
return StringUtils.isNotEmpty(getId());
}
@Override
public Grantee create() {
return new CanonicalGrantee(getId());
}
} |
// fetchBatch fetches all metric metadata for the given job and instance combination.
// We constrain it by instance to reduce the total payload size.
// In a well-configured setup it is unlikely that instances for the same job have any notable
// difference in their exposed metrics.
func (c *Cache) fetchBatch(ctx context.Context, job, instance string) (map[string]*cacheEntry, error) {
job, instance = escapeLval(job), escapeLval(instance)
apiResp, err := c.fetch(ctx, "batch", url.Values{
"match_target": []string{fmt.Sprintf("{job=\"%s\",instance=\"%s\"}", job, instance)},
})
if err != nil {
return nil, err
}
now := time.Now()
if apiResp.ErrorType == apiErrorNotFound {
return nil, nil
}
if apiResp.ErrorType != "" {
return nil, errors.Wrap(errors.New(apiResp.Error), "lookup failed")
}
result := make(map[string]*cacheEntry, len(apiResp.Data)+len(internalMetrics))
for _, md := range apiResp.Data {
if md.Type == MetricTypeUntyped {
md.Type = textparse.MetricTypeUnknown
}
result[md.Metric] = &cacheEntry{
Entry: &Entry{Metric: md.Metric, MetricType: md.Type, Help: md.Help},
lastFetch: now,
found: true,
}
}
for _, md := range internalMetrics {
result[md.Metric] = &cacheEntry{Entry: md, lastFetch: now, found: true}
}
return result, nil
} |
/** This method actually handles the SNBT-command. **/
private static void handleSNBTCommand(EntityPlayerMP player, World world, StringNBTCommand commandPacket) {
if(world.isRemote){
TaleCraft.logger.error("FATAL ERROR: ServerHandler method was called on client-side!");
return;
}
if(commandPacket.command.equals("server.client.connection.state.change:join_acknowledged")) {
TaleCraft.logger.info("join acknowledged : " + commandPacket.data);
getServerMirror(null).playerList().getPlayer(player).construct(commandPacket.data);
return;
}
if(commandPacket.command.equals("server.client.settings.update")) {
TaleCraft.logger.info("updating settings " + commandPacket.data);
getServerMirror(null).playerList().getPlayer(player).updateSettings(commandPacket.data);
return;
}
if(commandPacket.command.equals("server.data.entity.merge")) {
if(!PlayerHelper.isOp(player)) {
player.addChatMessage(new ChatComponentText("Error: 'server.data.entity.merge' is a operator only command."));
return;
}
String uuidStr = commandPacket.data.getString("entityUUID");
UUID uuid = UUID.fromString(uuidStr);
Entity theEntity = null;
for(Object entityObject : world.loadedEntityList) {
Entity entity = (Entity) entityObject;
if(entity.getUniqueID().equals(uuid)) {
theEntity = entity;
break;
}
}
if(theEntity == null) {
player.addChatMessage(new ChatComponentText("Error: Entity not found. (Possibly dead)"));
return;
}
NBTTagCompound entityData = new NBTTagCompound();
NBTTagCompound mergeData = commandPacket.data.getCompoundTag("entityData");
mergeData.removeTag("UUIDMost");
mergeData.removeTag("UUIDLeast");
mergeData.removeTag("Dimension");
mergeData.removeTag("Pos");
theEntity.writeToNBT(entityData);
entityData.merge(mergeData);
theEntity.readFromNBT(entityData);
if(entityData.hasKey("TC_Width")) theEntity.width = entityData.getFloat("TC_Width");
if(entityData.hasKey("TC_Height")) theEntity.height = entityData.getFloat("TC_Height");
if(entityData.hasKey("TC_StepHeight")) theEntity.stepHeight = entityData.getFloat("TC_StepHeight");
if(entityData.hasKey("TC_NoClip")) theEntity.noClip = entityData.getBoolean("TC_NoClip");
return;
}
if(commandPacket.command.startsWith("server.data.block.merge:")) {
if(!PlayerHelper.isOp(player)) {
player.addChatMessage(new ChatComponentText("Error: 'blockdatamerge' is a operator only command."));
return;
}
String positionString = commandPacket.command.substring(24);
String[] posStrings = positionString.split(" ");
BlockPos position = new BlockPos(Integer.valueOf(posStrings[0]), Integer.valueOf(posStrings[1]), Integer.valueOf(posStrings[2]));
TileEntity entity = world.getTileEntity(position);
if(entity != null) {
TaleCraft.logger.info("(datamerge) " + position + " -> " + commandPacket.data);
mergeTileEntityData(entity, commandPacket.data);
return;
} else {
player.addChatMessage(new ChatComponentText("Error: Failed to merge block data: TileEntity does not exist."));
return;
}
}
if(commandPacket.command.startsWith("server.data.block.command:")) {
if(!PlayerHelper.isOp(player)) {
player.addChatMessage(new ChatComponentText("Error: 'blockcommand' is a operator only command."));
return;
}
String positionString = commandPacket.command.substring(26);
String[] posStrings = positionString.split(" ");
BlockPos position = new BlockPos(Integer.valueOf(posStrings[0]), Integer.valueOf(posStrings[1]), Integer.valueOf(posStrings[2]));
TileEntity entity = world.getTileEntity(position);
if(entity != null) {
if(entity instanceof TCIBlockCommandReceiver) {
((TCIBlockCommandReceiver) entity).commandReceived(commandPacket.data.getString("command"), commandPacket.data);
return;
}
} else {
player.addChatMessage(new ChatComponentText("Error: Failed to run block-command: TileEntity does not exist."));
return;
}
}
TaleCraft.logger.error("Received unknown StringNBTCommand from client: "+commandPacket.command+" : "+commandPacket.data);
} |
<filename>org.jrebirth.af/core/src/test/java/org/jrebirth/af/core/ui/model/basic/ModelTest.java
package org.jrebirth.af.core.ui.model.basic;
import org.jrebirth.af.core.ui.model.AbstractModelTest;
import org.junit.Test;
/**
* The class <strong>FxmlTest</strong>.
*
* @author <NAME>
*/
public class ModelTest extends AbstractModelTest {
@Test
public void basicModel() {
basicModel(MyModel.class);
}
@Test
public void basicObjectModel() {
objectModel(MyObjectModel.class, MyObjectModel2.class);
}
}
|
/** Remove ActivationObjectItem from the activation system tree (root + related group)
* and propagate a change.
* @param aoi object to remove.
*/
public void removeActivationObjectItem(ActivationObjectItem aoi) {
ActivationID aID = aoi.getActivationID();
ActivationGroupID agID = aoi.getDesc().getGroupID();
if (aObjectItems != null) aObjectItems.remove(aID);
if (aGroupItems != null) {
ActivationGroupItem agi = (ActivationGroupItem) aGroupItems.get(agID);
if (agi != null) agi.removeActivatable(aoi);
}
firePropertyChange(PROP_ACTIVATION_ITEMS, null, null);
} |
Estimating Housing Rent Depreciation for Inflation Adjustments
Abstract U.S. inflation measures, such as the Consumer Price Index, are adjusted for an aging-bias based on estimates of the average rent depreciation. This study analyzes the characteristics of rent depreciation using novel, market-based data on rental contracts in Las Vegas, NV. We find that the estimated annual depreciation rate for new properties is 0.9% for single-family residences and 1.5% for condominiums. The higher depreciation rate for condominiums is due to higher functional obsolescence instead of physical deterioration. Rent depreciation rates are lower for older and smaller structures and vary significantly across neighborhoods. Our results suggest that local inflation rates are biased downward where new and large units increased since the last update to the official rent depreciation estimates but upward where the housing stock became older. From an asset pricing perspective, failing to account for initially high depreciation results in an overvaluation of new properties and an undervaluation of old properties. |
import pygame.display
from math import sin, cos, radians, copysign
class Boat:
def __init__(self, pos_x, pos_y):
self.pos_x = pos_x
self.pos_y = pos_y
self.vel = 10
self.heading = 90
self.rudder = 0
self.turn = 0
self.length = 100
self.width = 38
self.boat_img = pygame.image.load('assets/boat_sprite.png')
self.boat_img = pygame.transform.smoothscale(self.boat_img, (self.length, self.width))
self.water_img_width = 450
self.water_img_height = 450
self.water_img = pygame.image.load('assets/water_sprite.jpg')
self.water_img = pygame.transform.scale(self.water_img, (self.water_img_width, self.water_img_height))
def update(self):
self.heading = self.heading % 360
max_rudder = 50
max_vel = 40
if self.rudder > max_rudder:
self.rudder = max_rudder
elif self.rudder < -max_rudder:
self.rudder = -max_rudder
if self.vel > max_vel:
self.vel = max_vel
elif self.vel < 0:
self.vel = 0
self.move()
max_width = pygame.display.Info().current_w
max_height = pygame.display.Info().current_h
if self.pos_x > max_width:
self.pos_x = 0
if self.pos_x < 0:
self.pos_x = max_width
if self.pos_y > max_height:
self.pos_y = 0
if self.pos_y < 0:
self.pos_y = max_height
def move(self):
self.turn += copysign(self.rudder ** 2, self.rudder) / 200
if self.turn > 50:
self.turn = 50
elif self.turn < -50:
self.turn = -50
self.turn = self.turn * min((self.vel / 200 + 0.90), 0.95)
if self.vel == 0:
self.turn = 0
self.heading += self.turn / ((self.vel + 10) * 4)
self.turn = self.turn * 0.95
self.pos_x = self.pos_x + self.vel * cos(radians(self.heading)) / 10
self.pos_y = self.pos_y - self.vel * sin(radians(self.heading)) / 10
|
import java.util.*;
public class Main {
static int m, n;
public static void main (String[] args){
Scanner sc = new Scanner(System.in);
m = sc.nextInt();
n = sc.nextInt();
boolean[] lie = new boolean[n];
for(int i = 0; i < n; ++i){
System.out.println(1);
int in = sc.nextInt();
if(in == 0){
System.out.flush();
return;
}
if(in == -1) lie[i] = true;
}
int l = 1, h = m, i = 0;
while(l + 1 < h){
int g = (l + h) / 2;
System.out.println(g);
int in = sc.nextInt();
if(in == 0){
System.out.flush();
return;
}
boolean greater = in == 1 && !lie[i] || in == -1 && lie[i];
if(greater) l = g + 1;
else h = g - 1;
++i;
i %= n;
}
for(i = l; i <= h; ++i){
System.out.println(i);
int in = sc.nextInt();
if(in == 0){
System.out.flush();
return;
}
}
}
} |
// Validate checks for config errors
func (el *Elem) Validate(it *Item, rl *Rule, rls *Rules) []error {
switch el.El {
case RuleEl:
_, err := rls.RuleTry(el.Value)
if err != nil {
return []error{err}
}
return nil
case TokenEl:
if el.Value == "" {
err := fmt.Errorf("Rule: %v Item: %v has empty Token element", rl.Name, it.String())
return []error{err}
}
}
return nil
} |
def _build_endpoint_url(self, endpoint, method, data=None, named_params=None, field_selectors=None):
if named_params:
named_param_str = '%s=%s'.join((k, v) for k, v in named_params.iteritems())
else:
named_param_str = '~'
field_selector_str = self._get_field_selector_str(field_selectors)
endpoint = endpoint % ({
'named_params': named_param_str,
'field_selectors': field_selector_str,
})
if method == 'get':
if data:
endpoint = '%s?%s' % (endpoint, urllib.urlencode(data))
return endpoint |
<gh_stars>0
#[macro_use]
extern crate log;
mod meteo_data;
mod web_server;
#[derive(Clone)]
pub struct SharedAppState {
running: std::sync::Arc<std::sync::atomic::AtomicBool>,
index_html: std::sync::Arc<std::sync::RwLock<String>>,
}
impl SharedAppState {
fn new() -> SharedAppState {
SharedAppState {
running: std::sync::Arc::new(std::sync::atomic::AtomicBool::new(true)),
index_html: std::sync::Arc::new(std::sync::RwLock::new(String::from("Initial state"))),
}
}
}
fn main() -> std::result::Result<(), Box<dyn std::error::Error>> {
env_logger::from_env(env_logger::Env::default().default_filter_or("info")).init();
let shared_app_state = SharedAppState::new();
let shared_app_state_clone = shared_app_state.clone();
let t1 = std::thread::Builder::new()
.name("web_server".to_string())
.spawn(move || {
web_server::main(shared_app_state_clone);
})?;
let shared_app_state_clone = shared_app_state.clone();
let t2 = std::thread::Builder::new()
.name("radar_data".to_string())
.spawn(move || {
meteo_data::main(shared_app_state_clone);
})?;
t1.join().expect("web_server thread has panicked");
t2.join().expect("radar_data thread has panicked");
Ok(())
}
|
<gh_stars>0
// Package plugin provides support for RPC plugins with registration server.
// It also implements middleware calling all the registered and alive plugins
package plugin
import (
"context"
"encoding/json"
"fmt"
"net/http"
"strings"
"sync"
"time"
log "github.com/go-pkgz/lgr"
"github.com/umputun/reproxy/app/discovery"
"github.com/umputun/reproxy/lib"
)
//go:generate moq -out dialer_mock.go -fmt goimports . RPCDialer
//go:generate moq -out client_mock.go -fmt goimports . RPCClient
// Conductor accepts registrations from rpc plugins, keeps list of active/current plugins and provides middleware calling all of them.
type Conductor struct {
Address string
RPCDialer RPCDialer
plugins []Handler
lock sync.RWMutex
}
// Handler contains information about a plugin's handler
type Handler struct {
Address string
Method string // full method name for rpc call, i.e. Plugin.Thing
Alive bool
client RPCClient
}
// conductorCtxtKey used to retrieve conductor from context
type conductorCtxtKey string
// CtxMatch key used to retrieve matching request info from the request context
const CtxMatch = conductorCtxtKey("match")
// RPCDialer is a maker interface dialing to rpc server and returning new RPCClient
type RPCDialer interface {
Dial(network, address string) (RPCClient, error)
}
// RPCDialerFunc is an adapter to allow the use of an ordinary functions as the RPCDialer.
type RPCDialerFunc func(network, address string) (RPCClient, error)
// Dial rpc server
func (f RPCDialerFunc) Dial(network, address string) (RPCClient, error) {
return f(network, address)
}
// RPCClient defines interface for remote calls
type RPCClient interface {
Call(serviceMethod string, args interface{}, reply interface{}) error
}
// Run creates and activates http registration server
// TODO: add some basic auth in case if exposed by accident
func (c *Conductor) Run(ctx context.Context) error {
log.Printf("[INFO] start plugin conductor on %s", c.Address)
httpServer := &http.Server{
Addr: c.Address,
Handler: c.registrationHandler(),
ReadHeaderTimeout: 50 * time.Millisecond,
WriteTimeout: 50 * time.Millisecond,
IdleTimeout: 50 * time.Millisecond,
}
go func() {
<-ctx.Done()
if err := httpServer.Close(); err != nil {
log.Printf("[ERROR] failed to close plugin registration server, %v", err)
}
}()
return httpServer.ListenAndServe()
}
// Middleware hits all registered, alive-only plugins and modifies the original request accordingly
// Failed plugin calls ignored. Status code from any plugin may stop the chain of calls if not 200. This is needed
// to allow plugins like auth which has to terminate request in some cases.
func (c *Conductor) Middleware(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
c.lock.RLock()
for _, p := range c.plugins {
if !p.Alive {
continue
}
var reply lib.Response
if err := p.client.Call(p.Method, c.makeRequest(r), &reply); err != nil {
log.Printf("[WARN] failed to invoke plugin handler %s: %v", p.Method, err)
http.Error(w, http.StatusText(http.StatusInternalServerError), http.StatusInternalServerError)
return
}
for k, vv := range reply.HeadersIn {
for _, v := range vv {
r.Header.Add(k, v)
}
}
for k, vv := range reply.HeadersOut {
for _, v := range vv {
w.Header().Add(k, v)
}
}
if reply.StatusCode >= 400 {
c.lock.RUnlock()
http.Error(w, http.StatusText(reply.StatusCode), reply.StatusCode)
return
}
}
c.lock.RUnlock()
next.ServeHTTP(w, r)
})
}
// makeRequest creates plugin request from http.Request
// uses context set by downstream (by proxyHandler)
func (c *Conductor) makeRequest(r *http.Request) lib.Request {
ctx := r.Context()
res := lib.Request{
URL: r.URL.String(),
RemoteAddr: r.RemoteAddr,
Host: r.URL.Hostname(),
Header: r.Header,
}
if v, ok := ctx.Value(CtxMatch).(discovery.MatchedRoute); ok {
res.Route = v.Destination
res.Match.MatchType = v.Mapper.MatchType.String()
res.Match.ProviderID = string(v.Mapper.ProviderID)
res.Match.Server = v.Mapper.Server
res.Match.Src = v.Mapper.SrcMatch.String()
res.Match.Dst = v.Mapper.Dst
res.Match.PingURL = v.Mapper.PingURL
res.Match.AssetsLocation = v.Mapper.AssetsLocation
res.Match.AssetsWebRoot = v.Mapper.AssetsWebRoot
}
return res
}
// registrationHandler accept POST or DELETE with lib.Plugin body and register/unregister plugin provider
func (c *Conductor) registrationHandler() http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
switch r.Method {
case "POST":
var plugin lib.Plugin
if err := json.NewDecoder(r.Body).Decode(&plugin); err != nil {
http.Error(w, "plugin registration failed", http.StatusBadRequest)
return
}
c.locked(func() {
if err := c.register(plugin); err != nil {
log.Printf("[WARN] rpc registration failed, %v", err)
http.Error(w, "rpc registration failed", http.StatusInternalServerError)
return
}
})
case "DELETE":
var plugin lib.Plugin
if err := json.NewDecoder(r.Body).Decode(&plugin); err != nil {
http.Error(w, "failed to unregister plugin", http.StatusBadRequest)
return
}
c.locked(func() { c.unregister(plugin) })
default:
http.Error(w, "invalid request type", http.StatusBadRequest)
}
})
}
// register plugin, not thread safe! call should be enclosed with lock
// creates tcp client, retrieves list of handlers (methods) and adds each one with the full method name
func (c *Conductor) register(p lib.Plugin) error {
// collect all handlers after registration
var pp []Handler //nolint
for _, h := range c.plugins {
if strings.HasPrefix(h.Method, p.Name+".") && h.Address == p.Address { // already registered
log.Printf("[WARN] plugin %+v already registered", p)
return nil
}
if strings.HasPrefix(h.Method, p.Name+".") && h.Address != p.Address { // registered, but address changed
log.Printf("[WARN] plugin %+v already registered, but address changed to %s", h, p.Address)
continue // remove from the collected pp
}
pp = append(pp, h)
}
client, err := c.RPCDialer.Dial("tcp", p.Address)
if err != nil {
return fmt.Errorf("can't reach plugin %+v: %v", p, err)
}
for _, l := range p.Methods {
handler := Handler{client: client, Alive: true, Address: p.Address, Method: p.Name + "." + l}
pp = append(pp, handler)
log.Printf("[INFO] register plugin %s, ip: %s, method: %s", p.Name, p.Address, handler.Method)
}
c.plugins = pp
return nil
}
// unregister plugin, not thread safe! call should be enclosed with lock
func (c *Conductor) unregister(p lib.Plugin) {
log.Printf("[INFO] unregister plugin %s, ip: %s", p.Name, p.Address)
var res []Handler //nolint
for _, h := range c.plugins {
if strings.HasPrefix(h.Method, p.Name+".") {
continue
}
res = append(res, h)
}
c.plugins = res
}
func (c *Conductor) locked(fn func()) {
c.lock.Lock()
fn()
c.lock.Unlock()
}
|
import java.util.Scanner;
public class CodeForces {
private static Scanner scn;
public static void main(String... args) {
scn = new Scanner(System.in);
int[] heights;
int maxIndex = 0, minIndex = 0;
int max, min;
int steps;
int n = scn.nextInt();
heights = fill(n);
max = findMax(heights);
min = findMin(heights);
if(max == heights[0]) {
if(max == heights[0]) maxIndex = 0;
} else {
maxIndex = nearestMaxIndex(heights);
}
if(min == heights[heights.length - 1]) {
if(min == heights[heights.length - 1]) minIndex = heights.length-1;
} else {
minIndex = furthestMinIndex(heights);
}
steps = maxIndex + (heights.length - minIndex - 1);
if(maxIndex > minIndex) steps--;
System.out.println(steps);
}
private static int[] fill(int size) {
int[] array = new int[size];
for(int i = 0; i < size; i++) {
array[i] = scn.nextInt();
}
return array;
}
private static int findMax(int[] array) {
int max = array[0];
for(int i = 0; i < array.length; i++) {
if(array[i] >= max) {
max = array[i];
}
}
return max;
}
private static int findMin(int[] array) {
int min = array[0];
for(int i = 0; i < array.length; i++) {
if(array[i] <= min) {
min = array[i];
}
}
return min;
}
private static int nearestMaxIndex(int[] array) {
int len = array.length;
int max = array[len - 1], index = 0;
for(int i = len - 1; i >= 0; i--) {
if(array[i] >= max) {
max = array[i];
index = i;
}
}
return index;
}
private static int furthestMinIndex(int[] array) {
int len = array.length;
int min = array[0], index = 0;
for(int i = 0; i < len; i++) {
if(array[i] <= min) {
min = array[i];
index = i;
}
}
return index;
}
} |
Family bonding and adolescent alcohol use: moderating effect of living with excessive drinking parents.
AIMS
Excessive parental drinking has been shown to be positively related to adolescent alcohol use and family bonding negatively related. The aim of the present study was to determine if the perception of parental drinking moderates the relationship between family bonding and adolescent alcohol use.
METHODS
Linear structural equation models for multiple group comparisons were estimated based on a national representative sample of 3,448 eight and ninth graders in Switzerland (mean age 14.77; SD 0.89).
RESULTS
Adjusted for gender and age, the results confirm that strong family bonds were negatively related to both frequency of alcohol intake and lifetime frequency of drunkenness. Furthermore, a positive link was found with regard to the perception of parental drinking. However, the multiple group comparison revealed that the negative relationship between bonding and adolescent alcohol use was even stronger among adolescents whose parents drink excessively than among those whose parents did not.
CONCLUSIONS
These results indicate that it may be particularly important for parents in the former category to establish strong family bonds (e.g. by spending free time with their children, listening to their worries) so as to limit adolescent excessive drinking. |
/**
* A factory for static file-serving routes.
*
* @author Luc Everse
*/
public class StaticFileRouteFactory extends FileRouteFactory {
private static final Logger logger = LoggerFactory.getLogger(StaticFileRouteFactory.class);
/**
* The path compiler to use.
*/
private final PathCompiler compiler;
/**
* The header value parser to use.
*/
private final HeaderValueParser valueParser;
/**
* The range parser to use.
*/
private final RangeParser rangeParser;
/**
* The MIME guesser to use.
*/
private final MimeGuesser mimeGuesser;
/**
* Creates a new factory for routes serving static files.
*
* @param compiler the path compiler to use
* @param valueParser the header value parser to use
* @param rangeParser the range parser to use
* @param mimeGuesser the MIME guesser to use
*/
public StaticFileRouteFactory(final PathCompiler compiler,
final HeaderValueParser valueParser,
final RangeParser rangeParser,
final MimeGuesser mimeGuesser) {
super(valueParser);
this.compiler = compiler;
this.valueParser = valueParser;
this.rangeParser = rangeParser;
this.mimeGuesser = mimeGuesser;
}
/**
* Builds a routing entry serving static files.
*
* @param prefix the URL prefix the static files should be accessible for
* @param dir the local directory the files should be in
*
* @return a routing entry serving files
*/
public RoutingEntry build(final String prefix, final Path dir) {
final Path absDir;
try {
absDir = dir.toRealPath();
} catch (final IOException ex) {
throw new RuntimeException("Unable to get the canonical path for " + dir + ": ", ex);
}
final String name = "static_file_route_" + prefix;
final String parameterizedPath = prefix + "/{file}";
final CompiledPath path = this.compiler.compile(parameterizedPath, Ob.map("file", ".+"));
final Controller controller = new FileController(absDir);
final java.lang.reflect.Method method;
try {
method = FileController.class.getMethod("handle", Request.class);
} catch (final NoSuchMethodException ex) {
throw new RuntimeException(ex);
}
final List<Method> methods = Arrays.asList(Method.GET, Method.HEAD);
final List<MimeType> mimeTypes = Collections.singletonList(MimeType.any());
return new RoutingEntry(name, path, controller, method, methods, mimeTypes);
}
/**
* The virtual controller serving these files.
*/
private class FileController extends Controller {
/**
* The local directory the files should be in.
*/
private final Path localDir;
/**
* The header value parser.
*/
private final HeaderValueParser valueParser;
/**
* The Range header parser.
*/
private final RangeParser rangeParser;
/**
* The MIME guesser.
*/
private final MimeGuesser mimeGuesser;
/**
* Creates a new virtual file controller.
*
* @param localDir the local directory the files should be in
*/
FileController(final Path localDir) {
this.localDir = localDir;
this.valueParser = StaticFileRouteFactory.this.valueParser;
this.rangeParser = StaticFileRouteFactory.this.rangeParser;
this.mimeGuesser = StaticFileRouteFactory.this.mimeGuesser;
}
/**
* Handles the request.
*
* @param request the request
*
* @return the response
*/
public Response handle(final Request request) {
try {
final Path file = this.localDir.resolve(request.getPathParameter("file"));
// If the request attempts to traverse the directory tree or if the file just
// doesn't exist reply Not Found.
if (!file.startsWith(this.localDir) || !Files.exists(file)) {
throw new NotFoundException();
}
final FileResponse response = new FileResponse(file.toFile());
// Set the request/response ranges, if any.
final List<String> ranges = this.valueParser.parseCommaSeparated(request, "Range");
response.setRanges(this.rangeParser.parse(ranges, response.getContentLength()));
// Guess and set the content type too.
response.setContentType(this.mimeGuesser.guessLocal(file));
// If the client indicates it may have cached the file and it actually did,
// reply Not Modified.
if (StaticFileRouteFactory.this.can304(request, response)) {
response.setStatus(ResponseCode.NOT_MODIFIED);
}
return response;
} catch (final FileNotFoundException ex) {
// If anything happens to the file during this process reply a 404 too.
throw new NotFoundException(ex);
}
}
}
} |
package net.openid.conformance.fapirwid2;
import net.openid.conformance.condition.Condition;
import net.openid.conformance.condition.client.AddBadRequestUriToAuthorizationRequest;
import net.openid.conformance.condition.client.CallPAREndpoint;
import net.openid.conformance.condition.client.EnsurePARInvalidRequestObjectError;
import net.openid.conformance.sequence.ConditionSequence;
import net.openid.conformance.testmodule.PublishTestModule;
import net.openid.conformance.variant.FAPIAuthRequestMethod;
import net.openid.conformance.variant.VariantNotApplicable;
//PAR-2.1 : The request_uri authorization request parameter MUST NOT be provided in this case
@PublishTestModule(
testName = "fapi-rw-id2-par-authorization-request-containing-request_uri",
displayName = "PAR : authorization request must not contain request_uri parameter",
summary = "This test sends a random request_uri parameter in authorization request object and expects authorization server to return an error",
profile = "FAPI-RW-ID2",
configurationFields = {
"server.discoveryUrl",
"client.client_id",
"client.scope",
"client.jwks",
"mtls.key",
"mtls.cert",
"mtls.ca",
"client2.client_id",
"client2.scope",
"client2.jwks",
"mtls2.key",
"mtls2.cert",
"mtls2.ca",
"resource.resourceUrl"
}
)
@VariantNotApplicable(parameter = FAPIAuthRequestMethod.class, values = {
"by_value"
})
public class FAPIRWID2PARRejectRequestUriInParAuthorizationRequest extends AbstractFAPIRWID2ServerTestModule {
@Override
protected ConditionSequence makeCreateAuthorizationRequestObjectSteps() {
return super.makeCreateAuthorizationRequestObjectSteps().
butFirst(condition(AddBadRequestUriToAuthorizationRequest.class).requirement("PAR-2"));
}
@Override
protected void performParAuthorizationRequestFlow() {
callAndStopOnFailure(CallPAREndpoint.class);
// this might be too strict, the spec mentions this error but doesn't require servers to use it
// the only firm requirement is for the http status code to indicate failure
callAndContinueOnFailure(EnsurePARInvalidRequestObjectError.class, Condition.ConditionResult.FAILURE, "JAR-6.2", "PAR-2.1");
fireTestFinished();
}
}
|
Inefficient postural responses to unexpected slips during walking in older adults.
BACKGROUND
Slips account for a high percentage of falls and subsequent injuries in community-dwelling older adults but not in young adults. This phenomenon suggests that although active and healthy older adults preserve a mobility level comparable to that of young adults, these older adults may have difficulty generating efficient reactive postural responses when they slip. This study tested the hypothesis that active and healthy older adults use a less effective reactive balance strategy than young adults when experiencing an unexpected forward slip occurring at heel strike during walking. This less effective balance strategy would be manifested by slower and smaller postural responses, altered temporal and spatial organization of the postural responses, and greater upper trunk instability after the slip.
METHODS
Thirty-three young adults (age range=19-34 yrs, mean=25+/-4 yrs) and 32 community-dwelling older adults (age range=70-87 yrs, mean=74+/-14 yrs) participated. Subjects walked across a movable forceplate which simulated a forward slip at heel strike. Surface electromyography was recorded from bilateral leg, thigh, hip, and trunk muscles. Kinematic data were collected from the right (perturbed) side of the body.
RESULTS
Although the predominant postural muscles and the activation sequence of these muscles were similar between the two age groups, the postural responses of older adults were of longer onset latencies, smaller magnitudes, and longer burst durations compared to young adults. Older adults also showed a longer coactivation duration for the ankle, knee, and trunk agonist/antagonist pairs on the perturbed side and for the knee agonist/antagonist pair on the nonperturbed side. Behaviorally, older adults became less stable after the slips. This was manifested by a higher incidence of being tripped (21 trials in older vs 5 trials in young adults) and a greater trunk hyperextension with respect to young adults. Large arm elevation was frequently used by older adults to assist in maintaining trunk stability. In an attempt to quickly reestablish the base of support after the slips, older adults had an earlier contralateral foot strike and shortened stride length.
CONCLUSION
The combination of slower onset and smaller magnitude of postural responses to slips in older adults resulted in an inefficient balance strategy. Older adults needed secondary compensatory adjustments, including a lengthened response duration and the use of the arms, to fully regain balance and prevent a fall. The shorter stride length and earlier contralateral foot strike following the slip indicate use of a more conservative balance strategy in older adults. |
/**
* Write ClientVmInfo for delegate and another (nonDelegate) host to BB for future use
* with kill methods above.
*/
protected void postClientVmInfoToBB() {
PoolImpl pool = (PoolImpl)PoolManager.find("brloader");
ServerLocation delegate = pool.getServerAffinityLocation();
int delegatePort = delegate.getPort();
String delegateHost = delegate.getHostName();
int dot = delegateHost.indexOf(".");
if (dot != -1) {
delegateHost = delegateHost.substring(0, dot);
}
Log.getLogWriter().info("Delegate is " + delegateHost + ":" + delegatePort);
ClientVmInfo delegateVmInfo = null;
ClientVmInfo nonDelegateVmInfo = null;
List endpoints = BridgeHelper.getEndpoints();
for (Iterator i = endpoints.iterator(); i.hasNext();) {
BridgeHelper.Endpoint endpoint = (BridgeHelper.Endpoint)i.next();
if ((endpoint.getHost().equalsIgnoreCase(delegateHost)) && (endpoint.getPort() == delegatePort)) {
delegateVmInfo = new ClientVmInfo(endpoint);
} else {
nonDelegateVmInfo = new ClientVmInfo(endpoint);
}
}
TxBB.getBB().getSharedMap().put(TxBB.delegate, delegateVmInfo);
TxBB.getBB().getSharedMap().put(TxBB.nonDelegateServer, nonDelegateVmInfo);
Log.getLogWriter().info("Writing delegateVmInfo to BB: " + delegateVmInfo);
Log.getLogWriter().info("Writing nonDelegateVmInfo to BB: " + nonDelegateVmInfo);
} |
<reponame>radisvaliullin/test-golang
package srvcln
import (
"bufio"
"log"
"net"
"strings"
"sync"
)
// Srv test server
type Srv struct {
Addr string
wg sync.WaitGroup
connCnt int
}
func (s *Srv) Run() {
ln, err := net.Listen("tcp", s.Addr)
if err != nil {
log.Printf("listen err: %v", err)
return
}
for {
conn, err := ln.Accept()
s.connCnt++
if err != nil {
log.Printf("accept err: %v", err)
continue
}
s.wg.Add(1)
go connHandler(conn, &s.wg)
}
}
func connHandler(conn net.Conn, wg *sync.WaitGroup) {
defer wg.Done()
defer conn.Close()
for {
line, err := bufio.NewReader(conn).ReadString('\n')
if err != nil {
// log.Printf("srv: conn: read line err: %v", err)
return
}
if strings.TrimSpace(line) == "stop" {
return
}
}
}
|
Essendon Football Club is delighted to announce the incredible career of veteran, Dustin Fletcher, is set to continue after the evergreen defender committed to the Club for the 2014 season.
The 38 year-old announced his contract extension in front of 1500 guests at tonight’s Crichton Medal count at Crown Palladium.
Entering his 22nd season, Dustin currently sits equal with Simon Madden on 378 games and is now destined to become the club’s games record holder next year.
“To be honest missing the last couple of games this year made me really hungry to play again in 2014,” said Fletcher.
“I’ve said it before, but to be able to come to the club and work with 40 of your mates and play footy for a living is a great job.”
“Breaking the club’s games record wasn’t a motivating factor for me. The body is feeling good, I’ve still got my speed and we’ve got a great group of players here.”
“While I can still contribute and play my role for the team I want to keep playing.”
Assistant coach Simon Goodwin said the experience of a dual premiership player and All-Australian in the backline was invaluable.
“As has been the case the past couple of seasons, the decision to play on was entirely up to Fletch,” Goodwin said.
“It’s great to have Dustin sign on for next year, he’s a highly respected member of our playing group and he brings so much experience to the group, and is a terrific teacher and mentor for our defenders.”
List manager Adrian Dodoro said it was great to have a legend of the club continue for another year.
“To be able to line up for 22 seasons is an incredible effort and it says so much about his durability but also his ability to adapt to the game which has changed so much over the course of his career,” Dodoro said.
“He is a great reader of the play, still has his speed and that inner determination to beat his opponent at every contest." |
/**
* Returns the single instance of this class.
*
* @return The single instance of this class.
*/
public static IconToolkit instance() {
if (fInstance == null) {
fInstance = new IconToolkit();
}
return fInstance;
} |
/********************************************************************
* Base class for interactive components that display a list of selectable
* values.
*
* @author eso
*/
public abstract class UiListControl<T, C extends UiListControl<T, C>>
extends UiControl<T, C> implements UiHasUpdateEvents<T, C>
{
//~ Constructors -----------------------------------------------------------
/***************************************
* Creates a new instance for an existing parameter type.
*
* @param rParent The parent container
* @param rParamType The parameter relation type
* @param eListStyle The list style
*/
public UiListControl(UiContainer<?> rParent,
RelationType<T> rParamType,
ListStyle eListStyle)
{
super(rParent, rParamType);
set(LIST_STYLE, eListStyle);
}
/***************************************
* Creates a new instance. If the datatype is an enum all enum values will
* be pre-set as the list values.
*
* @param rParent The parent container
* @param rDatatype The datatype of the list values
* @param eListStyle The list style
*/
public UiListControl(UiContainer<?> rParent,
Class<? super T> rDatatype,
ListStyle eListStyle)
{
super(rParent, rDatatype);
set(LIST_STYLE, eListStyle);
}
//~ Methods ----------------------------------------------------------------
/***************************************
* Returns the currently selected list value.
*
* @return The selected value (NULL for none)
*/
public T getSelection()
{
return getValueImpl();
}
/***************************************
* Sets the event handler for selection events of this table.
*
* @param rEventHandler The event handler
*
* @return This instance for concatenation
*/
public final C onSelection(Consumer<T> rEventHandler)
{
return setParameterEventHandler(InteractionEventType.UPDATE,
v -> rEventHandler.accept(v));
}
/***************************************
* {@inheritDoc}
*/
@Override
public C onUpdate(Consumer<T> rEventHandler)
{
return onSelection(rEventHandler);
}
/***************************************
* Sets the selected value.
*
* @param rValue The new selection or NULL for none
*
* @return This instance
*/
public C select(T rValue)
{
return setValueImpl(rValue);
}
/***************************************
* Sets the selected value.
*
* @param rValue The new selection or NULL for none
*/
public void setSelection(T rValue)
{
select(rValue);
}
} |
/**
* @author Gail Badner
*/
public class SessionFactoryBuilderImpl implements SessionFactoryBuilder {
SessionFactoryOptionsImpl options;
private final MetadataImplementor metadata;
/* package-protected */
SessionFactoryBuilderImpl(MetadataImplementor metadata) {
this.metadata = metadata;
options = new SessionFactoryOptionsImpl();
}
@Override
public SessionFactoryBuilder with(Interceptor interceptor) {
this.options.interceptor = interceptor;
return this;
}
@Override
public SessionFactoryBuilder with(EntityNotFoundDelegate entityNotFoundDelegate) {
this.options.entityNotFoundDelegate = entityNotFoundDelegate;
return this;
}
@Override
public SessionFactory buildSessionFactory() {
return new SessionFactoryImpl(metadata, options, null );
}
private static class SessionFactoryOptionsImpl implements SessionFactory.SessionFactoryOptions {
private Interceptor interceptor = EmptyInterceptor.INSTANCE;
// TODO: should there be a DefaultEntityNotFoundDelegate.INSTANCE?
private EntityNotFoundDelegate entityNotFoundDelegate = new EntityNotFoundDelegate() {
public void handleEntityNotFound(String entityName, Serializable id) {
throw new ObjectNotFoundException( id, entityName );
}
};
@Override
public Interceptor getInterceptor() {
return interceptor;
}
@Override
public EntityNotFoundDelegate getEntityNotFoundDelegate() {
return entityNotFoundDelegate;
}
}
} |
<gh_stars>1-10
import { Component, Input, Output, EventEmitter } from "@angular/core";
import { Role } from "../../../security/shared/role.model";
import { RolePermissionMap } from "../../../security/shared/role-permission-map.model";
import { Permission } from "../../../security/shared/permission.model";
import { Application } from "../../../security/shared/application.model";
import { SettingsBLService } from '../../shared/settings.bl.service';
import { SecurityService } from '../../../security/shared/security.service';
import { MessageboxService } from '../../../shared/messagebox/messagebox.service';
import * as moment from 'moment/moment';
import { DanpheHTTPResponse } from "../../../shared/common-models";
@Component({
selector: 'permission-manage',
templateUrl: "./role-permission-manage.html"
})
export class RolePermissionManageComponent {
@Input("application-perm-list")
public applicationList: Array<Application> = new Array<Application>();
@Input("selectedRole")
public selectedRole: Role;
@Output("callback-manageRole")
callbackManageRole: EventEmitter<Object> = new EventEmitter<Object>();
public selectedRolePermissionList: Array<RolePermissionMap> = new Array<RolePermissionMap>();
public selectedItem: Permission;
public roleId: number;
public allPermissionList: Array<Permission> = [];
constructor(public settingsBLService: SettingsBLService,
public securityService: SecurityService,
public msgBoxServ: MessageboxService) {
}
ngOnInit() {
this.roleId = this.selectedRole.RoleId;
this.GetRolePermissionList(this.roleId);
this.LoadAllPermissionList();
}
LoadAllPermissionList() {
if (this.applicationList) {
this.applicationList.forEach(app => {
this.allPermissionList = this.allPermissionList.concat(app.Permissions);
});
}
}
GetRolePermissionList(roleId: number) {
this.settingsBLService.GetRolePermissionList(roleId)
.subscribe(res => {
if (res.Status == 'OK') {
this.selectedRolePermissionList = res.Results;
if (this.selectedRolePermissionList) {
this.selectedRolePermissionList.forEach(p => {
p.IsSelected = true;
});
}
this.SetIsSelectedToApplicationPermissions();
} else {
this.msgBoxServ.showMessage("failed", [res.ErrorMessage]);
}
},
err => {
this.msgBoxServ.showMessage("error", ['Failed to get application list.. please check log for details.'], err.ErrorMessage);
});
}
/// for search box and calling the SelectImaging function
SelectPermissionSearchBox(selectedItem: Permission) {
if (typeof selectedItem === "object" && !Array.isArray(selectedItem) && selectedItem !== null) {
//check if the item already exisit on the selected list.
for (let sel of this.selectedRolePermissionList) {
if (sel.PermissionId == selectedItem.PermissionId) {
var check = true;
break;
}
}
if (!check) {
selectedItem.IsSelected = true;
this.RolePermissionEventHandler(selectedItem);
//this.ChangeMainListSelectStatus(selectedItem.PermissionId, true);
}
else {
this.msgBoxServ.showMessage("error", ["This item is already added"]);
}
}
}
public RolePermissionEventHandler(currItem) {
if (currItem.IsSelected) {
//this is needed for application select-all feature.
let isPermAlreadyExist = this.selectedRolePermissionList.filter(p => p.PermissionId == currItem.PermissionId).length > 0;
if (!isPermAlreadyExist) {
//add item to selectedItemList or exisitingModifiedList depending on condition
var rolePermission: RolePermissionMap = new RolePermissionMap();
rolePermission.RoleId = this.roleId;
rolePermission.PermissionId = currItem.PermissionId;
rolePermission.PermissionName = currItem.PermissionName;
rolePermission.ApplicationId = currItem.ApplicationId;
rolePermission.IsSelected = true;
rolePermission.IsActive = true;
rolePermission.CreatedBy = this.securityService.GetLoggedInUser().EmployeeId;
rolePermission.CreatedOn = moment().format('YYYY-MM-DD HH:mm');
this.selectedRolePermissionList.push(rolePermission);
}
}
//remove item from selectedList ofr exisitingModifiedList
else {
var index = this.selectedRolePermissionList.findIndex(x => x.PermissionId == currItem.PermissionId);
this.selectedRolePermissionList.splice(index, 1);
}
let selApplication = this.applicationList.find(a => a.ApplicationId == currItem.ApplicationId);
//when coming from right panel, we have to check/un-check that item from Application's Permission List as well.
let selAppnPermission = selApplication.Permissions.find(p => p.PermissionId == currItem.PermissionId);
selAppnPermission.IsSelected = currItem.IsSelected;
if (selApplication.Permissions.every(p => p.IsSelected)) {
selApplication.IsApplicationNameSelected = true;
}
else {
selApplication.IsApplicationNameSelected = false;
}
}
//for initially selecting the items in main list existing item from the existingItemList
SetIsSelectedToApplicationPermissions() {
this.applicationList.forEach(app => {
if (app.Permissions) {
app.Permissions.forEach(perm => {
let currPerm = this.selectedRolePermissionList.find(p => p.PermissionId == perm.PermissionId);
if (currPerm) {
perm.IsSelected = true;
}
else {
perm.IsSelected = false;
}
});
if (app.Permissions.every(p => p.IsSelected)) {
app.IsApplicationNameSelected = true;
}
else {
app.IsApplicationNameSelected = false;
}
}
});
}
Submit() {
//var addList: Array<RolePermissionMap>;
this.settingsBLService.AddRolePermissions(this.selectedRolePermissionList, this.roleId)
.subscribe((res: DanpheHTTPResponse) => {
if (res.Status == "OK") {
this.msgBoxServ.showMessage("success", ["Added and Updated RolePermissions"]);
this.callbackManageRole.emit();
}
else {
this.msgBoxServ.showMessage("error", ["Failed to Add/Update RolePermissions. Please Check log details."], res.ErrorMessage);
}
});
}
//used to format display item in ng-autocomplete
myListFormatter(data: any): string {
let html = data["PermissionName"];
return html;
}
logError(err: any) {
console.log(err);
}
public OnApplicationNameSelected(changedApp: Application) {
let currApp = this.applicationList.find(app => app.ApplicationId == changedApp.ApplicationId);
if (currApp.IsApplicationNameSelected) {
currApp.Permissions.forEach(perm => {
perm.IsSelected = true;
this.RolePermissionEventHandler(perm);
});
}
else {
currApp.Permissions.forEach(perm => {
perm.IsSelected = false;
this.RolePermissionEventHandler(perm);
});
}
}
public selectPermission($data){
if($data.IsSelected == true){
$data.IsSelected = false;
this.RolePermissionEventHandler($data);
}else{
$data.IsSelected = true;
this.RolePermissionEventHandler($data);
}
}
}
|
/**
* <p>An implementation of an OJB TransactionManagerFactory which provides access to Workflow's
* JTA UserTransaction.</p>
*
* <p>If the TransactionManager singleton has been set via {@link #setTransactionManager(TransactionManager)}
* then that reference is returned, otherwise the TransactionManager is pulled from Workflow's Spring core
* via the SpringServiceLocator.</p>
*
* <p>When accessed from outside the workflow core (i.e. embedded mode), the transaction manager
* singleton MUST explicitly be set - it cannot be resolved through the SpringServiceLocator.</p>
*
* <p>Note: if OJB is caused to initialize DURING Spring initialization (for example, by programmatically
* obtaining the OJB PersistenceBrokerFactory to set the platform attribute of connection descriptors
* from within a bean initialized by Spring), the TransactionManager singleton MUST be set beforehand,
* otherwise NPE will result from attempting to traverse SpringServiceLocator as the GlobalResourceLoader
* will not have been initialized yet).</p>
*
* <p>This TransactionManagerFactory implementation is specified in OJB via the following
* setting the OJB properties:</p>
* <blockquote>
* <code>
* JTATransactionManagerClass=org.kuali.rice.core.database.WorkflowTransactionManagerFactory
* </code>
* </blockquote>
*
* @author Kuali Rice Team ([email protected])
*/
public class TransactionManagerFactory implements org.apache.ojb.broker.transaction.tm.TransactionManagerFactory {
private static TransactionManager transactionManager;
public TransactionManager getTransactionManager() throws TransactionManagerFactoryException {
// return SpringServiceLocator.getJtaTransactionManager().getTransactionManager();
if (transactionManager == null) {
throw new RiceRuntimeException("The JTA Transaction Manager for OJB was not configured properly.");
}
return transactionManager;
// core and plugins
// TODO what to do here
//return KSBServiceLocator.getTransactionManager();
}
public static void setTransactionManager(TransactionManager transactionManager) {
TransactionManagerFactory.transactionManager = transactionManager;
}
} |
<filename>src/main/java/me/itzg/mccy/config/MccySecuritySettings.java
package me.itzg.mccy.config;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
import java.util.List;
/**
* @author <NAME>
* @since 0.2
*/
@ConfigurationProperties("mccy.security")
@Component
public class MccySecuritySettings {
private AllowAnonymous allowAnonymous = new AllowAnonymous();
public AllowAnonymous getAllowAnonymous() {
return allowAnonymous;
}
public void setAllowAnonymous(AllowAnonymous allowAnonymous) {
this.allowAnonymous = allowAnonymous;
}
public static class AllowAnonymous {
/**
* Comma-separated list of paths that can be accessed anonymously via a GET
*/
private List<String> get;
public List<String> getGet() {
return get;
}
public void setGet(List<String> get) {
this.get = get;
}
}
}
|
Toward Unifying the Mechanistic Concepts in Electrochemical CO2 Reduction from an Integrated Material Design and Catalytic Perspective
Electrocatalytic CO2 reduction (eCO2RR) is one of the avenues with most potential toward achieving sustainable energy economy and global climate change targets by harvesting renewable energy into value‐added fuels and chemicals. From an industrial standpoint, eCO2RR provides specific advantages over thermochemical and photochemical pathways in terms of much broader product scope, high product specificity, and easy adaptability to the renewable electricity infrastructure. However, unlike water electrolyzers, the lack of suitable cathode materials for eCO2RR impedes its commercialization due to material design challenges. The current state‐of‐the‐art catalysts in eCO2RR suffer largely from low reaction rates, insufficient C2+ product selectivity, high overpotentials, and industrial‐scale stability. Overcoming the scientific and applied technical hurdles for commercial realization demands a holistic integration of catalytic designs, deep mechanistic understanding, and efficient process engineering. Special emphasis on mechanistic understanding and performance outcome is sought to guide the future design of eCO2RR catalysts that can play a significant role in closing the anthropogenic carbon loop. This article provides an integrative approach to understand principles of robust eCO2RR catalyst design superimposed with underlying mechanistic projections which strongly depend on experimental conditions viz. choice of electrolyte, reactor and membrane design, pH of the solvent, and partial pressure of the CO2. |
import re
def strip_comments(bytes):
lines = bytes.split(b"\n")
comment = rb"#.*"
return b"\n".join([re.sub(comment, b"", line) for line in lines])
def strip_extra_newlines(bytes):
two_plus_newlines = rb"\n\n+"
return re.sub(two_plus_newlines, b"\n", bytes)
def strip_trailing_whitespace(bytes):
trailing_whitespace = rb"\s+\n"
return re.sub(trailing_whitespace, b"\n", bytes)
def strip_operator_whitespace(bytes):
ops = [
r"\+|-|\*|%|/|,|=|\(|\)|\[|\]|\{|\}|:|>|<|&|\||\^|~",
r"\+=|-=|\*=|%=|/=|==|\*\*|//|!=|<>|>=|<=|<<|>>",
r"\*\*=|//=",
]
space = "[ \t]*"
pre_space = "|".join([rf"({space}(?={op}))" for op in ops])
post_space = "|".join([rf"((?<={op}){space})" for op in ops])
pre_or_post_space = rf"({pre_space})|({post_space})".encode("utf-8")
return re.sub(pre_or_post_space, b"", bytes)
def replace_spaces_with_tabs(bytes):
return re.sub(b" {4}", b"\t", bytes)
|
Propagation of Bamboo (Dendrocalamus giganteus, Munro) Through Culm-Branch Cuttings in Egypt
: Bamboo plants are an essen-tial element in Egyptian agricul-ture development. Nothing was found on propagation of Dendrocalamus giganteus under the Egyptian conditions. Thus, two trials were done to evaluate the effect of position of cuttings on the stem, time of propagation, indole-3-butyric acid (IBA) treatments, planting method on rooting of culm and culm-branch cuttings of bamboo. Three cutting types (basal, middle and tip) of single and double nodes were collected from culm (main stem) and culm-branch (lateral branch) cuttings of bamboo during September 2004 and March 2005. Culm-branch cuttings were treated for 24h with IBA at 0, 50, 100 and 200 ppm. Cuttings were planted horizontally and/or vertically in clay soil. The experi-ments were arranged in Randomized Complete Block Design as split plot with three replicates. The obtained data were statistically analyzed and revealed the following: cuttings Meanwhile, ratio than that in tip ones. The best results were obtained from treating basal culm-branch cuttings with 100 ppm IBA. Vertical planting of double nodes cuttings reduced its rootability. |
import { HttpClient } from '@angular/common/http';
import { Configuration } from '../configuration/configuration';
export class BaseResource {
protected baseUrl: string;
constructor(
protected http: HttpClient,
protected configuration: Configuration,
private suffix?: string
) {
this.baseUrl = `${this.configuration.restUrl}`;
this.setSuffix(suffix);
}
public getBaseUrl(): string {
return this.baseUrl;
}
public setSuffix(suffix?: string): void {
if (suffix) {
this.baseUrl = `${this.configuration.restUrl}/${suffix}`;
}
}
public composeQueryString(object: any): string {
let result = '';
let isFirst = true;
if (object) {
Object.keys(object)
.filter(key => object[key] !== null && object[key] !== undefined)
.forEach(key => {
let value = object[key];
if (value instanceof Date) {
value = value.toISOString();
}
if (isFirst) {
result = '?' + key + '=' + value;
isFirst = false;
} else {
result += '&' + key + '=' + value;
}
});
}
return result;
}
}
|
/**
* Given a string S, check if it is palindrome or not.
*/
public class MaxElement {
public static int getMaxElement(int[] arr) {
int max = -1;
for (int num : arr) {
if (num > max)
max = num;
}
return max;
}
public static void main(String[] args) {
int[] input = {1, 2, 3, 4, 5};
System.out.println(getMaxElement(input));
}
} |
/**
* The groups are shown in "Project Type" selection list.
* @author Dmitry Avdeev
*/
public class TemplatesGroup implements Comparable<TemplatesGroup> {
private final String myName;
private final String myDescription;
private final Icon myIcon;
private final int myWeight;
private final String myParentGroup;
private final String myId;
private final ModuleBuilder myModuleBuilder;
private ProjectCategory myProjectCategory;
private boolean mySafeToReport = false;
public TemplatesGroup(String name, String description, Icon icon, int weight, String parentGroup, String id, ModuleBuilder moduleBuilder) {
myName = name;
myDescription = description;
myIcon = icon;
myWeight = weight;
myParentGroup = parentGroup;
myId = id;
myModuleBuilder = moduleBuilder;
}
/**
* Category-based group
* @param category
*/
public TemplatesGroup(ProjectCategory category) {
this(category.getDisplayName(), category.getDescription(), category.getIcon(), category.getWeight(), category.getGroupName(), category.getId(), category.createModuleBuilder());
myProjectCategory = category;
}
public TemplatesGroup(ModuleBuilder builder) {
this(builder.getPresentableName(), builder.getDescription(), builder.getNodeIcon(), builder.getWeight(), builder.getParentGroup(), builder.getBuilderId(), builder);
}
@Nullable
public ModuleBuilder getModuleBuilder() {
return myModuleBuilder;
}
public ProjectCategory getProjectCategory() { return myProjectCategory; }
public String getName() {
return myName;
}
public String getDescription() {
return myDescription;
}
public Icon getIcon() {
return myIcon;
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
TemplatesGroup group = (TemplatesGroup)o;
if (!myName.equals(group.myName)) return false;
return true;
}
public int getWeight() {
return myWeight;
}
@Override
public int hashCode() {
return myName.hashCode();
}
@Override
public String toString() {
return myName;
}
@Override
public int compareTo(@NotNull TemplatesGroup o) {
int i = o.myWeight - myWeight;
if (i != 0) return i;
int i1 = Comparing.compare(o.getParentGroup(), getParentGroup());
if (i1 != 0) return i1;
return o.getName().compareTo(getName());
}
public String getParentGroup() {
return myParentGroup;
}
public String getId() {
return myId;
}
public boolean isSafeToReport() {
if (myModuleBuilder != null) {
return PluginInfoDetectorKt.getPluginInfo(myModuleBuilder.getClass()).isSafeToReport();
}
return mySafeToReport;
}
public void setSafeToReport(boolean report) {
mySafeToReport = report;
}
} |
A miniaturized dual-fibre laser Doppler sensor
The integration of laser Doppler velocimeters into small components such as wind tunnel models requires highly miniaturized sensors. This paper presents a novel miniaturization concept, suitable for reducing the dimensions of a fibre-coupled laser Doppler sensor to a millimetre scale. Only a few low-cost, passive elements such as two fibres, a diffraction grating and refractive optics are employed. Preliminary results of a realized sensor head, of about 7 mm diameter and about 15 mm length, are presented. The generated measurement volume of about 0.02 mm diameter and about 0.1 mm length allows highly spatially-resolved velocity measurements. The outstanding features of the sensor are its simplicity, ruggedness and the high optical power of about 0.5 W in the measurement volume. |
The Commonwealth Ombudsman's Power to Compel Testimonial Activity for the Purpose of an Investigation
For the purpose of reaching a decision about potentially defective administrative action into which he is conducting an investigation, the Ombudsman may wish to compel people to engage in various sorts of testimonial activity—the furnishing of information in writing, the production of documents, the answering of questions orally. This article examines the Ombudsman's powers in that regard, pointing to matters which may give rise to difficulties in the exercise of such powers and suggesting a number of changes to the relevant provisions. Some of the matters discussed are relevant to the information-gathering powers of other Commonwealth agencies, for example, the Taxation Commissioner and the Trade Practices Commission. Not discussed in the article is the question of excuses which can be made to avoid complying with a valid request once made, a subject which deserves its own treatment separately. |
def keyvault_secret_value_get(self, keyvault_name=None, secret_name=None, client_id=None):
secret = self.keyvault_secret_get(keyvault_name=keyvault_name, secret_name=secret_name, client_id=client_id)
if secret:
return secret.value
return None |
// In case of the Allegro Hand, this callback is processed
// every 0.003 seconds
void AllegroNodeVelSat::computeDesiredTorque() {
if (controlPD) {
for (int i = 0; i < DOF_JOINTS; i++) {
desired_velocity[i] = (k_p[i] / k_d[i]) * (desired_position[i] -
current_position_filtered[i]);
v[i] = std::min(1.0, v_max[i] /
fabs(desired_velocity[i]));
desired_torque[i] = -k_d[i] * (current_velocity_filtered[i] -
v[i] * desired_velocity[i]);
desired_torque[i] = desired_torque[i] / canDevice->torqueConversion();
}
}
else {
for (int i = 0; i < DOF_JOINTS; i++) desired_torque[i] = 0.0;
}
} |
<filename>packages/web/src/app/contexts/general/modals/modals.tsx
import * as React from 'react';
import { Suspense, useCallback, useContext, useEffect, useMemo, useState } from 'react';
import { styled } from '@linaria/react';
import type { LoadableComponent } from '@loadable/component';
import loadable from '@loadable/component';
import { zIndexes } from '@p2p-wallet-web/ui';
import classNames from 'classnames';
import type { ModalPropsType } from 'app/contexts/general/modals/types';
import { ModalType } from 'app/contexts/general/modals/types';
const Wrapper = styled.div`
position: fixed;
top: 0;
right: 0;
bottom: 0;
left: 0;
z-index: ${zIndexes.modal};
display: flex;
flex-direction: column;
background-color: rgba(0, 0, 0, 0.6);
user-select: none;
&.nav {
bottom: 57px;
}
`;
const ModalWrapper = styled.div`
display: flex;
align-items: center;
justify-content: center;
height: 100%;
`;
type ModalState = { modalType: ModalType; modalId: number; props: any };
type GetPresetFn = (modal?: ModalType) => Preset;
type Preset = 'nav' | 'regular';
const modalsMap = new Map<ModalType, LoadableComponent<ModalPropsType & any>>([
// [SHOW_MODAL_ADD_COIN, loadable(() => import('components/modals/__AddCoinModal'))],
[
ModalType.SHOW_MODAL_ACTIONS_MOBILE,
loadable(() => import('components/modals/ActionsMobileModal')),
],
[
ModalType.SHOW_MODAL_RECEIVE_BITCOIN,
loadable(() => import('components/modals/ReceiveBitcoinModal')),
],
[
ModalType.SHOW_MODAL_TRANSACTION_CONFIRM,
loadable(() => import('components/modals/TransactionConfirmModal')),
],
[
ModalType.SHOW_MODAL_TRANSACTION_DETAILS,
loadable(() => import('components/modals/TransactionInfoModals/TransactionDetailsModal')),
],
[
ModalType.SHOW_MODAL_TRANSACTION_STATUS_SEND,
loadable(() => import('components/modals/TransactionInfoModals/TransactionStatusSendModal')),
],
[
ModalType.SHOW_MODAL_TRANSACTION_STATUS_SWAP,
loadable(() => import('components/modals/TransactionInfoModals/TransactionStatusSwapModal')),
],
[
ModalType.SHOW_MODAL_CLOSE_TOKEN_ACCOUNT,
loadable(() => import('components/modals/CloseTokenAccountModal')),
],
[
ModalType.SHOW_MODAL_PROCEED_USERNAME,
loadable(() => import('components/modals/ProceedUsernameModal')),
],
[
ModalType.SHOW_MODAL_CHOOSE_BUY_TOKEN_MOBILE,
loadable(() => import('components/modals/ChooseBuyTokenMobileModal')),
],
[
ModalType.SHOW_MODAL_SELECT_LIST_MOBILE,
loadable(() => import('components/modals/SelectListMobileModal')),
],
[ModalType.SHOW_MODAL_ERROR, loadable(() => import('components/modals/ErrorModal'))],
]);
const promises = new Map();
let modalIdCounter = 0;
const getPreset: GetPresetFn = (modal) => {
switch (modal) {
case ModalType.SHOW_MODAL_ACTIONS_MOBILE:
return 'nav';
default:
return 'regular';
}
};
const ModalsContext = React.createContext<{
openModal: <T, S extends {}>(modalType: ModalType, props?: S) => Promise<T | void>;
closeModal: (modalId: number) => void;
closeTopModal: () => void;
}>({
openModal: () => Promise.resolve(),
closeModal: () => {},
closeTopModal: () => {},
});
export function ModalsProvider({ children = null as any }) {
const [modals, setModals] = useState<ModalState[]>([]);
const setPageScroll = (overflow: 'hidden' | 'scroll') =>
(document.documentElement.style.overflow = overflow);
useEffect(() => {
setPageScroll(modals.length ? 'hidden' : 'scroll');
}, [modals.length]);
const openModal = useCallback((modalType: ModalType, props?: any) => {
++modalIdCounter;
setModals((state) => [
...state,
{
modalType,
modalId: modalIdCounter,
props,
},
]);
const promise = new Promise((resolve) => {
promises.set(modalIdCounter, {
modalId: modalIdCounter,
resolve,
});
});
promise.modalId = modalIdCounter;
return promise;
}, []);
const closeModal = useCallback((modalId: number, result?: any) => {
setModals((state) => state.filter((modal) => modal.modalId !== modalId));
const dialogInfo = promises.get(modalId);
if (dialogInfo) {
dialogInfo.resolve(result);
promises.delete(modalId);
}
return result;
}, []);
const closeTopModal = useCallback(() => {
if (!modals.length) {
return;
}
closeModal(modals[modals.length - 1].modalId);
}, [modals]);
const handleWrapperClick = useCallback(
(e: React.MouseEvent<HTMLDivElement>) => {
// handle click only on element
if (e.target !== e.currentTarget) {
return;
}
closeTopModal();
},
[closeTopModal],
);
const preparedModals = useMemo(() => {
return modals.map((modal) => {
const ModalComponent = modalsMap.get(modal.modalType);
if (!ModalComponent) {
return null;
}
return (
<Suspense fallback={null} key={modal.modalId}>
<ModalWrapper onMouseDown={handleWrapperClick}>
<ModalComponent
{...modal.props}
key={modal.modalId}
close={(result?: any) => closeModal(modal.modalId, result)}
/>
</ModalWrapper>
</Suspense>
);
});
}, [modals, handleWrapperClick, closeModal]);
const preset = getPreset(modals.at(-1)?.modalType);
return (
<ModalsContext.Provider
value={{
openModal,
closeModal,
closeTopModal,
}}
>
{children}
{preparedModals.length > 0 ? (
<Wrapper className={classNames(preset)}>{preparedModals}</Wrapper>
) : undefined}
</ModalsContext.Provider>
);
}
export function useModals() {
const { openModal, closeModal, closeTopModal } = useContext(ModalsContext);
return { openModal, closeModal, closeTopModal };
}
|
<gh_stars>0
import argparse
from pprint import pformat
import numpy as np
from sklearn.cluster import KMeans, AgglomerativeClustering
from scripts.utils.utils import init_logger, load_npz, save_data_to_json
logger = init_logger()
# liefert zum Clusterverfahren cluster_method das passende sklearn-Cluster-Modell, das num_clusters Clusters erzeugt
# cluster_method == 'kmeans' -> liefert sklearn-K-Means
# cluster_method == aggl-avg[-cos] -> liefert sklearn-AgglomerativeClustering mit euklidischer Distanz [mit Kosinusunähnlichkeit]
# cluster_method == aggl-ward -> liefert sklearn-AgglomerativeClustering mit Ward-Clustering
def get_cluster_model(cluster_method, num_clusters):
if cluster_method == 'kmeans':
return KMeans(n_clusters=num_clusters, n_init=10, init='k-means++', max_iter=1000000, verbose=False, n_jobs=-1)
if cluster_method.startswith('aggl'):
linkages = {
'aggl-ward': 'ward',
'aggl-avg': 'average',
'aggl-avg-cos': 'average',
}
affinites = {
'aggl-ward': 'euclidean',
'aggl-avg': 'euclidean',
'aggl-avg-cos': 'cosine',
}
# AgglomerativeClustering legt im Verzeichnis ".cache" einen Cache des Dendrogramms an -> verkürzt Berechnungsdauer erheblich, wenn
# verschiedene Clusteranzahlen desselben Dendrogramms gesucht!
return AgglomerativeClustering(n_clusters=num_clusters, linkage=linkages[cluster_method], affinity=affinites[cluster_method], memory='.cache')
def main():
parser = argparse.ArgumentParser(description='clusters documents of a given document-topics-file by their topics')
parser.add_argument('--document-topics', type=argparse.FileType('r'), help='path to input document-topic-file (.npz)', required=True)
parser.add_argument('--cluster-labels', type=argparse.FileType('w'), help='path to output JSON cluster labels file', required=True)
cluster_methods = {
'kmeans': 'kmeans algorithm with kmeans++',
'aggl-ward': 'hierarchical agglomerative ward clustering',
'aggl-avg': 'hierarchical agglomerative average clustering',
'aggl-avg-cos': 'hierarchical agglomerative average clustering with cosine distance',
}
cm = parser.add_argument('--cluster-method', choices=cluster_methods, help='clustering algorithm: ' + str(cluster_methods), required=True)
parser.add_argument('--num-clusters', type=int, help='number of clusters to create', required=True)
args = parser.parse_args()
input_document_topics_path = args.document_topics.name
output_cluster_labels_path = args.cluster_labels.name
cluster_method = args.cluster_method
num_clusters = args.num_clusters
logger.info('running with:\n{}'.format(pformat({'input_document_topics_path':input_document_topics_path, 'output_cluster_labels_path':output_cluster_labels_path, 'cluster_method':cluster_method, 'num_clusters':num_clusters})))
# lade Dokument-Topic-Matrix
logger.info('loading dense document-topics from {}'.format(input_document_topics_path))
document_topics = load_npz(input_document_topics_path)
logger.info('loaded document-topics-matrix of shape {}'.format(document_topics.shape))
logger.debug('document-topics-matrix \n{}'.format(document_topics))
# hole Modell zu cluster_method, num_clusters
num_docs, num_topics = document_topics.shape
logger.info('clustering on {} documents, {} topics'.format(num_docs, num_topics))
cluster_model = get_cluster_model(cluster_method, num_clusters)
logger.info('clustering model:\n{}'.format(cluster_model))
# führe Clusteranalyse durch
cluster_labels = cluster_model.fit_predict(document_topics)
logger.info('{} labels'.format(len(cluster_labels)))
logger.debug(cluster_labels)
logger.info('{} different labels'.format(len(np.unique(cluster_labels))))
logger.info('{} noise labels'.format((cluster_labels < 0).sum()))
# speichere Labels
logger.info('saving cluster labels')
save_data_to_json(cluster_labels.tolist(), output_cluster_labels_path)
if __name__ == '__main__':
main()
|
def _parse_styling_control_sequences(self):
self._parse_capabilities(BLINK="blink", BOLD="bold", DIM="dim",
NORMAL="sgr0", REVERSE="rev",
UNDERLINE="smul") |
<filename>work-common/src/main/java/com/suizhu/common/core/FdfsClient.java
package com.suizhu.common.core;
import java.io.IOException;
import java.io.InputStream;
import java.net.URLEncoder;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.apache.commons.io.FilenameUtils;
import org.apache.commons.io.IOUtils;
import org.springframework.http.HttpHeaders;
import org.springframework.stereotype.Component;
import org.springframework.web.multipart.MultipartFile;
import com.github.tobato.fastdfs.domain.fdfs.MetaData;
import com.github.tobato.fastdfs.domain.fdfs.StorePath;
import com.github.tobato.fastdfs.domain.proto.storage.DownloadByteArray;
import com.github.tobato.fastdfs.domain.upload.FastFile;
import com.github.tobato.fastdfs.service.FastFileStorageClient;
import com.suizhu.common.exception.MyException;
import lombok.AllArgsConstructor;
/**
* fdfs客户端
*
* @author gaochao
* @date Feb 18, 2019
*/
@Component
@AllArgsConstructor
public class FdfsClient {
private final FastFileStorageClient storageClient;
private static final String[] IEBrowserSignals = { "MSIE", "Trident", "Edge" };
private static final Map<String, String> EXT_MAPS = new HashMap<>(19);
private static void initExt() {
// image
EXT_MAPS.put("png", "image/png");
EXT_MAPS.put("gif", "image/gif");
EXT_MAPS.put("bmp", "image/bmp");
EXT_MAPS.put("ico", "image/x-ico");
EXT_MAPS.put("jpeg", "image/jpeg");
EXT_MAPS.put("jpg", "image/jpeg");
// 压缩文件
EXT_MAPS.put("zip", "application/zip");
EXT_MAPS.put("rar", "application/x-rar");
// doc
EXT_MAPS.put("pdf", "application/pdf");
EXT_MAPS.put("ppt", "application/vnd.ms-powerpoint");
EXT_MAPS.put("xls", "application/vnd.ms-excel");
EXT_MAPS.put("xlsx", "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
EXT_MAPS.put("pptx", "application/vnd.openxmlformats-officedocument.presentationml.presentation");
EXT_MAPS.put("doc", "application/msword");
EXT_MAPS.put("doc", "application/wps-office.doc");
EXT_MAPS.put("docx", "application/vnd.openxmlformats-officedocument.wordprocessingml.document");
EXT_MAPS.put("txt", "text/plain");
// 音频
EXT_MAPS.put("mp4", "video/mp4");
EXT_MAPS.put("flv", "video/x-flv");
}
/**
* @dec 上传文件
* @date Feb 18, 2019
* @author gaochao
* @param inputStream
* @param filename
* @param fileSize
* @return
*/
public String upload(InputStream inputStream, String filename, long fileSize) {
String suffix = FilenameUtils.getExtension(filename);
Set<MetaData> metaDataSet = new HashSet<>(0);
metaDataSet.add(new MetaData("filename", filename));
metaDataSet.add(new MetaData("suffix", suffix));
FastFile fastFile = new FastFile(inputStream, fileSize, suffix, metaDataSet);
return storageClient.uploadFile(fastFile).getFullPath();
}
/**
* @dec 上传文件
* @date Feb 21, 2019
* @author gaochao
* @param file
* @return
*/
public String upload(MultipartFile file) {
String filename = file.getOriginalFilename();
String suffix = FilenameUtils.getExtension(filename);
InputStream inputStream = null;
try {
inputStream = file.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
Set<MetaData> metaDataSet = new HashSet<>(0);
metaDataSet.add(new MetaData("filename", filename));
metaDataSet.add(new MetaData("suffix", suffix));
FastFile fastFile = new FastFile(inputStream, file.getSize(), suffix, metaDataSet);
return storageClient.uploadFile(fastFile).getFullPath();
}
/**
* @dec 更新文件
* @date Feb 21, 2019
* @author gaochao
* @param file
* @param fileId
* @return
*/
public String updateFile(MultipartFile file, String fileId) {
storageClient.deleteFile(fileId);
String filename = file.getOriginalFilename();
String suffix = FilenameUtils.getExtension(filename);
InputStream inputStream = null;
try {
inputStream = file.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
Set<MetaData> metaDataSet = new HashSet<>(0);
metaDataSet.add(new MetaData("filename", filename));
metaDataSet.add(new MetaData("suffix", suffix));
FastFile fastFile = new FastFile(inputStream, file.getSize(), suffix, metaDataSet);
return storageClient.uploadFile(fastFile).getFullPath();
}
/**
* @dec 下载文件
* @date Feb 18, 2019
* @author gaochao
* @param fileId
* @return
*/
public byte[] download(String fileId) {
StorePath storePath = StorePath.parseFromUrl(fileId);
DownloadByteArray downloadByteArray = new DownloadByteArray();
return storageClient.downloadFile(storePath.getGroup(), storePath.getPath(), downloadByteArray);
}
/**
* @dec 下载文件
* @date Feb 18, 2019
* @author gaochao
* @param fileId
* @param response
* @throws Exception
*/
public void download(String fileId, HttpServletResponse response) {
StorePath storePath = StorePath.parseFromUrl(fileId);
Set<MetaData> metadataSet = storageClient.getMetadata(storePath.getGroup(), storePath.getPath());
Map<String, String> data = new HashMap<>(0);
metadataSet.forEach(meta -> {
data.put(meta.getName(), meta.getValue());
});
DownloadByteArray downloadByteArray = new DownloadByteArray();
byte[] bs = storageClient.downloadFile(storePath.getGroup(), storePath.getPath(), downloadByteArray);
initExt();
String filename = data.get("filename");
String suffix = data.get("suffix");
String contentType = EXT_MAPS.get(suffix);
try {
String encoderName = URLEncoder.encode(filename, "UTF-8").replace("+", "%20").replace("%2B", "+");
response.setHeader("Content-Disposition", "attachment;filename=\"" + encoderName + "\"");
response.setContentType(contentType + ";charset=UTF-8");
response.setHeader("Accept-Ranges", "bytes");
IOUtils.write(bs, response.getOutputStream());
} catch (IOException e) {
e.printStackTrace();
}
}
/**
* @dec 下载文件
* @date Feb 19, 2019
* @author gaochao
* @param fileId
* @param request
* @param response
* @throws Exception
*/
public void download(String fileId, HttpServletRequest request, HttpServletResponse response) {
StorePath storePath = StorePath.parseFromUrl(fileId);
Set<MetaData> metadataSet = storageClient.getMetadata(storePath.getGroup(), storePath.getPath());
Map<String, String> data = new HashMap<>(0);
metadataSet.forEach(meta -> {
data.put(meta.getName(), meta.getValue());
});
DownloadByteArray downloadByteArray = new DownloadByteArray();
byte[] bs = storageClient.downloadFile(storePath.getGroup(), storePath.getPath(), downloadByteArray);
initExt();
String filename = data.get("filename");
String suffix = data.get("suffix");
String contentType = EXT_MAPS.get(suffix);
String encoderName;
try {
if (isMSBrowser(request)) {
encoderName = URLEncoder.encode(filename, "UTF-8").replace("+", "%20").replace("%2B", "+");
} else {
encoderName = new String(filename.getBytes("UTF-8"), "ISO-8859-1");
}
response.setHeader("Content-Disposition", "attachment;filename=\"" + encoderName + "\"");
response.setContentType(contentType + ";charset=UTF-8");
response.setHeader("Accept-Ranges", "bytes");
IOUtils.write(bs, response.getOutputStream());
} catch (IOException e) {
e.printStackTrace();
}
}
private static boolean isMSBrowser(HttpServletRequest request) {
String userAgent = request.getHeader(HttpHeaders.USER_AGENT);
for (String signal : IEBrowserSignals) {
if (userAgent.contains(signal))
return true;
}
return false;
}
/**
* @dec 获取文件信息
* @date Feb 19, 2019
* @author gaochao
* @param fileId
* @return
*/
public Map<String, String> getFileInfo(String fileId) {
StorePath storePath = StorePath.parseFromUrl(fileId);
Set<MetaData> metadataSet = storageClient.getMetadata(storePath.getGroup(), storePath.getPath());
Map<String, String> data = new HashMap<>(0);
metadataSet.forEach(meta -> {
data.put(meta.getName(), meta.getValue());
});
return data;
}
/**
* @dec 删除文件
* @date Feb 19, 2019
* @author gaochao
* @param fileId
* @return
* @throws Exception
*/
public boolean delete(String fileId) {
try {
storageClient.deleteFile(fileId);
return true;
} catch (Exception e) {
e.printStackTrace();
return false;
}
}
/**
* @dec 删除文件
* @date Feb 19, 2019
* @author gaochao
* @param fileIds
* @return
* @throws Exception
*/
public boolean delete(String[] fileIds) {
boolean b = false;
for (String fileId : fileIds) {
b = delete(fileId);
if (!b) {
throw new MyException("文件删除失败!");
}
}
return b;
}
/**
* @dec 删除文件
* @date Feb 19, 2019
* @author gaochao
* @param fileIds
* @return
* @throws Exception
*/
public boolean delete(List<String> fileIds) {
boolean b = false;
for (String fileId : fileIds) {
b = delete(fileId);
if (!b) {
throw new MyException("文件删除失败!");
}
}
return b;
}
}
|
// unmarshalHeadersFromFrame reads serialized headers from the byte slice into
// a map.
func (v *v0ProtocolMarshaler) unmarshalHeadersFromFrame(frame []byte) (map[string]string, error) {
if len(frame) < 4 {
return nil, thrift.NewTProtocolExceptionWithType(thrift.INVALID_DATA,
fmt.Errorf("frugal: invalid v0 frame size %d", len(frame)))
}
size := int32(binary.BigEndian.Uint32(frame))
if size > int32(len(frame[4:])) {
return nil, thrift.NewTProtocolExceptionWithType(thrift.INVALID_DATA,
fmt.Errorf("frugal: v0 frame size %d does not match actual size %d", size, len(frame[4:])))
}
return v.readPairs(frame, 4, size+4)
} |
/**
* The method to handle ForbiddenException and return the appropriate
* response to UI.
*
* @param ex
* the exception thrown by the service methods.
* @param request
* the Web request.
* @return ResponseEntitiy<Project> returns Project with ServiceStatus indicating error
*/
@ExceptionHandler(ForbiddenException.class)
public final ResponseEntity<?> handleForbiddenException(ForbiddenException ex, WebRequest request) {
Project project = getProject(ex);
return new ResponseEntity<Project>(project, HttpStatus.FORBIDDEN);
} |
<reponame>adele-robots/fiona
#include "stdafx.h"
#include "MorphTargetBlender3D.h"
#ifndef __MORPH_TARGET_BLENDER2D_H
#define __MORPH_TARGET_BLENDER2D_H
#include "IMorphTargetBlender.h"
#include "my_application.h"
class MorphTargetBlender2D : public IMorphTargetBlender
{
public:
public:
void copyAppSvg(my_application *app_svg_original);
virtual void setMorphTargetValue(char *morphTargetName, float v)const;
private:
my_application *app_svg;
};
#endif
|
package com.trining.design.decorator.train;
import com.google.common.collect.Lists;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.List;
/**
* @author ganjx
* Copyright (c) 2012-2020 All Rights Reserved.
*/
public class NavigationBarDecorator {
Logger logger = LoggerFactory.getLogger(NavigationBarDecorator.class);
private int level;
NavigationBar navigationBar;
public NavigationBarDecorator(int level, NavigationBar navigationBar) {
this.level = level;
this.navigationBar = navigationBar;
}
public void show() {
navigationBar.show();
List<String> menuItemByLevel = getMenuItemByLevel();
logger.info("{}",menuItemByLevel);
}
public List<String> getMenuItemByLevel() {
List<String> menuitem = Lists.newArrayList();
switch (level) {
case 2:
menuitem = Lists.newArrayList("newItem1", "newItem2", "newItem3", "newItem4");
break;
case 3:
menuitem = Lists.newArrayList("newItem1", "newItem2", "newItem3", "newItem4", "newItem5", "newItem6");
break;
}
return menuitem;
}
}
|
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.List;
import java.util.Scanner;
public class Rebus {
public static void main(String[] args) throws Exception {
Scanner in = new Scanner(new BufferedReader(new InputStreamReader(
System.in)));
String str = in.nextLine();
List<Character> operations = new ArrayList<Character>();
int n = 0;
int plusCnt = 1;
int minusCnt = 0;
for (int i=0; i<str.length(); i+=2) {
char ch = str.charAt(i);
if (ch == '+') {
operations.add(ch);
plusCnt++;
} else if (ch == '-') {
operations.add(ch);
minusCnt++;
} else if (ch == '=') {
n = Integer.parseInt(str.substring(i+2));
break;
}
}
boolean isPossible = false;
String result = "";
int max = plusCnt * n - minusCnt;
int min = plusCnt - minusCnt * n;
if (n > max || n < min) {
// impossible
} else if (plusCnt - minusCnt == n) {
isPossible = true;
result = str.replace("?", "1");
} else if (plusCnt - minusCnt < n) {
int[] plusNum = new int[plusCnt];
int numToAdd = n - plusCnt + minusCnt;
for (int i=0; i<plusCnt; i++) {
plusNum[i] = 1;
if (numToAdd >= n-1) {
plusNum[i] += n-1;
numToAdd -= n-1;
} else if (numToAdd > 0) {
plusNum[i] += numToAdd;
numToAdd = 0;
}
}
if (numToAdd == 0) {
isPossible = true;
result += plusNum[0];
int cur = 1;
for (int i=0; i<operations.size(); i++) {
if (operations.get(i) == '-') {
result += " - 1";
} else {
result += " + " + plusNum[cur];
cur++;
}
}
result += " = " + n;
}
} else {
int[] minusNum = new int[minusCnt];
int numToSub = plusCnt - minusCnt - n;
for (int i=0; i<minusCnt; i++) {
minusNum[i] = 1;
if (numToSub >= n-1) {
minusNum[i] += n-1;
numToSub -= n-1;
} else if (numToSub > 0) {
minusNum[i] += numToSub;
numToSub = 0;
}
}
if (numToSub == 0) {
isPossible = true;
result += 1;
int cur = 0;
for (int i=0; i<operations.size(); i++) {
if (operations.get(i) == '+') {
result += " + 1";
} else {
result += " - " + minusNum[cur];
cur++;
}
}
result += " = " + n;
}
}
if (isPossible) {
System.out.println("Possible");
System.out.println(result);
} else {
System.out.println("Impossible");
}
}
} |
s=input()
t=input()
d=dict()
flag=True
ds=set()
for i in range(len(s)):
if not(t[i] in d):
if s[i] in ds:
flag=False
break
d[t[i]]=s[i]
ds.add(s[i])
else:
if d[t[i]]!=s[i]:
flag=False
break
if flag:
print("Yes")
else:
print("No") |
// Calling apis from a callback simulates concurrent access from multiple
// threads
ssize_t pwrite_hang_cb(void *ctx, struct filemgr_ops *normal_ops,
fdb_fileops_handle fops_handle, void *buf, size_t count,
cs_off_t offset)
{
struct shared_data *data = (struct shared_data *)ctx;
if (data->test_handle_busy) {
bad_thread(ctx);
}
return normal_ops->pwrite(fops_handle, buf, count, offset);
} |
<reponame>wilebeast/FireFox-OS<filename>B2G/gecko/content/svg/content/src/nsSVGDataParser.h<gh_stars>1-10
/* -*- Mode: C++; tab-width: 2; indent-tabs-mode: nil; c-basic-offset: 2 -*- */
/* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/. */
#ifndef __NS_SVGDATAPARSER_H__
#define __NS_SVGDATAPARSER_H__
#include "nsError.h"
#include "nsStringGlue.h"
//----------------------------------------------------------------------
// helper macros
#define ENSURE_MATCHED(exp) { nsresult rv = exp; if (NS_FAILED(rv)) return rv; }
////////////////////////////////////////////////////////////////////////
// nsSVGDataParser: a simple abstract class for parsing values
// for path and transform values.
//
class nsSVGDataParser
{
public:
nsresult Parse(const nsAString &aValue);
protected:
const char* mInputPos;
const char* mTokenPos;
enum { DIGIT, WSP, COMMA, POINT, SIGN, LEFT_PAREN, RIGHT_PAREN, OTHER, END } mTokenType;
char mTokenVal;
// helpers
void GetNextToken();
void RewindTo(const char* aPos);
virtual nsresult Match()=0;
nsresult MatchNumber(float* x);
bool IsTokenNumberStarter();
nsresult MatchCommaWsp();
bool IsTokenCommaWspStarter();
nsresult MatchIntegerConst();
nsresult MatchFloatingPointConst();
nsresult MatchFractConst();
nsresult MatchExponent();
bool IsTokenExponentStarter();
nsresult MatchDigitSeq();
bool IsTokenDigitSeqStarter();
nsresult MatchWsp();
bool IsTokenWspStarter();
nsresult MatchLeftParen();
nsresult MatchRightParen();
};
#endif // __NS_SVGDATAPARSER_H__
|
/**
* Encodes the supplied string for inclusion as a (relative) URI in a Turtle document.
*
* @param s
*/
@Deprecated
public static String encodeURIString(String s) {
s = StringUtil.gsub("\\", "\\u005C", s);
s = StringUtil.gsub("\t", "\\u0009", s);
s = StringUtil.gsub("\n", "\\u000A", s);
s = StringUtil.gsub("\r", "\\u000D", s);
s = StringUtil.gsub("\"", "\\u0022", s);
s = StringUtil.gsub("`", "\\u0060", s);
s = StringUtil.gsub("^", "\\u005E", s);
s = StringUtil.gsub("|", "\\u007C", s);
s = StringUtil.gsub("<", "\\u003C", s);
s = StringUtil.gsub(">", "\\u003E", s);
s = StringUtil.gsub(" ", "\\u0020", s);
return s;
} |
<reponame>jimmy0017/homebridge-hisense-tv
#!/usr/bin/env python3.8
import argparse
import json
import logging
import ssl
from . import HisenseTv
def main():
parser = argparse.ArgumentParser(description="Hisense TV control.")
parser.add_argument("hostname", type=str, help="Hostname or IP for the TV.")
parser.add_argument(
"--authorize",
action="store_true",
help="Authorize this API to access the TV.",
)
parser.add_argument(
"--ifname",
type=str,
help="Name of the network interface to use",
default=""
)
parser.add_argument(
"--get",
action="append",
default=[],
choices=["sources", "volume", "state"],
help="Gets a value from the TV.",
)
parser.add_argument(
"--key",
action="append",
default=[],
choices=[
"power",
"up",
"down",
"left",
"right",
"menu",
"back",
"exit",
"ok",
"volume_up",
"volume_down",
"channel_up",
"channel_down",
"fast_forward",
"rewind",
"stop",
"play",
"pause",
"mute",
"home",
"subtitle",
"netflix",
"youtube",
"amazon",
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
"source_0",
"source_1",
"source_2",
"source_3",
"source_4",
"source_5",
"source_6",
"source_7",
],
help="Sends a keypress to the TV.",
)
parser.add_argument(
"--no-ssl",
action="store_true",
help="Do not connect with SSL (required for some models).",
)
parser.add_argument("--certfile", help="Absolute path to the .cer file (required for some models). "
"Works only when --keyfile is also specified. "
"Will be ignored if --no-ssl is specified.")
parser.add_argument("--keyfile", help="Absolute path to the .pkcs8 file (required for some models). "
"Works only when --certfile is also specified. "
"Will be ignored if --no-ssl is specified.")
parser.add_argument(
"-v", "--verbose", action="count", default=0, help="Logging verbosity."
)
args = parser.parse_args()
if args.verbose:
level = logging.DEBUG
else:
level = logging.INFO
root_logger = logging.getLogger()
stream_handler = logging.StreamHandler()
formatter = logging.Formatter(
fmt="[{asctime}] [{levelname:<8}] {message}", style="{"
)
stream_handler.setFormatter(formatter)
root_logger.addHandler(stream_handler)
root_logger.setLevel(level)
logger = logging.getLogger(__name__)
if args.no_ssl:
logger.info("No SSL context specified.")
ssl_context = None
elif args.certfile is not None and args.keyfile is not None:
ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ssl_context.load_cert_chain(certfile=args.certfile, keyfile=args.keyfile)
logger.info("SSL context created with cert file (" + args.certfile + ") and private key (" + args.keyfile + ")")
else:
ssl_context = ssl._create_unverified_context()
logger.info("Unverified SSL context created.")
tv = HisenseTv(
args.hostname, enable_client_logger=args.verbose >= 2, ssl_context=ssl_context, network_interface=args.ifname
)
with tv:
if args.authorize:
tv.start_authorization()
code = input("Please enter the 4-digit code: ")
tv.send_authorization_code(code)
for key in args.key:
func = getattr(tv, f"send_key_{key}")
logger.info(f"sending keypress: {key}")
func()
for getter in args.get:
func = getattr(tv, f"get_{getter}")
output = func()
if isinstance(output, dict) or isinstance(output, list):
output = json.dumps(output, indent=4)
print(output)
if __name__ == "__main__":
main()
|
<filename>src/lib/covalent.ts
import { NetworkId } from "src/constants";
import { EnvHelper } from "src/helpers/Environment";
import { CovalentResponse, CovalentTokenBalance, CovalentTransaction } from "./covalent.types";
export class Covalent {
public SUPPORTED_NETWORKS = {
[NetworkId.FANTOM]: true,
[NetworkId.MAINNET]: true,
[NetworkId.POLYGON]: true,
[NetworkId.ARBITRUM]: true,
[NetworkId.AVALANCHE]: true,
};
private _url = "https://api.covalenthq.com/v1";
private _key = Buffer.from(EnvHelper.getCovalentKey() + "::").toString("base64");
private async _fetch<Data = unknown>(path: string) {
const url = this._url + path;
const options = {
headers: {
Authorization: `Basic ${this._key}`,
},
};
const response = await fetch(url, options);
return this._validateResponse<Data>(response);
}
private async _validateResponse<Data = unknown>(response: Response) {
const json: CovalentResponse<Data> = await response.json();
if (!response.ok) throw new Error("Failed to fetch Covalent API.");
else if (json.error) throw new Error(json.error_message);
return json.data.items;
}
public isSupportedNetwork(networkId: NetworkId) {
return this.SUPPORTED_NETWORKS.hasOwnProperty(networkId);
}
public balances = {
/**
* Returns the balance of every token owned by an address.
*/
getAllTokens: async (address: string, networkId: keyof typeof this.SUPPORTED_NETWORKS) => {
const url = `/${networkId}/address/${address}/balances_v2/`;
return this._fetch<CovalentTokenBalance[]>(url);
},
};
public transactions = {
/**
* Returns all successful, failed, and pending transactions for an address
*/
getAllForAddress: async (address: string, networkId: keyof typeof this.SUPPORTED_NETWORKS) => {
const url = `/${networkId}/address/${address}/transactions_v2/?no-logs=true`;
return this._fetch<CovalentTransaction[]>(url);
},
};
}
export const covalent = new Covalent();
|
<filename>utils/planing_utils.py
import numpy as np
from scipy import stats
from utils.prob_utils import sample_simplex
#------------------------------------------------------------------------------------------------------------~
def generalized_argmax_indicator(x):
argmax_size = np.sum(x == x.max())
indc = np.zeros_like(x)
indc[x == x.max()] = 1 / argmax_size
return indc
#------------------------------------------------------------------------------------------------------------~
def generalized_greedy(Q):
"""
Calculates a greedy policy w.r.t Q
if all Q values are distinct then we derive a deterministic policy.
If several Q values are equal, the probability is divided among them
Parameters:
Q [S x A] Q-function
Returns:
pi: [S x A] matrix representing pi(a|s)
"""
if Q.ndim != 2:
raise AssertionError('Invalid input')
S = Q.shape[0]
A = Q.shape[1]
pi = np.zeros((S,A))
for s in range(S):
pi[s] = generalized_argmax_indicator(Q[s])
return pi
#------------------------------------------------------------------------------------------------------------~
def GetUniformPolicy(nS, nA):
"""
Create a Markov stochastic policy which chooses actions randomly uniform from each state
Parameters:
nS: number of states
nA: number of actions
Returns:
pi: [S x A] matrix representing pi(a|s)
"""
pi = np.ones((nS, nA))
for i in range(nS):
pi[i] /= pi[i].sum()
return pi
#-------------------------------------------------------------------------------------------
def draw_policy_at_random(nS, nA):
pi = np.zeros((nS, nA))
for i in range(nS):
pi[i] = sample_simplex(nA)
return pi
#------------------------------------------------------------------------------------------------------------~
def GetPolicyDynamics(P, R, pi):
"""
Calculate the dynamics when following the policy pi
Parameters:
P: [nS x nA x nS] transitions probabilities matrix P_{s,a,s'}=P(s'|s,a)
R: [nS x nA] mean rewards matrix R
pi: [nS x nA] matrix representing pi(a|s)
Returns:
P_pi: [nS x nS] transitions matrix when following the policy pi (P_pi)_{s,s'} P^pi(s'|s)
R_pi: [nS] mean rewards at each state when following the policy pi (R_pi)_{s} = R^pi(s)
"""
if P.ndim != 3 or R.ndim != 2 or pi.ndim != 2:
raise AssertionError('Invalid input')
nS = P.shape[0]
nA = P.shape[1]
P_pi = np.zeros((nS, nS))
R_pi = np.zeros((nS))
for i in range(nS): # current state
for a in range(nA):
for j in range(nS): # next state
# Explanation: P(s'|s) = sum_a pi(a|s)P(s'|s,a)
P_pi[i, j] += pi[i, a] * P[i,a,j]
is_row_updated = True
R_pi[i] += pi[i, a] * R[i,a]
if np.any(np.abs(P_pi.sum(axis=1) - 1) > 1e-5):
raise RuntimeError('Probabilty matrix not normalized!!')
return P_pi, R_pi
#------------------------------------------------------------------------------------------------------------~
def PolicyEvaluation(M, pi, gamma, P_pi=None, R_pi=None):
"""
Calculates the value-function for a given policy pi and a known model
Parameters:
P: [nS x nA x nS] transitions probabilities matrix P_{s,a,s'}=P(s'|s,a)
R: [nS x nA] mean rewards matrix R
pi: [nS x nA] matrix representing pi(a|s)
gamma: Discount factor
Returns:
V_pi: [nS] The value-function for a fixed policy pi, i,e. the the expected discounted return when following pi starting from some state
Q_pi [nS x nA] The Q-function for a fixed policy pi, i,e. the the expected discounted return when following pi starting from some state and action
"""
# (1) Use PolicyDynamics to get P and R, (2) V = (I-gamma*P)^-1 * R
P = M.P
R = M.R
if P.ndim != 3 or R.ndim != 2 or pi.ndim != 2:
raise AssertionError('Invalid input')
nS = P.shape[0]
nA = P.shape[1]
if P_pi is None or R_pi is None:
P_pi, R_pi = GetPolicyDynamics(P, R, pi)
V_pi = np.linalg.solve((np.eye(nS) - gamma * P_pi), R_pi)
# Verify that R_pi + gamma * np.matmul(P_pi, V_pi) == V_pi
Q_pi = np.zeros((nS, nA))
for a in range(nA):
for i in range(nS):
Q_pi[i, a] = R[i, a] + gamma * np.matmul(P[i,a,:], V_pi)
# Verify that V_pi(s) = sum_a pi(a|s) * Q_pi(s,a)
return V_pi, Q_pi
#------------------------------------------------------------------------------------------------------------~
def PolicyIteration(M, gamma):
"""
Finds the optimal policy given a known model using policy-iteration algorithm
Parameters:
P: [nS x nA x nS] transitions probabilities matrix P_{s,a,s'}=P(s'|s,a)
R: [nS x nA] mean rewards matrix R
gamma: Discount factor
Returns
pi_opt [nS x nA]: An optimal policy (assuming given model and gamma)
V_opt: [nS] The optimal value-function , i,e. the the expected discounted return when following optimal policy starting from some state
Q_opt [nS x nA] The optimal Q-function, i,e. the the expected discounted return when following optimal policy starting from some state and action
"""
# The algorithm: until policy not changes: (1) run policy-evaluation to get Q_pi (2) new_policy = argmax Q
nS = M.nS
nA = M.nA
Q_pi = np.zeros((nS, nA))
# initial point of the algorithm: uniform policy
pi = np.ones((nS, nA)) / nA
pi_prev = nA - pi # arbitrary different policy than pi
max_iter = nS*nA
iter = 0
while np.any(pi != pi_prev):
pi_prev = pi
_, Q_pi = PolicyEvaluation(M, pi, gamma)
# Policy improvement:
# pi = np.zeros((nS, nA))
# pi[np.arange(nS), np.argmax(Q_pi, axis=1)] = 1 # set 1 for the optimal action w.r.t Q, and 0 for the other actions
pi = generalized_greedy(Q_pi)
if iter > max_iter:
raise RuntimeError('Policy Iteration should have stopped by now!')
iter += 1
pi_opt = pi
Q_opt = Q_pi
V_opt = np.max(Q_opt, axis=1)
return pi_opt, V_opt, Q_opt
# ------------------------------------------------------------------------------------------------------------~
def PolicyIteration_GivenRP(R, P, gamma, args):
from utils.mdp_utils import MDP
M = MDP(args)
M.R = R
M.P = P
return PolicyIteration(M, gamma)
# ------------------------------------------------------------------------------------------------------------~
def ValueOfGreedyPolicyForV(V, M, gammaEval):
P = M.P
R = M.R
nS = M.nS
nA = M.nA
Q_pi = np.zeros((nS, nA))
for a in range(nA):
for i in range(nS):
Q_pi[i, a] = R[i, a] + gammaEval * np.matmul(P[i, a, :], V)
# pi = np.zeros((nS, nA))
# pi[np.arange(nS), np.argmax(Q_pi, axis=1)] = 1 # set 1 for the optimal action w.r.t Q, and 0 for the other actions
pi = generalized_greedy(Q_pi)
Vgreedy, _ = PolicyEvaluation(M, pi, gammaEval)
return Vgreedy
# ------------------------------------------------------------------------------------------------------------~
def BellmanErr(V, M, pi, gammaEval):
P = M.P
R = M.R
P_pi, R_pi = GetPolicyDynamics(P, R, pi)
errVec = V - (R_pi + gammaEval * np.matmul(P_pi, V))
return errVec
# ------------------------------------------------------------------------------------------------------------~
def evaluate_value_estimation(loss_type, V_pi, V_est, M, pi, gammaEval, gamma_guidance):
if loss_type == 'correction_scaling':
# Correction factor:
V_est = V_est * (1 / (1 - gammaEval)) / (1 / (1 - gamma_guidance))
eval_loss = np.abs(V_pi - V_est).mean()
elif loss_type =='L1_normalized':
V_est_norm = np.sum(np.abs(V_est))
V_pi_norm = np.sum(np.abs(V_pi))
eval_loss = np.abs(V_pi / V_pi_norm - V_est / V_est_norm).mean()
# elif loss_type =='Lmax_normalized':
# V_est_norm = np.max(np.abs(V_est))
# V_pi_norm = np.max(np.abs(V_pi))
# eval_loss = np.abs(V_pi / V_pi_norm - V_est / V_est_norm).mean()
elif loss_type =='Lmax':
eval_loss = np.abs(V_pi - V_est).max()
elif loss_type =='L2_normalized':
V_est_norm = np.linalg.norm(V_est)
V_pi_norm = np.linalg.norm(V_pi)
eval_loss = np.sqrt(np.square(V_pi / V_pi_norm - V_est / V_est_norm).mean())
elif loss_type =='L2':
eval_loss = np.sqrt(np.square(V_pi - V_est).sum())
elif loss_type =='L2_uni_weight':
eval_loss = np.sqrt(np.mean(np.square(V_pi - V_est)))
elif loss_type == 'one_pol_iter_l2_loss':
# Optimal policy for the MDP:
pi_opt, V_opt, Q_opt = PolicyIteration(M, gammaEval)
V_est_g = ValueOfGreedyPolicyForV(V_est, M, gammaEval)
eval_loss = (np.square(V_opt - V_est_g)).mean()
elif loss_type == 'greedy_V_L1':
V_pi_g = ValueOfGreedyPolicyForV(V_pi, M, gammaEval)
V_est_g = ValueOfGreedyPolicyForV(V_est, M, gammaEval)
eval_loss = np.abs(V_pi_g - V_est_g).mean()
elif loss_type == 'greedy_V_L_infty':
V_pi_g = ValueOfGreedyPolicyForV(V_pi, M, gammaEval)
V_est_g = ValueOfGreedyPolicyForV(V_est, M, gammaEval)
eval_loss = np.abs(V_pi_g - V_est_g).max()
elif loss_type =='Bellman_Err':
# V_pi_err = BellmanErr(V_pi, P, R, pi, gammaEval) # should be very close to 0
V_est_err = BellmanErr(V_est, M, pi, gammaEval)
eval_loss = np.abs(V_est_err).mean()
elif loss_type == 'Values_SameGamma':
V_pi_gamma, _ = PolicyEvaluation(M, pi, gamma_guidance)
eval_loss = np.abs(V_est - V_pi_gamma).mean()
elif loss_type == 'rankings_kendalltau':
tau, p_value = stats.kendalltau(V_est, V_pi)
eval_loss = -tau
elif loss_type == 'rankings_spearmanr':
rho, pval = stats.spearmanr(V_est, V_pi)
eval_loss = -rho
# elif loss_type == 'greedy_V_L_infty_MRP':
# V_pi_g = ValueOfGreedyPolicyForV(V_pi, M, gammaEval)
# V_est_g = ValueOfGreedyPolicyForV(V_est, M, gammaEval)
# eval_loss = np.abs(V_pi_g - V_est_g).max()
else:
raise AssertionError('unrecognized evaluation_loss_type')
return eval_loss
# ------------------------------------------------------------------------------------------------------------~
def get_stationary_distrb(M, pi):
P_pi, R_pi = GetPolicyDynamics(M.P, M.R, pi)
evals, evecs = np.linalg.eig(P_pi.T) # det left eigenvectors of P_pi
evals_abs = np.abs(evals)
sort_inds = np.argsort(evals_abs)
ind_of_largest = sort_inds[-1]
dVec = evecs[:, ind_of_largest].copy()
dVec = np.real_if_close(dVec)
dVec /= dVec.sum()
### DEBUG : verify that p -> dVec ######
# p = np.ones(M.nS) / M.nS
# for i in range(1000):
# p = p @ P_pi
# print(p)
######
return dVec
|
#include <iostream>
#include <string>
using namespace std;
string l, letter;
int number, i, counter = 1;
bool ident = false;
int main() {
cin >> l;
number = l.size() - 1;
string words[number + 2];
words[1] = l;
while(number) {
letter = l[l.size() - 1];
l = l.substr(0, l.size() - 1);
l = letter + l;
for(i = 1; i <= counter; i++) {
if(l == words[i]) {
ident = true;
break;
}
}
if(!ident) {
counter++;
words[counter] = l;
}
ident = false;
number--;
}
cout << counter;
return 0;
}
|
def perm_to_tree_structure(p):
perm = list(p)
n = len(perm)+1
root = Node(perm.pop(0))
root.left = Node(None, parent = root)
root.right = Node(None, parent = root)
for val in perm:
node = root
while not node.is_leaf():
if node.value > val: node = node.left
else: node = node.right
node.value = val
node.left = Node(None, parent = node)
node.right = Node(None, parent = node)
return root |
// GetValue returns the string stored in the entry
func (e *Entry) GetValue() string {
if e != nil {
return e.Value
}
return ""
} |
<reponame>Ansari90/chess_server<gh_stars>0
package chess.model.pieces;
import chess.model.board.*;
public class Knight implements Piece {
String color;
public Knight(String color) {
this.color = color;
}
@Override
public String getColor() {
return color;
}
@Override
public char getLetter() {
return Knight;
}
@Override
public boolean validMove(Square currentSquare, Square toSquare, Chessboard theBoard) {
boolean valid = false;
int currentRank = currentSquare.getRank();
char currentFile = currentSquare.getFile();
int toRank = toSquare.getRank();
char toFile = toSquare.getFile();
int rankDifference = currentRank - toRank;
int fileDifference = currentFile - toFile;
/*
* Knight Movement:Knights can have a maximum of 8 possible moves, and move in an 'L' formation,
* with the 'L' consisting of 4 squares (including the square with the knight on it).
* They can either move 2 files left or right and one rank up or down, OR
* 2 ranks up or down and 1 file left or right. Probably the easiest piece to program, if you think about it.
*/
Piece thePiece = toSquare.hasPiece();
if(thePiece == null || thePiece.getColor().equals(color) == false) {
if((fileDifference == 2 || fileDifference == -2) && (rankDifference == 1 || rankDifference == -1))
valid = true;
if((fileDifference == 1 || fileDifference == -1) && (rankDifference == 2 || rankDifference == -2))
valid = true;
}
return valid;
}
}
|
// Validate runs validations on the value and returns an error if the value is
// invalid for any reason.
func (s *Server) Validate() error {
if err := s.ServerAddress.Validate(); err != nil {
return err
}
s.Name = strings.TrimSpace(s.Name)
if nameLength := len(s.Name); nameLength < nameLengthMin || nameLength > nameLengthMax {
return fmt.Errorf("name length must be within range of %d-%d", nameLengthMin, nameLengthMax)
}
return nil
} |
// PublishMessage pushes the provided messeage to an SNS Topic and returns the
// messages' ID.
func (notif *SNS) PublishMessage(topicArn *string, message *string,
isJSON bool) (*string, error) {
var publishInput *sns.PublishInput
if isJSON {
messageStructure := "json"
publishInput = &sns.PublishInput{
TopicArn: topicArn,
Message: message,
MessageStructure: &messageStructure,
}
} else {
publishInput = &sns.PublishInput{
TopicArn: topicArn,
Message: message,
}
}
publishOutput, err := notif.Client.Publish(publishInput)
if err != nil {
return nil, err
}
return publishOutput.MessageId, err
} |
def __get_username(self):
return self.__auth_session.user if self.__auth_session else "Anonymous" |
import zmq
import time
import sys
port = "5557"
context = zmq.Context()
socket = context.socket(zmq.REP)
socket.bind("tcp://*:%s" % port)
while True:
print("top of server recv loop");
message = socket.recv_string()
print("Received request: %s" % message)
time.sleep (1)
socket.send_string(message.upper())
|
// Copyright 2020 <NAME>
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
// ==================
//
// parsers Module
//
// ==================
use getopts::{Matches, Options};
use std::io;
pub fn parse_options() -> Options {
let mut opts = Options::new();
opts.optflag("l", "light", "Light highlight mode");
opts.optflag("d", "dark", "Dark highlight mode");
opts.optopt("s", "syntax", "Source syntax format", "SYNTAX");
// todo: implement custom theme support
// opts.optopt("t", "theme", "Theme name", "THEME");
opts.optflag("h", "help", "Print this help menu");
opts.optflag("v", "version", "Print version number");
opts
}
pub fn parse_matches(args: &[String], opts: Options) -> Result<Matches, io::Error> {
// parse command line arguments
match opts.parse(&args[1..]) {
Ok(m) => Ok(m),
Err(f) => Err(io::Error::new(io::ErrorKind::Other, f.to_string())),
}
}
#[cfg(test)]
mod tests {
#[test]
fn test_parse_matches_valid_empty() {
let opts = crate::parsers::parse_options();
let argv = ["hl".to_string()];
let matches = crate::parsers::parse_matches(&argv, opts);
assert!(matches.is_ok());
}
#[test]
fn test_parse_matches_valid_one_valid_option() {
let opts = crate::parsers::parse_options();
let argv = ["hl".to_string(), "-l".to_string()];
let matches = crate::parsers::parse_matches(&argv, opts);
assert!(matches.is_ok());
}
#[test]
fn test_parse_matches_valid_two_valid_options() {
let opts = crate::parsers::parse_options();
let argv = [
"hl".to_string(),
"-l".to_string(),
"-s".to_string(),
"txt".to_string(),
];
let matches = crate::parsers::parse_matches(&argv, opts);
assert!(matches.is_ok());
}
#[test]
fn test_parse_matches_invalid_unsupported_option() {
let opts = crate::parsers::parse_options();
let argv = ["hl".to_string(), "--bogus".to_string()];
let matches = crate::parsers::parse_matches(&argv, opts);
assert!(matches.is_err());
}
#[test]
fn test_parse_matches_invalid_missing_argument() {
let opts = crate::parsers::parse_options();
let argv = ["hl".to_string(), "--syntax".to_string()];
let matches = crate::parsers::parse_matches(&argv, opts);
assert!(matches.is_err());
}
}
|
/**
* @author Shu Tadaka
*/
@Slf4j
public final class Cli {
private Cli() {
// do nothing
}
/**
* Entry point of the program.
*
* @param args command line arguments
*/
@SneakyThrows
public static void main(String... args) {
try (ConfigurableApplicationContext applicationContext = newApplicationContext(args)) {
runCommand(applicationContext, args);
}
}
private static ConfigurableApplicationContext newApplicationContext(@NotNull String[] args) {
//
System.setProperty("SERVER_PORT", "0");
//
SpringApplication application = SpringApplicationUtils.newApplication();
application.setWebEnvironment(isWebEnvironmentRequested());
return application.run(args);
}
private static boolean isWebEnvironmentRequested() {
String value = System.getProperty("SPRING_MAIN_WEB_ENVIRONMENT");
return !Strings.isNullOrEmpty(value) || Boolean.valueOf(value);
}
@SneakyThrows
@SuppressWarnings("rawtypes")
private static void runCommand(@NotNull ApplicationContext applicationContext, @NotNull String[] args) {
//
ArgumentParser parser = ArgumentParsers.newArgumentParser("gprdb-cli");
Subparsers subparsers = parser.addSubparsers().dest("task");
Map<String, CliCommand> taskIdMap = Maps.newHashMap();
for (CliCommand task : applicationContext.getBeansOfType(CliCommand.class).values()) {
Optional<String> taskId = task.register(subparsers);
if (taskId.isPresent()) {
taskIdMap.put(taskId.get(), task);
}
}
//
Namespace arguments;
try {
arguments = parser.parseArgs(args);
} catch (ArgumentParserException ex) {
parser.handleError(ex);
return;
}
//
CliCommand task = taskIdMap.get(arguments.getString("task"));
try {
task.execute(arguments, applicationContext);
} catch (Exception ex) {
log.error("an error happened during execution of a task", ex);
}
}
} |
<gh_stars>1-10
module PeerTrader.Account.Handler where
import Control.Monad (liftM)
import Data.Maybe (listToMaybe)
import Data.Text (Text)
import Database.Groundhog
import Prosper
import Application
import PeerTrader.Account.Account
import PeerTrader.Types
import PeerTrader.Account.Web
newPeerTraderAccount :: UserLogin -> Text -> AppHandler ()
newPeerTraderAccount n s = runGH $
insert_ $ PeerTraderAccount n False Nothing s False
setEnabledProsperAccount :: UserLogin -> Bool -> AppHandler ()
setEnabledProsperAccount n b = runGH $
update [ProsperEnabledField =. b] $ LoginField ==. n
getCheckTerms :: UserLogin -> AppHandler (Maybe Bool)
getCheckTerms n = runGH $ liftM listToMaybe $
project CheckTermsField (LoginField ==. n)
setCheckTerms :: UserLogin -> Bool -> AppHandler ()
setCheckTerms n b = runGH $
update [CheckTermsField =. b] (LoginField ==. n)
updateAccount :: UserLogin -> AccountData -> AppHandler ()
updateAccount n (AccountData { _prosperUser = ui }) = runGH $ do
uiKey <- insert ui
update [ProsperUserKeyField =. Just uiKey] $ LoginField ==. n
deleteAccount :: UserLogin -> AppHandler ()
deleteAccount n = runGH $
update [ProsperUserKeyField =. emptyUi] $ LoginField ==. n
where
emptyUi :: Maybe (DefaultKey User)
emptyUi = Nothing
accountStateProsperEnabled :: UserLogin -> AppHandler (Maybe (Text, Bool))
accountStateProsperEnabled n = runGH $ liftM listToMaybe $
project (StateField, ProsperEnabledField) (LoginField ==. n)
|
<filename>src/main/java/br/com/orange/jpa/test/CreateResposta.java
package br.com.orange.jpa.test;
import br.com.orange.jpa.models.Aluno;
import br.com.orange.jpa.models.Avaliacao;
import br.com.orange.jpa.models.Correcao;
import br.com.orange.jpa.models.Resposta;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
public class CreateResposta {
public static void main(String[] args) {
EntityManagerFactory emf = Persistence.createEntityManagerFactory("talents");
EntityManager em = emf.createEntityManager();
Aluno aluno = new Aluno();
aluno.setId(1L);
Avaliacao ava = new Avaliacao();
ava.setId(7L);
Correcao corr = new Correcao();
corr.setId(9L);
Resposta resp1 = new Resposta();
resp1.setSolucao("Mussum Ipsum, cacilds vidis litro abertis. Viva Forevis aptent taciti sociosqu ad litora torquent. Detraxit consequat et quo num tendi nada. Nec orci ornare consequat. Praesent lacinia ultrices consectetur. Sed non ipsum felis. Quem num gosta di mim que vai caçá sua turmis! ");
resp1.setAluno(aluno);
resp1.setAvaliacao(ava);
resp1.setCorrecao(corr);
em.getTransaction().begin();
em.persist(resp1);
em.getTransaction().commit();
em.close();
}
}
|
#include "AZHDistribution.h"
#include <fstream>
#include <iostream>
#include <sstream>
#include <string>
#include "AZHPreAnalysis.h"
#include "TFile.h"
#include "TH2F.h"
#include "TTree.h"
using namespace std;
Distribution_t::Distribution_t(int nbin) : NBINS(nbin), data_cs(nbin, 0), data_mc_count(nbin, 0), cs(0), n_mc(0) {}
Distribution_t::Distribution_t(const char *root_file) {
TFile *f = new TFile(root_file);
TTree *t = (TTree *)f->Get("AZHPreAnalysis");
AZHPreAnalysis *del = new AZHPreAnalysis(t);
int NBINX = 20; // * for mtt;
int NBINY = 20; // * for mztt;
double X_LOW = 350;
double X_HIGH = 1000;
double Y_LOW = 400;
double Y_HIGH = 1000;
TH2F *hist_cs = new TH2F("hist_cs", "", NBINX, X_LOW, X_HIGH, NBINY, Y_LOW, Y_HIGH);
TH2F *hist_mc = new TH2F("hist_mc", "", NBINX, X_LOW, X_HIGH, NBINY, Y_LOW, Y_HIGH);
cs = 0;
n_mc = 0;
NBINS = NBINX * NBINY;
for (int i = 0; i < t->GetEntries(); i++) {
del->GetEntry(i);
bool good = cut(del);
if (!good) continue;
hist_cs->Fill(del->mtt, del->mztt, del->weight * 1000);
hist_mc->Fill(del->mtt, del->mztt);
cs += del->weight * 1000;
n_mc += 1;
}
int nbinall;
for (int i = 0; i < NBINX; i++) {
for (int j = 0; j < NBINY; j++) {
nbinall = hist_cs->GetBin(i + 1, j + 1);
data_cs.push_back(hist_cs->GetBinContent(nbinall));
nbinall = hist_mc->GetBin(i + 1, j + 1);
data_mc_count.push_back(hist_mc->GetBinContent(nbinall));
}
}
delete hist_cs;
delete hist_mc;
delete del;
}
void Distribution_t::set_histogram(char *hist_name, TH2F *h2) {
int NBINX = 20; // * for mtt;
int NBINY = 20; // * for mztt;
double X_LOW = 350;
double X_HIGH = 1000;
double Y_LOW = 400;
double Y_HIGH = 1000;
h2 = new TH2F(hist_name, "", NBINX, X_LOW, X_HIGH, NBINY, Y_LOW, Y_HIGH);
for (int i = 0; i < NBINS; i++) {
cout << "data @" << i << " = " << data_cs[i] << endl;
h2->SetBinContent(i + 1, data_cs[i]);
}
}
Distribution_t::Distribution_t(const char *distribution_file, int null_data) {
ifstream infile(distribution_file);
string line;
getline(infile, line); // first line is the title
getline(infile, line);
stringstream sstr(line);
string tok;
getline(sstr, tok, '\t'); // skip MHA
getline(sstr, tok, '\t'); // skip MHH
getline(sstr, tok, '\t'); // skip WHA
getline(sstr, tok, '\t'); // skip WHH
getline(sstr, tok, '\t'); // skip tb
getline(sstr, tok, '\t'); // skip cba
getline(sstr, tok, '\t');
cs = atof(tok.c_str());
while (getline(sstr, tok, '\t')) {
data_cs.push_back(atof(tok.c_str()));
}
NBINS = data_cs.size();
getline(infile, line);
stringstream sstr_mc(line);
getline(sstr_mc, tok, '\t'); // skip MHA
getline(sstr_mc, tok, '\t'); // skip MHH
getline(sstr_mc, tok, '\t'); // skip WHA
getline(sstr_mc, tok, '\t'); // skip WHH
getline(sstr_mc, tok, '\t'); // skip tb
getline(sstr_mc, tok, '\t'); // skip cba
getline(sstr_mc, tok, '\t');
n_mc = atoi(tok.c_str());
while (getline(sstr_mc, tok, '\t')) {
data_mc_count.push_back(atof(tok.c_str()));
}
}
bool Distribution_t::cut(AZHPreAnalysis *del) {
static const double mz = 91.1776;
if (abs(del->MOSSF_BEST - mz) > 15.0) return false;
if (del->MET < 30.0) return false;
return true;
}
Distribution_t Distribution_t::operator+(const Distribution_t &dist) {
Distribution_t dist_rest(this->NBINS);
for (int i = 0; i < this->NBINS; i++) {
dist_rest.data_cs[i] = this->data_cs[i] + dist.data_cs[i];
dist_rest.data_mc_count[i] = this->data_mc_count[i] + dist.data_mc_count[i];
}
dist_rest.cs = this->cs + dist.cs;
dist_rest.n_mc = this->n_mc + dist.n_mc;
return dist_rest;
}
Distribution_t Distribution_t::operator*(const double scale) {
Distribution_t dist_rest(this->NBINS);
for (int i = 0; i < this->NBINS; i++) {
dist_rest.data_cs[i] = this->data_cs[i] * scale;
dist_rest.data_mc_count[i] = this->data_mc_count[i];
}
dist_rest.cs = this->cs * scale;
dist_rest.n_mc = this->n_mc;
return dist_rest;
}
Distribution_t &Distribution_t::operator+=(const Distribution_t &dist) {
// cout << "current size: " << this->NBINS << " Addon size: " << dist.NBINS << endl;
for (int i = 0; i < this->NBINS; i++) {
data_cs[i] += dist.data_cs[i];
data_mc_count[i] += dist.data_mc_count[i];
}
cs += dist.cs;
n_mc += dist.n_mc;
return *this;
}
|
import React, { useEffect, useMemo, useRef } from "react";
import { useLocalStorage, useEffectOnce } from "react-use";
import { IDisposable } from "monaco-editor";
import { useMonaco, Monaco } from "@monaco-editor/react";
import { CustomTokensProvider } from "../TokenizeLogic/CustomLanguageTokensProvider";
import { FHEditorProps, FullHeightEditor } from "./FullHeightEditor";
import { buildParsedTokensByLine } from "../RequestPostprocessing";
import { CustomLanguageCompletionItemProvider } from "../TokenizeLogic/completionProvider";
import { getTokenizeHoverProvider } from "../TokenizeLogic/TokenizeHoverProvider";
import {
CustomThemeProvider,
SequentialThemeProvider,
ThemeMode,
} from "../TokenizeTheme";
import { GrammarRequestResult } from "../Types/GrammarTypes";
import { ParsedCustomLanguage, TokenInfo } from "../Types/TokenizeTypes";
import "./Menu.css";
/*
Component containing a monaco editor whose custom content is presented
with color highlighting according to the user's grammar.
*/
interface IProps extends FHEditorProps {
value: string;
grammarResponse: GrammarRequestResult | undefined;
customThemeProvider: CustomThemeProvider;
sequentialThemeProvider: SequentialThemeProvider | undefined;
parsedCustomLanguage: ParsedCustomLanguage | undefined;
themeMode: ThemeMode | undefined;
defaultLanguage?: never;
language?: never;
}
export const TokenizeEditor = (props: IProps) => {
const [tokensByLineStringified, setTokensByLineStringified] = useLocalStorage<
string
>("tokensByLineStringified", "");
const hoverDisposable = useRef<IDisposable | undefined>();
const tokenProviderDisposable = useRef<IDisposable | undefined>();
const completionDisposable = useRef<IDisposable | undefined>();
const tokenProvider = useRef<CustomTokensProvider>();
const monaco = useMonaco();
const syntaxHighlightAndUpdateHover = useMemo(
() => (monaco: Monaco, tokensByLine: Map<number, TokenInfo[]>) => {
if (tokensByLine === undefined || monaco === null) {
return;
}
// Update Tokens
if (tokenProvider.current) {
tokenProvider.current.tokensByLine = tokensByLine;
}
// Update Hover
const hoverProvider = getTokenizeHoverProvider(tokensByLine);
const hoverDisposable_ = monaco.languages.registerHoverProvider(
"customLanguage",
hoverProvider
);
if (hoverDisposable.current !== undefined) {
hoverDisposable.current.dispose();
}
hoverDisposable.current = hoverDisposable_;
},
[hoverDisposable]
);
const { parsedCustomLanguage } = props;
useEffect(() => {
if (monaco !== null && parsedCustomLanguage !== undefined) {
const currentTokensByLine = buildParsedTokensByLine(parsedCustomLanguage);
if (currentTokensByLine !== undefined) {
setTokensByLineStringified(
JSON.stringify(Array.from(currentTokensByLine.entries()))
);
} else {
setTokensByLineStringified(undefined);
}
}
}, [
monaco,
parsedCustomLanguage,
tokensByLineStringified,
setTokensByLineStringified,
]);
// Update syntax highlighting and hover with new theme or tokens
const { sequentialThemeProvider, customThemeProvider, themeMode } = props;
useEffect(() => {
if (monaco === null) {
return () => {};
}
const selectedThemeProvider =
themeMode === ThemeMode.Sequential
? sequentialThemeProvider
: customThemeProvider;
const rules =
selectedThemeProvider !== undefined
? selectedThemeProvider.buildRules()
: [];
monaco.editor.defineTheme("customTheme", {
base: "vs",
inherit: false,
rules: rules,
colors: { "editorLineNumber.foreground": "ff0000" },
});
monaco.editor.setTheme("customTheme");
const tokensByLine = deserializeTokensByLine(tokensByLineStringified);
if (tokensByLine.size > 0) {
syntaxHighlightAndUpdateHover(monaco, tokensByLine);
}
}, [
sequentialThemeProvider,
customThemeProvider,
themeMode,
monaco,
tokensByLineStringified,
syntaxHighlightAndUpdateHover,
]);
useEffect(() => {
if (monaco !== null && props.grammarResponse !== undefined) {
completionDisposable.current = monaco.languages.registerCompletionItemProvider(
"customLanguage",
new CustomLanguageCompletionItemProvider(props.grammarResponse.tokens)
);
return () => {
if (completionDisposable.current !== undefined) {
completionDisposable.current.dispose();
}
};
}
}, [monaco, props.grammarResponse]);
useEffectOnce(() => {
if (monaco !== null) {
monaco.languages.register({
id: "customLanguage",
});
const tokensByLine = deserializeTokensByLine(tokensByLineStringified);
tokenProvider.current = new CustomTokensProvider(tokensByLine);
tokenProviderDisposable.current = monaco.languages.setTokensProvider(
"customLanguage",
tokenProvider.current
);
}
return () => {
if (tokenProviderDisposable.current) {
tokenProviderDisposable.current.dispose();
}
if (hoverDisposable.current) {
hoverDisposable.current.dispose();
}
if (completionDisposable.current) {
completionDisposable.current.dispose();
}
};
});
return (
<FullHeightEditor
{...props}
language={"customLanguage"}
defaultLanguage={"customLanguage"}
theme="customTheme"
loading={props.loading ? props.loading : ""}
/>
);
};
const deserializeTokensByLine = (
tokensByLineStringified: string | undefined
) => {
const tokensByLine: Map<number, TokenInfo[]> = tokensByLineStringified
? new Map(JSON.parse(tokensByLineStringified))
: new Map();
return tokensByLine;
};
|
<gh_stars>10-100
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.
#pragma once
#ifndef HERODATA_H
#define HERODATA_H
#include <memory>
#include <variant>
namespace graphql::learn {
class Droid;
class Human;
using SharedHero = std::variant<std::shared_ptr<Human>, std::shared_ptr<Droid>>;
using WeakHero = std::variant<std::weak_ptr<Human>, std::weak_ptr<Droid>>;
namespace object {
class Character;
} // namespace object
std::shared_ptr<object::Character> make_hero(const SharedHero& hero) noexcept;
std::shared_ptr<object::Character> make_hero(const WeakHero& hero) noexcept;
} // namespace graphql::learn
#endif // HERODATA_H
|
Soil Classification Using GATree
This paper details the application of a genetic programming framework for classification of decision tree of Soil data to classify soil texture. The database contains measurements of soil profile data. We have applied GATree for generating classification decision tree. GATree is a decision tree builder that is based on Genetic Algorithms (GAs). The idea behind it is rather simple but powerful. Instead of using statistic metrics that are biased towards specific trees we use a more flexible, global metric of tree quality that try to optimize accuracy and size. GATree offers some unique features not to be found in any other tree inducers while at the same time it can produce better results for many difficult problems. Experimental results are presented which illustrate the performance of generating best decision tree for classifying soil texture for soil data set. |
<gh_stars>100-1000
/**
* \file KeyModule.h
*/
#pragma once
#include "KeyMap.h"
#include "RcmLoader.h"
#include "logging/RhoLog.h"
#define INVALID_KEY -2
#define KEY_EMPTYSTRING 0
#define ALL_KEYS -1
#define DISPLAYCLASS L"DISPLAYCLASS"
class CKeyModule;
struct INSTANCE_DATA
{
CKeyMap *pKeyMap;
BOOL bAllKeysDispatch;
rho::apiGenerator::CMethodResult psAllKeysNavigate;
rho::apiGenerator::CMethodResult psTriggerNavigate;
bool bSuppressChar;
int nHomeKey;
int nAccelerateMode;
// int nControlKeys, *pControlKeys; ///< Enabled control keys
bool bAppHasFocus;
};
struct TRIGGER_DATA
{
CKeyModule *pModule;
LPCWSTR psNavigate;
};
/******************************************************************************/
/**
* Main Key plugin module class.
*/
/******************************************************************************/
class CKeyModule
{
public:
CKeyModule ();
~CKeyModule ();
BOOL Initialize();
/**
* Reads the Configuration file to determine which Function and Application
* Keys should not be blocked and blocks all unspecified keys from
* functioning. These are Windows Hot Keys, which send WM_HOTKEY.
* \param HWND of the new application to receive the hot keys.
* \return TRUE if the keys were successfully registered for by Browser
* or FALSE if the registration failed
* \todo ES400 device has a dedicated calendar button which is not blocked
* by this function. Reported key down code is 0x3D
*/
static BOOL BlockHotKeys(HWND hwndToRegisterFor);
bool onPrimaryMessage (MSG *pmsg);
void setHomeKey(int nHomeKey) {pInstanceData->nHomeKey = nHomeKey;}
void setHomeKey(LPCWSTR pkey);
int getHomeKey() {return pInstanceData->nHomeKey;}
int parseKeyValue(rho::String szKeyValue);
void setRemap(int iKeyFrom, int iKeyTo);
void setTriggerCallback(rho::apiGenerator::CMethodResult pCallback);
void setKeyCallback(bool bDispatch, int iKeyValue, rho::apiGenerator::CMethodResult pCallback);
BOOL onRhoAppFocus(bool bActivate);
private:
BOOL onBeforeNavigate (int iInstID);
BOOL onWindowChanged(int iEvent);
INSTANCE_DATA *pInstanceData;
HANDLE hTriggerEvent;
HANDLE hTriggerQuit;
HANDLE hTriggerThread;
HANDLE hTriggerNotification;
static DWORD StaticTriggerProc (LPVOID pparam);
void TriggerProc (void);
BOOL StartTriggerWatch (void);
void StopTriggerWatch (void);
BOOL IsAccelerator (int key);
void ClearAll (INSTANCE_DATA *pdata, BOOL bIsNavigation);
int GetKeyCode (LPCWSTR pkey);
bool ProcessKeyDown (WPARAM wparam, LPARAM lparam, INSTANCE_DATA *pdata);
bool ProcessChar (WPARAM wparam, LPARAM lparam, INSTANCE_DATA *pdata);
bool ProcessKeyUp (WPARAM wparam, LPARAM lparam, INSTANCE_DATA *pdata);
void ClearRegistrationsFromOOPEngine();
void RegisterFromOOPEngine();
CRcmLoader* m_pRcm; ///< EMDK Rcm DLL loaded dynamically
BOOL m_bRcmLoaded; ///< Whether or not the Rcm DLL has been loaded
rho::apiGenerator::CMethodResult blankCallback; // Default callback
bool m_bOOPMessagesRegistered;
};
|
//---------------------------------------------------------------------
// Utility function to synchronize the command queue.
// On error, an exception will be thrown describing the failure.
//---------------------------------------------------------------------
void ZePingPong::synchronize_command_queue(L0Context &context) {
ze_result_t result = ZE_RESULT_SUCCESS;
result = zeCommandQueueSynchronize(context.command_queue, UINT64_MAX);
if (result) {
throw std::runtime_error("zeCommandQueueSynchronize failed: " +
std::to_string(result));
}
} |
/**
* Updates the global recommendations. This method will load content data if necessary, then
* runs the global recommendation recipes.
*
* @param context The context.
*/
public void updateGlobalRecommendations(Context context) {
if (mContentLoader.getNavigatorModel().getRecommendationRecipes() == null ||
mContentLoader.getNavigatorModel().getRecommendationRecipes().size() == 0) {
Log.d(TAG, "No global recommendation recipes to run");
return;
}
Log.d(TAG, "Updating global recommendations.");
if (mContentLoader.isContentLoaded()) {
mSender.setRootContentContainer(mContentLoader.getRootContentContainer());
runGlobalRecommendationRecipes(context);
}
else {
loadDataForRecommendations(context);
}
AnalyticsHelper.trackUpdateGlobalRecommendations();
} |
<reponame>Superconductor/superconductor<filename>compiler/AleAntlr3Grammar/aleGrammar/AleFrontend.java<gh_stars>10-100
package aleGrammar;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import org.antlr.runtime.ANTLRFileStream;
import org.antlr.runtime.CommonTokenStream;
import org.antlr.runtime.RecognitionException;
//import aleGrammar.Generators.HHelpers;
import AGEval.AGEvaluator;
import AGEval.AleSimulatorNode;
//import AGEval.Generator;
import AGEval.GrammarEvaluator;
import AGEval.IFace;
import AGEval.InvalidGrammarException;
//import AGEvalSwipl.CppParserGenerator;
public class AleFrontend {
public final ALEParser ast;
public AGEvaluator alegEval;
//stripFloats: if true, 0.0f => 0.0
public AleFrontend (String aleGrammar, boolean runStaticChecks, boolean stripFloats) throws IOException, RecognitionException, InvalidGrammarException {
if (aleGrammar.contains("USCORE"))
throw new InvalidGrammarException("USCORE token is reserved for advanced underscore handling!");
if (runStaticChecks) {
FrontendBeta.checkGrammar(aleGrammar);
System.out.println(" (checks pass)");
}
//ALELexer lexer = new ALELexer(new ANTLRFileStream(aleGrammar.replace("_", "USCORE")));
ALELexer lexer = new ALELexer(new ANTLRFileStream(aleGrammar));
CommonTokenStream tokens = new CommonTokenStream(lexer);
ast = new ALEParser(tokens);
ast.stripFloats = stripFloats;
ast.root(); //force parse...
//treat aliases as an input field on every class; it is the union from all classes
//skip root nodes
for (AGEval.IFace face : ast.interfaces) {
face.addField("refname");
ast.extendedClasses.get(face).extendedVertices.put("refname", new ALEParser.ExtendedVertex(true, "refnameType"));
ArrayList<String> aliases = getAliases(ast);
ast.types.put("refnameType", aliases);
ast.types.get("refnameType").add("undefined");
ast.typeVals.addAll(aliases);
face.addField("display");
ast.extendedClasses.get(face).extendedVertices.put("display", new ALEParser.ExtendedVertex(true, "displaytype"));
ArrayList<String> displays = getClassNames(ast);
ast.types.put("displayType", displays);
ast.types.get("displayType").add("ignore");
ast.types.get("displayType").add("textbox");
ast.typeVals.addAll(displays);
}
}
//partial initialization of ALE for FTL (FIXME: move into attrib-gram-evaluator-swipl or fix ALE)
public void initFtl(boolean runAle) throws RecognitionException, InvalidGrammarException {
AGEval.IFace.isFTL = true;
alegEval = new AGEvaluator(ast.interfaces, ast.classes);
if (runAle) {
GrammarEvaluator gramEval = new GrammarEvaluator(alegEval.interfaces, alegEval.classes, alegEval.stringMappings, true);
gramEval.createGraph(); //init symbols
}
}
public static ArrayList<String> getClassNames (ALEParser g) {
ArrayList<String> res = new ArrayList<String>();
for (AGEval.Class cls : g.classes) res.add(cls.getName().toLowerCase());
return res;
}
public static ArrayList<String> getAliases (ALEParser g) {
ArrayList<String> res = new ArrayList<String>();
for (AGEval.Class c : g.classes) {
for (String a : c.getChildMappings().keySet()) {
if (!res.contains(a)) res.add(a);
}
}
return res;
}
//1: input grammar file url
public static void main(String[] args) throws Exception {
int numArgs = 1;
if (args.length < numArgs) {
System.err.println("Arg 1: ALE grammar file");
System.err.println("(rest: ALE args)");
return;
}
AleFrontend grammar = new AleFrontend(args[0], true, false);
grammar.alegEval.analyzeGrammar();
//test
//String[] newArgs = new String[args.length - numArgs]; //chop off file name..
//for (int i = numArgs; i < args.length; i++) newArgs[i - numArgs] = args[i];
//runSample(grammar.alegEval, grammar.ast, newArgs);
}
public static void runSample (AGEvaluator agEval, ALEParser g, String[] args) throws InvalidGrammarException {
//////
AGEval.Class TopBox = g.classTable.get("TopBox");
AGEval.Class VBox = g.classTable.get("VBox");
AGEval.Class LeafBox = g.classTable.get("LeafBox");
IFace Node = g.interfaceTable.get("Node");
//////
// External Input to Grammar ---------------------------------------------------------------------------
AleSimulatorNode AleTop = new AleSimulatorNode(TopBox);
AleSimulatorNode VBoxOne = new AleSimulatorNode(VBox);
AleSimulatorNode VBoxTwo = new AleSimulatorNode(VBox);
AleSimulatorNode LeafOne = new AleSimulatorNode(LeafBox);
AleSimulatorNode LeafTwo = new AleSimulatorNode(LeafBox);
AleSimulatorNode LeafThree = new AleSimulatorNode (LeafBox);
AleTop.addChild("root", Node, VBoxOne);
VBoxOne.addChild("child1", Node, VBoxTwo);
VBoxOne.addChild("child2", Node, LeafOne);
VBoxTwo.addChild("child1", Node, LeafTwo);
VBoxTwo.addChild("child2", Node, LeafThree);
// Parse Command Args
int pos = 0;
while (pos < args.length){
String currentArg = args[pos];
if (currentArg.startsWith("-")){
String command = currentArg.substring(1).toLowerCase();
if (command.equals("graph")){
int nextPos = pos + 1;
while (nextPos < args.length){
String nextArg = args[nextPos];
if (nextArg.startsWith("-")){
break;
}
else if (nextArg.equals("c")){
agEval.createClassGraph();
}
else if (nextArg.equals("o")){
agEval.createOAGGraph();
}
else {
AGEvaluator.parseError(nextArg);
}
nextPos ++;
}
pos = nextPos;
continue;
} else if (command.equals("ale")){
String baseclass = "AleNode";
String baseTreeClass = "AleTree";
String basedir = null;
boolean usePar = false;
int nextPos = pos + 1;
while (nextPos < args.length){
String nextCmd = args[nextPos];
if (nextCmd.equals("par")){
usePar = true;
}
else if (nextCmd.startsWith("baseclass")){
baseclass = nextCmd.substring(10);
}
else if (nextCmd.startsWith("basetreeclass")){
baseTreeClass = nextCmd.substring(10);
}
else if (nextCmd.startsWith("dir")){
String testDirStr = nextCmd.substring(4);
File testDir = new File(testDirStr);
if (testDir.exists()){
basedir = testDirStr;
}
else {
System.err.println("Warning: Ignorning non-existent directory " + testDirStr);
}
}
else if (nextCmd.startsWith("-")){
break;
}
else {
AGEvaluator.parseError(nextCmd);
}
nextPos ++;
}
agEval.generateAle(baseclass, baseTreeClass, basedir, usePar);
pos = nextPos;
}
else {
AGEvaluator.parseError(currentArg);
}
}
else {
AGEvaluator.parseError(currentArg);
}
}
//////
}
}
|
#include <cstdio>
#include <algorithm>
#include <set>
using namespace std;
int reach;
set< int > nums;
int f( int A ) {
A++;
while( A % 10 == 0 ) A /= 10;
return A;
}
int main( void ) {
int A;
scanf("%d", &A );
int newA = A;
nums.insert( newA );
for( int i = 0; i < 100000; i++ ) {
newA = f( newA );
nums.insert( newA );
}
printf("%d\n", nums.size() );
return 0;
}
|
declare module 'unified';
declare module 'remark-parse';
declare module 'remark-rehype';
declare module 'remark-html';
declare module 'remark-stringify';
declare module 'rehype-stringify';
|
WASHINGTON — Senate Majority Leader Mitch McConnell strongly condemned any foreign involvement in the election Monday and supported calls for a bipartisan investigation, but wouldn't say whether he believed Russia tried to swing the election for Donald Trump.
"The Russians are not our friends," said McConnell. “I think we ought to approach all of these issues on the assumption that the Russians do not wish us well.”
McConnell said he has confidence in the Senate Select Committee on Intelligence to thoroughly investigate the Russian hacking of DNC and RNC emails. This would be on top of investigations by the Obama administration and the Senate Armed Services Committee.
"This simply cannot be a partisan issue," said McConnell.
McConnell’s statements are in direct contradiction to the views of President-elect Trump, who has brushed off the reports of Russian hacking as partisan sour grapes.
“Really clearly what this is is an attempt to try to delegitimize President-elect Trump's win,” said Trump communications director Jason Miller on a call with reporters Monday.
“First after the election it was the recount nonsense, then it was the discussion of the popular vote, now it's anonymous off-the-record sources with conflicting information trying to raise other issues.”
Trump also took a direct shot at the CIA Friday, saying “these are the same people that said Saddam Hussein had weapons of mass destruction.”
McConnell, however said he has the highest confidence in the intelligence community and the CIA in particular. McConnell highlighted that Minority Leader Chuck Schumer and other Democrats would be part of the Senate investigations into Russian hacking.
Schumer put out his own statement praising McConnell's support of bipartisan investigation.
"This issue should not and must not turn into a political football," said Schumer.
House Speaker Paul Ryan released a statement that denounced any possible Russian cyberattacks as "entirely unacceptable" but notably did not call for any investigation by the House.
Instead, Ryan warned that "exploiting the work of our intelligence community for partisan purposes" would be a threat to national security.
The Obama administration is urging Congress to investigate the cyberattacks. White House press secretary Josh Earnest said Monday that members need to "spare us the hand-wringing" and move forward quickly with their investigations.
McConnell said he has not had discussions with Ryan about the matter. When asked if he personally believes that Russian actors tried to influence the election for Trump, McConnell would not say.
“Prior to the election the director of national intelligence released a statement saying that the Russian government directed the recent compromises of emails," he said.
"That is what the intelligence community believes can be said in unclassified remarks without risking sources and methods. Anything else is irresponsible, likely illegal and potentially for partisan political gain.” |
class TimeBasedMovingWindowFilter:
"""A moving-window filter for smoothing the signals within certain time interval."""
def __init__(
self,
filter_window: float = 0.1,
):
"""Initializes the class.
Args:
filter_window: The filtering window (in time) used to smooth the input
signal.
"""
self._filter_window = filter_window
self.reset()
def reset(self):
self._timestamp_buffer = []
self._value_buffer = []
def calculate_average(self, new_value, timestamp):
"""Compute the filtered signals based on the time-based moving window."""
self._timestamp_buffer.append(timestamp)
self._value_buffer.append(new_value)
while len(self._value_buffer) > 1:
if self._timestamp_buffer[
0] < timestamp - self._filter_window:
self._timestamp_buffer.pop(0)
self._value_buffer.pop(0)
else:
break
return np.mean(self._value_buffer, axis=0) |
<filename>quickstart-code/src/test/java/org/quickstart/code/example/ReversedMyLinkedList.java<gh_stars>1-10
/**
* 项目名称:quickstart-code
* 文件名:ReversedMyLinkedList.java
* 版本信息:
* 日期:2018年1月22日
* Copyright yangzl Corporation 2018
* 版权所有 *
*/
package org.quickstart.code.example;
import org.quickstart.code.example.model.MyLinkedList;
import org.quickstart.code.example.model.Node;
/**
* ReversedMyLinkedList
*
* @author:<EMAIL>
* @2018年1月22日 下午12:50:13
* @since 1.0
*/
public class ReversedMyLinkedList {
/*逻辑描述如下:假设原链表有三个节点l,m,n,指向关系为l->m->n。
当遍历到m节点时,此时有current=m,newHead=l。先用next指针保存n节点,然后将current.next指针指向newHead,
此时m的next指针变为newHead即为l。接着,将newHead指针设为current,即此时newHead指针变为m。
最后,将current指针变为先前保存的next指针即为n。
等到下一次循环的时候,通过current.next = newHead这句将节点n的next指针设为m。
通过整个过程,三个节点的指向关系就变为了n->m->l。*/
public static Node reversed_linkedlist() {
MyLinkedList list = new MyLinkedList();
Node head = list.init();
if (head == null || head.next == null) {
return head;
}
// 使用三个节指针
Node current = head;
Node newHead = null;
Node next = null;
while (current != null) {
// 先将当前节点的下个节点保存
next = current.next;
current.next = newHead; // 将原来的链表断链,将current的下一个结点指向新链表的头结点
newHead = current; // 将current设为新表头
current = next; // 将之前保存的next设为下一个节点
}
return newHead;
}
public static void main(String[] args) {
MyLinkedList list = new MyLinkedList();
Node head = reversed_linkedlist();
System.out.println("After reversed, the list is: ");
list.print(head);
}
}
|
THE EFFECT OF PILE GROUP LOCATION IN SLOPE ON ITS SEISMIC BEHAVIOR
IN THIS RESEARCH SHAKING TABLE TESTS WERE CONDUCTED TO INVESTIGATE THE EFFECTSOF THE LOCATION OF PILE GROUP LOCATION ON ITS SEISMIC BEHAVIOR. A CAP SUPPORTED BY APILE GROUP WAS SET IN A DRY SAND SLOPE, AND SUBJECTED TO SINUSOIDAL BASE MOTIONWITH CONSTANT FREQUENCY. SOIL USED IN THIS STUDY WAS FIROUZKOUH SAND 161 WITH 60%RELATIVE DENSITY.THE TESTS WERE SCALED AT 1/10th AND THE PILES WERE MADE FROMALUMINUM WITH 95cm LENGTH. DISCUSSIONS ARE FOCUSED ON THE BEHAVIOR OF PILE GROUPIN DIFFERENT SITUATION. |
//WE only have two frames so work with that.....
char FFMPEG::ReleaseFrame(){
if (mStop){
return -1;
}
return 1;
} |
// ErrWriteConcernFailed -- code 64, write concern failed
func ErrWriteConcernFailed() Error {
return Error{
Message: "write concern failed",
Code: 64,
}
} |
def plot_two_conditions(merged_df, condition_1, condition_2, xlabel, ylabel):
fig, axes = plt.subplots(ncols=2, nrows=1, figsize=(10, 4))
cmap = sns.cubehelix_palette(start=2.8, rot=0.1, as_cmap=True)
fig_abs = sns.scatterplot(
data=merged_df,
x=f"Test statistic (Real)_grp_{condition_1}",
y=f"Test statistic (Real)_grp_{condition_2}",
hue="max Z score",
size="max Z score",
linewidth=0,
alpha=0.7,
ax=axes[0],
palette=cmap,
)
fig_abs.plot([0, 4], [0, 4], "--k")
fig_raw = sns.scatterplot(
data=merged_df,
x=f"Test statistic (Real)_grp_{condition_1}_raw",
y=f"Test statistic (Real)_grp_{condition_2}_raw",
hue="max Z score",
size="max Z score",
linewidth=0,
alpha=0.7,
ax=axes[1],
palette=cmap,
)
fig_raw.plot([-4, 4], [-4, 4], "--k")
fig.suptitle(f"({xlabel}) vs ({ylabel})", fontsize=16)
fig.text(0.5, 0.04, xlabel, ha="center", va="center")
fig.text(0.06, 0.5, ylabel, ha="center", va="center", rotation="vertical")
axes[0].set_title("using abs(log$_2$ Fold Change)")
axes[1].set_title("using log$_2$ Fold Change")
axes[0].set_xlabel("")
axes[1].set_xlabel("")
axes[0].set_ylabel("")
axes[1].set_ylabel("")
print(fig) |
<gh_stars>1-10
package app.packed.state.sandbox;
// Vi har 2 mulige taenker jeg...
// Vi kender altid alle states... Det ligger fast vi er finite
// Men nogen gange kender vi alle mulige transitioner...
// Nogen gange goer vi ikke
// Sidechannels/reasons/...
public interface StateModel {
}
/// Hvordan modellere vi restarting...
// Det er jo helt sikkert noget med state at goere
// Men vi kan ikke smide en Restarting state ind over
// Condition ... [Restarting, Resuming, Pausing, Upgrading].. Failing???, Completed, idk
// They are typically boolean...
//// If they where strings... It would be possible to use them from annotations
//// We want some kind of type safety???? Eller vi vil ikke have ransom conditions...
// Men f.eks. Restartable som en kondition
// @OnStart + Restarting
// @OnStop + Restarting
// @OnStart + Restarting + Upgrading
// @OnStop + Restarting + Upgrading
// Upgrading and Restarting
// Vi vil gerne supportere |
def _batch_mv(bmat, bvec):
return torch.matmul(bmat, bvec.unsqueeze(-1)).squeeze(-1) |
<reponame>Aisiriiiiiii/CoviSolution_final<filename>app/src/main/java/com/pnuema/android/foursite/main_stuff/youtube2.java
package com.pnuema.android.foursite.main_stuff;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.os.Bundle;
import android.widget.Button;
import com.pnuema.android.foursite.R;
public class youtube2 extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_youtube2);
Button button4 = findViewById(R.id.button40);
button4.setOnClickListener(v -> openyoutube());
Button button5 = findViewById(R.id.button50);
button5.setOnClickListener(v -> openyoutube3());
}
public void openyoutube() {
Intent intent = new Intent(this, youtube.class);
startActivity(intent);
}
public void openyoutube3() {
Intent intent = new Intent(this, youtube3.class);
startActivity(intent);
}
} |
Strength performance of outer layer oil palm trunk wood treated with microwave energy
This study was done to evaluate the flexural properties of microwave treated oil palm trunk (OPT) wood. The outer portion for basal position of the tree was chosen as it is the most stable part of oil palm trunk. Six treatment with the combination of different power input and exposure time were applied on the OPT wood blocks and one sample as control was not treated with any of the combination. The flexural properties of the microwave treated OPT wood referred as flexural modulus (MOE) and flexural strength (MOR) of the samples were analyzed. Microwave treatment number six has the highest flexural modulus and surpasses the untreated OPT. |
def GenericTypeParameter(self, parameterIndex):
pass |
def invert_selection(self, selection):
all_atoms = numpy.arange(0,len(self.__parent_molecule.get_atom_information()), 1, dtype=int)
remaining_indicies = numpy.delete(all_atoms, selection)
return remaining_indicies |
Like our Facebook page for more conversation and news coverage from the East Bay and beyond.
BERKELEY — Hoping to avoid violent clashes at a planned white supremacist rally this weekend, Berkeley’s mayor on Tuesday issued a plea for counter demonstrations to be held elsewhere in the city.
His message was backed by several prominent East Bay politicians at a press conference on the steps of Berkeley City Hall, but some demonstrators responded that they still plan to turn out for a united message against hate.
“Today and always, we stand together as a community against bigotry, racism, and intolerance – and we are stronger for it,” said Berkeley Mayor Jesse Arreguin. “As mayor, I am working closely with officials at every level of government—including various law enforcement agencies—to keep peace on Sunday.”
The mayor said the city would print up and distribute “Berkeley Stands United Against Hate” posters for residents and businesses to post.
Berkeley Mayor responds to questions about this Sunday expected rally pic.twitter.com/VcqnW66Q0c — Angela E. Ruggiero (@Aeruggie) August 22, 2017
Arreguin was joined by Congresswoman Barbara Lee, D-Oakland, California Assembly members Rob Bonta and Tony Thurmond, state Sen. Nancy Skinner, Alameda County Supervisor Keith Carson and Alameda County District Attorney Nancy O’Malley.
Related Articles Student in Lincoln Memorial confrontation responds
Across the Bay Area, protesters gathered to fight emergency funding of border wall “President Trump has emboldened white nationalists but we must hold steadfast to our progressive values as a community, regardless of the challenges,” Lee said. “We cannot allow anyone, certainly not the President, to roll back the clock on progress. We must stand united against hate.”
Officials have continued to urge residents and visitors to stay away from a 1 p.m., Sunday, “No to Marxism” rally at the city’s Martin Luther King, Jr. Civic Center Park, noting that “no permit has been sought, nor has one been granted.” The mayor called the event a white supremacist rally.
But organizer Amber Cummings said Sunday’s demonstration is an anti-Marxist rally, and doesn’t want white nationalists to attend. She said she organized the event before the events in Charlottesville, Virginia on Aug. 12. She called Arreguin’s characterization of the rally as a white supremacy event as “an outright lie,” in an effort to incite violence against the people who will participate.
During a question and answer session after the leaders spoke Tuesday, some residents asked Berkeley leaders for an alternative response and to prepare for large numbers of peaceful demonstrators.
Although the city said there is no explicit connection between Sunday’s gathering and this month’s “Unite The Right” white-nationalist rally in Charlottesville that led to the deaths of a counter protester and two state troopers, it reminded others of previous rallies that have devolved into violence.
Arreguin encouraged residents to execute their right to speak out against hate through the counter demonstration, away from where the Sunday rally is expected to take place. Thousands are expected in the city this weekend.
“We don’t want nonviolent protesters to be in a situation where they can be in a middle of a fight,” he said.
Counter protests are planned, including one led by Unite for Freedom from Right Wing Violence in the Bay Area, who have called for a “Bay Area Rally Against Hate” from 10:30 a.m. to 12:30 p.m. Aug. 27 on UC Berkeley’s Crescent Lawn at Oxford and Center streets.
“We’re residents of the Bay Area — people of color, working class people, immigrants, queer, lesbian, gay, bi, and trans people, liberals, leftists, and others,” the group wrote in a news release promoting the event. “We think it’s time to get together, to celebrate our differences in solidarity, and peacefully speak out against the hateful currents in American society.”
Meanwhile, the Berkeley-based group “Network of Spiritual Progressives,” along with Tikkun magazine and Beyt Tikkun Synagogue-Without-Walls, put out a call on Facebook to “Stand for Love and Justice while saying ‘No’ to the Nazis” in Civic Center Park from 3 to 4:30 p.m. Saturday, the day before the announced right-wing rally.
Arreguin said the city does believe, “It’s important for people to express their outrage, their concern for what happened in Charlottesvillle… our opposition for white supremacy, bigotry, hatred and racism.”
Skinner said the purpose of such rallies is to incite violence. Staying a safe distance away at a counter-protest means that rally organizers will be isolated.
“They only get attention when we give it them to them. ‘When they go low, we go high'” she said, quoting former First Lady Michelle Obama.
Despite encouragement to stay away from Sunday’s demonstration, some such as Reiko Redmonde, of the “Refuse Fascism” group, said the message needs to be united. Others from her group held a poster of a cartoon President Trump wearing a KKK-type white hood with the words “This is what Fascism looks like.”
“I’m concerned (Trump has) given the go-ahead and the thumbs up to these white supremacists and Nazis through his various statements and Tweets,” Redmonde said.
She said the remedy is to stand together.
“Yes, maybe people are risking their safety, but shouldn’t people have risked their safety early on in the Nazi regime when Hitler came to power? Shouldn’t they have stood out and not let their neighbors be taken away?” Redmonde said.
Another group, the Coalition to Defend Affirmative Action, Integration, and Immigrant Rights and Fight for Equality By Any Means Necessary — better known as By Any Means Necessary, or by its initials, BAMN — made it clear, in a Facebook post titled “Charlottesville: Never Again!” that it does not intend to heed officials’ requests to stay away from downtown Berkeley on Aug. 27.
In the post, the group announced an organizing meeting for Wednesday on “Shutting Down and Defeating the Fascists.” For Sunday, the group called on its adherents to gather at 10 a.m. at Martin Luther King Jr. Civic Center Park, adding, “Fascists officially gather 1pm – 4pm.”
Stay up to date on East Bay news by downloading our mobile app for free. Get it from the Apple app store or the Google Play store.
Bay City News contributed to this report. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.