content
stringlengths 10
4.9M
|
---|
/*
* Here we restore the fileUri again
*/
@Override
protected void onRestoreInstanceState(Bundle savedInstanceState) {
super.onRestoreInstanceState(savedInstanceState);
imageUri = savedInstanceState.getParcelable("file_uri");
} |
Gulliver and the Houyhnhnm Good Life
This essay analyzes Gulliver's relationship to the Houyhnhnms in Book 4 of Gulliver's Travels in light of Lauren Berlant's challenge “to reimagine state/society relations” so that “consumer forms of collectivity not the main way” we “secure or fantasize securing everyday happiness.” Vital to that process, Berlant writes, is the imperative for readers to critically examine their own investments in the fantasies attached to the socio-political and economic conditions they protest. |
/**
* document
* Created by Gordn on 2017/6/22.
*/
public class CircleHeader extends FrameLayout implements KRefreshHeader {
AnimationView mHeader;
private ValueAnimator mUpTopAnimator;
public CircleHeader(@NonNull Context context) {
super(context);
mHeader = new AnimationView(context);
LayoutParams params = new LayoutParams(LayoutParams.MATCH_PARENT, 0);
params.gravity = Gravity.TOP;
mHeader.setLayoutParams(params);
addView(mHeader);
mHeader.setAniBackColor(0xff8b90af);
mHeader.setAniForeColor(0xffffffff);
mHeader.setRadius(7);
mHeader.setOnViewAniDone(new AnimationView.OnViewAniDone() {
@Override
public void viewAniDone() {
mUpTopAnimator.start();
}
});
mUpTopAnimator = ValueAnimator.ofFloat(refreshHeight(), 0);
mUpTopAnimator.addUpdateListener(new ValueAnimator.AnimatorUpdateListener() {
@Override
public void onAnimationUpdate(ValueAnimator animation) {
float val = (float) animation.getAnimatedValue();
mHeader.getLayoutParams().height = (int) val;
mHeader.requestLayout();
}
});
mUpTopAnimator.setDuration(200);
}
@Override
public long succeedRetention() {
return 1000;
}
@Override
public long failingRetention() {
return 0;
}
@Override
public int refreshHeight() {
return DensityUtil.dip2px(100);
}
@Override
public int maxOffsetHeight() {
return DensityUtil.dip2px(150);
}
@Override
public void onReset(@NotNull KRefreshLayout refreshLayout) {
}
@Override
public void onPrepare(@NotNull KRefreshLayout refreshLayout) {
}
@Override
public void onRefresh(@NotNull KRefreshLayout refreshLayout) {
mHeader.releaseDrag();
}
@Override
public void onComplete(@NotNull KRefreshLayout refreshLayout, boolean isSuccess) {
mHeader.setRefreshing(false);
}
@Override
public void onScroll(@NotNull KRefreshLayout refreshLayout, int distance, float percent, boolean refreshing) {
if (!refreshing){
mHeader.getLayoutParams().height = distance;
mHeader.requestLayout();
}
}
} |
<reponame>joehni/NanoContainer-remoting
/*****************************************************************************
* Copyright (C) NanoContainer Organization. All rights reserved. *
* ------------------------------------------------------------------------- *
* The software in this package is published under the terms of the BSD *
* style license a copy of which has been included with this distribution in *
* the LICENSE.txt file. *
* *
* Original code by <NAME> *
*****************************************************************************/
package org.nanocontainer.remoting.jmx;
import javax.management.MBeanInfo;
import javax.management.NotCompliantMBeanException;
import javax.management.StandardMBean;
/**
* StandardMBean with a provided MBeanInfo.
* @author Jö<NAME>
* @since 1.0
*/
public final class StandardNanoMBean extends StandardMBean {
private final MBeanInfo mBeanInfo;
/**
* Construct a StandardNanoMBean. The only difference to a {@link StandardMBean} of the JSR 3 is the user provided
* {@link MBeanInfo}.
* @param implementation
* @param management
* @param mBeanInfo
* @throws NotCompliantMBeanException
*/
public StandardNanoMBean(final Object implementation, final Class management, final MBeanInfo mBeanInfo)
throws NotCompliantMBeanException {
super(implementation, management);
this.mBeanInfo = mBeanInfo;
}
/**
* Return the provided {@link MBeanInfo}.
* @see javax.management.StandardMBean#getMBeanInfo()
*/
public MBeanInfo getMBeanInfo() {
return mBeanInfo;
}
} |
// DeleteFrom3scale Removes an InternalAPI from 3scale
func (api InternalAPI) DeleteFrom3scale(c *portaClient.ThreeScaleClient) error {
services, err := c.ListServices()
if err != nil {
return err
}
for _, service := range services.Services {
if service.SystemName == api.Name {
return c.DeleteService(service.ID)
}
}
return nil
} |
Effect of doping control on weightlifting performance.
The results of participants in Junior World Championships of weight lifting were compared between the years 1978 and 1981 and also between 1981 and 1984. In contrast to a significant improvement of athletic performance which occurred between 1978 and 1981, no significant change was seen between 1981 and 1984. It is suggested that this lack of improvement may be attributed to the more effective doping control imposed in the last few years. |
When L.A. Weekly recently compiled our list of the 20 best songs ever written about Los Angeles, we discovered that the songs our city has inspired are almost as misunderstood as the city itself. Take, for example, the fact that "I Love L.A.," which is blasted over the speakers after every win at Dodgers Stadium, is actually an ironic dig about how much L.A. sucks. (Blonde bimbos! Homeless people!)
But perhaps the most hotly debated song on our list is "Under the Bridge," which, ever since its release in 1991, has prompted countless investigations as to the location of the infamous bridge in question.
That mystery was purportedly solved in 2012, when Vulture writer Mark Haskell Smith (who, by the way, did not get the irony in Randy Newman's "I Love L.A.") claimed he'd found the bridge where RHCP singer Anthony Kiedis nearly gave his life away shooting heroin: in MacArthur Park.
Continue Reading
But after doing our own research and consulting with countless drug and gang experts in Los Angeles, we found enough evidence not only to prove Smith wrong — but to definitively state where that bridge is.
Yes, we said it. We know where "the bridge downtown” is — and it's not where you think.
Perhaps we were a little inspired by Sarah Koenig's exhaustive reporting leading up to the finale of Serial, which has been called the most popular podcast of all time. We decided to take that Serial structure and approach this story from every possible angle, taking into consideration any and all evidence, research, anecdotes and sources that we could find.
Along the way, we learned a lot not just about gang and drug culture, but about how much the city we live in (our only friend) has changed over the last three decades.
Part One: MacArthur Park
We were immediately skeptical of Smith’s theory that the bridge from "Under the Bridge" was a pedestrian tunnel in MacArthur Park for several reasons, not least of which is geography: MacArthur Park isn't exactly downtown, as the song’s lyrics specify, but in neighboring Westlake.
But what truly makes us question Smith's claim that "the bridge downtown" is actually a pedestrian tunnel is the fact that Kiedis himself refutes it. In both his 2005 autobiography Scar Tissue and the 1991 documentary Funky Monks, Kiedis describes the scene as a "freeway bridge." MacArthur Park is not immediately near a freeway and there's no way a foot tunnel underneath Wilshire Boulevard can be misconstrued as a freeway bridge, no matter how high Kiedis was at the time.
So if that bridge isn't in MacArthur Park after all, then where the hell is it?
L.A. drug and gang experts doubt the bridge’s very existence. "Sounds a little mythical to me," says Alex Alonso, gang historian and creator of the website Streetgangs.com.
But how could we argue with Kiedis’ vivid description of the bridge in Scar Tissue, which detailed its narrow passageways and dirty mattresses?
"When we weren’t shooting up in his drug-infested apartment, Mario knew this safety zone beneath a freeway bridge, some weird hideaway that the LAPD never patrolled. He explained to me that no non-Mexican gang members were allowed there, so in order for me to get in, we had to lie and tell them that I was engaged to his sister. We walked up to the big guys guarding the gate, told them Mario was my future brother in law, and they let us in. Sheltered beneath that overpass right in the middle of the city, I spent countless days lying on a bunch of dirty mattresses and shooting up with a bunch of killers."
So there you have it: The bridge downtown is a freeway overpass, exactly the kind of structure you won't find in MacArthur Park. Also, an L.A. Times article from 1989 titled "MacArthur Park: Police Try to Retake it From Drug Dealers" suggests that the park wasn't some weird hideaway that the LAPD never patrolled — quite the contrary.
We know that Kiedis visited the bridge in 1986, because it was right after the Red Hot Chili Peppers' Freaky Styley tour, around the time Kiedis' drug addiction got so bad that he was briefly kicked out of the band. That same year, the Red Hot Chili Peppers were named Band of the Year at the L.A. Weekly Music Awards.
"For our circle, that was similar to getting nominated for an Oscar," Kiedis wrote in Scar Tissue. "But the awards show happened to be at the Variety Arts Theatre, a classic old venue right smack downtown. Coincidentally, I was in the same neighborhood that night, trying to hustle more drugs for my money than anyone wanted to give me."
This timeline presents several problems if the bridge were to be in MacArthur Park. For one, "The whole MacArthur Park scene wasn't that strong in '86 in terms of gangs and drugs," Alonso says, citing the fact that neighborhood gangs like the Mexican Mafia-affiliated MS-13 had only recently formed in the early '80s and were still gaining strength.
Secondly, heroin wasn't prevalent in the park at that time, according to Alonso. Instead, local dealers were peddling huge quantities of crack and meth. The 1989 Times article supports that claim, citing rampant "roca," or crack rock sales, with no mention of heroin.
So if there was no heroin in 1986 and no freeway bridge ever, then why did Vulture writer Smith think the downtown bridge was actually in MacArthur Park?
As evidence, he cited this passage from Scar Tissue, in which Kiedis and friend Kim Jones, a former L.A. Weekly writer, "owed too much money to the drug dealers around Hollywood, so [they] started walking from her house, which was not far from downtown L.A. [it was in Echo Park], to known drug neighborhoods, mainly Sixth and Union."
But the passage about 6th and Union never directly correlates to the scene where Kiedis kicks it with a bunch of killers under a bridge. And while Smith uses this logic to map routes between downtown and that intersection, it should be noted that MacArthur Park — about a half mile west of 6th and Union — is actually in the opposite direction.
Clearly, MacArthur Park was not the home of Kiedis’ bridge. But what other clues might point to its real location? |
<gh_stars>1-10
/* Copyright (C) 2019 SCARV project <<EMAIL>>
*
* Use of this source code is restricted per the MIT license, a copy of which
* can be found at https://opensource.org/licenses/MIT (or should be included
* as LICENSE.txt within the associated archive or repository).
*/
#include "test_mpn.h"
// ============================================================================
void test_mpn_dump( char* id, limb_t* x, int l_x ) {
printf( "%s = int( '", id ); test_dump_seq( ( uint8_t* )( x ), l_x * sizeof( limb_t ), DUMP_MSB ); printf( "', 16 )\n" );
}
// ============================================================================
void test_mpn_add( int trials, int l_min, int l_max ) {
limb_t* x = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_x;
limb_t* y = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_y;
limb_t* r = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_r;
for( int i = 1; i <= trials; i++ ) {
test_id( "test_mpn", "add", i, trials );
l_x = test_rand_seq( ( uint8_t* )( x ), l_min, l_max, sizeof( limb_t ) );
l_y = test_rand_seq( ( uint8_t* )( y ), l_min, l_max, sizeof( limb_t ) );
test_mpn_dump( "x", x, l_x );
test_mpn_dump( "y", y, l_y );
l_r = MAX( l_x, l_y ) + 1;
MEASURE( r[ l_r - 1 ] = mpn_add( r, x, l_x, y, l_y ) );
l_r = mpn_lop( r, l_r );
test_mpn_dump( "r", r, l_r );
printf( "l_x = %d\n", l_x);
printf( "l_y = %d\n", l_y);
printf( "l_r = %d\n", l_r);
printf( "t = x + y " "\n" );
printf( "if ( r != t ) : " "\n" );
printf( " print( 'fail %%s' %% ( id ) )" "\n" );
printf( " print( 'x == %%s' %% ( hex( x ) ) )" "\n" );
printf( " print( 'y == %%s' %% ( hex( y ) ) )" "\n" );
printf( " print( 'r == %%s' %% ( hex( r ) ) )" "\n" );
printf( " print( ' != %%s' %% ( hex( t ) ) )" "\n" );
printf( " sys.exit( 1 ) " "\n\n" );
}
free( x );
free( y );
free( r );
}
void test_mpn_sub( int trials, int l_min, int l_max ) {
limb_t* x = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_x;
limb_t* y = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_y;
limb_t* r = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_r;
for( int i = 1; i <= trials; i++ ) {
test_id( "test_mpn", "sub", i, trials );
l_x = test_rand_seq( ( uint8_t* )( x ), l_min, l_max, sizeof( limb_t ) );
l_y = test_rand_seq( ( uint8_t* )( y ), l_min, l_max, sizeof( limb_t ) );
test_mpn_dump( "x", x, l_x );
test_mpn_dump( "y", y, l_y );
if( mpn_cmp( x, l_x, y, l_y ) >= 0 ) {
l_r = MAX( l_x, l_y ) + 1;
MEASURE( r[ l_r - 1 ] = mpn_sub( r, x, l_x, y, l_y ) );
l_r = mpn_lop( r, l_r );
}
else {
l_r = MAX( l_y, l_x ) + 1;
MEASURE( r[ l_r - 1 ] = mpn_sub( r, y, l_y, x, l_x ) );
l_r = mpn_lop( r, l_r );
}
test_mpn_dump( "r", r, l_r );
if( mpn_cmp( x, l_x, y, l_y ) >= 0 ) {
printf( "t = x - y " "\n" );
}
else {
printf( "t = y - x " "\n" );
}
printf( "if ( r != t ) : " "\n" );
printf( " print( 'fail %%s' %% ( id ) )" "\n" );
printf( " print( 'x == %%s' %% ( hex( x ) ) )" "\n" );
printf( " print( 'y == %%s' %% ( hex( y ) ) )" "\n" );
printf( " print( 'r == %%s' %% ( hex( r ) ) )" "\n" );
printf( " print( ' != %%s' %% ( hex( t ) ) )" "\n" );
printf( " sys.exit( 1 ) " "\n\n" );
}
free( x );
free( y );
free( r );
}
void test_mpn_mul( int trials, int l_min, int l_max ) {
limb_t* x = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_x;
limb_t* y = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_y;
limb_t* r = ( limb_t* )( malloc( ( 2 * l_max + 2 ) * sizeof( limb_t ) ) ); int l_r;
for( int i = 1; i <= trials; i++ ) {
test_id( "test_mpn", "mul", i, trials );
l_x = test_rand_seq( ( uint8_t* )( x ), l_min, l_max, sizeof( limb_t ) );
l_y = test_rand_seq( ( uint8_t* )( y ), l_min, l_max, sizeof( limb_t ) );
test_mpn_dump( "x", x, l_x );
test_mpn_dump( "y", y, l_y );
l_r = l_x + l_y;
MEASURE( mpn_mul( r, x, l_x, y, l_y ) );
l_r = mpn_lop( r, l_r );
test_mpn_dump( "r", r, l_r );
printf( "t = x * y " "\n" );
printf( "if ( r != t ) : " "\n" );
printf( " print( 'fail %%s' %% ( id ) )" "\n" );
printf( " print( 'x == %%s' %% ( hex( x ) ) )" "\n" );
printf( " print( 'y == %%s' %% ( hex( y ) ) )" "\n" );
printf( " print( 'r == %%s' %% ( hex( r ) ) )" "\n" );
printf( " print( ' != %%s' %% ( hex( t ) ) )" "\n" );
printf( " sys.exit( 1 ) " "\n\n" );
}
free( x );
free( y );
free( r );
}
// ============================================================================
int main( int argc, char* argv[] ) {
test_init( argc, argv, "sys" );
test_mpn_add( opt_trials, opt_limb_min, opt_limb_max );
test_mpn_sub( opt_trials, opt_limb_min, opt_limb_max );
test_mpn_mul( opt_trials, opt_limb_min, opt_limb_max );
test_fini();
return 0;
}
// ============================================================================
|
Confucian Concept of Self-Cultivation and Social Harmony
This study highlights the notion of self-cultivation, one of the most important heritages of Confucianism with regard to social harmony. Self-cultivation is the first step for developing society, namely how people should develop the self as a basis for developing family, society and the world. The basis of self-cultivation is ren (humanity). If we can conduct it correctly, then this will lead to social harmony and peace in-practice. Social harmony presupposes diversity. Diversity is an impirical reality. Hence, tolerance in diversity is needed. It is the basis of a family ethics and an ideal society. So, in the process of harmonization, there is always a dialogical of understanding between the self and the others based on their role and position. This is related to the another concept of Confucius, that of rectifying names. This is the epistemological foundation. It is not only a theoretical matter, but also a practical one. The objective of such a notion is for people to be aware of their position, role and obligations as a part of society. Therefore, self-cultivation allows for correctness of name, and it is the base for positive relation in the family and society. Briefly, self-cultivation is the correct way to realize the great harmony in reality and one of the solutions for socio-political problems. There is no harmony without diversity.
"Hence the sovereign may not neglect the cultivation of his own character. Wishing to cultivate his character, he may not neglect to serve his parents. In order to serve his parents, he may not neglect to acquire knowledge of men. In order to know men, he may not dispense with a knowledge of Heaven", and in The Doctrine of the Mean 4 (chapter XIV, 2), it is said in the Book of Poetry, "Happy union with wife and children is like the music of lutes and harps. When there is concord among brethren, the harmony is delightful and enduring. Thus may you regulate your family, and enjoy the pleasure of your wife and children. " Furthermore, self-cultivation will lead a person to engage in good interactions with the others. Self-cultivation is the basis for governing the state and bringing peace to the world 5 . Self-cultivation leads people to have the virtues of honesty, sincerity, empathy, and related qualities. The point is to come to humanity, with an ethical attitude in relation and interaction with the others 6 . This interaction aiming at self-cultivation should be carried out consistenly. Cyrille J.D Jawary said that the self-cultivation is « bien plutôt la recherche tenace d"un développement progressif de cette aptitude qui pousse les humains vers le bien tant pour eux-mêmes que pour les autres » 7 . The main aim of Confucius' notion is not only to set the meaning of the notion of self-cultivation, but to put this notion to work, so that to every person becomes chun tzu (pinyin : junzi), translated variously as an exemplary person or morally noble human being. The results of this aim is attaining a good family relationship and a good social relationship or in the Confucianism context, the great harmony.
Self-cultivation is the premier step to become a morally noble human (chun tzu) before regulating family, society and state. It is an obligation for everyone and each person in the state, as we can find it in The Great Learning, (verse 6): ""From the Son of Heaven down to the mass of the people, all must consider the cultivation of the person the root of everything besides."" 8 . Self-cultivation is a personal ethics that give contribution to participate in social harmony 9 . Self-cultivation is not something we practice for our private satisfaction or because it leads us to some sort of enlightenment. It is meant to allow us to act, and act properly, in our family, our neighborhood, and our country 10 .
A productive self-cultivation will lead people to an improved self-esteem and better behavior. This, flows on to family, society, and government in beneficial outlook and actions. And finally, it leads to a peaceful world. Peace in the world will not be able to be achieved if there is no harmony in what the Confucian concept takes to be the minimal unity of social cohesion, namely family. A peaceful country is only possible when the people are in peace. Peaceful society is only possible when the families are in peace 11 . Peaceful families only possible if each member of the family understands their rights and obligations, based on the good of self-cultivation. So, we believe that our society would be better if we talked about self-cultivation. Self-cultivation allows the presence of genuine friendships interaction between all members of society.
The procedure of self-cultivation as described in The Great Learning is wishing to cultivate the self requires that, one should first rectify one"s heart. Wishing to rectify one"s heart, one should first seek to be sincere in one"s thoughts. Wishing to be sincere in one"s thoughts, one should first extend to the utmost one"s knowledge. Such extension of knowledge lies in the investigation of things 12 .
We can say that self-cultivation is a personal ethics which reflects family ethics and social ethics. That a human being is called a human being when he or she is in a considerate relationship with other human beings. Li (propriety) is the basis of social ethics. That thought leads us to understand that peace in the world will not be achieved if there is no harmony in the family, and that it has to be started from the self, as the smallest unit of society.
In Confucianism, self-cultivation is the important base to present justice and peace, because the foundation of selfcultivation is ren (humanity). Confucian ethics are based on human relationship within society. This distinguishes Confucian ethics from those in the West which are based on human existence personally and individually. In this context, Thomas Hosuck Kang explained the Confucian concept as a relation system, or a concept of belonging : "Human being to human being; human being to family; human being to community; human being to society; human being to the state, and human being to the world, human being to the universe. Confucius was, in a sense, the father of Oriental ethics, Oriental philosophy, Oriental sociology, Oriental anthropology, and many other fields. In contrast with Western ethics is based on personal and individual human beings who are only related to God, Confucian ethics is based on human beings in the community and society. Human beings are thinkable only in the relation of humans being. Hence, without community and society, human beings cannot exist" 13 . The family system is the micro-structure of the world. The family is a castle of human beings. Without the family there is no society. The family is the root of the human race, while the world is a macro-structure of the family 14 .
C. The Rectification of Names
The rectification of names is central to understanding the great harmony in Confucianism or rational coordination of social interaction. The question that must be answered is how can the Confucian social concept of the rectification of names be developed or implemented to build some means of achieving unity in the presence of cultural diversity ?
The rectification of names, namely designating names appropriately and matching these names to actions such that every single name has a set of responsibilities attached to ensure the harmony. To put it in another way, rectifying names aims to establishing correct names and their correct use for the reason that they are the base of an ideal society, while incorrect names and incorrect use of names are the sources out of which grow problems of linguistic, moral, social and political disorder. Names are also a medium for interaction without which the division of labour would be imposible. A name makes posible not only the survival of the human species, but also the progress of civilizational development 15 . So, the correctness of names involved by rectifying names is not only a theoretical matter, but it is more a practical matter 16 . Indeed, the objective of such notion is that people be aware of their position, function and obligations as a part of society.
The use of a name is correct when it is used in a way corresponding to the meaning of the name. And the meaning of a name is established by some public norms of social actions. So, if there is no conformity between the norms that establish the meaning of the name and the attitude or behavior of the people, we say it is not correct. The important point is that from the perspective of Confucianism those norms that establish meaning, because of their social character, are a crucial constitutive of the social institutions. For instance, we can judge or evaluate if a leader behaves as a leader only if we evaluate his or her behavior within the context of the norms that establish the meaning of being leader. In a similar way, a minister should reflect the reflect the attitude and behavior of a minister, a father should reflect the attitude and behavior of a father, and a son should reflect the reflect the attitude and behavior of a son 17 . Briefly, the correctness of names according to Confucius is the foundation of the great universal harmony. This leads us to ethical components based on the pragmatist theory of meaning.
So, if the use of names is correct and interrelation between names themselves is also correct, then an authentic harmony will. Furthermore, the correctness of names is not about truth and falsehood in the truth-functional sense of formal semantics, but about taking responsibility for what we do and giving reasons for doing so. In fact, Confucius" concept of the use of names was triggered by a situation which occured during the social and political disorder in China at his period of history. According to Confucius, this condition occured because the rulers, the ministers, and people acted in ways that not in accordance with the position they occupied. What they said was not same as what they did, such as a minister not behaving as one.
The interesting point in Confucius" notion of the correctness of names is that there is interdependence between a single name and another names. So, in some sense, this form of holism also extends the notion of harmony. The correctness of a single name has to be integrated with the correctness of other names. 13 . Thomas Hosuck Kang, 1997, p Kang, 1997, p 111 : There are basically five human relationships: parents and children; the ruler and subject and the people; husband and wife or man and woman; elder and younger or elder brother and younger brother; friend and friend.
That is why Confucius mentioned names in the plural form, not in singular form when he answered Zilu"s question 18 . In other words, a single name cannot be correct independently, separated from the correctness of other names. There is dialogical understanding between names themselves; that is, if someone uses a name it has to respond to the intended use of the name as well as relating this use to the use of other names. Hence, there is a harmony here, harmony between the names themselves. Harmony always contains diversity. Harmony cannot exist without diversity. So, in this context, harmony is not only the objective of the correctness of names, but it is also a paradigm, it is an episteme.
Confucius" notion of the correctness of names has derived a notion of social order and peace. Both social order and peace are prerequisite for gathering harmony. Human beings can not exist on their own. They have to be able to make a social relationship with another human being. How to make a social relationship depends on a person"s behavior towards others in daily life. And how to behave towards others is how to know and have humanity. In the view of this article, this is the meaning of the great harmony of Confucius. Good relationships between people is the key to reach the great harmony and peace. But, the first thing to be done for achieving all of them is selfcultivation. It is instructive to look at the experiences of Indonesia on the theme of harmony in social life, especially from the perspective of Confucian adherents.
D. Confucianism In Indonesia
Confucianism came to Indonesia along with the arrival of Chinese merchants and immigrants around the 3rd Century BC 19 . But, from 1967 until 2000, based on the Presidential Instruction (Inpres/Instruksi Presiden) No. 14/1967, Confucianism in Indonesia was derecognized and suppressed. This regulation practically banned the practice of Chinese culture, the expression of Chinese traditional beliefs, and Chinese celebrations and festivities. In short, all things Chinese-affiliated were targetted. This certainly took its toll on Confucianism in Indonesia, as it was considered a Chinese faith 20 . But, there were no protests or demonstration by Confucian adherents to oppose the government. They obeyed the government's decision. This indicated their expression of loyalty to the government because that was precisely the teachings of Confucianism. They attempted to realize the harmony in social life with their high tolerance. What the president Gus Dur had done was based on the Pancasila. This is the official and foundational philosophical theory of Indonesia. Pancasila comprises two words originally derived from Sanskrit: 'panca' (five) and "sīla' (principles). Pancasila is composed of five principles and contends that they are inseparable and interrelated. The five principles are the universal principles based on : Bhinneka Tunggal Ika or Unity in Diversity. They are as follows : 1 Catholicism/Christianity. Confucianism emphasizes the moral teaching rather than religious conversion. Hence, anyone can become a Confucian as long as their behavior is in accordance with the moral teachings of Confucius. This is particularly favored by some boards of MATAKIN.
E. The Great Harmony
What is the meaning of harmony according to Confucianism ? Problems in social life are very complex. One of the social problems is identity crisis. This occurs because of incomplete self-cultivation. A person does not know their role and position. Their behavio is not in accordance with the name they bear. Human beings lose their grip on living. In addition, the establishment of religious beliefs is crushed by a materialistic ideology. This refers to the belief that the material is the only measure and standard of truth, while religious beliefs or spirituality are regarded as nothing more than a secondary need to meet the demands of social life.
A return to religious spirituality may occur under the pressure of anxiety Religious spirituality has been tested and shown capable of providing comfort and understanding for people in need. this means that religion actually has the ability to provide a sense of calm and security, giving a real identity, thus providing humans a meaning to their existence.
We can acquire the right knowledge about human nature only when we consider it as a system of social life and study it from an ever-evolving and comprehensive point of view. We must analyze it from its social position and relations, as social life is the essential quality of human existence. This does not mean that Confucius did not consider the role of individual free will. As Thomas Hosuck Kang said, human beings are capable of being thought about only in relationship to other human beings 21 . According to Confucius, names are related to all social dimensions and Confucius had a serious concern with regard to individuality and sociality for advancing social older and harmony through human relationships. Thomas Hosuck Kang has claimed that Confucius was historically the first one who discovered this concept 22 .
The nature of a human being comprises three elements: (1) the position of nature, namely the human being is as a personal being and creation of God; (2) the composition of nature, namely the human being consists of soul and body; and (3) nature of nature, namely the human being lives as an individual being and a social being. Each of them is a mono-dualist, the two being united. The three mono-dualisms are united in what is called a monopluralist, unified pluralism 23 . This means that all aspects within a human being must operate in a harmonious, fair and balanced way because each person has rights to be fulfilled. The goal of human life is to achieve godly character. In the context of Confucianism, godly character relates to becoming a chun tzu (an exemplary person).
The balance of the relationship between body and soul is complementary such as melodies in music, namely they affirm and complement one another. This balance leads to harmony of the individual-social relationship which will be followed by the chun tzu experience. Hence, the concepts of harmony and balance which in Eastern thought are central. This points to the appropriateness of working with ideals and self potential. Philosophically, harmony presupposes the existence of different things and implies a certain favorable relationship among them 24 .
In Confucianism, a human can only be called a human if a human being is able to establish good relations with other humans. This means that human existence depends on the extent to which a person humanizes others, respects others, upholds the dignity of humanity. This is explained in the Analects of Confucius. If a person wants to advance cannot do so alone, they must also advance others. When you go out your front gate, continue to treat each person as though receiving an honored guest. When directing the actions of sub-ordinates, do so as though officiating at a great ritual sacrifice. Do not do to others what you would not wish done to you. Then, there can be no complaint against you, in your state or in your household 25 . What you would not wish done to you, then do not do it to others, because in the world with-in the four seas, all men are his brothers 26 . This is the meaning of humanity (ren) of Confucianism.
The concept of ren as the core of Confucian moral teaching is the true humanity which every human being possesses. It is the fundamental characteristic which is reflected in human actions. Ren also refers to benevolence. Benevolence means to love humankind. Human beings should follow the humanitarian spirit and start from the self 21 . Thomas Hosuck Kang, 1997, p 109, Confucius and Confucianism, Questions and Answers, Confucian Publications, Washington, D.C. 22 . Thomas Hosuck Kang, 1997, p 111, Confucius and Confucianism, Questions and Answers, Confucian Publications, Washington, D.C. 23 . Notonagoro, 1997, p 13, Pancasila Secara Ilmiah Populer, Bumi Aksara, Jakarta. 24 . Chenyang Li, 2006, p to cultivate harmonious relations with the others and with the nature 27 . Ren is the way of human conduct, the way in which humans social life go on, humanity. Man is supposed to love his fellow men and treat others as he does himself 28 . It is the Confucian morality which is the highest principle of humanity. Xinzhong Yao said that morality has been characteristic of Confucian theory and practice. It was on the foundation of Confucianism that various codes of moral life, rules of propriety, patterns of behavior and guide lines for social and daily life were produced and enhanced. Confucianism underlined, and perhaps to a smaller extent continues to underline, the basic structure of society and community, to orient the life of the people and to define their moral standards and ethical ideals in most parts of East Asia 29 . This is the teaching in the main chapter (verses 4-6) of the Confucian classic, the Great Learning. ""(4) The ancients who wished to illustrate illustrious virtue throughout the kingdom, first ordered well their own states. Wishing to order well their states, they first regulated their families. Wishing to regulate their families, they first cultivated their persons. Wishing to cultivate their persons, they first rectified their hearts. Wishing to rectify their hearts, they first sought to be sincere in their thoughts. Wishing to be sincere in their thoughts, they first extended to the utmost their knowledge. Such extension of knowledge lay in the investigation of things. (5) Things being investigated, knowledge became complete. Their knowledge being complete, their thoughts were sincere. Their thoughts being sincere, their hearts were then rectified. Their hearts being rectified, their persons were cultivated. Their persons being cultivated, their families were regulated. Their families being regulated, their states were rightly governed. Their states being rightly governed, the whole kingdom was made tranquil and happy. (6) From the Son of Heaven down to the mass of the people, all must consider the cultivation of the person the root of everything besides. 30 "" The prerequisite for achieving the fruits of self-cultivation is the necessity of freedom, because freedom is the center of dynamics for human existence. Freedom is a characteristic of self-identity and expression of humanity. The basis of any moral principle is the dignity of the human person endowed with reason and freedom of selfdetermination, but the human person also is vulnerable by nature. Individual freedom and self-determination must be related to the common good 31 .
The concept of freedom in Confucianism can not be separated from what Sastrapratedja calls common good. Freedom in Confucianism is freedom within the boundaries and framework of social relations. Individuality is one element of happiness, but individual freedom must be limited, and the limit of individual freedom belongs to the other freedom. Thus individual freedom should not be allowed to detract from the freedom of others. When a person"s individual freedom meets the freedom of others, we called it sociality. Individuality dissolves in sociality. The identity of ""I", ""you" and ""they" dissolves to become "we", but their identity as an individual remains inherent.
Freedom contains the meaning of responsibility, order and involvement. Hence, it means recognition of plurality and diversity. The acknowledgment of plurality means the attitude of plurality. Freedom must be interpreted in a relational framework with others, where social responsibility must be possesed by every member of society. This is then formulated by Confucius as the great harmony. What Confucius wants with the great harmony only possibly occurs if humans individually base their life orientation to become morally noble beings (chun tzu). The exemplary person is one who is able to uphold the dignity of his or her humanity (ren), and this means upholding human rights values. The self-realized human is one who develops civilized society. In the other words, the chun tzu human is an exemplar for human relationships.
The orientation of a chun tzu"s life is not on what can be obtained from others, but on what can be done (given) and accommodated for others. Only a chun tzu human has the ability for transforming society towards peace. If there is righteousness in the heart, there will be beauty in the character. If there is beauty in the character, there will be harmony in the home. If there is harmony in the home, there will be order in the nation. If there is order in the nation, there will be peace in the world 32 . In fact, anyone can become a good Confucian as long as their behavior and attitude is in accordance with ren and his/her name. So, rectifying names is the correct way to realize the great harmony in reality and one of the solutions for socio-political problems. 27
F. Conclusion
Many scholars have observed that said that Confucian teachings can be considered as this worldly. This worldly in Confucianism means each person has to attempt to become a good person and able to establish good relationships with others in order to reach social harmony in their daily life.
Harmony in social life is a mutual agreement between all members of society. In harmony, there is diversity and tolerance. Both diversity and tolerance are built up in harmony. So, when we say keep diversity and keep tolerance, it means keep harmony too, because it automatically consits of diversity and tolerance. Therefore, it is correct when there is the enforcement of the rule of law and ethics as an effort to ensure the attainment of harmony in social life. This means that all members of society should play their role in accordance with the name which they possess.
The concept of the Confucian social harmony derives from the basic concept of humanity (ren) which is realized first with self-cultivation, then the family system and social system. Harmony is a logical consequence of applied Confucian ethics in social life. This harmony is based on ren as a fundamental principle of the whole thought of Confucianism. Humanity is placed within the framework of relationships with other human beings, so that ren is the ideal framework in human relationships. With such an understanding, then righteousness (yi), virtue (zhi) and popriety (li) should be applied in the context of a relationship between humans being.
A good relationship will develop if there is a division of roles and functions among members of the society. The meaning of the division of roles and functions is to increase harmony and solidarity among all of the members of society. It will improve effectiveness and efficiency so the progress of social and cultural work accelerates. For the individual, working means contributing and participating the cultural and social development processes. The division of roles and functions should be conducted in accordance with names carried.
Order, harmony and solidarity are prerequisite for arising the social cohesion in the community and the requirement for the realization of a society that is strong and independent that in the context of Confucianism is referred to as the great harmony. The order of social life of the community will only be realized if each individual aspires to the moral quality on ren (humanity), yi (righteousness), li (propriety), xiao (filial piety), and zhong (loyalty). But, before people act, self-cultivation and rectifying names are the first tasks to be undertaken. |
<filename>src/main/java/nl/mvdr/adventofcode/adventofcode2016/day03/SquaresPart2.java
package nl.mvdr.adventofcode.adventofcode2016.day03;
import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import nl.mvdr.adventofcode.IntSolver;
/**
* Solution to the day 3 puzzle of 2016's Advent of Code:
* <a href="https://adventofcode.com/2016/day/3">Squares With Three Sides</a>.
*
* @author <NAME>
*/
public class SquaresPart2 implements IntSolver {
private static final Logger LOGGER = LoggerFactory.getLogger(SquaresPart2.class);
/**
* {@inheritDoc}
*
* @return number of possible triangles
*/
@Override
public int solve(Stream<String> input) {
List<String> lines = input.collect(Collectors.toList());
int result = 0;
for (int i = 0; i < lines.size(); i = i + 3) {
String[] parts0 = lines.get(i).trim().split(" +");
String[] parts1 = lines.get(i + 1).trim().split(" +");
String[] parts2 = lines.get(i + 2).trim().split(" +");
for (int j = 0; j != 3; j++) {
int a = Integer.parseInt(parts0[j]);
int b = Integer.parseInt(parts1[j]);
int c = Integer.parseInt(parts2[j]);
Triangle triangle = new Triangle(a, b, c);
if (triangle.isPossible()) {
result++;
}
}
}
return result;
}
/**
* Main method.
*
* @param args commandline arguments; these are ignored
*/
public static void main(String[] args) {
SquaresPart2 instance = new SquaresPart2();
String result = instance.solve("input-day03-2016.txt");
LOGGER.info(result);
}
}
|
Evaluation of serum and seminal levels of prostate specific antigen in men with spinal cord injury.
PURPOSE
Recent studies have used prostate specific antigen (PSA) as an indicator of prostate gland activity in patients with spinal cord injury (SCI). Thus, the present study was performed to determine whether SCI can induce alterations in total serum and seminal PSA, and to compare the findings obtained to those of normal men (controls).
MATERIALS AND METHODS
A total of 44 men with SCI (cases, mean age +/- SD 33.98 +/- 9.12 years) and 44 controls (mean age +/- SD 34.09 +/- 9.16 years) were studied. Blood and semen samples were collected after 3 days of abstinence from ejaculation and stored at a controlled temperature between -70 and -79C. Seminal fluid was kept at room temperature for 15 minutes before storage. The tests for determination of total serum and seminal PSA were performed using AxSYM equipment and reagents.
RESULTS
The mean total seminal PSA obtained from patients (0.609 mg/ml) was lower than the 0.773 mg/ml value obtained from controls (p = 0.0012), but the mean total serum PSA of patients (0.918 ng/ml) did not differ significantly from that obtained from controls (0.976 ng/ml, p = 0.9967).
CONCLUSIONS
SCI patients have a significant decrease in total seminal PSA but total serum PSA is not affected by this lesion. |
Reducing Your Risk for Arthritis: The Power of Food
Arthritis is the swelling or tenderness of the joints, and one in four adults within the United States have been diagnosed with some type of it. Arthritis can happen because of genetics and aging, but other factors, such as diet and lifestyle, may contribute to it. This new 5-page publication of the UF/IFAS Food Science and Human Nutrition Department describes the modifiable factors contributing to arthritis and tips to reduce risk for arthritis. It also includes some relevant recipe ideas. Written by Sarah Curl, Jodi Fitzgerald, Danielle Nelson, and Jeanette Andrade.https://edis.ifas.ufl.edu/fs398 |
Indocyanine Green–Coated Gold Nanoclusters for Photoacoustic Imaging and Photothermal Therapy
Traditional oncology treatment modalities are often associated with a poor therapeutic index. This has driven the development of new targeted treatment modalities, including several based on the conversion of optical light into heat energy (photothermal therapy, PTT) and sound waves (photoacoustic imaging, PA) that can be applied locally. These approaches are especially effective when combined with photoactive nanoparticles that preferentially accumulate in tissues of interest and thereby further increase spatiotemporal resolution. In this study, two clinically used materials that have proven effective in both PTT and PA—indocyanine green (ICG) and gold nanoparticles (AuNPs)—are combined into a single nanoformulation. These particles, “ICG–AuNP clusters,” incorporate high concentrations of both moieties without the need for additional stabilizing or solubilizing reagents. The clusters demonstrate high theranostic efficacy both in vitro and in vivo, compared with ICG alone. Specifically, in an orthotopic mouse model of triple‐negative breast cancer, ICG–AuNP clusters can be injected intravenously, imaged in the tumor by PA, and then combined with near‐infrared laser irradiation to successfully thermally ablate tumors and prolong animal survival. Altogether, this novel nanomaterial demonstrates excellent therapeutic potential for integrated treatment and imaging. |
Read the English version of this article here
Gibt es eine Flut von Fake News im Internet? Ja. Sollte man etwas dagegen tun? Natürlich. Aber bevor wir unsere Möglichkeiten prüfen, sollten wir uns ein paar Dinge klarmachen.
Fake News, also Falschmeldungen, haben nicht Donald Trump gewählt – es sei denn, man ordnet viele seiner Tweets in diese Kategorie ein. Die Medien sind ebenso schuld am Ergebnis dieser Wahl wie alle anderen, weil der Journalismus in seiner Aufgabe versagt hat: die Öffentlichkeit zu informieren.
Fake News haben die sozialen Netzwerke nicht übernommen. Das meiste, was wir dort tun und lesen, ist noch immer genießbar und manchmal wertvoll. (Andernfalls würden nur noch Dummköpfe diese Plattformen weiter nutzen.)
Fake News können überall auftreten. Ich würde sagen, dass die Geschichte der New York Times über Hillary Clintons E-Mails ein Fake war oder zumindest von anderen Medien heftig überzogen wurde. Aber die New York Times ist kein Fake-News-Medium.
Der Versuch, Falschmeldungen, Fehler, Lügen und Idiotie online auszulöschen, also ein Recht auf Vergessenwerden des Falschen, wäre so aussichtslos wie der Versuch, jedes Gespräch unter Betrunkenen in jeder Kneipe des Landes zu korrigieren. Ignoranz und Dummheit sind verwandte Schädlinge, die nicht aussterben werden.
Medien können nicht länger erwarten, dass Nutzer zu ihnen kommen
Wenn uns das Internetzeitalter eines gelehrt hat, dann dies: Die einzig vernünftige Behandlung (wenn auch nicht Heilung) im Falle schlechter Informationen sind mehr gute Informationen, die viele unterschiedliche Formen annehmen sollten.
© Andreas Rentz/Getty Images Jeff Jarvis Journalist, Autor, Medienexperte zur Autorenseite
Zunächst sollten Nachrichtenmedien von den Fake-News-Fabriken lernen und deren soziale Werkzeuge nutzen – also Meme (Fotos mit Text auf Facebook, Instagram und anderen Plattformen), Videos und Tweets –, um damit echte Nachrichten zu verbreiten: Berichte, Fact Checking, Hintergründe, Analysen. Ich sage nicht, dass Journalisten die Form des Artikels aufgeben sollten. ZEIT ONLINE sollte sich nicht in eine Sammlung von Fun Facts illustrierenden Katzenfotos verwandeln. Und die Tagesschau sollte kein virales, 30-sekündiges YouTube-Video werden.
Aber wir können nicht länger erwarten, dass die Öffentlichkeit immer zu uns kommt – zu unseren Publikationen, unseren Sendungen, unseren Seiten –, um Nachrichten in der Form zu konsumieren, die wir vorgeben. Nein, stattdessen müssen wir uns an die Öffentlichkeit wenden, der wir geschworen haben, sie zu informieren, und zwar im Kontext ihrer Gespräche. Wir sollten Bürger mit Wahrheitsmunition ausstatten, die sie abfeuern können, um ihre Gespräche mit Fakten anzureichern und ihre Freunde zu korrigieren.
Zusammenarbeit mit dem Silicon Valley
Es war leichtfertig und wenig hilfreich, dass Mark Zuckerberg zunächst behauptet hat, Falschmeldungen hätten Donald Trump nicht gewählt, weil 99 Prozent der Facebook-Inhalte wahr seien. (Wo sind die Daten, die das belegen?) Doch seine Haltung änderte sich schnell und er hat aufgelistet, was seine Plattform gegen das Problem tun will. Vieles davon entspricht den 15 Empfehlungen, die der New Yorker Unternehmer John Borthwick und ich an die Betreiber sozialer Plattformen gerichtet haben. Im Kern geht es darum, Facebook, Twitter, Google, Instagram, YouTube und die anderen zur Zusammenarbeit mit Medien und Nutzern zu bringen, um die Öffentlichkeit mit mehr guten Informationen zu versorgen.
Stellen Sie sich vor, in ihrem Feed taucht ein Foto von Angela Merkel auf, wie sie im Clownskostüm auf einer ausschweifenden Party tanzt. Sie müssen lachen. Sie erwägen, das zu teilen. Aber bevor Sie das tun, stellen Sie fest, dass dieses Foto von einer angeblichen Nachrichtenseite stammt, von der Sie noch nie zuvor gehört haben, und die vor gerade einmal einem Tag eingerichtet wurde. Sie sehen auch, dass ein etabliertes Nachrichtenmedium das Bild bereits entlarvt hat. Sie sehen, dass die Person, die es geteilt hat, schon öfter Inhalte geteilt hat, die sich als Enten herausgestellt haben. Werden Sie es nun auch teilen? Und wenn Sie es tun, werden Sie zumindest hinzufügen, dass es sich um einen Fake handelt, damit Sie am Ende nicht als Trottel dastehen? |
package com.github.loafer.spring.mvc;
import static org.springframework.test.web.servlet.setup.MockMvcBuilders.standaloneSetup;
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get;
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.content;
import com.github.loafer.spring.mvc.mobile.preference.PreferenceController;
import org.junit.Before;
import org.junit.Test;
import org.springframework.mobile.device.site.SitePreferenceHandlerInterceptor;
import org.springframework.mobile.device.site.SitePreferenceHandlerMethodArgumentResolver;
import org.springframework.test.web.servlet.MockMvc;
/**
* @author zhaojh.
*/
public class PreferenceControllerTests {
private MockMvc mockMvc;
@Before
public void setup(){
mockMvc = standaloneSetup(new PreferenceController())
.addInterceptors(new SitePreferenceHandlerInterceptor())
.setCustomArgumentResolvers(new SitePreferenceHandlerMethodArgumentResolver())
.build();
}
@Test
public void perferenceNormal() throws Exception {
mockMvc.perform(get("/preference?site_preference=normal"))
.andExpect(content().string("Site preference is normal"));
}
@Test
public void preferenceMobile() throws Exception {
mockMvc.perform(get("/preference?site_preference=mobile"))
.andExpect(content().string("Site preference is mobile"));
}
@Test
public void preferenceTablet() throws Exception {
mockMvc.perform(get("/preference?site_preference=tablet"))
.andExpect(content().string("Site preference is tablet"));
}
}
|
#!/usr/bin/python -*- coding: utf-8 -*-
#
# Merlin - Almost Native Python Machine Learning Library: Perceptron Classifier
#
# Copyright (C) 2014-2015 alvations
# URL:
# For license information, see LICENSE.md
import numpy as np
import linear_classifier as lc
class Perceptron(lc.LinearClassifier):
def __init__(self,nr_epochs = 10,learning_rate = 1, averaged = True):
lc.LinearClassifier.__init__(self)
self.trained = False
self.nr_epochs = nr_epochs
self.learning_rate = learning_rate
self.averaged = averaged
self.params_per_round = []
def train(self,x,y):
self.params_per_round = []
x_orig = x[:,:]
x = self.add_intercept_term(x)
nr_x,nr_f = x.shape
nr_c = np.unique(y).shape[0]
w = np.zeros((nr_f,nr_c))
## Randomize the examples
perm = np.random.permutation(nr_x)
for epoch_nr in xrange(self.nr_epochs):
for nr in xrange(nr_x):
#print "iter %i" %( epoch_nr*nr_x + nr)
inst = perm[nr]
y_hat = self.get_label(x[inst:inst+1,:],w)
if(y[inst:inst+1,0] != y_hat):
#Increase features of th e truth
w[:,y[inst:inst+1,0]] += self.learning_rate*x[inst:inst+1,:].transpose()
#Decrease features of the prediction
w[:,y_hat] += -1*self.learning_rate*x[inst:inst+1,:].transpose()
self.params_per_round.append(w.copy())
self.trained = True
y_pred = self.test(x_orig,w)
acc = self.evaluate(y,y_pred)
self.trained = False
print "Rounds: %i Accuracy: %f" %( epoch_nr,acc)
self.trained = True
if(self.averaged == True):
new_w = 0
for old_w in self.params_per_round:
new_w += old_w
new_w = new_w / len(self.params_per_round)
return new_w
return w
|
/*
* Parse Fixed ACPI Description Table (FADT)
*/
static int
_parse_fadt(acpi_t *acpi, struct acpi_sdt_hdr *sdt)
{
uint64_t addr;
struct acpi_sdt_fadt *fadt;
uint32_t len;
uint64_t dsdt;
len = 0;
addr = (uint64_t)sdt;
len += sizeof(struct acpi_sdt_hdr);
fadt = (struct acpi_sdt_fadt *)(addr + len);
if ( sdt->revision >= 3 ) {
if ( fadt->x_pm_timer_block.addr_space == 1 ) {
acpi->pm_tmr_port = fadt->x_pm_timer_block.addr;
if ( !acpi->pm_tmr_port ) {
acpi->pm_tmr_port = fadt->pm_timer_block;
}
}
if ( fadt->x_pm1a_ctrl_block.addr_space == 1 ) {
acpi->pm1a_ctrl_block = fadt->x_pm1a_ctrl_block.addr;
if ( !acpi->pm1a_ctrl_block ) {
acpi->pm1a_ctrl_block = fadt->pm1a_ctrl_block;
}
}
if ( fadt->x_pm1b_ctrl_block.addr_space == 1 ) {
acpi->pm1b_ctrl_block = fadt->x_pm1b_ctrl_block.addr;
if ( !acpi->pm1b_ctrl_block ) {
acpi->pm1b_ctrl_block = fadt->pm1b_ctrl_block;
}
}
dsdt = fadt->x_dsdt;
if ( !dsdt ) {
dsdt = fadt->dsdt;
}
} else {
acpi->pm_tmr_port = fadt->pm_timer_block;
acpi->pm1a_ctrl_block = fadt->pm1a_ctrl_block;
acpi->pm1b_ctrl_block = fadt->pm1b_ctrl_block;
dsdt = fadt->dsdt;
}
acpi->pm_tmr_ext = (fadt->flags >> 8) & 0x1;
acpi->smi_cmd_port = fadt->smi_cmd_port;
acpi->acpi_enable = fadt->acpi_enable;
acpi->cmos_century = fadt->century;
return 0;
} |
from django.core.urlresolvers import reverse
from django.test import TestCase
from ..factories import TemplateFactory, RequestFactory
from ...organizations.factories import OrganizationFactory
class RequestCreateViewTestCase(TestCase):
def setUp(self):
self.template = TemplateFactory()
self.organization = OrganizationFactory()
self.url = reverse('organizations_requests:send',
kwargs={'organization': self.organization.slug,
'template': self.template.slug})
def test_status_code_for_home(self):
resp = self.client.get(self.url)
self.assertEqual(resp.status_code, 200)
def test_contains_link_to_organization(self):
resp = self.client.get(self.url)
self.assertContains(resp, self.organization.get_absolute_url())
def test_contains_organization_name(self):
resp = self.client.get(self.url)
self.assertContains(resp, self.organization)
class RequestDetailViewTestCase(TestCase):
def setUp(self):
self.request = RequestFactory()
self.url = reverse('organizations_requests:details',
kwargs={'pk': str(self.request.pk)})
def test_status_code_for_home(self):
resp = self.client.get(self.url)
self.assertEqual(resp.status_code, 200)
def test_contains_link_to_organization(self):
resp = self.client.get(self.url)
self.assertContains(resp, self.request.organization.get_absolute_url())
def test_contains_organization_name(self):
resp = self.client.get(self.url)
self.assertContains(resp, self.request.organization)
|
#ifndef THSHADER_H
#define THSHADER_H
#include <Common\THCommon.h>
#include <Math\THMath.h>
namespace THEngine
{
class AssetManager;
class Texture;
class CubeMap;
class Shader : public Object
{
protected:
ID3DXEffect* effect;
UINT passNum;
String path;
int currentPass = -1;
public:
Shader();
virtual ~Shader();
void SetTechnique(char* technique);
void Use();
void End();
inline void CommitChanges()
{
this->effect->CommitChanges();
}
void UsePass(unsigned int pass);
void EndPass();
inline UINT GetPassNum() { return passNum; }
void SetTexture(char* textureName, Ptr<Texture> texture);
void SetCubeMap(char* textureName, Ptr<CubeMap> cubeMap);
inline void SetInt(char* name, int value)
{
effect->SetInt(name, value);
}
inline void SetFloat(char* name, float value)
{
effect->SetFloat(name, value);
}
inline void SetBoolean(char* name, bool value)
{
effect->SetBool(name, value);
}
inline void SetFloatArray(char* name, float* value, int count)
{
effect->SetFloatArray(name, value, count);
}
inline void SetFloat4(char* name, const Vector4f& vector)
{
effect->SetFloatArray(name, vector._data, 4);
}
inline void SetMatrix(char* name, const Matrix& value)
{
effect->SetMatrix(name, &value.matrix);
}
inline void SetValue(char* name, void* value, int size)
{
effect->SetValue(name, value, size);
}
void OnLostDevice();
void OnResetDevice();
friend class AssetManager;
};
}
#endif |
// SetDefaults sets the default values
func (s *ServerFTP) SetDefaults() {
s.Port = 21
s.Sources = []string{}
s.Timeout = NewDuration(5 * time.Second)
s.DisableEPSV = NewFalse()
s.TLS = NewFalse()
s.InsecureSkipVerify = NewFalse()
s.LogTrace = NewFalse()
} |
//MSD 3 class
public class DBManager {
//Variables for db
private static final int DATABASE_VERSION = 1;
private static String DATABASE_NAME = "";
private static String TABLE_EVENTS = "";
private static String KEY_ID = "";
private static String KEY_EVENT_DATE = "";
private static String KEY_EVENT_START_TIME = "";
private static String KEY_EVENT_END_TIME = "";
private static String KEY_EVENT_COMMENTS = "";
private static String KEY_EVENTS_HOURS_WORKED = "";
private static String KEY_EVENTS_MINUTES_WORKED = "";
private static String CREATE_EVENTS_TABLE = "";
private final Context context; //context received from activity
private MyDatabaseHelper DBHelper; //db helper class
private SQLiteDatabase db; //readable / writable db
// we must pass the context from our class that we called from
public DBManager(Context ctx) {
this.context = ctx;
//Assign strings from String file
DATABASE_NAME = ctx.getString(R.string.event_db_name);
TABLE_EVENTS = ctx.getString(R.string.events_db_table_name);
KEY_ID = ctx.getString(R.string.events_db_id_column_name);
KEY_EVENT_DATE = ctx.getString(R.string.events_db_date_column_name);
KEY_EVENT_START_TIME = ctx.getString(R.string.events_db_start_time_column_name);
KEY_EVENT_END_TIME = ctx.getString(R.string.events_db_end_time_column_name);
KEY_EVENT_COMMENTS = ctx.getString(R.string.events_db_comments_column_name);
KEY_EVENTS_HOURS_WORKED = ctx.getString(R.string.events_db_total_hours_column_name);
KEY_EVENTS_MINUTES_WORKED = ctx.getString(R.string.events_db_total_mins_column_name);
//Assign create statement to string
CREATE_EVENTS_TABLE = "CREATE TABLE " + TABLE_EVENTS + " " + "(" +
KEY_ID + " INTEGER PRIMARY KEY autoincrement, " +
KEY_EVENT_DATE + " TEXT not null, " +
KEY_EVENT_START_TIME + " TEXT not null, " +
KEY_EVENT_END_TIME + " TEXT not null, " +
KEY_EVENT_COMMENTS + " TEXT not null, " +
KEY_EVENTS_HOURS_WORKED + " integer not null, " +
KEY_EVENTS_MINUTES_WORKED + " integer not null);";
//Declare new helper class
DBHelper = new MyDatabaseHelper(context);
}//end constructor
//data base handler class
private static class MyDatabaseHelper extends SQLiteOpenHelper {
public MyDatabaseHelper(Context context) {
super(context, DATABASE_NAME, null, DATABASE_VERSION);
}//end constructor
@Override
public void onCreate(SQLiteDatabase db) {
db.execSQL(CREATE_EVENTS_TABLE);
}//end on create
@Override
public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) {
db.execSQL("DROP TABLE IF EXISTS " + TABLE_EVENTS);
onCreate(db);
}//end on upgrade
}//end MyDataBaseHelper
public DBManager open() //throws SQLException {
{
db = DBHelper.getWritableDatabase();
return this;
}//end open
public void close() {
DBHelper.close();
}//end close
public long insertEvent(String date, String startTime, String endTime, String comments, int hours, int minutes) {
ContentValues initialValues = new ContentValues();
initialValues.put(KEY_EVENT_DATE, date);
initialValues.put(KEY_EVENT_START_TIME, startTime);
initialValues.put(KEY_EVENT_END_TIME, endTime);
initialValues.put(KEY_EVENT_COMMENTS, comments);
initialValues.put(KEY_EVENTS_HOURS_WORKED, hours);
initialValues.put(KEY_EVENTS_MINUTES_WORKED, minutes);
return db.insert(TABLE_EVENTS, null, initialValues);
}//end insert event
public long updateEvent(String where, String date, String startTime, String endTime, String comments, int hours, int minutes) {
ContentValues initialValues = new ContentValues();
initialValues.put(KEY_EVENT_DATE, date);
initialValues.put(KEY_EVENT_START_TIME, startTime);
initialValues.put(KEY_EVENT_END_TIME, endTime);
initialValues.put(KEY_EVENT_COMMENTS, comments);
initialValues.put(KEY_EVENTS_HOURS_WORKED, hours);
initialValues.put(KEY_EVENTS_MINUTES_WORKED, minutes);
return db.update(TABLE_EVENTS, initialValues, where, null);
}//end insert event
public Cursor getEvent(int id) {
Cursor mCursor = db.rawQuery(
"SELECT * FROM " + TABLE_EVENTS +" WHERE " + KEY_ID + " = " + id + ";", null);
if (mCursor != null) {
mCursor.moveToFirst();
}//end if
return mCursor;
}//end get task
public Cursor getCertainMonths(String condition){
Cursor mCursor = db.rawQuery(
"SELECT * FROM " + TABLE_EVENTS + " WHERE " +
KEY_EVENT_DATE + " LIKE " + "'%/"+ condition +"' ORDER BY " + KEY_EVENT_DATE +" ASC", null);
if (mCursor != null) {
mCursor.moveToFirst();
}//end if
return mCursor;
}//end get certain months
public boolean deleteEvent(int rowId) {
// delete statement. If any rows deleted (i.e. >0), returns true
return db.delete(TABLE_EVENTS, KEY_ID +
"=" + rowId, null) > 0;
}//end deleteTask
//for testing
/*
public Cursor getAll() {
Cursor mCursor = db.rawQuery(
"SELECT * FROM " + TABLE_EVENTS, null);
if (mCursor != null) {
mCursor.moveToFirst();
}//end if
return mCursor;
}//end get all
*/
} |
<filename>src/main/java/ch/iserver/ace/algorithm/text/DeleteOperation.java
/*
* $Id: DeleteOperation.java 2434 2005-12-12 07:49:51Z sim $
*
* ace - a collaborative editor
* Copyright (C) 2005 <NAME>, <NAME>, <NAME>
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
*/
package ch.iserver.ace.algorithm.text;
import ch.iserver.ace.algorithm.Operation;
/**
* The DeleteOperation is used to hold a text together with its position that is
* to be deleted in the document model.
*/
public class DeleteOperation implements Operation {
/**
* the text to be deleted.
*/
private String text;
/**
* the position in the document where the text is to be deleted.
*/
private int position;
/**
* Class constructor.
*/
public DeleteOperation() {}
/**
* Class constructor.
*
* @param position
* the position into the document
* @param text
* the text to be deleted
*/
public DeleteOperation(int position, String text) {
setPosition(position);
setText(text);
}
/**
* Class constructor.
*
* @param position
* the position into the document
* @param text
* the text to be deleted
* @param isUndo
* flag to indicate whether this operation is an undo
*/
public DeleteOperation(int position, String text, boolean isUndo) {
setPosition(position);
setText(text);
}
/**
* Returns the position.
*
* @return the position
*/
public int getPosition() {
return position;
}
/**
* Sets the position of this operation.
*
* @param position
* the position to set
*/
public void setPosition(int position) {
if (position < 0) {
throw new IllegalArgumentException("position index must be >= 0");
}
this.position = position;
}
/**
* Returns the text length.
*
* @return the length of the text
*/
public int getTextLength() {
return text.length();
}
/**
* Returns the text to be deleted.
*
* @return the text to be deleted
*/
public String getText() {
return text;
}
/**
* Sets the text to be deleted.
*
* @param text
* the text to be deleted
*/
public void setText(String text) {
if (text == null) {
throw new IllegalArgumentException("text may not be null");
}
this.text = text;
}
/**
* {@inheritDoc}
*/
public String toString() {
return "Delete(" + position + ",'" + text + "')";
}
/**
* {@inheritDoc}
*/
public boolean equals(Object obj) {
if (obj == this) {
return true;
} else if (obj == null) {
return false;
} else if (obj.getClass().equals(getClass())) {
DeleteOperation op = (DeleteOperation) obj;
return op.position == position && op.text.equals(text);
} else {
return false;
}
}
/**
* {@inheritDoc}
*/
public int hashCode() {
int hashcode = position;
hashcode += 13 * text.hashCode();
return hashcode;
}
}
|
import React from "react";
const Speed = ({ ...others }) => {
return (
<svg
viewBox="0 0 50 50"
fill="none"
xmlns="http://www.w3.org/2000/svg"
{...others}
>
<path
d="M20 15V35L35 25L20 15ZM10.75 7.5L9.25 5.5C13 2.5 17.5 0.5 22.5 0L22.75 2.5C18.25 3 14.25 4.75 10.75 7.5ZM7.5 10.75L5.5 9.25C2.5 13 0.5 17.5 0 22.5L2.5 22.75C3 18.25 4.75 14.25 7.5 10.75ZM7.5 39.25C4.75 35.75 3 31.5 2.5 27.25L0 27.5C0.5 32.5 2.5 37 5.5 41L7.5 39.25ZM22.75 47.5C18.25 47 14.25 45.25 10.75 42.5L9.25 44.5C13 47.5 17.5 49.5 22.5 50L22.75 47.5ZM50 25C50 12 40.25 1.5 27.5 0L27.25 2.5C38.75 3.75 47.5 13.25 47.5 25C47.5 36.75 38.75 46.25 27.25 47.5L27.5 50C40.5 48.75 50 38 50 25Z"
fill="white"
/>
</svg>
);
};
export default Speed;
|
<gh_stars>0
package pl.polsl.service;
import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.web.multipart.MultipartFile;
import pl.polsl.exception.FFMPEGException;
import pl.polsl.helper.Resolution;
import pl.polsl.model.VideoFiles;
import java.io.IOException;
import java.sql.SQLException;
/**
* Created by Mateusz on 27.11.2016.
*/
public interface StorageService {
VideoFiles store(MultipartFile file, String quality);
@Transactional(propagation = Propagation.REQUIRES_NEW)
void storeFile(VideoFiles videoFile);
VideoFiles downloadVideoFile(Long id);
VideoFiles downloadVideoFile(Long id, String username);
String getExtension(MultipartFile file);
VideoFiles transcode(Long id, String username, Resolution resolution) throws IOException, SQLException, FFMPEGException, InterruptedException;
void transcode(VideoFiles videoFile, Resolution resolution) throws IOException, SQLException, FFMPEGException, InterruptedException;
}
|
package main
type validate func(token *Token) bool
func UserValidator(userId int64) validate {
return func(token *Token) bool {
return userId == token.User
}
}
func UserScenarioValidator(userId int64, scenario string) validate {
return func(token *Token) bool {
return userId == token.User && scenario == token.Scenario
}
}
|
/*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package app.metatron.discovery.domain.workbook.configurations.chart;
import com.fasterxml.jackson.core.JsonProcessingException;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import app.metatron.discovery.common.GlobalObjectMapper;
/**
* Pie chart spec. Test
*/
public class PieChartTest extends ChartTest {
@Before
public void setUp() {
}
@Test
public void de_serialize() throws JsonProcessingException {
PieChart chart = new PieChart(colorByMeasureForSection(), null, new ChartLegend(), null, fontLargerSize(), null, null,
500,
PieChart.MarkType.SECTOR.name(),
PieChart.SplitLayout.HORIZONTAL.name(), 10);
String chartStr = GlobalObjectMapper.getDefaultMapper().writeValueAsString(chart);
System.out.println(chartStr);
PieChart deSerialized = (PieChart) GlobalObjectMapper.readValue(chartStr, Chart.class);
Assert.assertEquals(chart.getMarkType(), deSerialized.getMarkType());
Assert.assertEquals(chart.getSplitLayout(), deSerialized.getSplitLayout());
Assert.assertEquals(chart.getMaxCategory(), deSerialized.getMaxCategory());
System.out.println("Result : " + deSerialized.toString());
}
}
|
// End of variables declaration//GEN-END:variables
private void gravaDados(int action) {
try {
if (action > INCLUIR && jTxtId.getText().equals("")) {
throw new SQLException("Informe um cadastro para continuar.");
}
CartaoCreditoBandeiras ccB = new CartaoCreditoBandeiras();
CartaoCredito_BandeirasDAO ccbDAO = new CartaoCredito_BandeirasDAO();
try {
ccB.setId(Integer.parseInt(jTxtId.getText()));
} catch (Exception e) {
ccB.setId(0);
}
ccB.setTitle(jTxtNome.getText());
switch (action) {
case INCLUIR:
ccbDAO.create(ccB);
jTxtId.setText(ccB.getId() + "");
break;
case ALTERAR:
ccbDAO.update(ccB);
break;
case EXCLUIR:
ccbDAO.delete(ccB);
break;
}
tools.DefaultMsg.saveDataSuccessfull();
tools.ClearFields.ClearFields(jPanel1);
conn.ConexaoMySQL.finalizarTransacao(true);
} catch (Exception e) {
tools.DefaultMsg.errorMsg(e.getMessage());
conn.ConexaoMySQL.finalizarTransacao(false);
}
} |
package it.isislab.dmason.sim.app.DWoims3D;
import java.awt.Color;
import java.util.List;
import javax.swing.JFrame;
import it.isislab.dmason.experimentals.tools.batch.data.EntryParam;
import it.isislab.dmason.experimentals.tools.batch.data.GeneralParam;
import sim.display.Controller;
import sim.display.GUIState;
import sim.display3d.Display3D;
import sim.engine.SimState;
import sim.portrayal3d.continuous.ContinuousPortrayal3D;
import sim.portrayal3d.simple.SpherePortrayal3D;
public class DWoims3DWithUI extends GUIState {
public Display3D display;
public JFrame displayFrame;
public static String name;
public static String title;
ContinuousPortrayal3D woimsPortrayal = new ContinuousPortrayal3D();
public DWoims3DWithUI(SimState state) {
super(state);
// TODO Auto-generated constructor stub
}
public void init(Controller c)
{
super.init(c);
// make the displayer
display = new Display3D(600, 600, this);
display.setBackdrop(Color.black);
//WireFrameBoxPortrayal3D wireFramePortrayal = new WireFrameBoxPortrayal3D(-0.5,-0.5,-0.5,DParticles3D.gridWidth, DParticles3D.gridHeight, DParticles3D.gridLenght, Color.blue);
display.translate(-100,-100,-100);
display.scale(1.0/200);
display.attach( woimsPortrayal, "Woims!" );
displayFrame = display.createFrame();
displayFrame.setTitle(title);
c.registerFrame(displayFrame); // register the frame so it appears in the "Display" list
displayFrame.setVisible(true);
// uncomment this to try out trails (also need to uncomment out some others in this file, look around)
/* display.attach( trailsPortrayal, "Trails" ); */
}
@Override
public void quit()
{
super.quit();
if (displayFrame!=null) displayFrame.dispose();
displayFrame = null;
display = null;
}
public DWoims3DWithUI(GeneralParam args,String prefix)
{
super(new DWoims3D(args, prefix));
name=String.valueOf(args.getI())+""+(String.valueOf(args.getJ()))+""+(String.valueOf(args.getZ()));
title=String.valueOf("Woims"+args.getI())+""+(String.valueOf(args.getJ()))+""+(String.valueOf(args.getZ()));
}
public DWoims3DWithUI(GeneralParam args,List<EntryParam<String, Object>> simParams,String topicPrefix)
{
super(new DWoims3D(args, simParams, topicPrefix));
name=String.valueOf(args.getI())+""+(String.valueOf(args.getJ()))+""+(String.valueOf(args.getZ()));
title=String.valueOf("Woims"+args.getI())+""+(String.valueOf(args.getJ()))+""+(String.valueOf(args.getZ()));
}
public static String getName() { return "Peer: <"+name+">"; }
@Override
public void start()
{
super.start();
setupPortrayals();
}
@Override
public void load(SimState state)
{
super.load(state);
setupPortrayals();
}
public void setupPortrayals()
{
DWoims3D woims = (DWoims3D)state;
woimsPortrayal.setField(woims.environment);
display.createSceneGraph();
display.reset();
//display.repaint();
}
}
|
It is no coincidence that the deadline Qatar was given to comply with Saudi Arabia’s 13 demands fell on 3 July, the fourth anniversary of the military coup in Egypt that ousted the country’s first democratically elected president.
The link between the two days was made explicit by propagandists for the Saudi and Emirati regimes. On 2 July, Dhahi Khalfan Tamim, the former police chief of Dubai, tweeted: “On 3 July Morsi was ousted. On 3 July Qatar will be ousted. Is it a coincidence?”
The week before, Abdulrahman al-Rasheed, the former general manager of the Saudi-owned Al Arabiya TV, wrote of Qatar: "It is threatening and warning that the confrontation will be similar to what happened at the 'Safwan tent' but we fear for Doha as it may be like the 'Rabaa Square!'"
When an ally commits acts like the massacre of Rabaa square in August 2013 that "likely amounted to crimes against humanity” - these are Human Rights Watch’s words not mine - the normal reaction is to distance yourself from it.
But these are not normal times. The sponsors of the coup in Egypt not only boast about what happened, but also threaten to use the same tactics on their disobedient Gulf neighbour.
They have become drunk with power. If they wield a big stick, they expect everyone to cower. Bahrain did. Qatar, so far, has not.
The final chapter
July 3, 2013 was a pivotal event for all sides. For the youth, and the forces which toppled two dictators in Tunisia and Egypt, it was a crushing blow.
For the Gulf monarchies who financed Abdel Fattah al-Sisi, it was the start of the counter-revolution that would shore up their absolute power, kick free elections or any form of parliamentary accountability into the next decade, and leave them with their wealth.
The attempted coup in Turkey last year and the campaign against Qatar today marks nothing less than the final chapter of an operation started four years ago
The attempted coup in Turkey last year and the campaign against Qatar today marks nothing less than the final chapter of an operation started four years ago.
Qatar supported the political opposition in Egypt and elsewhere in the region. It gave the Arab Spring a voice, through the reporting of Al Jazeera. Silencing Qatar is thus central to the success of the whole four-year operation. This is the driving force behind the blockade and sanctions today.
Saudi security forces work at a scene after a suicide bomber blew himself up in Mecca, Saudi Arabia 23 June 2017 (Reuters)
The more Saudi, the UAE and Egypt insist that their campaign is about ending the funding of terrorists, the more examples come to light of the collusion of their states with al-Qaeda and the Islamic State (IS) group, evidence which they are now keen to sweep under the table.
I have already written about the release of 1,239 inmates on death row by Prince Bandar bin Sultan, on condition they “go to jihad in Syria”, according to a Saudi Interior Ministry document dated 17 April 2012.
On Wednesday, Middle East Eye published UN documents, dated 3 February this year, in which Egypt placed a hold on a US proposal to add IS entities in Saudi Arabia, Yemen, Libya and Afghanistan-Pakistan to the UN list of sanctioned groups and individuals. They stopped it again in May.
As Madawi al-Rasheed, visiting professor at the Middle East Centre at LSE said, this was "a classic case" of Saudi Arabia not wanting to draw attention to its own terrorism problem.
Bring back Mubarak
Four years ago, the Egyptians who poured onto the streets on 30 June 2013 to demand that Morsi step down looked to the army and to Sisi as a source of stability. Today, however, Egypt is less stable, weaker and poorer by every parameter.
When the cry now is to bring back Mubarak - or rather his son Gamal - it is not an ironic one
Between 30 and 40 percent of the country is living on $2 a day or less. In May, inflation rose to 30 percent, the highest in three decades. Fuel prices have increased by 200 percent in three years. On 3 July 2013, the US dollar was worth less than six Egyptian pounds. Today, it is worth more than 18. Even the official rate of unemployment - 12.4 percent - is spiralling and the real rate is much higher.
This for a country that has been given at least $50bn from three Gulf States, Saudis, UAE and Kuwait, and a further $12bn bailout from the IMF.
Four years on, the human cost of Sisi’s iron hand is high. The following is a snapshot of his repression, from figures drawn from the Arab Organisation for Human Rights: 2,934 extrajudicial killings, 58,966 arbitrary detentions of whom over 1,000 are under age; 30,177 court sentences; 6,863 military trials; eight politically motivated executions; 11 more on death row. In Sinai, 3,446 civilians have been killed and 5,766 detained, and more than 2,500 houses demolished to establish a buffer zone on the border with Gaza.
Supporters of the Muslim Brotherhood chant slogans and raise four fingers, the symbol known as "Rabaa", which means four in Arabic, remembering those killed in the crackdown on the Rabaa al-Adawiya protest camp in Cairo in 2013 (AFP)
Many who supported Sisi in his coup against Morsi have fled in exile, or been imprisoned. The cleavage between secular and Islamist forces which filled Tahrir Square and loomed so large in Morsi’s day has been rendered irrelevant today as both have joined the ranks of the politically oppressed. When Egypt blocked access to 21 websites, the leftist independent Mada Masr was notably one of them. It was no supporter of the Brotherhood.
Today’s enemy of the state is a prominent human rights lawyer, Khalid Ali, who shot to prominence over his defence of a case in January against a government plan to transfer two uninhabited Red Sea islands to Saudi Arabia. He has been detained for “offending public decency” as have eight members of his Bread and Freedom Party for “misusing social media to incite against the state" and "insulting the president," according to the party's legal advisor.
When the cry now is to bring back Mubarak, or rather his son Gamal, it is not an ironic one. Mubarak is remembered as a competent oligarch in comparison to the venal, stupid, and blood-stained Sisi.
The other side of the story
Egypt today is on its knees, so weakened by misrule it may never again recover. But this is only one side of the story.
The major fault line in the Arab world, which the uprisings of 2011 could not surmount, is created by the distribution of wealth. With the exception of oil rich Iraq and Algeria, both crippled by clientism and corruption, the wealth is on one side of the Arab world and the masses are on the other. Without the rich part of the Arab world investing its wealth in its people, the Arab spring was doomed. This is felt as keenly today as it was in 2011.
The central question that the US government has been asking itself about its Saudi ally: just how much is the House of Saud creaming off?
To look at the wealthiest Arab countries is to be aghast at how much of it there is and on whom it is spent. The rankings of sovereign wealth funds tell an interesting story. Firstly that there is immense wealth - the sovereign wealths fund of the GCC amount to $2.8 trillion. At $320bn, Qatar is a modest player, although its population is tiny. Saudi, UAE, Kuwait and Bahrain have assets valued at $2.53 trillion.
Look closer and there is something that does not make sense about the relative size of these funds. The funds held by six Emirati sovereign funds amount to just under $1.3 trillion while the two top Saudi funds are valued at $679bn, only half that. You would expect it to be the other way round. Five extended families in the Middle East own about 60 percent of the world’s oil and the Saud family controls more than one-third of that.
Follow the Saudi money
This is a puzzle and the answer may lie in the black hole of Saudi’s state accounting, something that lawyers on the New York and London stock exchanges will be interested in investigating now that up to five percent of Aramco, the Saudi state oil company, will be up for sale.
In 2003, Robert Baer, a former CIA man who wrote a book on the subject, estimated the size of the family to be 30,000 of whom, he wrote then, between 10,000 and 12,000 were on royal stipends ranging from $800 to $270,000 a month. These figures are 14-years-old and would have gone up considerably since.
Huge sums of money are disappearing from Saudi state coffers - an average of $133bn annually
The cost of funding the Saud family today can be glimpsed by the magically changing numbers for government revenue provided the General Authority for Statistics (GAS) yearbook. In its scrutiny of the changing figures, the Arab Digest published in May claims that huge sums of money are disappearing from the state coffers - an average of $133bn annually.
The transparency that is required on the New York and London stock exchanges about the forthcoming sale of shares in Aramco is shedding an unwelcome spotlight on the central question that the US government has been asking itself about its Saudi ally - just how much is the House of Saud creaming off?
Taxing foreign workers
They are certainly not spending this money on their people, and are scrambling to find other sources of revenue, such as foreign workers. Some 11 million foreign workers are going to be forced to pay in advance for their dependents to live in the country, as a condition of getting their entry visa. Each foreigner will pay $319 for each dependent this year, which will rise to $1,070 by 2020.
Contrary to the image, most of these are not rich expat Brits, but low-paid workers from the Arab world and the Indian subcontinent. Rather than pay these sums, they will send their families back, as they will their salaries. The Saudi state will lose twice over.
The net external assets fell by $36bn in the first quarter of this year, and have dropped from $737bn in August 2014 to $529bn in December 2016.
This is evidence of corruption on a vast scale and suggests that the state coffers are haemorrhaging funds to keep the royal family in the life style to which they have grown accustomed.
The revolution cometh
Now just imagine if in 2011, Saudi Arabia and the Emirates, the richest countries of the Arab world, had taken a different decision. Imagine that instead of investing in a counter-revolution and another decade of repression, they had chosen to invest in democracy and in people.
For a culture that uses the word brother a lot, fraternity is in short supply
Imagine that when governments were elected after the first free elections the region had known, they didn't need donor conferences. Or a Marshall plan. The money was already there. All it needed was for one part of the Arab world to have faith in and invest in the other part. For a culture that uses the word brother a lot, fraternity is in short supply.
The Saudis have committed themselves to spend up to $500bn on US arms sales.Donald Trump is very grateful, so thankful in fact that he is cutting his aid to Tunisia, the only Arab state where there are real elections, a real parliament and a functioning although faltering democracy. It is desperately short of foreign investment. Instead of getting an insignificant $177m, it will now get a paltry $54.4m. US aid to the autocratic regimes of Egypt and Jordan only marginally decreases, while Israel continues to get its $3.1billion. As an expression of American values under Trump, these figures are hard to beat.
The rich and powerful chose instead to invest in repression. Four years on, millions of Sunnis are homeless. Mosul, Iraq’s second city, is in ruins. Cholera has broken out in Yemen on Saudi’s doorstep. Devastated by a 27-month war conducted by the Saudi-led coalition, at least 10,000 people have been killed, 3.1 million people are internally displaced and 14.1 million are food insecure.
Welcome to the liquid Middle East Eduard Soler i Lecha Read More »
Does this carnage make the kingdom’s southern border any more secure? Do Yemenis feel beholden to the Saudis after what they have experienced?
As with Egypt, so with the region as a whole. At the very point in which the Saudis and Emiratis seem victorious, they are sowing the seeds for a huge new revolutionary wave to come. This time it will not be based on democracy, the rule of law and non-violence. Nor will it be self-restrained or controllable. But it is coming.
- David Hearst is editor-in-chief of Middle East Eye. He was chief foreign leader writer of The Guardian, former Associate Foreign Editor, European Editor, Moscow Bureau Chief, European Correspondent, and Ireland Correspondent. He joined The Guardian from The Scotsman, where he was education correspondent.
The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Eye.
Photo: (L-R) Saudi Foreign Minister Adel al-Jubeir, UAE Foreign Minister Abdullah Zayed bin al-Nayhan, Egyptian Foreign Minister Sameh al-Shoukry and Bahraini Foreign Minister Khalid bin Ahmed al-Khalifa in Cairo on 5 July (Reuters) |
// NewCancelBidInstructionBuilder creates a new `CancelBid` instruction builder.
func NewCancelBidInstructionBuilder() *CancelBid {
nd := &CancelBid{
AccountMetaSlice: make(ag_solanago.AccountMetaSlice, 11),
}
return nd
} |
Hyperthyroidism in Patients with Graves' Ophthalmopathy, and Thyroidal, Skeletal and Eye Muscle Specific Type 2 Deiodinase Enzyme Activities.
Graves' ophthalmopathy is characterized by hyperthyroidism, which is associated with higher serum T3 levels than T4 due to deiodinase enzymes.The effect of Graves' patient's sera (n=52) with elevated thyroid hormone and TSH receptor or thyroid peroxidase antibody (anti-TPO) levels was investigated on thyroidal, skeletal and eye muscle type 2 deiodinase enzyme (DII) activities. DII activities were measured with 125I-T4 substrate, while thyroid hormone and antibody levels with immunoassays.In Graves' ophthalmopathy, sera with elevated FT4 or FT3 levels reduced DII activites remarkably in all tissue fractions. Thyroidal DII activities were lower than those using eye muscle fraction (0.6±0.22 vs 1.14±0.43 pmol/mg/min, P<0.006). Effect of sera with increased FT3 levels demonstrated also reduced DII activities in patients with Graves' ophthalmopathy after methimazole therapy compared to those who had no ophthalmopathy (2.88±2 vs 20.42±11.82 pmol/mg/min, P<0.006 for thyroidal fraction, 4.07±2.72 vs 29.22±15.46 pmol/mg/min, P<0.004 for skeletal muscle, 5.3±3.47 vs 37.87±18.82 pmol/mg/min, P<0.003 for eye muscle). Hyperthyroid sera with TSH receptor antibodies resulted in increased DII activities, while sera with anti-TPO antibodies were connected to lower DII activities in Graves' ophthalmopathy.In summary, the actions of hyperthyroid sera derived from patients with Graves' disease were tested on tissue-specific DII activities. Elevated FT4 level-induced DII inactivation is present in Graves' ophthalmopathy, which seems to be also present at the beginning of methimazole therapy. Stimulating TSH receptor antibiodies increased DII activities via their nongenomic effects using sera of hyperthyroid Graves' ophthalmopathy, but anti-TPO antibodies could influence DII activities via altering FT4 levels. |
import * as React from "react";
import {
Code,
ControlGroup,
FormGroup,
H3,
InputGroup,
Intent,
} from "@blueprintjs/core";
import {
observer,
} from "mobx-react-lite";
import {
ServiceAppFormI,
} from "@app/Types";
import ErrorIcon from "@app/Components/FormErrorIcon";
type Props = {
/** Used as left-side details */
children?: React.ReactNode,
form: ServiceAppFormI,
};
const AppDetails = observer((props: Props) => {
const onChangeName = (e: React.SyntheticEvent<HTMLInputElement>) => {
props.form.name.onChange(e.currentTarget.value.replace(/\W/g, ""));
};
const onChangeAppRoot = (e: React.SyntheticEvent<HTMLInputElement>) =>
props.form.appRoot.onChange(e.currentTarget.value);
return (
<div className="helper-form">
<div className="left">
<H3>Service Details</H3>
{props.children}
</div>
<div className="right">
<Name form={props.form} onChange={onChangeName} />
<AppRoot form={props.form} onChange={onChangeAppRoot} />
</div>
</div>
);
});
type FieldProps = {
form: ServiceAppFormI,
onChange: (e: React.SyntheticEvent<HTMLInputElement>) => void,
}
const Name = observer((props: FieldProps) => {
const error = props.form.name.error || props.form.nameInUse.error;
return (
<FormGroup
inline
label="Service Name"
labelFor="name"
labelInfo="*"
intent={error ? Intent.DANGER : undefined}
>
<ControlGroup fill>
<InputGroup
id="name"
placeholder="Service Name"
intent={error ? Intent.DANGER : undefined}
value={props.form.name.value}
onChange={props.onChange}
/>
<ErrorIcon hasError={!!error} />
</ControlGroup>
<div className="helper-text">{error}</div>
<p className="mt-2">
Must be unique to each Project. You can leave it as default.
</p>
</FormGroup>
);
});
const AppRoot = observer((props: FieldProps) =>
<FormGroup
inline
label="Path to App Root"
labelFor="appRoot"
labelInfo="*"
intent={props.form.appRoot.error ? Intent.DANGER : undefined}
className="mb-0"
>
<ControlGroup fill>
<InputGroup
id="appRoot"
placeholder="Path to App Root"
intent={props.form.appRoot.error ? Intent.DANGER : undefined}
value={props.form.appRoot.value}
onChange={props.onChange}
/>
<ErrorIcon hasError={!!props.form.appRoot.error} />
</ControlGroup>
<div className="helper-text">{props.form.appRoot.error}</div>
<p className="mt-2">
Location of your project files on host machine.
The contents will be made available inside the container
at <Code>/var/www</Code>.
</p>
<p className="mt-2">
Windows users: You must use forward-slash
<Code>c:/dev/my-project</Code> or
double back-slash <Code>c:\\dev\\my-project</Code>.
</p>
</FormGroup>,
);
export default AppDetails;
|
package client
const (
DefaultAPIURL = "https://wd5-impl-services1.workday.com/ccx/service"
DefaultAPIVersion = "v27.1"
DefaultPageSize = 999
DefaultTimeout = 60
)
type Config struct {
APIURL string
APIVersion string
Username string
Password string
Tenant string
PageSize int
Timeout int
}
|
def read_memory(self, addr: int, transfer_size: int = 32, now: bool = True) -> Union[int, Callable[[], int]]:
if transfer_size != 32:
raise exceptions.DebugError("unsupported transfer size")
return self._dp.read_ap(self._offset + addr, now) |
import copy
import math
n_hombres = input()
skills_h = input()
skills_h = skills_h.split(" ")
n_mujeres = input()
skills_m = input()
skills_m = skills_m.split(" ")
skills_h = [ int(i) for i in skills_h]
skills_m = [ int(i) for i in skills_m]
def sort_list(lista):
sorted_list = []
lista_1 = lista
lista_2 = copy.copy(lista)
for i in range(0,len(lista_1)):
min = math.inf
for i in range(0,len(lista_2)):
element = lista_2[i]
if int(element) <= min:
min = int(element)
sorted_list.append(min)
index = lista_2.index(min)
lista_2.pop(index)
return sorted_list
def greedy(list1,list2):
list1 = sort_list(list1)
list2 = sort_list(list2)
big = None
small = None
max_list = []
if len(list1) >= len(list2):
big = list1
small = list2
else:
big = list2
small = list1
max = 0
pos = []
for i in range (0, len(small)):
for j in range(0, len(big)):
element = small[i]
element_2 = big[j]
if (element_2 == element or element_2 + 1 == element or element_2 - 1 == element) and j not in pos:
pos.append(j)
max += 1
break
print(max)
greedy(skills_h,skills_m)
|
import sys
inputs = sys.stdin.readlines()
n, m = map(int, inputs[0].split())
connections = []
for i in range(1, m+1):
connection = [0] * n
_, *S = map(int, inputs[i].split())
for s in S:
connection[s-1] = 1
connections.append(connection)
*P, = map(int, inputs[i+1].split())
patterns = []
for i in range(2 ** n):
pattern = [(i >> j) & 1 for j in range(n)]
on_switches = [sum(c and p for c, p in zip(connection, pattern)) for connection in connections]
if all(p == s%2 for p, s in zip(P, on_switches)):
patterns.append(pattern)
print(len(patterns)) |
<reponame>jeongjoonyoo/arm_AndroidNN
//
// Copyright © 2017 Arm Ltd. All rights reserved.
// SPDX-License-Identifier: MIT
//
#pragma once
#ifndef LOG_TAG
#define LOG_TAG "ArmnnDriverTests"
#endif // LOG_TAG
#include "../ArmnnDriver.hpp"
#include <iosfwd>
#include <boost/test/unit_test.hpp>
namespace android
{
namespace hardware
{
namespace neuralnetworks
{
namespace V1_0
{
std::ostream& operator<<(std::ostream& os, ErrorStatus stat);
} // namespace android::hardware::neuralnetworks::V1_0
} // namespace android::hardware::neuralnetworks
} // namespace android::hardware
} // namespace android
namespace driverTestHelpers
{
std::ostream& operator<<(std::ostream& os, V1_0::ErrorStatus stat);
struct ExecutionCallback : public V1_0::IExecutionCallback
{
ExecutionCallback() : mNotified(false) {}
Return<void> notify(ErrorStatus status) override;
/// wait until the callback has notified us that it is done
Return<void> wait();
private:
// use a mutex and a condition variable to wait for asynchronous callbacks
std::mutex mMutex;
std::condition_variable mCondition;
// and a flag, in case we are notified before the wait call
bool mNotified;
};
class PreparedModelCallback : public V1_0::IPreparedModelCallback
{
public:
PreparedModelCallback()
: m_ErrorStatus(ErrorStatus::NONE)
, m_PreparedModel()
{ }
~PreparedModelCallback() override { }
Return<void> notify(ErrorStatus status,
const android::sp<V1_0::IPreparedModel>& preparedModel) override;
ErrorStatus GetErrorStatus() { return m_ErrorStatus; }
android::sp<V1_0::IPreparedModel> GetPreparedModel() { return m_PreparedModel; }
private:
ErrorStatus m_ErrorStatus;
android::sp<V1_0::IPreparedModel> m_PreparedModel;
};
#ifdef ARMNN_ANDROID_NN_V1_2
class PreparedModelCallback_1_2 : public V1_2::IPreparedModelCallback
{
public:
PreparedModelCallback_1_2()
: m_ErrorStatus(ErrorStatus::NONE)
, m_PreparedModel()
, m_PreparedModel_1_2()
{ }
~PreparedModelCallback_1_2() override { }
Return<void> notify(ErrorStatus status, const android::sp<V1_0::IPreparedModel>& preparedModel) override;
Return<void> notify_1_2(ErrorStatus status, const android::sp<V1_2::IPreparedModel>& preparedModel) override;
ErrorStatus GetErrorStatus() { return m_ErrorStatus; }
android::sp<V1_0::IPreparedModel> GetPreparedModel() { return m_PreparedModel; }
android::sp<V1_2::IPreparedModel> GetPreparedModel_1_2() { return m_PreparedModel_1_2; }
private:
ErrorStatus m_ErrorStatus;
android::sp<V1_0::IPreparedModel> m_PreparedModel;
android::sp<V1_2::IPreparedModel> m_PreparedModel_1_2;
};
#endif
hidl_memory allocateSharedMemory(int64_t size);
android::sp<IMemory> AddPoolAndGetData(uint32_t size, Request& request);
void AddPoolAndSetData(uint32_t size, Request& request, const float* data);
template<typename HalPolicy,
typename HalModel = typename HalPolicy::Model,
typename HalOperand = typename HalPolicy::Operand>
void AddOperand(HalModel& model, const HalOperand& op)
{
model.operands.resize(model.operands.size() + 1);
model.operands[model.operands.size() - 1] = op;
}
template<typename HalPolicy, typename HalModel = typename HalPolicy::Model>
void AddIntOperand(HalModel& model, int32_t value)
{
using HalOperand = typename HalPolicy::Operand;
using HalOperandType = typename HalPolicy::OperandType;
using HalOperandLifeTime = typename HalPolicy::OperandLifeTime;
DataLocation location = {};
location.offset = model.operandValues.size();
location.length = sizeof(int32_t);
HalOperand op = {};
op.type = HalOperandType::INT32;
op.dimensions = hidl_vec<uint32_t>{};
op.lifetime = HalOperandLifeTime::CONSTANT_COPY;
op.location = location;
model.operandValues.resize(model.operandValues.size() + location.length);
*reinterpret_cast<int32_t*>(&model.operandValues[location.offset]) = value;
AddOperand<HalPolicy>(model, op);
}
template<typename HalPolicy, typename HalModel = typename HalPolicy::Model>
void AddBoolOperand(HalModel& model, bool value)
{
using HalOperand = typename HalPolicy::Operand;
using HalOperandType = typename HalPolicy::OperandType;
using HalOperandLifeTime = typename HalPolicy::OperandLifeTime;
DataLocation location = {};
location.offset = model.operandValues.size();
location.length = sizeof(uint8_t);
HalOperand op = {};
op.type = HalOperandType::BOOL;
op.dimensions = hidl_vec<uint32_t>{};
op.lifetime = HalOperandLifeTime::CONSTANT_COPY;
op.location = location;
model.operandValues.resize(model.operandValues.size() + location.length);
*reinterpret_cast<uint8_t*>(&model.operandValues[location.offset]) = static_cast<uint8_t>(value);
AddOperand<HalModel>(model, op);
}
template<typename T>
OperandType TypeToOperandType();
template<>
OperandType TypeToOperandType<float>();
template<>
OperandType TypeToOperandType<int32_t>();
template<typename HalPolicy,
typename T,
typename HalModel = typename HalPolicy::Model,
typename HalOperandType = typename HalPolicy::OperandType,
typename HalOperandLifeTime = typename HalPolicy::OperandLifeTime>
void AddTensorOperand(HalModel& model,
const hidl_vec<uint32_t>& dimensions,
const T* values,
HalOperandType operandType = HalOperandType::TENSOR_FLOAT32,
HalOperandLifeTime operandLifeTime = HalOperandLifeTime::CONSTANT_COPY)
{
using HalOperand = typename HalPolicy::Operand;
uint32_t totalElements = 1;
for (uint32_t dim : dimensions)
{
totalElements *= dim;
}
DataLocation location = {};
location.length = totalElements * sizeof(T);
if(operandLifeTime == HalOperandLifeTime::CONSTANT_COPY)
{
location.offset = model.operandValues.size();
}
HalOperand op = {};
op.type = operandType;
op.dimensions = dimensions;
op.lifetime = HalOperandLifeTime::CONSTANT_COPY;
op.location = location;
model.operandValues.resize(model.operandValues.size() + location.length);
for (uint32_t i = 0; i < totalElements; i++)
{
*(reinterpret_cast<T*>(&model.operandValues[location.offset]) + i) = values[i];
}
AddOperand<HalPolicy>(model, op);
}
template<typename HalPolicy,
typename T,
typename HalModel = typename HalPolicy::Model,
typename HalOperandType = typename HalPolicy::OperandType,
typename HalOperandLifeTime = typename HalPolicy::OperandLifeTime>
void AddTensorOperand(HalModel& model,
const hidl_vec<uint32_t>& dimensions,
const std::vector<T>& values,
HalOperandType operandType = HalPolicy::OperandType::TENSOR_FLOAT32,
HalOperandLifeTime operandLifeTime = HalOperandLifeTime::CONSTANT_COPY)
{
AddTensorOperand<HalPolicy, T>(model, dimensions, values.data(), operandType, operandLifeTime);
}
template<typename HalPolicy,
typename HalModel = typename HalPolicy::Model,
typename HalOperandType = typename HalPolicy::OperandType>
void AddInputOperand(HalModel& model,
const hidl_vec<uint32_t>& dimensions,
HalOperandType operandType = HalOperandType::TENSOR_FLOAT32)
{
using HalOperand = typename HalPolicy::Operand;
using HalOperandLifeTime = typename HalPolicy::OperandLifeTime;
HalOperand op = {};
op.type = operandType;
op.scale = operandType == HalOperandType::TENSOR_QUANT8_ASYMM ? 1.f / 255.f : 0.f;
op.dimensions = dimensions;
op.lifetime = HalOperandLifeTime::MODEL_INPUT;
AddOperand<HalPolicy>(model, op);
model.inputIndexes.resize(model.inputIndexes.size() + 1);
model.inputIndexes[model.inputIndexes.size() - 1] = model.operands.size() - 1;
}
template<typename HalPolicy,
typename HalModel = typename HalPolicy::Model,
typename HalOperandType = typename HalPolicy::OperandType>
void AddOutputOperand(HalModel& model,
const hidl_vec<uint32_t>& dimensions,
HalOperandType operandType = HalOperandType::TENSOR_FLOAT32)
{
using HalOperand = typename HalPolicy::Operand;
using HalOperandLifeTime = typename HalPolicy::OperandLifeTime;
HalOperand op = {};
op.type = operandType;
op.scale = operandType == HalOperandType::TENSOR_QUANT8_ASYMM ? 1.f / 255.f : 0.f;
op.dimensions = dimensions;
op.lifetime = HalOperandLifeTime::MODEL_OUTPUT;
AddOperand<HalPolicy>(model, op);
model.outputIndexes.resize(model.outputIndexes.size() + 1);
model.outputIndexes[model.outputIndexes.size() - 1] = model.operands.size() - 1;
}
android::sp<V1_0::IPreparedModel> PrepareModelWithStatus(const V1_0::Model& model,
armnn_driver::ArmnnDriver& driver,
ErrorStatus& prepareStatus,
ErrorStatus expectedStatus = ErrorStatus::NONE);
#if defined(ARMNN_ANDROID_NN_V1_1) || defined(ARMNN_ANDROID_NN_V1_2)
android::sp<V1_0::IPreparedModel> PrepareModelWithStatus(const V1_1::Model& model,
armnn_driver::ArmnnDriver& driver,
ErrorStatus& prepareStatus,
ErrorStatus expectedStatus = ErrorStatus::NONE);
#endif
template<typename HalModel>
android::sp<V1_0::IPreparedModel> PrepareModel(const HalModel& model,
armnn_driver::ArmnnDriver& driver)
{
ErrorStatus prepareStatus = ErrorStatus::NONE;
return PrepareModelWithStatus(model, driver, prepareStatus);
}
#ifdef ARMNN_ANDROID_NN_V1_2
android::sp<V1_2::IPreparedModel> PrepareModelWithStatus_1_2(const armnn_driver::hal_1_2::HalPolicy::Model& model,
armnn_driver::ArmnnDriver& driver,
ErrorStatus& prepareStatus,
ErrorStatus expectedStatus = ErrorStatus::NONE);
template<typename HalModel>
android::sp<V1_2::IPreparedModel> PrepareModel_1_2(const HalModel& model,
armnn_driver::ArmnnDriver& driver)
{
ErrorStatus prepareStatus = ErrorStatus::NONE;
return PrepareModelWithStatus_1_2(model, driver, prepareStatus);
}
#endif
ErrorStatus Execute(android::sp<V1_0::IPreparedModel> preparedModel,
const Request& request,
ErrorStatus expectedStatus = ErrorStatus::NONE);
android::sp<ExecutionCallback> ExecuteNoWait(android::sp<V1_0::IPreparedModel> preparedModel,
const Request& request);
} // namespace driverTestHelpers
|
// called when the global settings checkbox is changed
void LLFloaterAutoReplaceSettings::onAutoReplaceToggled()
{
mEnabled = childGetValue("autoreplace_enable").asBoolean();
LL_DEBUGS("AutoReplace")<< "autoreplace_enable " << ( mEnabled ? "on" : "off" ) << LL_ENDL;
} |
def grangers_causation_table(data, xnames, ynames, maxlag,
test='ssr_chi2test', alpha=None):
res = pd.DataFrame(np.zeros((len(xnames), len(ynames))),
columns=ynames, index=xnames)
for c in res.columns:
for r in res.index:
test_result = sm.tsa.stattools.grangercausalitytests(data[[r, c]],
maxlag=maxlag, verbose=False)
p_values = [ round(test_result[i+1][0][test][1],4)
for i in range(maxlag) ]
min_p_value = np.min(p_values)
res.loc[r, c] = min_p_value
res.columns = res.columns + '_y'
res.index = res.index + '_x'
if alpha is None:
res.index.name = 'Granger Causation Table'
return res
res.index.name = 'Granger Causation Table alpha={}'.format(alpha)
return res < alpha |
/**
* This function does change the PlayerType value by iterating
* through the possibilities
*/
private void changePlayerType() {
switch( this.type ) {
case NOT_PLAYING: this.type = PlayerType.HUMAN; break;
case HUMAN: this.type = PlayerType.AI; break;
case AI: this.type = PlayerType.NOT_PLAYING; break;
}
this.playerTypeBtn.setText(this.type.name());
} |
/**
* Get a random unsigned long integer with a uniform distribution between 0 and
* a number.
*
* @param max
* Maximum value of the number, exclusive.
* @return A random unsigned long integer with a uniform distribution between 0
* and a number.
* @throws IllegalArgumentException
* If max is 0.
*/
public default long generateUniformUnsignedLong(final long max) {
if (0 == max) {
throw new IllegalArgumentException();
}
long result;
final long moduloBias = Long.remainderUnsigned(-max, max);
if (moduloBias == 0) {
result = this.generateLong();
} else {
final long unbiasedMaximum = 0 - moduloBias;
do {
result = this.generateLong();
} while (Long.compareUnsigned(result, unbiasedMaximum) >= 0);
}
return Long.remainderUnsigned(result, max);
} |
package storage
import (
"crypto/sha256"
"encoding/base32"
)
type Hash string
const encodeStd = "abcdefghijklmnopqrstuvwxyz234567"
func CalculateHash(b []byte) Hash {
h := sha256.New()
// hash.Hash#Write never returns an error
_, _ = h.Write(b)
return Hash(base32.NewEncoding(encodeStd).EncodeToString(h.Sum(nil))[:12])
}
func ExampleHash() Hash {
return "aaaaaaaaaaaa"
}
func ExampleHash2() Hash {
return "aaaaaaaaaaab"
}
|
Archaeologists have discovered evidence of what they believe is the earliest known journey in British history: a 250-mile trip from York to Wiltshire made 7,000 years ago by a Mesolithic hunter-gatherer and his dog.
Scientific investigation of a dog’s tooth found at a site one mile from Stonehenge has thrown up a number of exciting discoveries, said archaeologist David Jacques, who leads the team digging at an encampment site called Blick Mead.
For one thing, it shows that Mesolithic man was using a domesticated dog, probably for hunting. But more fascinating is that the animal originally came from an area that is now the Vale of York, suggesting the pair made a long, gruelling journey.
Jacques said the findings showed the dog and people with it came to what is now Wiltshire. This is the earliest evidence of a journey that has been unearthed in Britain.
The clinching evidence was found by researchers at Durham University, who carried out an isotope analysis of the tooth enamel, which showed the dog drank water that came from the Vale of York area. They also speculate that the dog would have been roughly the same size, shape and colour of an alsatian, albeit more wolf-like.
Facebook Twitter Pinterest David Jacques in Blick Mead. The site has yielded evidence of the earliest settlement near Stonehenge. Photograph: University of Buckingham/PA
Jacques said the Durham analysis was “a world first, it’s a big deal,” that substantially increased what archaeologists know about Blick Mead. “It is very hazy and this evidence just makes the glass slightly less dark, it is a significant movement forwards,” he said.
Previously, artefacts had been found which implied Mesolithic man had travelled long distances to get to the site. Jacques and his team believe people were coming to the spot over a near 4,000-year period, from 7900BC to 4000BC.
“It is an amazing sequence,” he added. “There is nothing like it in Europe and now we’ve got this evidence from the dog you start to piece it together. You can see that this place seems to have special to not just local people, it seems to have been drawing in people from long distances away and the sort of distances you would not expect for mobile hunter-gatherers.”
Jacques has led University of Buckingham digs at Blick Mead for nearly a decade, believing the area is key to the beginnings of people living in Britain because evidence of occupancy covers such a long period of time.
Facebook Twitter Pinterest Archaeologists have criticised plans to construct a 1.8-mile tunnel past Stonehenge in Wiltshire. Photograph: Steve Parsons/PA
Other discoveries include evidence of Mesolithic people feasting on huge oxen, known as aurochs, salmon, trout, hazelnuts and even frog’s legs, about eight millennia before they became a French staple.
Jacques believes Blick Mead is crucial to our understanding of the stone circle at Stonehenge, erected in the late Neolithic period, at about 2500BC
“Discoveries like this give us a completely new understanding of the establishment of the ritual landscape and make Stonehenge even more special than we thought we knew it was,” said Jacques.
It raises the question of why people would travel such long distances to get to the site. “It makes us wonder if this place is a hub point, a really important place for the spread of ideas, new technologies and probably genes,” said Jacques.
“You have small dispersed populations of people in Britain and people are probably getting it together and families are coming out of that. In a sense it is probably quite a multicultural environment.”
The Blick Mead site is close to the busy A303 and only 100 metres from the site of a proposed 1.8-mile tunnel, which would remove the road from the Stonehenge site.
Jacques is firmly against a tunnel, fearing it could alter the water table and make it impossible to continue digging at Blick Mead. “It would be devastating if the tunnel obliterated our chance of piecing together the jigsaw to explain why Stonehenge was built,” he said.
Facebook Twitter Pinterest Examination of a trench in Blick Mead has led to the discovery of a charred toad’s leg, bones of trout or salmon as well as the remains of cooked aurochs. Photograph: University of Buckingham/PA
The tunnel is supported by the National Trust and Historic England, which believe the road is a blight.
Andy Rhind-Tutt, chairman of Amesbury Museum and Heritage Trust, said the discoveries were rewriting the history books of Mesolithic Britain. “Blick Mead is without doubt one of the greatest national discoveries ever made in the Stonehenge landscape.
“As we edge towards a road improvement plan that could see a disastrous, ineffective tunnel, I desperately hope Historic England and the National Trust recognise what a key site this is and ensure it is protected and preserved so that we can carry on unlocking the history of Stonehenge.”
Historic England said any tunnel was likely to be well away from the Blick Mead site.
Phil McMahon, inspector of ancient monuments for Historic England in the south west said: “There’s no scheme available yet to understand the exact impacts of any road improvement proposal.
“But from our work on possible tunnel locations advising Department for Transport and Highways England over the past two and a half years, we understand that any tunnel scheme is likely to be well away from the Blick Mead site.” |
Recombination between a temperature-sensitive mutant and a deletion mutant of Rous sarcoma virus
Cells doubly infected with two mutants of the Schmidt-Ruppin strain of Rous sarcoma virus (RSV), ts68, which is temperature sensitive for cell transformation (srcts), and a deletion mutant, N8, which is deficient in the envelope glycoprotein (env-), produced a recombinant which carried the defects of both parents. The frequency of formation of such a recombinant was exceptionally high and made up 45 to 55% of the progeny carrying the srcts marker. By contrast, the reciprocal recombinant, which is wild type in transformation (srcts) and contains the subgroup A envelope glycoprotein (envA), was almost undetectable. This remarkable difference in the frequency of the formation of the two possible recombinants suggests that a unique mechanism may be involved in the genetic interaction of the two virus genomes, one of which has a large deletion. When an RNA-dependent DNA polymerase-negative variant of the N8 (N8alpha) was crinants also became deficient in the polymerase. Cells infected by the srctsenv- recombinant were morphologically normal at the nonpermissive temperature (41 degrees C) and susceptible to all subgroups of RSV. The rate by which the wild-type RSV transformed the recombinant-preinfected cells was indistinguishable from that of transformation of uninfected chicken cells by the same wild-type virus. This indicates that no detectable interference exists at postpenetration stages between the preinfected and superinfecting virus genomes and confirms that the expression of the transformed state is dominant over the suppressed state. |
<gh_stars>0
package sampleLib;
import javax.servlet.jsp.*;
import javax.servlet.jsp.tagext.*;
public class SwitchTag extends TagSupport
{
public void setPageContext(PageContext pageContext)
{
super.setPageContext(pageContext);
//Sets the internal flag that tells whether or not a matching
//case tag has been found to be false.
setValue("caseFound", Boolean.FALSE);
}
//stores the value of the match attribute
public void setConditionValue(String value)
{
setValue("conditionValue", value);
}
public int doStartTag() throws JspException
{
return EVAL_BODY_INCLUDE;
}
}
|
/**
* Add a inventory area to the array if it isn't already included in the array
*
* @param area pointer to area to be added
* @return bool if successful
**/
bool NewSimulatorInventory::AddInventoryArea( NewSimulatorInventoryArea *area ) {
if ( FindInventoryArea( area ) ) {
return false;
}
if (area->Num() > m_area_id)
m_area_id = area->Num();
m_areas.Add( area );
m_inv_info.NumAreas = m_areas.Num();
return true;
} |
#![doc = "generated by AutoRust"]
#![allow(non_camel_case_types)]
#![allow(unused_imports)]
use serde::{Deserialize, Serialize};
#[doc = "Input values."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub struct CheckNameAvailabilityParameters {
#[doc = "The name of the service instance to check."]
pub name: String,
#[doc = "The fully qualified resource type which includes provider namespace."]
#[serde(rename = "type")]
pub type_: String,
}
impl CheckNameAvailabilityParameters {
pub fn new(name: String, type_: String) -> Self {
Self { name, type_ }
}
}
#[doc = "Error details."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ErrorDetails {
#[doc = "Error details."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub error: Option<ErrorDetailsInternal>,
}
impl ErrorDetails {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "Error details."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ErrorDetailsInternal {
#[doc = "The error code."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub code: Option<String>,
#[doc = "The error message."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub message: Option<String>,
#[doc = "The target of the particular error."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub target: Option<String>,
}
impl ErrorDetailsInternal {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "Service REST API operation."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct Operation {
#[doc = "Operation name: {provider}/{resource}/{read | write | action | delete}"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub name: Option<String>,
#[doc = "Gets or sets a value indicating whether the operation is a data action or not"]
#[serde(rename = "isDataAction", default, skip_serializing_if = "Option::is_none")]
pub is_data_action: Option<bool>,
#[doc = "Default value is 'user,system'."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub origin: Option<String>,
#[doc = "The object that represents the operation."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub display: Option<OperationDisplay>,
#[doc = "Operation properties."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub properties: Option<OperationProperties>,
}
impl Operation {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "The object that represents the operation."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct OperationDisplay {
#[doc = "Service provider: Microsoft.HealthcareApis"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub provider: Option<String>,
#[doc = "Resource Type: Services"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub resource: Option<String>,
#[doc = "Name of the operation"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub operation: Option<String>,
#[doc = "Friendly description for the operation,"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
}
impl OperationDisplay {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "A list of service operations. It contains a list of operations and a URL link to get the next set of results."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct OperationListResult {
#[doc = "The link used to get the next page of service description objects."]
#[serde(rename = "nextLink", default, skip_serializing_if = "Option::is_none")]
pub next_link: Option<String>,
#[doc = "A list of service operations supported by the Microsoft.HealthcareApis resource provider."]
#[serde(default, skip_serializing_if = "Vec::is_empty")]
pub value: Vec<Operation>,
}
impl OperationListResult {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "Operation properties."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct OperationProperties {}
impl OperationProperties {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "The properties indicating the operation result of an operation on a service."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct OperationResultsDescription {
#[doc = "The ID of the operation returned."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub id: Option<String>,
#[doc = "The name of the operation result."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub name: Option<String>,
#[doc = "The status of the operation being performed."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub status: Option<operation_results_description::Status>,
#[doc = "The time that the operation was started."]
#[serde(rename = "startTime", default, skip_serializing_if = "Option::is_none")]
pub start_time: Option<String>,
#[doc = "Additional properties of the operation result."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub properties: Option<serde_json::Value>,
}
impl OperationResultsDescription {
pub fn new() -> Self {
Self::default()
}
}
pub mod operation_results_description {
use super::*;
#[doc = "The status of the operation being performed."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub enum Status {
Canceled,
Succeeded,
Failed,
Requested,
Running,
}
}
#[doc = "The common properties of a service."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub struct Resource {
#[doc = "The resource identifier."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub id: Option<String>,
#[doc = "The resource name."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub name: Option<String>,
#[doc = "The resource type."]
#[serde(rename = "type", default, skip_serializing_if = "Option::is_none")]
pub type_: Option<String>,
#[doc = "The kind of the service."]
pub kind: resource::Kind,
#[doc = "The resource location."]
pub location: String,
#[doc = "The resource tags."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub tags: Option<serde_json::Value>,
#[doc = "An etag associated with the resource, used for optimistic concurrency when editing it."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub etag: Option<String>,
#[doc = "Setting indicating whether the service has a managed identity associated with it."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub identity: Option<resource::Identity>,
}
impl Resource {
pub fn new(kind: resource::Kind, location: String) -> Self {
Self {
id: None,
name: None,
type_: None,
kind,
location,
tags: None,
etag: None,
identity: None,
}
}
}
pub mod resource {
use super::*;
#[doc = "The kind of the service."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub enum Kind {
#[serde(rename = "fhir")]
Fhir,
#[serde(rename = "fhir-Stu3")]
FhirStu3,
#[serde(rename = "fhir-R4")]
FhirR4,
}
#[doc = "Setting indicating whether the service has a managed identity associated with it."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct Identity {
#[doc = "The principal ID of the resource identity."]
#[serde(rename = "principalId", default, skip_serializing_if = "Option::is_none")]
pub principal_id: Option<String>,
#[doc = "The tenant ID of the resource."]
#[serde(rename = "tenantId", default, skip_serializing_if = "Option::is_none")]
pub tenant_id: Option<String>,
#[doc = "Type of identity being specified, currently SystemAssigned and None are allowed."]
#[serde(rename = "type", default, skip_serializing_if = "Option::is_none")]
pub type_: Option<identity::Type>,
}
impl Identity {
pub fn new() -> Self {
Self::default()
}
}
pub mod identity {
use super::*;
#[doc = "Type of identity being specified, currently SystemAssigned and None are allowed."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub enum Type {
SystemAssigned,
None,
}
}
}
pub type ServiceAccessPoliciesInfo = Vec<ServiceAccessPolicyEntry>;
#[doc = "An access policy entry."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub struct ServiceAccessPolicyEntry {
#[doc = "An Azure AD object ID (User or Apps) that is allowed access to the FHIR service."]
#[serde(rename = "objectId")]
pub object_id: String,
}
impl ServiceAccessPolicyEntry {
pub fn new(object_id: String) -> Self {
Self { object_id }
}
}
#[doc = "Authentication configuration information"]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServiceAuthenticationConfigurationInfo {
#[doc = "The authority url for the service"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub authority: Option<String>,
#[doc = "The audience url for the service"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub audience: Option<String>,
#[doc = "If the SMART on FHIR proxy is enabled"]
#[serde(rename = "smartProxyEnabled", default, skip_serializing_if = "Option::is_none")]
pub smart_proxy_enabled: Option<bool>,
}
impl ServiceAuthenticationConfigurationInfo {
pub fn new() -> Self {
Self::default()
}
}
pub type ServiceCorsConfigurationHeaderEntry = String;
#[doc = "The settings for the CORS configuration of the service instance."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServiceCorsConfigurationInfo {
#[doc = "The origins to be allowed via CORS."]
#[serde(default, skip_serializing_if = "Vec::is_empty")]
pub origins: Vec<ServiceCorsConfigurationOriginEntry>,
#[doc = "The headers to be allowed via CORS."]
#[serde(default, skip_serializing_if = "Vec::is_empty")]
pub headers: Vec<ServiceCorsConfigurationHeaderEntry>,
#[doc = "The methods to be allowed via CORS."]
#[serde(default, skip_serializing_if = "Vec::is_empty")]
pub methods: Vec<ServiceCorsConfigurationMethodEntry>,
#[doc = "The max age to be allowed via CORS."]
#[serde(rename = "maxAge", default, skip_serializing_if = "Option::is_none")]
pub max_age: Option<i64>,
#[doc = "If credentials are allowed via CORS."]
#[serde(rename = "allowCredentials", default, skip_serializing_if = "Option::is_none")]
pub allow_credentials: Option<bool>,
}
impl ServiceCorsConfigurationInfo {
pub fn new() -> Self {
Self::default()
}
}
pub type ServiceCorsConfigurationMethodEntry = String;
pub type ServiceCorsConfigurationOriginEntry = String;
#[doc = "The settings for the Cosmos DB database backing the service."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServiceCosmosDbConfigurationInfo {
#[doc = "The provisioned throughput for the backing database."]
#[serde(rename = "offerThroughput", default, skip_serializing_if = "Option::is_none")]
pub offer_throughput: Option<i64>,
}
impl ServiceCosmosDbConfigurationInfo {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "Export operation configuration information"]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServiceExportConfigurationInfo {
#[doc = "The name of the default export storage account."]
#[serde(rename = "storageAccountName", default, skip_serializing_if = "Option::is_none")]
pub storage_account_name: Option<String>,
}
impl ServiceExportConfigurationInfo {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "The description of the service."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub struct ServicesDescription {
#[serde(flatten)]
pub resource: Resource,
#[doc = "The properties of a service instance."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub properties: Option<ServicesProperties>,
}
impl ServicesDescription {
pub fn new(resource: Resource) -> Self {
Self {
resource,
properties: None,
}
}
}
#[doc = "A list of service description objects with a next link."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServicesDescriptionListResult {
#[doc = "The link used to get the next page of service description objects."]
#[serde(rename = "nextLink", default, skip_serializing_if = "Option::is_none")]
pub next_link: Option<String>,
#[doc = "A list of service description objects."]
#[serde(default, skip_serializing_if = "Vec::is_empty")]
pub value: Vec<ServicesDescription>,
}
impl ServicesDescriptionListResult {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "The properties indicating whether a given service name is available."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServicesNameAvailabilityInfo {
#[doc = "The value which indicates whether the provided name is available."]
#[serde(rename = "nameAvailable", default, skip_serializing_if = "Option::is_none")]
pub name_available: Option<bool>,
#[doc = "The reason for unavailability."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub reason: Option<services_name_availability_info::Reason>,
#[doc = "The detailed reason message."]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub message: Option<String>,
}
impl ServicesNameAvailabilityInfo {
pub fn new() -> Self {
Self::default()
}
}
pub mod services_name_availability_info {
use super::*;
#[doc = "The reason for unavailability."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub enum Reason {
Invalid,
AlreadyExists,
}
}
#[doc = "The description of the service."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServicesPatchDescription {
#[doc = "Instance tags"]
#[serde(default, skip_serializing_if = "Option::is_none")]
pub tags: Option<serde_json::Value>,
}
impl ServicesPatchDescription {
pub fn new() -> Self {
Self::default()
}
}
#[doc = "The properties of a service instance."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, Default)]
pub struct ServicesProperties {
#[doc = "The provisioning state."]
#[serde(rename = "provisioningState", default, skip_serializing_if = "Option::is_none")]
pub provisioning_state: Option<services_properties::ProvisioningState>,
#[doc = "The access policies of the service instance."]
#[serde(rename = "accessPolicies", default, skip_serializing_if = "Option::is_none")]
pub access_policies: Option<ServiceAccessPoliciesInfo>,
#[doc = "The settings for the Cosmos DB database backing the service."]
#[serde(rename = "cosmosDbConfiguration", default, skip_serializing_if = "Option::is_none")]
pub cosmos_db_configuration: Option<ServiceCosmosDbConfigurationInfo>,
#[doc = "Authentication configuration information"]
#[serde(rename = "authenticationConfiguration", default, skip_serializing_if = "Option::is_none")]
pub authentication_configuration: Option<ServiceAuthenticationConfigurationInfo>,
#[doc = "The settings for the CORS configuration of the service instance."]
#[serde(rename = "corsConfiguration", default, skip_serializing_if = "Option::is_none")]
pub cors_configuration: Option<ServiceCorsConfigurationInfo>,
#[doc = "Export operation configuration information"]
#[serde(rename = "exportConfiguration", default, skip_serializing_if = "Option::is_none")]
pub export_configuration: Option<ServiceExportConfigurationInfo>,
}
impl ServicesProperties {
pub fn new() -> Self {
Self::default()
}
}
pub mod services_properties {
use super::*;
#[doc = "The provisioning state."]
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub enum ProvisioningState {
Deleting,
Succeeded,
Creating,
Accepted,
Verifying,
Updating,
Failed,
Canceled,
Deprovisioned,
}
}
|
The application of outcrop-based research boreholes for reservoir modelling: potential, challenges and pitfalls
Abstract Research boreholes drilled in outcrop studies of reservoir analogues can have two distinctly different objectives. One is to assess how the sequences observed on outcrops are represented in cores and wireline logs, an exercise that can form an important link between data retrieved from actual reservoirs and observations made on outcrops. The second objective is to obtain information on the lateral and vertical extensions of the exposed rock types and possible changes in lithofacies beyond the limits of exposure. This can provide improved insights into the three-dimensional (3D) architecture of the system and the 3D distribution of reservoir rock types. Additional information gained from research boreholes concerns lithofacies types prone to weathering such as shales; cores are generally less weathered than outcrops and therefore provide a fresher and often more representative appearance. Electrical borehole images are found to be particularly useful as they have a large dynamic range that makes subtle sedimentary structures more visible, and their 3D nature allows accurate determination of directional structures such as cross-bedding. The costs and logistics of drilling research boreholes are important issues to consider before embarking on such projects. |
<reponame>simontonsoftware/s-libs<gh_stars>10-100
/**
* Throws an error if `condition` is falsy, and acts as a type guard.
*
* @param message The message to set in the error that is thrown
*
* ```ts
* class ProblemLogger {
* problem?: string;
*
* log() {
* assert(this.problem, "You must set problem before logging it");
*
* // now typescript knows `this.problem` is truthy, so we can safely call methods on it
* console.log(problem.toUpperCase());
* }
* }
* ```
*/
export function assert(condition: any, message?: string): asserts condition {
if (!condition) {
throw new Error(message);
}
}
|
Plukenetia conophora seed oil ameliorates streptozotocin-induced hyperglycaemia and oxidative stress in rats
Abstract Purpose Plukenetia conophora (African walnut) is an edible seed, widely cultivated for its ethnomedicinal and nutritional purposes. Consumption of African walnuts has been linked with blood sugar lowering effect. Objective The effects of P. conophora seed oil treatment on hyperglycaemia and oxidative stress were investigated in plasma, liver and kidney of streptozotocin (STZ)-induced diabetic rats. Materials and methods Plukenetia conophora seed oil (PCO) was obtained by extraction of pulverized dried seed in n-hexane. Diabetes was induced by STZ injection (65 mg/kg, i.p). Rats were assigned into non-diabetic control (NC) and diabetic control (DC; treated with vehicle), PCO (200 mg/kg) and pioglitazone (10 mg/kg). Fasting blood sugar (FBS) was taken from overnight fasted animals on day 7 and 14, respectively. Plasma, liver and kidney samples were obtained on day 14 for the determination of oxidative stress parameters malondialdehyde (MDA), reduced glutathione (GSH), catalase and superoxide dismutase (SOD). Results PCO treatment significantly (p < 0.05) reduced STZ-induced hyperglycaemia by lowering the elevated FBS. PCO significantly reduced MDA level and attenuated STZ-induced depletion of GSH, catalase and SOD in the diabetic rats’ plasma, liver and kidneys. Conclusions These results suggest that consumption of Plukenetia conophora seed might offer protection against diabetes-induced hepatic and renal damage. |
/**
* Method used to show the response from the Open Tech Calender API
*/
public void populateAPIResponse(String response){
ArrayList<TechMeetup> meetupArray = new ArrayList<TechMeetup>();
try{
JSONObject jsonObject = new JSONObject(response);
JSONArray jsonArray = jsonObject.getJSONArray("data");
int size = jsonArray.length();
for (int i = 0; i < size; i++) {
JSONObject meetupObject = jsonArray.getJSONObject(i);
Boolean meetupHasURL = meetupObject.has("url");
Boolean meetupHasAreas = meetupObject.has("areas");
if (meetupHasURL && meetupHasAreas) {
String summary = meetupObject.getString("summary");
String description = meetupObject.getString("description");
String url = meetupObject.getString("url");
String areasJSONString = meetupObject.getString("areas");
JSONArray areasJSONArray = new JSONArray(areasJSONString);
String city = "";
JSONObject areasJSONObject = areasJSONArray.getJSONObject(0);
city = areasJSONObject.getString("title");
String date = meetupObject.getString("start");
JSONObject dateJSONObject = new JSONObject(date);
String startDate = dateJSONObject.getString("displaylocal");
TechMeetup techMeetup = new TechMeetup(summary, city, description, startDate, url);
meetupArray.add(techMeetup);
}
}
}
catch(JSONException e){
e.printStackTrace();
}
for(TechMeetup tm : meetupArray){
String summary = tm.getSummary();
String city = tm.getCity();
String description = tm.getDescription();
String date = tm.getDate();
String url = tm.getUrl();
AddTechMeetupView(summary, city, description, date, url);
}
} |
/**
* For testing purposes.
*
* @author Lorenzo Bettini
*
*/
static class ServerSocketFactory {
public ServerSocket create(int port) throws IOException {
return new ServerSocket(port);
}
} |
/**
* Sets the information for the polygon into the Node
* @param node Node to update
* @param polygon to add to Node
*/
public void linkPolyToNode(Node node, Polygon polygon) {
var color = node.getColor();
node.setPolygon(polygon);
node.setColor(color);
node.getPolygon().setSmooth(true);
node.getPolygon().setScaleX(polygonScale);
node.getPolygon().setScaleY(polygonScale);
} |
/**
* A class for computing Fibonacci numbers using the provided cache to reuse previously computed
* values.
*/
class FibonacciTable {
private final Map<Integer, Integer> cache;
/** Constructs a new object with a default cache implementation. */
FibonacciTable() {
this(new HashMap<>());
}
/**
* Constructs a new object using the provided cache implementation.
*
* @param cache the cache to use for storing computed values
*/
FibonacciTable(Map<Integer, Integer> cache) {
this.cache = cache;
}
/**
* Compute a Fibonacci number.
*
* @param i the index in the Fibonacci sequence
* @return the Fibonacci number for this index
*/
int fib(int i) {
// use the provided cache to reuse computed values
// cache.containsKey(4) will return true if there is a value stored for the index 4
// cache.get(4) will return the stored value for 4
// cache.put(4,3) will store the value 3 for the index 4 in the cache
if (cache.containsKey(i)) {
return cache.get(i);
} else if (cache.containsKey(i-1) && cache.containsKey(i-2)) {
cache.put(i, (cache.get(i-1))+cache.get(i-2));
return cache.get(i);
} else if (i<=2) {
cache.put(i,1);
return 1;
} else {
cache.put(i,fib((i-1)) + fib((i-2)));
return cache.get(i);
}
}
} |
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Scanner;
import java.util.Set;
public class Main {
public static void main(String[] args) {
//knapSack();
longestPath();
}
private static void longestPath() {
Scanner sc = new Scanner(System.in);
int nodes = sc.nextInt();
int M = sc.nextInt();
Map<Integer,ArrayList<Integer>> adj = new HashMap<>();
for(int i =0;i<M;i++) {
int x = sc.nextInt();
int y = sc.nextInt();
ArrayList<Integer> cur= adj.getOrDefault(x, new ArrayList<Integer>());
cur.add(y);
adj.put(x,cur);
}
Set<Integer> visited = new HashSet<>();
int dp[] = new int[nodes+1];
Arrays.fill(dp, -1);
int ans = 0;
for(Integer k : adj.keySet()) {
if(dp[k]==-1) {
ans = Math.max(ans, dfs(k, dp,adj));
}
}
//System.out.println(Arrays.toString(dp));
System.out.println(ans);
}
private static int dfs(Integer k, int[] dp, Map<Integer, ArrayList<Integer>> adj) {
//System.out.println("k "+ k);
if(dp[k]!=-1) {
//System.out.println("quick return k "+ k + " dp[k] "+ dp[k]);
return dp[k];
}
int ans = 0;
if(adj.get(k)==null) {
dp[k]=0;
return 0;
}
for(int i: adj.get(k)) {
//System.out.println(" k "+k+" i "+ i + " dp[i] "+ dp[i]);
if(dp[i] != -1) {
ans = Math.max(ans, dp[i]+1);
}else {
ans = Math.max(ans, dfs(i, dp,adj)+1);
}
}
dp[k]= ans;
//System.out.println("result updated dp["+ k+"] "+ dp[k]);
return ans;
}
public static void knapSack() {
Scanner sc = new Scanner(System.in);
int items = sc.nextInt();
int weight = sc.nextInt();
int[] values = new int[items];
int[] weights = new int[items];
for(int i =0;i<items;i++) {
weights[i] = sc.nextInt();
values[i] = sc.nextInt();
}
long[][]dp = new long[items+1][weight+1];
knapSackHelper(dp, items,weight,weights, values);
System.out.println(dp[items][weight]);
}
private static long knapSackHelper(long[][] dp, int items, int WT, int[] weights, int[] values) {
System.out.println("items "+ items +" WT "+ WT + "current Item ");
if(items <=0) {
return 0;
}
if(dp[items][WT]!=0) {
return dp[items][WT];
}
if(WT>=weights[items-1]) {
dp[items][WT] = knapSackHelper(dp, items-1,WT-weights[items-1],weights,values)+values[items-1];
}
dp[items][WT] = Math.max(dp[items][WT], knapSackHelper(dp, items-1, WT, weights, values));
//System.out.println("dp ["+ items+"]["+WT+"] "+dp[items][WT]);
return dp[items][WT];
}
}
|
What Happens to the Fish’s Achievement in a Little Pond? A Simultaneous Analysis of Class-Average Achievement Effects on Achievement and Academic Self-Concept
Empirical studies have demonstrated that students who are taught in a group of students with higher average achievement benefit in terms of their achievement. However, there is also evidence showing that being surrounded by high-achieving students has a negative effect on students’ academic self-concept, also known as the big-fish–little-pond effect. In view of the reciprocal relationship between achievement and academic self-concept, the present study aims to scrutinize how the average achievement of a class affects students’ achievement and academic self-concept, and how that, in turn, affects subsequent achievement and academic self-concept. Using a sample of 6,463 seventh-graders from 285 classes in Germany, multilevel path models showed that the class-average achievement at the beginning of the school year positively affected individual achievement in the middle and at the end of the school year, and negative effects on academic self-concept occurred only at the beginning of Grade 7, but not later in the school year. In addition, mediation analyses revealed that the effects of class-average achievement on students’ achievement and academic self-concept at the end of the school year were mediated by midterm achievement, but not by midterm academic self-concept. This pattern was found for mathematics, biology, physics, and English as a foreign language. The results of our study indicate that the consequences for students of belonging to a group of high-achieving students should be analyzed with respect to both academic self-concept and achievement. |
import java.io.BufferedReader;
import java.io.InputStreamReader;
class Main {
public static void main(String[] args) throws Exception{
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
String line = br.readLine();
int n = Integer.parseInt(line);
int money = 100000;
for(int i=0;i<n;i++){
money *= 1.05;
String temp = money + "";
int len = temp.length();
if((!temp.substring(len-1, len).equals("0")) || (!temp.substring(len-2, len-1).equals("0"))
|| ((!temp.substring(len-3, len-2).equals("0")))){
money = ((money / 1000) + 1) * 1000;
}
//money = money % 1000 == 0 ? money : ((money / 1000) + 1) * 1000;
}
System.out.println(money);
}
} |
Enhancing directed content sharing on the web
To find interesting, personally relevant web content, people rely on friends and colleagues to pass links along as they encounter them. In this paper, we study and augment link-sharing via e-mail, the most popular means of sharing web content today. Armed with survey data indicating that active sharers of novel web content are often those that actively seek it out, we developed FeedMe, a plug-in for Google Reader that makes directed sharing of content a more salient part of the user experience. FeedMe recommends friends who may be interested in seeing content that the user is viewing, provides information on what the recipient has seen and how many emails they have received recently, and gives recipients the opportunity to provide lightweight feedback when they appreciate shared content. FeedMe introduces a novel design space within mixed-initiative social recommenders: friends who know the user voluntarily vet the material on the user's behalf. We performed a two-week field experiment (N=60) and found that FeedMe made it easier and more enjoyable to share content that recipients appreciated and would not have found otherwise. |
def add_global_vars_metric(self):
total_metric_count = 0
for function in self.functions:
if len(self.global_vars_dict) > 0:
self.functions[function].global_vars_metric = float(self.functions[function].global_vars_access)/len(self.global_vars_dict)
total_metric_count += self.functions[function].global_vars_metric
return total_metric_count |
/*
* SPDX-License-Identifier: Apache-2.0
*/
//===-----------LowerKrnlRegion.cpp ---------------------------------===//
//
// Copyright 2019-2022 The IBM Research Authors.
//
// =============================================================================
//
// This pass enables the lowering of the krnl.region operation
//
//===----------------------------------------------------------------------===//
#include "mlir/Dialect/Affine/IR/AffineOps.h"
#include "mlir/Dialect/Func/IR/FuncOps.h"
#include "mlir/Pass/Pass.h"
#include "mlir/Transforms/GreedyPatternRewriteDriver.h"
#include "src/Dialect/Krnl/KrnlOps.hpp"
#include "src/Pass/Passes.hpp"
#include "src/Support/KrnlSupport.hpp"
using namespace mlir;
using namespace onnx_mlir::krnl;
namespace {
/*!
Move the ops in KrnlRegionOp out of its region and then erase KrnlRegionOp
*/
class LowerKrnlRegion : public OpRewritePattern<KrnlRegionOp> {
public:
using OpRewritePattern<KrnlRegionOp>::OpRewritePattern;
LogicalResult matchAndRewrite(
KrnlRegionOp krnlRegionOp, PatternRewriter &rewriter) const override {
// Special traversal is used because the op being traversed is moved.
Block ®ionBlock = krnlRegionOp.getBodyRegion().front();
for (Operation &op : llvm::make_early_inc_range(regionBlock)) {
op.moveBefore(krnlRegionOp);
}
rewriter.eraseOp(krnlRegionOp);
return success();
}
};
/*!
* Function pass that lowers KrnlRegionOp
*/
class LowerKrnlRegionPass
: public PassWrapper<LowerKrnlRegionPass, OperationPass<func::FuncOp>> {
public:
MLIR_DEFINE_EXPLICIT_INTERNAL_INLINE_TYPE_ID(LowerKrnlRegionPass)
StringRef getArgument() const override { return "lower-krnl-region"; }
StringRef getDescription() const override {
return "Move ops in krnl.region operation out and erase this op";
}
void runOnOperation() override {
auto function = getOperation();
ConversionTarget target(getContext());
RewritePatternSet patterns(&getContext());
patterns.insert<LowerKrnlRegion>(&getContext());
if (failed(applyPatternsAndFoldGreedily(function, std::move(patterns))))
signalPassFailure();
}
};
} // namespace
namespace onnx_mlir {
namespace krnl {
std::unique_ptr<Pass> createLowerKrnlRegionPass() {
return std::make_unique<LowerKrnlRegionPass>();
}
} // namespace krnl
} // namespace onnx_mlir
|
/**
* Created by usarfraz on 2/12/17.
*/
@Database(name = TodoDatabase.NAME, version = TodoDatabase.VERSION)
public class TodoDatabase {
public static final String NAME = "TodoDatabase";
public static final int VERSION = 1;
} |
The Islamic State of Iraq and the Levant (ISIL) has lost about 14 percent of its territory in 2015, while Syria's Kurds have almost tripled theirs, think-tank IHS Jane's has said.
The development will be seen as a blow to the group given its stated aim to capture and hold territory to expand its so-called "caliphate", where it imposes its version of Islamic law.
The losses of ISIL include the strategically important town of Tal Abyad on Syria's border with Turkey, the Iraqi city of Tikrit, and Iraq's Baiji refinery.
Other big losses for the group include a stretch of highway between its Syrian stronghold Raqqa and Mosul in northern Iraq, complicating supply lines.
IHS Jane's said the group's territory had shrunk 12,800 square kilometres to 78,000 square kilometres between the start of the year and December 14.
"We had already seen a negative financial impact on the Islamic State [ISIL] due to the loss of control of the Tal Abyad border crossing prior to the recent intensification of air strikes against the group's oil production capacity," said Columb Strack, senior Middle East analyst with the US think-tank.
However, ISIL has made some high-profile gains during the year, including the historic Syrian town of Palmyra and the city centre of Ramadi, the provincial capital of Anbar, Iraq's largest province.
IHS Jane's said those victories came at the expense of the group's northern territories, which have been fiercely contested by Kurdish fighters.
Land under Syrian Kurdish control jumped 186 percent over the year, the intelligence review said.
"This indicates that the Islamic State [ISIL] was overstretched, and also that holding Kurdish territory is considered to be of lesser importance than expelling the Syrian and Iraqi governments from traditionally Sunni lands," Strack said.
"The Kurds appear to be primarily an obstruction to the Islamic State, rather than an objective in themselves."
Syrian Kurdish fighters dominate a group called the Syrian Democratic Forces, a coalition of Kurdish and Arab fighters battling ISIL in northeastern Syria, that has grown in prominence in recent months.
ISIL has also been targeted by US-led coalition air strikes, Iraqi forces and Syrian rebels.
Iraq's government managed to claw back some six percent of its territory from ISIL in the past year, IHS Jane's said, while Iraqi Kurds regained two percent of their lands.
The biggest territorial loser among the main factions in the Syrian conflict was the Syrian government, which lost 16 percent and is now left with around 30,000 square kilometres, according to the think-tank, less than half the area controlled by ISIL and a fraction of Syria's total area of about 185,000 square kilometres. |
<reponame>ikeda-mk/hackday2017-milktea-java
package com.example.demo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Random;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import twitter4j.Query;
import twitter4j.QueryResult;
import twitter4j.Trend;
import twitter4j.Trends;
import twitter4j.Twitter;
import twitter4j.TwitterException;
import twitter4j.TwitterFactory;
import twitter4j.User;
@Service
public class MonologueService {
private static final Twitter twitter = TwitterFactory.getSingleton();
// @Autowired
// Twitter twitter;
public String monologue () {
String tweetStr = "";
try {
tweetStr = tweet(searchTweet(getTrendInJapan()));
// searchTweet(getTrendInJapan());
} catch (TwitterException e) {
e.printStackTrace();
}
return tweetStr;
}
private String searchTweet(String searchStr) throws TwitterException {
System.out.println("[トレンドワード]" + searchStr);
// クエリを構築(スパム対策のためリンクとリツイートは除外)
Query query = new Query(searchStr + " -filter:links exclude:retweets");
// 検索(15件のツイートを取得)
QueryResult result = twitter.search(query);
// リプライとハッシュタグ、21文字以上のツイートを除外
String n = System.getProperty("line.separator");
Pattern replyPattern = Pattern.compile("(@|#)");
List<String> filterdTweetList =
result.getTweets().stream()
.filter(status -> !replyPattern.matcher(status.getText()).find() && status.getText().length() <= 20)
.map(status -> status.getText().replaceAll("([。…!!??ww]+)", "わん$1").replaceAll("([^。…!!??ww]$)", "$1わん"))
.collect(Collectors.toList());
// .forEach(text -> System.out.println(text + n));
// 1ツイートも残らなかったら適当につぶやく
if (filterdTweetList.size() == 0) {
filterdTweetList.add(searchStr + "が気になるわん");
}
return filterdTweetList.get(new Random().nextInt(filterdTweetList.size()));
}
private String getTrendInJapan() throws TwitterException {
List<Trend> filterdTrendList = new ArrayList<>();
// 日本のWOEIDで50件のトレンドを取得
Trends japanTrends = twitter.getPlaceTrends(Integer.parseInt("23424856"));
// ハッシュタグを除外
Pattern hashTagPattern = Pattern.compile("^#");
filterdTrendList = Arrays.stream(japanTrends.getTrends())
.filter(trend -> !hashTagPattern.matcher(trend.getName()).find())
.collect(Collectors.toList());
return filterdTrendList.get(new Random().nextInt(filterdTrendList.size())).getName();
}
public String tweet(String tweetStr) throws TwitterException {
twitter.updateStatus(tweetStr);
return tweetStr;
}
private void logUserInfo(Twitter twitter) throws TwitterException {
User user = twitter.verifyCredentials();
System.out.println(user.getName());
System.out.println(user.getScreenName());
System.out.println(user.getFriendsCount());
System.out.println(user.getFollowersCount());
}
}
|
<gh_stars>1-10
/*
Auvitek AU8522 QAM/8VSB demodulator driver
Copyright (C) 2008 <NAME> <<EMAIL>>
Copyright (C) 2008 <NAME> <<EMAIL>>
Copyright (C) 2005-2008 Auvitek International, Ltd.
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
*/
#include <linux/kernel.h>
#include <linux/init.h>
#include <linux/module.h>
#include <linux/string.h>
#include <linux/slab.h>
#include <linux/delay.h>
#include <linux/videodev2.h>
#include <media/v4l2-device.h>
#include <media/v4l2-ctrls.h>
#include <media/v4l2-mc.h>
#include <linux/i2c.h>
#include <media/dvb_frontend.h>
#include "au8522.h"
#include "tuner-i2c.h"
#define AU8522_ANALOG_MODE 0
#define AU8522_DIGITAL_MODE 1
#define AU8522_SUSPEND_MODE 2
enum au8522_pads {
AU8522_PAD_IF_INPUT,
AU8522_PAD_VID_OUT,
AU8522_PAD_AUDIO_OUT,
AU8522_NUM_PADS
};
struct au8522_state {
struct i2c_client *c;
struct i2c_adapter *i2c;
u8 operational_mode;
/* Used for sharing of the state between analog and digital mode */
struct tuner_i2c_props i2c_props;
struct list_head hybrid_tuner_instance_list;
/* configuration settings */
struct au8522_config config;
struct dvb_frontend frontend;
u32 current_frequency;
enum fe_modulation current_modulation;
u32 fe_status;
unsigned int led_state;
/* Analog settings */
struct v4l2_subdev sd;
v4l2_std_id std;
int vid_input;
int aud_input;
u32 id;
u32 rev;
struct v4l2_ctrl_handler hdl;
#ifdef CONFIG_MEDIA_CONTROLLER
struct media_pad pads[AU8522_NUM_PADS];
#endif
};
/* These are routines shared by both the VSB/QAM demodulator and the analog
decoder */
int au8522_writereg(struct au8522_state *state, u16 reg, u8 data);
u8 au8522_readreg(struct au8522_state *state, u16 reg);
int au8522_init(struct dvb_frontend *fe);
int au8522_sleep(struct dvb_frontend *fe);
int au8522_get_state(struct au8522_state **state, struct i2c_adapter *i2c,
u8 client_address);
void au8522_release_state(struct au8522_state *state);
int au8522_i2c_gate_ctrl(struct dvb_frontend *fe, int enable);
int au8522_analog_i2c_gate_ctrl(struct dvb_frontend *fe, int enable);
int au8522_led_ctrl(struct au8522_state *state, int led);
/* REGISTERS */
#define AU8522_INPUT_CONTROL_REG081H 0x081
#define AU8522_PGA_CONTROL_REG082H 0x082
#define AU8522_CLAMPING_CONTROL_REG083H 0x083
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H 0x0A3
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H 0x0A4
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H 0x0A5
#define AU8522_AGC_CONTROL_RANGE_REG0A6H 0x0A6
#define AU8522_SYSTEM_GAIN_CONTROL_REG0A7H 0x0A7
#define AU8522_TUNER_AGC_RF_STOP_REG0A8H 0x0A8
#define AU8522_TUNER_AGC_RF_START_REG0A9H 0x0A9
#define AU8522_TUNER_RF_AGC_DEFAULT_REG0AAH 0x0AA
#define AU8522_TUNER_AGC_IF_STOP_REG0ABH 0x0AB
#define AU8522_TUNER_AGC_IF_START_REG0ACH 0x0AC
#define AU8522_TUNER_AGC_IF_DEFAULT_REG0ADH 0x0AD
#define AU8522_TUNER_AGC_STEP_REG0AEH 0x0AE
#define AU8522_TUNER_GAIN_STEP_REG0AFH 0x0AF
/* Receiver registers */
#define AU8522_FRMREGTHRD1_REG0B0H 0x0B0
#define AU8522_FRMREGAGC1H_REG0B1H 0x0B1
#define AU8522_FRMREGSHIFT1_REG0B2H 0x0B2
#define AU8522_TOREGAGC1_REG0B3H 0x0B3
#define AU8522_TOREGASHIFT1_REG0B4H 0x0B4
#define AU8522_FRMREGBBH_REG0B5H 0x0B5
#define AU8522_FRMREGBBM_REG0B6H 0x0B6
#define AU8522_FRMREGBBL_REG0B7H 0x0B7
/* 0xB8 TO 0xD7 are the filter coefficients */
#define AU8522_FRMREGTHRD2_REG0D8H 0x0D8
#define AU8522_FRMREGAGC2H_REG0D9H 0x0D9
#define AU8522_TOREGAGC2_REG0DAH 0x0DA
#define AU8522_TOREGSHIFT2_REG0DBH 0x0DB
#define AU8522_FRMREGPILOTH_REG0DCH 0x0DC
#define AU8522_FRMREGPILOTM_REG0DDH 0x0DD
#define AU8522_FRMREGPILOTL_REG0DEH 0x0DE
#define AU8522_TOREGFREQ_REG0DFH 0x0DF
#define AU8522_RX_PGA_RFOUT_REG0EBH 0x0EB
#define AU8522_RX_PGA_IFOUT_REG0ECH 0x0EC
#define AU8522_RX_PGA_PGAOUT_REG0EDH 0x0ED
#define AU8522_CHIP_MODE_REG0FEH 0x0FE
/* I2C bus control registers */
#define AU8522_I2C_CONTROL_REG0_REG090H 0x090
#define AU8522_I2C_CONTROL_REG1_REG091H 0x091
#define AU8522_I2C_STATUS_REG092H 0x092
#define AU8522_I2C_WR_DATA0_REG093H 0x093
#define AU8522_I2C_WR_DATA1_REG094H 0x094
#define AU8522_I2C_WR_DATA2_REG095H 0x095
#define AU8522_I2C_WR_DATA3_REG096H 0x096
#define AU8522_I2C_WR_DATA4_REG097H 0x097
#define AU8522_I2C_WR_DATA5_REG098H 0x098
#define AU8522_I2C_WR_DATA6_REG099H 0x099
#define AU8522_I2C_WR_DATA7_REG09AH 0x09A
#define AU8522_I2C_RD_DATA0_REG09BH 0x09B
#define AU8522_I2C_RD_DATA1_REG09CH 0x09C
#define AU8522_I2C_RD_DATA2_REG09DH 0x09D
#define AU8522_I2C_RD_DATA3_REG09EH 0x09E
#define AU8522_I2C_RD_DATA4_REG09FH 0x09F
#define AU8522_I2C_RD_DATA5_REG0A0H 0x0A0
#define AU8522_I2C_RD_DATA6_REG0A1H 0x0A1
#define AU8522_I2C_RD_DATA7_REG0A2H 0x0A2
#define AU8522_ENA_USB_REG101H 0x101
#define AU8522_I2S_CTRL_0_REG110H 0x110
#define AU8522_I2S_CTRL_1_REG111H 0x111
#define AU8522_I2S_CTRL_2_REG112H 0x112
#define AU8522_FRMREGFFECONTROL_REG121H 0x121
#define AU8522_FRMREGDFECONTROL_REG122H 0x122
#define AU8522_CARRFREQOFFSET0_REG201H 0x201
#define AU8522_CARRFREQOFFSET1_REG202H 0x202
#define AU8522_DECIMATION_GAIN_REG21AH 0x21A
#define AU8522_FRMREGIFSLP_REG21BH 0x21B
#define AU8522_FRMREGTHRDL2_REG21CH 0x21C
#define AU8522_FRMREGSTEP3DB_REG21DH 0x21D
#define AU8522_DAGC_GAIN_ADJUSTMENT_REG21EH 0x21E
#define AU8522_FRMREGPLLMODE_REG21FH 0x21F
#define AU8522_FRMREGCSTHRD_REG220H 0x220
#define AU8522_FRMREGCRLOCKDMAX_REG221H 0x221
#define AU8522_FRMREGCRPERIODMASK_REG222H 0x222
#define AU8522_FRMREGCRLOCK0THH_REG223H 0x223
#define AU8522_FRMREGCRLOCK1THH_REG224H 0x224
#define AU8522_FRMREGCRLOCK0THL_REG225H 0x225
#define AU8522_FRMREGCRLOCK1THL_REG226H 0x226
#define AU_FRMREGPLLACQPHASESCL_REG227H 0x227
#define AU8522_FRMREGFREQFBCTRL_REG228H 0x228
/* Analog TV Decoder */
#define AU8522_TVDEC_STATUS_REG000H 0x000
#define AU8522_TVDEC_INT_STATUS_REG001H 0x001
#define AU8522_TVDEC_MACROVISION_STATUS_REG002H 0x002
#define AU8522_TVDEC_SHARPNESSREG009H 0x009
#define AU8522_TVDEC_BRIGHTNESS_REG00AH 0x00A
#define AU8522_TVDEC_CONTRAST_REG00BH 0x00B
#define AU8522_TVDEC_SATURATION_CB_REG00CH 0x00C
#define AU8522_TVDEC_SATURATION_CR_REG00DH 0x00D
#define AU8522_TVDEC_HUE_H_REG00EH 0x00E
#define AU8522_TVDEC_HUE_L_REG00FH 0x00F
#define AU8522_TVDEC_INT_MASK_REG010H 0x010
#define AU8522_VIDEO_MODE_REG011H 0x011
#define AU8522_TVDEC_PGA_REG012H 0x012
#define AU8522_TVDEC_COMB_MODE_REG015H 0x015
#define AU8522_REG016H 0x016
#define AU8522_TVDED_DBG_MODE_REG060H 0x060
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H 0x061
#define AU8522_TVDEC_FORMAT_CTRL2_REG062H 0x062
#define AU8522_TVDEC_VCR_DET_LLIM_REG063H 0x063
#define AU8522_TVDEC_VCR_DET_HLIM_REG064H 0x064
#define AU8522_TVDEC_COMB_VDIF_THR1_REG065H 0x065
#define AU8522_TVDEC_COMB_VDIF_THR2_REG066H 0x066
#define AU8522_TVDEC_COMB_VDIF_THR3_REG067H 0x067
#define AU8522_TVDEC_COMB_NOTCH_THR_REG068H 0x068
#define AU8522_TVDEC_COMB_HDIF_THR1_REG069H 0x069
#define AU8522_TVDEC_COMB_HDIF_THR2_REG06AH 0x06A
#define AU8522_TVDEC_COMB_HDIF_THR3_REG06BH 0x06B
#define AU8522_TVDEC_COMB_DCDIF_THR1_REG06CH 0x06C
#define AU8522_TVDEC_COMB_DCDIF_THR2_REG06DH 0x06D
#define AU8522_TVDEC_COMB_DCDIF_THR3_REG06EH 0x06E
#define AU8522_TVDEC_UV_SEP_THR_REG06FH 0x06F
#define AU8522_TVDEC_COMB_DC_THR1_NTSC_REG070H 0x070
#define AU8522_TVDEC_COMB_DC_THR2_NTSC_REG073H 0x073
#define AU8522_TVDEC_DCAGC_CTRL_REG077H 0x077
#define AU8522_TVDEC_PIC_START_ADJ_REG078H 0x078
#define AU8522_TVDEC_AGC_HIGH_LIMIT_REG079H 0x079
#define AU8522_TVDEC_MACROVISION_SYNC_THR_REG07AH 0x07A
#define AU8522_TVDEC_INTRP_CTRL_REG07BH 0x07B
#define AU8522_TVDEC_PLL_STATUS_REG07EH 0x07E
#define AU8522_TVDEC_FSC_FREQ_REG07FH 0x07F
#define AU8522_TVDEC_AGC_LOW_LIMIT_REG0E4H 0x0E4
#define AU8522_TOREGAAGC_REG0E5H 0x0E5
#define AU8522_TVDEC_CHROMA_AGC_REG401H 0x401
#define AU8522_TVDEC_CHROMA_SFT_REG402H 0x402
#define AU8522_FILTER_COEF_R410 0x410
#define AU8522_FILTER_COEF_R411 0x411
#define AU8522_FILTER_COEF_R412 0x412
#define AU8522_FILTER_COEF_R413 0x413
#define AU8522_FILTER_COEF_R414 0x414
#define AU8522_FILTER_COEF_R415 0x415
#define AU8522_FILTER_COEF_R416 0x416
#define AU8522_FILTER_COEF_R417 0x417
#define AU8522_FILTER_COEF_R418 0x418
#define AU8522_FILTER_COEF_R419 0x419
#define AU8522_FILTER_COEF_R41A 0x41A
#define AU8522_FILTER_COEF_R41B 0x41B
#define AU8522_FILTER_COEF_R41C 0x41C
#define AU8522_FILTER_COEF_R41D 0x41D
#define AU8522_FILTER_COEF_R41E 0x41E
#define AU8522_FILTER_COEF_R41F 0x41F
#define AU8522_FILTER_COEF_R420 0x420
#define AU8522_FILTER_COEF_R421 0x421
#define AU8522_FILTER_COEF_R422 0x422
#define AU8522_FILTER_COEF_R423 0x423
#define AU8522_FILTER_COEF_R424 0x424
#define AU8522_FILTER_COEF_R425 0x425
#define AU8522_FILTER_COEF_R426 0x426
#define AU8522_FILTER_COEF_R427 0x427
#define AU8522_FILTER_COEF_R428 0x428
#define AU8522_FILTER_COEF_R429 0x429
#define AU8522_FILTER_COEF_R42A 0x42A
#define AU8522_FILTER_COEF_R42B 0x42B
#define AU8522_FILTER_COEF_R42C 0x42C
#define AU8522_FILTER_COEF_R42D 0x42D
/* VBI Control Registers */
#define AU8522_TVDEC_VBI_RX_FIFO_CONTAIN_REG004H 0x004
#define AU8522_TVDEC_VBI_TX_FIFO_CONTAIN_REG005H 0x005
#define AU8522_TVDEC_VBI_RX_FIFO_READ_REG006H 0x006
#define AU8522_TVDEC_VBI_FIFO_STATUS_REG007H 0x007
#define AU8522_TVDEC_VBI_CTRL_H_REG017H 0x017
#define AU8522_TVDEC_VBI_CTRL_L_REG018H 0x018
#define AU8522_TVDEC_VBI_USER_TOTAL_BITS_REG019H 0x019
#define AU8522_TVDEC_VBI_USER_TUNIT_H_REG01AH 0x01A
#define AU8522_TVDEC_VBI_USER_TUNIT_L_REG01BH 0x01B
#define AU8522_TVDEC_VBI_USER_THRESH1_REG01CH 0x01C
#define AU8522_TVDEC_VBI_USER_FRAME_PAT2_REG01EH 0x01E
#define AU8522_TVDEC_VBI_USER_FRAME_PAT1_REG01FH 0x01F
#define AU8522_TVDEC_VBI_USER_FRAME_PAT0_REG020H 0x020
#define AU8522_TVDEC_VBI_USER_FRAME_MASK2_REG021H 0x021
#define AU8522_TVDEC_VBI_USER_FRAME_MASK1_REG022H 0x022
#define AU8522_TVDEC_VBI_USER_FRAME_MASK0_REG023H 0x023
#define AU8522_REG071H 0x071
#define AU8522_REG072H 0x072
#define AU8522_REG074H 0x074
#define AU8522_REG075H 0x075
/* Digital Demodulator Registers */
#define AU8522_FRAME_COUNT0_REG084H 0x084
#define AU8522_RS_STATUS_G0_REG085H 0x085
#define AU8522_RS_STATUS_B0_REG086H 0x086
#define AU8522_RS_STATUS_E_REG087H 0x087
#define AU8522_DEMODULATION_STATUS_REG088H 0x088
#define AU8522_TOREGTRESTATUS_REG0E6H 0x0E6
#define AU8522_TSPORT_CONTROL_REG10BH 0x10B
#define AU8522_TSTHES_REG10CH 0x10C
#define AU8522_FRMREGDFEKEEP_REG301H 0x301
#define AU8522_DFE_AVERAGE_REG302H 0x302
#define AU8522_FRMREGEQLERRWIN_REG303H 0x303
#define AU8522_FRMREGFFEKEEP_REG304H 0x304
#define AU8522_FRMREGDFECONTROL1_REG305H 0x305
#define AU8522_FRMREGEQLERRLOW_REG306H 0x306
#define AU8522_REG42EH 0x42E
#define AU8522_REG42FH 0x42F
#define AU8522_REG430H 0x430
#define AU8522_REG431H 0x431
#define AU8522_REG432H 0x432
#define AU8522_REG433H 0x433
#define AU8522_REG434H 0x434
#define AU8522_REG435H 0x435
#define AU8522_REG436H 0x436
/* GPIO Registers */
#define AU8522_GPIO_CONTROL_REG0E0H 0x0E0
#define AU8522_GPIO_STATUS_REG0E1H 0x0E1
#define AU8522_GPIO_DATA_REG0E2H 0x0E2
/* Audio Control Registers */
#define AU8522_AUDIOAGC_REG0EEH 0x0EE
#define AU8522_AUDIO_STATUS_REG0F0H 0x0F0
#define AU8522_AUDIO_MODE_REG0F1H 0x0F1
#define AU8522_AUDIO_VOLUME_L_REG0F2H 0x0F2
#define AU8522_AUDIO_VOLUME_R_REG0F3H 0x0F3
#define AU8522_AUDIO_VOLUME_REG0F4H 0x0F4
#define AU8522_FRMREGAUPHASE_REG0F7H 0x0F7
#define AU8522_REG0F9H 0x0F9
#define AU8522_AUDIOAGC2_REG605H 0x605
#define AU8522_AUDIOFREQ_REG606H 0x606
/**************************************************************/
/* Format control 1 */
/* VCR Mode 7-6 */
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_VCR_MODE_YES 0x80
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_VCR_MODE_NO 0x40
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_VCR_MODE_AUTO 0x00
/* Field len 5-4 */
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_FIELD_LEN_625 0x20
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_FIELD_LEN_525 0x10
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_FIELD_LEN_AUTO 0x00
/* Line len (us) 3-2 */
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_LINE_LEN_64_000 0x0b
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_LINE_LEN_63_492 0x08
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_LINE_LEN_63_556 0x04
/* Subcarrier freq 1-0 */
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_SUBCARRIER_NTSC_AUTO 0x03
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_SUBCARRIER_NTSC_443 0x02
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_SUBCARRIER_NTSC_MN 0x01
#define AU8522_TVDEC_FORMAT_CTRL1_REG061H_SUBCARRIER_NTSC_50 0x00
/* Format control 2 */
#define AU8522_TVDEC_FORMAT_CTRL2_REG062H_STD_AUTODETECT 0x00
#define AU8522_TVDEC_FORMAT_CTRL2_REG062H_STD_NTSC 0x01
#define AU8522_TVDEC_FORMAT_CTRL2_REG062H_STD_PAL_M 0x02
#define AU8522_INPUT_CONTROL_REG081H_ATSC 0xC4
#define AU8522_INPUT_CONTROL_REG081H_ATVRF 0xC4
#define AU8522_INPUT_CONTROL_REG081H_ATVRF13 0xC4
#define AU8522_INPUT_CONTROL_REG081H_J83B64 0xC4
#define AU8522_INPUT_CONTROL_REG081H_J83B256 0xC4
#define AU8522_INPUT_CONTROL_REG081H_CVBS 0x20
#define AU8522_INPUT_CONTROL_REG081H_CVBS_CH1 0xA2
#define AU8522_INPUT_CONTROL_REG081H_CVBS_CH2 0xA0
#define AU8522_INPUT_CONTROL_REG081H_CVBS_CH3 0x69
#define AU8522_INPUT_CONTROL_REG081H_CVBS_CH4 0x68
#define AU8522_INPUT_CONTROL_REG081H_CVBS_CH4_SIF 0x28
/* CH1 AS Y,CH3 AS C */
#define AU8522_INPUT_CONTROL_REG081H_SVIDEO_CH13 0x23
/* CH2 AS Y,CH4 AS C */
#define AU8522_INPUT_CONTROL_REG081H_SVIDEO_CH24 0x20
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H_ATSC 0x0C
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H_J83B64 0x09
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H_J83B256 0x09
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H_CVBS 0x12
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H_ATVRF 0x1A
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H_ATVRF13 0x1A
#define AU8522_MODULE_CLOCK_CONTROL_REG0A3H_SVIDEO 0x02
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_CLEAR 0x00
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_SVIDEO 0x9C
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_CVBS 0x9D
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_ATSC 0xE8
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_J83B256 0xCA
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_J83B64 0xCA
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_ATVRF 0xDD
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_ATVRF13 0xDD
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_PAL 0xDD
#define AU8522_SYSTEM_MODULE_CONTROL_0_REG0A4H_FM 0xDD
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_ATSC 0x80
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_J83B256 0x80
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_J83B64 0x80
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_DONGLE_ATSC 0x40
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_DONGLE_J83B256 0x40
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_DONGLE_J83B64 0x40
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_DONGLE_CLEAR 0x00
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_ATVRF 0x01
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_ATVRF13 0x01
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_SVIDEO 0x04
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_CVBS 0x01
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_PWM 0x03
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_IIS 0x09
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_PAL 0x01
#define AU8522_SYSTEM_MODULE_CONTROL_1_REG0A5H_FM 0x01
/* STILL NEED TO BE REFACTORED @@@@@@@@@@@@@@ */
#define AU8522_TVDEC_CONTRAST_REG00BH_CVBS 0x79
#define AU8522_TVDEC_SATURATION_CB_REG00CH_CVBS 0x80
#define AU8522_TVDEC_SATURATION_CR_REG00DH_CVBS 0x80
#define AU8522_TVDEC_HUE_H_REG00EH_CVBS 0x00
#define AU8522_TVDEC_HUE_L_REG00FH_CVBS 0x00
#define AU8522_TVDEC_PGA_REG012H_CVBS 0x0F
#define AU8522_TVDEC_COMB_MODE_REG015H_CVBS 0x00
#define AU8522_REG016H_CVBS 0x00
#define AU8522_TVDED_DBG_MODE_REG060H_CVBS 0x00
#define AU8522_TVDEC_VCR_DET_LLIM_REG063H_CVBS 0x19
#define AU8522_REG0F9H_AUDIO 0x20
#define AU8522_TVDEC_VCR_DET_HLIM_REG064H_CVBS 0xA7
#define AU8522_TVDEC_COMB_VDIF_THR1_REG065H_CVBS 0x0A
#define AU8522_TVDEC_COMB_VDIF_THR2_REG066H_CVBS 0x32
#define AU8522_TVDEC_COMB_VDIF_THR3_REG067H_CVBS 0x19
#define AU8522_TVDEC_COMB_NOTCH_THR_REG068H_CVBS 0x23
#define AU8522_TVDEC_COMB_HDIF_THR1_REG069H_CVBS 0x41
#define AU8522_TVDEC_COMB_HDIF_THR2_REG06AH_CVBS 0x0A
#define AU8522_TVDEC_COMB_HDIF_THR3_REG06BH_CVBS 0x32
#define AU8522_TVDEC_COMB_DCDIF_THR1_REG06CH_CVBS 0x34
#define AU8522_TVDEC_COMB_DCDIF_THR1_REG06CH_SVIDEO 0x2a
#define AU8522_TVDEC_COMB_DCDIF_THR2_REG06DH_CVBS 0x05
#define AU8522_TVDEC_COMB_DCDIF_THR2_REG06DH_SVIDEO 0x15
#define AU8522_TVDEC_COMB_DCDIF_THR3_REG06EH_CVBS 0x6E
#define AU8522_TVDEC_UV_SEP_THR_REG06FH_CVBS 0x0F
#define AU8522_TVDEC_COMB_DC_THR1_NTSC_REG070H_CVBS 0x80
#define AU8522_REG071H_CVBS 0x18
#define AU8522_REG072H_CVBS 0x30
#define AU8522_TVDEC_COMB_DC_THR2_NTSC_REG073H_CVBS 0xF0
#define AU8522_REG074H_CVBS 0x80
#define AU8522_REG075H_CVBS 0xF0
#define AU8522_TVDEC_DCAGC_CTRL_REG077H_CVBS 0xFB
#define AU8522_TVDEC_PIC_START_ADJ_REG078H_CVBS 0x04
#define AU8522_TVDEC_AGC_HIGH_LIMIT_REG079H_CVBS 0x00
#define AU8522_TVDEC_MACROVISION_SYNC_THR_REG07AH_CVBS 0x00
#define AU8522_TVDEC_INTRP_CTRL_REG07BH_CVBS 0xEE
#define AU8522_TVDEC_AGC_LOW_LIMIT_REG0E4H_CVBS 0xFE
#define AU8522_TOREGAAGC_REG0E5H_CVBS 0x00
#define AU8522_TVDEC_VBI6A_REG035H_CVBS 0x40
/* Enables Closed captioning */
#define AU8522_TVDEC_VBI_CTRL_H_REG017H_CCON 0x21
|
/**
* The enhanced {@link voldemort.client.StoreClient StoreClient} implementation
* you get back from a {@link voldemort.client.StoreClientFactory
* StoreClientFactory}
*
*
* @param <K> The key type
* @param <V> The value type
*/
@Threadsafe
@JmxManaged(description = "A voldemort client")
public class ZenStoreClient<K, V> extends DefaultStoreClient<K, V> {
private final Logger logger = Logger.getLogger(ZenStoreClient.class);
private final AbstractStoreClientFactory abstractStoreFactory;
private final ClientConfig config;
private final SystemStoreRepository sysRepository;
private final String clientId;
private final SchedulerService scheduler;
private ClientInfo clientInfo;
private String clusterXml;
private AsyncMetadataVersionManager asyncMetadataManager = null;
private ClientRegistryRefresher clientRegistryRefresher = null;
public ZenStoreClient(String storeName,
InconsistencyResolver<Versioned<V>> resolver,
AbstractStoreClientFactory storeFactory,
int maxMetadataRefreshAttempts) {
this(storeName, resolver, storeFactory, maxMetadataRefreshAttempts, null, 0, null, null);
}
public ZenStoreClient(String storeName,
InconsistencyResolver<Versioned<V>> resolver,
AbstractStoreClientFactory storeFactory,
int maxMetadataRefreshAttempts,
String clientContext,
int clientSequence,
ClientConfig config,
SchedulerService scheduler) {
super();
this.storeName = Utils.notNull(storeName);
this.resolver = resolver;
this.abstractStoreFactory = Utils.notNull(storeFactory);
this.storeFactory = this.abstractStoreFactory;
this.metadataRefreshAttempts = maxMetadataRefreshAttempts;
this.clientInfo = new ClientInfo(storeName,
clientContext,
clientSequence,
System.currentTimeMillis(),
ManifestFileReader.getReleaseVersion(),
config);
this.clientId = generateClientId(clientInfo);
this.config = config;
this.sysRepository = new SystemStoreRepository();
this.scheduler = scheduler;
// Registering self to be able to bootstrap client dynamically via JMX
JmxUtils.registerMbean(this,
JmxUtils.createObjectName(JmxUtils.getPackageName(this.getClass()),
JmxUtils.getClassName(this.getClass())
+ "." + storeName));
// Bootstrap this client
bootStrap();
// Initialize the background thread for checking metadata version
if(config != null) {
asyncMetadataManager = scheduleAsyncMetadataVersionManager(clientId.toString(),
config.getAsyncMetadataRefreshInMs());
}
clientRegistryRefresher = registerClient(clientId,
config.getClientRegistryUpdateIntervalInSecs());
logger.info("Voldemort client created: " + clientId + "\n" + clientInfo);
}
private ClientRegistryRefresher registerClient(String jobId, int intervalInSecs) {
ClientRegistryRefresher refresher = null;
if(this.sysRepository.getClientRegistryStore() != null) {
try {
Version version = this.sysRepository.getClientRegistryStore()
.putSysStore(clientId, clientInfo.toString());
refresher = new ClientRegistryRefresher(this.sysRepository,
clientId,
clientInfo,
version);
GregorianCalendar cal = new GregorianCalendar();
cal.add(Calendar.SECOND, intervalInSecs);
if(scheduler != null) {
scheduler.schedule(jobId + refresher.getClass().getName(),
refresher,
cal.getTime(),
TimeUnit.MILLISECONDS.convert(intervalInSecs,
TimeUnit.SECONDS));
logger.info("Client registry refresher thread started, refresh interval: "
+ intervalInSecs + " seconds");
} else {
logger.warn("Client registry won't run because scheduler service is not configured");
}
} catch(Exception e) {
logger.warn("Unable to register with the cluster due to the following error:", e);
}
} else {
logger.warn(SystemStoreConstants.SystemStoreName.voldsys$_client_registry.name()
+ "not found. Unable to registry with voldemort cluster.");
}
return refresher;
}
private AsyncMetadataVersionManager scheduleAsyncMetadataVersionManager(String jobId,
long interval) {
AsyncMetadataVersionManager asyncMetadataManager = null;
SystemStore<String, String> versionStore = this.sysRepository.getMetadataVersionStore();
if(versionStore == null) {
logger.warn("Metadata version system store not found. Cannot run Metadata version check thread.");
} else {
// Create a callback for re-bootstrapping the client
Callable<Void> rebootstrapCallback = new Callable<Void>() {
public Void call() throws Exception {
bootStrap();
return null;
}
};
asyncMetadataManager = new AsyncMetadataVersionManager(this.sysRepository,
rebootstrapCallback,
this.storeName);
// schedule the job to run every 'checkInterval' period, starting
// now
if(scheduler != null) {
scheduler.schedule(jobId + asyncMetadataManager.getClass().getName(),
asyncMetadataManager,
new Date(),
interval);
logger.info("Metadata version check thread started. Frequency = Every " + interval
+ " ms");
} else {
logger.warn("Metadata version check thread won't start because the scheduler service is not configured.");
}
}
return asyncMetadataManager;
}
@Override
@JmxOperation(description = "bootstrap metadata from the cluster.")
public void bootStrap() {
logger.info("Bootstrapping metadata for store " + this.storeName);
/*
* Since we need cluster.xml for bootstrapping this client as well as
* all the System stores, just fetch it once and pass it around.
*/
clusterXml = abstractStoreFactory.bootstrapMetadataWithRetries(MetadataStore.CLUSTER_KEY);
// Get client store
this.store = abstractStoreFactory.getRawStore(storeName, resolver, null, clusterXml, null);
// Create system stores
logger.info("Creating system stores for store " + this.storeName);
this.sysRepository.createSystemStores(this.config,
this.clusterXml,
abstractStoreFactory.getFailureDetector());
/*
* Update to the new metadata versions (in case we got here from Invalid
* Metadata exception). This will prevent another bootstrap via the
* Async metadata checker
*/
if(asyncMetadataManager != null) {
asyncMetadataManager.updateMetadataVersions();
}
/*
* Every time we bootstrap, update the bootstrap time
*/
if(this.clientInfo != null) {
if(this.asyncMetadataManager != null) {
this.clientInfo.setClusterMetadataVersion(this.asyncMetadataManager.getClusterMetadataVersion());
}
this.clientInfo.setBootstrapTime(System.currentTimeMillis());
}
if(this.clientRegistryRefresher == null) {
logger.error("Unable to publish the client registry after bootstrap. Client Registry Refresher is NULL.");
} else {
logger.info("Publishing client registry after Bootstrap.");
this.clientRegistryRefresher.publishRegistry();
}
}
public String getClientId() {
return clientId;
}
@JmxGetter(name = "getClusterMetadataVersion")
public String getClusterMetadataVersion() {
String result = "Current Cluster Metadata Version : "
+ this.asyncMetadataManager.getClusterMetadataVersion();
return result;
}
/**
* Generate a unique client ID based on: 0. clientContext, if specified; 1.
* storeName; 2. deployment path; 3. client sequence
*
* @param storeName the name of the store the client is created for
* @param contextName the name of the client context
* @param clientSequence the client sequence number
* @return unique client ID
*/
public String generateClientId(ClientInfo clientInfo) {
String contextName = clientInfo.getContext();
int clientSequence = clientInfo.getClientSequence();
String newLine = System.getProperty("line.separator");
StringBuilder context = new StringBuilder(contextName == null ? "" : contextName);
context.append(0 == clientSequence ? "" : ("." + clientSequence));
context.append(".").append(clientInfo.getStoreName());
context.append("@").append(clientInfo.getLocalHostName()).append(":");
context.append(clientInfo.getDeploymentPath()).append(newLine);
if(logger.isDebugEnabled()) {
logger.debug(context.toString());
}
return context.toString();
}
} |
<gh_stars>1-10
/*
* "[...] Sincerity (comprising truth-to-experience, honesty towards the self,
* and the capacity for human empathy and compassion) is a quality which
* resides within the laguage of literature. It isn't a fact or an intention
* behind the work [...]"
*
* - An introduction to Literary and Cultural Theory, <NAME>
*
*
* o8o
* `"'
* oooo ooo .oooo. .ooooo. .oooo.o oooo .ooooo.
* `88. .8' `P )88b d88' `88b d88( "8 `888 d88' `88b
* `88..8' .oP"888 888 888 `"Y88b. 888 888 888
* `888' d8( 888 888 888 o. )88b .o. 888 888 888
* .8' `Y888""8o `Y8bod8P' 8""888P' Y8P o888o `Y8bod8P'
* .o..P'
* `Y8P' <NAME> <<EMAIL>>
*
* Welcome aboard!
*/
#include "bsp.h"
#include "flash.h"
#include <stdbool.h>
#include <stdlib.h>
#include <errno.h>
static inline void clear_flags()
{
FLASH_SR |= FLASH_STATUS_MASK;
}
static inline void clear_errflags()
{
FLASH_SR |= FLASH_STATUS_ERROR_MASK;
}
static inline int get_errflags()
{
return FLASH_SR & FLASH_STATUS_ERROR_MASK;
}
static inline void flash_wait()
{
while (FLASH_SR & (1U << BIT_FLASH_BUSY));
}
static inline void flash_unlock()
{
flash_wait();
if (FLASH_CR & (1U << BIT_FLASH_LOCK)) {
FLASH_KEYR = FLASH_UNLOCK_KEY1;
FLASH_KEYR = FLASH_UNLOCK_KEY2;
}
while (FLASH_CR & (1U << BIT_FLASH_LOCK));
}
static inline void flash_lock()
{
FLASH_CR |= 1U << BIT_FLASH_LOCK;
}
static inline void flash_unlock_opt()
{
FLASH_OPTKEYR = FLASH_OPT_UNLOCK_KEY1;
FLASH_OPTKEYR = FLASH_OPT_UNLOCK_KEY2;
}
static inline void flash_prepare()
{
clear_flags();
flash_unlock();
flash_writesize_set(32);
FLASH_CR |= 1U << BIT_FLASH_PROGRAM;
flash_wait();
}
static inline void flash_finish()
{
FLASH_CR &= ~(1U << BIT_FLASH_PROGRAM);
flash_lock();
}
#if defined(stm32f4)
static inline void flash_erase_sector(int nr)
{
unsigned int tmp;
flash_wait();
if (nr >= 12)
nr = (nr - 12) | 0x10;
tmp = FLASH_CR;
tmp &= ~(0x1f << BIT_FLASH_SECTOR_NR);
tmp |= (1U << BIT_FLASH_SECTOR_ERASE) | (nr << BIT_FLASH_SECTOR_NR);
tmp |= 1U << BIT_FLASH_START;
FLASH_CR = tmp;
flash_wait();
debug("erase sector %d", nr);
}
static inline void flash_erase_all()
{
unsigned int tmp;
flash_wait();
tmp = FLASH_CR;
tmp |= (1U << BIT_FLASH_MASS_ERASE) | (1U << BIT_FLASH_MASS_ERASE2);
tmp |= 1U << BIT_FLASH_START;
FLASH_CR = tmp;
flash_wait();
debug("erase all banks and sectors");
}
static inline bool flash_write_word(unsigned int *dst, const unsigned int *src)
{
*dst = *(volatile unsigned int *)src;
flash_wait();
if (get_errflags() || *(volatile unsigned int *)dst != *src)
return false;
return true;
}
#elif defined(stm32f1) || defined(stm32f3)
static inline void flash_erase_sector(int addr)
{
FLASH_CR &= ~(1U << BIT_FLASH_PROGRAM);
flash_wait();
FLASH_CR |= 1U << BIT_FLASH_SECTOR_ERASE;
FLASH_AR = (unsigned int)addr;
FLASH_CR |= 1U << BIT_FLASH_START;
flash_wait();
FLASH_CR &= ~(1U << BIT_FLASH_SECTOR_ERASE);
FLASH_CR |= 1U << BIT_FLASH_PROGRAM;
}
static inline void flash_erase_all()
{
FLASH_CR &= ~(1U << BIT_FLASH_PROGRAM);
flash_wait();
FLASH_CR |= 1U << BIT_FLASH_MASS_ERASE;
FLASH_CR |= 1U << BIT_FLASH_START;
flash_wait();
FLASH_CR &= ~(1U << BIT_FLASH_MASS_ERASE);
FLASH_CR |= 1U << BIT_FLASH_PROGRAM;
}
static inline bool flash_write_word(unsigned int *dst, const unsigned int *src)
{
unsigned int addr = (unsigned int)dst;
unsigned short int t;
t = (unsigned short int)*src;
*(volatile unsigned short int *)(addr) = t;
flash_wait();
t = (unsigned short int)(*src >> 16);
*(volatile unsigned short int *)(addr+2) = (unsigned short int)(*src >> 16);
flash_wait();
if (get_errflags() || *(volatile unsigned int *)dst != *src)
return false;
return true;
}
#else
#error undefined machine
#endif
static inline int flash_erase(int nr)
{
/* FIXME: make sure that no data in the sector is cached */
if ((unsigned int)nr == FLASH_MASS_ERASE) {
flash_erase_all();
return get_errflags();
}
if (nr >= NSECTORS) {
debug("no existing sector %d", nr);
return -ERANGE;
}
flash_erase_sector(nr);
return get_errflags();
}
static size_t __attribute__((section(".iap")))
flash_write_core(void * const addr, const void * const buf, size_t len,
bool overwrite)
{
const unsigned int *src, *new, *restore;
unsigned int *dst;
unsigned int base, tmp;
int s, ss, diff, left, t;
unsigned int new_start, new_end;
len = (len / 4) + !!(len % 4); /* bytes to word */
left = len;
dst = addr;
src = buf;
ss = diff = 0;
new = NULL;
new_start = new_end = 0;
restore = NULL;
flash_prepare();
retry:
while (left) {
if ((unsigned int)dst >= new_start &&
(unsigned int)dst < new_end) {
if (!flash_write_word(dst, new))
break;
new++;
diff = 0;
} else if (diff < 0) { /* Restore the fore data */
if (!flash_write_word(dst, restore))
break;
restore++;
} else {
if (!flash_write_word(dst, src))
break;
src++;
}
dst++;
left--;
}
if (left) {
s = addr2sector(dst);
ss = get_sector_size_kb(s) << 10; /* error if 0 */
base = BASE_ALIGN((unsigned int)dst, ss);
diff = (int)((unsigned int)dst - base) / 4;
if ((unsigned int)addr > base) {
new_start = (unsigned int)addr;
new_end = min(base + ss, new_start + len * 4);
new = &((unsigned int *)buf)
[((unsigned int)dst - (unsigned int)addr) / 4];
diff = (int)(base - (unsigned int)addr) / 4;
left = len;
} else {
new_start = base;
new_end = min(base + ss, new_start + (left + diff) * 4);
new = src - diff;
}
dst = (unsigned int *)base;
src = new + (new_end - new_start) / 4;
left += abs(diff);
if (!overwrite) { /* Save the sector in a temporal sector */
tmp = get_temporal_sector_addr(ss);
if (flash_write_core((void *)tmp, (void *)base, ss, true) != (size_t)ss)
goto out;
restore = (unsigned int *)tmp;
t = (int)((base + left * 4) - (base + ss));
if (t < 0) { /* Restore the rear data */
src = (unsigned int *)(tmp + (ss - abs(t)));
left += abs(t) / 4;
}
} else
clear_flags();
flash_prepare();
if (flash_erase(s))
goto cleanout;
goto retry;
}
cleanout:
flash_finish();
out:
dsb();
isb();
return (len - left) * 4;
}
size_t flash_program(void * const addr, const void * const buf, size_t len)
{
size_t written;
written = flash_write_core(addr, buf, len, 1);
return written;
}
#if 0
void flash_protect()
{
#if defined(stm32f1) || defined(stm32f3)
if (FLASH_OPT_RDP != 0x5aa5)
return;
#elif defined(stm32f4)
if (((FLASH_OPTCR >> 8) & 0xff) != 0xaa)
return;
#else
#error undefined machine
#endif
warn("Protect flash memory from externel accesses");
flash_unlock();
flash_unlock_opt();
#if defined(stm32f1) || defined(stm32f3)
FLASH_CR |= 1U << BIT_FLASH_OPT_BYTE_ERASE;
FLASH_CR |= 1U << BIT_FLASH_START;
#elif defined(stm32f4)
FLASH_OPTCR &= ~(0xffU << 8);
FLASH_OPTCR |= 2U; /* set start bit */
#else
#error undefined machine
#endif
while (FLASH_SR & (1U << BIT_FLASH_BUSY));
#if defined(stm32f1) || defined(stm32f3)
FLASH_CR &= ~(1U << BIT_FLASH_OPT_BYTE_ERASE);
#elif defined(stm32f4)
FLASH_OPTCR &= ~2U;
#else
#error undefined machine
#endif
flash_lock_opt();
flash_lock();
reboot();
}
#endif
|
def from_pint(cls, arr, unit_registry=None):
p_units = []
for base, exponent in arr._units.items():
bs = convert_pint_units(base)
p_units.append("%s**(%s)" % (bs, Rational(exponent)))
p_units = "*".join(p_units)
if isinstance(arr.magnitude, np.ndarray):
return YTArray(arr.magnitude, p_units, registry=unit_registry)
else:
return YTQuantity(arr.magnitude, p_units, registry=unit_registry) |
import React from 'react';
interface IExternalLinkProps {
href: string;
}
const ExternalLink: React.SFC<IExternalLinkProps> = props => {
return (
<a href={props.href} rel="noopener" target="_blank">
{props.children}
</a>
);
};
export default ExternalLink;
|
<reponame>shikharvashistha/oak<filename>examples/translator/client/go/translator.go<gh_stars>100-1000
// Copyright 2020 The Project Oak Authors
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package main
import (
"context"
"flag"
"fmt"
"github.com/golang/glog"
"github.com/golang/protobuf/proto"
"google.golang.org/grpc"
"google.golang.org/grpc/codes"
"google.golang.org/grpc/credentials"
"google.golang.org/grpc/status"
translator_pb "github.com/project-oak/oak/examples/translator/proto"
label_pb "github.com/project-oak/oak/oak_abi/proto/label"
)
var (
address = flag.String("address", "localhost:8080", "Address of the Oak application to connect to")
caCert = flag.String("ca_cert_path", "", "Path to the PEM-encoded CA root certificate")
)
// Keep in sync with /oak_runtime/src/node/grpc/server/mod.rs.
const oakLabelGrpcMetadataKey = "x-oak-label-bin"
func translate(ctx context.Context, client translator_pb.TranslatorClient, text, fromLang, toLang string) {
glog.Infof("Translate %q from %q to %q", text, fromLang, toLang)
req := translator_pb.TranslateRequest{Text: text, FromLang: fromLang, ToLang: toLang}
rsp, err := client.Translate(ctx, &req)
if err != nil {
rpcStatus, ok := status.FromError(err)
if !ok {
glog.Fatalf("Could not perform Translate(%q, %q=>%q): internal error %v", text, fromLang, toLang, err)
}
if rpcStatus.Code() != codes.NotFound {
glog.Fatalf("Could not perform Translate(%q, %q=>%q): %v", text, fromLang, toLang, err)
}
glog.Errorf("Failed to Translate(%q, %q=>%q): not found", text, fromLang, toLang)
return
}
glog.Infof("Response: %q", rsp.TranslatedText)
}
// TODO(#1097): move this into an SDK package to allow re-use.
type LabelMetadata struct {
metadata map[string]string
}
func NewLabelMetadata(label label_pb.Label) (*LabelMetadata, error) {
label_data, err := proto.Marshal(&label)
if err != nil {
return nil, fmt.Errorf("Failed to serialize label %v: %v", label, err)
}
return &LabelMetadata{
metadata: map[string]string{
oakLabelGrpcMetadataKey: string(label_data),
},
}, nil
}
// Implement the grpc.PerRPCCredentials interface.
func (lm *LabelMetadata) GetRequestMetadata(ctx context.Context, uri ...string) (map[string]string, error) {
return lm.metadata, nil
}
func (lm *LabelMetadata) RequireTransportSecurity() bool {
return true
}
func main() {
flag.Parse()
ctx := context.Background()
// Connect to the Oak Application.
creds, err := credentials.NewClientTLSFromFile(*caCert, "")
if err != nil {
glog.Exitf("Failed to set up TLS client credentials from %q: %v", *caCert, err)
}
// TODO(#1066): Use a more restrictive Label.
label := label_pb.Label{}
metadata, err := NewLabelMetadata(label)
if err != nil {
glog.Exitf("Failed to create label metadata for %v: %v", label, err)
}
conn, err := grpc.Dial(*address, grpc.WithTransportCredentials(creds), grpc.WithPerRPCCredentials(metadata))
if err != nil {
glog.Exitf("Failed to dial Oak Application at %v: %v", *address, err)
}
defer conn.Close()
client := translator_pb.NewTranslatorClient(conn)
// Perform multiple invocations of the same Oak Node, with different parameters.
translate(ctx, client, "WORLDS", "en", "fr")
translate(ctx, client, "WORLDS", "en", "it")
translate(ctx, client, "WORLDS", "en", "cn")
translate(ctx, client, "OSSIFRAGE", "en", "fr")
}
|
def make_image(path: str, cols: int, rows: int) -> np.ndarray:
img = np.loadtxt(path, delimiter=' ', dtype=np.int)
img = img.reshape(-1, rows, cols)
return img |
Some generation methods would be more realistic than others. Wind power would only be truly effective in the upper atmosphere (albeit 10 times more effective than on Earth) and would require tethered turbines. Solar energy, meanwhile, would be the most daunting. Titan barely gets any sunlight, so you'd need a tremendous number of solar panels. A population roughly that of the US, about 300 million, would require enough panels that you could cover the US. Solar would clearly be more of an energy supplement than a primary source, then. Spaceship crews would love Titan, though, as the surplus of methane could turn the moon into a giant fuel depot.
A separate study notes that Titan's lakes are calm enough that you could land probes without too much trouble, clearing a path for human visitors.
There's no question that any landing on Titan is decades away and would still be fraught with challenges, such as the extreme cold (-291F), low gravity (0.14 g) and inhospitable atmosphere. Manned trips to Mars aren't expected to happen until the 2030s, and that planet is both closer and much, much warmer. Nevertheless, the findings could be helpful in the long run. Humanity has very few choices for visiting moons and planets that are even vaguely survivable. If the species is going to maintain any kind of significant footprint beyond Earth, it needs to know what its options are well before it starts building ships and habitats. |
import pickle
import argparse
import multiprocessing as mp
import numpy as np
import scipy.interpolate
import scipy.stats
import sys
import time
import logging
from rvspecfit import spec_fit
from rvspecfit import make_interpol
import rvspecfit
git_rev = rvspecfit.__version__
def get_continuum_prefix(continuum):
if not continuum:
pref = 'nocont_'
else:
pref = ''
return pref
def get_ccf_pkl_name(setup, continuum=True):
return 'ccf_' + get_continuum_prefix(continuum) + '%s.pkl' % setup
def get_ccf_dat_name(setup, continuum=True):
return 'ccfdat_' + get_continuum_prefix(continuum) + '%s.npz' % setup
def get_ccf_mod_name(setup, continuum=True):
return 'ccfmod_' + get_continuum_prefix(continuum) + '%s.npy' % setup
class CCFConfig:
""" Configuration class for cross-correlation functions """
def __init__(self,
logl0=None,
logl1=None,
npoints=None,
splinestep=1000,
maxcontpts=20):
"""
Configure the cross-correlation
Parameters
----------
logl0: float
The natural logarithm of the wavelength of the beginning of
the CCF
logl1: float
The natural logarithm of the wavelength of the end of the CCF
npoints: integer
The number of points in the cross correlation functions
splinestep: float, optional
The stepsize in km/s that determines the smoothness of the
continuum fit
maxcontpts: integer, optional
The maximum number of points used for the continuum fit
"""
self.logl0 = logl0
self.logl1 = logl1
self.npoints = npoints
self.continuum = True
self.maxcontpts = maxcontpts
if splinestep is None:
self.continuum = False
else:
self.splinestep = max(
splinestep, 3e5 * (np.exp(
(logl1 - logl0) / self.maxcontpts) - 1))
def get_continuum(lam0, spec0, espec0, ccfconf=None):
"""Determine the continuum of the spectrum by fitting a spline
Parameters
----------
lam0: numpy array
The wavelength vector
spec0: numpy array
The spectral vector
espec0: numpy array
The vector of spectral uncertainties
ccfconf: CCFConfig object
The CCF configuration object
Returns
-------
cont: numpy array
The continuum vector
"""
lammin = lam0.min()
N = np.log(lam0.max() / lammin) / np.log(1 + ccfconf.splinestep / 3e5)
N = int(np.ceil(N))
# Determine the nodes of the spline used for the continuum fit
nodes = lammin * np.exp(
np.arange(N) * np.log(1 + ccfconf.splinestep / 3e5))
nodesedges = lammin * np.exp(
(-0.5 + np.arange(N + 1)) * np.log(1 + ccfconf.splinestep / 3e5))
medspec = np.median(spec0)
if medspec <= 0:
medspec = np.abs(medspec)
if medspec == 0:
medspec = 1
logging.warning('The spectrum has a median that is non-positive...')
BS = scipy.stats.binned_statistic(lam0, spec0, 'median', bins=nodesedges)
p0 = np.log(np.maximum(BS.statistic, 1e-3 * medspec))
p0[~np.isfinite(p0)] = np.log(medspec)
ret = scipy.optimize.least_squares(fit_resid,
p0,
loss='soft_l1',
args=((nodes, lam0, spec0, espec0),
False))
cont = fit_resid(ret['x'], (nodes, lam0, spec0, espec0), True)
return cont
def fit_resid(p, args=None, getModel=False):
# residual of the fit for the fitting
nodes, lam, spec, espec = args
mod = np.exp(
np.clip(
scipy.interpolate.UnivariateSpline(nodes, p, s=0, k=2)(lam), -100,
100))
if getModel:
return mod
return (mod - spec) / espec
def preprocess_model(logl,
lammodel,
model0,
vsini=None,
ccfconf=None,
modid=None):
"""
Take the input template model and return prepared for FFT vectors.
That includes padding, apodizing and normalizing by continuum
Parameters
-----------
logl: numpy array
The array of log wavelengths on which we want to outputted spectra
lammodel: numpy array
The wavelength array of the model
model0: numpy array
The initial model spectrum
vsini: float, optional
The stellar rotation, vsini
ccfconf: CCFConfig object, required
The CCF configuration object
Returns
--------
xlogl: Numpy array
The log-wavelength of the resulting processed model0
cpa_model: Numpy array
The continuum normalized/subtracted, apodized and padded spectrum
"""
if vsini is not None and vsini != 0:
m = spec_fit.convolve_vsini(lammodel, model0, vsini)
else:
m = model0
if ccfconf.continuum:
cont = get_continuum(lammodel,
m,
np.maximum(m * 1e-5, 1e-2 * np.median(m)),
ccfconf=ccfconf)
cont = np.maximum(cont, 1e-2 * np.median(cont))
else:
cont = 1
c_model = scipy.interpolate.interp1d(np.log(lammodel), m / cont)(logl)
return c_model
def preprocess_model_list(lammodels, models, params, ccfconf, vsinis=None):
"""Apply preprocessing to the array of models
Parameters
----------
lammodels: numpy array
The array of wavelengths of the models
(assumed to be the same for all models)
models: numpy array
The 2D array of modell with the shape [number_of_models, len_of_model]
params: numpy array
The 2D array of template parameters (i.e. stellar atmospheric
parameters) with the shape
[number_of_models,length_of_parameter_vector]
ccfconf: CCFConfig object
CCF configuration
vsinis: list of floats
The list of possible Vsini values to convolve model spectra with
Could be None
Returns
-------
ret: tuple
**FILL/CHECK ME**
1) log wavelenghts
2) processed spectra,
3) spectral params
4) list of vsini
"""
nthreads = 16
logl = np.linspace(ccfconf.logl0, ccfconf.logl1, ccfconf.npoints)
res = []
retparams = []
if vsinis is None:
vsinis = [None]
vsinisList = []
pool = mp.Pool(nthreads)
q = []
for imodel, m0 in enumerate(models):
for vsini in vsinis:
retparams.append(params[imodel])
q.append(
pool.apply_async(
preprocess_model,
(logl, lammodels, m0, vsini, ccfconf, params[imodel])))
vsinisList.append(vsini)
for ii, curx in enumerate(q):
print('Processing : %d / %d' % (ii, len(q)))
c_model = curx.get()
res.append(c_model)
pool.close()
pool.join()
retparams = np.array(retparams)
vsinisList = np.array(vsinisList)
res = np.array(res)
return res, retparams, vsinisList
def interp_masker(lam, spec, badmask):
"""
Fill the gaps spectrum by interpolating across a badmask.
The gaps are filled by linear interpolation. The edges are just
using the value of the closest valid pixel.
Parameters
-----------
lam: numpy array
The array of wavelengths of pixels
spec: numpy array
The spectrum array
badmask: boolean array
The array identifying bad pixels
Returns
--------
spec: numpy array
The array with bad pixels interpolated away
"""
spec1 = spec * 1
xbad = np.nonzero(badmask)[0]
xgood = np.nonzero(~badmask)[0]
if len(xgood) == 0:
logging.warning('All the pixels are masked for the ccf determination')
ret = spec1
ret[~np.isfinite(ret)] = 1
return ret
xpos = np.searchsorted(xgood, xbad)
leftedge = xpos == 0
rightedge = xpos == len(xgood)
mid = (~leftedge) & (~rightedge)
l1, l2 = lam[xgood[xpos[mid] - 1]], lam[xgood[xpos[mid]]]
s1, s2 = spec[xgood[xpos[mid] - 1]], spec[xgood[xpos[mid]]]
l0 = lam[xbad[mid]]
spec1[xbad[leftedge]] = spec[xgood[0]]
spec1[xbad[rightedge]] = spec[xgood[-1]]
spec1[xbad[mid]] = (-(l1 - l0) * s2 + (l2 - l0) * s1) / (l2 - l1)
return spec1
def preprocess_data(lam, spec0, espec, ccfconf=None, badmask=None, maxerr=10):
"""
Preprocess data in the same manner as the template spectra, normalize by
the continuum, apodize and pad
Parameters
-----------
lam: numpy array
The wavelength vector
spec0: numpy array
The input spectrum vector
espec0: numpy array
The error-vector of the spectrum
ccfconf: CCFConfig object
The CCF configuration
badmask: Numpy array(boolean), optional
The optional mask for the CCF
maxerr: integer
The maximum value of error to be masked in units of median(error)
Returns
cap_spec: numpy array
The processed apodized/normalized/padded spectrum
"""
t1 = time.time()
ccf_logl = np.linspace(ccfconf.logl0, ccfconf.logl1, ccfconf.npoints)
ccf_lam = np.exp(ccf_logl)
# to modify them
curespec = espec.copy()
curspec = spec0.copy()
if badmask is None:
badmask = np.zeros(len(curespec), dtype=bool)
# now I filter the spectrum to see if there are parts where
# spectrum is basicaly negative, I mask those areas out
filt_size = 11
filtspec = scipy.signal.medfilt(curspec, filt_size)
mederr = np.nanmedian(curespec)
if ccfconf.continuum:
badmask = badmask | (curespec > maxerr * mederr) | (filtspec <= 0)
curespec[badmask] = 1e9 * mederr
curspec = interp_masker(lam, curspec, badmask)
# not really needed but may be helpful for continuun determination
t2 = time.time()
if ccfconf.continuum:
cont = get_continuum(lam, curspec, curespec, ccfconf=ccfconf)
else:
cont = 1
t3 = time.time()
curivar = 1. / curespec**2
curivar[badmask] = 0
medv = np.median(curspec)
if medv > 0:
cont = np.maximum(1e-2 * medv, cont)
else:
cont = np.maximum(cont, 1)
# normalize the spectrum by continuum and update ivar
c_spec = spec0 / cont
curivar = cont**2 * curivar
c_spec[badmask] = 0
xind = np.searchsorted(lam, ccf_lam) - 1
indsub = (xind >= 0) & (xind <= (len(lam) - 2))
# these are the pixels we can fill
res1 = np.zeros(len(ccf_logl))
res2 = np.zeros(len(ccf_logl))
left_i = xind[indsub]
right_i = left_i + 1
right_w = (ccf_lam[indsub] - lam[left_i]) / (lam[right_i] - lam[left_i])
left_w = 1 - right_w
res1[indsub] = left_w * c_spec[left_i] + right_w * c_spec[right_i]
left_ivar = curivar[left_i]
right_ivar = curivar[right_i]
# prevent division by zero
res2[indsub] = left_ivar * right_ivar / (
left_w**2 * right_ivar + right_w**2 * left_ivar +
((left_ivar * right_ivar) == 0).astype(int))
t4 = time.time()
logging.debug('CCF preprocessing time %f %f %f' %
(t2 - t1, t3 - t2, t4 - t3))
return res1, res2
def ccf_executor(spec_setup,
ccfconf,
prefix=None,
oprefix=None,
every=10,
vsinis=None,
revision=''):
"""
Prepare the FFT transformations for the CCF
Parameters
-----------
spec_setup: string
The name of the spectroscopic spec_setup
ccfconf: CCFConfig
The CCF configuration object
prefix: string
The input directory where the templates are located
oprefix: string
The output directory
every: integer (optional)
Produce FFTs of every N-th spectrum
vsinis: list (optional)
Produce FFTS of the templates with Vsini from the list.
Could be None (it means no rotation will be added)
revision: str (optional)
The revision of the files/run that will be tagged in the pickle file
Returns
-------
Nothing
"""
with open(('%s/' + make_interpol.SPEC_PKL_NAME) % (prefix, spec_setup),
'rb') as fp:
D = pickle.load(fp)
vec, specs, lam, parnames = D['vec'], D['specs'], D['lam'], D[
'parnames']
del D
nspec = specs.shape[0]
rng = np.random.Generator(np.random.PCG64(44))
inds = rng.permutation(np.arange(nspec))[:(nspec // every)]
specs = specs[inds, :]
vec = vec.T[inds, :]
nspec, lenspec = specs.shape
models, params, vsinis = preprocess_model_list(lam,
np.exp(specs),
vec,
ccfconf,
vsinis=vsinis)
ffts = np.array([np.fft.rfft(x) for x in models])
fft2s = np.array([np.fft.rfft(x**2) for x in models])
savefile = (oprefix + '/' +
get_ccf_pkl_name(spec_setup, ccfconf.continuum))
datsavefile = (oprefix + '/' +
get_ccf_dat_name(spec_setup, ccfconf.continuum))
modsavefile = (oprefix + '/' +
get_ccf_mod_name(spec_setup, ccfconf.continuum))
dHash = {}
dHash['params'] = params
dHash['ccfconf'] = ccfconf
dHash['vsinis'] = vsinis
dHash['parnames'] = parnames
dHash['revision'] = revision
with open(savefile, 'wb') as fp:
pickle.dump(dHash, fp)
np.savez(datsavefile, fft=np.array(ffts), fft2=np.array(fft2s))
np.save(modsavefile, np.array(models))
def to_power_two(i):
return 2**(int(np.ceil(np.log(i) / np.log(2))))
def main(args):
parser = argparse.ArgumentParser(
description='Create the Fourier transformed templates')
parser.add_argument('--prefix',
type=str,
help='Location of the input spectra')
parser.add_argument(
'--oprefix',
type=str,
default='templ_data/',
help='Location where the ouput products will be located')
parser.add_argument('--setup',
type=str,
help='Name of spectral configuration')
parser.add_argument('--lambda0',
type=float,
help='Starting wavelength in Angstroms',
required=True)
parser.add_argument('--lambda1',
type=float,
help='Wavelength endpoint',
required=True)
parser.add_argument('--nocontinuum',
dest='nocontinuum',
action='store_true')
parser.add_argument('--step',
type=float,
help='Pixel size in angstroms',
required=True)
parser.add_argument('--revision',
type=str,
help='Revision of the data files/run',
required=False,
default='')
parser.add_argument(
'--vsinis',
type=str,
default=None,
help='Comma separated list of vsini values to include in the ccf set')
parser.add_argument('--every',
type=int,
default=30,
help='Subsample the input grid by this amount')
parser.set_defaults(nocontinuum=False)
args = parser.parse_args(args)
npoints = to_power_two(int((args.lambda1 - args.lambda0) / args.step))
if args.nocontinuum:
ccfconf = CCFConfig(logl0=np.log(args.lambda0),
logl1=np.log(args.lambda1),
npoints=npoints,
splinestep=None)
else:
ccfconf = CCFConfig(logl0=np.log(args.lambda0),
logl1=np.log(args.lambda1),
npoints=npoints)
if args.vsinis is not None:
vsinis = [float(_) for _ in args.vsinis.split(',')]
else:
vsinis = None
ccf_executor(args.setup,
ccfconf,
args.prefix,
args.oprefix,
args.every,
vsinis,
revision=args.revision)
if __name__ == '__main__':
main(sys.argv[1:])
|
package com.dsatab.xml;
import org.jdom2.Element;
import java.util.LinkedList;
import java.util.List;
public class DomUtil {
public static Element getChildByTagName(Element parent, String subParentTagName, String tagName) {
if (subParentTagName != null) {
Element subParent = parent.getChild(subParentTagName);
if (subParent != null) {
parent = subParent;
}
}
return parent.getChild(tagName);
}
public static String getChildValue(Element node, String childTagName, String childParamName) {
Element child = node.getChild(childTagName);
if (child != null) {
return child.getAttributeValue(childParamName);
} else {
return null;
}
}
public static List<Element> getChildrenByTagName(Element parent, String subParentTagName, String tagName) {
List<Element> children = new LinkedList<Element>();
List<Element> parentList = null;
if (subParentTagName != null) {
List<Element> subParents = parent.getChildren(subParentTagName);
if (!subParents.isEmpty())
parentList = subParents;
}
if (parentList != null && !parentList.isEmpty()) {
for (Element subParent : parentList) {
children.addAll(subParent.getChildren(tagName));
}
} else {
children = parent.getChildren(tagName);
}
return children;
}
}
|
def gen_rematch_val():
train_df = pd.read_csv(from_project_root("data/train_2.csv"))
test_df = pd.read_csv(from_project_root("data/preliminary/test_public.csv"))
val_df = test_df.merge(train_df, on='content') \
.drop(columns=['content_id_y']) \
.rename(columns={'content_id_x': 'content_id'})
val_df.to_csv(from_project_root('data/preliminary/test_gold.csv'), index=False)
test_df = pd.read_csv(from_project_root("data/test_public_2.csv"))
test_df = test_df[~test_df['content_id'].isin(val_df['content_id'])]
test_df.to_csv('data/test_2.csv', index=False) |
/*
* Cancel the timer.
*
* @since 1.2.0.dp6
*
* This operation makes sense for periodic timers or if one need to cancel
* regular timer before it will be triggered.
*
* @example Cancel periodic timer
* n = 1
* c.run do
* tm = c.create_periodic_timer(500000) do
* c.incr("foo") do
* if n == 5
* tm.cancel
* else
* n += 1
* end
* end
* end
* end
*
* @return [String]
*/
VALUE
cb_timer_cancel(VALUE self)
{
struct cb_timer_st *tm = DATA_PTR(self);
lcb_timer_destroy(tm->bucket->handle, tm->timer);
return self;
} |
Prostaglandins and cannabis XIV. Tolerance to the stimulatory actions of cannabinoids on arachidonate metabolism.
The stimulation of prostaglandin E2 synthesis by delta 1-tetrahydrocannabinol in cultured cells is rapidly diminished by successive exposures to the drug at 24-hr intervals. Cannabidiol and cannabicyclol, two other constituents of cannabis, also displayed this in vitro tolerance effect. The phenomenon could, in addition, be observed by measuring the release of arachidonic acid from these cells, suggesting that the site of action of the cannabinoids is at one or more of the lipases that are believed to control prostaglandin synthesis under most conditions. Tolerance to cannabinoid action has been reported for a variety of in vivo parameters; thus, this in vitro system exhibits similar behavior and may, therefore, be a good model for studies on the molecular mechanisms involved in tetrahydrocannabinol action. |
Elbow dislocation. Review of current concepts
A simple elbow dislocation represents between 51% and 74% of all elbow dislocations. It is the second joint in dislocation frequency after the shoulder. Elbow dislocations are classified as simple (without a fracture) or complex (with a fracture); direction of cubitus displacement and radius in relation to the humerus is also important. Usually, the injury mechanism indi-cates the dislocation type. The radiographic evaluation (gold standard) is done with an anteroposterior and lateral of the elbow joint. Treatment is related to the dislocation classification. In a simple dislocation, the objectives are a concentric, closed, and stable reduction of the joint, to gain a range of movement early. Related complications are neurological or vascular injuries, compartmental syndrome, heterotopic ossification, chronic instability, and osteoarthritis. |
Miniaturized Bandpass Filter with Mixed Electric and Magnetic Coupling Using Hexagonal Stepped Impedance Resonators
Abstract A miniaturized fourth-order direct-coupled bandpass filter with good stopband responses using hexagonal stepped-impedance resonators is presented. Based on the odd- and even-mode equivalent circuits, the resonance characteristics of the hexagonal stepped-impedance resonators with mixed electric/magnetic coupling are investigated. Multiple finite-frequency transmission zeros are realized in the stopband but without introducing either cross-coupling between nonadjacent resonators or source-load coupling between input/output ports. The frequency-dependent coupling matrix of the proposed filter is presented. A new bandpass filter centered at 2.45 GHz with 6.5% fractional bandwidth has been designed and fabricated to verify the validity of the proposed method. The measurement result shows four finite transmission zeros in the stopband, located at 0.98 GHz with 73.14-dB rejection, 2.10 GHz with 48.18-dB rejection, 2.75 GHz with 53.80-dB rejection, 3.12 GHz with 57.08-dB rejection, respectively. The circuit only occupies 15.9 × 9.0 mm2. |
Application of monitoring guidelines to induced seismicity in Italy
Public concern about anthropogenic seismicity in Italy first arose in the aftermath of the deadly M ≈ 6 earthquakes that hit the Emilia-Romagna region (northern Italy) in May 2012. As these events occurred in a (tectonically active) region of oil and gas production and storage, the question was raised, whether stress perturbations due to underground industrial activities could have induced or triggered the shocks. Following expert recommendations, in 2014, the Italian Oil & Gas Safety Authority (DGS-UNMIG, Ministry of Economic Development) published guidelines (ILG - Indirizzi e linee guida per il monitoraggio della sismicità, delle deformazioni del suolo e delle pressioni di poro nell’ambito delle attività antropiche), describing regulations regarding hydrocarbon extraction, waste-water injection and gas storage that could also be adapted to other technologies, such as dams, geothermal systems, CO2 storage, and mining. The ILG describe the framework for the different actors involved in monitoring activities, their relationship and responsibilities, the procedure to be followed in case of variations of monitored parameters, the need for in-depth scientific analyses, the definition of different alert levels, their meaning and the parameters to be used to activate such alerts. Four alert levels are defined, the transition among which follows a decision to be taken jointly by relevant authorities and industrial operator on the basis of evaluation of several monitored parameters (micro-seismicity, ground deformation, pore pressure) carried on by a scientific-technical agency. Only in the case of liquid reinjection, the alert levels are automatically activated on the basis of exceedance of thresholds for earthquake magnitude and ground shaking – in what is generally known as a Traffic Light System (TLS). Istituto Nazionale di Geofisica e Vulcanologia has been charged by the Italian oil and gas safety authority (DGS-UNMIG) to apply the ILG in three test cases (two oil extraction and one gas storage plants). The ILG indeed represent a very important and positive innovation, as they constitute official guidelines to coherently regulate monitoring activity on a national scale. While pilot studies are still mostly under way, we may point out merits of the whole framework, and a few possible critical issues, requiring special care in the implementation. Attention areas of adjacent reservoirs, possibly licenced to different operators, may overlap, hence making the point for joint monitoring, also in view of the possible interaction between stress changes related to the different reservoirs. The prescribed initial blank-level monitoring stage, aimed at assessing background seismicity, may lose significance in case of nearby active production. Magnitude – a critical parameter used to define a possible step-up in activation levels – has inherent uncertainty and can be evaluated using different scales. A final comment considers the fact that relevance of TLS, most frequently used in hydraulic fracturing operations, may not be high in case of triggered tectonic events.
Abstract Public concern about anthropogenic seismicity in Italy first arose in the aftermath of the deadly M ≈ 6 earthquakes that hit the Emilia-Romagna region (northern Italy) in May 2012. As these events occurred in a (tectonically active) region of oil and gas production and storage, the question was raised, whether stress perturbations due to underground industrial activities could have induced or triggered the shocks. Following expert recommendations, in 2014, the Italian Oil & Gas Safety Authority (DGS-UNMIG, Ministry of Economic Development) published guidelines (ILG -Indirizzi e linee guida per il monitoraggio della sismicità, delle deformazioni del suolo e delle pressioni di poro nell'ambito delle attività antropiche), describing regulations regarding hydrocarbon extraction, waste-water injection and gas storage that could also be adapted to other technologies, such as dams, geothermal systems, CO 2 storage, and mining. The ILG describe the framework for the different actors involved in monitoring activities, their relationship and responsibilities, the procedure to be followed in case of variations of monitored parameters, the need for in-depth scientific analyses, the definition of different alert levels, their meaning and the parameters to be used to activate such alerts. Four alert levels are defined, the transition among which follows a decision to be taken jointly by relevant authorities and industrial operator on the basis of evaluation of several monitored parameters (micro-seismicity, ground deformation, pore pressure) carried on by a scientific-technical agency. Only in the case of liquid reinjection, the alert levels are automatically activated on the basis of exceedance of thresholds for earthquake magnitude and ground shakingin what is generally known as a Traffic Light System (TLS). Istituto Nazionale di Geofisica e Vulcanologia has been charged by the Italian oil and gas safety authority (DGS-UNMIG) to apply the ILG in three test cases (two oil extraction and one gas storage plants). The ILG indeed represent a very important and positive innovation, as they constitute official guidelines to coherently regulate monitoring activity on a national scale. While pilot studies are still mostly under way, we may point out merits of the whole framework, and a few possible critical issues, requiring special care in the implementation. Attention areas of adjacent reservoirs, possibly licenced to different operators, may overlap, hence making the point for joint monitoring, also in view of the possible interaction between stress changes related to the different reservoirs. The prescribed initial blanklevel monitoring stage, aimed at assessing background seismicity, may lose significance in case of nearby active production. Magnitudea critical parameter used to
Introduction
Since the inception of the use of hydraulic fracturing for shale gas production, human-induced seismicity has become a subject of increasing interest, especially in the USA and Canada (e.g. Davis and Frohlich 1993;McGarr et al. 2002;Ellsworth 2013). Many studies have since been published on anthropogenic seismicity. A review of human-induced earthquakes on a global scale was given by Foulger et al. (2017); Grigoli et al. (2017) published a European perspective about challenges in monitoring, discrimination, and management of induced seismicity related to underground industrial activities, while Braun et al. (2018b) gave an overview about the state of the art of anthropogenic seismicity in Italy. Doglioni (2018) proposed a classification of induced seismicity, distinguishing four different mechanisms causing earthquakes with anthropogenic origin. Dahm et al. (2013) and Cesca et al. (2013b) gave recommendations for the discrimination of human-related and natural seismicity and proposed a probabilistic approach to discriminate between induced, triggered, and natural earthquakes based on the modelling of depletioninduced stress changes and seismological source parameters (Dahm et al. 2015). Italian geology is not suitable for shale gas exploitation; however, concerns about anthropogenic seismicity in Italy came up after the deadly M W = 6.2 Emilia-Romagna (northern Italy) earthquake in May 2012 (Scognamiglio et al. 2012;Cesca et al. 2013a). Since this seismic sequence occurred in the vicinity of gas and oil production sites, the question surfaced, whether variations in crustal stress accompanying hydrocarbon extraction might have influenced the generation of these earthquakes. The Italian Department of Civil Protection appointed an international committee (ICHESE, International Commission on Hydrocarbon Exploration and Seismicity in the Emilia region) to analyse all available geological, geophysical, and industrial information and to investigate whether the 2012 earthquake sequence could have been induced or triggered by industrial activities in the area. In their conclusions, the committee argued that only the Cavone oilfield, the Casaglia geothermal field, and the Minerbio gas storage plant were located in the surroundings of the main shocks, and concluded that "it is highly unlikely that the activities of hydrocarbon exploitation and the geothermal activity have produced sufficient stress change to generate an 'induced'seismic event", but that they could not rule out the possibility that operations at the Cavone oilfield "may have contributed to trigger" "the Emilia seismic activity" (ICHESE 2014). The report originated public concern, as well as a debate about implications of underground technologies, and it hit the news as a suggestion that human activities might indeed have caused deadly earthquakes (Cartlidge, 2014), with a mechanism never before seriously considered in Italy. In 2014, the Italian institute for environmental protection and research (Istituto Superiore per la Protezione e la Ricerca Ambientale, ISPRA) published a report about documented and presumed cases of triggered or induced seismicity in Italy ( Fig. 1; ISPRA 2014). The Ministry for Economic Development (MISE), the Emilia-Romagna Regional Government, and the Italian Petroleum and Mining Industry Association (Assomineraria) and the company, owner of the oil plant, in 2014 promoted monitoring and research on the Cavone site (LabCavone, 2019), in an effort that led to the production of a fluid-geo-mechanical model that allowed to conclude that "the combined effects of fluid production and reinjection from the Cavone field were not a driver for the observed seismicity" (Astiz et al. 2014). Further studies considered unlikely that the combined effect of oil production and water injection from the main potential culprit, the Cavone oil field, could have influenced the occurrence of the earthquake sequence (e.g. Dahm et al. 2015;Juanes et al., 2016).
In the following years, the Italian government adopted disciplinary resolutions concerning gas and oil prospecting, research and exploitation (i.e. Legislative Decree D.L. 133/2014, Stability Law 2015, Stability Law 2016). The ICHESE (2014) report recommended that all existing and future activities of hydrocarbon exploitation (oil and gas production, wastewater re-injection, gas storage, geothermal energy production) would have to be subject to monitoring for seismicity, ground deformation and pore pressure by high-quality networks. To follow such recommendation, the Italian Oil & Gas Safety Authori t y ( D G S -U N M I G , M i n i s t r y o f E c o n o m i c Development) published guidelines (Dialuce et al. 2014) describing regulations for geophysical monitoring of hydrocarbon extraction, waste-water injection and gas storage. (For a more in-depth account of these events see, e.g., Macini et al. 2015;Antoncecchi et al. 2017;Ciccone et al. 2017;Macini et al. 2017.) In this short note, we briefly outline the essence of the monitoring guidelines, describe the experience of their first implementation (up to now, in experimental mode), provide a general picture of the current state of monitoring practices in Italy, mention some possibly sensitive issues, and comment on future perspectives.
Italian guidelines for monitoring effects of industrial activity on the subsurface
The Italian Oil & Gas Safety Authority (DGS-UNMIG, a Directorate General of the Ministry of Economic Development) charged a group of experts to define guidelines following recommendations from the ICHESE (2014) report, for monitoring seismicity, ground deformation and pore pressure. The result of this effort (Indirizzi e linee guida per il monitoraggio della sismicità, delle deformazioni del suolo e delle pressioni di poro nell'ambito delle attività antropiche, ILG, Dialuce et al. 2014) represent the first effort towards systematic, well structured, public regulations regarding independent geophysical monitoring of underground anthropic activities in oil/gas operations (extraction, waste-water re-injection, storage) that could also be adapted to other technologies, such as dams, geothermal systems, CO 2 storage, and mining. A more recent edition of the ILG concerning geothermal energy production has been published in 2016 (Terlizzese 2016).
The ILG describe standards for monitoring relevant geophysical observables; outline roles and responsibilities of the different actors involved in monitoring activities; define procedures to be followed in case of significant changes of the monitored parameters; pinpoint the need for in-depth scientific analyses; establish four different activation levels, along with their meaning and the criteria to be used to activate such alerts (Dialuce et al. 2014;Macini et al. 2015;Macini et al. 2017). In the case of reinjection of incompressible fluids (i.e. production waste waters), alert levels are automatically attributed following a threshold system controlled by a few seismic parameters: magnitude, peak ground velocity (PGV) and peak ground acceleration (PGA)this is generally known as a Traffic Light System (e.g. Bommer et al. 2006;Baisch et al. 2019).
The monitoring scheme is focussed on a limited three-dimensional volume around the production reservoir. Special attention is requested to possible induced or triggered seismicity and/or ground deformation occurring within the so-called Inner Domain (Dominio Interno, DI), defined by widening the footprint of the oil-water contact in the reservoir by a distance, depending on the developed activity (2-3 km for gas storage, 5 km in case of fluid injection within the oilfield; the depth extent is also obtained by adding the same distance to reservoir depth). An Extended Domain (Dominio Esteso, DE) is also defined as an additional crustal volume of 5-10 km width around the DI, depending on the type of activity and oilfield dimension, where some looser conditions apply. The definition of finite volumes addressed for observation and monitoring assumes that any geo-mechanical or fluid propagation effect outside the External Domain should not be directly ascribed to the reservoir exploitation. In analogy with well-known Traffic Light Systems (e.g. Bommer et al. 2006;Baisch et al. 2019), the ILG introduce the socalled activation levels, which correspond to increasing size of seismic phenomena, and to increasing impact of the actions required.
The ILG define four activation levelsthat in case of water re-injection are automatically set in exceedance of specific thresholdsand corresponding actions, namely: Within 10 days from the reduction or stop of activities, conditions to step back to a lower level, or restore background conditions, must be verified and decisions be taken accordingly. All transitions among these levels are regulated by decisions taken jointly by MISE, relevant local Regional government, and industrial operator, on the basis of scientific data and interpretation given by the monitoring agency (Struttura Preposta al Monitoraggio, SPM). The ILG guidelines therefore define characteristics and roles of a technical-scientific body with proved skills, entrusted with tasks of acquisition and analysis of data, and technical support to the competent regulatory authorities (Dialuce et al. 2014).
For each hydrocarbon field, the SPM can be designated by the oil and gas safety authority (DGS-UNMIG), chosen among universities or public research centres with proved skills. The SPM will act as an independent technical body supervising the monitoring projects; collecting, processing, interpreting data; and reporting to the ministerial authority, the local administrative authority, and to the industrial operator. Moreover, the SPM will contribute to the definition of specific boundaries of the survey volumes, reference thresholds and parameter values that should be adopted in specific decisional models. With this set-up, the ILG guidelines thus provide a guarantee for the impartiality and independence of the technical-scientific analysis, carried out by a neutral, unbiased, SPM, with respect to the owner of the production licence.
In case of reinjection of incompressible fluids (e.g. waste water; the ILG do not include gas storage in this case), the ILG prescribe a strict four-stage TLS, tied to fixed thresholds, and actions directly following exceedance of limits. Focussing on the Inner Domain, the ILG recommend to: 1. Proceed with ordinary activities and regularly report all events with magnitude less than the 'green' magnitude threshold M GREEN (or corresponding PGV and PGA) 2. Re-analyse earthquake parameters (and all other data) after an event with magnitude exceeding M GREEN (or corresponding PGV and PGA) 3. Reduce production after an event with magnitude between M YELLOW and M ORANGE (or corresponding PGV and PGA) 4. Immediately halt industrial operations in case of events with magnitude exceeding the orange level M ORANGE (or corresponding PGV and PGA) The ILG suggest indicative threshold values (M GREEN = 1.5, M YELLOW = 2.2, and M ORANGE = 3.0; see Table 1) and recommend that actual values be explicitly estimated at each individual site in consideration of the specific site characteristics, including the tectonic environment. Preliminary monitoring of background seismicity for at least 1 year before the new activity is started and a period of calibration of the monitoring procedures are also recommended. Observations are supposed to start 1 year before the new industrial activity to allow definition of a blank-level baseline. They must then continue for the entire duration, and last for at least 1 year after the end of industrial operations. The task of the monitoring system is to control seismic parameters, pore pressure, and ground deformation as well as those derived from them by further analyses, as e.g., PGA, PGV, number and/or frequency of seismic events, magnitude, or time-space evolution.
Experimental application of the ILG
According to the ISPRA (2014), human activities that can potentially induce earthquakes in Italy are mining, reservoir impoundment, geothermal energy production, gas storage, and hydrocarbon exploitation (extraction of oil and gas and re-injection of wastewater). Hydraulic fracturing is not practiced in Italy, because the suitable shale gas formations are lacking. Figure 1 shows cases, either postulated or documented, of induced or triggered seismicity compiled by ISPRA (2014). Note, however, that according to Caciagli et al. (2015), the Caviaga earthquakes of 1951 can hardly be considered as due to the underground activities today (Caciagli et al., 2015). The white symbols represent additional sites of gas storage (squares) and low enthalpy geothermal energy production (triangle) from where so far; no seismicity has been reported (Braun et al. 2018b). In a three-year experimental phase, the ILG are planned to be tested in at least four different pilot areas ( Fig. 1): (i) Casaglia (Emilia Romagna, northern Italy) for lowenthalpy geothermal energy production (ii) Minerbio (Emilia Romagna, northern Italy) for gas storage (iii) Cavone (Emilia Romagna, northern Italy) for hydrocarbon extraction/waste water re-injection (iv) Val d'Agri (Basilicata, southern Italy) for hydrocarbon extraction/waste water reinjection While the three cases of Casaglia, Minerbio, and Cavone directly reflect the interest of the ICHESE Commission about specific areas in Emilia-Romagna, implementation of an independent monitoring system for Val d'Agri is regulated by a specific agreement signed by MISE, INGV, and Regione Basilicata with the acceptance of the industrial operator.
For the abovementioned areas (ii.) -(iv.), DGS-UNMIG nominated INGV as the agency, responsible for applying the ILG (Struttura Preposta al Monitoraggio, SPM). First experiences made during the recently concluded test-phase of the Minerbio concession are described by Carannante et al. (2019, this volume), who emphasize the improvement in earthquake detection capability due to the upgrade of the seismographic network of the plant operator linked to the requirements posed by ILG, and to the integration of available stations of the INGV national network. Other relevant cases in Italy, where geophysical monitoring has been extensively running for several years under systematic protocols, include the gas storage at Collalto (Priolo et al. 2015;Moratto et al. 2019;Romano et al. 2019) and the planned geothermal site of Torre Alfina (Braun et al. 2018a).
2 Discussion on the application of the ILG Experimental application of the ILG is still ongoing, and results are not yet final (information is available on a specific site, maintained by the Italian Oil & Gas Safety Authority -DGS -UNMIG, MISE: https://unmig.mise. gov.it/index.php/it/sicurezza/geomonitoraggi; last accessed: August 20, 2019).
In the following, we would like to point out some possibly critical issues in the application of the ILG that require special care in the implementation, concerning (i) adequacy of magnitude thresholds of TLS, and their relevance to Italian cases, (ii) interference between multiple anthropic activities in the same area, (iii) interactions between neighbouring exploitation licences, and (iv) significance of one-year period of pre-production background monitoring.
Earthquake magnitudes in a traffic light system
In the traffic light system, alert levels are defined on the basis of exceedance of threshold values for a few seismic parameters (magnitude, PGV, PGA; Dialuce et al. 2014). The ILG provide no constraint on the specific magnitude scale to use, so the local magnitude (M L ) may seem the most appropriate and practical choice. In fact, to estimate the size of a seismic event, the INGV national scale monitoring system for tectonic earthquakes uses the local magnitude M L , following the classical definition given by Hutton and Boore (1987). On the other hand, M W may seem an alternate, more significant estimator of source size. Malagnini and Munafò (2018) show that for the Italian Apennines, M W = 2/3 M L + 1.14meaning that if the proposed ILG threshold values (1.5/ 2.2/ 3.0) are understood as relating to M L , they may translate into M W = 2.1/2.6/ 3.1. Moment magnitude M W is only provided a few hours after the event, as additional information, when the seismic moment tensor is available for events of particular interest, but not for smaller events (the size of weak seismic events in Italian volcanic areas is sometimes quantified using yet another magnitude scale based on duration, M D ). In a rather promising alternate approach, Atkinson et al. (2014) proposed the computation of M w for weak local events based on the use of response spectra; a formulation that has later been applied to the weak seismicity of North-East Italy by Moratto et al. (2017). All these magnitude types (i.e. M L , M D and M W ) are fundamentally different among them, and obviously not interchangeable. It should be noted that such thresholds, given in the ILG, are meant to be indicative and have to be fixed in all specific cases by the competent actors (monitoring agency, licence operator, ministry, local regional government) with a technical operational document (Documento di Gestione Operativa del Monitoraggio, DGOM) considering the seismotectonic setting of the area. However, the risk of potential ambiguities in magnitude, or subjective choices not supported by a standard regulatory protocol, could be source of contrasting judgements at time of possible alert step-up, and should be carefully addressed. As noted above, INGV uses the Hutton and Boore (1987) relation for M L at the national scale. However, this distance-dependent attenuation correction although carrying the advantage of being a longstanding, reliable, referenceis not very well suited for the whole Italian region (Di Bona 2016). In particular, seismographic stations closer to the epicentre (~tens of km) systematically provide larger-magnitude values than farther stations. On one side, then, more reliable specific attenuation termsparticularly calibrated on the short distances that are relevant for the very local scale of oil/ gas reservoirsshould be devised for the sites object of monitoring. On the other side, it may not be desirable to publish different magnitudes, depending on the seismographic network used for analysis, for some critical earthquake, which may involve stopping industrial operations. As an example, we may mention recent experiences made in the geothermal area of Torre Alfina, where seismicity recorded by a local seismic network produced M L estimates that differ significantly from magnitudes determined by the National Seismic Network (Braun et al. 2018a). Depending on the geometrical distribution of the monitoring seismic network the application of accurate attenuation laws and correction factors strongly affects the estimation of M L . This adds to uncertainty inherent in every magnitude estimate that may pose issues when using a fixed-threshold activation system as TLS.
The very significance of a TLS, based on magnitude or peak ground motion (PGV, PGA) thresholds, is mainly connected to hydraulic fracturing, or fluid extraction causing differential sediment compaction (e.g., Bommer et al. 2006;De Waal et al. 2015;Baisch et al. 2019). In fact, the principle of a TLS assumes gradual increase in magnitude of earthquakes with the duration of anthropic activities (i.e. volume of fluid transferred), so that major earthquakes (that may cause damages or concern, because of the ground shaking they produce) are preceded by weaker precursors, and hence, precautionary operational measures may mitigate the seismic hazard. It has been shown that the actual mechanism may be more complex even in hydraulic stimulations (Baisch et al. 2019), and application of such a TLS to cases of possibly triggered eventsa major source of concern in tectonically-active Italian territoryis not proven conceptually (Bommer et al. 2015). A triggered event is such that stress build up is due to steady, longterm, ungovernable tectonic loading, while human endeavours may only provide a (possibly minute) contribution with an activation stress, triggering slip on a fault. Here, a threshold system on a sequence of earthquakes provides no forecast model. More sophisticated TLS systems, based on statistical forecast models, have also been proposed (Adaptive TLS, e.g. Mignan et al. 2017), but they require an event population numerous enough to allow statistical analyses, only available in hydraulic fracturing cases, where microearthquakes are often plentiful. When triggered events are a concern (quite common in Italy), perhaps a physics-based fluid-geo-mechanic reservoir model, updated with time and coupled with an active fault model, would represent a necessary requirement (although modelling of earthquake triggering is complex and highly dependent on, largely unknown, local geology and stress state).
Multiple anthropic activity
Different anthropic activities may insist on the same territory, in close geographical locations. Such an example is the Val d'Agri region (Basilicata, Southern Italy), one of the largest European onshore oil reservoirs, with reinjection of production water, where the local Basin Authority manages the artificial Pertusillo Lake reservoir, with seasonal water level variations of ± 40 m (Valoroso et al. 2009;Stabile et al. 2014b).
The Pertusillo impoundment (PI in Fig. 2) generates seismicity at the border between Inner and Extended domains of the Val d'Agri licence, with maximum recorded magnitude M L = 3.3 in the period from 2001 to 2017compared to maximum magnitude M L = 1.8 recorded in association to waste water injection at the Costa Molina 2 well (CM2 in Fig. 2; Stabile et al. 2014a;Improta et al. 2015). The event with maximum magnitude (red star, northeast of PI in Fig. 2) has likely not been generated by the activities of the hydrocarbon production, but is rather attributable to reservoirinduced seismicity, and failure to discriminate between the two sources in application of a TLSand possible resulting limitations to operational industrial activitiesmay have expensive consequences on the oil and gas production. On the other hand, the ILG have no regulatory power on water reservoirs. Concerning the seismicity observed in the NW part of the monitoring domains, Valoroso et al. (2009) report seismotectonic origins, indicating prevalent normal faults with anti-Apenninic strike.
Neighbouring exploitation licences
Identification of the process responsible for generating a seismic event becomes of particular interest in the case of adjacent production areas, operated by different companies. This is, e.g., the case in region Basilicata (southern Italy), where hydrocarbon reservoirs in Val d'Agri and Gorgoglione licences are exploited by different companies (Fig. 3).
While oil and gas extraction in the Val d'Agri licence is ongoing since 1993 (https://unmig.mise.gov.it/), Note that most of the seismicity reported below CM2 belongs to the projection of the seismic cluster located SW off PI hydrocarbon production at Gorgoglione is scheduled to start imminently. For new licences, the ILG prescribe determination of the blank-level baseline of natural seismicity and ground deformation for at least 1 year prior to the start of the production. However, it cannot be excluded that the existing productive activity in Val d'Agri may induce stress variations into adjacent areas, perhaps on the site of the Gorgoglione licence, leading to a biased blank-level baseline. Besides, once the hydrocarbon production at Gorgoglione has started, it might become difficult to discriminate, whether seismic events or ground deformation are induced by one, or the other company, especially for phenomena occurring near the border of the adjacent permits. Although the two reservoirs are geologically independent, identification of possible responsibilities will be a big challenge. Should inner or external domains of neighbouring concessions overlap, any observed anomaly exceeding the TLS-threshold inside the monitoring domains leads to the prescribed consequences of all involved permits, according to the ILG. It may hence be advisable that neighbouring licences would be dealt with jointly, by the same monitoring agency. Besides, physics-based modelling might become necessary to discriminate between the two reservoirs, and to take appropriate decisions.
A similar situation exists in Central Italy (Tuscany, Umbria and Latium) for geothermal energy production licences, whose regulatory authorities are generally the regional governments. After publication of the ILG in 2014, new research permits have been issued, some of which designated "pilot concessions", and held by MISE under national supervision. The new permits are inside and around presently productive permits (orange and violet areas in Fig. 4), operated by one single company, in the areas of (1) Larderello-Travale, (2) Mount Amiata, (3) Latera (red areas in Fig. 4). The ILG (Terlizzese 2016) suggest to determine the blanklevel baseline for seismicity and ground deformation, even for small-footprint permits inside or adjacent to geothermal reservoirs that have been productive since decades.
In the case of a M W = 3.7 seismic event that occurred at Castelnuovo Val di Cecina on May 01, 2018 (yellow star 1 in Fig. 4), the epicentre is located inside the DI of the violet as well as the red permits (red area in Fig. 4, http:/terremoti.ingv.it/event/1910349). The question whether the event is part of the natural seismicity related to the seismotectonic activity of the area or may be connected to the geothermal exploitation of adjacent permits cannot be answered clearly, because both permits exploit the same geothermal reservoir. Seismicity in the area had already been observed long before the geothermal production started, reaching maximum magnitude M W = 4.3 on March, 21, 1925, as well as the M W = 3.6 event of June 24, 199024, (both, Rovida et al. 2016, occurred during production, had similar magnitude and location as the 2018 event, but gave no reason to stop operations (as in 2018 the ILG were not mandatory for the geothermal operator).
In case of adjacent production permits some critical questions may arise, such as: & How to regulate the joint use of data owned by different companies? & How to assess the possible interference between such activities? & How to make transparent/public both data streams and information?
Hence, it might be advantageous if a single monitoring agency (SPM) is nominated for neighbouring permits, such that one single integrated monitoring system may manage nearby sites, even when exploited by different companies.
One-year pre-production monitoring
The ILG prescribe that preliminary seismic monitoring has to be carried out at least 1 year before extraction or underground storage of fluids, as a measure to evaluate natural background seismicity in unperturbed conditions. Then, only exploitation licences that have recently been granted would be subject to this prescription, while all running production that started before implementation of the ILG guidelines would obviously be exempt. This may be a critical point in the case where new operations are planned in the vicinity of productive areas, and preliminary seismic monitoring may be biased by nearby ongoing anthropic activities. In particular, licences for geothermal energy production often have a small geographical footprint in Tuscany, where the liberalization of the geothermal energy market opened the field to smaller private subjects. Figure 4 shows the orange and violet areas of new concessions located inside or adjacent to geothermal production areas that have been operated since decades. In this case, a truly unbiased preliminary assessment of background seismicity seems impossible. On the other hand, it is important to establish whether new activities generate a significant variation in the (background) seismicity. Given the general seismic hazard of most of the Italian territory, notable tectonic earthquakes may occasionally occur during the one-year pre-productive monitoring period, as in the case of the May 30, 2016, M L = 4.1 event at the proposed Torre Alfina geothermal area (area 3 in Fig. 4; Braun et al. 2018a), where production has never started. Given that any effort for better understanding the underlying seismotectonics of a region, where significant stress perturbation may result from planned or ongoing anthropic activities, is important and should be pursued as much as possible, we question the possible regulatory significance of a pre-production seismic survey of just one-year duration. On one side, 1 year is not enough to fully characterize background seismicity; on the other side, such a survey may be biased by nearby anthropic activities, or by an occasional occurrence of an earthquake that may hinder future activities. Application of ILG guidelines should explicitly worry about the consequences of the pre-production survey for the future exploitation, especially in case that significant seismic events occurred (e.g. M L = 4.1 at Torre Alfina).
However, the 1-year recommendation for the preactivity acquisition represents a compromise between the goal of achieving a fair picture of background seismicity, recorded by the local monitoring network, and the need to avoid a too long and demanding requirement in addition to many other technical and administrative duties.
Conclusions
Globally, induced seismicity is receiving increased interest, both by the scientific community and the general public, more and more concerned about the impact of natural and industrial risks in modern society. A thorough and continued examination of what the "best monitoring practices" could be is therefore timely and important. The ILG represents the first, very significant, attempt in Italy to regulate the monitoring of human activities in the subsoil. The test implementation phase at the pilot sites is therefore an important step that may help improve the protocol, point out critical questions, and clarify many potential site-specific issues.
In Italy extraction of oil and gas, as well as the production of geothermal energy, often implies that different operators exploit the same reservoirs, or adjacent ones. However, each single industrial company has to fulfil the prescriptions defined by the ILG, to set up an autonomous monitoring system for continuous monitoring of seismicity, ground deformation, and pore pressure, each to be controlled by an independent monitoring agency (SPM). We point out that it could be advantageous to manage geographically adjacent sites in a single monitoring system, to enable a better global view and avoid artificial hard borders between processes. In case of contiguous productive areas (Fig. 4), the ILG's obligation for new exploitation licences to determine the blank-level baseline 1 year before starting the production may not be exhaustive. For companies with upcoming exploitation licences, it is impossible to determine the natural background seismicity or the ground deformation, excluding any possible bias by ongoing productive activities in neighbouring concessions. Data sharing among operators of contiguous productive concessions is a need for a significant assessment.
The definition of the domains and the level transition in the reaction scheme represent key-points for the interpretation of possible phenomena recognized by the monitoring, As a first approximation, such volumes are purely defined in geometrical terms. The complexity of the geophysical context of natural and induced effects would require that the domain volumes are defined individually on the basis of the analysis of long-term seismicity time series and geo-mechanical characteristics of the reservoir.
Noticeably, the ILG define a formal framework for drawing and implementing efficient monitoring schemes at a very general level providing large autonomies (and related responsibilities) to the concerned institution. In particular and special cases, such as the existence of adjacent permits that insist on neighbouring domains, it may be advisable to promote a cooperative, supervising, and coherent planning of activities. In particular, concerning adjacent industrial activities and potential mutual interference on background seismicity, as the guidelines require one-year long preliminary seismic observations as a measure for the evaluation of background seismicity in unperturbed conditions, it would be probably appropriate to widen the meaning of the term "unperturbed" to "natural or already perturbed" seismicity.
Should domains of neighbouring concessions overlap, it is our opinion that the automatic activation of alert level for both the involved industrial subjects should be given great care. It would be desirable that all imaginable efforts are made to carefully evaluate how to ascribe possible observable variations to one or another industrial subject, especially in case of exploitation of independent reservoirs.
The recommendation to determine the blank level baseline of natural background seismicity and ground deformation during a one-year lasting period prior to exploitation may not be sufficiently long. The possible absence of any seismicity during this period is not significant; therefore an evaluation of the previous instrumental, and even historical, seismicity should be possibly included. On the other hand, in case that during the pre-production phase seismicity or ground deformation show significant variations, as, e.g., in May 2016 at Torre Alfina, the ILG do not describe any consequences or constraints concerning the future exploitation.
Further critical points on implementation of ILG concern the use of magnitude thresholds for a TLS. Different formulas can be used for the calculation of the magnitude, with no obvious best choice, and uncertainty is inherent in any estimate. For the case of local magnitude, M L , it strongly depends on local attenuation and network configuration (Di Bona 2016;Braun et al. 2018b). Best-practice guidelines are hence challenged to be actionable and precise, but at the same time, they should not assign disproportionate meaning and consequence to, say, the decimal digit of a numerical value. For this and the other issues, possible suggestions for improvement may stem from a discussion among all the actors involved in the system.
We may also question the meaning of a traffic light system, with thresholds set on magnitude or ground motion parameters (PGV, PGA), for triggered seismicity a common concern in tectonically active Italyas it is based on a forecast model that primarily applies to earthquakes induced by hydraulic fracturing (e.g. Baisch et al. 2019).
We conclude with a final comment on the merits of the ILG. As already pointed out, they represent an excellent framework for coherent monitoring of hydrocarbon and geothermal sites at a national scale, and started a wide-scale, authoritative, and cogent debate on the possible impacts of underground energy technologies in Italy. A planned new version of ILG guidelines, revised after the test implementation, will overcome issues and questions that may have risen during the test implementation period. As a long-term goal, on the basis of the recorded data (seismic, ground deformation, pore pressure), the refined geological and fluid-geomechanical models of the reservoir, combined with ground motion prediction equations, the analysis and evaluation of hazard may be updated in quasi-real-time enabling to pass finally from a static to a dynamic risk treatment. |
/**
* Simple value object representing a blood sugar measurement.
*
* No use of encapsulating the fields, IMHO. Change it if your inner autistic tells you so.
*/
public class BloodSugarMeasurement {
@Expose
public Date timeOfMeasurement;
@Expose
public Double result;
@Expose
public Boolean hasTemperatureWarning;
@Expose
public Boolean isOutOfBounds;
@Expose
public Boolean otherInformation;
@Expose
public Boolean isBeforeMeal;
@Expose
public Boolean isAfterMeal;
@Expose
public Boolean isControlMeasurement;
} |
<filename>config.go
package main
var symbolsMap = map[string]string{
"USDT": "815b0b1a-2764-3736-8faa-42d694fa620a",
"BTC": "c6d0c728-2624-429b-8e0d-d9d19b6592fa",
"BCH": "fd11b6e3-0b87-41f1-a41f-f0e9b49e5bf0",
"EOS": "6cfe566e-4aad-470b-8c9a-2fd35b49c68d",
"ETH": "43d61dcd-e413-450d-80b8-101d5e903357",
"ETC": "2204c1ee-0ea2-4add-bb9a-b3719cfff93a",
"LTC": "76c802a2-7c88-447f-a93e-c29c9e5dd9c8",
"XRP": "23dfb5a5-5d7b-48b6-905f-3970e3176e27",
"SC": "990c4c29-57e9-48f6-9819-7d986ea44985",
"XIN": "c94ac88f-4671-3976-b60a-09064f1811e8",
"CNB": "965e5c6e-434c-3fa9-b780-c50f43cd955c",
}
var (
oceanOneId = "aaff5bef-42fb-4c9f-90e0-29f69176b7d4"
author = "8017d200-78<PASSWORD>"
)
|
import { NgModule } from '@angular/core';
import { RouterModule, Routes } from '@angular/router';
import { ArtistsComponent } from './artists/artists/artists.component';
import { CardComponent } from './card/card.component';
import { CardResolver } from './card/services/card.resolver';
import { ChangeComponent } from './changes/change/change.component';
import { ChangesComponent } from './changes/changes/changes.component';
import { CollectionsComponent } from './collections/collections/collections.component';
import { HomeComponent } from './home/home.component';
import { InstitutionsComponent } from './institutions/institutions/institutions.component';
import { ListComponent } from './list/list.component';
import { LoginComponent } from './login/login.component';
import { QuizzComponent } from './quizz/quizz.component';
import { CollectionVisibility, UserRole } from './shared/generated-types';
import { AuthAdminGuard } from './shared/services/auth.admin.guard';
import { AuthGuard } from './shared/services/auth.guard';
import { UserResolver } from './users/services/user.resolver';
import { UserComponent } from './users/user/user.component';
import { UsersComponent } from './users/users/users.component';
export const routes: Routes = [
{
path: 'login',
component: LoginComponent,
},
// Auth required routes
{
path: '',
component: HomeComponent,
canActivate: [AuthGuard],
resolve: {user: UserResolver},
children: [
{
path: '',
component: ListComponent,
data: {
showLogo: true,
forceSearch: true,
},
},
{
path: 'card/new',
component: CardComponent,
data: {showLogo: true},
runGuardsAndResolvers: 'always',
},
{
path: 'card/:cardId',
component: CardComponent,
resolve: {card: CardResolver},
data: {showLogo: true},
},
{
path: 'profile',
component: UserComponent,
},
{
path: 'user',
component: UsersComponent,
canActivate: [AuthAdminGuard],
},
{
path: 'institution',
component: InstitutionsComponent,
canActivate: [AuthGuard],
},
{
path: 'artist',
component: ArtistsComponent,
canActivate: [AuthGuard],
},
{
path: 'notification',
component: ChangesComponent,
},
{
path: 'notification/new/:cardId',
component: ChangeComponent,
},
{
path: 'notification/:changeId',
component: ChangeComponent,
},
{
path: 'quizz',
component: QuizzComponent,
},
{
path: 'collection',
component: CollectionsComponent,
data: {
creationButtonForRoles: false,
editionButtonsForRoles: [
UserRole.administrator,
UserRole.senior,
],
filters: {
isSource: false,
visibilities: [
CollectionVisibility.administrator,
CollectionVisibility.member,
],
},
},
children: [
{
path: ':collectionId',
component: ListComponent,
data: {
showLogo: false,
showDownloadCollectionForRoles: [UserRole.administrator],
},
},
],
},
{
path: 'my-collection',
component: CollectionsComponent,
resolve: {creator: UserResolver},
data: {
creationButtonForRoles: true,
showLogo: false,
showUnclassified: true,
showMyCards: true,
filters: {
isSource: false,
},
},
children: [
{
path: '',
component: ListComponent,
data: {
filter: {
groups: [
{
conditions: [
{
collections: {empty: {not: false}},
},
],
},
],
},
},
},
{
path: 'my-cards',
component: ListComponent,
resolve: {creator: UserResolver},
},
{
path: ':collectionId',
component: ListComponent,
},
],
},
{
path: 'source',
component: CollectionsComponent,
data: {
creationButtonForRoles: [UserRole.administrator],
editionButtonsForRoles: [UserRole.administrator],
filter: {
isSource: true,
},
},
children: [
{
path: ':collectionId',
component: ListComponent,
data: {
showDownloadCollectionForRoles: [UserRole.administrator],
showLogo: false,
filter: {},
},
},
],
},
],
},
];
@NgModule({
imports: [
RouterModule.forRoot(routes, {
paramsInheritanceStrategy: 'emptyOnly',
}),
],
exports: [RouterModule],
})
export class AppRoutingModule {
}
|
package io.anuke.mindustry.io;
import io.anuke.mindustry.core.ThreadHandler.ThreadProvider;
import io.anuke.ucore.entities.Entity;
import io.anuke.ucore.entities.EntityGroup;
import io.anuke.ucore.scene.ui.TextField;
import java.util.Date;
import java.util.Locale;
public abstract class Platform {
public static Platform instance = new Platform() {};
public String format(Date date){return "invalid";}
public String format(int number){return "invalid";}
public void showError(String text){}
public void addDialog(TextField field){
addDialog(field, 16);
}
public void addDialog(TextField field, int maxLength){}
public void updateRPC(){}
public void onGameExit(){}
public void openDonations(){}
public boolean hasDiscord(){return true;}
public void requestWritePerms(){}
public String getLocaleName(Locale locale){
return locale.toString();
}
public boolean canJoinGame(){
return true;
}
public boolean isDebug(){return false;}
/**Must be 8 bytes in length.*/
public byte[] getUUID(){return null;}
public ThreadProvider getThreadProvider(){
return new ThreadProvider() {
@Override public boolean isOnThread() {return true;}
@Override public void sleep(long ms) {}
@Override public void start(Runnable run) {}
@Override public void stop() {}
@Override public void notify(Object object) {}
@Override public void wait(Object object) {}
@Override public <T extends Entity> void switchContainer(EntityGroup<T> group) {}
};
}
}
|
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.flink.table.planner.plan.nodes.exec.batch;
import org.apache.flink.api.dag.Transformation;
import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
import org.apache.flink.table.data.RowData;
import org.apache.flink.table.planner.codegen.CodeGeneratorContext;
import org.apache.flink.table.planner.codegen.agg.batch.AggWithoutKeysCodeGenerator;
import org.apache.flink.table.planner.codegen.agg.batch.SortAggCodeGenerator;
import org.apache.flink.table.planner.delegation.PlannerBase;
import org.apache.flink.table.planner.plan.nodes.exec.ExecEdge;
import org.apache.flink.table.planner.plan.nodes.exec.ExecNode;
import org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase;
import org.apache.flink.table.planner.plan.nodes.exec.ExecNodeContext;
import org.apache.flink.table.planner.plan.nodes.exec.InputProperty;
import org.apache.flink.table.planner.plan.nodes.exec.SingleTransformationTranslator;
import org.apache.flink.table.planner.plan.nodes.exec.utils.ExecNodeUtil;
import org.apache.flink.table.planner.plan.utils.AggregateInfoList;
import org.apache.flink.table.planner.plan.utils.AggregateUtil;
import org.apache.flink.table.planner.utils.JavaScalaConversionUtil;
import org.apache.flink.table.runtime.generated.GeneratedOperator;
import org.apache.flink.table.runtime.operators.CodeGenOperatorFactory;
import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
import org.apache.flink.table.types.logical.RowType;
import org.apache.calcite.rel.core.AggregateCall;
import java.util.Arrays;
import java.util.Collections;
/** Batch {@link ExecNode} for (global) sort-based aggregate operator. */
public class BatchExecSortAggregate extends ExecNodeBase<RowData>
implements BatchExecNode<RowData>, SingleTransformationTranslator<RowData> {
private final int[] grouping;
private final int[] auxGrouping;
private final AggregateCall[] aggCalls;
private final RowType aggInputRowType;
private final boolean isMerge;
private final boolean isFinal;
public BatchExecSortAggregate(
int[] grouping,
int[] auxGrouping,
AggregateCall[] aggCalls,
RowType aggInputRowType,
boolean isMerge,
boolean isFinal,
InputProperty inputProperty,
RowType outputType,
String description) {
super(
ExecNodeContext.newNodeId(),
ExecNodeContext.newContext(BatchExecSortAggregate.class),
Collections.singletonList(inputProperty),
outputType,
description);
this.grouping = grouping;
this.auxGrouping = auxGrouping;
this.aggCalls = aggCalls;
this.aggInputRowType = aggInputRowType;
this.isMerge = isMerge;
this.isFinal = isFinal;
}
@SuppressWarnings("unchecked")
@Override
protected Transformation<RowData> translateToPlanInternal(PlannerBase planner) {
final ExecEdge inputEdge = getInputEdges().get(0);
final Transformation<RowData> inputTransform =
(Transformation<RowData>) inputEdge.translateToPlan(planner);
final RowType inputRowType = (RowType) inputEdge.getOutputType();
final RowType outputRowType = (RowType) getOutputType();
final CodeGeneratorContext ctx = new CodeGeneratorContext(planner.getTableConfig());
final AggregateInfoList aggInfos =
AggregateUtil.transformToBatchAggregateInfoList(
aggInputRowType,
JavaScalaConversionUtil.toScala(Arrays.asList(aggCalls)),
null,
null);
final GeneratedOperator<OneInputStreamOperator<RowData, RowData>> generatedOperator;
if (grouping.length == 0) {
generatedOperator =
AggWithoutKeysCodeGenerator.genWithoutKeys(
ctx,
planner.getRelBuilder(),
aggInfos,
inputRowType,
outputRowType,
isMerge,
isFinal,
"NoGrouping");
} else {
generatedOperator =
SortAggCodeGenerator.genWithKeys(
ctx,
planner.getRelBuilder(),
aggInfos,
inputRowType,
outputRowType,
grouping,
auxGrouping,
isMerge,
isFinal);
}
return ExecNodeUtil.createOneInputTransformation(
inputTransform,
getOperatorName(planner.getTableConfig()),
getOperatorDescription(planner.getTableConfig()),
new CodeGenOperatorFactory<>(generatedOperator),
InternalTypeInfo.of(outputRowType),
inputTransform.getParallelism());
}
}
|
#include <Rcpp.h>
// [[Rcpp::(depends(Rcpp, compoisson))]]
using namespace Rcpp;
// Below is a simple example of exporting a C++ function to R. You can
// source this function into an R session using the Rcpp::sourceCpp
// function (or via the Source button on the editor toolbar)
// For more on using Rcpp click the Help button on the editor toolbar
//
// to compile and test in R.
// Rcpp::sourceCpp('c++/test.cpp', verbose = TRUE, rebuild = TRUE, dryRun = TRUE)
//
// [[Rcpp::export]]
int timesTwo(int x) {
return x * 2;
}
// [[Rcpp::export]]
double indicator(double x){
double res = 0.0;
if(x == 0) res = 1;
return res;
}
double factorial(int x) {
double res = 1.0;
for (double d = 1.0; d <= x; ++d) {
res *= d;
}
return res;
}
double W(double lam, double nu, int sumTo) {
double sum = 0.0;
double factorial = 1.0;
double lamPower = lam;
for (int i = 1; i <= sumTo; ++i) {
factorial *= i;
sum += lamPower * log(factorial) / pow(factorial, nu);
lamPower *= lam;
}
return sum;
}
double Y(double lam, double nu, int sumTo) {
double sum = 0.0;
double factorial = 1.0;
double lamPower = lam;
for (int i = 1; i <= sumTo; ++i) {
factorial *= i;
sum += lamPower * i / pow(factorial, nu);
lamPower *= lam;
}
return sum;
}
double YY(double lam, double nu, int sumTo) {
double sum = 0.0;
double factorial = 1.0;
double lamPower = lam;
for (int i = 1; i <= sumTo; ++i) {
factorial *= i;
sum += lamPower * i * i / pow(factorial, nu);
lamPower *= lam;
}
return sum;
}
double Z(double lam, double nu, int sumTo) {
double sum = 1.0;
double factorial = 1.0;
double lamPower = lam;
for (int i = 1; i <= sumTo; ++i) {
factorial *= i;
sum += lamPower / pow(factorial, nu);
lamPower *= lam;
}
return sum;
}
// [[Rcpp::export]]
NumericVector Indicator(NumericVector xx) {
int size = xx.size();
NumericVector out(size);
for (int i = 0; i < size; ++i) {
out[i] = indicator(xx[i]);
}
return out;
}
// [[Rcpp::export]]
NumericVector W(NumericVector lam, NumericVector nu, int sumTo) {
int size = lam.size();
NumericVector out(size);
for (int i = 0; i < size; ++i) {
out[i] = W(lam[i], nu[i], sumTo);
}
return out;
}
// [[Rcpp::export]]
NumericVector Y(NumericVector lam, NumericVector nu, int sumTo) {
int size = lam.size();
NumericVector out(size);
for (int i = 0; i < size; ++i) {
out[i] = Y(lam[i], nu[i], sumTo);
}
return out;
}
// [[Rcpp::export]]
NumericVector YY(NumericVector lam, NumericVector nu, int sumTo) {
int size = lam.size();
NumericVector out(size);
for (int i = 0; i < size; ++i) {
out[i] = YY(lam[i], nu[i], sumTo);
}
return out;
}
// [[Rcpp::export]]
NumericVector Z(NumericVector lam, NumericVector nu, int sumTo) {
int size = lam.size();
NumericVector out(size);
for (int i = 0; i < size; ++i) {
out[i] = Z(lam[i], nu[i], sumTo);
}
return out;
}
double dcomp0(int y, double lam, double nu, int sumTo) {
if (y < 0) {
return 0.0;
} else {
return pow(lam, y) / (pow(factorial(y), nu) * Z(lam, nu, sumTo));
}
}
double dzip0(int y, double lam, double pi) {
if (y < 0) {
return 0.0;
} else {
return R::dpois(y,lam, FALSE) * (1 - pi) + indicator(y) * pi;
}
}
double dtpois0(int y, double lam) {
if (y <= 0) {
return 0.0;
} else {
return R::dpois(y,lam, FALSE) / (1-R::dpois(0,lam, FALSE));
}
}
double dhp0(int y, double lam, double pi) {
if (y < 0) {
return 0.0;
} else {
return dtpois0(y,lam) * (1 - pi) + indicator(y) * pi;
}
}
NumericVector dcomp1(NumericVector y, double lam, double nu, int sumTo) {
int n = y.size();
NumericVector yvec(n);
for(int i=0; i<y.size();i++){
yvec[i] = pow(lam, y[i]) / (pow(factorial(y[i]), nu) * Z(lam, nu, sumTo));
}
return yvec;
}
NumericVector dhp1(NumericVector y, double lam, double pi,bool logP = false) {
int size = y.size();
NumericVector ans = NumericVector(size);
for (int i = 0; i < y.size(); ++i) {
ans[i] = dhp0(y[i], lam, pi);
}
if (logP) {
return log(ans);
} else {
return ans;
}
}
double dcompoissongauss(double y, double mu, double size, double mug, double sigma) {
double sum = 0.0;
for (int i = 0; i <= 100; ++i) {
sum += R::dnorm(y-i,mug,sigma,0) * dcomp0(i,mu,size,100);
}
return sum;
}
double dkcompoissongauss(double y, double mu, double size, double mug, double sigma, int k) {
double sum = 0.0;
sum = R::dnorm(y-k,mug,sigma,0) * dcomp0(k,mu,size,100);
return sum;
}
void checkInputs(NumericVector lam, NumericVector nu) {
int lamSize = lam.size();
int nuSize = nu.size();
for (int i = 0; i < lamSize; ++i) {
if (lam[i] < 0) {
throw exception("input 'lam' should be >= 0");
}
}
for (int i = 0; i < nuSize; ++i) {
if (nu[i] < 0) {
throw exception("input 'nu' should be >= 0");
}
}
}
//' Zero-inflated Poisson distribution
//' @param y observations
//' @param lam Mean Poisson paramter
//' @param pi Inflation probability
//' @param sumTo integer value for calculating the distribution. Default to 100.
//' @param logP Logical. Should the value be on log scale.
//' @keywords zero inflated Poisson
//' @export
// [[Rcpp::export]]
Rcpp::NumericVector dZIP(NumericVector y, NumericVector lam, NumericVector pi,
int sumTo = 100, bool logP = false) {
checkInputs(lam, pi);
int size = max(NumericVector::create(y.size(), lam.size(), pi.size()));
y = rep_len(y, size);
lam = rep_len(lam, size);
pi = rep_len(pi, size);
NumericVector ans = NumericVector(size);
for (int i = 0; i < size; ++i) {
ans[i] = dzip0(y[i], lam[i], pi[i]);
}
if (logP) {
return log(ans);
} else {
return ans;
}
}
//' Hurdle Poisson distribution
//' @param y observations
//' @param lam Mean Poisson paramter
//' @param pi Hurdle probability
//' @param logP Logical. Should the value be on log scale.
//' @keywords hurdle Poisson
//' @export
// [[Rcpp::export]]
Rcpp::NumericVector dHP(NumericVector y, NumericVector lam, NumericVector pi,
bool logP = false) {
int size = max(NumericVector::create(y.size(), lam.size(), pi.size()));
y = rep_len(y, size);
lam = rep_len(lam, size);
pi = rep_len(pi, size);
NumericVector ans = NumericVector(size);
for (int i = 0; i < size; ++i) {
ans[i] = dhp0(y[i], lam[i], pi[i]);
}
if (logP) {
return log(ans);
} else {
return ans;
}
}
//' Conway Maxwell Poisson distribution
//' @param y observations
//' @param lam Mean Poisson paramter
//' @param nu Variance parameter
//' @param sumTo integer value for calculating the distribution. Default to 50.
//' @param logP Logical. Should the value be on log scale.
//' @keywords Conway Maxwell Poisson
//' @export
// [[Rcpp::export]]
Rcpp::NumericVector dcomp(NumericVector y, NumericVector lam, NumericVector nu,
int sumTo = 50, bool logP = false) {
checkInputs(lam, nu);
int size = max(NumericVector::create(y.size(), lam.size(), nu.size()));
y = rep_len(y, size);
lam = rep_len(lam, size);
nu = rep_len(nu, size);
NumericVector ans = NumericVector(size);
for (int i = 0; i < size; ++i) {
ans[i] = dcomp0(y[i], lam[i], nu[i], sumTo);
}
if (logP) {
return log(ans);
} else {
return ans;
}
}
//' Zero-truncated Poisson distribution
//' @param y observations
//' @param lam Mean Poisson paramter
//' @param logP Logical. Should the value be on log scale.
//' @keywords zero truncated Poisson
//' @export
// [[Rcpp::export]]
Rcpp::NumericVector dtpois(NumericVector y, NumericVector lam,
bool logP = false) {
int size = max(NumericVector::create(y.size(), lam.size()));
y = rep_len(y, size);
lam = rep_len(lam, size);
NumericVector ans = NumericVector(size);
for (int i = 0; i < size; ++i) {
ans[i] = dtpois0(y[i], lam[i]);
}
if (logP) {
return log(ans);
} else {
return ans;
}
}
//' Binomial Gaussian convolution
//' @param x observations
//' @param prob Binomial probability
//' @param size Binomial size
//' @param mu Gaussian mean
//' @param sigma Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords Binomial Gaussian convolution
//' @export
// [[Rcpp::export]]
Rcpp::NumericVector dBinomGauss(Rcpp::NumericVector x,
Rcpp::NumericVector prob,
Rcpp::NumericVector size,
Rcpp::NumericVector mu,
Rcpp::NumericVector sigma,
bool log = false)
{
int n = x.size();
NumericVector res(n);
int s = max(size);
int yy = mu.size();
NumericVector rdp(s);
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy == 1)
res[i] = sum ( dnorm(x[i]-rdp,mu[0],sigma[0]) * Rcpp::dbinom(rdp, size[0], prob[0] ) ); else
res[i] = sum ( dnorm(x[i]-rdp,mu[i],sigma[i]) * Rcpp::dbinom(rdp, size[i], prob[i] ) );
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
}
// doing log transformation for the last
if( log ) return(wrap(Rcpp::log(res)));
return(wrap(res));
}
//' Binomial Lognormal convolution
//' @param x observations
//' @param prob Binomial probability
//' @param size Binomial size
//' @param mu Log Gaussian mean
//' @param sigma Log Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords Binomial Log Gaussian convolution
//' @export
// [[Rcpp::export]]
Rcpp::NumericVector dBinomLnorm(Rcpp::NumericVector x,
Rcpp::NumericVector prob,
Rcpp::NumericVector size,
Rcpp::NumericVector mu,
Rcpp::NumericVector sigma,
bool log = false)
{
int n = x.size();
NumericVector res(n);
int s = max(size);
int yy = mu.size();
NumericVector rdp(s);
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy == 1)
res[i] = sum ( dlnorm(x[i]-rdp,mu[0],sigma[0]) * Rcpp::dbinom(rdp, size[0], prob[0] ) ); else
res[i] = sum ( dlnorm(x[i]-rdp,mu[i],sigma[i]) * Rcpp::dbinom(rdp, size[i], prob[i] ) );
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
}
// doing log transformation for the last
if( log ) return(wrap(Rcpp::log(res)));
return(wrap(res));
}
//' Negative Binomial Gaussian convolution
//' @param x observations
//' @param mu Negative Binomial mean
//' @param size Negative Binomial size
//' @param mug Gaussian mean
//' @param sigmag Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords Negative Binomial Gaussian convolution
//' @export
// [[Rcpp::export]]
NumericVector dNbinomGauss(NumericVector x,
NumericVector mu,
NumericVector size,
NumericVector mug,
NumericVector sigmag,
bool log = false
)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = mu.size();
NumericVector res(n);
int s = 200;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1)
res[i] = sum ( dnorm(x[i]-rdp,mug[0],sigmag[0]) * Rcpp::dnbinom_mu(rdp, size[0], mu[0]) ) ; else
res[i] = sum ( dnorm(x[i]-rdp,mug[i],sigmag[i]) * Rcpp::dnbinom_mu(rdp, size[i], mu[i]) ) ;
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100 ) )
res[i] = pow(10,100);
} // for loop
if (log){
res = Rcpp::log(res);
}
return(wrap(res));
}
//' Zero inflated Poisson Gaussian convolution
//' @param x observations
//' @param lam ZIP mean
//' @param pi ZIP probability
//' @param mug Gaussian mean
//' @param sigmag Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords ZIP Gaussian convolution
//' @export
// [[Rcpp::export]]
NumericVector dZIPGauss(NumericVector x,
NumericVector lam,
NumericVector pi,
NumericVector mug,
NumericVector sigmag,
bool log= false)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = lam.size();
NumericVector res(n);
int s = 100;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1){
res[i] = sum ( dnorm(x[i]-rdp,mug[0],sigmag[0]) * (dpois(rdp, lam[0])*(1-pi[0]) + pi[0] * Indicator(rdp)));
} else {
res[i] = sum ( dnorm(x[i]-rdp,mug[i],sigmag[i]) * (dpois(rdp, lam[i])*(1-pi[i]) + pi[i] * Indicator(rdp))) ;
}
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100 ) )
res[i] = pow(10,100);
} // for loop
if (log){
res = Rcpp::log(res);
}
return(wrap(res));
}
//' Hurdle Poisson Gaussian convolution
//' @param x observations
//' @param lam Hurdle mean
//' @param pi Hurdle probability
//' @param mug Gaussian mean
//' @param sigmag Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords ZIP Gaussian convolution
//' @export
// [[Rcpp::export]]
NumericVector dHPGauss(NumericVector x,
NumericVector lam,
NumericVector pi,
NumericVector mug,
NumericVector sigmag,
bool log= false)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = lam.size();
NumericVector res(n);
int s = 100;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1){
res[i] = sum ( dnorm(x[i]-rdp,mug[0],sigmag[0]) * (dhp1(rdp, lam[0],pi[0])));
} else {
res[i] = sum ( dnorm(x[i]-rdp,mug[i],sigmag[i]) * (dhp1(rdp, lam[i],pi[i])));
}
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100 ) )
res[i] = pow(10,100);
} // for loop
if (log){
res = Rcpp::log(res);
}
return(wrap(res));
}
//' Negative Binomial Log Gaussian convolution
//' @param x observations
//' @param mu Negative Binomial mean
//' @param size Negative Binomial size
//' @param mug Log Gaussian mean
//' @param sigmag Log Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords Negative Binomial Log Gaussian convolution
//' @export
// [[Rcpp::export]]
NumericVector dNbinomLnorm(NumericVector x,
NumericVector mu,
NumericVector size,
NumericVector mug,
NumericVector sigmag,
bool log = false)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = mu.size();
NumericVector res(n);
int s = 100;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1)
res[i] = sum ( dlnorm(x[i]-rdp,mug[0],sigmag[0]) * Rcpp::dnbinom_mu(rdp, size[0], mu[0]) ) ; else
res[i] = sum ( dlnorm(x[i]-rdp,mug[i],sigmag[i]) * Rcpp::dnbinom_mu(rdp, size[i], mu[i]) ) ;
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100 ) )
res[i] = pow(10,100);
} // for loop
if (log){
res = Rcpp::log(res);
}
return(wrap(res));
}
//' Conway Maxwell Poisson Gaussian convolution
//' @param x observations
//' @param mu CoMP mean
//' @param size CoMP size
//' @param mug Gaussian mean
//' @param sigmag Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords CoMP Gaussian convolution
//' @export
// [[Rcpp::export]]
NumericVector dCoMPoissonGauss2(NumericVector x,
NumericVector mu,
NumericVector size,
NumericVector mug,
NumericVector sigmag,
bool log = false
)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = mu.size();
NumericVector res(n);
int s = 100;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1) {
res[i] = sum ( dnorm(x[i]-rdp,mug[0],sigmag[0]) * dcomp1(rdp, mu[0], size[0], 100) ) ;
}
else{
res[i] = sum ( dnorm(x[i]-rdp,mug[i],sigmag[i]) * dcomp1(rdp, mu[i], size[i],100) ) ;
}
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100 ) )
res[i] = pow(10,100);
} // for loop
if (log){
res = Rcpp::log(res);
}
return(wrap(res));
}
//' Poisson Gaussian convolution
//' @param x observations
//' @param lam Poisson mean
//' @param mug Gaussian mean
//' @param sigmag Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords Poisson Gaussian convolution
//' @export
// [[Rcpp::export]]
NumericVector dPoisGauss(NumericVector x,
NumericVector lam,
NumericVector mug,
NumericVector sigmag,
bool log= false)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = lam.size();
NumericVector res(n);
int s = 100;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1)
res[i] = sum ( dnorm(x[i]-rdp,mug[0],sigmag[0]) * dpois(rdp, lam[0]) ) ; else
res[i] = sum ( dnorm(x[i]-rdp,mug[i],sigmag[i]) * dpois(rdp, lam[i]) ) ;
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100 ) )
res[i] = pow(10,100);
} // for loop
if (log){
res = Rcpp::log(res);
}
return(wrap(res));
}
//' Poisson Log Gaussian convolution
//' @param x observations
//' @param lam Poisson mean
//' @param mu Log Gaussian mean
//' @param sigma Log Gaussian variance
//' @param log Logical. Should the value be on log scale.
//' @keywords Poisson Log Gaussian convolution
//' @export
// [[Rcpp::export]]
Rcpp::NumericVector dPoisLnorm(Rcpp::NumericVector x,
Rcpp::NumericVector lam,
Rcpp::NumericVector mu,
Rcpp::NumericVector sigma,
bool log= false)
{
int n = x.size();
NumericVector res(n);
int s = 100;
int yy = lam.size();
NumericVector rdp(s);
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy == 1)
res[i] = sum ( dlnorm(x[i]-rdp,mu[0],sigma[0]) * Rcpp::dpois(rdp, lam[0]) ); else
res[i] = sum ( dlnorm(x[i]-rdp,mu[i],sigma[i]) * Rcpp::dpois(rdp, lam[i]) );
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
}
// doing log transformation for the last
if( log ) return(wrap(Rcpp::log(res)));
return(wrap(res));
}
// [[Rcpp::export]]
NumericVector my_dnorm( NumericVector x, NumericVector means, NumericVector sds){
int n = x.size() ;
NumericVector res(n) ;
for( int i=0; i<n; i++) res[i] = R::dnorm( x[i], means[i], sds[i],0) ;
return res ;
}
// [[Rcpp::export]]
NumericVector dkNbinomGauss(NumericVector x,
NumericVector mu,
NumericVector size,
NumericVector mug,
NumericVector sigmag,
int k)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = mu.size();
NumericVector res(n);
int s = 100;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1)
res[i] = (dnorm(x[i]-rdp,mug[0],sigmag[0]) * Rcpp::dnbinom_mu(rdp, size[0], mu[0] ))[k]; else
res[i] = (dnorm(x[i]-rdp,mug[i],sigmag[i]) * Rcpp::dnbinom_mu(rdp, size[i], mu[i] ))[k];
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100) )
res[i] = pow(10, 100);
} // for loop
return(wrap(res));
}
// [[Rcpp::export]]
NumericVector dkNbinomLnorm(NumericVector x,
NumericVector mu,
NumericVector size,
NumericVector mug,
NumericVector sigmag,
int k)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = mu.size();
NumericVector res(n);
int s = 100;
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
if(yy==1)
res[i] = (dlnorm(x[i]-rdp,mug[0],sigmag[0]) * Rcpp::dnbinom_mu(rdp, size[0], mu[0] ))[k]; else
res[i] = (dlnorm(x[i]-rdp,mug[i],sigmag[i]) * Rcpp::dnbinom_mu(rdp, size[i], mu[i] ))[k];
if( res[i] < pow(10, -300) )
res[i] = pow(10, -300);
if( res[i] > pow(10, 100) )
res[i] = pow(10, 100);
} // for loop
return(wrap(res));
}
// [[Rcpp::export]]
NumericMatrix dkBinomGauss(NumericVector x,
NumericVector size,
NumericVector prob,
NumericVector mug,
NumericVector sigmag
)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = size.size();
int s = max(size)+1;
NumericMatrix res(n,s);
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
for(int j=0; j < s; j++){
if(yy==1)
res(i,j) = R::dnorm(x[i]-rdp[j],mug[0],sigmag[0],0) * R::dbinom(rdp[j], size[0], prob[0],0); else
res(i,j) = R::dnorm(x[i]-rdp[j],mug[i],sigmag[i],0) * R::dbinom(rdp[j], size[i], prob[i],0);
if( res(i,j) < pow(10, -300) )
res(i,j) = pow(10, -300);
if( res(i,j) > pow(10, 100) )
res(i,j) = pow(10, 100);
}
} // for loop
return(wrap(res));
}
// [[Rcpp::export]]
NumericMatrix dkPoisGauss(NumericVector x,
NumericVector lam,
NumericVector mug,
NumericVector sigmag
)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = lam.size();
int s = 100;
NumericMatrix res(n,s);
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
for(int j=0; j < s; j++){
if(yy==1)
res(i,j) = R::dnorm(x[i]-rdp[j],mug[0],sigmag[0],0) * R::dpois(rdp[j], lam[0],0); else
res(i,j) = R::dnorm(x[i]-rdp[j],mug[i],sigmag[i],0) * R::dpois(rdp[j], lam[i],0);
if( res(i,j) < pow(10, -300) )
res(i,j) = pow(10, -300);
if( res(i,j) > pow(10, 100) )
res(i,j) = pow(10, 100);
}
} // for loop
return(wrap(res));
}
// [[Rcpp::export]]
NumericMatrix dkBinomLnorm(NumericVector x,
NumericVector size,
NumericVector prob,
NumericVector mug,
NumericVector sigmag
)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = size.size();
int s = max(size)+1;
NumericMatrix res(n,s);
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
for(int j=0; j < s; j++){
if(yy==1)
res(i,j) = R::dlnorm(x[i]-rdp[j],mug[0],sigmag[0],0) * R::dbinom(rdp[j], size[0], prob[0],0); else
res(i,j) = R::dlnorm(x[i]-rdp[j],mug[i],sigmag[i],0) * R::dbinom(rdp[j], size[i], prob[i],0);
if( res(i,j) < pow(10, -300) )
res(i,j) = pow(10, -300);
if( res(i,j) > pow(10, 100) )
res(i,j) = pow(10, 100);
}
} // for loop
return(wrap(res));
}
// [[Rcpp::export]]
NumericMatrix dkPoisLnorm(NumericVector x,
NumericVector lam,
NumericVector mug,
NumericVector sigmag
)
{
#include <Rcpp.h>
using namespace Rcpp;
int n = x.size();
int yy = lam.size();
int s = 100;
NumericMatrix res(n,s);
NumericVector rdp(s);
// rep
for( int i=0; i < s; i++ )
{ rdp[i] = i; }
// we use Rcpp sugar expression doing wrapping
for(int i=0; i < x.size(); i++)
{
for(int j=0; j < s; j++){
if(yy==1)
res(i,j) = R::dlnorm(x[i]-rdp[j],mug[0],sigmag[0],0) * R::dpois(rdp[j], lam[0], 0); else
res(i,j) = R::dlnorm(x[i]-rdp[j],mug[i],sigmag[i],0) * R::dpois(rdp[j], lam[i], 0);
if( res(i,j) < pow(10, -300) )
res(i,j) = pow(10, -300);
if( res(i,j) > pow(10, 100) )
res(i,j) = pow(10, 100);
}
} // for loop
return(wrap(res));
}
|
<filename>XCAssetGenerator/FileSystemObserver.h
//
// FileSystemObserver.h
// XCAssetGenerator
//
// Created by Bader on 7/1/15.
// Copyright (c) 2015 <NAME>. All rights reserved.
//
#import <Foundation/Foundation.h>
@protocol FileSystemObserverDelegate <NSObject>
@required
- (void)FileSystemDirectoryDeleted:(NSString *)path;
- (void)FileSystemDirectory:(NSString *)oldPath renamedTo:(NSString *)newPath;
- (void)FileSystemDirectoryError:(NSError *)error;
- (void)FileSystemDirectoryContentChanged:(NSString *)root;
@end
@interface FileSystemObserver : NSObject
@property FSEventStreamRef rootStream;
@property FSEventStreamRef contentStream;
@property BOOL ignoreHiddenItems;
- (void)addObserver:(id<FileSystemObserverDelegate>)observer forFileSystemPath:(NSString *)path ignoreContents:(BOOL)ignore;
- (void)addObserver:(id<FileSystemObserverDelegate>)observer forFileSystemPaths:(NSArray *)paths ignoreContents:(BOOL)ignore;
- (void)replacePathForObserversFrom:(NSString *)originalPath To:(NSString *)newPath;
- (void)removeObserverForPath:(NSString *)path;
- (void)removeObserverForPath:(NSString *)path restartStream:(BOOL)restart;
- (void)removeAllObservers;
- (void)stopStream;
@end
|
<reponame>Exusial/jittor<filename>python/jittor/src/numpy_func.h
// ***************************************************************
// Copyright (c) 2021 Jittor. All Rights Reserved.
// Maintainers:
// <NAME> <<EMAIL>>
// <NAME> <<EMAIL>>.
//
// This file is subject to the terms and conditions defined in
// file 'LICENSE.txt', which is part of this source code package.
// ***************************************************************
#pragma once
#include <functional>
#include "common.h"
#include "var_holder.h"
#include "ops/array_op.h"
namespace jittor {
struct NumpyResult;
struct NumpyFunc {
typedef NumpyResult R;
std::function<void(R*)> callback;
std::function<void()> deleter;
std::function<void()> inc_ref;
NumpyFunc() = default;
NumpyFunc(NumpyFunc&& other) : callback(other.callback), deleter(other.deleter), inc_ref(other.inc_ref) {
other.callback = nullptr;
other.deleter = nullptr;
other.inc_ref = nullptr;
};
NumpyFunc(const NumpyFunc& other) : callback(other.callback), deleter(other.deleter), inc_ref(other.inc_ref) {
inc_ref();
};
NumpyFunc(std::function<void(R*)>&& callback) : callback(move(callback)) {}
NumpyFunc(std::function<void(R*)>&& callback, std::function<void()>&& deleter)
: callback(move(callback)), deleter(move(deleter)) {};
NumpyFunc(std::function<void(R*)>&& callback, std::function<void()>&& deleter, std::function<void()>&& inc_ref)
: callback(move(callback)), deleter(move(deleter)), inc_ref(move(inc_ref)) {};
~NumpyFunc() {
if (deleter) {
deleter();
}
}
void operator =(NumpyFunc&& other) { this->~NumpyFunc(); new (this) NumpyFunc(move(other)); }
};
struct NumpyResult {
map<string, vector<DataView>> varrays;
map<string, int> ints;
map<string, DataView> arrays;
};
} // jittor |
/**
* Adds the specified vector's coordinates to this vector's coordinates, but
* stores the resulting values in the specified <i>cache</i> vector.
*
* @param vector
* The vector whose x, y, and z values will be added to this
* vector's coordinates.
* @param cache
* The vector in which to store the computed values.
* @return The <i>cache</i> vector.
*/
public Vector3f add(Vector3f vector, Vector3f cache) {
if (cache != null) {
cache.x = x + vector.x;
cache.y = y + vector.y;
cache.z = z + vector.z;
} else {
cache = new Vector3f(x + vector.x, y + vector.y, z + vector.z);
}
return cache;
} |
<filename>include/fileIO/TextureAtlas.hpp
#ifndef TNT_ASSETS_TEXTURE_ATLAS_HPP
#define TNT_ASSETS_TEXTURE_ATLAS_HPP
#include "math/Rectangle.hpp"
#include "core/Window.hpp"
namespace tnt
{
class TextureAtlas final
{
public:
TextureAtlas(Window const &win, std::string_view file_, Rectangle const &area) noexcept;
void Draw(Window const &win, Vector const &pos, Vector const& scale = {1.f, 1.f}, float angle = 0.f) noexcept;
private:
std::string_view filename;
Rectangle clip;
};
} // namespace tnt
#endif //!TNT_ASSETS_TEXTURE_ATLAS_HPP |
//= Reduce FIND/BIND/CHK requirements to just those in selected operator.
// triggers the switchover from FIND/BIND to CHK mode when satisfied, or early CHK termination
// returns positive if new cond is different from original (minimal generation for FIND/BIND)
int jhcAliaDir::reduce_cond (const jhcAliaOp *op, const jhcBindings& match)
{
const jhcAliaCore *core = step->Core();
const jhcActionTree *wmem = &(core->atree);
const jhcNetNode *item, *a;
int i, j, na, ni = full.NumItems();
cond.CopyBind(*(op->Pattern()), match);
for (i = 0; i < ni; i++)
{
item = full.Item(i);
na = item->NumArgs();
for (j = 0; j < na; j++)
{
a = item->Arg(j);
if (!full.InDesc(a))
cond.RemItem(a);
}
}
for (i = 0; i < ni; i++)
{
item = full.Item(i);
if (!cond.InDesc(item))
return(wmem->Version() + 1);
}
return 0;
} |
from django.contrib import admin
from posts.models import (
Profile, Tag, Post
)
@admin.register(Profile)
class ProfileAdmin(admin.ModelAdmin):
list_display = [
"handle",
"display_name",
"bio",
]
@admin.register(Tag)
class TagAdmin(admin.ModelAdmin):
list_display = [
"name",
"slug",
]
@admin.register(Post)
class PostAdmin(admin.ModelAdmin):
list_display = [
"profile",
"uuid",
"timestamp",
"title",
"content",
]
|
AN ANALYSIS OF WORD STRESS IN THE NEWS READING VIDEOS OF PUBLIC RELATIONS STUDENTS
This study aims to analyze word stress and factors that cause the differences in pronunciation in the news reading videos of students majoring in Public Relations. Using the descriptive method, this study focuses on how students pronounce words stress by comparing the pronunciation with Standard American English (SAE). Through the transcription of the news reading, it is found that there are 30 words containing word stress. Working with 10 selected informants, results reveal that the students can pronounce 16 words with the correct stress. 11 words are pronounced in different syllables and 1 word is pronounced without stress. The rest of the data indicates diverse answers by the informants but still highlights the mispronunciation. The factors that cause difficulties in the pronunciation of word stress in English are the interferences of the mother tongue, the inconsistency of some sounds in English, the differences in phonological rules between English and Indonesian, and lastly, the lack of exposure to English as a medium for communication.Keywords: pronunciation; word stress; syllables. |
<reponame>lorenzogm/react-nextjs-crystallize-portfolio-starter
import React, { useState } from 'react'
import Img from '@crystallize/react-image'
import ContentTransformer from 'themes/crystallize/ui/content-transformer'
import { screen } from 'themes/crystallize/ui'
import Layout from 'themes/crystallize/components/layout'
import ShapeComponents from 'themes/crystallize/components/shape/components'
import Topics from 'themes/crystallize/components/topics'
import VariantSelector from './VariantSelector'
import {
Outer,
Sections,
Media,
MediaInner,
Name,
Info,
Summary,
Content,
Specs,
Description,
} from './ProductTemplate.styles'
export default function ProductTemplate({ product, preview }) {
// Set the selected variant to the default
const [selectedVariant, setSelectedVariant] = useState(
product.variants.find((v) => v.isDefault),
)
function onVariantChange(variant) {
setSelectedVariant(variant)
}
const summaryComponent = product.components.find((c) => c.name === 'Summary')
const descriptionComponent = product.components.find(
(c) => c.name === 'Description',
)
const specs = product.components.find((c) => c.name === 'Specs')
const componentsRest = product.components?.filter(
(c) => !['Summary', 'Description', 'Specs'].includes(c.name),
)
return (
<Layout title={product.name} preview={preview}>
<Outer>
<Sections>
<Media>
<MediaInner>
<Img
{...selectedVariant.image}
sizes={`(max-width: ${screen.sm}px) 400px, 60vw`}
alt={product.name}
/>
</MediaInner>
</Media>
<Info>
<Name>{product.name}</Name>
{summaryComponent && (
<Summary>
<ContentTransformer {...summaryComponent?.content?.json} />
</Summary>
)}
{product.variants?.length > 1 && (
<VariantSelector
variants={product.variants}
selectedVariant={selectedVariant}
onVariantChange={onVariantChange}
/>
)}
</Info>
</Sections>
<Content>
{descriptionComponent && (
<Description>
<ShapeComponents
className="description"
components={[descriptionComponent]}
/>
</Description>
)}
{specs && (
<Specs>
<ShapeComponents components={[specs]} />
</Specs>
)}
</Content>
{product?.topics?.length && <Topics topicMaps={product.topics} />}
<ShapeComponents components={componentsRest} />
</Outer>
</Layout>
)
}
|
/**
* Calculate the square of a 3-vector's magnitude.
* This is cheaper than calculating the magnitude, when you
* only need to make comparisons between two vectors.
*/
float magnitude2( v3 vec ) {
return ( vec.v[ 0 ] * vec.v[ 0 ] +
vec.v[ 1 ] * vec.v[ 1 ] +
vec.v[ 2 ] * vec.v[ 2 ] );
} |
from dataclasses import dataclass
import logging
from typing import Union, Tuple, Sequence, Any, Optional # noqa: F401
import numpy as np
import wiggin_mito.forces
import wiggin
from wiggin.core import SimAction
import looplib
import looplib.looptools
import polychrom
import polychrom.forces
logging.basicConfig(level=logging.INFO)
@dataclass
class BackboneTethering(SimAction):
k: float=15
_reads_shared = ['backbone']
def run_init(self, sim):
sim.add_force(
polychrom.forces.tether_particles(
sim_object=sim,
particles=self._shared["backbone"],
k=self.k,
positions="current",
name="wm_tether_backbone",
)
)
@dataclass
class BackboneAngularTethering(SimAction):
angle_wiggle: float = np.pi / 16
_reads_shared = ['backbone']
def run_init(self, sim):
sim.add_force(
wiggin_mito.forces.angular_tether_particles(
sim_object=sim,
particles=self._shared["backbone"],
angle_wiggle=self.angle_wiggle,
angles="current",
name="wm_tether_backbone_angle",
)
)
@dataclass
class RootLoopBaseAngularTethering(SimAction):
angle_wiggle: float = np.pi / 16
_reads_shared = ['loops']
def run_init(self, sim):
loops = self._shared["loops"]
root_loops = loops[looplib.looptools.get_roots(loops)]
root_loop_particles = sorted(np.unique(root_loops))
sim.add_force(
wiggin_mito.forces.angular_tether_particles(
sim_object=sim,
particles=root_loop_particles,
angle_wiggle=self.angle_wiggle,
angles="current",
name="wm_tether_root_loops_angle",
)
)
@dataclass
class TetherTips(SimAction):
k: Union[float, Tuple[float, float, float]] = (0, 0, 5)
particles: Sequence[int] = (0, -1)
positions: Any = "current"
def run_init(self, sim):
sim.add_force(
polychrom.forces.tether_particles(
sim_object=sim,
particles=self.particles,
k=self.k,
positions=self.positions,
name='wm_tether_tips'
)
)
@dataclass
class LoopBrushCylinderCompression(SimAction):
k: Optional[float] = 1.0
z_min: Optional[Union[float, str]] = None
z_max: Optional[Union[float, str]] = None
r: Optional[float] = None
per_particle_volume: Optional[float] = 1.25 * 1.25 * 1.25
_reads_shared = ['N', 'backbone', 'initial_conformation']
def configure(self):
if (self.z_min is None) != (self.z_max is None):
raise ValueError(
"Both z_min and z_max have to be either specified or left as None."
)
coords = self._shared["initial_conformation"]
if self.z_min is None:
self.z_min = coords[:, 2].min()
elif self.z_min == "bb":
self.z_min = coords[self._shared["backbone"]][:, 2].min()
else:
self.z_min = self.z_min
if self.z_max is None:
self.z_max = coords[:, 2].max()
elif self.z_max == "bb":
self.z_max = coords[self._shared["backbone"]][:, 2].max()
else:
self.z_max = self.z_max
if (self.r is not None) and (
self.per_particle_volume is not None
):
raise ValueError("Please specify either r or per_particle_volume.")
elif (self.r is None) and (
self.per_particle_volume is None
):
coords = self._shared["initial_conformation"]
self.r = ((coords[:, :2] ** 2).sum(axis=1) ** 0.5).max()
# elif (self.r is None) and (
# self.per_particle_volume is not None
# ):
# self.r = np.sqrt(
# self._shared["N"]
# * self.per_particle_volume
# / (self.z_max - self.z_min)
# / np.pi
# )
return {}
def run_init(self, sim):
sim.add_force(
wiggin_mito.forces.cylindrical_confinement(
sim_object=sim,
r=self.r,
per_particle_volume=self.per_particle_volume,
bottom=self.z_min,
top=self.z_max,
k=self.k,
name='wm_cylindrical_confinement'
)
)
@dataclass
class DynamicLoopBrushCylinderCompression(SimAction):
ts_axial_compression: Optional[Tuple[int, int]] = (100, 200)
ts_volume_compression: Optional[Tuple[int, int]] = (100, 200)
per_particle_volume: Optional[float] = 1.25 * 1.25 * 1.25
k_confinement: Optional[float] = 1.0
axial_length_final: Optional[float] = None
_reads_shared = ['N', 'initial_conformation']
def spawn_actions(self):
new_actions = []
N = self._shared["N"]
coords = self._shared["initial_conformation"]
bottom_init = coords[:, 2].min()
top_init = coords[:, 2].max()
r_init = ((coords[:, :2] ** 2).sum(axis=1) ** 0.5).max()
ppv_init = np.pi * r_init * r_init * (top_init - bottom_init) / N
axial_length_final = self.axial_length_final
new_actions.append(
LoopBrushCylinderCompression(
k=self.k_confinement,
z_min=bottom_init,
z_max=top_init,
per_particle_volume=ppv_init
)
)
new_actions.append(TetherTips())
new_actions.append(
wiggin.actions.sim.UpdateGlobalParameter(
force='wm_cylindrical_confinement',
param='top',
ts=self.ts_axial_compression,
vals=[top_init, bottom_init + axial_length_final],
#vals=[bottom_init, (bottom_init + top_init) / 2 + axial_length_final / 2]
).rename('UpdateConfinementTop')
)
new_actions.append(
wiggin.actions.sim.UpdatePerParticleParameter(
force='wm_tether_tips',
parameter_name='z0',
term_index=1,
ts=self.ts_axial_compression,
vals=[top_init, bottom_init+axial_length_final],
).rename('UpdateTopTipTethering')
)
new_actions.append(
wiggin.actions.sim.UpdateGlobalParameter(
force='wm_cylindrical_confinement',
param='ppv',
ts=self.ts_volume_compression,
vals=[ppv_init, self.per_particle_volume],
power=1/4,
).rename('UpdateConfinementVolume')
)
return new_actions
|
from LUIObject import LUIObject
from LUILayouts import LUIHorizontalStretchedLayout
from LUILabel import LUILabel
from LUIInitialState import LUIInitialState
__all__ = ["LUIButton"]
class LUIButton(LUIObject):
""" Simple button, containing three sprites and a label. """
def __init__(self, text=u"Button", template="ButtonDefault", **kwargs):
""" Constructs a new button. The template controls which sprites to use.
If the template is "ButtonDefault" for example, the sprites
"ButtonDefault_Left", "ButtonDefault" and "ButtonDefault_Right" will
be used. The sprites used when the button is pressed should be named
"ButtonDefaultFocus_Left" and so on then.
If an explicit width is set on the button, the button will stick to
that width, otherwise it will automatically resize to fit the label """
LUIObject.__init__(self, x=0, y=0, solid=True)
self._template = template
self._layout = LUIHorizontalStretchedLayout(
parent=self, prefix=self._template, width="100%")
self._label = LUILabel(parent=self, text=text)
self._label.z_offset = 1
self._label.center_vertical = True
self._label.margin = 0, 20, 0, 20
self.margin.left = -1
LUIInitialState.init(self, kwargs)
@property
def text(self):
""" Returns the current label text of the button """
return self._label.text
@text.setter
def text(self, text):
""" Sets the label text of the button """
self._label.text = text
def on_mousedown(self, event):
""" Internal on_mousedown handler. Do not override """
self._layout.prefix = self._template + "Focus"
self._label.margin.top = 1
def on_mouseup(self, event):
""" Internal on_mouseup handler. Do not override """
self._layout.prefix = self._template
self._label.margin.top = 0
|
/**
* Perform an ask query against the global data set
*
* @param query SPARQL query string
* @return true if a solution exists
*/
public boolean doQueryAsk(String query)
{
boolean ret;
dataset.begin(ReadWrite.READ);
try (QueryExecution qExec = QueryExecutionFactory.create(query, dataset.getNamedModel("urn:x-arq:UnionGraph")))
{
ret = qExec.execAsk();
}
dataset.end();
return ret;
} |
<reponame>wesleyw72/intellij<gh_stars>1-10
/*
* Copyright 2016 The Bazel Authors. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.idea.blaze.base.model.blaze.deepequalstester;
import com.google.common.collect.Lists;
import com.google.common.collect.Maps;
import com.google.common.collect.Sets;
import java.io.File;
import java.io.Serializable;
import java.lang.reflect.Field;
import java.lang.reflect.Modifier;
import java.lang.reflect.ParameterizedType;
import java.lang.reflect.Type;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import org.junit.Assert;
final class ReachabilityAnalysis {
/** Wrapper around a map from class to set of paths that lead to that path from the root object */
public static final class ReachableClasses {
Map<Class<? extends Serializable>, Set<List<String>>> map;
public ReachableClasses() {
map = Maps.newHashMap();
}
boolean alreadyFound(Class<? extends Serializable> clazz) {
return map.containsKey(clazz);
}
void addPath(Class<? extends Serializable> clazz, List<String> path) {
Set<List<String>> paths;
if (map.containsKey(clazz)) {
paths = map.get(clazz);
} else {
paths = Sets.newHashSet();
map.put(clazz, paths);
}
paths.add(path);
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
Set<? extends Entry<Class<? extends Serializable>, ? extends Set<? extends List<String>>>>
entries = map.entrySet();
for (Entry<Class<? extends Serializable>, ? extends Set<? extends List<String>>> entry :
entries) {
sb.append(entry.getKey().toString());
sb.append("\n");
}
return sb.toString();
}
public Set<Class<? extends Serializable>> getClasses() {
return map.keySet();
}
public List<String> getExamplePathTo(Class<? extends Serializable> aClass) {
if (map.containsKey(aClass)) {
return map.get(aClass).iterator().next();
}
return Lists.newArrayList();
}
}
/**
* Find all of the classes reachable from a root object
*
* @param root object to start reachability calculation from
* @param declaredRootClass declared class of the root object
* @param currentPath field access path to get to root
* @param reachableClasses output: add classes reachable from root to this object
* @throws IllegalAccessException
* @throws ClassNotFoundException
*/
public static void computeReachableFromObject(
Object root,
Class<?> declaredRootClass,
List<String> currentPath,
ReachableClasses reachableClasses)
throws IllegalAccessException, ClassNotFoundException {
final Class<?> concreteRootClass = DeepEqualsTesterUtil.getClass(declaredRootClass, root);
List<Field> allFields = DeepEqualsTesterUtil.getAllFields(concreteRootClass);
for (Field field : allFields) {
if (!Modifier.isStatic(field.getModifiers())) {
field.setAccessible(true);
final Object fieldObject;
if (root == null) {
fieldObject = null;
} else {
fieldObject = field.get(root);
}
List<String> childPath = Lists.newArrayList();
childPath.addAll(currentPath);
childPath.add(field.toString());
addToReachableAndRecurse(
fieldObject, field.getType(), field.getGenericType(), childPath, reachableClasses);
}
}
}
/**
* Determine the action we should take based on the type of Object and then take it. In the normal
* object case, this results in a recursive call to {@link #computeReachableFromObject(Object,
* Class, List, ReachableClasses)}. In the case of Collections, we skip the Collection type and
* continue on with the type contained in the collection.
*/
private static void addToReachableAndRecurse(
Object object,
Class<?> declaredObjectClass,
Type genericType,
List<String> currentPath,
ReachableClasses reachableClasses)
throws ClassNotFoundException, IllegalAccessException {
Class<? extends Serializable> objectType =
DeepEqualsTesterUtil.getClass(declaredObjectClass, object);
// TODO(salguarnieri) modify if so all ignored classes are taken care of together
if (objectType.isPrimitive()) {
// ignore
} else if (objectType.isEnum()) {
// assume enums do the right thing, ignore
} else if (DeepEqualsTesterUtil.isSubclassOf(objectType, String.class)) {
// ignore
} else if (DeepEqualsTesterUtil.isSubclassOf(objectType, File.class)) {
// ignore
} else if (DeepEqualsTesterUtil.isSubclassOf(objectType, Collection.class)
|| DeepEqualsTesterUtil.isSubclassOf(objectType, Map.class)) {
if (genericType instanceof ParameterizedType) {
ParameterizedType parameterType = (ParameterizedType) genericType;
Type[] actualTypeArguments = parameterType.getActualTypeArguments();
for (Type typeArgument : actualTypeArguments) {
if (typeArgument instanceof Class) {
List<String> childPath = Lists.newArrayList();
childPath.addAll(currentPath);
childPath.add("[[IN COLLECTION]]");
// this does not properly handle subtyping
addToReachableAndRecurse(null, (Class) typeArgument, null, childPath, reachableClasses);
} else {
Assert.fail("This case is not handled yet");
}
}
} else {
Assert.fail("This case is not handled yet");
}
} else if (objectType.isArray()) {
Class<?> typeInArray = objectType.getComponentType();
// This does not properly handle subtyping
List<String> childPath = Lists.newArrayList();
childPath.addAll(currentPath);
childPath.add("[[IN ARRAY]]");
addToReachableAndRecurse(null, typeInArray, null, childPath, reachableClasses);
} else {
boolean doRecursion = !reachableClasses.alreadyFound(objectType);
reachableClasses.addPath(objectType, currentPath);
if (doRecursion) {
computeReachableFromObject(object, declaredObjectClass, currentPath, reachableClasses);
}
}
}
}
|
Master Qui-Gon, more to say, have you? It is requested that this article, or a section of this article, be expanded.
"The Force is strong in my family." ―Luke Skywalker [src]
The Skywalker family was a Force-sensitive Human bloodline whose first known member was Shmi Skywalker.[1] The clan contributed members to both the Old[1] and New Jedi Orders,[2] as well as the ranks of Sith Lords.[3][4] Through several generations, the Skywalker family remained extraordinarily prominent in galactic affairs, having a significant impact on major historical events for over 150 years.
Contents show]
History
"Be a Skywalker?! That's turned out real well for our family, hasn't it? Killed my father – and look what it did to yours!" ―Cade Skywalker to Luke Skywalker's Force ghost [src]
Shmi Skywalker's son, Anakin Skywalker, was conceived by the will of the Force[5] and raised on Tatooine.[1]
As a Jedi Padawan, Anakin married Padmé Amidala in 22 BBY.[6] Three years later, Amidala gave birth to twins, Luke and Leia, passing away soon after. Due to Anakin's fall to the dark side of the Force and transformation to Darth Vader, Luke was raised on Tatooine by his uncle and aunt, Owen and Beru Lars (Anakin's step-brother and step-sister-in-law), while Leia was raised by Bail and Breha Organa on Alderaan.[3]
Leia married Han Solo in 8 ABY[7] and had three children: twins Jaina and Jacen in 9 ABY,[8] and a younger son Anakin, who was named after his grandfather, in 10.5 ABY.[9]
In 19 ABY, Luke married Mara Jade.[10] Their son, Ben, who was named after Luke's first mentor Obi-Wan "Ben" Kenobi, was born in 26.5 ABY.[11] Ben would later become the ancestor of brothers Nat and Kol Skywalker, as well as Kol's son Cade. Cade lived during 130 ABY and was the last surviving Skywalker of his time,[12] his uncle having abandoned the famous name and living as "Bantha" Rawk. Nat married the Kiffar Jedi Droo Rawk and adopted her biological daughter, Ahnah, and two other children Skeeto and Micah.
The droids C-3PO and R2-D2 were both the creation or property of Anakin and Padmé respectively.[1][6] Eventually the droids would later fall into the hands of their children. R2-D2 went under the possession of Luke Skywalker[13] whereas C-3PO mostly remained with Leia Organa Solo and her family.[14]
The family was unaware that they were related to the Naberrie family of Naboo until 36 ABY, when Luke discovered a group of holograms within R2-D2 dealing with the final days of his mother's life.[15] Soon after, Leia traveled to Naboo to meet with Pooja Naberrie and revealed their relationship.[16]
Family tree
The dark side
"She has the Skywalker anger…like her brother…like her father." ―Palpatine, in reference to Leia Organa Solo [src]
Since the miraculous conception of Anakin Skywalker, the majority of the Skywalker family fell to or struggled with the dark side. Anakin, whose very existence might have been influenced by the Sith, became Darth Vader, one of the most feared Sith Lords in history.[3] His son Luke eventually took his father's place as the Emperor's apprentice as a hoax for a short while.[19] Luke's sister Leia had three children. Her youngest son Anakin died at an early age,[20] sending his sister Jaina temporarily over to the dark side.[21] Leia's oldest son, Jacen, became Darth Caedus; unlike the other Skywalkers, Caedus did not want to be redeemed, having been led to believe that he could be a Sith and not be evil, only to end up doing "wrong things for the right reasons." Luke married a former Emperor's Hand Mara Jade, and their son Ben was influenced by his cousin Jacen to commit an assassination and other questionable acts.[4] Even 130 years after the time of Luke Skywalker, his descendant, Cade Skywalker, was still dealing with his family's attraction to the dark side. Cade became an apprentice to Darth Talon in service to Darth Krayt, the new Dark Lord of the Sith. However, he abandoned this position and escaped Krayt's grasp with the help of his friends.[22]
Perhaps the most fascinating thing about the Skywalker family was not how easily they tended to fall to the dark side, but how they eventually returned to the light. All known members of the Skywalker clan who fell to the dark side returned to the light and redeemed themselves. Darth Caedus returned to the light a moment before his death to save his lover and daughter. In the long history of the Jedi, this was known as a very uncommon feat. Usually their redemption relied on the support and determination of their family members and close friends. The belief in redemption, originating with Padmé Amidala, and continued on by Luke Skywalker, was a powerful weapon the Skywalkers used to battle the dark side from within and without.
Behind the scenes
Several roads and locations in the real world were influenced by Star Wars culture and named after the famous Skywalkers, most notably the Skywalker Drives.
In the canceled novel Alien Exodus, the main character Cosmo Hender gets the ability to tap into the Force through a giant crystal, giving him telekinetic powers and the ability to levitate, causing the other slaves nickname him Skywalker because of his powers. This would have originated the founder of the Skywalker family.
Appearances
Sources
Notes and references |
package de.tu_dresden.inf.lat.evee.protege.letheProofExtractorBasedProofService;
import de.tu_dresden.inf.lat.evee.protege.abstractProofService.preferences.AbstractEveeProofPreferencesManager;
public class EveeLetheBasedExtractorProofPreferencesManager extends AbstractEveeProofPreferencesManager {
private static final String SET_ID = "EVEE_PROOF_LETHE_DETAILED";
private static final String PREFERENCE_ID = "EVEE_PREFERENCES_MANAGER_LETHE_DETAILED";
protected static final String DETAILED = "Detailed Proof (LETHE)";
public EveeLetheBasedExtractorProofPreferencesManager(String identifier) {
super(SET_ID, PREFERENCE_ID, identifier);
}
public EveeLetheBasedExtractorProofPreferencesManager() {
super(SET_ID, PREFERENCE_ID, AbstractEveeProofPreferencesManager.PREFERENCE_UI);
}
@Override
protected void createIdentifierSet() {
this.proofServiceNameSet.add(DETAILED);
}
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.