text
stringlengths 2
100k
| meta
dict |
---|---|
TRFS - trivial remote file system, quick and dirty product of [email protected] :)
Port is 2052, near NFS, but unused.
Simplest thing to page across network. Runs on UDP or TCP. Any packet
is retransmitted if no reply received. Newer request replaces older
even if no reply is received.
SessionId - 64 bit net order, generated by server on startup,
verified on each rq/reply, used to find out that server was restarted
and disk ids are, possiblly, invalid - client has to renegotiate by
doing FindRQ
FileId - 32 bit net order, just like unix file descriptor.
IoId - 32 bit net order, io request id to distinguish two subsequent
io operations on the same block(s).
NSectors - 32 bit net order, number of 512 bype sectors to transfer
StartSector - 64 bit net order, number of 512 bype sector to start io from
NOP
------
SessionId
Error
------
SessionId
ErrorId - int32
ErrorTextLen - int32, bytes in ErrorText
ErrorText -
FindRQ
------
SessionId - ignored!
fileNameLen - int32
file name
FindReply
------
SessionId
FileId
ReadRQ
------
SessionId
NRequests - following 4 fields are repeated this number of times
FileId
IoId
NSectors
StartSector
....
ReadReply
------
SessionId
FileId
IoId
NSectors
StartSector
data (512*n_pages bytes)
WriteRQ
------
SessionId
FileId
IoId
NSectors
StartSector
data (512*n_pages bytes)
WriteAck
------
SessionId
FileId
IoId - same as in RQ we replying to
NSectors
StartSector
| {
"pile_set_name": "Github"
} |
Amanda Forums are located at: http://forums.zmanda.com/
Amanda Documentation is available at: http://wiki.zmanda.com/
| {
"pile_set_name": "Github"
} |
info:
x-apisguru-categories:
- entertainment
contact:
x-twitter: NPR
x-logo:
url: 'https://twitter.com/NPR/profile_image?size=original'
| {
"pile_set_name": "Github"
} |
== Introduction ==
Hardware modules that control pin multiplexing or configuration parameters
such as pull-up/down, tri-state, drive-strength etc are designated as pin
controllers. Each pin controller must be represented as a node in device tree,
just like any other hardware module.
Hardware modules whose signals are affected by pin configuration are
designated client devices. Again, each client device must be represented as a
node in device tree, just like any other hardware module.
For a client device to operate correctly, certain pin controllers must
set up certain specific pin configurations. Some client devices need a
single static pin configuration, e.g. set up during initialization. Others
need to reconfigure pins at run-time, for example to tri-state pins when the
device is inactive. Hence, each client device can define a set of named
states. The number and names of those states is defined by the client device's
own binding.
The common pinctrl bindings defined in this file provide an infrastructure
for client device device tree nodes to map those state names to the pin
configuration used by those states.
Note that pin controllers themselves may also be client devices of themselves.
For example, a pin controller may set up its own "active" state when the
driver loads. This would allow representing a board's static pin configuration
in a single place, rather than splitting it across multiple client device
nodes. The decision to do this or not somewhat rests with the author of
individual board device tree files, and any requirements imposed by the
bindings for the individual client devices in use by that board, i.e. whether
they require certain specific named states for dynamic pin configuration.
== Pinctrl client devices ==
For each client device individually, every pin state is assigned an integer
ID. These numbers start at 0, and are contiguous. For each state ID, a unique
property exists to define the pin configuration. Each state may also be
assigned a name. When names are used, another property exists to map from
those names to the integer IDs.
Each client device's own binding determines the set of states that must be
defined in its device tree node, and whether to define the set of state
IDs that must be provided, or whether to define the set of state names that
must be provided.
Required properties:
pinctrl-0: List of phandles, each pointing at a pin configuration
node. These referenced pin configuration nodes must be child
nodes of the pin controller that they configure. Multiple
entries may exist in this list so that multiple pin
controllers may be configured, or so that a state may be built
from multiple nodes for a single pin controller, each
contributing part of the overall configuration. See the next
section of this document for details of the format of these
pin configuration nodes.
In some cases, it may be useful to define a state, but for it
to be empty. This may be required when a common IP block is
used in an SoC either without a pin controller, or where the
pin controller does not affect the HW module in question. If
the binding for that IP block requires certain pin states to
exist, they must still be defined, but may be left empty.
Optional properties:
pinctrl-1: List of phandles, each pointing at a pin configuration
node within a pin controller.
...
pinctrl-n: List of phandles, each pointing at a pin configuration
node within a pin controller.
pinctrl-names: The list of names to assign states. List entry 0 defines the
name for integer state ID 0, list entry 1 for state ID 1, and
so on.
For example:
/* For a client device requiring named states */
device {
pinctrl-names = "active", "idle";
pinctrl-0 = <&state_0_node_a>;
pinctrl-1 = <&state_1_node_a &state_1_node_b>;
};
/* For the same device if using state IDs */
device {
pinctrl-0 = <&state_0_node_a>;
pinctrl-1 = <&state_1_node_a &state_1_node_b>;
};
/*
* For an IP block whose binding supports pin configuration,
* but in use on an SoC that doesn't have any pin control hardware
*/
device {
pinctrl-names = "active", "idle";
pinctrl-0 = <>;
pinctrl-1 = <>;
};
== Pin controller devices ==
Pin controller devices should contain the pin configuration nodes that client
devices reference.
For example:
pincontroller {
... /* Standard DT properties for the device itself elided */
state_0_node_a {
...
};
state_1_node_a {
...
};
state_1_node_b {
...
};
}
The contents of each of those pin configuration child nodes is defined
entirely by the binding for the individual pin controller device. There
exists no common standard for this content.
The pin configuration nodes need not be direct children of the pin controller
device; they may be grandchildren, for example. Whether this is legal, and
whether there is any interaction between the child and intermediate parent
nodes, is again defined entirely by the binding for the individual pin
controller device.
== Generic pin multiplexing node content ==
pin multiplexing nodes:
function - the mux function to select
groups - the list of groups to select with this function
(either this or "pins" must be specified)
pins - the list of pins to select with this function (either
this or "groups" must be specified)
Example:
state_0_node_a {
uart0 {
function = "uart0";
groups = "u0rxtx", "u0rtscts";
};
};
state_1_node_a {
spi0 {
function = "spi0";
groups = "spi0pins";
};
};
state_2_node_a {
function = "i2c0";
pins = "mfio29", "mfio30";
};
== Generic pin configuration node content ==
Many data items that are represented in a pin configuration node are common
and generic. Pin control bindings should use the properties defined below
where they are applicable; not all of these properties are relevant or useful
for all hardware or binding structures. Each individual binding document
should state which of these generic properties, if any, are used, and the
structure of the DT nodes that contain these properties.
Supported generic properties are:
pins - the list of pins that properties in the node
apply to (either this or "group" has to be
specified)
group - the group to apply the properties to, if the driver
supports configuration of whole groups rather than
individual pins (either this or "pins" has to be
specified)
bias-disable - disable any pin bias
bias-high-impedance - high impedance mode ("third-state", "floating")
bias-bus-hold - latch weakly
bias-pull-up - pull up the pin
bias-pull-down - pull down the pin
bias-pull-pin-default - use pin-default pull state
drive-push-pull - drive actively high and low
drive-open-drain - drive with open drain
drive-open-source - drive with open source
drive-strength - sink or source at most X mA
input-enable - enable input on pin (no effect on output)
input-disable - disable input on pin (no effect on output)
input-schmitt-enable - enable schmitt-trigger mode
input-schmitt-disable - disable schmitt-trigger mode
input-debounce - debounce mode with debound time X
power-source - select between different power supplies
low-power-enable - enable low power mode
low-power-disable - disable low power mode
output-low - set the pin to output mode with low level
output-high - set the pin to output mode with high level
slew-rate - set the slew rate
For example:
state_0_node_a {
cts_rxd {
pins = "GPIO0_AJ5", "GPIO2_AH4"; /* CTS+RXD */
bias-pull-up;
};
};
state_1_node_a {
rts_txd {
pins = "GPIO1_AJ3", "GPIO3_AH3"; /* RTS+TXD */
output-high;
};
};
state_2_node_a {
foo {
group = "foo-group";
bias-pull-up;
};
};
Some of the generic properties take arguments. For those that do, the
arguments are described below.
- pins takes a list of pin names or IDs as a required argument. The specific
binding for the hardware defines:
- Whether the entries are integers or strings, and their meaning.
- bias-pull-up, -down and -pin-default take as optional argument on hardware
supporting it the pull strength in Ohm. bias-disable will disable the pull.
- drive-strength takes as argument the target strength in mA.
- input-debounce takes the debounce time in usec as argument
or 0 to disable debouncing
More in-depth documentation on these parameters can be found in
<include/linux/pinctrl/pinconf-generic.h>
| {
"pile_set_name": "Github"
} |
#ifndef NESEMULATORDOCKWIDGET_H
#define NESEMULATORDOCKWIDGET_H
#include "nes_emulator_core.h"
#include "nesemulatorrenderer.h"
#include <QDockWidget>
#include <QKeyEvent>
namespace Ui {
class NESEmulatorDockWidget;
}
class NESEmulatorDockWidget : public QDockWidget
{
Q_OBJECT
public:
explicit NESEmulatorDockWidget(QWidget *parent = 0);
virtual ~NESEmulatorDockWidget();
void setLinearInterpolation(bool enabled) { renderer->setLinearInterpolation(enabled); }
void set43Aspect(bool enabled) { renderer->set43Aspect(enabled); }
void fixTitleBar();
protected:
void changeEvent(QEvent* e);
void mousePressEvent(QMouseEvent* event);
void mouseReleaseEvent(QMouseEvent* event);
void keyPressEvent(QKeyEvent* event);
void keyReleaseEvent(QKeyEvent* event);
signals:
void controllerInput(uint32_t* joy);
private:
Ui::NESEmulatorDockWidget *ui;
CNESEmulatorRenderer* renderer;
QWidget* fakeTitleBar;
QWidget* savedTitleBar;
char* imgData;
uint32_t m_joy [ NUM_CONTROLLERS ];
private slots:
void renderData();
};
#endif // NESEMULATORDOCKWIDGET_H
| {
"pile_set_name": "Github"
} |
/* Copyright 2000 Kjetil S. Matheussen
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation; either version 2
of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA. */
// TODO: Delete. Doesn't do anything anymore.
#include "nsmtracker.h"
#include "visual_proc.h"
#include "time_proc.h"
#include "playerclass.h"
#include <string.h>
#include "blocklist_proc.h"
#include "gfx_statusbar_proc.h"
extern struct Root *root;
extern PlayerClass *pc;
char firstringinstatusbar[512]={0};
void GFX_DrawStatusBar(
struct Tracker_Windows *window,
struct WBlocks *wblock
){
int blocklength=getBlockSTimeLength(wblock->block)/pc->pfreq;
int minutes=blocklength/60;
int seconds=blocklength%60;
sprintf(
wblock->title,
"%s%d*(%d~%d) - %d:%02d - %d/%d/%d - %d/%d:%.64s",
firstringinstatusbar,
wblock->block->num_tracks,
wblock->block->num_lines,
wblock->num_reallines,
minutes,
seconds,
BL_GetBlockFromPos(root->curr_playlist-1)==NULL?-1:BL_GetBlockFromPos(root->curr_playlist-1)->l.num,
BL_GetBlockFromPos(root->curr_playlist)==NULL?-1:BL_GetBlockFromPos(root->curr_playlist)->l.num,
BL_GetBlockFromPos(root->curr_playlist+1)==NULL?-1:BL_GetBlockFromPos(root->curr_playlist+1)->l.num,
wblock->l.num,
root->song->num_blocks-1,
wblock->block->name
);
//GFX_SetStatusBar(window,wblock->title);
}
char *GFX_GetChangeString(
struct Tracker_Windows *window,
struct WBlocks *wblock
){
return firstringinstatusbar;
}
void GFX_SetChangeInt(
struct Tracker_Windows *window,
struct WBlocks *wblock,
const char *title,
int Int
){
sprintf(firstringinstatusbar,"%s%s%d - ",title,title==NULL?"":": ",Int);
}
void GFX_SetChangeFloat(
struct Tracker_Windows *window,
struct WBlocks *wblock,
const char *title,
float Float
){
sprintf(
firstringinstatusbar,
"%s%s%.2f - ",
title,title==NULL?"":": ",
Float
);
}
| {
"pile_set_name": "Github"
} |
/*
* Copyright 2012 Netflix, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.netflix.exhibitor.core.backup;
public class BackupMetaData
{
private final String name;
private final long modifiedDate;
public BackupMetaData(String name, long modifiedDate)
{
this.name = name;
this.modifiedDate = modifiedDate;
}
public String getName()
{
return name;
}
public long getModifiedDate()
{
return modifiedDate;
}
@SuppressWarnings("RedundantIfStatement")
@Override
public boolean equals(Object o)
{
if ( this == o )
{
return true;
}
if ( o == null || getClass() != o.getClass() )
{
return false;
}
BackupMetaData that = (BackupMetaData)o;
if ( modifiedDate != that.modifiedDate )
{
return false;
}
if ( !name.equals(that.name) )
{
return false;
}
return true;
}
@Override
public int hashCode()
{
int result = name.hashCode();
result = 31 * result + (int)(modifiedDate ^ (modifiedDate >>> 32));
return result;
}
@Override
public String toString()
{
return "BackupMetaData{" +
"name='" + name + '\'' +
", version=" + modifiedDate +
'}';
}
}
| {
"pile_set_name": "Github"
} |
// This code is distributed under MIT license.
// Copyright (c) 2015 George Mamaladze
// See license.txt or http://opensource.org/licenses/mit-license.php
using System.Windows.Forms;
namespace Gma.System.MouseKeyHook.Implementation
{
internal class GlobalEventFacade : IDisposableMouseEvents
{
private GlobalMouseListener m_MouseListenerCache;
public event MouseEventHandler MouseMove
{
add { GetMouseListener().MouseMove += value; }
remove { GetMouseListener().MouseMove -= value; }
}
public void Dispose()
{
m_MouseListenerCache?.Dispose();
}
private GlobalMouseListener GetMouseListener()
{
var target = m_MouseListenerCache;
if (target != null) return target;
target = new GlobalMouseListener();
m_MouseListenerCache = target;
return target;
}
}
} | {
"pile_set_name": "Github"
} |
module DbExportDataService
def run_db_export(path, format)
raise 'DbExportDataService#run_db_export is not implemented'
end
end | {
"pile_set_name": "Github"
} |
Visual Studio 14 2015
---------------------
Generates Visual Studio 14 (VS 2015) project files.
Project Types
^^^^^^^^^^^^^
Only Visual C++ and C# projects may be generated. Other types of
projects (JavaScript, Powershell, Python, etc.) are not supported.
Platform Selection
^^^^^^^^^^^^^^^^^^
The default target platform name (architecture) is ``Win32``.
The :variable:`CMAKE_GENERATOR_PLATFORM` variable may be set, perhaps
via the :manual:`cmake(1)` ``-A`` option, to specify a target platform
name (architecture). For example:
* ``cmake -G "Visual Studio 14 2015" -A Win32``
* ``cmake -G "Visual Studio 14 2015" -A x64``
* ``cmake -G "Visual Studio 14 2015" -A ARM``
For compatibility with CMake versions prior to 3.1, one may specify
a target platform name optionally at the end of the generator name.
This is supported only for:
``Visual Studio 14 2015 Win64``
Specify target platform ``x64``.
``Visual Studio 14 2015 ARM``
Specify target platform ``ARM``.
Toolset Selection
^^^^^^^^^^^^^^^^^
The ``v140`` toolset that comes with Visual Studio 14 2015 is selected by
default. The :variable:`CMAKE_GENERATOR_TOOLSET` option may be set, perhaps
via the :manual:`cmake(1)` ``-T`` option, to specify another toolset.
.. |VS_TOOLSET_HOST_ARCH_DEFAULT| replace::
By default this generator uses the 32-bit variant even on a 64-bit host.
.. include:: VS_TOOLSET_HOST_ARCH.txt
| {
"pile_set_name": "Github"
} |
<?xml version="1.0"?>
<ZopeData>
<record id="1" aka="AAAAAAAAAAE=">
<pickle>
<global name="ProxyField" module="Products.ERP5Form.ProxyField"/>
</pickle>
<pickle>
<dictionary>
<item>
<key> <string>delegated_list</string> </key>
<value>
<list/>
</value>
</item>
<item>
<key> <string>id</string> </key>
<value> <string>my_source_payment</string> </value>
</item>
<item>
<key> <string>message_values</string> </key>
<value>
<dictionary>
<item>
<key> <string>external_validator_failed</string> </key>
<value> <string>The input failed the external validator.</string> </value>
</item>
</dictionary>
</value>
</item>
<item>
<key> <string>overrides</string> </key>
<value>
<dictionary>
<item>
<key> <string>field_id</string> </key>
<value> <string></string> </value>
</item>
<item>
<key> <string>form_id</string> </key>
<value> <string></string> </value>
</item>
<item>
<key> <string>target</string> </key>
<value> <string></string> </value>
</item>
</dictionary>
</value>
</item>
<item>
<key> <string>tales</string> </key>
<value>
<dictionary>
<item>
<key> <string>field_id</string> </key>
<value> <string></string> </value>
</item>
<item>
<key> <string>form_id</string> </key>
<value> <string></string> </value>
</item>
<item>
<key> <string>target</string> </key>
<value> <string></string> </value>
</item>
</dictionary>
</value>
</item>
<item>
<key> <string>values</string> </key>
<value>
<dictionary>
<item>
<key> <string>field_id</string> </key>
<value> <string>my_view_mode_source_payment</string> </value>
</item>
<item>
<key> <string>form_id</string> </key>
<value> <string>Base_viewTradeFieldLibrary</string> </value>
</item>
<item>
<key> <string>target</string> </key>
<value> <string>Click to edit the target</string> </value>
</item>
</dictionary>
</value>
</item>
</dictionary>
</pickle>
</record>
</ZopeData>
| {
"pile_set_name": "Github"
} |
contract C {
function f() public {
assembly {
let d := 0x10
function asmfun(a, b, c) -> x, y, z {
x := a
y := b
z := 7
}
let a1, b1, c1 := asmfun(1, 2, 3)
mstore(0x00, a1)
mstore(0x20, b1)
mstore(0x40, c1)
mstore(0x60, d)
return (0, 0x80)
}
}
}
// ====
// compileViaYul: also
// ----
// f() -> 0x1, 0x2, 0x7, 0x10
| {
"pile_set_name": "Github"
} |
import { ActionCreatorWithoutPayload, PayloadActionCreator } from '@reduxjs/toolkit';
export const mockToolkitActionCreator = (creator: PayloadActionCreator<any>) => {
return Object.assign(jest.fn(), creator);
};
export type ToolkitActionCreatorWithoutPayloadMockType = typeof mockToolkitActionCreatorWithoutPayload &
ActionCreatorWithoutPayload<any>;
export const mockToolkitActionCreatorWithoutPayload = (creator: ActionCreatorWithoutPayload<any>) => {
return Object.assign(jest.fn(), creator);
};
| {
"pile_set_name": "Github"
} |
// Copyright 2017 the V8 project authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#ifndef V8_OBJECTS_SCRIPT_INL_H_
#define V8_OBJECTS_SCRIPT_INL_H_
#include "src/objects/script.h"
#include "src/objects/shared-function-info.h"
#include "src/objects/smi-inl.h"
#include "src/objects/string-inl.h"
// Has to be the last include (doesn't have include guards):
#include "src/objects/object-macros.h"
namespace v8 {
namespace internal {
OBJECT_CONSTRUCTORS_IMPL(Script, Struct)
NEVER_READ_ONLY_SPACE_IMPL(Script)
CAST_ACCESSOR(Script)
ACCESSORS(Script, source, Object, kSourceOffset)
ACCESSORS(Script, name, Object, kNameOffset)
SMI_ACCESSORS(Script, id, kIdOffset)
SMI_ACCESSORS(Script, line_offset, kLineOffsetOffset)
SMI_ACCESSORS(Script, column_offset, kColumnOffsetOffset)
ACCESSORS(Script, context_data, Object, kContextOffset)
SMI_ACCESSORS(Script, type, kScriptTypeOffset)
ACCESSORS(Script, line_ends, Object, kLineEndsOffset)
ACCESSORS_CHECKED(Script, eval_from_shared_or_wrapped_arguments, Object,
kEvalFromSharedOrWrappedArgumentsOffset,
this->type() != TYPE_WASM)
SMI_ACCESSORS_CHECKED(Script, eval_from_position, kEvalFromPositionOffset,
this->type() != TYPE_WASM)
SMI_ACCESSORS(Script, flags, kFlagsOffset)
ACCESSORS(Script, source_url, Object, kSourceUrlOffset)
ACCESSORS(Script, source_mapping_url, Object, kSourceMappingUrlOffset)
ACCESSORS(Script, host_defined_options, FixedArray, kHostDefinedOptionsOffset)
ACCESSORS_CHECKED(Script, wasm_breakpoint_infos, FixedArray,
kEvalFromSharedOrWrappedArgumentsOffset,
this->type() == TYPE_WASM)
ACCESSORS_CHECKED(Script, wasm_managed_native_module, Object,
kEvalFromPositionOffset, this->type() == TYPE_WASM)
ACCESSORS_CHECKED(Script, wasm_weak_instance_list, WeakArrayList,
kSharedFunctionInfosOffset, this->type() == TYPE_WASM)
bool Script::is_wrapped() const {
return eval_from_shared_or_wrapped_arguments().IsFixedArray();
}
bool Script::has_eval_from_shared() const {
return eval_from_shared_or_wrapped_arguments().IsSharedFunctionInfo();
}
void Script::set_eval_from_shared(SharedFunctionInfo shared,
WriteBarrierMode mode) {
DCHECK(!is_wrapped());
set_eval_from_shared_or_wrapped_arguments(shared, mode);
}
SharedFunctionInfo Script::eval_from_shared() const {
DCHECK(has_eval_from_shared());
return SharedFunctionInfo::cast(eval_from_shared_or_wrapped_arguments());
}
void Script::set_wrapped_arguments(FixedArray value, WriteBarrierMode mode) {
DCHECK(!has_eval_from_shared());
set_eval_from_shared_or_wrapped_arguments(value, mode);
}
FixedArray Script::wrapped_arguments() const {
DCHECK(is_wrapped());
return FixedArray::cast(eval_from_shared_or_wrapped_arguments());
}
DEF_GETTER(Script, shared_function_infos, WeakFixedArray) {
return type() == TYPE_WASM
? ReadOnlyRoots(GetHeap()).empty_weak_fixed_array()
: TaggedField<WeakFixedArray, kSharedFunctionInfosOffset>::load(
*this);
}
void Script::set_shared_function_infos(WeakFixedArray value,
WriteBarrierMode mode) {
DCHECK_NE(TYPE_WASM, type());
TaggedField<WeakFixedArray, kSharedFunctionInfosOffset>::store(*this, value);
CONDITIONAL_WRITE_BARRIER(*this, kSharedFunctionInfosOffset, value, mode);
}
bool Script::has_wasm_breakpoint_infos() const {
return type() == TYPE_WASM && wasm_breakpoint_infos().length() > 0;
}
wasm::NativeModule* Script::wasm_native_module() const {
return Managed<wasm::NativeModule>::cast(wasm_managed_native_module()).raw();
}
Script::CompilationType Script::compilation_type() {
return BooleanBit::get(flags(), kCompilationTypeBit) ? COMPILATION_TYPE_EVAL
: COMPILATION_TYPE_HOST;
}
void Script::set_compilation_type(CompilationType type) {
set_flags(BooleanBit::set(flags(), kCompilationTypeBit,
type == COMPILATION_TYPE_EVAL));
}
Script::CompilationState Script::compilation_state() {
return BooleanBit::get(flags(), kCompilationStateBit)
? COMPILATION_STATE_COMPILED
: COMPILATION_STATE_INITIAL;
}
void Script::set_compilation_state(CompilationState state) {
set_flags(BooleanBit::set(flags(), kCompilationStateBit,
state == COMPILATION_STATE_COMPILED));
}
bool Script::is_repl_mode() const {
return BooleanBit::get(flags(), kREPLModeBit);
}
void Script::set_is_repl_mode(bool value) {
set_flags(BooleanBit::set(flags(), kREPLModeBit, value));
}
ScriptOriginOptions Script::origin_options() {
return ScriptOriginOptions((flags() & kOriginOptionsMask) >>
kOriginOptionsShift);
}
void Script::set_origin_options(ScriptOriginOptions origin_options) {
DCHECK(!(origin_options.Flags() & ~((1 << kOriginOptionsSize) - 1)));
set_flags((flags() & ~kOriginOptionsMask) |
(origin_options.Flags() << kOriginOptionsShift));
}
bool Script::HasValidSource() {
Object src = this->source();
if (!src.IsString()) return true;
String src_str = String::cast(src);
if (!StringShape(src_str).IsExternal()) return true;
if (src_str.IsOneByteRepresentation()) {
return ExternalOneByteString::cast(src).resource() != nullptr;
} else if (src_str.IsTwoByteRepresentation()) {
return ExternalTwoByteString::cast(src).resource() != nullptr;
}
return true;
}
} // namespace internal
} // namespace v8
#include "src/objects/object-macros-undef.h"
#endif // V8_OBJECTS_SCRIPT_INL_H_
| {
"pile_set_name": "Github"
} |
/*
* Copyright (C) 2015 The Team Win Recovery Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#ifndef _TWATOMIC_HPP_HEADER
#define _TWATOMIC_HPP_HEADER
#include <pthread.h>
class TWAtomicInt
{
public:
TWAtomicInt(int initial_value = 0);
~TWAtomicInt();
void set_value(int new_value);
int get_value();
private:
int value;
bool use_mutex;
pthread_mutex_t mutex_lock;
};
#endif //_TWATOMIC_HPP_HEADER
| {
"pile_set_name": "Github"
} |
/// Copyright (c) 2012 Ecma International. All rights reserved.
/// Ecma International makes this code available under the terms and conditions set
/// forth on http://hg.ecmascript.org/tests/test262/raw-file/tip/LICENSE (the
/// "Use Terms"). Any redistribution of this code must retain the above
/// copyright and this notice and otherwise comply with the Use Terms.
/**
* @path ch15/15.2/15.2.3/15.2.3.6/15.2.3.6-4-514.js
* @description ES5 Attributes - property ([[Get]] is a Function, [[Set]] is undefined, [[Enumerable]] is false, [[Configurable]] is true) is non-enumerable
*/
function testcase() {
var obj = {};
var getFunc = function () {
return 1001;
};
Object.defineProperty(obj, "prop", {
get: getFunc,
set: undefined,
enumerable: false,
configurable: true
});
var propertyDefineCorrect = obj.hasOwnProperty("prop");
var desc = Object.getOwnPropertyDescriptor(obj, "prop");
for (var p in obj) {
if (p === "prop") {
return false;
}
}
return propertyDefineCorrect && desc.enumerable === false;
}
runTestCase(testcase);
| {
"pile_set_name": "Github"
} |
# AWS Beanstalk
> This tutorial shows how to deploy an application with an SQL database to [AWS Beanstalk](https://console.aws.amazon.com/elasticbeanstalk/home). It assumes that you already have an AWS account and have access to your console.
## Prerequisites
### Do not use SQLite
SQLite databases are not supported by AWS Beanstalk. You have to use a different one such as Postgres, MySQL, MariaDB, Oracle or MSSQL.
Make sure that the SQLite driver is also uninstalled.
```
npm uninstall sqlite3 connect-sqlite3
```
### Configure the Database Credentials
Replace your `ormconfig.js` (or `ormconfig.yml` or `ormconfig.json`) file with this one:
*ormconfig.js*
```js
const { Config } = require('@foal/core');
module.exports = {
type: Config.get('database.type'),
url: Config.get('database.url'),
database: process.env.RDS_DB_NAME || Config.get('database.name'),
port: process.env.RDS_PORT || Config.get('database.port'),
host: process.env.RDS_HOSTNAME || Config.get('database.host'),
username: process.env.RDS_USERNAME || Config.get('database.username'),
password: process.env.RDS_PASSWORD || Config.get('database.password'),
entities: ["build/app/**/*.entity.js"],
migrations: ["build/migrations/*.js"],
cli: {
"migrationsDir": "src/migrations"
},
synchronize: Config.get('database.synchronize')
};
```
And complete your configuration file `config/default.json` (or `config/default.yml`) with your local database credentials:
> The below credentials are an example. If you want to use them, you need to install PostgreSQL on your local host, create a database named `my-db` and install the postgres driver in your project (`npm install pg`). But you are free to use another database with other credentials if you want to.
```json
{
"settings": {
// ...
},
"database": {
"type": "postgres",
"name": "my-db",
"port": 5432,
"username": "postgres",
"synchronize": true
}
}
```
### Sessions
#### Case 1: The application does not use sessions
If you do not use sessions, then remove the store import and the store option from the `createApp` function in the `src/index.ts` file.
```typescript
import 'source-map-support/register';
// std
import * as http from 'http';
// 3p
import { Config, createApp } from '@foal/core';
// The store import is removed.
import { createConnection } from 'typeorm';
// App
import { AppController } from './app/app.controller';
async function main() {
await createConnection();
// The store option is removed.
const app = createApp(AppController);
const httpServer = http.createServer(app);
const port = Config.get('port', 3001);
httpServer.listen(port, () => {
console.log(`Listening on port ${port}...`);
});
}
main();
```
#### Case 2: The application uses sessions
If your application uses sessions, you need to provide a [session store](https://github.com/expressjs/session#compatible-session-stores).
Here is an example with [connect-redis](https://www.npmjs.com/package/connect-redis):
```typescript
import 'source-map-support/register';
// std
import * as http from 'http';
// 3p
import { Config, createApp } from '@foal/core';
import * as redisStoreFactory from 'connect-redis';
import { createConnection } from 'typeorm';
// App
import { AppController } from './app/app.controller';
async function main() {
await createConnection();
const app = createApp(AppController, {
store: session => new (redisStoreFactory(session))(/* options */)
});
const httpServer = http.createServer(app);
const port = Config.get('port', 3001);
httpServer.listen(port, () => {
console.log(`Listening on port ${port}...`);
});
}
main();
```
> This guide does not explain how to set up a redis database on AWS Beanstalk.
## Create the AWS Application and Add a Database
Go to https://console.aws.amazon.com/elasticbeanstalk/home and click on *Get Started*.

Enter the name of your application, choose the *Node.js* platform and select the *Sample Application*.

AWS creates and loads the new application. **This takes a few minutes**. Then check that the application *health* is ok and open the application.
> If the health is incorrect, click on the *Causes* button to see what happened.

The home page should look like this:

Now it is time to configure your environment and add a database. Click on the *Configuration* button and set the environment variables `NODE_ENV` and `DATABASE_SYNCHRONIZE`.
> The `NODE_ENV` variable tells FoalTS to look at the production configuration (for example `config/production.json`).
>
> The `DATABASE_SYNCHRONIZE` variable tells TypeORM not to update the database schema on every application launch (see section [Generate & Run the Database Migrations](#Generate-&-Run-the-Database-Migrations) below).


Then create a new database from the configuration page.

Choose the database engine (postgres in this example) and enter the production database credentials.

## Deploy the Foal Application
Build the app.
```
npm run build:app
```
Create an archive from the directories and files `build/`, `config/`, `public/`, `ormconfig.json`, `package-lock.json` and `package.json`.

Upload the archive to AWS.


The application restarts. This may take a few minutes.
## Generate & Run the Database Migrations
**Warning, warning: this section is only compatible with projects created with FoalTS v0.8. If you need a tutorial for v1 and above, feel free to open a Github issue for that.**
Migrations are SQL queries that modify the database schemas (definition of the tables, relations, etc). By default, every new Foal project is created with the option `synchronize: true` in its `ormconfig`. This setting updates the database schema on every launch of the application.
But using this in production is considered unsafe (data could be lost for example if a model is changed by mistake). That's why we will generate and run migrations manually. To do this, we will need access to the database.
> **Warning** This section assumes that you have previously set the environment variable `DATABASE_SYNCHRONIZE` to `false`. This overrides the `synchronize` setting on AWS.
Go to [AWS database page](https://console.aws.amazon.com/rds/home#databases:) and click on your database.

Save the URL endpoint and click on the VPC security group. We will tell AWS that we can access the database from our local host.

Add a new *inbound* rule. **Make sure you don't delete the one that already exists.**


You are now able to communicate from your local host with the production database (as long as you provide the correct credentials).
> The next part of the tutorial assumes that you did not change the default option `synchronize: true` in the `ormconfig` file. This is probably the case if you have never had to deal with migrations before.
Open **a new terminal/console**.
Enter the database credentials.
*On Mac and Linux*
```sh
export DATABASE_HOST=<the previous saved endpoint>
export DATABASE_USERNAME=<the database username> # in the tutorial, it is myusername
export DATABASE_PASSWORD=<the database password>
export DATABASE_NAME=ebdb
```
*On Windows*
```sh
set DATABASE_HOST=<the previous saved endpoint>
set DATABASE_USERNAME=<the database username> # in the tutorial, it is myusername
set DATABASE_PASSWORD=<the database password>
set DATABASE_NAME=ebdb
```
Generate the migration.
```sh
npm run migration:generate -- --name first-migration
```
A new migration file appears in `src/migrations/`. Check that it is correct and then build it.
```sh
npm run build:migrations
```
Then run the migration.
```sh
npm run migration:run
```
The database schema is updated. Your remote application should now run properly.
**Close your terminal / console**. Do not start your local application in the same terminal, otherwise it will run on your production database.
> **Caution:** Running migrations is always sensitive part of deployments. You should always back up your data before doing such a thing.
| {
"pile_set_name": "Github"
} |
function getOffsetPoint(start, end, deltaAngle) {
let distance = getDistance(start, end) / 4;
let angle, dX, dY;
let mp = [start[0], start[1]];
deltaAngle = deltaAngle == null ? -0.2 : deltaAngle;
if (start[0] != end[0] && start[1] != end[1]) {
let k = (end[1] - start[1]) / (end[0] - start[0]);
angle = Math.atan(k);
} else if (start[0] == end[0]) {
angle = (start[1] <= end[1] ? 1 : -1) * Math.PI / 2;
} else {
angle = 0;
}
if (start[0] <= end[0]) {
angle -= deltaAngle;
dX = Math.round(Math.cos(angle) * distance);
dY = Math.round(Math.sin(angle) * distance);
mp[0] += dX;
mp[1] += dY;
} else {
angle += deltaAngle;
dX = Math.round(Math.cos(angle) * distance);
dY = Math.round(Math.sin(angle) * distance);
mp[0] -= dX;
mp[1] -= dY;
}
return mp;
}
function smoothSpline(points, isLoop) {
let len = points.length;
let ret = [];
let distance = 0;
for (let i = 1; i < len; i++) {
distance += getDistance(points[i - 1], points[i]);
}
let segs = distance / 2;
segs = segs < len ? len : segs;
for (let i = 0; i < segs; i++) {
let pos = i / (segs - 1) * (isLoop ? len : len - 1);
let idx = Math.floor(pos);
let w = pos - idx;
let p0;
let p1 = points[idx % len];
let p2;
let p3;
if (!isLoop) {
p0 = points[idx === 0 ? idx : idx - 1];
p2 = points[idx > len - 2 ? len - 1 : idx + 1];
p3 = points[idx > len - 3 ? len - 1 : idx + 2];
} else {
p0 = points[(idx - 1 + len) % len];
p2 = points[(idx + 1) % len];
p3 = points[(idx + 2) % len];
}
let w2 = w * w;
let w3 = w * w2;
ret.push([
interpolate(p0[0], p1[0], p2[0], p3[0], w, w2, w3),
interpolate(p0[1], p1[1], p2[1], p3[1], w, w2, w3)
]);
}
return ret;
}
function interpolate(p0, p1, p2, p3, t, t2, t3) {
let v0 = (p2 - p0) * 0.5;
let v1 = (p3 - p1) * 0.5;
return (2 * (p1 - p2) + v0 + v1) * t3 + (-3 * (p1 - p2) - 2 * v0 - v1) * t2 + v0 * t + p1;
}
function getDistance(p1, p2) {
return Math.sqrt(
(p1[0] - p2[0]) * (p1[0] - p2[0]) +
(p1[1] - p2[1]) * (p1[1] - p2[1])
);
}
export function lineCurve(fromPoint, endPoint, n) {
let delLng = (endPoint[0] - fromPoint[0]) / n;
let delLat = (endPoint[1] - fromPoint[1]) / n;
let path = [];
for (let i = 0; i < n; i++) {
let pointNLng = fromPoint[0] + delLng * i;
let pointNLat = fromPoint[1] + delLat * i;
path.push([pointNLng, pointNLat]);
}
return path;
}
export function getPointList(start, end, deltaAngle) {
let points = [
[start[0], start[1]],
[end[0], end[1]]
];
let ex = points[1][0];
let ey = points[1][1];
points[3] = [ex, ey];
points[1] = getOffsetPoint(points[0], points[3], deltaAngle);
points[2] = getOffsetPoint(points[3], points[0], deltaAngle);
points = smoothSpline(points, false);
points[points.length - 1] = [ex, ey];
return points;
} | {
"pile_set_name": "Github"
} |
package com.clover_studio.spikachatmodule;
import android.annotation.TargetApi;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Matrix;
import android.graphics.PorterDuff;
import android.media.ExifInterface;
import android.net.Uri;
import android.os.AsyncTask;
import android.os.Build;
import android.os.Bundle;
import android.provider.MediaStore;
import android.view.View;
import android.widget.ImageView;
import android.widget.ProgressBar;
import com.clover_studio.spikachatmodule.api.UploadFileManagement;
import com.clover_studio.spikachatmodule.base.BaseActivity;
import com.clover_studio.spikachatmodule.base.SingletonLikeApp;
import com.clover_studio.spikachatmodule.dialogs.NotifyDialog;
import com.clover_studio.spikachatmodule.dialogs.UploadFileDialog;
import com.clover_studio.spikachatmodule.models.UploadFileResult;
import com.clover_studio.spikachatmodule.utils.Const;
import com.clover_studio.spikachatmodule.utils.LogCS;
import com.clover_studio.spikachatmodule.utils.Tools;
import com.google.gson.Gson;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
public class CameraPhotoPreviewActivity extends BaseActivity {
boolean mIsOverJellyBean;
String mOriginalPath;
String mScaledPath;
Bitmap previewBitmap;
ImageView previewImageView;
/**
* start preview for image from gallery
* @param context
*/
public static void starCameraFromGalleryPhotoPreviewActivity(Context context){
Intent intent = new Intent(context, CameraPhotoPreviewActivity.class);
intent.putExtra(Const.Extras.TYPE_OF_PHOTO_INTENT, Const.PhotoIntents.GALLERY);
((Activity)context).startActivityForResult(intent, Const.RequestCode.PHOTO_CHOOSE);
}
/**
* start preview for image from camera
* @param context
*/
public static void starCameraPhotoPreviewActivity(Context context){
Intent intent = new Intent(context, CameraPhotoPreviewActivity.class);
intent.putExtra(Const.Extras.TYPE_OF_PHOTO_INTENT, Const.PhotoIntents.CAMERA);
((Activity)context).startActivityForResult(intent, Const.RequestCode.PHOTO_CHOOSE);
}
@TargetApi(Build.VERSION_CODES.KITKAT)
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera_photo_preview);
setToolbar(R.id.tToolbar, R.layout.custom_camera_preview_toolbar);
setMenuLikeBack();
findViewById(R.id.cancelButton).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
onBackPressed();
}
});
findViewById(R.id.okButton).setOnClickListener(onOkClicked);
previewImageView = (ImageView) findViewById(R.id.ivCameraFullPhoto);
((ProgressBar)findViewById(R.id.progressBarLoading)).getIndeterminateDrawable().setColorFilter(Color.WHITE, PorterDuff.Mode.SRC_IN);
mIsOverJellyBean = Tools.isBuildOver(18);
if(getIntent().getExtras().getInt(Const.Extras.TYPE_OF_PHOTO_INTENT) == Const.PhotoIntents.GALLERY){
if (mIsOverJellyBean) {
Intent intent = new Intent(Intent.ACTION_OPEN_DOCUMENT);
intent.addCategory(Intent.CATEGORY_OPENABLE);
intent.setType("image/*");
startActivityForResult(intent, Const.RequestCode.GALLERY);
} else {
Intent intent = new Intent();
intent.setType("image/*");
intent.setAction(Intent.ACTION_GET_CONTENT);
this.startActivityForResult(intent, Const.RequestCode.GALLERY);
}
}else{
try {
startCamera();
} catch (Exception ex) {
ex.printStackTrace();
NotifyDialog.startInfo(this, getString(R.string.camera_error_title), getString(R.string.camera_error_camera_init));
}
}
}
private View.OnClickListener onOkClicked = new View.OnClickListener() {
@Override
public void onClick(View v) {
String sizeOfOriginalImage = "";
String sizeOfScaledImage = "";
File originalFile = new File(mOriginalPath);
if(originalFile.exists()){
sizeOfOriginalImage = Tools.readableFileSize(originalFile.length());
}
File scaledFile = new File(mScaledPath);
if(scaledFile.exists()){
sizeOfScaledImage = Tools.readableFileSize(scaledFile.length());
}
if(originalFile.length() < scaledFile.length()){
uploadFile(mOriginalPath);
}else{
NotifyDialog dialog = NotifyDialog.startConfirm(getActivity(), getString(R.string.compress_image_title), getString(R.string.compress_image_text));
dialog.setButtonsText(getString(R.string.NO_CAPITAL) + ", " + sizeOfOriginalImage, getString(R.string.YES_CAPITAL) + ", " + sizeOfScaledImage);
dialog.setTwoButtonListener(new NotifyDialog.TwoButtonDialogListener() {
@Override
public void onOkClicked(NotifyDialog dialog) {
dialog.dismiss();
uploadFile(mScaledPath);
}
@Override
public void onCancelClicked(NotifyDialog dialog) {
dialog.dismiss();
uploadFile(mOriginalPath);
}
});
dialog.setCancelable(true);
}
}
};
private void uploadFile(String path){
final UploadFileDialog dialog = UploadFileDialog.startDialog(getActivity());
UploadFileManagement tt = new UploadFileManagement();
tt.new BackgroundUploader(SingletonLikeApp.getInstance().getConfig(getActivity()).apiBaseUrl + Const.Api.UPLOAD_FILE, new File(path), Const.ContentTypes.IMAGE_JPG, new UploadFileManagement.OnUploadResponse() {
@Override
public void onStart() {
LogCS.d("LOG", "START UPLOADING");
}
@Override
public void onSetMax(final int max) {
getActivity().runOnUiThread(new Runnable() {
@Override
public void run() {
dialog.setMax(max);
}
});
}
@Override
public void onProgress(final int current) {
getActivity().runOnUiThread(new Runnable() {
@Override
public void run() {
dialog.setCurrent(current);
}
});
}
@Override
public void onFinishUpload() {
getActivity().runOnUiThread(new Runnable() {
@Override
public void run() {
dialog.fileUploaded();
}
});
}
@Override
public void onResponse(final boolean isSuccess, final String result) {
getActivity().runOnUiThread(new Runnable() {
@Override
public void run() {
dialog.dismiss();
if(!isSuccess){
onResponseFailed();
}else{
onResponseFinish(result);
}
}
});
}
}).execute();
}
private void onResponseFailed() {
NotifyDialog.startInfo(getActivity(), getString(R.string.error), getString(R.string.file_not_found));
}
protected void onResponseFinish(String result){
Gson gson = new Gson();
UploadFileResult data = null;
try {
data = gson.fromJson(result, UploadFileResult.class);
new File(mScaledPath).delete();
new File(mOriginalPath).delete();
} catch (Exception e) {
e.printStackTrace();
}
if(data != null){
sendMessage(data);
}
}
private void sendMessage(UploadFileResult data) {
Intent intentData = new Intent();
intentData.putExtra(Const.Extras.UPLOAD_MODEL, data);
setResult(RESULT_OK, intentData);
finish();
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(resultCode == RESULT_CANCELED){
setResult(RESULT_CANCELED);
finish();
return;
}
if(requestCode == Const.RequestCode.GALLERY){
if(resultCode == RESULT_OK){
if (mIsOverJellyBean) {
Uri uri = null;
if (data != null) {
uri = data.getData();
String selected_image_path = Tools.getImagePath(this, uri, mIsOverJellyBean);
onPhotoTaken(selected_image_path);
} else {
NotifyDialog.startInfo(this, getString(R.string.gallery_error_title), getString(R.string.gallery_error_text));
}
} else {
try {
Uri selected_image = data.getData();
String selected_image_path = Tools.getImagePath(this, selected_image, mIsOverJellyBean);
onPhotoTaken(selected_image_path);
} catch (Exception e) {
e.printStackTrace();
NotifyDialog.startInfo(this, getString(R.string.gallery_error_title), getString(R.string.gallery_error_text));
}
}
}
}else if(requestCode == Const.RequestCode.CAMERA){
if(resultCode == RESULT_OK){
File file = new File(mOriginalPath);
boolean exists = file.exists();
if (exists) {
onPhotoTaken(mOriginalPath);
} else {
NotifyDialog.startInfo(this, getString(R.string.camera_error_title), getString(R.string.camera_error_camera_image));
}
}
}
}
private void startCamera() {
// Check if camera exists
if (!getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA)
&& !getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_FRONT)) {
NotifyDialog.startInfo(this, getString(R.string.camera_error_title), getString(R.string.camera_error_no_camera));
} else {
try {
long date = System.currentTimeMillis();
String filename = Const.FilesName.CAMERA_TEMP_FILE_NAME;
mOriginalPath = Tools.getTempFolderPath() + "/" + filename;
File file = new File(mOriginalPath);
Uri outputFileUri = Uri.fromFile(file);
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
intent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri);
startActivityForResult(intent, Const.RequestCode.CAMERA);
} catch (Exception ex) {
ex.printStackTrace();
NotifyDialog.startInfo(this, getString(R.string.camera_error_title), getString(R.string.camera_error_camera_init));
}
}
}
/**
* scale and show image
* @param path path oo image
*/
protected void onPhotoTaken(String path) {
String fileName = Uri.parse(path).getLastPathSegment();
mOriginalPath = Tools.getTempFolderPath() + "/" + fileName;
mScaledPath = Tools.getTempFolderPath() + "/" + Const.FilesName.SCALED_PREFIX + fileName;
if (!path.equals(mOriginalPath)) {
try {
Tools.copyStream(new FileInputStream(new File(path)), new FileOutputStream(new File(mOriginalPath)));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
new AsyncTask<String, Void, byte[]>() {
@Override
protected byte[] doInBackground(String... params) {
try {
if (params == null) {
return null;
}
File f = new File(params[0]);
ExifInterface exif = new ExifInterface(f.getPath());
int orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL);
int angle = 0;
if (orientation == ExifInterface.ORIENTATION_ROTATE_90) {
angle = 90;
} else if (orientation == ExifInterface.ORIENTATION_ROTATE_180) {
angle = 180;
} else if (orientation == ExifInterface.ORIENTATION_ROTATE_270) {
angle = 270;
}
BitmapFactory.Options optionsMeta = new BitmapFactory.Options();
optionsMeta.inJustDecodeBounds = true;
BitmapFactory.decodeFile(f.getAbsolutePath(), optionsMeta);
int actualHeight = optionsMeta.outHeight;
int actualWidth = optionsMeta.outWidth;
// this options allow android to claim the bitmap memory
// if
// it runs low
// on memory
optionsMeta.inJustDecodeBounds = false;
optionsMeta.inPurgeable = true;
optionsMeta.inInputShareable = true;
optionsMeta.inTempStorage = new byte[16 * 1024];
// if (!isFromWall) {
float maxHeight = 1600.0f;
float maxWidth = 1600.0f;
optionsMeta.inSampleSize = Tools.calculateInSampleSize(optionsMeta, (int) maxWidth, (int) maxHeight);
// max Height and width values of the compressed image
// is
// taken as
// 816x612
float imgRatio = (float) actualWidth / (float) actualHeight;
float maxRatio = maxWidth / maxHeight;
if (actualHeight > maxHeight || actualWidth > maxWidth) {
if (imgRatio < maxRatio) {
imgRatio = maxHeight / actualHeight;
actualWidth = (int) (imgRatio * actualWidth);
actualHeight = (int) maxHeight;
} else if (imgRatio > maxRatio) {
imgRatio = maxWidth / actualWidth;
actualHeight = (int) (imgRatio * actualHeight);
actualWidth = (int) maxWidth;
} else {
actualHeight = (int) maxHeight;
actualWidth = (int) maxWidth;
}
}
Bitmap tempBitmap = BitmapFactory.decodeStream(new FileInputStream(f), null, optionsMeta);
previewBitmap = Bitmap.createBitmap(actualWidth, actualHeight, Bitmap.Config.ARGB_8888);
float ratioX = actualWidth / (float) optionsMeta.outWidth;
float ratioY = actualHeight / (float) optionsMeta.outHeight;
float middleX = actualWidth / 2.0f;
float middleY = actualHeight / 2.0f;
Matrix mat = new Matrix();
mat.postRotate(angle);
Matrix scaleMatrix = new Matrix();
scaleMatrix.setScale(ratioX, ratioY, middleX, middleY);
Canvas canvas = new Canvas(previewBitmap);
canvas.setMatrix(scaleMatrix);
canvas.drawBitmap(tempBitmap, middleX - tempBitmap.getWidth() / 2, middleY - tempBitmap.getHeight() / 2, null);
previewBitmap = Bitmap.createBitmap(previewBitmap, 0, 0, previewBitmap.getWidth(), previewBitmap.getHeight(), mat, true);
Tools.saveBitmapToFile(previewBitmap, mScaledPath);
return null;
} catch (Exception ex) {
ex.printStackTrace();
previewBitmap = null;
}
return null;
}
@Override
protected void onPostExecute(byte[] result) {
super.onPostExecute(result);
if (null != previewBitmap) {
previewImageView.setImageBitmap(previewBitmap);
} else {
NotifyDialog.startInfo(getActivity(), getString(R.string.gallery_error_title), getString(R.string.gallery_error_text));
}
findViewById(R.id.progressBarLoading).setVisibility(View.GONE);
}
}.execute(mOriginalPath);
}
}
| {
"pile_set_name": "Github"
} |
{
"name": "iron-icons",
"version": "2.1.1",
"description": "A set of icons for use with iron-icon",
"authors": [
"The Polymer Authors"
],
"keywords": [
"web-components",
"polymer",
"icon"
],
"main": "iron-icons.html",
"private": true,
"repository": {
"type": "git",
"url": "git://github.com/PolymerElements/iron-icons"
},
"license": "http://polymer.github.io/LICENSE.txt",
"homepage": "https://github.com/PolymerElements/paper-icons",
"dependencies": {
"iron-icon": "PolymerElements/iron-icon#1 - 2",
"iron-iconset-svg": "PolymerElements/iron-iconset-svg#1 - 2",
"polymer": "Polymer/polymer#1.9 - 2"
},
"devDependencies": {
"paper-styles": "PolymerElements/paper-styles#1 - 2",
"iron-component-page": "polymerelements/iron-component-page#1 - 2",
"iron-meta": "PolymerElements/iron-meta#1 - 2",
"webcomponentsjs": "webcomponents/webcomponentsjs#^1.0.0",
"web-component-tester": "^6.0.0"
},
"resolutions": {
"webcomponentsjs": "^1.0.0"
},
"variants": {
"1.x": {
"dependencies": {
"iron-icon": "polymerelements/iron-icon#^1.0.0",
"iron-iconset-svg": "polymerelements/iron-iconset-svg#^1.0.0",
"polymer": "Polymer/polymer#^1.9"
},
"devDependencies": {
"paper-styles": "polymerelements/paper-styles#^1.0.2",
"iron-component-page": "polymerelements/iron-component-page#1.0.0",
"iron-meta": "polymerelements/iron-meta#^1.0.0",
"webcomponentsjs": "webcomponents/webcomponentsjs#^0.7.0",
"web-component-tester": "^4.0.0"
},
"resolutions": {
"webcomponentsjs": "^0.7"
}
},
"polymer-1.x_iron-meta-2.x": {
"dependencies": {
"iron-icon": "polymerelements/iron-icon#^1.0.0",
"iron-iconset-svg": "polymerelements/iron-iconset-svg#^1.0.0",
"polymer": "Polymer/polymer#^1.9"
},
"devDependencies": {
"paper-styles": "polymerelements/paper-styles#^1.0.2",
"iron-component-page": "polymerelements/iron-component-page#1.0.0",
"iron-meta": "polymerelements/iron-meta#^2.0.0",
"webcomponentsjs": "webcomponents/webcomponentsjs#^0.7.0",
"web-component-tester": "^4.0.0"
},
"resolutions": {
"webcomponentsjs": "^0.7",
"iron-meta": "^2.0.0"
}
}
},
"ignore": [
"util",
"update-icons.sh"
]
}
| {
"pile_set_name": "Github"
} |
/*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except
* in compliance with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software distributed under the License
* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
* or implied. See the License for the specific language governing permissions and limitations under
* the License.
*/
/*
* This code was generated by https://github.com/googleapis/google-api-java-client-services/
* Modify at your own risk.
*/
package com.google.api.services.oslogin.v1.model;
/**
* A response message for importing an SSH public key.
*
* <p> This is the Java data model class that specifies how to parse/serialize into the JSON that is
* transmitted over HTTP when working with the Cloud OS Login API. For a detailed explanation see:
* <a href="https://developers.google.com/api-client-library/java/google-http-java-client/json">https://developers.google.com/api-client-library/java/google-http-java-client/json</a>
* </p>
*
* @author Google, Inc.
*/
@SuppressWarnings("javadoc")
public final class ImportSshPublicKeyResponse extends com.google.api.client.json.GenericJson {
/**
* The login profile information for the user.
* The value may be {@code null}.
*/
@com.google.api.client.util.Key
private LoginProfile loginProfile;
/**
* The login profile information for the user.
* @return value or {@code null} for none
*/
public LoginProfile getLoginProfile() {
return loginProfile;
}
/**
* The login profile information for the user.
* @param loginProfile loginProfile or {@code null} for none
*/
public ImportSshPublicKeyResponse setLoginProfile(LoginProfile loginProfile) {
this.loginProfile = loginProfile;
return this;
}
@Override
public ImportSshPublicKeyResponse set(String fieldName, Object value) {
return (ImportSshPublicKeyResponse) super.set(fieldName, value);
}
@Override
public ImportSshPublicKeyResponse clone() {
return (ImportSshPublicKeyResponse) super.clone();
}
}
| {
"pile_set_name": "Github"
} |
/*
Copyright 2014 The Kubernetes Authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
package runtime
import (
"fmt"
"net/url"
"reflect"
"strings"
"k8s.io/apimachinery/pkg/conversion"
"k8s.io/apimachinery/pkg/runtime/schema"
"k8s.io/apimachinery/pkg/util/naming"
utilruntime "k8s.io/apimachinery/pkg/util/runtime"
"k8s.io/apimachinery/pkg/util/sets"
)
// Scheme defines methods for serializing and deserializing API objects, a type
// registry for converting group, version, and kind information to and from Go
// schemas, and mappings between Go schemas of different versions. A scheme is the
// foundation for a versioned API and versioned configuration over time.
//
// In a Scheme, a Type is a particular Go struct, a Version is a point-in-time
// identifier for a particular representation of that Type (typically backwards
// compatible), a Kind is the unique name for that Type within the Version, and a
// Group identifies a set of Versions, Kinds, and Types that evolve over time. An
// Unversioned Type is one that is not yet formally bound to a type and is promised
// to be backwards compatible (effectively a "v1" of a Type that does not expect
// to break in the future).
//
// Schemes are not expected to change at runtime and are only threadsafe after
// registration is complete.
type Scheme struct {
// versionMap allows one to figure out the go type of an object with
// the given version and name.
gvkToType map[schema.GroupVersionKind]reflect.Type
// typeToGroupVersion allows one to find metadata for a given go object.
// The reflect.Type we index by should *not* be a pointer.
typeToGVK map[reflect.Type][]schema.GroupVersionKind
// unversionedTypes are transformed without conversion in ConvertToVersion.
unversionedTypes map[reflect.Type]schema.GroupVersionKind
// unversionedKinds are the names of kinds that can be created in the context of any group
// or version
// TODO: resolve the status of unversioned types.
unversionedKinds map[string]reflect.Type
// Map from version and resource to the corresponding func to convert
// resource field labels in that version to internal version.
fieldLabelConversionFuncs map[schema.GroupVersionKind]FieldLabelConversionFunc
// defaulterFuncs is an array of interfaces to be called with an object to provide defaulting
// the provided object must be a pointer.
defaulterFuncs map[reflect.Type]func(interface{})
// converter stores all registered conversion functions. It also has
// default converting behavior.
converter *conversion.Converter
// versionPriority is a map of groups to ordered lists of versions for those groups indicating the
// default priorities of these versions as registered in the scheme
versionPriority map[string][]string
// observedVersions keeps track of the order we've seen versions during type registration
observedVersions []schema.GroupVersion
// schemeName is the name of this scheme. If you don't specify a name, the stack of the NewScheme caller will be used.
// This is useful for error reporting to indicate the origin of the scheme.
schemeName string
}
// FieldLabelConversionFunc converts a field selector to internal representation.
type FieldLabelConversionFunc func(label, value string) (internalLabel, internalValue string, err error)
// NewScheme creates a new Scheme. This scheme is pluggable by default.
func NewScheme() *Scheme {
s := &Scheme{
gvkToType: map[schema.GroupVersionKind]reflect.Type{},
typeToGVK: map[reflect.Type][]schema.GroupVersionKind{},
unversionedTypes: map[reflect.Type]schema.GroupVersionKind{},
unversionedKinds: map[string]reflect.Type{},
fieldLabelConversionFuncs: map[schema.GroupVersionKind]FieldLabelConversionFunc{},
defaulterFuncs: map[reflect.Type]func(interface{}){},
versionPriority: map[string][]string{},
schemeName: naming.GetNameFromCallsite(internalPackages...),
}
s.converter = conversion.NewConverter(s.nameFunc)
utilruntime.Must(s.AddConversionFuncs(DefaultEmbeddedConversions()...))
// Enable map[string][]string conversions by default
utilruntime.Must(s.AddConversionFuncs(DefaultStringConversions...))
utilruntime.Must(s.RegisterInputDefaults(&map[string][]string{}, JSONKeyMapper, conversion.AllowDifferentFieldTypeNames|conversion.IgnoreMissingFields))
utilruntime.Must(s.RegisterInputDefaults(&url.Values{}, JSONKeyMapper, conversion.AllowDifferentFieldTypeNames|conversion.IgnoreMissingFields))
return s
}
// nameFunc returns the name of the type that we wish to use to determine when two types attempt
// a conversion. Defaults to the go name of the type if the type is not registered.
func (s *Scheme) nameFunc(t reflect.Type) string {
// find the preferred names for this type
gvks, ok := s.typeToGVK[t]
if !ok {
return t.Name()
}
for _, gvk := range gvks {
internalGV := gvk.GroupVersion()
internalGV.Version = APIVersionInternal // this is hacky and maybe should be passed in
internalGVK := internalGV.WithKind(gvk.Kind)
if internalType, exists := s.gvkToType[internalGVK]; exists {
return s.typeToGVK[internalType][0].Kind
}
}
return gvks[0].Kind
}
// fromScope gets the input version, desired output version, and desired Scheme
// from a conversion.Scope.
func (s *Scheme) fromScope(scope conversion.Scope) *Scheme {
return s
}
// Converter allows access to the converter for the scheme
func (s *Scheme) Converter() *conversion.Converter {
return s.converter
}
// AddUnversionedTypes registers the provided types as "unversioned", which means that they follow special rules.
// Whenever an object of this type is serialized, it is serialized with the provided group version and is not
// converted. Thus unversioned objects are expected to remain backwards compatible forever, as if they were in an
// API group and version that would never be updated.
//
// TODO: there is discussion about removing unversioned and replacing it with objects that are manifest into
// every version with particular schemas. Resolve this method at that point.
func (s *Scheme) AddUnversionedTypes(version schema.GroupVersion, types ...Object) {
s.addObservedVersion(version)
s.AddKnownTypes(version, types...)
for _, obj := range types {
t := reflect.TypeOf(obj).Elem()
gvk := version.WithKind(t.Name())
s.unversionedTypes[t] = gvk
if old, ok := s.unversionedKinds[gvk.Kind]; ok && t != old {
panic(fmt.Sprintf("%v.%v has already been registered as unversioned kind %q - kind name must be unique in scheme %q", old.PkgPath(), old.Name(), gvk, s.schemeName))
}
s.unversionedKinds[gvk.Kind] = t
}
}
// AddKnownTypes registers all types passed in 'types' as being members of version 'version'.
// All objects passed to types should be pointers to structs. The name that go reports for
// the struct becomes the "kind" field when encoding. Version may not be empty - use the
// APIVersionInternal constant if you have a type that does not have a formal version.
func (s *Scheme) AddKnownTypes(gv schema.GroupVersion, types ...Object) {
s.addObservedVersion(gv)
for _, obj := range types {
t := reflect.TypeOf(obj)
if t.Kind() != reflect.Ptr {
panic("All types must be pointers to structs.")
}
t = t.Elem()
s.AddKnownTypeWithName(gv.WithKind(t.Name()), obj)
}
}
// AddKnownTypeWithName is like AddKnownTypes, but it lets you specify what this type should
// be encoded as. Useful for testing when you don't want to make multiple packages to define
// your structs. Version may not be empty - use the APIVersionInternal constant if you have a
// type that does not have a formal version.
func (s *Scheme) AddKnownTypeWithName(gvk schema.GroupVersionKind, obj Object) {
s.addObservedVersion(gvk.GroupVersion())
t := reflect.TypeOf(obj)
if len(gvk.Version) == 0 {
panic(fmt.Sprintf("version is required on all types: %s %v", gvk, t))
}
if t.Kind() != reflect.Ptr {
panic("All types must be pointers to structs.")
}
t = t.Elem()
if t.Kind() != reflect.Struct {
panic("All types must be pointers to structs.")
}
if oldT, found := s.gvkToType[gvk]; found && oldT != t {
panic(fmt.Sprintf("Double registration of different types for %v: old=%v.%v, new=%v.%v in scheme %q", gvk, oldT.PkgPath(), oldT.Name(), t.PkgPath(), t.Name(), s.schemeName))
}
s.gvkToType[gvk] = t
for _, existingGvk := range s.typeToGVK[t] {
if existingGvk == gvk {
return
}
}
s.typeToGVK[t] = append(s.typeToGVK[t], gvk)
}
// KnownTypes returns the types known for the given version.
func (s *Scheme) KnownTypes(gv schema.GroupVersion) map[string]reflect.Type {
types := make(map[string]reflect.Type)
for gvk, t := range s.gvkToType {
if gv != gvk.GroupVersion() {
continue
}
types[gvk.Kind] = t
}
return types
}
// AllKnownTypes returns the all known types.
func (s *Scheme) AllKnownTypes() map[schema.GroupVersionKind]reflect.Type {
return s.gvkToType
}
// ObjectKinds returns all possible group,version,kind of the go object, true if the
// object is considered unversioned, or an error if it's not a pointer or is unregistered.
func (s *Scheme) ObjectKinds(obj Object) ([]schema.GroupVersionKind, bool, error) {
// Unstructured objects are always considered to have their declared GVK
if _, ok := obj.(Unstructured); ok {
// we require that the GVK be populated in order to recognize the object
gvk := obj.GetObjectKind().GroupVersionKind()
if len(gvk.Kind) == 0 {
return nil, false, NewMissingKindErr("unstructured object has no kind")
}
if len(gvk.Version) == 0 {
return nil, false, NewMissingVersionErr("unstructured object has no version")
}
return []schema.GroupVersionKind{gvk}, false, nil
}
v, err := conversion.EnforcePtr(obj)
if err != nil {
return nil, false, err
}
t := v.Type()
gvks, ok := s.typeToGVK[t]
if !ok {
return nil, false, NewNotRegisteredErrForType(s.schemeName, t)
}
_, unversionedType := s.unversionedTypes[t]
return gvks, unversionedType, nil
}
// Recognizes returns true if the scheme is able to handle the provided group,version,kind
// of an object.
func (s *Scheme) Recognizes(gvk schema.GroupVersionKind) bool {
_, exists := s.gvkToType[gvk]
return exists
}
func (s *Scheme) IsUnversioned(obj Object) (bool, bool) {
v, err := conversion.EnforcePtr(obj)
if err != nil {
return false, false
}
t := v.Type()
if _, ok := s.typeToGVK[t]; !ok {
return false, false
}
_, ok := s.unversionedTypes[t]
return ok, true
}
// New returns a new API object of the given version and name, or an error if it hasn't
// been registered. The version and kind fields must be specified.
func (s *Scheme) New(kind schema.GroupVersionKind) (Object, error) {
if t, exists := s.gvkToType[kind]; exists {
return reflect.New(t).Interface().(Object), nil
}
if t, exists := s.unversionedKinds[kind.Kind]; exists {
return reflect.New(t).Interface().(Object), nil
}
return nil, NewNotRegisteredErrForKind(s.schemeName, kind)
}
// Log sets a logger on the scheme. For test purposes only
func (s *Scheme) Log(l conversion.DebugLogger) {
s.converter.Debug = l
}
// AddIgnoredConversionType identifies a pair of types that should be skipped by
// conversion (because the data inside them is explicitly dropped during
// conversion).
func (s *Scheme) AddIgnoredConversionType(from, to interface{}) error {
return s.converter.RegisterIgnoredConversion(from, to)
}
// AddConversionFuncs adds functions to the list of conversion functions. The given
// functions should know how to convert between two of your API objects, or their
// sub-objects. We deduce how to call these functions from the types of their two
// parameters; see the comment for Converter.Register.
//
// Note that, if you need to copy sub-objects that didn't change, you can use the
// conversion.Scope object that will be passed to your conversion function.
// Additionally, all conversions started by Scheme will set the SrcVersion and
// DestVersion fields on the Meta object. Example:
//
// s.AddConversionFuncs(
// func(in *InternalObject, out *ExternalObject, scope conversion.Scope) error {
// // You can depend on Meta() being non-nil, and this being set to
// // the source version, e.g., ""
// s.Meta().SrcVersion
// // You can depend on this being set to the destination version,
// // e.g., "v1".
// s.Meta().DestVersion
// // Call scope.Convert to copy sub-fields.
// s.Convert(&in.SubFieldThatMoved, &out.NewLocation.NewName, 0)
// return nil
// },
// )
//
// (For more detail about conversion functions, see Converter.Register's comment.)
//
// Also note that the default behavior, if you don't add a conversion function, is to
// sanely copy fields that have the same names and same type names. It's OK if the
// destination type has extra fields, but it must not remove any. So you only need to
// add conversion functions for things with changed/removed fields.
func (s *Scheme) AddConversionFuncs(conversionFuncs ...interface{}) error {
for _, f := range conversionFuncs {
if err := s.converter.RegisterConversionFunc(f); err != nil {
return err
}
}
return nil
}
// AddConversionFunc registers a function that converts between a and b by passing objects of those
// types to the provided function. The function *must* accept objects of a and b - this machinery will not enforce
// any other guarantee.
func (s *Scheme) AddConversionFunc(a, b interface{}, fn conversion.ConversionFunc) error {
return s.converter.RegisterUntypedConversionFunc(a, b, fn)
}
// AddGeneratedConversionFunc registers a function that converts between a and b by passing objects of those
// types to the provided function. The function *must* accept objects of a and b - this machinery will not enforce
// any other guarantee.
func (s *Scheme) AddGeneratedConversionFunc(a, b interface{}, fn conversion.ConversionFunc) error {
return s.converter.RegisterGeneratedUntypedConversionFunc(a, b, fn)
}
// AddFieldLabelConversionFunc adds a conversion function to convert field selectors
// of the given kind from the given version to internal version representation.
func (s *Scheme) AddFieldLabelConversionFunc(gvk schema.GroupVersionKind, conversionFunc FieldLabelConversionFunc) error {
s.fieldLabelConversionFuncs[gvk] = conversionFunc
return nil
}
// RegisterInputDefaults sets the provided field mapping function and field matching
// as the defaults for the provided input type. The fn may be nil, in which case no
// mapping will happen by default. Use this method to register a mechanism for handling
// a specific input type in conversion, such as a map[string]string to structs.
func (s *Scheme) RegisterInputDefaults(in interface{}, fn conversion.FieldMappingFunc, defaultFlags conversion.FieldMatchingFlags) error {
return s.converter.RegisterInputDefaults(in, fn, defaultFlags)
}
// AddTypeDefaultingFunc registers a function that is passed a pointer to an
// object and can default fields on the object. These functions will be invoked
// when Default() is called. The function will never be called unless the
// defaulted object matches srcType. If this function is invoked twice with the
// same srcType, the fn passed to the later call will be used instead.
func (s *Scheme) AddTypeDefaultingFunc(srcType Object, fn func(interface{})) {
s.defaulterFuncs[reflect.TypeOf(srcType)] = fn
}
// Default sets defaults on the provided Object.
func (s *Scheme) Default(src Object) {
if fn, ok := s.defaulterFuncs[reflect.TypeOf(src)]; ok {
fn(src)
}
}
// Convert will attempt to convert in into out. Both must be pointers. For easy
// testing of conversion functions. Returns an error if the conversion isn't
// possible. You can call this with types that haven't been registered (for example,
// a to test conversion of types that are nested within registered types). The
// context interface is passed to the convertor. Convert also supports Unstructured
// types and will convert them intelligently.
func (s *Scheme) Convert(in, out interface{}, context interface{}) error {
unstructuredIn, okIn := in.(Unstructured)
unstructuredOut, okOut := out.(Unstructured)
switch {
case okIn && okOut:
// converting unstructured input to an unstructured output is a straight copy - unstructured
// is a "smart holder" and the contents are passed by reference between the two objects
unstructuredOut.SetUnstructuredContent(unstructuredIn.UnstructuredContent())
return nil
case okOut:
// if the output is an unstructured object, use the standard Go type to unstructured
// conversion. The object must not be internal.
obj, ok := in.(Object)
if !ok {
return fmt.Errorf("unable to convert object type %T to Unstructured, must be a runtime.Object", in)
}
gvks, unversioned, err := s.ObjectKinds(obj)
if err != nil {
return err
}
gvk := gvks[0]
// if no conversion is necessary, convert immediately
if unversioned || gvk.Version != APIVersionInternal {
content, err := DefaultUnstructuredConverter.ToUnstructured(in)
if err != nil {
return err
}
unstructuredOut.SetUnstructuredContent(content)
unstructuredOut.GetObjectKind().SetGroupVersionKind(gvk)
return nil
}
// attempt to convert the object to an external version first.
target, ok := context.(GroupVersioner)
if !ok {
return fmt.Errorf("unable to convert the internal object type %T to Unstructured without providing a preferred version to convert to", in)
}
// Convert is implicitly unsafe, so we don't need to perform a safe conversion
versioned, err := s.UnsafeConvertToVersion(obj, target)
if err != nil {
return err
}
content, err := DefaultUnstructuredConverter.ToUnstructured(versioned)
if err != nil {
return err
}
unstructuredOut.SetUnstructuredContent(content)
return nil
case okIn:
// converting an unstructured object to any type is modeled by first converting
// the input to a versioned type, then running standard conversions
typed, err := s.unstructuredToTyped(unstructuredIn)
if err != nil {
return err
}
in = typed
}
flags, meta := s.generateConvertMeta(in)
meta.Context = context
if flags == 0 {
flags = conversion.AllowDifferentFieldTypeNames
}
return s.converter.Convert(in, out, flags, meta)
}
// ConvertFieldLabel alters the given field label and value for an kind field selector from
// versioned representation to an unversioned one or returns an error.
func (s *Scheme) ConvertFieldLabel(gvk schema.GroupVersionKind, label, value string) (string, string, error) {
conversionFunc, ok := s.fieldLabelConversionFuncs[gvk]
if !ok {
return DefaultMetaV1FieldSelectorConversion(label, value)
}
return conversionFunc(label, value)
}
// ConvertToVersion attempts to convert an input object to its matching Kind in another
// version within this scheme. Will return an error if the provided version does not
// contain the inKind (or a mapping by name defined with AddKnownTypeWithName). Will also
// return an error if the conversion does not result in a valid Object being
// returned. Passes target down to the conversion methods as the Context on the scope.
func (s *Scheme) ConvertToVersion(in Object, target GroupVersioner) (Object, error) {
return s.convertToVersion(true, in, target)
}
// UnsafeConvertToVersion will convert in to the provided target if such a conversion is possible,
// but does not guarantee the output object does not share fields with the input object. It attempts to be as
// efficient as possible when doing conversion.
func (s *Scheme) UnsafeConvertToVersion(in Object, target GroupVersioner) (Object, error) {
return s.convertToVersion(false, in, target)
}
// convertToVersion handles conversion with an optional copy.
func (s *Scheme) convertToVersion(copy bool, in Object, target GroupVersioner) (Object, error) {
var t reflect.Type
if u, ok := in.(Unstructured); ok {
typed, err := s.unstructuredToTyped(u)
if err != nil {
return nil, err
}
in = typed
// unstructuredToTyped returns an Object, which must be a pointer to a struct.
t = reflect.TypeOf(in).Elem()
} else {
// determine the incoming kinds with as few allocations as possible.
t = reflect.TypeOf(in)
if t.Kind() != reflect.Ptr {
return nil, fmt.Errorf("only pointer types may be converted: %v", t)
}
t = t.Elem()
if t.Kind() != reflect.Struct {
return nil, fmt.Errorf("only pointers to struct types may be converted: %v", t)
}
}
kinds, ok := s.typeToGVK[t]
if !ok || len(kinds) == 0 {
return nil, NewNotRegisteredErrForType(s.schemeName, t)
}
gvk, ok := target.KindForGroupVersionKinds(kinds)
if !ok {
// try to see if this type is listed as unversioned (for legacy support)
// TODO: when we move to server API versions, we should completely remove the unversioned concept
if unversionedKind, ok := s.unversionedTypes[t]; ok {
if gvk, ok := target.KindForGroupVersionKinds([]schema.GroupVersionKind{unversionedKind}); ok {
return copyAndSetTargetKind(copy, in, gvk)
}
return copyAndSetTargetKind(copy, in, unversionedKind)
}
return nil, NewNotRegisteredErrForTarget(s.schemeName, t, target)
}
// target wants to use the existing type, set kind and return (no conversion necessary)
for _, kind := range kinds {
if gvk == kind {
return copyAndSetTargetKind(copy, in, gvk)
}
}
// type is unversioned, no conversion necessary
if unversionedKind, ok := s.unversionedTypes[t]; ok {
if gvk, ok := target.KindForGroupVersionKinds([]schema.GroupVersionKind{unversionedKind}); ok {
return copyAndSetTargetKind(copy, in, gvk)
}
return copyAndSetTargetKind(copy, in, unversionedKind)
}
out, err := s.New(gvk)
if err != nil {
return nil, err
}
if copy {
in = in.DeepCopyObject()
}
flags, meta := s.generateConvertMeta(in)
meta.Context = target
if err := s.converter.Convert(in, out, flags, meta); err != nil {
return nil, err
}
setTargetKind(out, gvk)
return out, nil
}
// unstructuredToTyped attempts to transform an unstructured object to a typed
// object if possible. It will return an error if conversion is not possible, or the versioned
// Go form of the object. Note that this conversion will lose fields.
func (s *Scheme) unstructuredToTyped(in Unstructured) (Object, error) {
// the type must be something we recognize
gvks, _, err := s.ObjectKinds(in)
if err != nil {
return nil, err
}
typed, err := s.New(gvks[0])
if err != nil {
return nil, err
}
if err := DefaultUnstructuredConverter.FromUnstructured(in.UnstructuredContent(), typed); err != nil {
return nil, fmt.Errorf("unable to convert unstructured object to %v: %v", gvks[0], err)
}
return typed, nil
}
// generateConvertMeta constructs the meta value we pass to Convert.
func (s *Scheme) generateConvertMeta(in interface{}) (conversion.FieldMatchingFlags, *conversion.Meta) {
return s.converter.DefaultMeta(reflect.TypeOf(in))
}
// copyAndSetTargetKind performs a conditional copy before returning the object, or an error if copy was not successful.
func copyAndSetTargetKind(copy bool, obj Object, kind schema.GroupVersionKind) (Object, error) {
if copy {
obj = obj.DeepCopyObject()
}
setTargetKind(obj, kind)
return obj, nil
}
// setTargetKind sets the kind on an object, taking into account whether the target kind is the internal version.
func setTargetKind(obj Object, kind schema.GroupVersionKind) {
if kind.Version == APIVersionInternal {
// internal is a special case
// TODO: look at removing the need to special case this
obj.GetObjectKind().SetGroupVersionKind(schema.GroupVersionKind{})
return
}
obj.GetObjectKind().SetGroupVersionKind(kind)
}
// SetVersionPriority allows specifying a precise order of priority. All specified versions must be in the same group,
// and the specified order overwrites any previously specified order for this group
func (s *Scheme) SetVersionPriority(versions ...schema.GroupVersion) error {
groups := sets.String{}
order := []string{}
for _, version := range versions {
if len(version.Version) == 0 || version.Version == APIVersionInternal {
return fmt.Errorf("internal versions cannot be prioritized: %v", version)
}
groups.Insert(version.Group)
order = append(order, version.Version)
}
if len(groups) != 1 {
return fmt.Errorf("must register versions for exactly one group: %v", strings.Join(groups.List(), ", "))
}
s.versionPriority[groups.List()[0]] = order
return nil
}
// PrioritizedVersionsForGroup returns versions for a single group in priority order
func (s *Scheme) PrioritizedVersionsForGroup(group string) []schema.GroupVersion {
ret := []schema.GroupVersion{}
for _, version := range s.versionPriority[group] {
ret = append(ret, schema.GroupVersion{Group: group, Version: version})
}
for _, observedVersion := range s.observedVersions {
if observedVersion.Group != group {
continue
}
found := false
for _, existing := range ret {
if existing == observedVersion {
found = true
break
}
}
if !found {
ret = append(ret, observedVersion)
}
}
return ret
}
// PrioritizedVersionsAllGroups returns all known versions in their priority order. Groups are random, but
// versions for a single group are prioritized
func (s *Scheme) PrioritizedVersionsAllGroups() []schema.GroupVersion {
ret := []schema.GroupVersion{}
for group, versions := range s.versionPriority {
for _, version := range versions {
ret = append(ret, schema.GroupVersion{Group: group, Version: version})
}
}
for _, observedVersion := range s.observedVersions {
found := false
for _, existing := range ret {
if existing == observedVersion {
found = true
break
}
}
if !found {
ret = append(ret, observedVersion)
}
}
return ret
}
// PreferredVersionAllGroups returns the most preferred version for every group.
// group ordering is random.
func (s *Scheme) PreferredVersionAllGroups() []schema.GroupVersion {
ret := []schema.GroupVersion{}
for group, versions := range s.versionPriority {
for _, version := range versions {
ret = append(ret, schema.GroupVersion{Group: group, Version: version})
break
}
}
for _, observedVersion := range s.observedVersions {
found := false
for _, existing := range ret {
if existing.Group == observedVersion.Group {
found = true
break
}
}
if !found {
ret = append(ret, observedVersion)
}
}
return ret
}
// IsGroupRegistered returns true if types for the group have been registered with the scheme
func (s *Scheme) IsGroupRegistered(group string) bool {
for _, observedVersion := range s.observedVersions {
if observedVersion.Group == group {
return true
}
}
return false
}
// IsVersionRegistered returns true if types for the version have been registered with the scheme
func (s *Scheme) IsVersionRegistered(version schema.GroupVersion) bool {
for _, observedVersion := range s.observedVersions {
if observedVersion == version {
return true
}
}
return false
}
func (s *Scheme) addObservedVersion(version schema.GroupVersion) {
if len(version.Version) == 0 || version.Version == APIVersionInternal {
return
}
for _, observedVersion := range s.observedVersions {
if observedVersion == version {
return
}
}
s.observedVersions = append(s.observedVersions, version)
}
func (s *Scheme) Name() string {
return s.schemeName
}
// internalPackages are packages that ignored when creating a default reflector name. These packages are in the common
// call chains to NewReflector, so they'd be low entropy names for reflectors
var internalPackages = []string{"k8s.io/apimachinery/pkg/runtime/scheme.go"}
| {
"pile_set_name": "Github"
} |
;; Latent Feature Model using Indian Buffet Process
(defpackage :nonparametric.lfm
; (:nicknames :lfm)
(:use :cl :nonpara.stat :hjs.util.meta
:hjs.util.matrix :hjs.util.vector
:nonparametric.dpm)
(:export :ibp
:ibp-row
:ibp-distribution
:lfm
:lfm-row
:row-weight
:lfm-distribution
))
(in-package :nonparametric.lfm)
(defclass ibp (dpm)
((base-distribution :initform (make-instance 'ibp-distribution))
(past-assign :initform (make-hash-table) :accessor ibp-past-assign)))
(defclass ibp-row (cluster) ())
(defclass ibp-distribution (dp-distribution)
((cluster-class :initform 'ibp-row)))
(defclass lfm (ibp)
((base-distribution :initform (make-instance 'lfm-distribution))
(feature-ave :initarg :feat-ave :accessor lfm-feature-average)
(tmp-A :initform nil :accessor lfm-tmp-A)
(tmp-z :initform nil :accessor lfm-tmp-z)
(tmp-data :accessor lfm-tmp-data)))
(defclass lfm-row (ibp-row)
((weight :accessor row-weight)))
(defclass lfm-distribution (ibp-distribution)
((cluster-class :initform 'lfm-row)
(std-of-data :initform 1d0 :accessor data-std)
(std-matrix :accessor std-matrix)
(feature-average :initform 0d0 :accessor feature-average)
(feature-std :initform 1d0 :accessor feature-std)))
;; assume (point-cluster customer) is adarray
(defmethod add-customer ((dpm ibp) customer N &rest args &key &allow-other-keys)
(let ((k (dpm-k dpm))
(clusters (dpm-clusters dpm))
(layers (dpm-cluster-layers dpm))
(data (point-data customer))
(dist (dpm-base dpm))
(flip (point-cluster customer))
(table (ibp-past-assign dpm)))
(clrhash table)
(loop until (zerop (fill-pointer flip))
for c = (vector-pop flip) do
(setf (gethash c table) t))
(loop for i from 0 below k
for c = (aref clusters i) do
(let ((coin (bernoulli (apply #'density-to-cluster dpm c data :past table :n n args))))
(if (zerop coin)
(setf (gethash c table) nil)
(progn
(setf (gethash c table) t)
(vector-push-extend c flip))))
finally
(loop for c across flip
for old = (cluster-size c)
for ref = (position c clusters :start (aref layers old)) do
(apply #'add-to-cluster c data args)
(cluster-rotation ref clusters layers old)))
;; try to add new row
(let ((new (sample-new-row dpm customer table n)))
(dotimes (i new)
(incf (dpm-k dpm))
(let* ((ref (aref layers 0))
(new-cluster (make-new-cluster dpm dist customer (unless (= ref (length clusters))
(aref clusters ref)))))
(when (= ref (the fixnum (length clusters)))
;; extend array related to tables
(vector-push-extend new-cluster clusters))
(apply #'add-to-cluster new-cluster data args)
(vector-push-extend new-cluster flip)
(when (= (length layers) 1)
;;; first addition
(vector-push-extend 0 layers))
(incf (aref layers 0)) ;; this is all to do for rotation
)))
flip))
;; passed customer is <struct point> !!
(defmethod make-new-cluster ((dpm lfm) (dist lfm-distribution) customer &optional discarded-cluster)
(declare (optimize (speed 3) (safety 0) (debug 0))
(ignore discarded-cluster))
(let* ((new (call-next-method))
(pdata (copy-seq (point-data customer)))
(clusters (point-cluster customer))
(ave (feature-average dist))
(std (let ((s (feature-std dist)))
(sqrt (+ 1d0 (* s s)))))
(m (std-matrix dist))
(l (length pdata)))
(declare (type double-float ave std))
(loop for c across clusters
for feat = (row-weight c) do
(loop for i from 0 below l do
(decf (aref pdata i) (aref feat i))))
(loop for i from 0 below l do
(setf (aref m i i) std))
(setf (row-weight new)
(multivariate-normal-random
(map-into pdata #'(lambda (x)
(declare (type double-float x))
(/ (+ x ave) 2d0)) pdata)
m))
new
))
(defmethod sample-cluster-parameters ((cluster ibp-row) (dist ibp-distribution) (dpm ibp))
cluster)
;; sample from conditional distribution
(defmethod sample-cluster-parameters ((cluster lfm-row) (dist lfm-distribution) (dpm lfm))
(declare (optimize (speed 3) (safety 0) (debug 0)))
(let* ((data (dpm-data dpm))
(w (row-weight cluster))
(l (length w))
(size (cluster-size cluster))
(std (std-matrix (dpm-base dpm)))
(tmp (lfm-tmp-data dpm)))
(declare (type fixnum size))
(fill tmp 0d0)
(loop for d across data do
(when (find cluster (point-cluster d))
(v+ tmp (point-data d) tmp)
(loop for c across (point-cluster d) do
(unless (eq c cluster)
(v- tmp (row-weight c) tmp)))))
(map-into tmp #'(lambda (x) (declare (type double-float x)) (/ x size)) tmp)
;; edit std
(loop with s double-float = (/ (data-std dist) (sqrt l))
for i from 0 below l do
(setf (aref std i i) s))
(multivariate-normal-random tmp std w)))
(defun density-to-z (z+ A data std tmp)
(%multivariate-normal-logged-density (m*v A z+ tmp) std data))
(defun make-z (k &optional result)
(declare (type fixnum k)
(type (or null dvec) result))
(if (and result (= (the fixnum (length result)) k))
(fill result 0d0)
(make-dvec k 0d0)))
(defun make-A (clusters d k &optional result)
(declare (optimize (speed 3) (safety 0) (debug 0))
(type fixnum d k)
(type (or null dmat) result))
(unless (and result (= k (array-dimension result 1)))
(setf result nil))
(let ((ans (or result (make-dmat d k))))
(declare (type dmat ans))
(loop for i from 0 below k
for w = (row-weight (aref clusters i)) do
(loop for j from 0 below d do
(setf (aref ans j i) (aref w j))))
ans))
(defmethod density-to-cluster ((dpm lfm) (cluster lfm-row) data &rest args &key past n)
(declare (optimize (speed 3) (safety 0) (debug 0))
(type dvec data))
(let* ((k (dpm-k dpm))
(clusters (dpm-clusters dpm))
(z+ (setf (lfm-tmp-z dpm) (make-z k (lfm-tmp-z dpm))))
(A (setf (lfm-tmp-A dpm) (make-A clusters (length data) k (lfm-tmp-A dpm))))
(std (std-matrix (dpm-base dpm)))
(ref (position cluster clusters :start (aref (dpm-cluster-layers dpm) (cluster-size cluster))))
(buffet (/ (cluster-size cluster) N))
(tmp (lfm-tmp-data dpm)))
(declare (type dvec z+))
;; edit std
(loop with s = (/ (data-std (dpm-base dpm)))
for i from 0 below (length tmp) do
(setf (aref std i i) s))
;; construct z+
(loop for i from 0 below k
for c = (aref clusters i)
when (or (= i ref) (gethash c past)) do
(incf (aref z+ i)))
(let ((positive (+ (log buffet) (density-to-z z+ A data std tmp)))
(negative (+ (log (- 1 buffet))
(progn (decf (aref z+ ref)) (density-to-z z+ A data std tmp)))))
(/ (1+ (the double-float (safe-exp (- negative positive))))))))
(defmethod remove-customer ((dpm ibp) customer &rest args &key &allow-other-keys)
(let ((clusters (dpm-clusters dpm))
(layers (dpm-cluster-layers dpm)))
(loop for c across (point-cluster customer) do
(let* ((old (cluster-size c))
(ref (position c clusters :start (aref layers old)))
(new (apply #'remove-from-cluster c (point-data customer) args))
(new-position (1- (the fixnum (aref layers new)))))
(when (zerop new)
(decf (dpm-k dpm)))
(rotatef (aref clusters ref)
(aref clusters new-position))
(decf (aref layers new))))))
(defparameter *ibp-sample-threshold* 10)
(defun sample-new-row (dpm customer table n)
(let* ((p (dpm-p dpm))
(dist (dpm-base dpm))
(alpha (dpm-hyper dpm))
(lambda (/ alpha n))
(poisson-base (max least-positive-double-float (expt #.(exp 1) (- lambda))))
(max most-negative-double-float)
(pdata (copy-seq (point-data customer)))
(clusters (dpm-clusters dpm))
(l (length pdata))
(k (dpm-k dpm))
(z+ (make-z k (lfm-tmp-z dpm)))
(A (make-A clusters l k (lfm-tmp-A dpm)))
c-ave)
(if (zerop k)
(setf c-ave (make-dvec l 0d0))
(progn
(loop for i from 0 below k
for c = (aref clusters i)
when (gethash c table) do
(incf (aref z+ i)))
(setf c-ave (m*v A z+))))
(setf (fill-pointer p) 0)
(let ((den (+ (log poisson-base) (base-distribution dpm dist customer :n 0 :ave c-ave))))
(vector-push-extend den p)
(setf max (max max den)))
(setf poisson-base (max least-positive-double-float (* poisson-base lambda)))
(loop
while (>= poisson-base least-positive-double-float)
for i from 1 below *ibp-sample-threshold*
for den = (+ (log poisson-base) (base-distribution dpm dist customer :n i :ave c-ave)) do
(vector-push-extend den p)
(setf max (max max den))
(setf poisson-base (* (/ poisson-base (1+ i)) lambda)))
(let ((sum (jackup-logged-prob p max)))
(randomize-choice p sum))))
(defmethod base-distribution ((dpm lfm) (dist lfm-distribution) customer &rest args &key n ave)
(declare (optimize (speed 3) (safety 0) (debug 0))
(type fixnum n)
(type dvec ave))
(let* ((k (dpm-k dpm))
(k+ (+ k n)))
(if (zerop k+)
most-negative-double-float ;;; should make new cluster
(let* ((new (lfm-tmp-data dpm))
(std (std-matrix dist))
(bstd (data-std dist))
(fa (feature-average dist))
(fs (feature-std dist))
(data (point-data customer))
(l (length (dpm-data dpm))))
(declare (type double-float bstd fa fs)
(type fixnum l)
(type dvec new)
(type dmat std))
(map-into new #'(lambda (x) (+ x (* n fa))) ave)
(loop with s double-float = (/ (sqrt (+ (* bstd bstd)
(* n (* fs fs)))))
for i from 0 below l do
(setf (aref std i i) s))
(+ (the double-float (%multivariate-normal-logged-density new std data))
(* n #.(log 0.5d0)))))))
(defmethod initialize ((dpm ibp))
(let ((d (length (point-data (aref (dpm-data dpm) 0))))
(dist (dpm-base dpm)))
(setf (lfm-tmp-data dpm) (make-dvec d))
(setf (std-matrix dist) (diag d (data-std dist))))
(loop for point across (shuffle-vector (dpm-data dpm))
for i from 1 do
(add-customer dpm point i))
(parameters-sampling dpm)
(when (estimate-base? dpm)
(sample-distribution dpm (dpm-base dpm)))
(hypers-sampling dpm))
(defmethod initialize ((dpm lfm))
(setf (feature-average (dpm-base dpm)) (lfm-feature-average dpm))
(call-next-method))
(defmethod seatings-sampling ((dpm ibp))
(loop
with n = (length (dpm-data dpm))
for point across (shuffle-vector (dpm-data dpm)) do
(remove-customer dpm point)
(add-customer dpm point n)))
(defun make-feat-data (size features)
(let ((data (make-array size)))
(loop for i from 0 below size
for s = (make-array (length (aref features 0)) :element-type 'double-float :initial-element 0d0) do
(loop for feat across features
when (zerop (random 2)) do
(loop for f across feat
for j from 0 do
(incf (aref s j) f)))
(setf (aref data i) s))
data))
;; hypers-sampling -- test implementation
#+ignore
(defmethod hypers-sampling ((dpm ibp))
"skip") | {
"pile_set_name": "Github"
} |
# The following strings are byte-for-byte equivalent:
key1 = "The quick brown fox jumps over the lazy dog."
key2 = """
The quick brown \
fox jumps over \
the lazy dog."""
key3 = """\
The quick brown \
fox jumps over \
the lazy dog.\
"""
| {
"pile_set_name": "Github"
} |
%{
#include <stdio.h>
#ifdef IRIX
#include <pfmt.h>
#endif
#include <ncarg/c.h>
#include <ncarg/hlu/hluP.h>
#include <ncarg/hlu/NresDB.h>
#include "defs.h"
#include "NclDataDefs.h"
#include "Symbol.h"
#include "SrcTree.h"
#include "Machine.h"
#include <errno.h>
#include <ctype.h>
#include <sys/types.h>
#include <sys/stat.h>
#if defined(HPUX)
#include <dl.h>
#else
#include <dlfcn.h>
#endif
int scopelevel = 0;
extern int yydebug;
extern char *yytext;
extern FILE *thefptr;
extern FILE *theoptr;
extern int cmd_line;
extern int cur_line_length;
extern int cur_line_number;
extern int last_line_length;
extern char *cur_line_text;
extern int ok_to_start_vsblk;
extern void ResetCurLine(void);
extern NhlErrorTypes _NclPreLoadScript(char *path, int status);
extern int _NclTranslate(void *root, FILE *fp);
extern int yylex (void);
#define ERROR(x) NhlPError(NhlFATAL,NhlEUNKNOWN,"%s",(x))
#define PRINTLOCATION(errlev) \
if (loading > 0) \
NhlPError(errlev,NhlEUNKNOWN,"error at line %d in file %s\n",cur_line_number,cur_load_file); \
else\
NhlPError(errlev,NhlEUNKNOWN,"error at line %d\n",cur_line_number);
int is_error = 0;
int block_syntax_error = 0;
int ret_urn = 0;
ExtStack *tmp_sym= NULL;
/*
extern int _NclTranslate(
#ifdef NhlNeedProto
void*,
FILE*
#endif
);
*/
extern void _NclTransTerminate(
#ifdef NhlNeedProto
void
#endif
);
extern int rec;
extern FILE* recfp;
extern FILE* yyin;
int loading = 0;
int preloading = 0;
int top_level_line;
char *cur_load_file = NULL;
char *ncl_cur_func = NULL;
%}
%union {
int integer;
long long int_val;
double real;
char str[NCL_MAX_STRING];
char *sstr;
struct _NclSymbol *sym;
void *src_node;
struct src_node_list *list;
struct ncl_rcl_list *array;
struct ncl_rcl_list *listvar;
}
%token <void> EOLN
%token <void> EOFF
%token <void> RP LP RBC LBC RBK LBK COLON ',' SEMI MARKER LPSLSH SLSHRP DIM_MARKER FSTRING EFSTRING ASTRING CSTRING
%token <void> GSTRING LBKSLSH SLSHRBK
%token <integer> DIMNUM
%token <int_val> INT
%token <real> REAL
%token <str> DIM DIMNAME ATTNAME COORDV FVAR
%token <str> GVAR
%token <sstr> STRING
%token <sym> INTEGER UINT FLOAT LONG ULONG INT64 UINT64 DOUBLE BYTE UBYTE CHARACTER GRAPHIC STRNG
%token <sym> NUMERIC ENUMERIC SNUMERIC FILETYPE SHORT USHORT LOGICAL
%token <sym> GROUP GROUPTYPE COMPOUND UNDEFFILEGROUP
%token <sym> UNDEF VAR WHILE DO QUIT NPROC PIPROC IPROC UNDEFFILEVAR BREAK NOPARENT NCLNULL LIST
%token <sym> BGIN END NFUNC IFUNC FDIM IF THEN VBLKNAME CONTINUE
%token <sym> DFILE KEYFUNC KEYPROC ELSE ELSEIF EXTERNAL NCLEXTERNAL RETURN VSBLKGET NEW
%token <sym> OBJVAR OBJTYPE RECORD VSBLKCREATE VSBLKSET LOCAL STOP NCLTRUE NCLFALSE NCLMISSING DLIB
%token '='
%token REASSIGN
%token OR
%token XOR
%token AND
%token GT
%token GE
%token LT
%token LE
%token EQ
%token NE
%token '<'
%token '>'
%token '+'
%token '-'
%token '*'
%token '#'
%token '/'
%token '%'
%token '^'
%token UNOP
%token NOT
%right '='
%right REASSIGN
%left OR XOR
%left AND
%left GT GE LT LE EQ NE
%left '<' '>'
%left '+' '-'
%left '*' '#' '/' '%'
%left '^'
%left UNOP NOT
%type <array> expr_list
%type <listvar> list_expr_list
%type <src_node> statement assignment reassignment
%type <src_node> procedure function_def procedure_def fp_block block do conditional elseif
%type <src_node> visblk statement_list
%type <src_node> declaration identifier expr v_parent
%type <src_node> subscript0 subscript2 break_cont vcreate list_subscript
%type <src_node> subscript3 subscript1 subexpr primary function array listvar error filevarselector coordvarselector attributeselector
%type <src_node> filegroupselector
%type <list> the_list arg_dec_list subscript_list opt_arg_list named_subscript_list normal_subscript_list
%type <list> block_statement_list cond_block_list resource_list dim_size_list
%type <list> arg_list do_stmnt resource vset vget get_resource get_resource_list
%type <sym> datatype pfname vname
%type <sym> func_identifier proc_identifier anysym
%%
statement_list : statement eoln {
int strt;
/*
fprintf(stdout,"is_error0 %d\n",is_error);
fflush(stdout);
*/
if(($1 != NULL)&&!(is_error||block_syntax_error)) {
_NclPrintTree($1,thefptr);
strt = _NclTranslate($1,thefptr);
_NclTransTerminate();
#ifdef NCLDEBUG
_NclPrintMachine(strt,-1,theoptr);
#endif
#ifndef PRINTTREEONLY
_NclExecute(strt);
#endif
_NclResetNewSymStack();
_NclFreeTree();
} else {
_NclDeleteNewSymStack();
_NclFreeTree();
is_error = 0;
if(block_syntax_error) {
NhlPError(NhlFATAL,NhlEUNKNOWN,"Syntax Error in block, block not executed");
PRINTLOCATION(NhlFATAL);
block_syntax_error = 0;
}
}
}
| statement_list statement eoln {
int strt;
/*
fprintf(stdout,"is_error1 %d\n",is_error);
fflush(stdout);
*/
if(($2 != NULL) && !(is_error||block_syntax_error)) {
_NclPrintTree($2,thefptr);
strt = _NclTranslate($2,thefptr);
_NclTransTerminate();
#ifdef NCLDEBUG
_NclPrintMachine(strt,-1,theoptr);
#endif
#ifndef PRINTTREEONLY
_NclExecute(strt);
#endif
_NclResetNewSymStack();
_NclFreeTree();
} else {
_NclDeleteNewSymStack();
_NclFreeTree();
is_error = 0;
if(block_syntax_error) {
NhlPError(NhlFATAL,NhlEUNKNOWN,"Syntax Error in block, block not executed");
PRINTLOCATION(NhlFATAL);
block_syntax_error = 0;
}
}
}
| statement_list RECORD STRING eoln {
/*
* These record statments have to occur here so that the record command isn't written out
* by the scanner. The scanner writes each line when an EOLN is scanned.
*/
recfp = fopen(_NGResolvePath($3),"w");
if(recfp != NULL){
rec =1;
ResetCurLine();
} else {
NhlPError(NhlWARNING,errno,"Could not open record file");
PRINTLOCATION(NhlWARNING);
rec = 0;
}
}
| RECORD STRING eoln {
recfp = fopen(_NGResolvePath($2),"w");
if(recfp != NULL){
rec =1;
ResetCurLine();
} else {
NhlPError(NhlWARNING,errno,"Could not open record file");
PRINTLOCATION(NhlWARNING);
rec = 0;
}
}
| NCLEXTERNAL UNDEF STRING {
char buffer[4*NCL_MAX_STRING];
ExtStack *tm;
$2->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
$2->u.package->so_handle = NULL;
sprintf(buffer,"%s",_NGResolvePath($3));
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($2,DLIB);
_NclPreLoadScript(buffer,0);
if(tmp_sym == NULL) {
tmp_sym = (ExtStack*)NclMalloc(sizeof(ExtStack));
tmp_sym->tmp_sym = $2;
tmp_sym->next = NULL;
} else {
tm = (ExtStack*)NclMalloc(sizeof(ExtStack));
tm->tmp_sym = $2;
tm->next = tmp_sym;
tmp_sym = tm;
}
}
| EXTERNAL UNDEF STRING eoln {
void (*init_function)(void);
$2->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
/*
fprintf(stdout,"opening: %s\n",_NGResolvePath($3));
*/
#if defined(HPUX)
$2->u.package->so_handle = shl_load(_NGResolvePath($3),BIND_IMMEDIATE,0L);
#else
$2->u.package->so_handle = dlopen(_NGResolvePath($3),RTLD_NOW);
#endif
if($2->u.package->so_handle == NULL) {
#if defined(HPUX)
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$3,strerror(errno));
#else
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$3,dlerror());
#endif
PRINTLOCATION(NhlWARNING);
} else {
#if defined(HPUX)
init_function = NULL;
(void)shl_findsym(&($2->u.package->so_handle), "Init",TYPE_UNDEFINED,(void*)&init_function);
#else
init_function = dlsym($2->u.package->so_handle, "Init");
#endif
if(init_function != NULL) {
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($2,DLIB);
(*init_function)();
$2->u.package->scope = _NclPopScope();
} else {
NhlPError(NhlWARNING,NhlEUNKNOWN,
"Could not find Init() in external file %s, file not loaded",$3);
PRINTLOCATION(NhlWARNING);
$2->u.package->scope = NULL;
}
}
}
| statement_list NCLEXTERNAL UNDEF STRING {
char buffer[4*NCL_MAX_STRING];
ExtStack *tm;
$3->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
$3->u.package->so_handle = NULL;
sprintf(buffer,"%s",_NGResolvePath($4));
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($3,DLIB);
_NclPreLoadScript(buffer,0);
if(tmp_sym == NULL) {
tmp_sym = (ExtStack*)NclMalloc(sizeof(ExtStack));
tmp_sym->tmp_sym = $3;
tmp_sym->next = NULL;
} else {
tm = (ExtStack*)NclMalloc(sizeof(ExtStack));
tm->tmp_sym = $3;
tm->next = tmp_sym;
tmp_sym = tm;
}
}
| statement_list EXTERNAL UNDEF STRING eoln {
void (*init_function)(void);
$$ = NULL;
$3->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
/*
fprintf(stdout,"opening: %s\n",_NGResolvePath($4));
*/
#if defined(HPUX)
$3->u.package->so_handle = shl_load(_NGResolvePath($4),BIND_IMMEDIATE,0L);
#else
$3->u.package->so_handle = dlopen(_NGResolvePath($4),RTLD_NOW);
#endif
if($3->u.package->so_handle == NULL) {
#if defined(HPUX)
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$4,strerror(errno));
#else
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$4,dlerror());
#endif
PRINTLOCATION(NhlWARNING);
} else {
#if defined(HPUX)
init_function = NULL;
(void)shl_findsym(&($3->u.package->so_handle), "Init",TYPE_UNDEFINED,(void*)&init_function);
#else
init_function = dlsym($3->u.package->so_handle, "Init");
#endif
if(init_function != NULL) {
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($3,DLIB);
(*init_function)();
$3->u.package->scope = _NclPopScope();
} else {
NhlPError(NhlWARNING,NhlEUNKNOWN,
"Could not find Init() in external file %s, file not loaded",$4);
PRINTLOCATION(NhlWARNING);
$3->u.package->scope = NULL;
}
}
}
;
/* This was added 9/2017 to handle single-statement/single-line conditional statements */
cond_block_list : statement {
NclSrcListNode *tmp = NULL;
if($1 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
}
}
| block_statement_list {$$ = $1;}
;
block_statement_list : statement eoln {
if(is_error) {
if(!cmd_line)
block_syntax_error = 1;
_NclDeleteNewSymStack();
is_error = 0;
$1 = NULL;
} else {
_NclResetNewSymStack();
}
if($1 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
} else {
$$ = NULL;
}
}
| block_statement_list statement eoln {
/*
* This looping is necessary because ordering needs to be maintained for statement_lists
*/
NclSrcListNode *step;
if(is_error) {
if(!cmd_line)
block_syntax_error = 1;
_NclDeleteNewSymStack();
is_error = 0;
$2 = NULL;
} else {
_NclResetNewSymStack();
}
if($1 == NULL) {
if($2 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $2;
} else if($2 == NULL) {
$$ = NULL;
}
} else if($2 != NULL){
step = $1;
while(step->next != NULL) {
step = step->next;
}
step->next = _NclMakeNewListNode();
step= step->next;
step->next = NULL;
step->node = $2;
$$ = $1;
} else {
$$ = $1;
}
}
| block_statement_list RECORD STRING eoln {
recfp = fopen(_NGResolvePath($3),"w");
if(recfp != NULL){
rec =1;
ResetCurLine();
} else {
NhlPError(NhlWARNING,errno,"Could not open record file");
PRINTLOCATION(NhlWARNING);
rec = 0;
}
$$ = $1;
}
| RECORD STRING eoln {
recfp = fopen(_NGResolvePath($2),"w");
if(recfp != NULL){
rec =1;
ResetCurLine();
} else {
NhlPError(NhlWARNING,errno,"Could not open record file");
PRINTLOCATION(NhlWARNING);
rec = 0;
}
$$ = NULL;
}
| NCLEXTERNAL UNDEF STRING {
char buffer[4*NCL_MAX_STRING];
ExtStack *tm;
$2->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
$2->u.package->so_handle = NULL;
sprintf(buffer,"%s",_NGResolvePath($3));
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($2,DLIB);
_NclPreLoadScript(buffer,0);
if(tmp_sym == NULL) {
tmp_sym = (ExtStack*)NclMalloc(sizeof(ExtStack));
tmp_sym->tmp_sym = $2;
tmp_sym->next = NULL;
} else {
tm = (ExtStack*)NclMalloc(sizeof(ExtStack));
tm->tmp_sym = $2;
tm->next = tmp_sym;
tmp_sym = tm;
}
}
| EXTERNAL UNDEF STRING eoln {
void (*init_function)(void);
$$ = NULL;
$2->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
/*
fprintf(stdout,"opening: %s\n",_NGResolvePath($3));
*/
#if defined(HPUX)
$2->u.package->so_handle = shl_load(_NGResolvePath($3),BIND_IMMEDIATE,0L);
#else
$2->u.package->so_handle = dlopen(_NGResolvePath($3),RTLD_NOW);
#endif
if($2->u.package->so_handle == NULL) {
#if defined(HPUX)
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$3,strerror(errno));
#else
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$3,dlerror());
#endif
PRINTLOCATION(NhlWARNING);
} else {
#if defined(HPUX)
init_function = NULL;
(void)shl_findsym(&($2->u.package->so_handle), "Init",TYPE_UNDEFINED,(void*)&init_function);
#else
init_function = dlsym($2->u.package->so_handle, "Init");
#endif
if(init_function != NULL) {
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($2,DLIB);
(*init_function)();
$2->u.package->scope = _NclPopScope();
} else {
NhlPError(NhlWARNING,NhlEUNKNOWN,
"Could not find Init() in external file %s, file not loaded",$3);
PRINTLOCATION(NhlWARNING);
$2->u.package->scope = NULL;
}
}
}
| block_statement_list NCLEXTERNAL UNDEF STRING {
char buffer[4*NCL_MAX_STRING];
ExtStack *tm;
$3->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
$3->u.package->so_handle = NULL;
sprintf(buffer,"%s",_NGResolvePath($4));
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($3,DLIB);
_NclPreLoadScript(buffer,0);
if(tmp_sym == NULL) {
tmp_sym = (ExtStack*)NclMalloc(sizeof(ExtStack));
tmp_sym->tmp_sym = $3;
tmp_sym->next = NULL;
} else {
tm = (ExtStack*)NclMalloc(sizeof(ExtStack));
tm->tmp_sym = $3;
tm->next = tmp_sym;
tmp_sym = tm;
}
}
| block_statement_list EXTERNAL UNDEF STRING eoln {
void (*init_function)(void);
$$ = NULL;
$3->u.package = NclMalloc(sizeof(NclSharedLibraryInfo));
/*
fprintf(stdout,"opening: %s\n",_NGResolvePath($4));
*/
#if defined(HPUX)
$3->u.package->so_handle = shl_load(_NGResolvePath($4),BIND_IMMEDIATE,0L);
#else
$3->u.package->so_handle = dlopen(_NGResolvePath($4),RTLD_NOW);
#endif
if($3->u.package->so_handle == NULL) {
#if defined(HPUX)
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$4,strerror(errno));
#else
NhlPError(NhlWARNING,NhlEUNKNOWN,
"An error occurred loading the external file %s, file not loaded\n%s",
$4,dlerror());
#endif
PRINTLOCATION(NhlWARNING);
} else {
#if defined(HPUX)
init_function = NULL;
(void)shl_findsym(&($3->u.package->so_handle), "Init",TYPE_UNDEFINED,(void*)&init_function);
#else
init_function = dlsym($3->u.package->so_handle, "Init");
#endif
if(init_function != NULL) {
_NclResetNewSymStack();
_NclNewScope();
_NclChangeSymbolType($3,DLIB);
(*init_function)();
$3->u.package->scope = _NclPopScope();
} else {
NhlPError(NhlWARNING,NhlEUNKNOWN,
"Could not find Init() in external file %s, file not loaded",$4);
PRINTLOCATION(NhlWARNING);
$3->u.package->scope = NULL;
}
}
}
;
opt_eoln : { /* do nothing */ }
| opt_eoln eoln {
yyerrok;
}
;
eoln : EOLN { yyerrok; }
| EOFF {
ExtStack *tm;
yyerrok;
if(tmp_sym != NULL) {
tm = tmp_sym;
tm->tmp_sym->u.package->scope = _NclPopScope();
tmp_sym = tmp_sym->next;
NclFree(tm);
}
#ifdef MAKEAPI
ret_urn = 1;
#endif
}
;
statement : { $$ = NULL; }
| assignment {
$$ = $1;
}
| reassignment {
$$ = $1;
}
| procedure {
$$ = $1;
}
| function_def {
$$ = $1;
}
| procedure_def {
$$ = $1;
}
| block {
$$ = $1;
}
| do {
$$ = $1;
}
| conditional {
$$ = $1;
}
| break_cont {
$$ = $1;
}
| visblk {
$$ = $1;
}
| RETURN expr {
$$ = _NclMakeReturn($2);
}
| RETURN {
$$ = _NclMakeReturn(NULL);
}
| QUIT {
#ifndef MAKEAPI
return(-1);
#else
$$ = NULL;
#endif
}
| error {
$$ = NULL ;
ERROR("error in statement");
}
| STOP RECORD {
/*
* this goes here so that rec gets set to one before eoln comes from scanner.
*/
if(rec ==1 ) {
fclose(recfp);
recfp = NULL;
rec = 0;
}
$$ = NULL;
}
;
break_cont : BREAK {
$$ = _NclMakeBreakCont($1);
}
| CONTINUE {
$$ = _NclMakeBreakCont($1);
}
;
elseif : elseif ELSEIF expr then cond_block_list { _NclAppendElseIfClause($1, $3, $5);
$$ = $1;
}
| ELSEIF expr then cond_block_list { $$ = _NclMakeIfThenElse($2,$4, NULL); }
;
conditional : IF expr then cond_block_list elseif END IF {
NclSrcListNode *tmp = NULL;
if($4 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $5;
}
$$ = _NclMakeIfThenElse($2,$4,tmp);
}
| IF expr then cond_block_list elseif ELSE cond_block_list END IF {
NclSrcListNode* tmp1 = _NclMakeNewListNode();
tmp1->node = $5;
tmp1->next = NULL;
_NclAppendElseIfElseClause($5, $7);
$$ = _NclMakeIfThenElse($2, $4, tmp1);
}
| IF expr then cond_block_list END IF { $$ = _NclMakeIfThen($2,$4); }
| IF expr then cond_block_list ELSE cond_block_list END IF { $$ = _NclMakeIfThenElse($2,$4,$6); }
/* NCL-2655 */
/* When I added the elseif construct, I commented the following rules out because they appeared to be
* superfluous. Indeed they were for "properly" formatted if-then-else's. But in turns out they were intended
* to handle if-then-elses written on a single line. "block_statement_lists" expect EOLN terminated "statement"s,
* whereas the "statement" rule does not have the EOLN built in. I created a new rule, cond_block_list, which
* now comprises the body of conditional statements; it expects either a "statement" or a block_statement_list.
* The following rules are now indeed superfluous with the addition of the cond_block_list rule.
* They should eventually be removed --RLB 9/2017
*/
/* NCL-95 (original comment, superceded by above) */
/* The following rules appear to be superfluous for the purpose of parsing a conditional. That is, the parser
* always returns a block_statement_list object regardless of whether a single-statement or multiple statements,
* or even zero statements, are encountered. Commenting these out for now; eventually they should be removed.
*/
/**************************************************************************************************************
| IF expr then statement END IF {
NclSrcListNode *tmp = NULL;
if($4 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $4;
}
$$ = _NclMakeIfThen($2,tmp);
}
| IF expr then statement ELSE block_statement_list END IF {
NclSrcListNode *tmp = NULL;
if($4 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $4;
}
$$ = _NclMakeIfThenElse($2,tmp,$6);
}
| IF expr then statement ELSE statement END IF {
NclSrcListNode *tmp = NULL ,*tmp1 = NULL ;
if($4 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $4;
}
if($6 != NULL) {
tmp1 = _NclMakeNewListNode();
tmp1->next = NULL;
tmp1->node = $6;
}
$$ = _NclMakeIfThenElse($2,tmp,tmp1);
}
| IF expr then block_statement_list ELSE statement END IF {
NclSrcListNode *tmp = NULL ;
if($6 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $6;
}
$$ = _NclMakeIfThenElse($2,$4,tmp);
}
**************************************************************************************/
;
then :
| THEN
;
visblk : vset {
$$ = $1;
}
| vget {
$$ = $1;
}
;
v_parent: NOPARENT {
$$ = NULL;
}
| identifier {
$$ = $1;
}
;
vcreate : VSBLKCREATE expr OBJTYPE v_parent resource_list END VSBLKCREATE {
$$ = _NclMakeVis($2,$3,$4,$5,Ncl_VISBLKCREATE);
}
| VSBLKCREATE expr OBJTYPE v_parent resource END VSBLKCREATE {
NclSrcListNode * tmp = NULL;
if($5 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $5;
}
$$ = _NclMakeVis($2,$3,$4,tmp,Ncl_VISBLKCREATE);
}
| VSBLKCREATE error {
$$ = NULL;
}
;
vset : VSBLKSET expr resource END VSBLKSET {
NclSrcListNode * tmp = NULL;
if($3 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $3;
}
$$ = _NclMakeSGVis($2,tmp,Ncl_VISBLKSET);
}
| VSBLKSET expr resource_list END VSBLKSET {
$$ = _NclMakeSGVis($2,$3,Ncl_VISBLKSET);
}
| VSBLKSET error {
$$ = NULL;
}
;
vget : VSBLKGET expr get_resource END VSBLKGET {
NclSrcListNode * tmp = NULL;
if($3 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $3;
}
$$ = _NclMakeSGVis($2,tmp,Ncl_VISBLKGET);
}
| VSBLKGET expr get_resource_list END VSBLKGET {
$$ = _NclMakeSGVis($2,$3,Ncl_VISBLKGET);
}
| VSBLKGET error {
$$ = NULL;
}
;
get_resource_list : get_resource eoln {
if($1 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
} else {
$$ = NULL;
}
}
| get_resource_list get_resource eoln {
if($1 == NULL) {
if($2 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $2;
} else {
$$ = NULL;
}
} else if($2 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = $1;
$$->node = $2;
} else {
$$ = $1;
if((is_error)&&(cmd_line)) {
is_error = 0;
}
}
}
;
get_resource : {
$$ = NULL;
}
| STRING COLON identifier {
((NclGenericRefNode*)$3)->ref_type = Ncl_WRITEIT;
$$ = _NclMakeGetResource(_NclMakeStringExpr($1),$3);
NclFree($1);
}
| identifier COLON identifier {
((NclGenericRefNode*)$3)->ref_type = Ncl_WRITEIT;
$$ = _NclMakeGetResource($1,$3);
}
/*
| STRING COLON UNDEF {
$$ = _NclMakeGetResource($1,$3);
if(cmd_line)
if(!VerifyGetResExpr($3)) {
$$ = NULL;
}
}
| STRING COLON identifier {
$$ = _NclMakeGetResource($1,$3);
}
*/
| error {
$$ = NULL;
}
;
resource_list : resource eoln {
if($1 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
} else {
$$ = NULL;
}
}
| resource_list resource eoln {
if($1 == NULL) {
if($2 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $2;
} else {
$$ = NULL;
}
} else if($2 != NULL) {
$$ = _NclMakeNewListNode();
$$->next = $1;
$$->node = $2;
} else {
$$ = $1;
if((is_error)&&(cmd_line)) {
/*
_NclDeleteNewSymStack();
*/
is_error = 0;
}
}
}
| resource_list error eoln {
$$ = $1;
is_error -= 1;
/*
_NclDeleteNewSymStack();
*/
}
;
resource : {
$$ = NULL;
}
| STRING COLON expr {
$$ = _NclMakeResource(_NclMakeStringExpr($1),$3);
NclFree($1);
}
| identifier COLON expr {
$$ = _NclMakeResource($1,$3);
}
/*
| STRING COLON RKEY FVAR {
$$ = NULL;
}
| STRING COLON RKEY FVAR LP subscript_list RP {
$$ = NULL;
}
| STRING COLON RKEY COORDV {
$$ = NULL;
}
| STRING COLON RKEY COORDV LP subscript_list RP {
$$ = NULL;
}
| STRING COLON RKEY ATTNAME {
$$ = NULL;
}
| STRING COLON RKEY ATTNAME LP subscript_list RP {
$$ = NULL;
}
*/
;
do_stmnt : block_statement_list {
$$ = $1;
}
| statement {
NclSrcListNode * tmp = NULL;
if($1 != NULL ) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $1;
}
$$ = tmp;
}
;
do : DO identifier '=' expr ',' expr do_stmnt END DO {
$$ = _NclMakeDoFromTo($2,$4, $6, $7);
}
| DO identifier '=' expr ',' expr ',' expr do_stmnt END DO {
$$ = _NclMakeDoFromToStride($2,$4,$6,$8,$9);
}
| DO WHILE expr block_statement_list END DO {
$$ = _NclMakeWhile($3,$4);
}
| DO WHILE expr statement END DO {
NclSrcListNode *tmp = NULL ;
if($4 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $4;
}
$$ = _NclMakeWhile($3,tmp);
}
;
block : BGIN block_statement_list END { $$ = _NclMakeBlock($2); }
| BGIN statement END {
NclSrcListNode *tmp = NULL ;
if($2 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $2;
}
$$ = _NclMakeBlock(tmp);
}
;
fp_block : BGIN block_statement_list END { $$ = _NclMakeBlock($2); }
| BGIN statement END {
NclSrcListNode *tmp = NULL ;
if($2 != NULL) {
tmp = _NclMakeNewListNode();
tmp->next = NULL;
tmp->node = $2;
}
$$ = _NclMakeBlock(tmp);
}
;
procedure : IPROC opt_arg_list {
NclSrcListNode *step;
int count = 0;
step = $2;
while(step != NULL) {
count++;
step = step->next;
}
if(count != $1->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall($1,$2,Ncl_INTRINSICPROCCALL);
}
}
| PIPROC opt_arg_list {
NclSrcListNode *step;
int count = 0;
step = $2;
while(step != NULL) {
count++;
step = step->next;
}
if(count != $1->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall($1,$2,Ncl_INTRINSICPROCCALL);
}
}
| NPROC opt_arg_list {
NclSrcListNode *step;
int count = 0;
step = $2;
while(step != NULL) {
count++;
step = step->next;
}
if(count != $1->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall($1,$2,Ncl_PROCCALL);
}
}
| PIPROC {
if($1->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall($1,NULL,Ncl_INTRINSICPROCCALL);
}
}
| IPROC {
if($1->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall($1,NULL,Ncl_INTRINSICPROCCALL);
}
}
| NPROC {
if($1->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall($1,NULL,Ncl_PROCCALL);
}
}
| DLIB COLON COLON anysym {
NclSymbol *s;
if($1->u.package != NULL) {
s = _NclLookUpInScope($1->u.package->scope,$4->name);
if(s == NULL) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s is not defined in package %s\n",
$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else if(s->type == IPROC) {
if(s->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall(s,NULL,Ncl_INTRINSICPROCCALL);
}
} else if(s->type == NPROC) {
if(s->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall(s,NULL,Ncl_PROCCALL);
}
} else {
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: %s is not a procedure in package %s\n",$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
}
} else {
$$ = NULL;
}
}
| DLIB COLON COLON anysym opt_arg_list {
NclSrcListNode *step;
int count = 0;
NclSymbol *s;
if($1->u.package != NULL) {
s = _NclLookUpInScope($1->u.package->scope,$4->name);
if(s == NULL) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s is not defined in package %s\n",$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else if(s->type == IPROC){
step = $5;
while(step != NULL) {
count++;
step = step->next;
}
if(count != s->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,count);
PRINTLOCATION(NhlWARNING);
$$ = NULL;
} else {
$$ = _NclMakeProcCall(s,$5,Ncl_INTRINSICPROCCALL);
}
} else if(s->type == NPROC ) {
step = $5;
while(step != NULL) {
count++;
step = step->next;
}
if(count != s->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeProcCall(s,$5,Ncl_PROCCALL);
}
} else {
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: %s is not a procedure in package %s\n",
$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
}
} else {
$$ = NULL;
}
}
/*---------------------------------------------ERROR HANDLING BELOW THIS LINE-----------------------------------------------------*/
| IFUNC opt_arg_list { $$ = NULL;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: %s is a function not a procedure; return value must be referenced",ncl_cur_func);
PRINTLOCATION(NhlFATAL);
}
| NFUNC opt_arg_list { $$ = NULL;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: %s is a function not a procedure; return value must be referenced",ncl_cur_func);
PRINTLOCATION(NhlFATAL);
}
;
opt_arg_list : LP arg_list RP { $$ = $2; }
| LP RP { $$ = NULL; }
;
arg_list: expr {
/* Code to check type of expression, iff its and identifier then stamp it with
the Ncl_PARAMIT tag so the translator can add extra code */
if(!is_error) {
if(((NclGenericNode*)$1)->kind == Ncl_IDNEXPR) {
((NclGenericRefNode*)((NclIdnExpr*)$1)->idn_ref_node)->ref_type =
Ncl_PARAMIT;
}
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
} else {
$$ = NULL;
}
}
| arg_list ',' expr {
NclSrcListNode * step;
/*
* ordering is important because arguments eventually must be pushed on stack in
* appropriate order
*/
if(!is_error) {
step = $1;
while(step->next != NULL) {
step = step->next;
}
/* Code to check type of expression, iff its and identifier then stamp it with
the Ncl_PARAMIT tag so the translator can add extra code */
if(((NclGenericNode*)$3)->kind == Ncl_IDNEXPR) {
((NclGenericRefNode*)((NclIdnExpr*)$3)->idn_ref_node)->ref_type =
Ncl_PARAMIT;
}
step->next = _NclMakeNewListNode();
step->next->next = NULL;
step->next->node = $3;
}
$$ = $1;
}
;
func_identifier: KEYFUNC UNDEF { _NclNewScope(); $$ = $2; }
;
local_list: vname {
/* have to make sure that items in the local list are not added twice !! */
int lv = _NclGetCurrentScopeLevel();
if($1->level != lv) {
_NclAddSym($1->name,UNDEF);
}
}
| pfname {
int lv = _NclGetCurrentScopeLevel();
if($1->level != lv) {
_NclAddSym($1->name,UNDEF);
}
}
| local_list opt_eoln ',' opt_eoln vname {
int lv = _NclGetCurrentScopeLevel();
if($5->level != lv) {
_NclAddSym($5->name,UNDEF);
}
}
| local_list opt_eoln ',' opt_eoln pfname {
int lv = _NclGetCurrentScopeLevel();
if($5->level != lv) {
_NclAddSym($5->name,UNDEF);
}
}
;
function_def : func_identifier LP arg_dec_list RP opt_eoln {_NclChangeSymbolType($1,NFUNC);_NclAddProcFuncInfoToSym($1,$3); } fp_block
{
NclScopeRec *tmp;
if(is_error||((!cmd_line)&&block_syntax_error)) {
_NclDeleteNewSymStack();
tmp = _NclPopScope();
$$ = NULL;
is_error += 1;
$1->type = UNDEF;
}else {
tmp = _NclPopScope();
$$ = _NclMakeNFunctionDef($1,$3,$7,tmp);
}
}
| func_identifier LP arg_dec_list RP opt_eoln LOCAL opt_eoln local_list opt_eoln {_NclChangeSymbolType($1,NFUNC); _NclAddProcFuncInfoToSym($1,$3); } fp_block
{
NclScopeRec *tmp;
if(is_error||(!cmd_line&&block_syntax_error)) {
_NclDeleteNewSymStack();
tmp = _NclPopScope();
$$ = NULL;
is_error += 1;
$1->type = UNDEF;
}else {
tmp = _NclPopScope();
$$ = _NclMakeNFunctionDef($1,$3,$11,tmp);
}
}
/*---------------------------------------------ERROR HANDLING BELOW THIS LINE-----------------------------------------------------*/
| func_identifier error {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,"syntax error: Possibly expecting a 'begin' or 'local'");
/*
* Need to call this before new scope is poped so symbols can be found and freed
*/
_NclDeleteNewSymStack();
/*
* Need to call function to free scope
*/
(void)_NclPopScope();
$1->type = UNDEF;
}
| KEYFUNC error {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,"Function identifier is defined");
$$ = NULL;
}
/*
| EXTERNAL func_identifier LP arg_dec_list RP opt_eoln local_arg_dec_list eoln error {
ERROR("syntax error: EXPECTING A 'begin'");
}
*/
;
arg_dec_list : { $$ = NULL; }
| opt_eoln the_list { $$ = $2; }
;
the_list: declaration opt_eoln {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
}
| declaration opt_eoln ',' opt_eoln the_list {
/* once again ordering not important as long as it is consistent with function
and procedure ordering of argument lists */
$$ = _NclMakeNewListNode();
$$->next = $5;
$$->node = $1;
}
;
declaration : vname {
NclSymbol *s;
int lv = _NclGetCurrentScopeLevel();
if(($1->type != UNDEF)||($1->level != lv)) {
s = _NclAddSym($1->name,UNDEF);
} else {
s = $1;
}
$$ = _NclMakeLocalVarDec(s,NULL,NULL);
}
| vname COLON datatype {
NclSymbol *s;
int lv = _NclGetCurrentScopeLevel();
if(($1->type != UNDEF)||($1->level != lv)) {
s = _NclAddSym($1->name,UNDEF);
} else {
s = $1;
}
$$ = _NclMakeLocalVarDec(s,NULL,$3);
}
| vname dim_size_list {
NclSymbol *s;
int lv = _NclGetCurrentScopeLevel();
if(($1->type != UNDEF)||($1->level != lv)) {
s = _NclAddSym($1->name,UNDEF);
} else {
s = $1;
}
$$ = _NclMakeLocalVarDec(s,$2,NULL);
}
| vname dim_size_list COLON datatype {
NclSymbol *s;
int lv = _NclGetCurrentScopeLevel();
if(($1->type != UNDEF)||($1->level != lv)) {
s = _NclAddSym($1->name,UNDEF);
} else {
s = $1;
}
$$ = _NclMakeLocalVarDec(s,$2,$4);
}
| pfname {
/* Need to intercept defined names and add them to current scope */
NclSymbol *s;
s = _NclAddSym($1->name,UNDEF);
$$ = _NclMakeLocalVarDec(s,NULL,NULL);
}
| pfname COLON datatype {
NclSymbol *s;
s= _NclAddSym($1->name,UNDEF);
$$ = _NclMakeLocalVarDec(s,NULL,$3);
}
| pfname dim_size_list {
NclSymbol *s;
s = _NclAddSym($1->name,UNDEF);
$$ = _NclMakeLocalVarDec(s,$2,NULL);
}
| pfname dim_size_list COLON datatype {
NclSymbol *s;
s = _NclAddSym($1->name,UNDEF);
$$ = _NclMakeLocalVarDec(s,$2,$4);
}
;
pfname : IFUNC {
$$ = $1;
}
| PIPROC {
$$ = $1;
}
| NFUNC {
$$ = $1;
}
| NPROC {
$$ = $1;
}
;
datatype : FLOAT { $$ = $1; }
| LONG { $$ = $1; }
| ULONG { $$ = $1; }
| INT64 { $$ = $1; }
| UINT64 { $$ = $1; }
| INTEGER { $$ = $1; }
| UINT { $$ = $1; }
| SHORT { $$ = $1; }
| USHORT { $$ = $1; }
| DOUBLE { $$ = $1; }
| CHARACTER { $$ = $1; }
| BYTE { $$ = $1; }
| UBYTE { $$ = $1; }
| FILETYPE { $$ = $1; }
| GROUP { $$ = $1; }
| GROUPTYPE { $$ = $1; }
| COMPOUND { $$ = $1; }
| NUMERIC { $$ = $1; }
| ENUMERIC { $$ = $1; }
| SNUMERIC { $$ = $1; }
| GRAPHIC { $$ = $1; }
| STRNG { $$ = $1; }
| LOGICAL { $$ = $1; }
| LIST { $$ = $1; }
;
dim_size_list : LBK INT RBK {
/* Dimension size list must be in order */
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = _NclMakeDimSizeNode($2);
}
| LBK '*' RBK {
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = _NclMakeDimSizeNode(-1);
}
| dim_size_list LBK INT RBK {
NclSrcListNode *step;
step = $1;
while(step->next != NULL)
step = step->next;
step->next = _NclMakeNewListNode();
step->next->next = NULL;
step->next->node = _NclMakeDimSizeNode($3);
}
| dim_size_list LBK '*' RBK {
NclSrcListNode *step;
step = $1;
while(step->next != NULL)
step = step->next;
step->next = _NclMakeNewListNode();
step->next->next = NULL;
step->next->node = _NclMakeDimSizeNode(-1);
}
;
proc_identifier: KEYPROC UNDEF { _NclNewScope(); $$ = $2; }
;
procedure_def : proc_identifier LP arg_dec_list RP opt_eoln LOCAL opt_eoln local_list opt_eoln {_NclChangeSymbolType($1,NPROC);_NclAddProcFuncInfoToSym($1,$3); } fp_block {
NclScopeRec *tmp;
if(is_error||((!cmd_line)&&block_syntax_error)) {
_NclDeleteNewSymStack();
tmp = _NclPopScope();
$$ = NULL;
is_error +=1;
$1->type = UNDEF;
} else {
tmp = _NclPopScope();
$$ = _NclMakeProcDef($1,$3,$11,tmp);
}
}
| proc_identifier LP arg_dec_list RP opt_eoln {_NclChangeSymbolType($1,NPROC);_NclAddProcFuncInfoToSym($1,$3); } fp_block {
NclScopeRec *tmp;
if(is_error||((!cmd_line)&&block_syntax_error)) {
_NclDeleteNewSymStack();
tmp = _NclPopScope();
$$ = NULL;
is_error +=1;
$1->type = UNDEF;
} else {
tmp = _NclPopScope();
$$ = _NclMakeProcDef($1,$3,$7,tmp);
}
}
| proc_identifier error {
is_error += 1;
/*
* Need to call this before new scope is poped so symbols can be found and freed
*/
_NclDeleteNewSymStack();
/*
* Need to call function to free scope
*/
(void)_NclPopScope();
$1->type = UNDEF;
$$ = NULL;
}
;
assignment : identifier '=' expr {
if($1 != NULL) {
((NclGenericRefNode*)$1)->ref_type = Ncl_WRITEIT;
$$ = _NclMakeAssignment($1,$3);
} else {
$$ = NULL;
}
}
| identifier error {
NhlPError(NhlFATAL,NhlEUNKNOWN,"syntax error: possibly an undefined procedure");
$$ = NULL;
}
| vname list_subscript LP RP error {
NhlPError(NhlFATAL,NhlEUNKNOWN,"syntax error: possibly an undefined procedure");
$$ = NULL;
}
| identifier '=' error {
$$ = NULL;
}
reassignment : identifier REASSIGN expr {
if($1 != NULL) {
((NclGenericRefNode*)$1)->ref_type = Ncl_REWRITEIT;
$$ = _NclMakeReassignment($1,$3);
} else {
$$ = NULL;
}
}
| identifier REASSIGN error {
$$ = NULL;
}
filevarselector : FVAR {
$$ = _NclMakeIdnExpr(_NclMakeStringExpr($1));
}
| FSTRING primary EFSTRING {
_NclValOnly($2);
$$ = $2;
}
filegroupselector : GVAR {
$$ = _NclMakeIdnExpr(_NclMakeStringExpr($1));
}
| GSTRING primary EFSTRING {
_NclValOnly($2);
$$ = $2;
}
coordvarselector : COORDV{
$$ = _NclMakeIdnExpr(_NclMakeStringExpr($1));
}
| CSTRING primary EFSTRING {
_NclValOnly($2);
$$ = $2;
}
attributeselector : ATTNAME{
$$ = _NclMakeIdnExpr(_NclMakeStringExpr($1));
}
| ASTRING primary EFSTRING {
_NclValOnly($2);
$$ = $2;
}
identifier : vname list_subscript {
NclSymbol *tmp;
NclSymbol *tmp0;
void *node;
if($2 == NULL) {
$$ = _NclMakeVarRef($1,NULL);
} else {
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
/*
node = _NclMakeVarRef(tmp0,NULL);
_NclValOnly(node);
*/
$$ = _NclMakeListRef(_NclMakeVarRef(tmp0,NULL),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector {
if($2 == NULL) {
$$ = _NclMakeFileVarRef($1,$3,NULL,Ncl_FILEVAR);
} else {
$$ = _NclMakeFileVarListRef($1,$2,$3,NULL);
}
}
| vname list_subscript filegroupselector {
if($2 == NULL) {
$$ = _NclMakeFileGroupRef($1,$3,Ncl_FILEGROUP);
} else {
$$ = _NclMakeFileGroupListRef($1,$2,$3,NULL);
}
}
| vname list_subscript filevarselector MARKER {
NclSymbol *tmp;
NclSymbol *tmp0;
if($2 == NULL) {
$$ = _NclMakeFileVarRef($1,$3,NULL,Ncl_FILEVAR);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarRef(tmp0,$3,NULL,Ncl_FILEVAR),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector LP subscript_list RP MARKER {
NclSymbol *tmp;
NclSymbol *tmp0;
if($2 == NULL) {
$$ = _NclMakeFileVarRef($1,$3,$5,Ncl_FILEVAR);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarRef(tmp0,$3,$5,Ncl_FILEVAR),tmp,$1,$2,tmp0);
}
}
/*
| vname list_subscript filevarselector LP subscript_list RP {
NclSymbol *tmp;
NclSymbol *tmp0;
if($2 == NULL) {
$$ = _NclMakeFileVarRef($1,$3,$5,Ncl_FILEVAR);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarRef(tmp0,$3,$5,Ncl_FILEVAR),tmp,$1,$2,tmp0);
}
}
*/
| vname list_subscript filevarselector LP subscript_list RP {
if($2 == NULL) {
$$ = _NclMakeFileVarRef($1,$3,$5,Ncl_FILEVAR);
} else {
$$ = _NclMakeFileVarListRef($1,$2,$3,(NclSrcListNode *)$5);
}
}
| vname list_subscript filevarselector DIM_MARKER primary {
NclSymbol *tmp0;
NclSymbol *tmp;
_NclValOnly($5);
if($2 == NULL) {
$$ = _NclMakeFileVarDimRef($1,$3,$5);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarDimRef(tmp0,$3,$5),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector attributeselector {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeFileVarAttRef($1,$3,$4,NULL);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarAttRef(tmp0,$3,$4,NULL),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector attributeselector LP subscript_list RP {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeFileVarAttRef($1,$3,$4,$6);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarAttRef(tmp0,$3,$4,$6),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector coordvarselector {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeFileVarCoordRef($1,$3,$4,NULL);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarCoordRef(tmp0,$3,$4,NULL),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector coordvarselector attributeselector{
NclSymbol *tmp;
NclSymbol *tmp0;
if($2 == NULL) {
$$ = _NclMakeFileVarCoordAttRef($1,$3,$4,$5,NULL);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarCoordAttRef(tmp0,$3,$4,$5,NULL),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector coordvarselector attributeselector LP subscript_list RP {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeFileVarCoordAttRef($1,$3,$4,$5,$7);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarCoordAttRef(tmp0,$3,$4,$5,$7),tmp,$1,$2,tmp0);
}
}
| vname list_subscript filevarselector coordvarselector LP subscript_list RP{
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeFileVarCoordRef($1,$3,$4,$6);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeFileVarCoordRef(tmp0,$3,$4,$6),tmp,$1,$2,tmp0);
}
}
| vname list_subscript DIM_MARKER primary {
NclSymbol *tmp0;
NclSymbol *tmp;
_NclValOnly($4);
if($2 == NULL) {
$$ = _NclMakeVarDimRef($1,$4);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarDimRef(tmp0,$4),tmp,$1,$2,tmp0);
}
}
| vname list_subscript attributeselector {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeVarAttRef($1,$3,NULL);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarAttRef(tmp0,$3,NULL),tmp,$1,$2,tmp0);
}
}
| vname list_subscript attributeselector LP subscript_list RP {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeVarAttRef($1,$3,$5);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarAttRef(tmp0,$3,$5),tmp,$1,$2,tmp0);
}
}
| vname MARKER {
$$ = _NclMakeVarRef($1,NULL);
}
| vname list_subscript LP subscript_list RP MARKER {
NclSymbol *tmp;
NclSymbol *tmp0;
if($2 == NULL) {
$$ = _NclMakeVarRef($1,$4);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarRef(tmp0,$4),tmp,$1,$2,tmp0);
}
}
| vname list_subscript LP subscript_list RP {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeVarRef($1,$4);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarRef(tmp0,$4),tmp,$1,$2,tmp0);
}
}
| vname list_subscript coordvarselector{
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeVarCoordRef($1,$3,NULL);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarCoordRef(tmp0,$3,NULL),tmp,$1,$2,tmp0);
}
}
| vname list_subscript coordvarselector LP subscript_list RP{
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeVarCoordRef($1,$3,$5);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarCoordRef(tmp0,$3,$5),tmp,$1,$2,tmp0);
}
}
| vname list_subscript coordvarselector attributeselector {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeVarCoordAttRef($1,$3,$4,NULL);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarCoordAttRef(tmp0,$3,$4,NULL),tmp,$1,$2,tmp0);
}
}
| vname list_subscript coordvarselector attributeselector LP subscript_list RP {
NclSymbol *tmp0;
NclSymbol *tmp;
if($2 == NULL) {
$$ = _NclMakeVarCoordAttRef($1,$3,$4,$6);
} else {
tmp = _NclAddUniqueSym("tmp_list_",UNDEF);
tmp0 = _NclAddUniqueSym("tmp_list_var_",UNDEF);
$$ = _NclMakeListRef(_NclMakeVarCoordAttRef(tmp0,$3,$4,$6),tmp,$1,$2,tmp0);
}
}
;
vname : OBJVAR {
$$ = $1;
}
| OBJTYPE {
$$ = $1;
}
| VAR {
$$ = $1;
}
| DFILE {
$$ = $1;
}
/*
| LIST {
$$ = $1;
}
*/
| UNDEF {
$$ = $1;
}
;
list_subscript : {
$$ = NULL;
}
| LBK subexpr RBK {
$$ = _NclMakeIntSubscript($2,NULL);
}
;
subscript_list :
named_subscript_list {
$$ = $1;
}
| normal_subscript_list {
$$ = $1;
}
named_subscript_list: subscript2 {
/* ordering of subscripts must be preserved */
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
}
| LBC subscript3 RBC {
/* ordering of subscripts must be preserved */
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $2;
}
| named_subscript_list ',' subscript2 {
NclSrcListNode *step;
step = $1;
if(!is_error) {
while(step->next != NULL)
step = step->next;
step->next = _NclMakeNewListNode();
step->next->next = NULL;
step->next->node = $3;
}
}
| named_subscript_list ',' LBC subscript3 RBC {
NclSrcListNode *step;
step = $1;
if(!is_error) {
while(step->next != NULL)
step = step->next;
step->next = _NclMakeNewListNode();
step->next->next = NULL;
step->next->node = $4;
}
}
| named_subscript_list ',' error {
NhlPError(NhlFATAL,NhlEUNKNOWN,
"Error in subscript, named subscripting is being used, make sure each subscript has a name");
is_error += 1;
$$ = NULL;
}
;
normal_subscript_list: subscript0 {
/* ordering of subscripts must be preserved */
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $1;
}
| LBC subscript1 RBC {
/* ordering of subscripts must be preserved */
$$ = _NclMakeNewListNode();
$$->next = NULL;
$$->node = $2;
}
| normal_subscript_list ',' subscript0 {
NclSrcListNode *step;
step = $1;
if(!is_error){
while(step->next != NULL)
step = step->next;
step->next = _NclMakeNewListNode();
step->next->next = NULL;
step->next->node = $3;
}
}
| normal_subscript_list ',' LBC subscript1 RBC {
NclSrcListNode *step;
step = $1;
if(!is_error){
while(step->next != NULL)
step = step->next;
step->next = _NclMakeNewListNode();
step->next->next = NULL;
step->next->node = $4;
}
}
| normal_subscript_list ',' error {
NhlPError(NhlFATAL,NhlEUNKNOWN,
"Error in subscript, normal subscripting is being used, make sure named subscripting has not been used");
is_error += 1;
$$ = NULL;
}
;
subscript0: subexpr {
$$ = _NclMakeIntSubscript($1,NULL);
}
;
subscript2: DIM subexpr {
$$ = _NclMakeIntSubscript($2,_NclMakeStringExpr($1));
}
| EFSTRING primary EFSTRING "|" subexpr {
_NclValOnly($2);
$$ = _NclMakeIntSubscript($5,$2);
}
;
subscript1: subexpr {
$$ = _NclMakeCoordSubscript($1,NULL);
}
;
subscript3: DIM subexpr {
$$ = _NclMakeCoordSubscript($2,_NclMakeStringExpr($1));
}
| EFSTRING primary EFSTRING "|" subexpr {
_NclValOnly($2);
$$ = _NclMakeCoordSubscript($5,$2);
}
;
subexpr: expr {
_NclValOnly($1);
$$ = _NclMakeSingleIndex($1);
}
| COLON {
$$ = _NclMakeRangeIndex(NULL,NULL,NULL);
}
| expr COLON expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeRangeIndex($1,$3,NULL);
}
| COLON expr {
_NclValOnly($2);
$$ = _NclMakeRangeIndex(NULL,$2,NULL);
}
| expr COLON {
_NclValOnly($1);
$$ = _NclMakeRangeIndex($1,NULL,NULL);
}
| expr COLON expr COLON {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeRangeIndex($1,$3,NULL);
}
| COLON expr COLON {
_NclValOnly($2);
$$ = _NclMakeRangeIndex(NULL,$2,NULL);
}
| expr COLON COLON {
_NclValOnly($1);
$$ = _NclMakeRangeIndex($1,NULL,NULL);
}
| expr COLON expr COLON expr {
_NclValOnly($1);
_NclValOnly($3);
_NclValOnly($5);
$$ = _NclMakeRangeIndex($1,$3,$5);
}
| expr COLON COLON expr {
_NclValOnly($1);
_NclValOnly($4);
$$ = _NclMakeRangeIndex($1,NULL,$4);
}
| COLON expr COLON expr {
_NclValOnly($2);
_NclValOnly($4);
$$ = _NclMakeRangeIndex(NULL,$2,$4);
}
| COLON COLON {
$$ = _NclMakeRangeIndex(NULL,NULL,NULL);
}
| COLON COLON expr {
_NclValOnly($3);
$$ = _NclMakeRangeIndex(NULL,NULL,$3);
}
/*
| '*' {
$$ = _NclMakeWildCardIndex();
}
*/
;
expr : primary {
$$ = $1;
}
| '-' expr %prec UNOP {
_NclValOnly($2);
$$ = _NclMakeUnaryExpr($2,Ncl_NEGEXPR);
}
| NOT expr %prec UNOP {
_NclValOnly($2);
$$ = _NclMakeUnaryExpr($2,Ncl_NOTEXPR);
}
| expr '%' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_MODEXPR);
}
| expr OR expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_OREXPR);
}
| expr AND expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_ANDEXPR);
}
| expr XOR expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_XOREXPR);
}
| expr '<' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_LTSELECTEXPR);
}
| expr '>' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_GTSELECTEXPR);
}
| expr '+' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_PLUSEXPR);
}
| expr '-' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_MINUSEXPR);
}
| expr '*' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_MULEXPR);
}
| expr '#' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_MATMULEXPR);
}
| expr '/' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_DIVEXPR);
}
| expr '^' expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_EXPEXPR);
}
| expr LE expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_LEEXPR);
}
| expr GE expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_GEEXPR);
}
| expr GT expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_GTEXPR);
}
| expr LT expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_LTEXPR);
}
| expr EQ expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_EQEXPR);
}
| expr NE expr {
_NclValOnly($1);
_NclValOnly($3);
$$ = _NclMakeExpr($1,$3,Ncl_NEEXPR);
}
;
anysym : FLOAT {
$$ = $1;
}
| INTEGER {
$$ = $1;
}
| UINT {
$$ = $1;
}
| LONG {
$$ = $1;
}
| ULONG {
$$ = $1;
}
| INT64 {
$$ = $1;
}
| UINT64 {
$$ = $1;
}
| DOUBLE {
$$ = $1;
}
| BYTE {
$$ = $1;
}
| UBYTE {
$$ = $1;
}
| CHARACTER {
$$ = $1;
}
| GRAPHIC {
$$ = $1;
}
| STRNG {
$$ = $1;
}
| GROUP {
$$ = $1;
}
| COMPOUND {
$$ = $1;
}
| NUMERIC {
$$ = $1;
}
| ENUMERIC {
$$ = $1;
}
| SNUMERIC {
$$ = $1;
}
| FILETYPE {
$$ = $1;
}
| GROUPTYPE {
$$ = $1;
}
| SHORT {
$$ = $1;
}
| USHORT {
$$ = $1;
}
| LOGICAL {
$$ = $1;
}
| UNDEF {
$$ = $1;
}
| VAR {
$$ = $1;
}
| WHILE {
$$ = $1;
}
| DO {
$$ = $1;
}
| QUIT {
$$ = $1;
}
| NPROC {
$$ = $1;
}
| PIPROC {
$$ = $1;
}
| IPROC {
$$ = $1;
}
| UNDEFFILEVAR {
$$ = $1;
}
| UNDEFFILEGROUP {
$$ = $1;
}
| BREAK {
$$ = $1;
}
| NOPARENT {
$$ = $1;
}
| BGIN {
$$ = $1;
}
| END {
$$ = $1;
}
| NFUNC {
$$ = $1;
}
| IFUNC {
$$ = $1;
}
| FDIM {
$$ = $1;
}
| IF {
$$ = $1;
}
| THEN {
$$ = $1;
}
| VBLKNAME {
$$ = $1;
}
| CONTINUE {
$$ = $1;
}
| DFILE {
$$ = $1;
}
| KEYFUNC {
$$ = $1;
}
| KEYPROC {
$$ = $1;
}
| ELSE {
$$ = $1;
}
| EXTERNAL {
$$ = $1;
}
| NCLEXTERNAL {
$$ = $1;
}
| RETURN {
$$ = $1;
}
| VSBLKGET {
$$ = $1;
}
| NEW {
$$ = $1;
}
| OBJVAR {
$$ = $1;
}
| OBJTYPE {
$$ = $1;
}
| RECORD {
$$ = $1;
}
| VSBLKCREATE {
$$ = $1;
}
| VSBLKSET {
$$ = $1;
}
| LOCAL {
$$ = $1;
}
| STOP {
$$ = $1;
}
| NCLTRUE {
$$ = $1;
}
| NCLFALSE {
$$ = $1;
}
| NCLMISSING {
$$ = $1;
}
| DLIB {
$$ = $1;
}
;
primary : REAL {
/*
* Note all of the structures created below the primary rule are special! They
* contain the ref_type field which is used to determine if the item
* is a parameter to a function or a procedure. The LP expr RP is an
* exception
*/
$$ = _NclMakeIdnExpr(_NclMakeRealExpr($1,yytext));
}
| INT {
$$ = _NclMakeIdnExpr(_NclMakeIntExpr($1,yytext));
}
| NCLTRUE {
$$ = _NclMakeIdnExpr(_NclMakeLogicalExpr(1,yytext));
}
| NCLFALSE {
$$ = _NclMakeIdnExpr(_NclMakeLogicalExpr(0,yytext));
}
| NCLMISSING {
$$ = _NclMakeIdnExpr(_NclMakeLogicalExpr(-1,yytext));
}
| STRING {
$$ = _NclMakeIdnExpr(_NclMakeStringExpr($1));
NclFree($1);
}
| function {
$$ = _NclMakeIdnExpr($1);
}
| identifier {
$$ = _NclMakeIdnExpr($1);
}
| array {
$$ = _NclMakeIdnExpr($1);
}
| listvar {
$$ = _NclMakeIdnExpr($1);
}
| vcreate {
$$ = $1;
}
| LP expr RP {
$$ = $2;
}
| NEW LP expr ',' datatype ',' expr RP {
_NclValOnly($3);
_NclValOnly($7);
$$ = _NclMakeNewOp($3,$5,$7);
}
| NEW LP expr ',' datatype RP {
_NclValOnly($3);
$$ = _NclMakeNewOp($3,$5,NULL);
}
| NEW LP expr ',' expr ',' expr RP {
_NclValOnly($3);
_NclValOnly($5);
_NclValOnly($7);
$$ = _NclMakeExprNewOp($3,$5,$7);
}
| NEW LP expr ',' expr RP {
_NclValOnly($3);
_NclValOnly($5);
$$ = _NclMakeExprNewOp($3,$5,NULL);
}
| LBK expr RBK {
$$ = $2;
}
| NEW LBK expr ',' datatype RBK {
_NclValOnly($3);
$$ = _NclMakeNewOp($3,$5,NULL);
}
| NCLNULL {
$$ = _NclMakeNULLNode();
}
;
function: IFUNC opt_arg_list {
NclSrcListNode *step;
int count = 0;
step = $2;
while(step != NULL) {
count++;
step = step->next;
}
if(count != $1->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall($1,$2,Ncl_INTRINSICFUNCCALL);
}
}
| NFUNC opt_arg_list {
NclSrcListNode *step;
int count = 0;
step = $2;
while(step != NULL) {
count++;
step = step->next;
}
if(count != $1->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall($1,$2,Ncl_FUNCCALL);
}
}
| IFUNC {
if($1->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall($1,NULL,Ncl_INTRINSICFUNCCALL);
}
}
| NFUNC {
if($1->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
$1->name,$1->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall($1,NULL,Ncl_FUNCCALL);
}
}
| DLIB COLON COLON anysym {
NclSymbol *s;
if($1->u.package != NULL) {
s = _NclLookUpInScope($1->u.package->scope,$4->name);
if(s == NULL) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s is not defined in package %s\n",
$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else if(s->type == IFUNC){
if(s->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall(s,NULL,Ncl_INTRINSICFUNCCALL);
}
} else if(s->type == NFUNC) {
if(s->u.procfunc->nargs != 0) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,0);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall(s,NULL,Ncl_FUNCCALL);
}
} else {
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: %s is not a function in package %s\n",
$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
}
} else {
$$ = NULL;
}
}
| DLIB COLON COLON anysym opt_arg_list {
NclSrcListNode *step;
int count = 0;
NclSymbol *s;
if($1->u.package != NULL) {
s = _NclLookUpInScope($1->u.package->scope,$4->name);
if(s == NULL) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: procedure %s is not defined in package %s\n",
$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else if(s->type == IFUNC){
step = $5;
while(step != NULL) {
count++;
step = step->next;
}
if(count != s->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall(s,$5,Ncl_INTRINSICFUNCCALL);
}
} else if(s->type == NFUNC) {
step = $5;
while(step != NULL) {
count++;
step = step->next;
}
if(count != s->u.procfunc->nargs) {
is_error += 1;
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: function %s expects %d arguments, got %d",
s->name,s->u.procfunc->nargs,count);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
} else {
$$ = _NclMakeFuncCall(s,$5,Ncl_FUNCCALL);
}
} else {
NhlPError(NhlFATAL,NhlEUNKNOWN,
"syntax error: %s is not a function in package %s\n",$4->name,$1->name);
PRINTLOCATION(NhlFATAL);
$$ = NULL;
}
} else {
$$ = NULL;
}
}
;
array : LPSLSH expr_list SLSHRP {
$$ = _NclMakeArrayNode($2);
}
;
listvar : LBKSLSH list_expr_list SLSHRBK {
$$ = _NclMakeListVarNode($2);
}
;
list_expr_list : expr {
$$ = _NclMakeRowList();
$$->list = _NclMakeNewListNode();
$$->list->next = NULL;
$$->list->node = $1;
$$->currentitem= NULL;
$$->nelem = 1;
}
| list_expr_list ',' expr {
/* pushed on backwards so they can be popped of in correct order*/
if($1 == NULL) {
$$ = _NclMakeRowList();
$$->nelem = 1;
$$->list = _NclMakeNewListNode();
$$->list->next = NULL;
$$->list->node = $1;
$$->currentitem= NULL;
$$->nelem = 1;
} else {
NclSrcListNode *tmp;
tmp = _NclMakeNewListNode();
tmp->next = $1->list;
tmp->node = $3;
$1->list = tmp;
$1->nelem++;
$$ = $1;
}
}
;
expr_list : expr {
_NclValOnly($1);
$$ = _NclMakeRowList();
$$->list = _NclMakeNewListNode();
$$->list->next = NULL;
$$->list->node = $1;
$$->currentitem= NULL;
$$->nelem = 1;
}
| expr_list ',' expr {
/* pushed on backwards so they can be popped of in correct order*/
_NclValOnly($3);
if($1 == NULL) {
$$ = _NclMakeRowList();
$$->nelem = 1;
$$->list = _NclMakeNewListNode();
$$->list->next = NULL;
$$->list->node = $1;
$$->currentitem= NULL;
$$->nelem = 1;
} else {
NclSrcListNode *tmp;
tmp = _NclMakeNewListNode();
tmp->next = $1->list;
tmp->node = $3;
$1->list = tmp;
$1->nelem++;
$$ = $1;
}
}
;
%%
yyerror
#if __STDC__
(char *s)
#else
(s)
char *s;
#endif
{
extern int is_error;
int i,len;
char error_buffer[1024];
is_error += 1;
if(is_error < NCL_MAX_ERROR) {
if(yytext[0] == '\n' || (yytext[0] == '\r' && yytext[1] == '\n')) {
sprintf(error_buffer,"%s\n",cur_line_text);
len = strlen(error_buffer);
for(i=0; i<last_line_length-1;i++) sprintf(&(error_buffer[len+i]),"-");
sprintf(&(error_buffer[len+last_line_length-1]),"^");
if(loading > 0) {
NhlPError(NhlFATAL,NhlEUNKNOWN,"%s: line %d in file %s before or near \\n \n%s\n",s,cur_line_number,cur_load_file,error_buffer);
} else if(cmd_line){
NhlPError(NhlFATAL,NhlEUNKNOWN,"%s: line %d before or near \\n \n%s\n",s,cur_line_number,error_buffer);
} else {
NhlPError(NhlFATAL,NhlEUNKNOWN,"%s: line %d before or near \\n \n%s\n",s,cur_line_number,error_buffer);
}
} else {
sprintf((char*)&(error_buffer[0]),"%s\n",cur_line_text);
len = strlen(error_buffer);
for(i=0; i<cur_line_length-1;i++) sprintf(&(error_buffer[len+i]),"-");
sprintf(&(error_buffer[len+cur_line_length-1]),"^");
if(loading > 0) {
NhlPError(NhlFATAL,NhlEUNKNOWN,"%s: line %d in file %s before or near %s \n%s\n",s,cur_line_number,cur_load_file,yytext,error_buffer);
} else if(cmd_line){
NhlPError(NhlFATAL,NhlEUNKNOWN,"%s: line %d before or near %s \n%s\n",s,cur_line_number,yytext,error_buffer);
} else {
NhlPError(NhlFATAL,NhlEUNKNOWN,"%s: line %d before or near %s \n%s\n",s,cur_line_number,yytext,error_buffer);
}
}
} else if((is_error == NCL_MAX_ERROR)&&(cmd_line != 2)) {
NhlPError(NhlFATAL,NhlEUNKNOWN,"Maximum number of errors exceeded, terminating");
_NclExit(0);
} else {
/*
* GUI STUFF
*/
}
return(0);
}
| {
"pile_set_name": "Github"
} |
--TEST--
SPL SplDoublyLinkedList offsetExists displays warning and returns null on no parameters
--CREDITS--
PHPNW TestFest 2009 - Ben Longden
--FILE--
<?php
$list = new SplDoublyLinkedList();
$a = $list->offsetExists();
if(is_null($a)) {
echo 'PASS';
}
?>
--EXPECTF--
Warning: SplDoublyLinkedList::offsetExists() expects exactly 1 parameter, 0 given in %s on line %d
PASS
| {
"pile_set_name": "Github"
} |
// Copyright 2017 the V8 project authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// Flags: --validate-asm --no-stress-opt --no-stress-validate-asm --no-suppress-asm-messages
function Module() {
"use asm"
function f() {
g();
}
return { f:f };
}
Module();
| {
"pile_set_name": "Github"
} |
/**
* Created by chriscai on 2015/1/15.
*/
var BusinessService = require('../../service/BusinessService'),
_ = require('underscore'),
StatisticsService = require('../../service/StatisticsService');
var log4js = require('log4js'),
logger = log4js.getLogger();
// ๅน้
2019-08-02
const DATE_REG = /^(?:(?!0000)[0-9]{4}-(?:(?:0[1-9]|1[0-2])-(?:0[1-9]|1[0-9]|2[0-8])|(?:0[13-9]|1[0-2])-(?:29|30)|(?:0[13578]|1[02])-31)|(?:[0-9]{2}(?:0[48]|[2468][048]|[13579][26])|(?:0[48]|[2468][048]|[13579][26])00)-02-29)$/;
var StatisticsAction = {
index: function (param, req, res) {
var params = req.query,
user = req.session.user;
var businessService = new BusinessService();
businessService.findBusinessByUser(user.loginName, function (err, item) {
res.render(param.tpl, {
layout: false,
user: user,
index: 'statistics',
statisticsTitle: param.statisticsTitle,
items: item
});
});
},
findBusinessByUser: function (param, req, res) {
var params = req.query,
user = req.session.user;
var businessService = new BusinessService();
businessService.findBusinessByUser(user.loginName, function (err, item) {
if (err) {
res.json({ code: 500, data: err });
} else {
res.json({ code: 200, data: item });
}
});
},
getRate: function (param, req, res) {
var db = global.models.db,
date = param.date.replace(/\D/g, '');
const ids = param.badjsid;
db.driver.execQuery('select * from b_quality where date=' + date + ' and badjsid in (' + ids + ');',
(err, data = []) => {
res.json({
data: data,
retcode: 0
});
});
},
projectTotal: function (param, req, res) {
var params = req.query,
user = req.session.user;
var businessService = new BusinessService();
businessService.findBusiness(function (err, items) {
res.render(param.tpl, {
layout: false,
user: user,
index: 'projectTotal',
statisticsTitle: param.statisticsTitle,
items: items
});
});
},
queryByChart: function (param, req, res) {
var statisticsService = new StatisticsService();
if (!param.projectIds || !param.timeScope) {
res.json({
ret: 0,
msg: 'invalid query'
});
return;
}
statisticsService.queryByChart({
userName: req.session.user.loginName,
projectId: param.projectIds.split(','),
timeScope: param.timeScope - 0
}, function (err, data) {
if (err) {
res.json({
ret: -1,
msg: "error"
});
return;
}
res.json(data);
});
},
queryByChartForAdmin: function (param, req, res) {
var statisticsService = new StatisticsService();
if (!param.projectId || isNaN(param.projectId) || !param.timeScope || req.session.user.role != 1) {
res.json({
ret: 0,
msg: 'success',
data: {}
});
return;
}
statisticsService.queryByChart({
projectId: param.projectId - 0,
timeScope: param.timeScope - 0
}, function (err, data) {
if (err) {
res.json({
ret: -1,
msg: "error"
});
return;
}
res.json(data);
return;
});
},
queryById: function (param, req, res) {
var statisticsService = new StatisticsService();
if (!req.query.projectId || isNaN(req.query.projectId) || !req.query.startDate) {
res.json({
ret: 0,
msg: 'query invalid๏ผ',
data: {}
});
return;
}
var startDate = new Date(param.startDate + ' 00:00:00');
statisticsService.queryById({
userName: req.session.user.loginName,
projectId: req.query.projectId - 0,
startDate
}, function (err, data) {
if (err) {
res.json({
ret: -1,
msg: 'error',
data: {}
});
} else {
res.json({
ret: 0,
msg: 'success',
data: data
});
}
});
},
getTopError: function (param, req, res) {
logger.info('ๆฅ่ฏข้่ฏฏๅ่กจ' + JSON.stringify(param));
const { startDate, userName } = param;
if (!DATE_REG.test(startDate) || !/^[A-Za-z_]{3,20}$/.test(userName)) {
return res.json({
ret: -1,
msg: 'params error'
});
}
const statisticsService = new StatisticsService();
const { user } = req.session;
logger.info(user);
const isAdmin = user.role === 1;
const { loginName } = user;
let searchName = loginName;
if (isAdmin) {
searchName = userName || loginName
}
statisticsService.getTopError({
searchName,
startDate
}, function (data) {
res.json({
ret: 0,
data
});
});
},
getpvbyid: function (param, req, res) {
var statisticsService = new StatisticsService();
statisticsService.getPvById({
badjsid: req.query.badjsid,
date: req.query.date
}, function (err, data) {
if (err) {
res.json({
retcode: 2,
msg: err
});
return;
}
res.json(data);
});
}
};
module.exports = StatisticsAction;
| {
"pile_set_name": "Github"
} |
Due to the anti-virus positive detection of shell scripts stored inside
this folder, we needed to somehow circumvent this. As from the plain
sqlmap users perspective nothing has to be done prior to their usage by
sqlmap, but if you want to have access to their original source code use
the decrypt functionality of the ../extra/cloak/cloak.py utility.
To prepare the original scripts to the cloaked form use this command:
find backdoor.* stager.* -type f -exec python ../extra/cloak/cloak.py -i '{}' \;
To get back them into the original form use this:
find backdoor.*_ stager.*_ -type f -exec python ../extra/cloak/cloak.py -d -i '{}' \;
| {
"pile_set_name": "Github"
} |
--- ./bfd/cpu-mips.c.orig 2011-07-24 14:20:05.000000000 +0000
+++ ./bfd/cpu-mips.c 2012-01-21 13:31:35.000000000 +0000
@@ -89,6 +89,7 @@
I_mipsisa64,
I_mipsisa64r2,
I_sb1,
+ I_allegrex,
I_loongson_2e,
I_loongson_2f,
I_loongson_3a,
@@ -130,6 +131,7 @@
N (64, 64, bfd_mach_mipsisa64, "mips:isa64", FALSE, NN(I_mipsisa64)),
N (64, 64, bfd_mach_mipsisa64r2,"mips:isa64r2", FALSE, NN(I_mipsisa64r2)),
N (64, 64, bfd_mach_mips_sb1, "mips:sb1", FALSE, NN(I_sb1)),
+ N (32, 32, bfd_mach_mips_allegrex, "mips:allegrex", FALSE, NN(I_allegrex)),
N (64, 64, bfd_mach_mips_loongson_2e, "mips:loongson_2e", FALSE, NN(I_loongson_2e)),
N (64, 64, bfd_mach_mips_loongson_2f, "mips:loongson_2f", FALSE, NN(I_loongson_2f)),
N (64, 64, bfd_mach_mips_loongson_3a, "mips:loongson_3a", FALSE, NN(I_loongson_3a)),
| {
"pile_set_name": "Github"
} |
<?php
class Clean_SqlReports_Block_Adminhtml_Page_Menu extends Mage_Adminhtml_Block_Page_Menu
{
/**
* Retrieve Adminhtml Menu array
*
* @return array
*/
public function getMenuArray()
{
$menuArray = $this->_buildMenuArray();
if (isset($menuArray['report']) && isset($menuArray['report']['children']) && isset($menuArray['report']['children']['cleansql'])) {
$this->_appendCleanSqlReports($menuArray['report']['children']['cleansql']);
}
return $menuArray;
}
/**
* @param array $menuArray
*/
protected function _appendCleanSqlReports(array &$menuArray)
{
if (!isset($menuArray['children'])) {
$menuArray['children'] = array();
}
$maxReports = (int) Mage::getStoreConfig('reports/cleansql/max_reports_in_menu');
$helper = Mage::helper('cleansql');
$reportCollection = Mage::getModel('cleansql/report')->getCollection()->setOrder('title', 'ASC');
$reportCount = $reportCollection->count();
$i = 1;
foreach ($reportCollection as $report) {
if (! $helper->getAllowViewReport($report->getId())) {
continue;
}
/** @var $report Clean_SqlReports_Model_Report */
$titleNodeName = $this->_getXmlTitle($report);
$route = $helper->getPrimaryReportRoute($report);
$menuArray['children'][$titleNodeName] = array(
'label' => $report->getTitle(),
'sort_order' => 0,
'url' => $this->getUrl('adminhtml/adminhtml_customreport/' . $route, array('report_id' => $report->getId())),
'active' => true,
'level' => 2,
'last' => $reportCount === $i || $i === $maxReports,
);
if ($i === $maxReports) {
break;
}
$i++;
}
}
/**
* @param Clean_SqlReports_Model_Report $report
*
* @return string
*/
protected function _getXmlTitle(Clean_SqlReports_Model_Report $report)
{
return 'report_' . $report->getId();
}
}
| {
"pile_set_name": "Github"
} |
class Particle {
float x, y; // The x- and y-coordinates
float vx, vy; // The x- and y-velocities
float radius; // Particle radius
float gravity = 0.1;
Particle(int xpos, int ypos, float velx, float vely, float r) {
x = xpos;
y = ypos;
vx = velx;
vy = vely;
radius = r;
}
void update() {
vy += gravity;
y += vy;
x += vx;
}
void display() {
ellipse(x, y, radius*2, radius*2);
}
}
| {
"pile_set_name": "Github"
} |
{
"description": "LimitRangeList is a list of LimitRange items.",
"required": [
"items"
],
"properties": {
"apiVersion": {
"description": "APIVersion defines the versioned schema of this representation of an object. Servers should convert recognized schemas to the latest internal value, and may reject unrecognized values. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#resources",
"type": [
"string",
"null"
]
},
"items": {
"description": "Items is a list of LimitRange objects. More info: https://kubernetes.io/docs/concepts/configuration/manage-compute-resources-container/",
"type": [
"array",
"null"
],
"items": {
"description": "LimitRange sets resource usage limits for each kind of resource in a Namespace.",
"properties": {
"apiVersion": {
"description": "APIVersion defines the versioned schema of this representation of an object. Servers should convert recognized schemas to the latest internal value, and may reject unrecognized values. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#resources",
"type": [
"string",
"null"
]
},
"kind": {
"description": "Kind is a string value representing the REST resource this object represents. Servers may infer this from the endpoint the client submits requests to. Cannot be updated. In CamelCase. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#types-kinds",
"type": [
"string",
"null"
],
"enum": [
"LimitRange"
]
},
"metadata": {
"description": "ObjectMeta is metadata that all persisted resources must have, which includes all objects users must create.",
"properties": {
"annotations": {
"description": "Annotations is an unstructured key value map stored with a resource that may be set by external tools to store and retrieve arbitrary metadata. They are not queryable and should be preserved when modifying objects. More info: http://kubernetes.io/docs/user-guide/annotations",
"type": "object",
"additionalProperties": {
"type": [
"string",
"null"
]
}
},
"clusterName": {
"description": "The name of the cluster which the object belongs to. This is used to distinguish resources with same name and namespace in different clusters. This field is not set anywhere right now and apiserver is going to ignore it if set in create or update request.",
"type": [
"string",
"null"
]
},
"creationTimestamp": {
"description": "Time is a wrapper around time.Time which supports correct marshaling to YAML and JSON. Wrappers are provided for many of the factory methods that the time package offers.",
"type": [
"string",
"null"
],
"format": "date-time"
},
"deletionGracePeriodSeconds": {
"description": "Number of seconds allowed for this object to gracefully terminate before it will be removed from the system. Only set when deletionTimestamp is also set. May only be shortened. Read-only.",
"type": "integer",
"format": "int64"
},
"deletionTimestamp": {
"description": "Time is a wrapper around time.Time which supports correct marshaling to YAML and JSON. Wrappers are provided for many of the factory methods that the time package offers.",
"type": [
"string",
"null"
],
"format": "date-time"
},
"finalizers": {
"description": "Must be empty before the object is deleted from the registry. Each entry is an identifier for the responsible component that will remove the entry from the list. If the deletionTimestamp of the object is non-nil, entries in this list can only be removed.",
"type": [
"array",
"null"
],
"items": {
"type": [
"string",
"null"
]
},
"x-kubernetes-patch-strategy": "merge"
},
"generateName": {
"description": "GenerateName is an optional prefix, used by the server, to generate a unique name ONLY IF the Name field has not been provided. If this field is used, the name returned to the client will be different than the name passed. This value will also be combined with a unique suffix. The provided value has the same validation rules as the Name field, and may be truncated by the length of the suffix required to make the value unique on the server.\n\nIf this field is specified and the generated name exists, the server will NOT return a 409 - instead, it will either return 201 Created or 500 with Reason ServerTimeout indicating a unique name could not be found in the time allotted, and the client should retry (optionally after the time indicated in the Retry-After header).\n\nApplied only if Name is not specified. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#idempotency",
"type": [
"string",
"null"
]
},
"generation": {
"description": "A sequence number representing a specific generation of the desired state. Populated by the system. Read-only.",
"type": "integer",
"format": "int64"
},
"initializers": {
"description": "Initializers tracks the progress of initialization.",
"required": [
"pending"
],
"properties": {
"pending": {
"description": "Pending is a list of initializers that must execute in order before this object is visible. When the last pending initializer is removed, and no failing result is set, the initializers struct will be set to nil and the object is considered as initialized and visible to all clients.",
"type": "array",
"items": {
"description": "Initializer is information about an initializer that has not yet completed.",
"required": [
"name"
],
"properties": {
"name": {
"description": "name of the process that is responsible for initializing this object.",
"type": "string"
}
}
},
"x-kubernetes-patch-merge-key": "name",
"x-kubernetes-patch-strategy": "merge"
},
"result": {
"description": "Status is a return value for calls that don't return other objects.",
"properties": {
"apiVersion": {
"description": "APIVersion defines the versioned schema of this representation of an object. Servers should convert recognized schemas to the latest internal value, and may reject unrecognized values. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#resources",
"type": [
"string",
"null"
]
},
"code": {
"description": "Suggested HTTP return code for this status, 0 if not set.",
"type": "integer",
"format": "int32"
},
"details": {
"description": "StatusDetails is a set of additional properties that MAY be set by the server to provide additional information about a response. The Reason field of a Status object defines what attributes will be set. Clients must ignore fields that do not match the defined type of each attribute, and should assume that any attribute may be empty, invalid, or under defined.",
"properties": {
"causes": {
"description": "The Causes array includes more details associated with the StatusReason failure. Not all StatusReasons may provide detailed causes.",
"type": [
"array",
"null"
],
"items": {
"description": "StatusCause provides more information about an api.Status failure, including cases when multiple errors are encountered.",
"properties": {
"field": {
"description": "The field of the resource that has caused this error, as named by its JSON serialization. May include dot and postfix notation for nested attributes. Arrays are zero-indexed. Fields may appear more than once in an array of causes due to fields having multiple errors. Optional.\n\nExamples:\n \"name\" - the field \"name\" on the current resource\n \"items[0].name\" - the field \"name\" on the first array entry in \"items\"",
"type": [
"string",
"null"
]
},
"message": {
"description": "A human-readable description of the cause of the error. This field may be presented as-is to a reader.",
"type": [
"string",
"null"
]
},
"reason": {
"description": "A machine-readable description of the cause of the error. If this value is empty there is no information available.",
"type": [
"string",
"null"
]
}
}
}
},
"group": {
"description": "The group attribute of the resource associated with the status StatusReason.",
"type": [
"string",
"null"
]
},
"kind": {
"description": "The kind attribute of the resource associated with the status StatusReason. On some operations may differ from the requested resource Kind. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#types-kinds",
"type": [
"string",
"null"
]
},
"name": {
"description": "The name attribute of the resource associated with the status StatusReason (when there is a single name which can be described).",
"type": [
"string",
"null"
]
},
"retryAfterSeconds": {
"description": "If specified, the time in seconds before the operation should be retried. Some errors may indicate the client must take an alternate action - for those errors this field may indicate how long to wait before taking the alternate action.",
"type": "integer",
"format": "int32"
},
"uid": {
"description": "UID of the resource. (when there is a single resource which can be described). More info: http://kubernetes.io/docs/user-guide/identifiers#uids",
"type": [
"string",
"null"
]
}
}
},
"kind": {
"description": "Kind is a string value representing the REST resource this object represents. Servers may infer this from the endpoint the client submits requests to. Cannot be updated. In CamelCase. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#types-kinds",
"type": [
"string",
"null"
],
"enum": [
"Status"
]
},
"message": {
"description": "A human-readable description of the status of this operation.",
"type": [
"string",
"null"
]
},
"metadata": {
"description": "ListMeta describes metadata that synthetic resources must have, including lists and various status objects. A resource may have only one of {ObjectMeta, ListMeta}.",
"properties": {
"continue": {
"description": "continue may be set if the user set a limit on the number of items returned, and indicates that the server has more data available. The value is opaque and may be used to issue another request to the endpoint that served this list to retrieve the next set of available objects. Continuing a list may not be possible if the server configuration has changed or more than a few minutes have passed. The resourceVersion field returned when using this continue value will be identical to the value in the first response.",
"type": [
"string",
"null"
]
},
"resourceVersion": {
"description": "String that identifies the server's internal version of this object that can be used by clients to determine when objects have changed. Value must be treated as opaque by clients and passed unmodified back to the server. Populated by the system. Read-only. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#concurrency-control-and-consistency",
"type": [
"string",
"null"
]
},
"selfLink": {
"description": "selfLink is a URL representing this object. Populated by the system. Read-only.",
"type": [
"string",
"null"
]
}
}
},
"reason": {
"description": "A machine-readable description of why this operation is in the \"Failure\" status. If this value is empty there is no information available. A Reason clarifies an HTTP status code but does not override it.",
"type": [
"string",
"null"
]
},
"status": {
"description": "Status of the operation. One of: \"Success\" or \"Failure\". More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#spec-and-status",
"type": [
"string",
"null"
]
}
},
"x-kubernetes-group-version-kind": [
{
"group": "",
"kind": "Status",
"version": "v1"
}
]
}
}
},
"labels": {
"description": "Map of string keys and values that can be used to organize and categorize (scope and select) objects. May match selectors of replication controllers and services. More info: http://kubernetes.io/docs/user-guide/labels",
"type": "object",
"additionalProperties": {
"type": [
"string",
"null"
]
}
},
"name": {
"description": "Name must be unique within a namespace. Is required when creating resources, although some resources may allow a client to request the generation of an appropriate name automatically. Name is primarily intended for creation idempotence and configuration definition. Cannot be updated. More info: http://kubernetes.io/docs/user-guide/identifiers#names",
"type": [
"string",
"null"
]
},
"namespace": {
"description": "Namespace defines the space within each name must be unique. An empty namespace is equivalent to the \"default\" namespace, but \"default\" is the canonical representation. Not all objects are required to be scoped to a namespace - the value of this field for those objects will be empty.\n\nMust be a DNS_LABEL. Cannot be updated. More info: http://kubernetes.io/docs/user-guide/namespaces",
"type": [
"string",
"null"
]
},
"ownerReferences": {
"description": "List of objects depended by this object. If ALL objects in the list have been deleted, this object will be garbage collected. If this object is managed by a controller, then an entry in this list will point to this controller, with the controller field set to true. There cannot be more than one managing controller.",
"type": [
"array",
"null"
],
"items": {
"description": "OwnerReference contains enough information to let you identify an owning object. Currently, an owning object must be in the same namespace, so there is no namespace field.",
"required": [
"apiVersion",
"kind",
"name",
"uid"
],
"properties": {
"apiVersion": {
"description": "API version of the referent.",
"type": "string"
},
"blockOwnerDeletion": {
"description": "If true, AND if the owner has the \"foregroundDeletion\" finalizer, then the owner cannot be deleted from the key-value store until this reference is removed. Defaults to false. To set this field, a user needs \"delete\" permission of the owner, otherwise 422 (Unprocessable Entity) will be returned.",
"type": "boolean"
},
"controller": {
"description": "If true, this reference points to the managing controller.",
"type": "boolean"
},
"kind": {
"description": "Kind of the referent. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#types-kinds",
"type": "string"
},
"name": {
"description": "Name of the referent. More info: http://kubernetes.io/docs/user-guide/identifiers#names",
"type": "string"
},
"uid": {
"description": "UID of the referent. More info: http://kubernetes.io/docs/user-guide/identifiers#uids",
"type": "string"
}
}
},
"x-kubernetes-patch-merge-key": "uid",
"x-kubernetes-patch-strategy": "merge"
},
"resourceVersion": {
"description": "An opaque value that represents the internal version of this object that can be used by clients to determine when objects have changed. May be used for optimistic concurrency, change detection, and the watch operation on a resource or set of resources. Clients must treat these values as opaque and passed unmodified back to the server. They may only be valid for a particular resource or set of resources.\n\nPopulated by the system. Read-only. Value must be treated as opaque by clients and . More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#concurrency-control-and-consistency",
"type": [
"string",
"null"
]
},
"selfLink": {
"description": "SelfLink is a URL representing this object. Populated by the system. Read-only.",
"type": [
"string",
"null"
]
},
"uid": {
"description": "UID is the unique in time and space value for this object. It is typically generated by the server on successful creation of a resource and is not allowed to change on PUT operations.\n\nPopulated by the system. Read-only. More info: http://kubernetes.io/docs/user-guide/identifiers#uids",
"type": [
"string",
"null"
]
}
}
},
"spec": {
"description": "LimitRangeSpec defines a min/max usage limit for resources that match on kind.",
"required": [
"limits"
],
"properties": {
"limits": {
"description": "Limits is the list of LimitRangeItem objects that are enforced.",
"type": "array",
"items": {
"description": "LimitRangeItem defines a min/max usage limit for any resource that matches on kind.",
"properties": {
"default": {
"description": "Default resource requirement limit value by resource name if resource limit is omitted.",
"type": "object",
"additionalProperties": {
"oneOf": [
{
"type": [
"string",
"null"
]
},
{
"type": "integer"
}
]
}
},
"defaultRequest": {
"description": "DefaultRequest is the default resource requirement request value by resource name if resource request is omitted.",
"type": "object",
"additionalProperties": {
"oneOf": [
{
"type": [
"string",
"null"
]
},
{
"type": "integer"
}
]
}
},
"max": {
"description": "Max usage constraints on this kind by resource name.",
"type": "object",
"additionalProperties": {
"oneOf": [
{
"type": [
"string",
"null"
]
},
{
"type": "integer"
}
]
}
},
"maxLimitRequestRatio": {
"description": "MaxLimitRequestRatio if specified, the named resource must have a request and limit that are both non-zero where limit divided by request is less than or equal to the enumerated value; this represents the max burst for the named resource.",
"type": "object",
"additionalProperties": {
"oneOf": [
{
"type": [
"string",
"null"
]
},
{
"type": "integer"
}
]
}
},
"min": {
"description": "Min usage constraints on this kind by resource name.",
"type": "object",
"additionalProperties": {
"oneOf": [
{
"type": [
"string",
"null"
]
},
{
"type": "integer"
}
]
}
},
"type": {
"description": "Type of resource that this limit applies to.",
"type": [
"string",
"null"
]
}
}
}
}
}
}
},
"x-kubernetes-group-version-kind": [
{
"group": "",
"kind": "LimitRange",
"version": "v1"
}
]
}
},
"kind": {
"description": "Kind is a string value representing the REST resource this object represents. Servers may infer this from the endpoint the client submits requests to. Cannot be updated. In CamelCase. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#types-kinds",
"type": [
"string",
"null"
],
"enum": [
"LimitRangeList"
]
},
"metadata": {
"description": "ListMeta describes metadata that synthetic resources must have, including lists and various status objects. A resource may have only one of {ObjectMeta, ListMeta}.",
"properties": {
"continue": {
"description": "continue may be set if the user set a limit on the number of items returned, and indicates that the server has more data available. The value is opaque and may be used to issue another request to the endpoint that served this list to retrieve the next set of available objects. Continuing a list may not be possible if the server configuration has changed or more than a few minutes have passed. The resourceVersion field returned when using this continue value will be identical to the value in the first response.",
"type": [
"string",
"null"
]
},
"resourceVersion": {
"description": "String that identifies the server's internal version of this object that can be used by clients to determine when objects have changed. Value must be treated as opaque by clients and passed unmodified back to the server. Populated by the system. Read-only. More info: https://git.k8s.io/community/contributors/devel/api-conventions.md#concurrency-control-and-consistency",
"type": [
"string",
"null"
]
},
"selfLink": {
"description": "selfLink is a URL representing this object. Populated by the system. Read-only.",
"type": [
"string",
"null"
]
}
}
}
},
"x-kubernetes-group-version-kind": [
{
"group": "",
"kind": "LimitRangeList",
"version": "v1"
}
],
"$schema": "http://json-schema.org/schema#",
"type": "object"
} | {
"pile_set_name": "Github"
} |
---
title: "Cannot infer a common type for the first and second operands of the binary 'If' operator"
ms.date: 07/20/2015
f1_keywords:
- "vbc33110"
- "bc33110"
helpviewer_keywords:
- "BC33110"
ms.assetid: f46873aa-f6cd-4cc9-9e8e-e668bddf0980
---
# Cannot infer a common type for the first and second operands of the binary 'If' operator
Cannot infer a common type for the first and second operands of the binary 'If' operator. One must have a widening conversion to the other.
The binary `If` operator requires that there be a widening conversion between one of the arguments and the other argument. For example, because there is not a widening conversion in either direction between `Integer` and `String`, the following code causes this error.
```vb
Dim first? As Integer
Dim second As String = "First is Nothing"
'' Not valid.
' Console.WriteLine(If(first, second))
```
**Error ID:** BC33110
## To correct this error
- Provide an explicit conversion for one of the operands, if that is possible in your code:
```vb
Console.WriteLine(If(first, CInt(second)))
```
- Rewrite the code by using a different conditional construction.
```vb
If first IsNot Nothing Then
Console.WriteLine(first)
Else
Console.WriteLine(second)
End If
```
## See also
- [If Operator](../language-reference/operators/if-operator.md)
- [Widening and Narrowing Conversions](../programming-guide/language-features/data-types/widening-and-narrowing-conversions.md)
- [If...Then...Else Statement](../language-reference/statements/if-then-else-statement.md)
| {
"pile_set_name": "Github"
} |
// ------------------------------------------------------------
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License (MIT). See License.txt in the repo root for license information.
// ------------------------------------------------------------
namespace Microsoft.Azure.IIoT.OpcUa.Registry.Handlers {
using Microsoft.Azure.IIoT.OpcUa.Registry.Models;
using Microsoft.Azure.IIoT.Hub;
using Microsoft.Azure.IIoT.Serializers;
using Serilog;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
/// <summary>
/// Server discovery result handling
/// </summary>
public sealed class DiscoveryEventHandler : IDeviceTelemetryHandler {
/// <inheritdoc/>
public string MessageSchema => Models.MessageSchemaTypes.DiscoveryEvents;
/// <summary>
/// Create handler
/// </summary>
/// <param name="processors"></param>
/// <param name="serializer"></param>
/// <param name="logger"></param>
public DiscoveryEventHandler(IEnumerable<IDiscoveryResultProcessor> processors,
IJsonSerializer serializer, ILogger logger) {
_serializer = serializer ??
throw new ArgumentNullException(nameof(serializer));
_logger = logger ??
throw new ArgumentNullException(nameof(logger));
_processors = processors?.ToList() ??
throw new ArgumentNullException(nameof(processors));
}
/// <inheritdoc/>
public async Task HandleAsync(string deviceId, string moduleId,
byte[] payload, IDictionary<string, string> properties, Func<Task> checkpoint) {
DiscoveryEventModel discovery;
try {
discovery = _serializer.Deserialize<DiscoveryEventModel>(payload);
}
catch (Exception ex) {
_logger.Error(ex, "Failed to convert discovery result {json}",
Encoding.UTF8.GetString(payload));
return;
}
try {
var discovererId = DiscovererModelEx.CreateDiscovererId(
deviceId, moduleId?.ToString());
await ProcessServerEndpointDiscoveryAsync(discovererId,
discovery, checkpoint);
}
catch (Exception ex) {
_logger.Error(ex, "Handling discovery event failed with exception - skip");
}
}
/// <inheritdoc/>
public async Task OnBatchCompleteAsync() {
try {
await _queueLock.WaitAsync();
var old = DateTime.UtcNow - TimeSpan.FromHours(1);
var removed = new List<KeyValuePair<DateTime, DiscovererDiscoveryResult>>();
foreach (var backlog in _discovererQueues.ToList()) {
foreach (var queue in backlog.Value
.Where(kv => kv.Key < old).ToList()) {
backlog.Value.Remove(queue.Key);
removed.Add(queue);
}
if (backlog.Value.Count == 0) {
_discovererQueues.Remove(backlog.Key);
}
}
//
// Checkpoint the latest of the ones removed.
//
var newest = removed
.Select(x => x.Value)
.OrderByDescending(x => x.Created).FirstOrDefault();
if (newest != null) {
try {
await newest.Checkpoint();
}
catch {
return;
}
}
}
finally {
_queueLock.Release();
}
}
/// <summary>
/// Process discovery model
/// </summary>
/// <param name="discovererId"></param>
/// <param name="model"></param>
/// <param name="checkpoint"></param>
/// <returns></returns>
private async Task ProcessServerEndpointDiscoveryAsync(
string discovererId, DiscoveryEventModel model, Func<Task> checkpoint) {
try {
await _queueLock.WaitAsync();
if (!_discovererQueues.TryGetValue(discovererId, out var backlog)) {
backlog = new Dictionary<DateTime, DiscovererDiscoveryResult>();
_discovererQueues.Add(discovererId, backlog);
}
if (!backlog.TryGetValue(model.TimeStamp, out var queue)) {
queue = new DiscovererDiscoveryResult(checkpoint);
backlog.Add(model.TimeStamp, queue);
}
queue.Enqueue(model);
if (queue.Completed) {
try {
// Process discoveries
await Task.WhenAll(
_processors.Select(p => p.ProcessDiscoveryResultsAsync(
discovererId, queue.Result, queue.Events)));
}
catch (Exception ex) {
_logger.Error(ex,
"Failure during discovery processing in registry. Skip.");
}
backlog.Remove(model.TimeStamp);
//
// Check if there are any older queues still in the list.
// If not then checkpoint this queue.
//
if (!_discovererQueues.Any(d => d.Value
.Any(x => x.Value.Created <= queue.Created))) {
await queue.Checkpoint();
}
}
if (backlog.Count == 0) {
_discovererQueues.Remove(discovererId);
}
}
finally {
_queueLock.Release();
}
}
/// <summary>
/// Keeps queue of discovery messages per device
/// </summary>
private class DiscovererDiscoveryResult {
/// <summary>
/// Checkpointable event data
/// </summary>
public Func<Task> Checkpoint { get; }
/// <summary>
/// When queue was created
/// </summary>
public DateTime Created { get; } = DateTime.UtcNow;
/// <summary>
/// Whether the result is complete
/// </summary>
public bool Completed =>
Result != null && _maxIndex == _endpoints.Count;
/// <summary>
/// Get events
/// </summary>
/// <returns></returns>
public IEnumerable<DiscoveryEventModel> Events {
get {
_endpoints.Sort((x, y) => x.Index - y.Index);
return _endpoints;
}
}
/// <summary>
/// Result of discovery
/// </summary>
public DiscoveryResultModel Result { get; private set; }
/// <summary>
/// Create queue
/// </summary>
public DiscovererDiscoveryResult(Func<Task> checkpoint) {
Checkpoint = checkpoint;
_endpoints = new List<DiscoveryEventModel>();
_maxIndex = 0;
}
/// <summary>
/// Add another discovery from discoverer
/// </summary>
/// <param name="model"></param>
/// <returns></returns>
public void Enqueue(DiscoveryEventModel model) {
_maxIndex = Math.Max(model.Index, _maxIndex);
if (model.Registration != null) {
_endpoints.Add(model);
}
else {
Result = model.Result ?? new DiscoveryResultModel();
}
}
private readonly List<DiscoveryEventModel> _endpoints;
private int _maxIndex;
}
private readonly Dictionary<string,
Dictionary<DateTime, DiscovererDiscoveryResult>> _discovererQueues =
new Dictionary<string,
Dictionary<DateTime, DiscovererDiscoveryResult>>();
private readonly SemaphoreSlim _queueLock = new SemaphoreSlim(1, 1);
private readonly IJsonSerializer _serializer;
private readonly ILogger _logger;
private readonly List<IDiscoveryResultProcessor> _processors;
}
}
| {
"pile_set_name": "Github"
} |
package com.mmnaseri.cs.clrs.ch23.s2;
import com.mmnaseri.cs.clrs.ch21.s3.PathCompressingRankedForestDisjointSet;
import com.mmnaseri.cs.clrs.ch21.s3.RankedTreeElement;
import com.mmnaseri.cs.clrs.ch22.s1.*;
import com.mmnaseri.cs.clrs.ch23.s1.WeightedEdgeDetails;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
/**
* @author Mohammad Milad Naseri ([email protected])
* @since 1.0 (8/5/15)
*/
public class KruskalMinimumSpanningTreeFinder<E extends WeightedEdgeDetails, V extends VertexDetails> implements MinimumSpanningTreeFinder<E, V> {
@Override
public Graph<E, V> find(Graph<E, V> graph) {
if (graph.isEmpty()) {
return graph;
}
final Graph<E, V> result = new AdjacencyListGraph<>();
for (Vertex<V> vertex : graph) {
result.add(vertex.getDetails());
}
final PathCompressingRankedForestDisjointSet<Integer> set = new PathCompressingRankedForestDisjointSet<>();
final List<RankedTreeElement<Integer>> elements = new ArrayList<>();
for (int i = 0; i < graph.size(); i++) {
final RankedTreeElement<Integer> element = set.create(i);
elements.add(element);
}
final List<Edge<E, V>> edges = collectEdges(graph);
Collections.sort(edges, new EdgeWeightComparator<E, V>());
for (Edge<E, V> edge : edges) {
final RankedTreeElement<Integer> from = elements.get(edge.getFrom().getIndex());
final RankedTreeElement<Integer> to = elements.get(edge.getTo().getIndex());
if (!set.find(from).equals(set.find(to))) {
set.union(from, to);
result.connect(from.getValue(), to.getValue());
result.connect(to.getValue(), from.getValue());
}
}
return result;
}
private List<Edge<E, V>> collectEdges(Graph<E, V> graph) {
final List<Edge<E, V>> edges = new ArrayList<>();
for (int i = 0; i < graph.size(); i++) {
for (int j = 0; j < graph.size(); j++) {
//we only want to support non-digraphs
final Edge<E, V> first = graph.edge(i, j);
final Edge<E, V> second = graph.edge(j, i);
if (first == null || second == null) {
continue;
}
E firstDetails = first.getDetails();
E secondDetails = second.getDetails();
if (firstDetails == null) {
firstDetails = secondDetails;
}
if (secondDetails == null) {
secondDetails = firstDetails;
}
final int firstWeight = firstDetails == null ? 0 : firstDetails.getWeight();
final int secondWeight = secondDetails == null ? 0 : secondDetails.getWeight();
//if both directions were specified, but had different weights, it is a digraph edge, and we reject it
if (firstWeight != secondWeight) {
continue;
}
//we have now determined that the edges are the same and they have the same weight, so it won't matter
//which one of them we store
edges.add(first);
}
}
return edges;
}
private static class EdgeWeightComparator<E extends WeightedEdgeDetails, V extends VertexDetails> implements Comparator<Edge<E, V>> {
@Override
public int compare(Edge<E, V> first, Edge<E, V> second) {
final int firstWeight = first.getDetails() == null ? 0 : first.getDetails().getWeight();
final int secondWeight = second.getDetails() == null ? 0 : second.getDetails().getWeight();
return Integer.compare(firstWeight, secondWeight);
}
}
}
| {
"pile_set_name": "Github"
} |
<?php
namespace Neos\Cache;
/*
* This file is part of the Neos.Cache package.
*
* (c) Contributors of the Neos Project - www.neos.io
*
* This package is Open Source Software. For the full copyright and license
* information, please view the LICENSE file which was distributed with this
* source code.
*/
/**
* Interface for objects which are cache aware and are collaborative when it comes to storing them in a cache.
*
* @api
*/
interface CacheAwareInterface
{
/**
* Returns a string which distinctly identifies this object and thus can be used as an identifier for cache entries
* related to this object.
*
* @return string
*/
public function getCacheEntryIdentifier(): string;
}
| {
"pile_set_name": "Github"
} |
//
// Generated by class-dump 3.5 (64 bit) (Debug version compiled Oct 15 2018 10:31:50).
//
// class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2015 by Steve Nygard.
//
#import <Contacts/CNContact.h>
@interface CNContact (IMCoreAdditions)
- (BOOL)_im_isEqualToContact:(id)arg1;
@end
| {
"pile_set_name": "Github"
} |
<?php
// This file is part of Moodle - http://moodle.org/
//
// Moodle is free software: you can redistribute it and/or modify
// it under the terms of the GNU General Public License as published by
// the Free Software Foundation, either version 3 of the License, or
// (at your option) any later version.
//
// Moodle is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
// GNU General Public License for more details.
//
// You should have received a copy of the GNU General Public License
// along with Moodle. If not, see <http://www.gnu.org/licenses/>.
/**
* Privacy Subsystem implementation for mod_quiz.
*
* @package mod_quiz
* @category privacy
* @copyright 2018 Andrew Nicols <[email protected]>
* @license http://www.gnu.org/copyleft/gpl.html GNU GPL v3 or later
*/
namespace mod_quiz\privacy;
use core_privacy\local\request\approved_contextlist;
use core_privacy\local\request\approved_userlist;
use core_privacy\local\request\contextlist;
use core_privacy\local\request\deletion_criteria;
use core_privacy\local\request\transform;
use core_privacy\local\metadata\collection;
use core_privacy\local\request\userlist;
use core_privacy\local\request\writer;
use core_privacy\manager;
defined('MOODLE_INTERNAL') || die();
require_once($CFG->dirroot . '/mod/quiz/lib.php');
require_once($CFG->dirroot . '/mod/quiz/locallib.php');
/**
* Privacy Subsystem implementation for mod_quiz.
*
* @copyright 2018 Andrew Nicols <[email protected]>
* @license http://www.gnu.org/copyleft/gpl.html GNU GPL v3 or later
*/
class provider implements
// This plugin has data.
\core_privacy\local\metadata\provider,
// This plugin currently implements the original plugin_provider interface.
\core_privacy\local\request\plugin\provider,
// This plugin is capable of determining which users have data within it.
\core_privacy\local\request\core_userlist_provider {
/**
* Get the list of contexts that contain user information for the specified user.
*
* @param collection $items The collection to add metadata to.
* @return collection The array of metadata
*/
public static function get_metadata(collection $items) : collection {
// The table 'quiz' stores a record for each quiz.
// It does not contain user personal data, but data is returned from it for contextual requirements.
// The table 'quiz_attempts' stores a record of each quiz attempt.
// It contains a userid which links to the user making the attempt and contains information about that attempt.
$items->add_database_table('quiz_attempts', [
'attempt' => 'privacy:metadata:quiz_attempts:attempt',
'currentpage' => 'privacy:metadata:quiz_attempts:currentpage',
'preview' => 'privacy:metadata:quiz_attempts:preview',
'state' => 'privacy:metadata:quiz_attempts:state',
'timestart' => 'privacy:metadata:quiz_attempts:timestart',
'timefinish' => 'privacy:metadata:quiz_attempts:timefinish',
'timemodified' => 'privacy:metadata:quiz_attempts:timemodified',
'timemodifiedoffline' => 'privacy:metadata:quiz_attempts:timemodifiedoffline',
'timecheckstate' => 'privacy:metadata:quiz_attempts:timecheckstate',
'sumgrades' => 'privacy:metadata:quiz_attempts:sumgrades',
], 'privacy:metadata:quiz_attempts');
// The table 'quiz_feedback' contains the feedback responses which will be shown to users depending upon the
// grade they achieve in the quiz.
// It does not identify the user who wrote the feedback item so cannot be returned directly and is not
// described, but relevant feedback items will be included with the quiz export for a user who has a grade.
// The table 'quiz_grades' contains the current grade for each quiz/user combination.
$items->add_database_table('quiz_grades', [
'quiz' => 'privacy:metadata:quiz_grades:quiz',
'userid' => 'privacy:metadata:quiz_grades:userid',
'grade' => 'privacy:metadata:quiz_grades:grade',
'timemodified' => 'privacy:metadata:quiz_grades:timemodified',
], 'privacy:metadata:quiz_grades');
// The table 'quiz_overrides' contains any user or group overrides for users.
// It should be included where data exists for a user.
$items->add_database_table('quiz_overrides', [
'quiz' => 'privacy:metadata:quiz_overrides:quiz',
'userid' => 'privacy:metadata:quiz_overrides:userid',
'timeopen' => 'privacy:metadata:quiz_overrides:timeopen',
'timeclose' => 'privacy:metadata:quiz_overrides:timeclose',
'timelimit' => 'privacy:metadata:quiz_overrides:timelimit',
], 'privacy:metadata:quiz_overrides');
// These define the structure of the quiz.
// The table 'quiz_sections' contains data about the structure of a quiz.
// It does not contain any user identifying data and does not need a mapping.
// The table 'quiz_slots' contains data about the structure of a quiz.
// It does not contain any user identifying data and does not need a mapping.
// The table 'quiz_reports' does not contain any user identifying data and does not need a mapping.
// The table 'quiz_statistics' contains abstract statistics about question usage and cannot be mapped to any
// specific user.
// It does not contain any user identifying data and does not need a mapping.
// The quiz links to the 'core_question' subsystem for all question functionality.
$items->add_subsystem_link('core_question', [], 'privacy:metadata:core_question');
// The quiz has two subplugins..
$items->add_plugintype_link('quiz', [], 'privacy:metadata:quiz');
$items->add_plugintype_link('quizaccess', [], 'privacy:metadata:quizaccess');
// Although the quiz supports the core_completion API and defines custom completion items, these will be
// noted by the manager as all activity modules are capable of supporting this functionality.
return $items;
}
/**
* Get the list of contexts where the specified user has attempted a quiz, or been involved with manual marking
* and/or grading of a quiz.
*
* @param int $userid The user to search.
* @return contextlist $contextlist The contextlist containing the list of contexts used in this plugin.
*/
public static function get_contexts_for_userid(int $userid) : contextlist {
$resultset = new contextlist();
// Users who attempted the quiz.
$sql = "SELECT c.id
FROM {context} c
JOIN {course_modules} cm ON cm.id = c.instanceid AND c.contextlevel = :contextlevel
JOIN {modules} m ON m.id = cm.module AND m.name = :modname
JOIN {quiz} q ON q.id = cm.instance
JOIN {quiz_attempts} qa ON qa.quiz = q.id
WHERE qa.userid = :userid AND qa.preview = 0";
$params = ['contextlevel' => CONTEXT_MODULE, 'modname' => 'quiz', 'userid' => $userid];
$resultset->add_from_sql($sql, $params);
// Users with quiz overrides.
$sql = "SELECT c.id
FROM {context} c
JOIN {course_modules} cm ON cm.id = c.instanceid AND c.contextlevel = :contextlevel
JOIN {modules} m ON m.id = cm.module AND m.name = :modname
JOIN {quiz} q ON q.id = cm.instance
JOIN {quiz_overrides} qo ON qo.quiz = q.id
WHERE qo.userid = :userid";
$params = ['contextlevel' => CONTEXT_MODULE, 'modname' => 'quiz', 'userid' => $userid];
$resultset->add_from_sql($sql, $params);
// Get the SQL used to link indirect question usages for the user.
// This includes where a user is the manual marker on a question attempt.
$qubaid = \core_question\privacy\provider::get_related_question_usages_for_user('rel', 'mod_quiz', 'qa.uniqueid', $userid);
// Select the context of any quiz attempt where a user has an attempt, plus the related usages.
$sql = "SELECT c.id
FROM {context} c
JOIN {course_modules} cm ON cm.id = c.instanceid AND c.contextlevel = :contextlevel
JOIN {modules} m ON m.id = cm.module AND m.name = :modname
JOIN {quiz} q ON q.id = cm.instance
JOIN {quiz_attempts} qa ON qa.quiz = q.id
" . $qubaid->from . "
WHERE " . $qubaid->where() . " AND qa.preview = 0";
$params = ['contextlevel' => CONTEXT_MODULE, 'modname' => 'quiz'] + $qubaid->from_where_params();
$resultset->add_from_sql($sql, $params);
return $resultset;
}
/**
* Get the list of users who have data within a context.
*
* @param userlist $userlist The userlist containing the list of users who have data in this context/plugin combination.
*/
public static function get_users_in_context(userlist $userlist) {
$context = $userlist->get_context();
if (!$context instanceof \context_module) {
return;
}
$params = [
'cmid' => $context->instanceid,
'modname' => 'quiz',
];
// Users who attempted the quiz.
$sql = "SELECT qa.userid
FROM {course_modules} cm
JOIN {modules} m ON m.id = cm.module AND m.name = :modname
JOIN {quiz} q ON q.id = cm.instance
JOIN {quiz_attempts} qa ON qa.quiz = q.id
WHERE cm.id = :cmid AND qa.preview = 0";
$userlist->add_from_sql('userid', $sql, $params);
// Users with quiz overrides.
$sql = "SELECT qo.userid
FROM {course_modules} cm
JOIN {modules} m ON m.id = cm.module AND m.name = :modname
JOIN {quiz} q ON q.id = cm.instance
JOIN {quiz_overrides} qo ON qo.quiz = q.id
WHERE cm.id = :cmid";
$userlist->add_from_sql('userid', $sql, $params);
// Question usages in context.
// This includes where a user is the manual marker on a question attempt.
$sql = "SELECT qa.uniqueid
FROM {course_modules} cm
JOIN {modules} m ON m.id = cm.module AND m.name = :modname
JOIN {quiz} q ON q.id = cm.instance
JOIN {quiz_attempts} qa ON qa.quiz = q.id
WHERE cm.id = :cmid AND qa.preview = 0";
\core_question\privacy\provider::get_users_in_context_from_sql($userlist, 'qn', $sql, $params);
}
/**
* Export all user data for the specified user, in the specified contexts.
*
* @param approved_contextlist $contextlist The approved contexts to export information for.
*/
public static function export_user_data(approved_contextlist $contextlist) {
global $DB;
if (!count($contextlist)) {
return;
}
$user = $contextlist->get_user();
$userid = $user->id;
list($contextsql, $contextparams) = $DB->get_in_or_equal($contextlist->get_contextids(), SQL_PARAMS_NAMED);
$sql = "SELECT
q.*,
qg.id AS hasgrade,
qg.grade AS bestgrade,
qg.timemodified AS grademodified,
qo.id AS hasoverride,
qo.timeopen AS override_timeopen,
qo.timeclose AS override_timeclose,
qo.timelimit AS override_timelimit,
c.id AS contextid,
cm.id AS cmid
FROM {context} c
INNER JOIN {course_modules} cm ON cm.id = c.instanceid AND c.contextlevel = :contextlevel
INNER JOIN {modules} m ON m.id = cm.module AND m.name = :modname
INNER JOIN {quiz} q ON q.id = cm.instance
LEFT JOIN {quiz_overrides} qo ON qo.quiz = q.id AND qo.userid = :qouserid
LEFT JOIN {quiz_grades} qg ON qg.quiz = q.id AND qg.userid = :qguserid
WHERE c.id {$contextsql}";
$params = [
'contextlevel' => CONTEXT_MODULE,
'modname' => 'quiz',
'qguserid' => $userid,
'qouserid' => $userid,
];
$params += $contextparams;
// Fetch the individual quizzes.
$quizzes = $DB->get_recordset_sql($sql, $params);
foreach ($quizzes as $quiz) {
list($course, $cm) = get_course_and_cm_from_cmid($quiz->cmid, 'quiz');
$quizobj = new \quiz($quiz, $cm, $course);
$context = $quizobj->get_context();
$quizdata = \core_privacy\local\request\helper::get_context_data($context, $contextlist->get_user());
\core_privacy\local\request\helper::export_context_files($context, $contextlist->get_user());
if (!empty($quizdata->timeopen)) {
$quizdata->timeopen = transform::datetime($quiz->timeopen);
}
if (!empty($quizdata->timeclose)) {
$quizdata->timeclose = transform::datetime($quiz->timeclose);
}
if (!empty($quizdata->timelimit)) {
$quizdata->timelimit = $quiz->timelimit;
}
if (!empty($quiz->hasoverride)) {
$quizdata->override = (object) [];
if (!empty($quizdata->override_override_timeopen)) {
$quizdata->override->timeopen = transform::datetime($quiz->override_timeopen);
}
if (!empty($quizdata->override_timeclose)) {
$quizdata->override->timeclose = transform::datetime($quiz->override_timeclose);
}
if (!empty($quizdata->override_timelimit)) {
$quizdata->override->timelimit = $quiz->override_timelimit;
}
}
$quizdata->accessdata = (object) [];
$components = \core_component::get_plugin_list('quizaccess');
$exportparams = [
$quizobj,
$user,
];
foreach (array_keys($components) as $component) {
$classname = manager::get_provider_classname_for_component("quizaccess_$component");
if (class_exists($classname) && is_subclass_of($classname, quizaccess_provider::class)) {
$result = component_class_callback($classname, 'export_quizaccess_user_data', $exportparams);
if (count((array) $result)) {
$quizdata->accessdata->$component = $result;
}
}
}
if (empty((array) $quizdata->accessdata)) {
unset($quizdata->accessdata);
}
writer::with_context($context)
->export_data([], $quizdata);
}
$quizzes->close();
// Store all quiz attempt data.
static::export_quiz_attempts($contextlist);
}
/**
* Delete all data for all users in the specified context.
*
* @param context $context The specific context to delete data for.
*/
public static function delete_data_for_all_users_in_context(\context $context) {
if ($context->contextlevel != CONTEXT_MODULE) {
// Only quiz module will be handled.
return;
}
$cm = get_coursemodule_from_id('quiz', $context->instanceid);
if (!$cm) {
// Only quiz module will be handled.
return;
}
$quizobj = \quiz::create($cm->instance);
$quiz = $quizobj->get_quiz();
// Handle the 'quizaccess' subplugin.
manager::plugintype_class_callback(
'quizaccess',
quizaccess_provider::class,
'delete_subplugin_data_for_all_users_in_context',
[$quizobj]
);
// Delete all overrides - do not log.
quiz_delete_all_overrides($quiz, false);
// This will delete all question attempts, quiz attempts, and quiz grades for this quiz.
quiz_delete_all_attempts($quiz);
}
/**
* Delete all user data for the specified user, in the specified contexts.
*
* @param approved_contextlist $contextlist The approved contexts and user information to delete information for.
*/
public static function delete_data_for_user(approved_contextlist $contextlist) {
global $DB;
foreach ($contextlist as $context) {
if ($context->contextlevel != CONTEXT_MODULE) {
// Only quiz module will be handled.
continue;
}
$cm = get_coursemodule_from_id('quiz', $context->instanceid);
if (!$cm) {
// Only quiz module will be handled.
continue;
}
// Fetch the details of the data to be removed.
$quizobj = \quiz::create($cm->instance);
$quiz = $quizobj->get_quiz();
$user = $contextlist->get_user();
// Handle the 'quizaccess' quizaccess.
manager::plugintype_class_callback(
'quizaccess',
quizaccess_provider::class,
'delete_quizaccess_data_for_user',
[$quizobj, $user]
);
// Remove overrides for this user.
$overrides = $DB->get_records('quiz_overrides' , [
'quiz' => $quizobj->get_quizid(),
'userid' => $user->id,
]);
foreach ($overrides as $override) {
quiz_delete_override($quiz, $override->id, false);
}
// This will delete all question attempts, quiz attempts, and quiz grades for this quiz.
quiz_delete_user_attempts($quizobj, $user);
}
}
/**
* Delete multiple users within a single context.
*
* @param approved_userlist $userlist The approved context and user information to delete information for.
*/
public static function delete_data_for_users(approved_userlist $userlist) {
global $DB;
$context = $userlist->get_context();
if ($context->contextlevel != CONTEXT_MODULE) {
// Only quiz module will be handled.
return;
}
$cm = get_coursemodule_from_id('quiz', $context->instanceid);
if (!$cm) {
// Only quiz module will be handled.
return;
}
$quizobj = \quiz::create($cm->instance);
$quiz = $quizobj->get_quiz();
$userids = $userlist->get_userids();
// Handle the 'quizaccess' quizaccess.
manager::plugintype_class_callback(
'quizaccess',
quizaccess_user_provider::class,
'delete_quizaccess_data_for_users',
[$userlist]
);
foreach ($userids as $userid) {
// Remove overrides for this user.
$overrides = $DB->get_records('quiz_overrides' , [
'quiz' => $quizobj->get_quizid(),
'userid' => $userid,
]);
foreach ($overrides as $override) {
quiz_delete_override($quiz, $override->id, false);
}
// This will delete all question attempts, quiz attempts, and quiz grades for this user in the given quiz.
quiz_delete_user_attempts($quizobj, (object)['id' => $userid]);
}
}
/**
* Store all quiz attempts for the contextlist.
*
* @param approved_contextlist $contextlist
*/
protected static function export_quiz_attempts(approved_contextlist $contextlist) {
global $DB;
$userid = $contextlist->get_user()->id;
list($contextsql, $contextparams) = $DB->get_in_or_equal($contextlist->get_contextids(), SQL_PARAMS_NAMED);
$qubaid = \core_question\privacy\provider::get_related_question_usages_for_user('rel', 'mod_quiz', 'qa.uniqueid', $userid);
$sql = "SELECT
c.id AS contextid,
cm.id AS cmid,
qa.*
FROM {context} c
JOIN {course_modules} cm ON cm.id = c.instanceid AND c.contextlevel = :contextlevel
JOIN {modules} m ON m.id = cm.module AND m.name = 'quiz'
JOIN {quiz} q ON q.id = cm.instance
JOIN {quiz_attempts} qa ON qa.quiz = q.id
" . $qubaid->from. "
WHERE (
qa.userid = :qauserid OR
" . $qubaid->where() . "
) AND qa.preview = 0
";
$params = array_merge(
[
'contextlevel' => CONTEXT_MODULE,
'qauserid' => $userid,
],
$qubaid->from_where_params()
);
$attempts = $DB->get_recordset_sql($sql, $params);
foreach ($attempts as $attempt) {
$quiz = $DB->get_record('quiz', ['id' => $attempt->quiz]);
$context = \context_module::instance($attempt->cmid);
$attemptsubcontext = helper::get_quiz_attempt_subcontext($attempt, $contextlist->get_user());
$options = quiz_get_review_options($quiz, $attempt, $context);
if ($attempt->userid == $userid) {
// This attempt was made by the user.
// They 'own' all data on it.
// Store the question usage data.
\core_question\privacy\provider::export_question_usage($userid,
$context,
$attemptsubcontext,
$attempt->uniqueid,
$options,
true
);
// Store the quiz attempt data.
$data = (object) [
'state' => \quiz_attempt::state_name($attempt->state),
];
if (!empty($attempt->timestart)) {
$data->timestart = transform::datetime($attempt->timestart);
}
if (!empty($attempt->timefinish)) {
$data->timefinish = transform::datetime($attempt->timefinish);
}
if (!empty($attempt->timemodified)) {
$data->timemodified = transform::datetime($attempt->timemodified);
}
if (!empty($attempt->timemodifiedoffline)) {
$data->timemodifiedoffline = transform::datetime($attempt->timemodifiedoffline);
}
if (!empty($attempt->timecheckstate)) {
$data->timecheckstate = transform::datetime($attempt->timecheckstate);
}
if ($options->marks == \question_display_options::MARK_AND_MAX) {
$grade = quiz_rescale_grade($attempt->sumgrades, $quiz, false);
$data->grade = (object) [
'grade' => quiz_format_grade($quiz, $grade),
'feedback' => quiz_feedback_for_grade($grade, $quiz, $context),
];
}
writer::with_context($context)
->export_data($attemptsubcontext, $data);
} else {
// This attempt was made by another user.
// The current user may have marked part of the quiz attempt.
\core_question\privacy\provider::export_question_usage(
$userid,
$context,
$attemptsubcontext,
$attempt->uniqueid,
$options,
false
);
}
}
$attempts->close();
}
}
| {
"pile_set_name": "Github"
} |
//
// SZCalendarCell.m
// SZCalendarPicker
//
// Created by Stephen Zhuang on 14/12/1.
// Copyright (c) 2014ๅนด Stephen Zhuang. All rights reserved.
//
#import "SZCalendarCell.h"
@implementation SZCalendarCell
- (UILabel *)dateLabel
{
if (!_dateLabel) {
_dateLabel = [[UILabel alloc] initWithFrame:self.bounds];
[_dateLabel setTextAlignment:NSTextAlignmentCenter];
[_dateLabel setFont:[UIFont systemFontOfSize:17]];
[self addSubview:_dateLabel];
}
return _dateLabel;
}
@end
| {
"pile_set_name": "Github"
} |
# Makefile for busybox
#
# Copyright (C) 1999-2005 by Erik Andersen <[email protected]>
#
# Licensed under GPLv2, see file LICENSE in this source tree.
lib-y:=
INSERT
lib-$(CONFIG_CHVT) += chvt.o
lib-$(CONFIG_FGCONSOLE) += fgconsole.o
lib-$(CONFIG_CLEAR) += clear.o
lib-$(CONFIG_DEALLOCVT) += deallocvt.o
lib-$(CONFIG_DUMPKMAP) += dumpkmap.o
lib-$(CONFIG_SETCONSOLE) += setconsole.o
lib-$(CONFIG_KBD_MODE) += kbd_mode.o
lib-$(CONFIG_LOADFONT) += loadfont.o
lib-$(CONFIG_LOADKMAP) += loadkmap.o
lib-$(CONFIG_OPENVT) += openvt.o
lib-$(CONFIG_RESET) += reset.o
lib-$(CONFIG_RESIZE) += resize.o
lib-$(CONFIG_SETFONT) += loadfont.o
lib-$(CONFIG_SETKEYCODES) += setkeycodes.o
lib-$(CONFIG_SETLOGCONS) += setlogcons.o
lib-$(CONFIG_SHOWKEY) += showkey.o
| {
"pile_set_name": "Github"
} |
/*
* Copyright (C) 2004 by Basler Vision Technologies AG
* Author: Thomas Koeller <[email protected]>
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*/
#if !defined(_ASM_RM9K_OCD_H)
#define _ASM_RM9K_OCD_H
#include <linux/types.h>
#include <linux/spinlock.h>
#include <asm/io.h>
extern volatile void __iomem * const ocd_base;
extern volatile void __iomem * const titan_base;
#define ocd_addr(__x__) (ocd_base + (__x__))
#define titan_addr(__x__) (titan_base + (__x__))
#define scram_addr(__x__) (scram_base + (__x__))
/* OCD register access */
#define ocd_readl(__offs__) __raw_readl(ocd_addr(__offs__))
#define ocd_readw(__offs__) __raw_readw(ocd_addr(__offs__))
#define ocd_readb(__offs__) __raw_readb(ocd_addr(__offs__))
#define ocd_writel(__val__, __offs__) \
__raw_writel((__val__), ocd_addr(__offs__))
#define ocd_writew(__val__, __offs__) \
__raw_writew((__val__), ocd_addr(__offs__))
#define ocd_writeb(__val__, __offs__) \
__raw_writeb((__val__), ocd_addr(__offs__))
/* TITAN register access - 32 bit-wide only */
#define titan_readl(__offs__) __raw_readl(titan_addr(__offs__))
#define titan_writel(__val__, __offs__) \
__raw_writel((__val__), titan_addr(__offs__))
/* Protect access to shared TITAN registers */
extern spinlock_t titan_lock;
extern int titan_irqflags;
#define lock_titan_regs() spin_lock_irqsave(&titan_lock, titan_irqflags)
#define unlock_titan_regs() spin_unlock_irqrestore(&titan_lock, titan_irqflags)
#endif /* !defined(_ASM_RM9K_OCD_H) */
| {
"pile_set_name": "Github"
} |
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="description" content="">
<meta name="author" content="">
<title>HTML5 (+ Brython): Aplicaciรณn TO DO que usa Local Storage</title>
<!-- Bootstrap core CSS -->
<link href="//netdna.bootstrapcdn.com/bootstrap/3.1.0/css/bootstrap.min.css" rel="stylesheet">
<script type="text/python">
from browser.local_storage import storage
from browser import document as doc
from browser import window as win
from browser import html
import datetime
import json
try:
storage['tasklist']
except:
storage['tasklist'] = json.dumps({})
def get_date():
date = datetime.datetime.now()
return date.strftime('%Y/%m/%d-%H:%M:%S')
def get_form_data():
task = doc["taskform"].value
relevance = doc["relevanceform"].value
return task, relevance
class ToDo:
def __init__(self):
self.tasks = json.loads(storage['tasklist'])
for key in self.tasks:
self.add_task_row(key)
def add_task(self, ev):
key = get_date()
task, relevance = get_form_data()
self.tasks[key] = [task, relevance]
self._save()
self.add_task_row(key)
def _del_task(self,event):
# get row
elt = event.target
while elt.parent and elt.tagName != 'TR':
elt = elt.parent
key = elt.id
del self.tasks[key]
del doc[key]
self._save()
def add_task_row(self, key):
if self.tasks[key][1] == "High":
color = "danger"
elif self.tasks[key][1] == "Medium":
color = "warning"
else:
color = "success"
row = html.TR(Class = color)
row.id = key
self._temp_key = key
link = html.A(html.IMG(src = "./done.png"))
link.href = "#"
link.bind('click', self._del_task)
row <= html.TD(self.tasks[key][0]) + \
html.TD(self.tasks[key][1]) + \
html.TD(key) + \
html.TD(link)
doc['tablebody'] <= row
def clear(self, ev):
for key in self.tasks:
del doc[key]
self.tasks = {}
self._save()
def _save(self):
storage['tasklist'] = json.dumps(self.tasks)
todo = ToDo()
add_task = getattr(todo,'add_task')
clear_all = getattr(todo,'clear')
doc["button-add"].bind("click", add_task)
doc["link-clear"].bind("click", clear_all)
</script>
</head>
<body onload="brython()">
<div class="navbar navbar-inverse">
<div class="container">
<div class="navbar-header">
<button type="button" class="navbar-toggle" data-toggle="collapse" data-target=".navbar-collapse">
<span class="icon-bar"></span>
<span class="icon-bar"></span>
<span class="icon-bar"></span>
</button>
<a class="navbar-brand" href="#">To-Do-Brython</a>
</div>
</div>
</div>
<div class="container">
<div class="starter-template">
<h1>HTML5 (+ Brython): TO DO App</h1>
<p class="lead">Esta sencilla app solo funciona en navegadores modernos.</p>
<form class="form-horizontal" role="form">
<div class="form-group">
<label for="taskform" class="col-lg-2 control-label">Task</label>
<div class="col-lg-10">
<input id="taskform" type="text" class="form-control" placeholder="Text input">
</div>
</div>
<div class="form-group">
<label for="relevanceform" class="col-lg-2 control-label">Relevancia</label>
<div class="col-lg-10">
<select id="relevanceform" class="form-control">
<option>Alta</option>
<option>Media</option>
<option>Baja</option>
</select>
</div>
</div>
<div class="form-group">
<div class="col-lg-offset-2 col-lg-10">
<button id="button-add" type="button" class="btn btn-default">Aรฑadir</button>
</div>
</div>
</form>
<table id="mytable" class="table table-striped table-bordered table-hover table-responsive">
<thead class="header">
<tr>
<th class="text-center">Tarea</th>
<th class="text-center">Relevancia</th>
<th class="text-center">Creado</th>
<th class="text-center">Finalizado</th>
</tr>
</thead>
<tbody id="tablebody">
</tbody>
</table>
<div><a id="link-clear" href="#">ยกLimpiar todo!</a></div>
</div><!-- /.starter-template -->
</div><!-- /.container -->
<!-- jQuery (necessary for Bootstrap's JavaScript plugins) -->
<script src="http://code.jquery.com/jquery-2.1.0.min.js"></script>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="//netdna.bootstrapcdn.com/bootstrap/3.1.0/js/bootstrap.min.js"></script>
<!-- Brython scripts
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script type="text/javascript" src="../../../../../../src/brython.js"></script>
</body>
</html>
| {
"pile_set_name": "Github"
} |
CREATE TABLE "ne_10m_admin_0_boundary_lines_land" (
gid serial,
"fid_ne_10m" int4,
"scalerank" numeric,
"featurecla" varchar(32),
"note_" varchar(32),
"name" varchar(100),
"comment" varchar(100),
"adm0_usa" int2,
"adm0_left" varchar(100),
"adm0_right" varchar(100),
"adm0_a3_l" varchar(3),
"adm0_a3_r" varchar(3),
"sov_a3_l" varchar(3),
"sov_a3_r" varchar(3),
"type" varchar(50),
"labelrank" int2);
ALTER TABLE "ne_10m_admin_0_boundary_lines_land" ADD PRIMARY KEY (gid);
SELECT AddGeometryColumn('','ne_10m_admin_0_boundary_lines_land','the_geom','3857','MULTILINESTRING',2);
| {
"pile_set_name": "Github"
} |
| {% extends 'projects/view.html' %}
include ../../../../../pillar/src/templates/mixins/components
| {% set title = 'production-tools' %}
| {% block css %}
| {{ super() }}
link(href="{{ url_for('static_flamenco', filename='assets/css/main.css') }}", rel="stylesheet")
link(href="{{ url_for('static_flamenco', filename='assets/img/favicon.png') }}", rel="shortcut icon")
| {% endblock css %}
| {% block head %}
script(src="{{ url_for('static_flamenco', filename='assets/js/generated/tutti.min.js') }}")
script(src="{{ url_for('static_pillar', filename='assets/js/vendor/clipboard.min.js')}}")
| {% endblock head %}
| {% block body %}
#app-main
#col_sidebar
nav.sidebar(role='navigation')
ul
li
a.navbar-item.flamenco(href="{{ url_for('flamenco.index') }}",
title='Flamenco')
i.pi-flamenco
li
a.navbar-item.managers(href="{{ url_for('flamenco.managers.index') }}",
title='Your Flamenco Managers') Mngrs
| {% if session.get('flamenco_last_project') %}
| {% set flamenco_last_project = session.get('flamenco_last_project') %}
li
a.navbar-item.jobs(href="{{ url_for('flamenco.jobs.perproject.index', project_url=flamenco_last_project.url) }}",
title='Jobs for project {{ flamenco_last_project.name }}') Jobs
li
a.navbar-item.archive(href="{{ url_for('flamenco.jobs.archive.perproject.index', project_url=flamenco_last_project.url) }}",
title='Jobs archive for project {{ flamenco_last_project.name }}') Archive
| {% endif %}
| {% block flamencobody %}
| {% endblock flamencobody %}
| {% endblock body %}
| {% block footer_scripts_pre %}
script(src="{{ url_for('static_pillar', filename='assets/js/vendor/jquery.select2.min.js') }}", async=true)
| {% if project %}
script.
ProjectUtils.setProjectAttributes({projectId: "{{project._id}}", projectUrl: "{{project.url}}"});
| {% endif %}
| {% endblock footer_scripts_pre %}
| {
"pile_set_name": "Github"
} |
/* Area: ffi_call
Purpose: Check return value complex, with many arguments
Limitations: none.
PR: none.
Originator: <[email protected]>. */
/* { dg-do run } */
#include "complex_defs_longdouble.inc"
#include "many_complex.inc"
| {
"pile_set_name": "Github"
} |
# Format: //devtools/kokoro/config/proto/build.proto
# Location of the bash script. Should have value <github_scm.name>/<path_from_repository_root>.
# github_scm.name is specified in the job configuration (next section).
build_file: "orbitprofiler/kokoro/builds/build.bat"
action {
define_artifacts {
regex: "github/orbitprofiler/build/testresults/*.xml"
strip_prefix: "github/orbitprofiler/build/package"
}
}
| {
"pile_set_name": "Github"
} |
//args: -Eunused
package p
type (
unused struct{}
)
func X() {}
| {
"pile_set_name": "Github"
} |
TITLE FRAME DUMP
.mllit=1
a=1
b=2
c=3 ;column count-down
d=4 ;row count-down
e=5 ;line-number on XGP
f=6 ;TV buffer byte pointer (8 bit)
g=7 ;disk buffer byte pointer (16 bit)
t=10 ;temporary
u=11 ;duplicity factor 1, 2 or 3
p=17
tvpage==370 ;top 10 pages of core for tv
tv=tvpage*2000 ;address of first word in tv screen
dskc==1
start: move p,[-20,,pdl]
movei u,1 ;DEFAULTS TO 1
pushj p,readnm
skipe unames
.suset [.ssname,,unames]
.open dskc,dskfil
.value
move a,[-10,,tvpage]
setz b,
.call cormap
.value
pushj p,tblini ;initialise magic table
movei d,455. ;455. lines to do
movei e,400. ;two inches clear of top on XGP
move f,[441000,,tv] ;8 bits at a shove
cain u,1
move f,[442000,,tv] ;16 bits at a shove
pushj p,linout ;write out the whole frame
move g,[442000,,dskbuf]
movei a,2 ;2 (PDP-11 words) long
idpb a,g
addi e,400. ;two inches clear at bottom of XGP
move a,e
iori a,100000 ;sign-bit on -> paper cut
idpb a,g
move a,[-1,,dskbuf]
.iot dskc,a ;write this record
move g,[442000,,dskbuf]
movei a,2 ;2 (PDP11 words) long
idpb a,g
setz a, ;line-number=0 -> EOF
idpb a,g
move a,[-1,,dskbuf]
.iot dskc,a
.close dskc,
.break 16,124000
.value
linout: pushj p,dumlin ;set up buffer pointer, enter XGP line number
pushj p,headin ;put in heading, mode switches and so on
pushj p,gobble ;transform the line
cain u,3
jrst wrtnow
setz a,
idpb a,g ;PDP11 words worth of white
idpb a,g ;PDP11 words worth of white
idpb a,g ;PDP11 words worth of white
wrtnow: pushj p,wrtlin ;write the line
caig u,1
jrst gumpa
pushj p,dumlin ;update line-number
pushj p,wrtlin ;write it again
gumpa: caig u,2
jrst gumpb
pushj p,dumlin ;update line-number
pushj p,wrtlin ;write it again
gumpb: sojg d,linout
popj p,
headin: cain u,3
jrst imagen
setz a, ;(0,0) enter run-length mode
idpb a,g
move a,lftmar
andi a,377
idpb a,g ;(200,0) 8 PDP11 words worth of white
idpb a,g ;(200,0) 8 PDP11 words worth of white
caie u,1
jrst whteno
idpb a,g ;(200,0) 8 PDP11 words worth of white
idpb a,g ;(200,0) 8 PDP11 words worth of white
whteno: setz a, ;(0,0) escape to command mode
idpb a,g
movei a,2
idpb a,g ;(2,0) enter image mode - and 8 bits of white
popj p,
imagen: movei a,2*400 ;(0,2) enter image mode
idpb a,g
popj p,
wrtlin: movn a,nmbwrd(u)
lsh a,-1
hrlz a,a
hrri a,dskbuf
.iot dskc,a
aos e ;up line-count
popj p,
dumlin: move g,[442000,,dskbuf]
move a,nmbwrd(u) ;number of PDP11 words in line
idpb a,g
idpb e,g ;line-number
popj p,
gobble: movei c,44 ;36. PDP10 words per line
cain u,1
jrst gobonc
cain u,2
jrst gobtwc
jrst gobtrc
gobonc: ildb a,f ;16 bits from tv
circ a,-20 ;reverse bits into b
idpb b,g ;16 bits to disk
sojg c,gobonc
popj p,
gobtwc: ildb a,f ;get 8 bits from tv
move b,dbltbl(a) ;look up doubled and reversed
idpb b,g ;put 16 bits to disk
ildb a,f ;get 8 bits from tv
move b,dbltbl(a) ;look up doubled and reversed
idpb b,g ;put 16 bits to disk
sojg c,gobtwc
popj p,
gobtrc: ildb a,f ;get 8 bits from TV
move a,dbltbl(a) ;look up 24 bits
ildb b,f ;get next 8 bits
move b,dbltbl(b) ;look up 24 bits
lsh b,14 ;left-justify (24. + 12. = 36.)
lshc a,-10 ;move 8 across to second word
move t,b ;save it for now
circ a,-20 ;reverse it
idpb b,g ;put 16 bits into disk buffer
move b,t ;next batch
lshc a,20 ;suck out 16
move t,b ;save rest for now
circ a,-20 ;reverse it
idpb b,g
move b,t
lshc a,20 ;suck out remaining 16 bit
circ a,-20 ;reverse it
idpb b,g
sojg c,gobtrc
popj p,
; if white-out on XGP is a problem, it may help to generate
; white bits in duplicate and triplicate mode.
tblini: cain u,1
popj p,
movei d,377
tbllup: move a,d ;get number
movei c,10 ;8 bits to do
dbllup: lshc a,-1 ;move one bit over
ash b,-1 ;duplicate it (make lsh to generate white)
caie u,2
ash b,-1 ;duplicate it (make lsh to generate whit)
sojg c,dbllup
caie u,2
jrst shovth
lshc a,20 ;bring back into a
circ a,-20 ;reverse into b
movem b,dbltbl(d)
sojge d,tbllup
popj p,
shovth: lshc a,30
movem a,dbltbl(d) ;store it
sojge d,tbllup
popj p,
readnm: .break 12,[5,,jcl] ;get job control language
move c,[440700,,jcl]
movei f,dskfil+1
scanon: move d,[440600,,g] ;first file name ?
setz g,
cntrlp: ildb a,c
skipn a
popj p,
caig a,40
jrst cntrlp
goblop: cain a,":
jrst colons
cain a,";
jrst semico
cain a,"/
jrst param
caig a,40
jrst spacer ;space
caige a,140
subi a,40 ;number
idpb a,d
ildb a,c
jrst goblop
spacer: skipn g
jrst scanon
movem g,(f)
movei f,dskfil+2
jrst scanon
semico: movem g,unames
jrst scanon
colons: hlrm g,dskfil ;oh, really?
jrst scanon
param: ildb u,c
subi u,60
caile u,3
movei u,3
skipg u
movei u,1
jrst spacer
dskfil: 7,,(sixbit/dsk/)
sixbit /frame/
sixbit />/
cormap: setz
'corblk
1000,,600000
1000,,-1
a
1000,,-2
setz b
nmbwrd: 0
2+1+4+2+44+3
2+1+2+2+44+44+3
2+1+44+44+44-3 ;number of PDP11 words in record (better be even!)
lftmar: 200 ;0 < lftmar < 400 (left margin of white)
unames: 0 ;potential place for luser name
dbltbl: block 400 ;tripled up bits
dskbuf: block 70 ;65 max of 53. words
pdl: block 20
jcl: block 20
-1
end start
| {
"pile_set_name": "Github"
} |
fn_args_layout = "Tall"
reorder_imports = true
| {
"pile_set_name": "Github"
} |
tree.databases.node.name=\u0411\u0430\u0437\u044B \u0434\u0430\u043D\u043D\u044B\u0445
tree.databases.node.tip=\u0411\u0430\u0437\u0430 \u0434\u0430\u043D\u043D\u044B\u0445 \u0441\u0435\u0440\u0432\u0435\u0440\u0430
tree.database.node.name=\u0411\u0430\u0437\u0430 \u0434\u0430\u043D\u043D\u044B\u0445
tree.tables.node.name=\u0422\u0430\u0431\u043B\u0438\u0446\u044B
tree.tables.node.tip=\u0422\u0430\u0431\u043B\u0438\u0446\u044B
tree.table.node.name=\u0422\u0430\u0431\u043B\u0438\u0446\u0430
tree.column.node.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0430
tree.columns.node.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0438
tree.constraints.node.name=\u041E\u0433\u0440\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u044F
tree.constraint.node.name=\u041E\u0433\u0440\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u0435
tree.constraint_columns.node.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0438 \u043E\u0433\u0440\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u044F
tree.check.constraints.node.name=Check \u043E\u0433\u0440\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u044F
tree.check.constraint.node.name=Check \u043E\u0433\u0440\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u0435
tree.foreign_keys.node.name=\u0412\u043D\u0435\u0448\u043D\u0438\u0435 \u043A\u043B\u044E\u0447\u0438
tree.foreign_key.node.name=\u0412\u043D\u0435\u0448\u043D\u0438\u0439 \u043A\u043B\u044E\u0447
tree.foreign_key_columns.node.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0438 \u0432\u043D\u0435\u0448\u043D\u0435\u0433\u043E \u043A\u043B\u044E\u0447\u0430
tree.references.node.name=\u0421\u0441\u044B\u043B\u043A\u0438
tree.reference_key.node.name=\u041A\u043B\u044E\u0447
tree.reference_key_columns.node.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0438
tree.triggers.node.name=\u0422\u0440\u0438\u0433\u0433\u0435\u0440\u044B
tree.triggers.node.tip=\u0422\u0440\u0438\u0433\u0433\u0435\u0440\u044B
tree.trigger.node.name=\u0422\u0440\u0438\u0433\u0433\u0435\u0440
tree.indexes.node.name=\u0418\u043D\u0434\u0435\u043A\u0441\u044B
tree.indexes.node.tip=\u0418\u043D\u0434\u0435\u043A\u0441\u044B
tree.index.node.name=\u0418\u043D\u0434\u0435\u043A\u0441
tree.index_columns.node.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0438 \u0438\u043D\u0434\u0435\u043A\u0441\u0430
tree.partitions.node.name=\u0420\u0430\u0437\u0434\u0435\u043B\u044B
tree.partition.node.name=\u0420\u0430\u0437\u0434\u0435\u043B
tree.subpartitions.node.name=\u041F\u043E\u0434\u0440\u0430\u0437\u0434\u0435\u043B\u044B
tree.subpartition.node.name=\u041F\u043E\u0434\u0440\u0430\u0437\u0434\u0435\u043B
tree.views.node.name=\u041F\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u044F
tree.views.node.tip=\u041F\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u044F
tree.view.node.name=\u041F\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435
tree.procedures.node.name=\u041F\u0440\u043E\u0446\u0435\u0434\u0443\u0440\u044B
tree.procedures.node.tip=\u041F\u0440\u043E\u0446\u0435\u0434\u0443\u0440\u044B
tree.procedure.node.name=\u041F\u0440\u043E\u0446\u0435\u0434\u0443\u0440\u0430
tree.users.node.name=\u041F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u0438
tree.users.node.tip=\u041F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u0438
tree.user.node.name=\u041F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u044C
tree.administer.node.name=\u0410\u0434\u043C\u0438\u043D\u0438\u0441\u0442\u0440\u0438\u0440\u043E\u0432\u0430\u043D\u0438\u0435
tree.administer.node.tip=\u041E\u0431\u0441\u043B\u0443\u0436\u0438\u0432\u0430\u043D\u0438\u0435/\u041D\u0430\u0441\u0442\u0440\u043E\u0439\u043A\u0438
tree.sessions.node.name=\u0421\u0435\u0441\u0441\u0438\u0438
tree.privilege.node.name=\u041F\u0440\u0438\u0432\u0438\u043B\u0435\u0433\u0438\u044F
tree.user_privileges.node.name=\u041F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u044C\u0441\u043A\u0438\u0435 \u043F\u0440\u0438\u0432\u0435\u043B\u0435\u0433\u0438\u0438
tree.system_info.node.name=\u0421\u0438\u0441\u0442\u0435\u043C\u043D\u0430\u044F \u0438\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0438\u044F
tree.system_info.node.tip=\u0421\u0438\u0441\u0442\u0435\u043C\u043D\u0430\u044F \u0438\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0438\u044F \u0431\u0430\u0437\u044B \u0434\u0430\u043D\u043D\u044B\u0445
tree.session_status.node.name=\u0421\u0442\u0430\u0442\u0443\u0441 \u0441\u0435\u0441\u0441\u0438\u0438
tree.variable.node.name=\u041F\u0435\u0440\u0435\u043C\u0435\u043D\u043D\u0430\u044F
tree.global_status.node.name=\u0413\u043B\u043E\u0431\u0430\u043B\u044C\u043D\u044B\u0439 \u0441\u0442\u0430\u0442\u0443\u0441
tree.session_variables.node.name=\u041F\u0435\u0440\u0435\u043C\u0435\u043D\u043D\u044B\u0435 \u0441\u0435\u0441\u0441\u0438\u0438
tree.global_variables.node.name=\u0413\u043B\u043E\u0431\u0430\u043B\u044C\u043D\u044B\u0435 \u043F\u0435\u0440\u0435\u043C\u0435\u043D\u043D\u044B\u0435
tree.engines.node.name=\u0414\u0432\u0438\u0436\u043A\u0438
tree.engine.node.name=\u0414\u0432\u0438\u0436\u043E\u043A
tree.charsets.node.name=\u041D\u0430\u0431\u043E\u0440\u044B \u0441\u0438\u043C\u0432\u043E\u043B\u043E\u0432
tree.charset.node.name=\u041D\u0430\u0431\u043E\u0440 \u0441\u0438\u043C\u0432\u043E\u043B\u043E\u0432
tree.collation.node.name=\u0421\u043E\u043F\u043E\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435
tree.events.node.name=\u0421\u043E\u0431\u044B\u0442\u0438\u044F
tree.events.node.tip=\u0421\u043E\u0431\u044B\u0442\u0438\u044F
tree.packages.node.name=\u041F\u0430\u043A\u0435\u0442\u044B
tree.packages.node.description=\u041F\u0430\u043A\u0435\u0442\u044B MariaDB (\u0440\u0435\u0436\u0438\u043C Oracle)
manager.catalog.name=\u041C\u0435\u043D\u0435\u0434\u0436\u0435\u0440 \u043A\u0430\u0442\u0430\u043B\u043E\u0433\u043E\u0432
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCatalog.name.name=\u0418\u043C\u044F \u0441\u0445\u0435\u043C\u044B
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCatalog$AdditionalInfo.defaultCharset.name=\u041A\u043E\u0434\u0438\u0440\u043E\u0432\u043A\u0430 \u043F\u043E \u0443\u043C\u043E\u043B\u0447\u0430\u043D\u0438\u044E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCatalog$AdditionalInfo.defaultCollation.name=\u041F\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435 \u043F\u043E \u0443\u043C\u043E\u043B\u0447\u0430\u043D\u0438\u044E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCatalog$AdditionalInfo.sqlPath.name=SQL \u043F\u0443\u0442\u044C
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCatalog.databaseSize.name=\u0420\u0430\u0437\u043C\u0435\u0440
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCatalog.databaseSize.description=\u0420\u0430\u0437\u043C\u0435\u0440 \u0431\u0430\u0437\u044B \u0434\u0430\u043D\u043D\u044B\u0445 (\u0434\u043B\u0438\u043D\u0430 \u0434\u0430\u043D\u043D\u044B\u0445 + \u0434\u043B\u0438\u043D\u0430 \u0438\u043D\u0434\u0435\u043A\u0441\u0430 \u0432 \u0431\u0430\u0439\u0442\u0430\u0445)
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCharset.name.name=\u041A\u043E\u0434\u0438\u0440\u043E\u0432\u043A\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCharset.maxLength.name=\u041C\u0430\u043A\u0441.\u0434\u043B\u0438\u043D\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCharset.description.name=\u041E\u043F\u0438\u0441\u0430\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCollation.charset.name=\u041D\u0430\u0431\u043E\u0440 \u0441\u0438\u043C\u0432\u043E\u043B\u043E\u0432
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCollation.default.name=\u041F\u043E \u0443\u043C\u043E\u043B\u0447\u0430\u043D\u0438\u044E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCollation.compiled.name=\u0421\u043A\u043E\u043C\u043F\u0438\u043B\u0438\u0440\u043E\u0432\u0430\u043D
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCollation.sortLength.name=\u0414\u043B\u0438\u043D\u0430 \u0441\u043E\u0440\u0442\u0438\u0440\u043E\u0432\u043A\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLCollation.name.name=\u0421\u043E\u043F\u043E\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEngine.name.name=\u0414\u0432\u0438\u0436\u043E\u043A
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEngine.support.name=\u041F\u043E\u0434\u0434\u0435\u0440\u0436\u043A\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEngine.supportsTransactions.name=\u041F\u043E\u0434\u0434\u0435\u0440\u0436\u043A\u0430 \u0442\u0440\u0430\u043D\u0437\u0430\u043A\u0446\u0438\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEngine.supportsXA.name=\u041F\u043E\u0434\u0434\u0435\u0440\u0436\u043A\u0430 \u0425\u0410
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEngine.supportsSavepoints.name=\u041F\u043E\u0434\u0434\u0435\u0440\u0436\u043A\u0430 \u0442\u043E\u0447\u0435\u043A \u0441\u043E\u0445\u0440\u0430\u043D\u0435\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLParameter.name.name=\u0418\u043C\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLParameter.value.name=\u0417\u043D\u0430\u0447\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.name.name=\u0418\u043C\u044F \u0440\u0430\u0437\u0434\u0435\u043B\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.position.name=\u041F\u043E\u0437\u0438\u0446\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.method.name=\u041C\u0435\u0442\u043E\u0434
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.expression.name=\u0412\u044B\u0440\u0430\u0436\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.description.name=\u041E\u043F\u0438\u0441\u0430\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.tableRows.name=\u0417\u0430\u043F\u0438\u0441\u0438 \u0442\u0430\u0431\u043B\u0438\u0446\u044B
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.avgRowLength.name=\u0421\u0440\u0435\u0434\u043D\u044F\u044F \u0434\u043B\u0438\u043D\u0430 \u0437\u0430\u043F\u0438\u0441\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.dataLength.name=\u0414\u043B\u0438\u043D\u0430 \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.maxDataLength.name=\u041D\u0430\u0438\u0431.\u0434\u043B\u0438\u043D\u0430 \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.indexLength.name=\u0414\u043B\u0438\u043D\u0430 \u0438\u043D\u0434\u0435\u043A\u0441\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.dataFree.name=\u0421\u0432\u043E\u0431\u043E\u0434\u043D\u043E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.createTime.name=\u0412\u0440\u0435\u043C\u044F \u0441\u043E\u0437\u0434\u0430\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.updateTime.name=\u0412\u0440\u0435\u043C\u044F \u0438\u0437\u043C\u0435\u043D\u0435\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.checkTime.name=\u0412\u0440\u0435\u043C\u044F \u043F\u0440\u043E\u0432\u0435\u0440\u043A\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.checksum.name=\u041A\u043E\u043D\u0442\u0440.\u0441\u0443\u043C\u043C\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.comment.name=\u041A\u043E\u043C\u043C\u0435\u043D\u0442\u0430\u0440\u0438\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPartition.nodegroup.name=\u0423\u0437\u043B\u043E\u0432\u0430\u044F \u0433\u0440\u0443\u043F\u043F\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPrivilege.name.name=\u041F\u0440\u0438\u0432\u0435\u043B\u0435\u0433\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPrivilege.context.name=\u041A\u043E\u043D\u0442\u0435\u043A\u0441\u0442
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedure.procedureType.name=\u0422\u0438\u043F \u043F\u0440\u043E\u0446\u0435\u0434\u0443\u0440\u044B
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedure.resultType.name=\u0422\u0438\u043F \u0440\u0435\u0437\u0443\u043B\u044C\u0442\u0430\u0442\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedure.bodyType.name=\u0422\u0438\u043F \u0442\u0435\u043B\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedure.body.name=\u0422\u0435\u043B\u043E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedure.deterministic.name=\u0414\u0435\u0442\u0435\u0440\u043C\u0438\u043D\u0438\u0440\u043E\u0432\u0430\u043D\u043D\u0430\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedure.deterministic.description=\u041E\u043F\u0435\u0440\u0430\u0446\u0438\u044F \u0441\u0447\u0438\u0442\u0430\u0435\u0442\u0441\u044F 'deterministic' \u0435\u0441\u043B\u0438 \u043E\u043D\u0430 \u0432\u0441\u0435\u0433\u0434\u0430 \u043F\u0440\u0438\u0432\u043E\u0434\u0438\u0442 \u043A \u043E\u0434\u0438\u043D\u0430\u043A\u043E\u0432\u043E\u043C\u0443 \u0440\u0435\u0437\u0443\u043B\u044C\u0442\u0430\u0442\u0443 \u043F\u0440\u0438 \u043E\u0434\u0438\u043D\u0430\u043A\u043E\u0432\u044B\u0445 \u0438\u0441\u0445\u043E\u0434\u043D\u044B\u0445 \u043F\u0430\u0440\u0430\u043C\u0435\u0442\u0440\u0430\u0445 \u0438 'not deterministic' \u0432 \u043F\u0440\u043E\u0442\u0438\u0432\u043D\u043E\u043C \u0441\u043B\u0443\u0447\u0430\u0435.
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedure.objectDefinitionText.name=\u041E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLProcedureParameter.parameterKind.name=\u0422\u0438\u043F \u043A\u043E\u043B\u043E\u043D\u043A\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.engine.name=\u0414\u0432\u0438\u0436\u043E\u043A
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.autoIncrement.name=\u0410\u0432\u0442\u043E-\u0443\u0432\u0435\u043B\u0438\u0447\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.charset.name=\u041D\u0430\u0431\u043E\u0440 \u0441\u0438\u043C\u0432\u043E\u043B\u043E\u0432
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.description.name=\u041E\u043F\u0438\u0441\u0430\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.rowCount.name=\u041A\u043E\u043B-\u0432\u043E \u0437\u0430\u043F\u0438\u0441\u0435\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.avgRowLength.name=\u0421\u0440\u0435\u0434\u043D\u044F\u044F \u0434\u043B\u0438\u043D\u0430 \u0437\u0430\u043F\u0438\u0441\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.dataLength.name=\u0420\u0430\u0437\u043C\u0435\u0440
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.createTime.name=\u0412\u0440\u0435\u043C\u044F \u0441\u043E\u0437\u0434\u0430\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.maxDataLength.name=\u041C\u0430\u043A\u0441. \u0434\u043B\u0438\u043D\u0430 \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.dataFree.name=\u0421\u0432\u043E\u0431\u043E\u0434\u043D\u043E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.collation.name=\u0421\u043E\u043F\u043E\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.typeName.name=\u0422\u0438\u043F \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.indexLength.name=\u0414\u043B\u0438\u043D\u0430 \u0438\u043D\u0434\u0435\u043A\u0441\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.rowFormat.name=\u0424\u043E\u0440\u043C\u0430\u0442 \u0441\u0442\u0440\u043E\u043A\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.updateTime.name=\u0412\u0440\u0435\u043C\u044F \u043E\u0431\u043D\u043E\u0432\u043B\u0435\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTable$AdditionalInfo.checkTime.name=\u0412\u0440\u0435\u043C\u044F \u043F\u0440\u043E\u0432\u0435\u0440\u043A\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.fullTypeName.name=\u0422\u0438\u043F \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.maxLength.name=\u0414\u043B\u0438\u043D\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.required.name=Not NULL
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.autoGenerated.name=\u0410\u0432\u0442\u043E-\u0443\u0432\u0435\u043B\u0438\u0447\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.keyType.name=\u041A\u043B\u044E\u0447
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.charset.name=\u041D\u0430\u0431\u043E\u0440 \u0441\u0438\u043C\u0432\u043E\u043B\u043E\u0432
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.collation.name=\u0421\u043E\u043F\u043E\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.comment.name=\u041A\u043E\u043C\u043C\u0435\u043D\u0442\u0430\u0440\u0438\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.genExpression.name=\u0412\u044B\u0440\u0430\u0436\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.extraInfo.name=\u0414\u043E\u043F\u043E\u043B\u043D\u0438\u0442\u0435\u043B\u044C\u043D\u043E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableColumn.genExpression.description=\u0412\u044B\u0440\u0430\u0436\u0435\u043D\u0438\u0435 \u0444\u043E\u0440\u043C\u0438\u0440\u043E\u0432\u0430\u043D\u0438\u044F \u0432\u0438\u0440\u0443\u0442\u0430\u043B\u044C\u043D\u043E\u0439 \u043A\u043E\u043B\u043E\u043D\u043A\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableConstraintColumn.ordinalPosition.name=\u041F\u043E\u0437\u0438\u0446\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableConstraintColumn.attribute.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndex.unique.name=\u0423\u043D\u0438\u043A\u0430\u043B\u044C\u043D\u044B\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndex.description.name=\u041A\u043E\u043C\u043C\u0435\u043D\u0442\u0430\u0440\u0438\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndexColumn.ordinalPosition.name=\u041F\u043E\u0437\u0438\u0446\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndexColumn.ascending.name=\u041F\u043E \u0432\u043E\u0437\u0440\u0430\u0441\u0442\u0430\u043D\u0438\u044E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndexColumn.tableColumn.name=\u041A\u043E\u043B\u043E\u043D\u043A\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndexColumn.nullable.name=\u0412\u043E\u0437\u043C\u043E\u0436\u0435\u043D NULL
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndex.cardinality.name=\u041C\u043E\u0449\u043D\u043E\u0441\u0442\u044C
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndex.additionalInfo.name=\u0414\u043E\u043F. \u0438\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableIndexColumn.subPart.name=\u0414\u043E\u043F\u043E\u043B\u043D\u0438\u0442\u0435\u043B\u044C\u043D\u043E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTrigger.table.name=\u0422\u0430\u0431\u043B\u0438\u0446\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTrigger.charsetClient.name=\u041A\u043E\u0434\u0438\u0440\u043E\u0432\u043A\u0430 \u043A\u043B\u0438\u0435\u043D\u0442\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTrigger.sqlMode.name=SQL \u0440\u0435\u0436\u0438\u043C
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTrigger.objectDefinitionText.name=\u041E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLUser.name.name=\u0418\u043C\u044F \u043F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLUser.host.name=\u041C\u0430\u0441\u043A\u0430 \u0445\u043E\u0441\u0442\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLView$AdditionalInfo.checkOption.name=\u041E\u043F\u0446\u0438\u044F \u043F\u0440\u043E\u0432\u0435\u0440\u043A\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLView$AdditionalInfo.definition.name=\u041E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLView$AdditionalInfo.updatable.name=\u041E\u0431\u043D\u043E\u0432\u043B\u044F\u0435\u043C\u044B\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLView$AdditionalInfo.definer.name=\u0412\u043B\u0430\u0434\u0435\u043B\u0435\u0446
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLView$AdditionalInfo.algorithm.name=\u0410\u043B\u0433\u043E\u0440\u0438\u0442\u043C
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLView.name.name=\u041D\u0430\u0437\u0432\u0430\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLView.definition.name=\u041E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.id.description=\u0418\u0434\u0435\u043D\u0442\u0438\u0444\u0438\u043A\u0430\u0442\u043E\u0440 SELECT'\u0430. \u042D\u0442\u043E \u043F\u043E\u0441\u043B\u0435\u0434\u043E\u0432\u0430\u0442\u0435\u043B\u044C\u043D\u044B\u0439 \u043D\u043E\u043C\u0435\u0440 SELECT'\u0430 \u0432 \u0437\u0430\u043F\u0440\u043E\u0441\u0435.
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.selectType.name=\u0422\u0438\u043F \u0441\u0435\u043B\u0435\u043A\u0442\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.selectType.description=\u0422\u0438\u043F \u0441\u0435\u043B\u0435\u043A\u0442\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.table.name=\u0422\u0430\u0431\u043B\u0438\u0446\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.nodeType.name=\u0422\u0438\u043F
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.nodeType.description=\u0422\u0438\u043F
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.possibleKeys.name=\u0412\u043E\u0437\u043C\u043E\u0436\u043D\u044B\u0435 \u043A\u043B\u044E\u0447\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.possibleKeys.description=\u041E\u0431\u043E\u0437\u043D\u0430\u0447\u0430\u0435\u0442, \u043A\u0430\u043A\u0438\u0435 \u0438\u043D\u0434\u0435\u043A\u0441\u044B \u043C\u043E\u0436\u0435\u0442 \u0438\u0441\u043F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u044C MySQL \u043F\u0440\u0438 \u043F\u043E\u0438\u0441\u043A\u0435 \u0437\u0430\u043F\u0438\u0441\u0435\u0439 \u0432 \u044D\u0442\u043E\u0439 \u0442\u0430\u0431\u043B\u0438\u0446\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.key.name=\u041A\u043B\u044E\u0447
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.key.description=\u041A\u043B\u044E\u0447 (\u0438\u043D\u0434\u0435\u043A\u0441), \u043A\u043E\u0442\u043E\u0440\u044B\u0439 \u0438\u0441\u043F\u043E\u043B\u044C\u0437\u0443\u0435\u0442 MySQL
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.keyLength.name=\u0414\u043B\u0438\u043D\u0430 \u043A\u043B\u044E\u0447\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.keyLength.description=\u0414\u043B\u0438\u043D\u0430 \u043A\u043B\u044E\u0447\u0430, \u043A\u043E\u0442\u043E\u0440\u044B\u0439 \u0438\u0441\u043F\u043E\u043B\u044C\u0437\u0443\u0435\u0442 MySQL
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.ref.description=\u041F\u043E\u043A\u0430\u0437\u044B\u0432\u0430\u0435\u0442, \u043A\u0430\u043A\u0438\u0435 \u043A\u043E\u043B\u043E\u043D\u043A\u0438 \u0438\u043B\u0438 \u043A\u043E\u043D\u0441\u0442\u0430\u043D\u0442\u044B \u0441\u0440\u0430\u0432\u043D\u0438\u0432\u0430\u044E\u0442\u0441\u044F \u0441 \u0438\u043C\u0435\u043D\u0430\u043C\u0438 \u0432 \u0438\u043D\u0434\u0435\u043A\u0441\u0435 \u043A\u043B\u044E\u0447\u0435\u0432\u043E\u0439 \u043A\u043E\u043B\u043E\u043D\u043A\u0438 \u043F\u0440\u0438 \u0432\u044B\u0431\u043E\u0440\u043A\u0435 \u0437\u0430\u043F\u0438\u0441\u0435\u0439 \u0438\u0437 \u0442\u0430\u0431\u043B\u0438\u0446\u044B
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.rowCount.name=\u0417\u0430\u043F\u0438\u0441\u0438
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.filtered.name=\u041E\u0442\u0444\u0438\u043B\u044C\u0442\u0440\u043E\u0432\u0430\u043D\u043E
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.filtered.description=\u041E\u0446\u0435\u043D\u043E\u0447\u043D\u044B\u0439 \u043F\u0440\u043E\u0446\u0435\u043D\u0442 \u0447\u0438\u0441\u043B\u0430 \u0437\u0430\u043F\u0438\u0441\u0435\u0439 \u0442\u0430\u0431\u043B\u0438\u0446\u044B, \u043A\u043E\u0442\u043E\u0440\u044B\u0435 \u0431\u0443\u0434\u0443\u0442 \u043E\u0442\u0444\u0438\u043B\u044C\u0442\u0440\u043E\u0432\u0430\u043D\u044B \u0443\u0441\u043B\u043E\u0432\u0438\u0435\u043C \u0442\u0430\u0431\u043B\u0438\u0446\u044B
meta.org.jkiss.dbeaver.ext.mysql.model.plan.MySQLPlanNodePlain.extra.description=\u0414\u043E\u043F\u043E\u043B\u043D\u0438\u0442\u0435\u043B\u044C\u043D\u0430\u044F \u0438\u043D\u0444\u043E\u0440\u043C\u0430\u0446\u0438\u044F \u043E \u0442\u043E\u043C, \u043A\u0430\u043A MySQL \u0440\u0430\u0437\u0440\u0435\u0448\u0430\u0435\u0442 \u0437\u0430\u043F\u0440\u043E\u0441
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.pid.description=\u0418\u0434\u0435\u043D\u0442\u0438\u0444\u0438\u043A\u0430\u0442\u043E\u0440 \u043F\u0440\u043E\u0446\u0435\u0441\u0441\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.user.name=\u041F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u044C
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.user.description=\u041F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u044C \u0431\u0430\u0437\u044B \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.host.name=\u0425\u043E\u0441\u0442
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.host.description=\u0423\u0434\u0430\u043B\u0451\u043D\u043D\u044B\u0439 \u0445\u043E\u0441\u0442
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.db.name=\u0411\u0430\u0437\u0430 \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.db.description=\u0411\u0430\u0437\u0430 \u0434\u0430\u043D\u043D\u044B\u0445
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.command.name=\u041A\u043E\u043C\u0430\u043D\u0434\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.command.description=\u0422\u0435\u043A\u0443\u0449\u0430\u044F \u043A\u043E\u043C\u0430\u043D\u0434\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.time.name=\u0412\u0440\u0435\u043C\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.time.description=\u0412\u0440\u0435\u043C\u044F \u0441\u0442\u0430\u0440\u0442\u0430 \u043A\u043E\u043C\u0430\u043D\u0434\u044B
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.state.name=\u0421\u043E\u0441\u0442\u043E\u044F\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.session.MySQLSession.state.description=\u0421\u043E\u0441\u0442\u043E\u044F\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPackage.name.name=\u041D\u0430\u0437\u0432\u0430\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLPackage.description.name=\u041D\u0430\u0437\u0432\u0430\u043D\u0438\u0435 \u043F\u0430\u043A\u0435\u0442\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.name.name=\u041D\u0430\u0437\u0432\u0430\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.name.description=\u041D\u0430\u0437\u0432\u0430\u043D\u0438\u0435 \u0441\u043E\u0431\u044B\u0442\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.definer.name=\u0412\u043B\u0430\u0434\u0435\u043B\u0435\u0446
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.definer.description=\u0423\u0447\u0435\u0442\u043D\u0430\u044F \u0437\u0430\u043F\u0438\u0441\u044C \u043F\u043E\u043B\u044C\u0437\u043E\u0432\u0430\u0442\u0435\u043B\u044F, \u0441\u043E\u0437\u0434\u0430\u0432\u0448\u0435\u0433\u043E \u0441\u043E\u0431\u044B\u0442\u0438\u0435, \u0432 \u0444\u043E\u0440\u043C\u0430\u0442\u0435 'user_name'@'host_name'.
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.timeZone.name=\u0427\u0430\u0441\u043E\u0432\u043E\u0439 \u043F\u043E\u044F\u0441
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.eventBody.name=\u041E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.eventDefinition.name=\u041E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.eventType.name=\u0422\u0438\u043F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.executeAt.name=\u0412\u044B\u043F\u043E\u043B\u043D\u0438\u0442\u044C \u0432
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.intervalValue.name=\u0417\u043D\u0430\u0447\u0435\u043D\u0438\u0435 \u0438\u043D\u0442\u0435\u0440\u0432\u0430\u043B\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.intervalField.name=\u041F\u043E\u043B\u0435 \u0438\u043D\u0442\u0435\u0440\u0432\u0430\u043B\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.sqlMode.name=\u0420\u0435\u0436\u0438\u043C SQL
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.starts.name=\u041D\u0430\u0447\u0430\u043B\u043E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.ends.name=\u0417\u0430\u0432\u0435\u0440\u0448\u0435\u043D\u0438\u0435
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.status.name=\u0421\u0442\u0430\u0442\u0443\u0441
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.onCompletion.name=\u041F\u043E \u0437\u0430\u0432\u0435\u0440\u0448\u0435\u043D\u0438\u044E
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.created.name=\u0412\u0440\u0435\u043C\u044F \u0441\u043E\u0437\u0434\u0430\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.lastAltered.name=\u0412\u0440\u0435\u043C\u044F \u0438\u0437\u043C\u0435\u043D\u0435\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.lastExecuted.name=\u0412\u0440\u0435\u043C\u044F \u0432\u044B\u043F\u043E\u043B\u043D\u0435\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.description.name=\u041A\u043E\u043C\u043C\u0435\u043D\u0442\u0430\u0440\u0438\u0439
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.originator.name=ID \u0441\u0435\u0440\u0432\u0435\u0440\u0430
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.characterSetClient.name=\u041D\u0430\u0431\u043E\u0440 \u0441\u0438\u043C\u0432\u043E\u043B\u043E\u0432
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.collationConnection.name=\u0421\u043E\u043F\u043E\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435 (\u043F\u043E\u0434\u043A\u043B\u044E\u0447\u0435\u043D\u0438\u0435)
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLEvent.databaseCollation.name=\u0421\u043E\u043F\u043E\u0441\u0442\u0430\u0432\u043B\u0435\u043D\u0438\u0435 (\u0431\u0430\u0437\u0430 \u0434\u0430\u043D\u043D\u044B\u0445)
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableConstraint.checkClause.name = \u0423\u0441\u043B\u043E\u0432\u0438\u0435 \u043E\u0433\u0440\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u044F
meta.org.jkiss.dbeaver.ext.mysql.model.MySQLTableConstraint.checkClause.description = \u041F\u0440\u043E\u0432\u0435\u0440\u044F\u0435\u0442, \u0441\u043E\u043E\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0443\u044E\u0442 \u043B\u0438 \u0434\u0430\u043D\u043D\u044B\u0435 \u0437\u0430\u0434\u0430\u043D\u043D\u043E\u043C\u0443 \u0443\u0441\u043B\u043E\u0432\u0438\u044E.
| {
"pile_set_name": "Github"
} |
import wx
import sys, glob
class DemoFrame(wx.Frame):
def __init__(self):
wx.Frame.__init__(self, None, -1,
"wx.ListCtrl in wx.LC_SMALL_ICON mode",
size=(600,400))
# load some images into an image list
il = wx.ImageList(16,16, True)
for name in glob.glob("smicon??.png"):
bmp = wx.Bitmap(name, wx.BITMAP_TYPE_PNG)
il_max = il.Add(bmp)
# create the list control
self.list = wx.ListCtrl(self, -1,
style=wx.LC_SMALL_ICON
| wx.LC_AUTOARRANGE
)
# assign the image list to it
self.list.AssignImageList(il, wx.IMAGE_LIST_SMALL)
# create some items for the list
for x in range(25):
img = x % (il_max+1)
self.list.InsertImageStringItem(x,
"This is item %02d" % x,
img)
app = wx.PySimpleApp()
frame = DemoFrame()
frame.Show()
app.MainLoop()
| {
"pile_set_name": "Github"
} |
/*
* Copyright 2017 LINE Corporation
*
* LINE Corporation licenses this file to you under the Apache License,
* version 2.0 (the "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at:
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations
* under the License.
*/
package com.linecorp.centraldogma.internal.api.v1;
import static com.linecorp.centraldogma.internal.Util.validateProjectName;
import java.util.Set;
import javax.annotation.Nullable;
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.google.common.base.MoreObjects;
import com.google.common.collect.ImmutableSet;
/**
* A request to create a new project.
*/
public class CreateProjectRequest {
private final String name;
private final Set<String> owners;
private final Set<String> members;
@JsonCreator
public CreateProjectRequest(@JsonProperty("name") String name,
@JsonProperty("owners") @Nullable Set<String> owners,
@JsonProperty("members") @Nullable Set<String> members) {
this.name = validateProjectName(name, "name");
this.owners = owners != null ? ImmutableSet.copyOf(owners) : ImmutableSet.of();
this.members = members != null ? ImmutableSet.copyOf(members) : ImmutableSet.of();
}
@JsonProperty
public String name() {
return name;
}
@JsonProperty
public Set<String> owners() {
return owners;
}
@JsonProperty
public Set<String> members() {
return members;
}
@Override
public String toString() {
return MoreObjects.toStringHelper(this)
.add("name", name())
.add("owners", owners())
.add("members", members())
.toString();
}
}
| {
"pile_set_name": "Github"
} |
/*
<documentation>
<summary>Read error log with filters</summary>
<returns>2 result sets with detailed info from error log.</returns>
<issues>No</issues>
<author>Konstantin Taranov</author>
<created>2019-06-27</created>
<modified>2019-08-27 by Konstantin Taranov</modified>
<version>1.1</version>
<sourceLink>https://github.com/ktaranov/sqlserver-kit/blob/master/Scripts/Jobs_Detailed_History.sql</sourceLink>
<originalLink>https://thelonedba.wordpress.com/2019/08/16/reading-sql-server-error-logs/</originalLink>
</documentation>
*/
DECLARE @daysback INT = 0; -- number of days to go back in the logs. 0 = today only
-- table variable for holding the details of the error logs.
-- Yes, I know, table variables are evil. This one is unlikely to hold more than a few dozen rather narrow lines
DECLARE @ErrorLogs TABLE
(
Archive INT NOT NULL,
LogDate DATETIME NOT NULL,
LogFileSizeBytes BIGINT NOT NULL,
ReadThis TINYINT NULL
);
-- useful trick if you don't know it: INSERT INTO a table the results of EXECing a SP.
INSERT INTO @ErrorLogs
(
Archive,
LogDate,
LogFileSizeBytes
)
EXEC sys.sp_enumerrorlogs;
DECLARE @lognum INT = 0;
DECLARE @logdate DATETIME;
--figure out which logfiles we need.
WITH NextLog
AS (SELECT Archive,
LogDate,
LogFileSizeBytes,
ReadThis,
ISNULL(LAG(LogDate) OVER (ORDER BY Archive), LogDate) AS nextlogdate
FROM @ErrorLogs)
UPDATE @ErrorLogs
SET ReadThis = CASE
WHEN e.Archive = 1 THEN
1 -- always read the first file; doesn't always get identified by the next line
WHEN n.nextlogdate >= DATEADD(DAY, DATEDIFF(DAY, '20100101', GETDATE()) - ABS(@daysback), '20100101') THEN
1
ELSE
0
END
FROM NextLog AS n
INNER JOIN @ErrorLogs AS e
ON e.Archive = n.Archive;
--just checking which files we're looking at... Probably comment this line out for production use
SELECT *
FROM @ErrorLogs
ORDER BY Archive;
IF OBJECT_ID('tempdb.dbo.#spErrorLog', 'U') IS NOT NULL
BEGIN
DROP TABLE #spErrorLog;
END;
CREATE TABLE #spErrorLog
(
logdate DATETIME NOT NULL,
ProcessInfo VARCHAR(20) NULL,
Text NVARCHAR(MAX) NOT NULL
);
WHILE @lognum <= (SELECT MAX(Archive) FROM @ErrorLogs WHERE ReadThis = 1)
BEGIN
INSERT INTO #spErrorLog
EXEC sys.sp_readerrorlog @p1 = @lognum;
SELECT @lognum = @lognum + 1;
SELECT @logdate = LogDate
FROM @ErrorLogs
WHERE Archive = @lognum;
END;
SELECT *
FROM #spErrorLog
WHERE 1 = 1
AND
(
ProcessInfo = 'Server' -- we like server-related boot messages.
OR
( -- filter out noisy messages that we don't really need to see here
Text NOT LIKE '%Backup Log%'
AND Text NOT LIKE 'Log was backed up%'
AND Text NOT LIKE 'CHECKDB for database % finished without errors on %This is an informational message only; no user action is required.'
AND Text NOT LIKE 'DBCC CHECKDB % WITH all_errormsgs, no_infomsgs, data_purity executed by % found 0 errors and repaired 0 errors.%'
AND Text NOT LIKE 'BACKUP DATABASE WITH DIFFERENTIAL successfully processed % pages%'
AND Text NOT LIKE 'Database differential changes were backed up. Database: % This is an informational message. No user action is required.'
AND Text NOT LIKE 'BACKUP DATABASE successfully processed % pages %'
AND Text NOT LIKE 'Database backed up. Database: % This is an informational message only. No user action is required.'
AND logdate >= DATEADD(DAY, DATEDIFF(DAY, '20100101', GETDATE()) - ABS(@daysback), '20100101')
)
)
ORDER BY logdate;
| {
"pile_set_name": "Github"
} |
-- Template updates for gameobject 1721 (Locked ball and chain)
UPDATE `gameobject_template` SET `flags`=`flags`|4 WHERE `entry`=1721;
| {
"pile_set_name": "Github"
} |
# Copyright 1999-2015 Gentoo Foundation
# Distributed under the terms of the GNU General Public License v2
EAPI=5
inherit eutils toolchain-funcs
DESCRIPTION="A Computer algebra package for Lie group computations"
HOMEPAGE="http://www-math.univ-poitiers.fr/~maavl/LiE/"
SRC_URI="http://wwwmathlabo.univ-poitiers.fr/~maavl/LiE/conLiE.tar.gz -> ${P}.tar.gz"
#### Remove the following line when moving this ebuild to the main tree!
RESTRICT="mirror"
LICENSE="LGPL-2.1"
##### See http://packages.debian.org/changelogs/pool/main/l/lie/lie_2.2.2+dfsg-1/lie.copyright
SLOT="0"
KEYWORDS="~amd64 ~x86"
IUSE="doc"
DEPEND="
sys-devel/bison
sys-libs/readline:0=
sys-libs/ncurses:0="
RDEPEND="sys-libs/readline:=
sys-libs/ncurses"
S="${WORKDIR}/LiE"
src_prepare() {
epatch \
"${FILESDIR}"/${P}-make.patch \
"${FILESDIR}"/parrallelmake-${P}.patch
}
src_compile() {
emake CC=$(tc-getCC)
}
src_install() {
default
use doc && dodoc "${S}"/manual/*
}
pkg_postinst() {
elog "This version of the LiE ebuild is still under development."
elog "Help us improve the ebuild in:"
elog "http://bugs.gentoo.org/show_bug.cgi?id=194393"
}
| {
"pile_set_name": "Github"
} |
r420 0x4f60
0x1434 SRC_Y_X
0x1438 DST_Y_X
0x143C DST_HEIGHT_WIDTH
0x146C DP_GUI_MASTER_CNTL
0x1474 BRUSH_Y_X
0x1478 DP_BRUSH_BKGD_CLR
0x147C DP_BRUSH_FRGD_CLR
0x1480 BRUSH_DATA0
0x1484 BRUSH_DATA1
0x1598 DST_WIDTH_HEIGHT
0x15C0 CLR_CMP_CNTL
0x15C4 CLR_CMP_CLR_SRC
0x15C8 CLR_CMP_CLR_DST
0x15CC CLR_CMP_MSK
0x15D8 DP_SRC_FRGD_CLR
0x15DC DP_SRC_BKGD_CLR
0x1600 DST_LINE_START
0x1604 DST_LINE_END
0x1608 DST_LINE_PATCOUNT
0x16C0 DP_CNTL
0x16CC DP_WRITE_MSK
0x16D0 DP_CNTL_XDIR_YDIR_YMAJOR
0x16E8 DEFAULT_SC_BOTTOM_RIGHT
0x16EC SC_TOP_LEFT
0x16F0 SC_BOTTOM_RIGHT
0x16F4 SRC_SC_BOTTOM_RIGHT
0x1714 DSTCACHE_CTLSTAT
0x1720 WAIT_UNTIL
0x172C RBBM_GUICNTL
0x1D98 VAP_VPORT_XSCALE
0x1D9C VAP_VPORT_XOFFSET
0x1DA0 VAP_VPORT_YSCALE
0x1DA4 VAP_VPORT_YOFFSET
0x1DA8 VAP_VPORT_ZSCALE
0x1DAC VAP_VPORT_ZOFFSET
0x2080 VAP_CNTL
0x2090 VAP_OUT_VTX_FMT_0
0x2094 VAP_OUT_VTX_FMT_1
0x20B0 VAP_VTE_CNTL
0x2138 VAP_VF_MIN_VTX_INDX
0x2140 VAP_CNTL_STATUS
0x2150 VAP_PROG_STREAM_CNTL_0
0x2154 VAP_PROG_STREAM_CNTL_1
0x2158 VAP_PROG_STREAM_CNTL_2
0x215C VAP_PROG_STREAM_CNTL_3
0x2160 VAP_PROG_STREAM_CNTL_4
0x2164 VAP_PROG_STREAM_CNTL_5
0x2168 VAP_PROG_STREAM_CNTL_6
0x216C VAP_PROG_STREAM_CNTL_7
0x2180 VAP_VTX_STATE_CNTL
0x2184 VAP_VSM_VTX_ASSM
0x2188 VAP_VTX_STATE_IND_REG_0
0x218C VAP_VTX_STATE_IND_REG_1
0x2190 VAP_VTX_STATE_IND_REG_2
0x2194 VAP_VTX_STATE_IND_REG_3
0x2198 VAP_VTX_STATE_IND_REG_4
0x219C VAP_VTX_STATE_IND_REG_5
0x21A0 VAP_VTX_STATE_IND_REG_6
0x21A4 VAP_VTX_STATE_IND_REG_7
0x21A8 VAP_VTX_STATE_IND_REG_8
0x21AC VAP_VTX_STATE_IND_REG_9
0x21B0 VAP_VTX_STATE_IND_REG_10
0x21B4 VAP_VTX_STATE_IND_REG_11
0x21B8 VAP_VTX_STATE_IND_REG_12
0x21BC VAP_VTX_STATE_IND_REG_13
0x21C0 VAP_VTX_STATE_IND_REG_14
0x21C4 VAP_VTX_STATE_IND_REG_15
0x21DC VAP_PSC_SGN_NORM_CNTL
0x21E0 VAP_PROG_STREAM_CNTL_EXT_0
0x21E4 VAP_PROG_STREAM_CNTL_EXT_1
0x21E8 VAP_PROG_STREAM_CNTL_EXT_2
0x21EC VAP_PROG_STREAM_CNTL_EXT_3
0x21F0 VAP_PROG_STREAM_CNTL_EXT_4
0x21F4 VAP_PROG_STREAM_CNTL_EXT_5
0x21F8 VAP_PROG_STREAM_CNTL_EXT_6
0x21FC VAP_PROG_STREAM_CNTL_EXT_7
0x2200 VAP_PVS_VECTOR_INDX_REG
0x2204 VAP_PVS_VECTOR_DATA_REG
0x2208 VAP_PVS_VECTOR_DATA_REG_128
0x221C VAP_CLIP_CNTL
0x2220 VAP_GB_VERT_CLIP_ADJ
0x2224 VAP_GB_VERT_DISC_ADJ
0x2228 VAP_GB_HORZ_CLIP_ADJ
0x222C VAP_GB_HORZ_DISC_ADJ
0x2230 VAP_PVS_FLOW_CNTL_ADDRS_0
0x2234 VAP_PVS_FLOW_CNTL_ADDRS_1
0x2238 VAP_PVS_FLOW_CNTL_ADDRS_2
0x223C VAP_PVS_FLOW_CNTL_ADDRS_3
0x2240 VAP_PVS_FLOW_CNTL_ADDRS_4
0x2244 VAP_PVS_FLOW_CNTL_ADDRS_5
0x2248 VAP_PVS_FLOW_CNTL_ADDRS_6
0x224C VAP_PVS_FLOW_CNTL_ADDRS_7
0x2250 VAP_PVS_FLOW_CNTL_ADDRS_8
0x2254 VAP_PVS_FLOW_CNTL_ADDRS_9
0x2258 VAP_PVS_FLOW_CNTL_ADDRS_10
0x225C VAP_PVS_FLOW_CNTL_ADDRS_11
0x2260 VAP_PVS_FLOW_CNTL_ADDRS_12
0x2264 VAP_PVS_FLOW_CNTL_ADDRS_13
0x2268 VAP_PVS_FLOW_CNTL_ADDRS_14
0x226C VAP_PVS_FLOW_CNTL_ADDRS_15
0x2284 VAP_PVS_STATE_FLUSH_REG
0x2288 VAP_PVS_VTX_TIMEOUT_REG
0x2290 VAP_PVS_FLOW_CNTL_LOOP_INDEX_0
0x2294 VAP_PVS_FLOW_CNTL_LOOP_INDEX_1
0x2298 VAP_PVS_FLOW_CNTL_LOOP_INDEX_2
0x229C VAP_PVS_FLOW_CNTL_LOOP_INDEX_3
0x22A0 VAP_PVS_FLOW_CNTL_LOOP_INDEX_4
0x22A4 VAP_PVS_FLOW_CNTL_LOOP_INDEX_5
0x22A8 VAP_PVS_FLOW_CNTL_LOOP_INDEX_6
0x22AC VAP_PVS_FLOW_CNTL_LOOP_INDEX_7
0x22B0 VAP_PVS_FLOW_CNTL_LOOP_INDEX_8
0x22B4 VAP_PVS_FLOW_CNTL_LOOP_INDEX_9
0x22B8 VAP_PVS_FLOW_CNTL_LOOP_INDEX_10
0x22BC VAP_PVS_FLOW_CNTL_LOOP_INDEX_11
0x22C0 VAP_PVS_FLOW_CNTL_LOOP_INDEX_12
0x22C4 VAP_PVS_FLOW_CNTL_LOOP_INDEX_13
0x22C8 VAP_PVS_FLOW_CNTL_LOOP_INDEX_14
0x22CC VAP_PVS_FLOW_CNTL_LOOP_INDEX_15
0x22D0 VAP_PVS_CODE_CNTL_0
0x22D4 VAP_PVS_CONST_CNTL
0x22D8 VAP_PVS_CODE_CNTL_1
0x22DC VAP_PVS_FLOW_CNTL_OPC
0x342C RB2D_DSTCACHE_CTLSTAT
0x4000 GB_VAP_RASTER_VTX_FMT_0
0x4004 GB_VAP_RASTER_VTX_FMT_1
0x4008 GB_ENABLE
0x4010 GB_MSPOS0
0x4014 GB_MSPOS1
0x401C GB_SELECT
0x4020 GB_AA_CONFIG
0x4024 GB_FIFO_SIZE
0x4100 TX_INVALTAGS
0x4200 GA_POINT_S0
0x4204 GA_POINT_T0
0x4208 GA_POINT_S1
0x420C GA_POINT_T1
0x4214 GA_TRIANGLE_STIPPLE
0x421C GA_POINT_SIZE
0x4230 GA_POINT_MINMAX
0x4234 GA_LINE_CNTL
0x4238 GA_LINE_STIPPLE_CONFIG
0x4260 GA_LINE_STIPPLE_VALUE
0x4264 GA_LINE_S0
0x4268 GA_LINE_S1
0x4278 GA_COLOR_CONTROL
0x427C GA_SOLID_RG
0x4280 GA_SOLID_BA
0x4288 GA_POLY_MODE
0x428C GA_ROUND_MODE
0x4290 GA_OFFSET
0x4294 GA_FOG_SCALE
0x4298 GA_FOG_OFFSET
0x42A0 SU_TEX_WRAP
0x42A4 SU_POLY_OFFSET_FRONT_SCALE
0x42A8 SU_POLY_OFFSET_FRONT_OFFSET
0x42AC SU_POLY_OFFSET_BACK_SCALE
0x42B0 SU_POLY_OFFSET_BACK_OFFSET
0x42B4 SU_POLY_OFFSET_ENABLE
0x42B8 SU_CULL_MODE
0x42C0 SU_DEPTH_SCALE
0x42C4 SU_DEPTH_OFFSET
0x42C8 SU_REG_DEST
0x4300 RS_COUNT
0x4304 RS_INST_COUNT
0x4310 RS_IP_0
0x4314 RS_IP_1
0x4318 RS_IP_2
0x431C RS_IP_3
0x4320 RS_IP_4
0x4324 RS_IP_5
0x4328 RS_IP_6
0x432C RS_IP_7
0x4330 RS_INST_0
0x4334 RS_INST_1
0x4338 RS_INST_2
0x433C RS_INST_3
0x4340 RS_INST_4
0x4344 RS_INST_5
0x4348 RS_INST_6
0x434C RS_INST_7
0x4350 RS_INST_8
0x4354 RS_INST_9
0x4358 RS_INST_10
0x435C RS_INST_11
0x4360 RS_INST_12
0x4364 RS_INST_13
0x4368 RS_INST_14
0x436C RS_INST_15
0x43A8 SC_EDGERULE
0x43B0 SC_CLIP_0_A
0x43B4 SC_CLIP_0_B
0x43B8 SC_CLIP_1_A
0x43BC SC_CLIP_1_B
0x43C0 SC_CLIP_2_A
0x43C4 SC_CLIP_2_B
0x43C8 SC_CLIP_3_A
0x43CC SC_CLIP_3_B
0x43D0 SC_CLIP_RULE
0x43E0 SC_SCISSOR0
0x43E8 SC_SCREENDOOR
0x4440 TX_FILTER1_0
0x4444 TX_FILTER1_1
0x4448 TX_FILTER1_2
0x444C TX_FILTER1_3
0x4450 TX_FILTER1_4
0x4454 TX_FILTER1_5
0x4458 TX_FILTER1_6
0x445C TX_FILTER1_7
0x4460 TX_FILTER1_8
0x4464 TX_FILTER1_9
0x4468 TX_FILTER1_10
0x446C TX_FILTER1_11
0x4470 TX_FILTER1_12
0x4474 TX_FILTER1_13
0x4478 TX_FILTER1_14
0x447C TX_FILTER1_15
0x4580 TX_CHROMA_KEY_0
0x4584 TX_CHROMA_KEY_1
0x4588 TX_CHROMA_KEY_2
0x458C TX_CHROMA_KEY_3
0x4590 TX_CHROMA_KEY_4
0x4594 TX_CHROMA_KEY_5
0x4598 TX_CHROMA_KEY_6
0x459C TX_CHROMA_KEY_7
0x45A0 TX_CHROMA_KEY_8
0x45A4 TX_CHROMA_KEY_9
0x45A8 TX_CHROMA_KEY_10
0x45AC TX_CHROMA_KEY_11
0x45B0 TX_CHROMA_KEY_12
0x45B4 TX_CHROMA_KEY_13
0x45B8 TX_CHROMA_KEY_14
0x45BC TX_CHROMA_KEY_15
0x45C0 TX_BORDER_COLOR_0
0x45C4 TX_BORDER_COLOR_1
0x45C8 TX_BORDER_COLOR_2
0x45CC TX_BORDER_COLOR_3
0x45D0 TX_BORDER_COLOR_4
0x45D4 TX_BORDER_COLOR_5
0x45D8 TX_BORDER_COLOR_6
0x45DC TX_BORDER_COLOR_7
0x45E0 TX_BORDER_COLOR_8
0x45E4 TX_BORDER_COLOR_9
0x45E8 TX_BORDER_COLOR_10
0x45EC TX_BORDER_COLOR_11
0x45F0 TX_BORDER_COLOR_12
0x45F4 TX_BORDER_COLOR_13
0x45F8 TX_BORDER_COLOR_14
0x45FC TX_BORDER_COLOR_15
0x4600 US_CONFIG
0x4604 US_PIXSIZE
0x4608 US_CODE_OFFSET
0x460C US_RESET
0x4610 US_CODE_ADDR_0
0x4614 US_CODE_ADDR_1
0x4618 US_CODE_ADDR_2
0x461C US_CODE_ADDR_3
0x4620 US_TEX_INST_0
0x4624 US_TEX_INST_1
0x4628 US_TEX_INST_2
0x462C US_TEX_INST_3
0x4630 US_TEX_INST_4
0x4634 US_TEX_INST_5
0x4638 US_TEX_INST_6
0x463C US_TEX_INST_7
0x4640 US_TEX_INST_8
0x4644 US_TEX_INST_9
0x4648 US_TEX_INST_10
0x464C US_TEX_INST_11
0x4650 US_TEX_INST_12
0x4654 US_TEX_INST_13
0x4658 US_TEX_INST_14
0x465C US_TEX_INST_15
0x4660 US_TEX_INST_16
0x4664 US_TEX_INST_17
0x4668 US_TEX_INST_18
0x466C US_TEX_INST_19
0x4670 US_TEX_INST_20
0x4674 US_TEX_INST_21
0x4678 US_TEX_INST_22
0x467C US_TEX_INST_23
0x4680 US_TEX_INST_24
0x4684 US_TEX_INST_25
0x4688 US_TEX_INST_26
0x468C US_TEX_INST_27
0x4690 US_TEX_INST_28
0x4694 US_TEX_INST_29
0x4698 US_TEX_INST_30
0x469C US_TEX_INST_31
0x46A4 US_OUT_FMT_0
0x46A8 US_OUT_FMT_1
0x46AC US_OUT_FMT_2
0x46B0 US_OUT_FMT_3
0x46B4 US_W_FMT
0x46B8 US_CODE_BANK
0x46BC US_CODE_EXT
0x46C0 US_ALU_RGB_ADDR_0
0x46C4 US_ALU_RGB_ADDR_1
0x46C8 US_ALU_RGB_ADDR_2
0x46CC US_ALU_RGB_ADDR_3
0x46D0 US_ALU_RGB_ADDR_4
0x46D4 US_ALU_RGB_ADDR_5
0x46D8 US_ALU_RGB_ADDR_6
0x46DC US_ALU_RGB_ADDR_7
0x46E0 US_ALU_RGB_ADDR_8
0x46E4 US_ALU_RGB_ADDR_9
0x46E8 US_ALU_RGB_ADDR_10
0x46EC US_ALU_RGB_ADDR_11
0x46F0 US_ALU_RGB_ADDR_12
0x46F4 US_ALU_RGB_ADDR_13
0x46F8 US_ALU_RGB_ADDR_14
0x46FC US_ALU_RGB_ADDR_15
0x4700 US_ALU_RGB_ADDR_16
0x4704 US_ALU_RGB_ADDR_17
0x4708 US_ALU_RGB_ADDR_18
0x470C US_ALU_RGB_ADDR_19
0x4710 US_ALU_RGB_ADDR_20
0x4714 US_ALU_RGB_ADDR_21
0x4718 US_ALU_RGB_ADDR_22
0x471C US_ALU_RGB_ADDR_23
0x4720 US_ALU_RGB_ADDR_24
0x4724 US_ALU_RGB_ADDR_25
0x4728 US_ALU_RGB_ADDR_26
0x472C US_ALU_RGB_ADDR_27
0x4730 US_ALU_RGB_ADDR_28
0x4734 US_ALU_RGB_ADDR_29
0x4738 US_ALU_RGB_ADDR_30
0x473C US_ALU_RGB_ADDR_31
0x4740 US_ALU_RGB_ADDR_32
0x4744 US_ALU_RGB_ADDR_33
0x4748 US_ALU_RGB_ADDR_34
0x474C US_ALU_RGB_ADDR_35
0x4750 US_ALU_RGB_ADDR_36
0x4754 US_ALU_RGB_ADDR_37
0x4758 US_ALU_RGB_ADDR_38
0x475C US_ALU_RGB_ADDR_39
0x4760 US_ALU_RGB_ADDR_40
0x4764 US_ALU_RGB_ADDR_41
0x4768 US_ALU_RGB_ADDR_42
0x476C US_ALU_RGB_ADDR_43
0x4770 US_ALU_RGB_ADDR_44
0x4774 US_ALU_RGB_ADDR_45
0x4778 US_ALU_RGB_ADDR_46
0x477C US_ALU_RGB_ADDR_47
0x4780 US_ALU_RGB_ADDR_48
0x4784 US_ALU_RGB_ADDR_49
0x4788 US_ALU_RGB_ADDR_50
0x478C US_ALU_RGB_ADDR_51
0x4790 US_ALU_RGB_ADDR_52
0x4794 US_ALU_RGB_ADDR_53
0x4798 US_ALU_RGB_ADDR_54
0x479C US_ALU_RGB_ADDR_55
0x47A0 US_ALU_RGB_ADDR_56
0x47A4 US_ALU_RGB_ADDR_57
0x47A8 US_ALU_RGB_ADDR_58
0x47AC US_ALU_RGB_ADDR_59
0x47B0 US_ALU_RGB_ADDR_60
0x47B4 US_ALU_RGB_ADDR_61
0x47B8 US_ALU_RGB_ADDR_62
0x47BC US_ALU_RGB_ADDR_63
0x47C0 US_ALU_ALPHA_ADDR_0
0x47C4 US_ALU_ALPHA_ADDR_1
0x47C8 US_ALU_ALPHA_ADDR_2
0x47CC US_ALU_ALPHA_ADDR_3
0x47D0 US_ALU_ALPHA_ADDR_4
0x47D4 US_ALU_ALPHA_ADDR_5
0x47D8 US_ALU_ALPHA_ADDR_6
0x47DC US_ALU_ALPHA_ADDR_7
0x47E0 US_ALU_ALPHA_ADDR_8
0x47E4 US_ALU_ALPHA_ADDR_9
0x47E8 US_ALU_ALPHA_ADDR_10
0x47EC US_ALU_ALPHA_ADDR_11
0x47F0 US_ALU_ALPHA_ADDR_12
0x47F4 US_ALU_ALPHA_ADDR_13
0x47F8 US_ALU_ALPHA_ADDR_14
0x47FC US_ALU_ALPHA_ADDR_15
0x4800 US_ALU_ALPHA_ADDR_16
0x4804 US_ALU_ALPHA_ADDR_17
0x4808 US_ALU_ALPHA_ADDR_18
0x480C US_ALU_ALPHA_ADDR_19
0x4810 US_ALU_ALPHA_ADDR_20
0x4814 US_ALU_ALPHA_ADDR_21
0x4818 US_ALU_ALPHA_ADDR_22
0x481C US_ALU_ALPHA_ADDR_23
0x4820 US_ALU_ALPHA_ADDR_24
0x4824 US_ALU_ALPHA_ADDR_25
0x4828 US_ALU_ALPHA_ADDR_26
0x482C US_ALU_ALPHA_ADDR_27
0x4830 US_ALU_ALPHA_ADDR_28
0x4834 US_ALU_ALPHA_ADDR_29
0x4838 US_ALU_ALPHA_ADDR_30
0x483C US_ALU_ALPHA_ADDR_31
0x4840 US_ALU_ALPHA_ADDR_32
0x4844 US_ALU_ALPHA_ADDR_33
0x4848 US_ALU_ALPHA_ADDR_34
0x484C US_ALU_ALPHA_ADDR_35
0x4850 US_ALU_ALPHA_ADDR_36
0x4854 US_ALU_ALPHA_ADDR_37
0x4858 US_ALU_ALPHA_ADDR_38
0x485C US_ALU_ALPHA_ADDR_39
0x4860 US_ALU_ALPHA_ADDR_40
0x4864 US_ALU_ALPHA_ADDR_41
0x4868 US_ALU_ALPHA_ADDR_42
0x486C US_ALU_ALPHA_ADDR_43
0x4870 US_ALU_ALPHA_ADDR_44
0x4874 US_ALU_ALPHA_ADDR_45
0x4878 US_ALU_ALPHA_ADDR_46
0x487C US_ALU_ALPHA_ADDR_47
0x4880 US_ALU_ALPHA_ADDR_48
0x4884 US_ALU_ALPHA_ADDR_49
0x4888 US_ALU_ALPHA_ADDR_50
0x488C US_ALU_ALPHA_ADDR_51
0x4890 US_ALU_ALPHA_ADDR_52
0x4894 US_ALU_ALPHA_ADDR_53
0x4898 US_ALU_ALPHA_ADDR_54
0x489C US_ALU_ALPHA_ADDR_55
0x48A0 US_ALU_ALPHA_ADDR_56
0x48A4 US_ALU_ALPHA_ADDR_57
0x48A8 US_ALU_ALPHA_ADDR_58
0x48AC US_ALU_ALPHA_ADDR_59
0x48B0 US_ALU_ALPHA_ADDR_60
0x48B4 US_ALU_ALPHA_ADDR_61
0x48B8 US_ALU_ALPHA_ADDR_62
0x48BC US_ALU_ALPHA_ADDR_63
0x48C0 US_ALU_RGB_INST_0
0x48C4 US_ALU_RGB_INST_1
0x48C8 US_ALU_RGB_INST_2
0x48CC US_ALU_RGB_INST_3
0x48D0 US_ALU_RGB_INST_4
0x48D4 US_ALU_RGB_INST_5
0x48D8 US_ALU_RGB_INST_6
0x48DC US_ALU_RGB_INST_7
0x48E0 US_ALU_RGB_INST_8
0x48E4 US_ALU_RGB_INST_9
0x48E8 US_ALU_RGB_INST_10
0x48EC US_ALU_RGB_INST_11
0x48F0 US_ALU_RGB_INST_12
0x48F4 US_ALU_RGB_INST_13
0x48F8 US_ALU_RGB_INST_14
0x48FC US_ALU_RGB_INST_15
0x4900 US_ALU_RGB_INST_16
0x4904 US_ALU_RGB_INST_17
0x4908 US_ALU_RGB_INST_18
0x490C US_ALU_RGB_INST_19
0x4910 US_ALU_RGB_INST_20
0x4914 US_ALU_RGB_INST_21
0x4918 US_ALU_RGB_INST_22
0x491C US_ALU_RGB_INST_23
0x4920 US_ALU_RGB_INST_24
0x4924 US_ALU_RGB_INST_25
0x4928 US_ALU_RGB_INST_26
0x492C US_ALU_RGB_INST_27
0x4930 US_ALU_RGB_INST_28
0x4934 US_ALU_RGB_INST_29
0x4938 US_ALU_RGB_INST_30
0x493C US_ALU_RGB_INST_31
0x4940 US_ALU_RGB_INST_32
0x4944 US_ALU_RGB_INST_33
0x4948 US_ALU_RGB_INST_34
0x494C US_ALU_RGB_INST_35
0x4950 US_ALU_RGB_INST_36
0x4954 US_ALU_RGB_INST_37
0x4958 US_ALU_RGB_INST_38
0x495C US_ALU_RGB_INST_39
0x4960 US_ALU_RGB_INST_40
0x4964 US_ALU_RGB_INST_41
0x4968 US_ALU_RGB_INST_42
0x496C US_ALU_RGB_INST_43
0x4970 US_ALU_RGB_INST_44
0x4974 US_ALU_RGB_INST_45
0x4978 US_ALU_RGB_INST_46
0x497C US_ALU_RGB_INST_47
0x4980 US_ALU_RGB_INST_48
0x4984 US_ALU_RGB_INST_49
0x4988 US_ALU_RGB_INST_50
0x498C US_ALU_RGB_INST_51
0x4990 US_ALU_RGB_INST_52
0x4994 US_ALU_RGB_INST_53
0x4998 US_ALU_RGB_INST_54
0x499C US_ALU_RGB_INST_55
0x49A0 US_ALU_RGB_INST_56
0x49A4 US_ALU_RGB_INST_57
0x49A8 US_ALU_RGB_INST_58
0x49AC US_ALU_RGB_INST_59
0x49B0 US_ALU_RGB_INST_60
0x49B4 US_ALU_RGB_INST_61
0x49B8 US_ALU_RGB_INST_62
0x49BC US_ALU_RGB_INST_63
0x49C0 US_ALU_ALPHA_INST_0
0x49C4 US_ALU_ALPHA_INST_1
0x49C8 US_ALU_ALPHA_INST_2
0x49CC US_ALU_ALPHA_INST_3
0x49D0 US_ALU_ALPHA_INST_4
0x49D4 US_ALU_ALPHA_INST_5
0x49D8 US_ALU_ALPHA_INST_6
0x49DC US_ALU_ALPHA_INST_7
0x49E0 US_ALU_ALPHA_INST_8
0x49E4 US_ALU_ALPHA_INST_9
0x49E8 US_ALU_ALPHA_INST_10
0x49EC US_ALU_ALPHA_INST_11
0x49F0 US_ALU_ALPHA_INST_12
0x49F4 US_ALU_ALPHA_INST_13
0x49F8 US_ALU_ALPHA_INST_14
0x49FC US_ALU_ALPHA_INST_15
0x4A00 US_ALU_ALPHA_INST_16
0x4A04 US_ALU_ALPHA_INST_17
0x4A08 US_ALU_ALPHA_INST_18
0x4A0C US_ALU_ALPHA_INST_19
0x4A10 US_ALU_ALPHA_INST_20
0x4A14 US_ALU_ALPHA_INST_21
0x4A18 US_ALU_ALPHA_INST_22
0x4A1C US_ALU_ALPHA_INST_23
0x4A20 US_ALU_ALPHA_INST_24
0x4A24 US_ALU_ALPHA_INST_25
0x4A28 US_ALU_ALPHA_INST_26
0x4A2C US_ALU_ALPHA_INST_27
0x4A30 US_ALU_ALPHA_INST_28
0x4A34 US_ALU_ALPHA_INST_29
0x4A38 US_ALU_ALPHA_INST_30
0x4A3C US_ALU_ALPHA_INST_31
0x4A40 US_ALU_ALPHA_INST_32
0x4A44 US_ALU_ALPHA_INST_33
0x4A48 US_ALU_ALPHA_INST_34
0x4A4C US_ALU_ALPHA_INST_35
0x4A50 US_ALU_ALPHA_INST_36
0x4A54 US_ALU_ALPHA_INST_37
0x4A58 US_ALU_ALPHA_INST_38
0x4A5C US_ALU_ALPHA_INST_39
0x4A60 US_ALU_ALPHA_INST_40
0x4A64 US_ALU_ALPHA_INST_41
0x4A68 US_ALU_ALPHA_INST_42
0x4A6C US_ALU_ALPHA_INST_43
0x4A70 US_ALU_ALPHA_INST_44
0x4A74 US_ALU_ALPHA_INST_45
0x4A78 US_ALU_ALPHA_INST_46
0x4A7C US_ALU_ALPHA_INST_47
0x4A80 US_ALU_ALPHA_INST_48
0x4A84 US_ALU_ALPHA_INST_49
0x4A88 US_ALU_ALPHA_INST_50
0x4A8C US_ALU_ALPHA_INST_51
0x4A90 US_ALU_ALPHA_INST_52
0x4A94 US_ALU_ALPHA_INST_53
0x4A98 US_ALU_ALPHA_INST_54
0x4A9C US_ALU_ALPHA_INST_55
0x4AA0 US_ALU_ALPHA_INST_56
0x4AA4 US_ALU_ALPHA_INST_57
0x4AA8 US_ALU_ALPHA_INST_58
0x4AAC US_ALU_ALPHA_INST_59
0x4AB0 US_ALU_ALPHA_INST_60
0x4AB4 US_ALU_ALPHA_INST_61
0x4AB8 US_ALU_ALPHA_INST_62
0x4ABC US_ALU_ALPHA_INST_63
0x4AC0 US_ALU_EXT_ADDR_0
0x4AC4 US_ALU_EXT_ADDR_1
0x4AC8 US_ALU_EXT_ADDR_2
0x4ACC US_ALU_EXT_ADDR_3
0x4AD0 US_ALU_EXT_ADDR_4
0x4AD4 US_ALU_EXT_ADDR_5
0x4AD8 US_ALU_EXT_ADDR_6
0x4ADC US_ALU_EXT_ADDR_7
0x4AE0 US_ALU_EXT_ADDR_8
0x4AE4 US_ALU_EXT_ADDR_9
0x4AE8 US_ALU_EXT_ADDR_10
0x4AEC US_ALU_EXT_ADDR_11
0x4AF0 US_ALU_EXT_ADDR_12
0x4AF4 US_ALU_EXT_ADDR_13
0x4AF8 US_ALU_EXT_ADDR_14
0x4AFC US_ALU_EXT_ADDR_15
0x4B00 US_ALU_EXT_ADDR_16
0x4B04 US_ALU_EXT_ADDR_17
0x4B08 US_ALU_EXT_ADDR_18
0x4B0C US_ALU_EXT_ADDR_19
0x4B10 US_ALU_EXT_ADDR_20
0x4B14 US_ALU_EXT_ADDR_21
0x4B18 US_ALU_EXT_ADDR_22
0x4B1C US_ALU_EXT_ADDR_23
0x4B20 US_ALU_EXT_ADDR_24
0x4B24 US_ALU_EXT_ADDR_25
0x4B28 US_ALU_EXT_ADDR_26
0x4B2C US_ALU_EXT_ADDR_27
0x4B30 US_ALU_EXT_ADDR_28
0x4B34 US_ALU_EXT_ADDR_29
0x4B38 US_ALU_EXT_ADDR_30
0x4B3C US_ALU_EXT_ADDR_31
0x4B40 US_ALU_EXT_ADDR_32
0x4B44 US_ALU_EXT_ADDR_33
0x4B48 US_ALU_EXT_ADDR_34
0x4B4C US_ALU_EXT_ADDR_35
0x4B50 US_ALU_EXT_ADDR_36
0x4B54 US_ALU_EXT_ADDR_37
0x4B58 US_ALU_EXT_ADDR_38
0x4B5C US_ALU_EXT_ADDR_39
0x4B60 US_ALU_EXT_ADDR_40
0x4B64 US_ALU_EXT_ADDR_41
0x4B68 US_ALU_EXT_ADDR_42
0x4B6C US_ALU_EXT_ADDR_43
0x4B70 US_ALU_EXT_ADDR_44
0x4B74 US_ALU_EXT_ADDR_45
0x4B78 US_ALU_EXT_ADDR_46
0x4B7C US_ALU_EXT_ADDR_47
0x4B80 US_ALU_EXT_ADDR_48
0x4B84 US_ALU_EXT_ADDR_49
0x4B88 US_ALU_EXT_ADDR_50
0x4B8C US_ALU_EXT_ADDR_51
0x4B90 US_ALU_EXT_ADDR_52
0x4B94 US_ALU_EXT_ADDR_53
0x4B98 US_ALU_EXT_ADDR_54
0x4B9C US_ALU_EXT_ADDR_55
0x4BA0 US_ALU_EXT_ADDR_56
0x4BA4 US_ALU_EXT_ADDR_57
0x4BA8 US_ALU_EXT_ADDR_58
0x4BAC US_ALU_EXT_ADDR_59
0x4BB0 US_ALU_EXT_ADDR_60
0x4BB4 US_ALU_EXT_ADDR_61
0x4BB8 US_ALU_EXT_ADDR_62
0x4BBC US_ALU_EXT_ADDR_63
0x4BC0 FG_FOG_BLEND
0x4BC4 FG_FOG_FACTOR
0x4BC8 FG_FOG_COLOR_R
0x4BCC FG_FOG_COLOR_G
0x4BD0 FG_FOG_COLOR_B
0x4BD4 FG_ALPHA_FUNC
0x4BD8 FG_DEPTH_SRC
0x4C00 US_ALU_CONST_R_0
0x4C04 US_ALU_CONST_G_0
0x4C08 US_ALU_CONST_B_0
0x4C0C US_ALU_CONST_A_0
0x4C10 US_ALU_CONST_R_1
0x4C14 US_ALU_CONST_G_1
0x4C18 US_ALU_CONST_B_1
0x4C1C US_ALU_CONST_A_1
0x4C20 US_ALU_CONST_R_2
0x4C24 US_ALU_CONST_G_2
0x4C28 US_ALU_CONST_B_2
0x4C2C US_ALU_CONST_A_2
0x4C30 US_ALU_CONST_R_3
0x4C34 US_ALU_CONST_G_3
0x4C38 US_ALU_CONST_B_3
0x4C3C US_ALU_CONST_A_3
0x4C40 US_ALU_CONST_R_4
0x4C44 US_ALU_CONST_G_4
0x4C48 US_ALU_CONST_B_4
0x4C4C US_ALU_CONST_A_4
0x4C50 US_ALU_CONST_R_5
0x4C54 US_ALU_CONST_G_5
0x4C58 US_ALU_CONST_B_5
0x4C5C US_ALU_CONST_A_5
0x4C60 US_ALU_CONST_R_6
0x4C64 US_ALU_CONST_G_6
0x4C68 US_ALU_CONST_B_6
0x4C6C US_ALU_CONST_A_6
0x4C70 US_ALU_CONST_R_7
0x4C74 US_ALU_CONST_G_7
0x4C78 US_ALU_CONST_B_7
0x4C7C US_ALU_CONST_A_7
0x4C80 US_ALU_CONST_R_8
0x4C84 US_ALU_CONST_G_8
0x4C88 US_ALU_CONST_B_8
0x4C8C US_ALU_CONST_A_8
0x4C90 US_ALU_CONST_R_9
0x4C94 US_ALU_CONST_G_9
0x4C98 US_ALU_CONST_B_9
0x4C9C US_ALU_CONST_A_9
0x4CA0 US_ALU_CONST_R_10
0x4CA4 US_ALU_CONST_G_10
0x4CA8 US_ALU_CONST_B_10
0x4CAC US_ALU_CONST_A_10
0x4CB0 US_ALU_CONST_R_11
0x4CB4 US_ALU_CONST_G_11
0x4CB8 US_ALU_CONST_B_11
0x4CBC US_ALU_CONST_A_11
0x4CC0 US_ALU_CONST_R_12
0x4CC4 US_ALU_CONST_G_12
0x4CC8 US_ALU_CONST_B_12
0x4CCC US_ALU_CONST_A_12
0x4CD0 US_ALU_CONST_R_13
0x4CD4 US_ALU_CONST_G_13
0x4CD8 US_ALU_CONST_B_13
0x4CDC US_ALU_CONST_A_13
0x4CE0 US_ALU_CONST_R_14
0x4CE4 US_ALU_CONST_G_14
0x4CE8 US_ALU_CONST_B_14
0x4CEC US_ALU_CONST_A_14
0x4CF0 US_ALU_CONST_R_15
0x4CF4 US_ALU_CONST_G_15
0x4CF8 US_ALU_CONST_B_15
0x4CFC US_ALU_CONST_A_15
0x4D00 US_ALU_CONST_R_16
0x4D04 US_ALU_CONST_G_16
0x4D08 US_ALU_CONST_B_16
0x4D0C US_ALU_CONST_A_16
0x4D10 US_ALU_CONST_R_17
0x4D14 US_ALU_CONST_G_17
0x4D18 US_ALU_CONST_B_17
0x4D1C US_ALU_CONST_A_17
0x4D20 US_ALU_CONST_R_18
0x4D24 US_ALU_CONST_G_18
0x4D28 US_ALU_CONST_B_18
0x4D2C US_ALU_CONST_A_18
0x4D30 US_ALU_CONST_R_19
0x4D34 US_ALU_CONST_G_19
0x4D38 US_ALU_CONST_B_19
0x4D3C US_ALU_CONST_A_19
0x4D40 US_ALU_CONST_R_20
0x4D44 US_ALU_CONST_G_20
0x4D48 US_ALU_CONST_B_20
0x4D4C US_ALU_CONST_A_20
0x4D50 US_ALU_CONST_R_21
0x4D54 US_ALU_CONST_G_21
0x4D58 US_ALU_CONST_B_21
0x4D5C US_ALU_CONST_A_21
0x4D60 US_ALU_CONST_R_22
0x4D64 US_ALU_CONST_G_22
0x4D68 US_ALU_CONST_B_22
0x4D6C US_ALU_CONST_A_22
0x4D70 US_ALU_CONST_R_23
0x4D74 US_ALU_CONST_G_23
0x4D78 US_ALU_CONST_B_23
0x4D7C US_ALU_CONST_A_23
0x4D80 US_ALU_CONST_R_24
0x4D84 US_ALU_CONST_G_24
0x4D88 US_ALU_CONST_B_24
0x4D8C US_ALU_CONST_A_24
0x4D90 US_ALU_CONST_R_25
0x4D94 US_ALU_CONST_G_25
0x4D98 US_ALU_CONST_B_25
0x4D9C US_ALU_CONST_A_25
0x4DA0 US_ALU_CONST_R_26
0x4DA4 US_ALU_CONST_G_26
0x4DA8 US_ALU_CONST_B_26
0x4DAC US_ALU_CONST_A_26
0x4DB0 US_ALU_CONST_R_27
0x4DB4 US_ALU_CONST_G_27
0x4DB8 US_ALU_CONST_B_27
0x4DBC US_ALU_CONST_A_27
0x4DC0 US_ALU_CONST_R_28
0x4DC4 US_ALU_CONST_G_28
0x4DC8 US_ALU_CONST_B_28
0x4DCC US_ALU_CONST_A_28
0x4DD0 US_ALU_CONST_R_29
0x4DD4 US_ALU_CONST_G_29
0x4DD8 US_ALU_CONST_B_29
0x4DDC US_ALU_CONST_A_29
0x4DE0 US_ALU_CONST_R_30
0x4DE4 US_ALU_CONST_G_30
0x4DE8 US_ALU_CONST_B_30
0x4DEC US_ALU_CONST_A_30
0x4DF0 US_ALU_CONST_R_31
0x4DF4 US_ALU_CONST_G_31
0x4DF8 US_ALU_CONST_B_31
0x4DFC US_ALU_CONST_A_31
0x4E08 RB3D_ABLENDCNTL_R3
0x4E10 RB3D_CONSTANT_COLOR
0x4E14 RB3D_COLOR_CLEAR_VALUE
0x4E18 RB3D_ROPCNTL_R3
0x4E1C RB3D_CLRCMP_FLIPE_R3
0x4E20 RB3D_CLRCMP_CLR_R3
0x4E24 RB3D_CLRCMP_MSK_R3
0x4E48 RB3D_DEBUG_CTL
0x4E4C RB3D_DSTCACHE_CTLSTAT_R3
0x4E50 RB3D_DITHER_CTL
0x4E54 RB3D_CMASK_OFFSET0
0x4E58 RB3D_CMASK_OFFSET1
0x4E5C RB3D_CMASK_OFFSET2
0x4E60 RB3D_CMASK_OFFSET3
0x4E64 RB3D_CMASK_PITCH0
0x4E68 RB3D_CMASK_PITCH1
0x4E6C RB3D_CMASK_PITCH2
0x4E70 RB3D_CMASK_PITCH3
0x4E74 RB3D_CMASK_WRINDEX
0x4E78 RB3D_CMASK_DWORD
0x4E7C RB3D_CMASK_RDINDEX
0x4EA0 RB3D_DISCARD_SRC_PIXEL_LTE_THRESHOLD
0x4EA4 RB3D_DISCARD_SRC_PIXEL_GTE_THRESHOLD
0x4F04 ZB_ZSTENCILCNTL
0x4F08 ZB_STENCILREFMASK
0x4F14 ZB_ZTOP
0x4F18 ZB_ZCACHE_CTLSTAT
0x4F28 ZB_DEPTHCLEARVALUE
0x4F58 ZB_ZPASS_DATA
| {
"pile_set_name": "Github"
} |
# THIS FILE IS AUTO-GENERATED. DO NOT EDIT
from verta._swagger.base_type import BaseType
class ModeldbGetExperimentRunsInProjectResponse(BaseType):
def __init__(self, experiment_runs=None, total_records=None):
required = {
"experiment_runs": False,
"total_records": False,
}
self.experiment_runs = experiment_runs
self.total_records = total_records
for k, v in required.items():
if self[k] is None and v:
raise ValueError('attribute {} is required'.format(k))
@staticmethod
def from_json(d):
from .ModeldbExperimentRun import ModeldbExperimentRun
tmp = d.get('experiment_runs', None)
if tmp is not None:
d['experiment_runs'] = [ModeldbExperimentRun.from_json(tmp) for tmp in tmp]
tmp = d.get('total_records', None)
if tmp is not None:
d['total_records'] = tmp
return ModeldbGetExperimentRunsInProjectResponse(**d)
| {
"pile_set_name": "Github"
} |
/*!
* socket.io-node
* Copyright(c) 2011 LearnBoost <[email protected]>
* MIT Licensed
*/
module.exports = require('./lib/socket.io');
| {
"pile_set_name": "Github"
} |
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/RelativeLayout1"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical" >
<TextView
android:id="@+id/textView1"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentLeft="true"
android:layout_alignParentTop="true"
android:text="@string/timmy_needs_upgrade"
android:textAppearance="?android:attr/textAppearanceMedium" />
<Button
android:id="@+id/button1"
android:layout_width="60sp"
android:layout_height="wrap_content"
android:layout_alignParentLeft="true"
android:layout_centerVertical="true"
android:onClick="onOk"
android:text="@string/ok" />
</RelativeLayout> | {
"pile_set_name": "Github"
} |
<?xml version="1.0" encoding="utf-8"?>
<ScrollView android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:android="http://schemas.android.com/apk/res/android">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="vertical">
<android.support.design.widget.TextInputLayout
android:id="@+id/student_name_layout"
android:hint="Place to type name"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<EditText
android:id="@+id/student_name"
android:layout_width="match_parent"
android:layout_height="wrap_content"/>
</android.support.design.widget.TextInputLayout>
<android.support.design.widget.TextInputLayout
android:id="@+id/student_id_layout"
android:hint="Place to type id"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<EditText
android:inputType="number"
android:id="@+id/student_id"
android:layout_width="match_parent"
android:layout_height="wrap_content"/>
</android.support.design.widget.TextInputLayout>
<android.support.design.widget.TextInputLayout
android:id="@+id/student_grade_layout"
android:hint="Place to type grade"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<EditText
android:maxLength="1"
android:id="@+id/student_grade"
android:layout_width="match_parent"
android:layout_height="wrap_content"/>
</android.support.design.widget.TextInputLayout>
<Switch
android:text="Is student passing?"
android:id="@+id/student_is_passing"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
<Button
android:text="SAVE"
android:id="@+id/save_button"
android:layout_gravity="center"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
<Button
android:text="RESTORE"
android:id="@+id/restore_button"
android:layout_gravity="center"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</LinearLayout>
</ScrollView> | {
"pile_set_name": "Github"
} |
<load target="js/ncenter_admin.js" />
<load target="css/ncenter_admin.css" />
<include target="header.html" />
<div class="x_form-horizontal" id="fo_ncenterlite">
<section class="section">
<h1>{$lang->etc}</h1>
<div class="x_control-group">
<label class="x_control-label">{$lang->ncenterlite_test_mention}</label>
<div class="x_controls">
<form action="./" method="post">
<fieldset>
<input type="hidden" name="module" value="ncenterlite" />
<input type="hidden" name="act" value="procNcenterliteAdminInsertDummyData" />
<button type="submit" class="x_btn">{$lang->ncenterlite_test_make_dummy}</button>
</fieldset>
</form>
<p>{$lang->ncenterlite_test_mention_about}</p>
</div>
</div>
<div class="x_control-group">
<label class="x_control-label">{$lang->ncenterlite_test_push}</label>
<div class="x_controls">
<form action="./" method="post">
<fieldset>
<input type="hidden" name="module" value="ncenterlite" />
<input type="hidden" name="act" value="procNcenterliteAdminInsertPushData" />
<button type="submit" class="x_btn">{$lang->ncenterlite_test_make_dummy}</button>
</fieldset>
</form>
<p>{$lang->ncenterlite_test_push_about}</p>
</div>
</div>
</section>
</div>
| {
"pile_set_name": "Github"
} |
0.2.1 / 2014-01-29
==================
* fix: support max-age=0 for end-to-end revalidation
0.2.0 / 2013-08-11
==================
* fix: return false for no-cache
| {
"pile_set_name": "Github"
} |
;; WARNING: This file was generated from umm-c-json-schema.json. Do not manually modify.
(ns cmr.umm-spec.models.umm-collection-models
"Defines UMM-C clojure records."
(:require [cmr.common.dev.record-pretty-printer :as record-pretty-printer]))
(defrecord UMM-C
[
;; Information required to properly cite the collection in professional scientific literature.
;; This element provides information for constructing a citation for the item itself, and is not
;; designed for listing bibliographic references of scientific research articles arising from
;; search results. A list of references related to the research results should be in the
;; Publication Reference element.
CollectionCitations
;; Controlled hierarchical keywords used to specify the spatial location of the collection. The
;; controlled vocabulary for spatial keywords is maintained in the Keyword Management System
;; (KMS). The Spatial Keyword hierarchy includes one or more of the following layers: Category
;; (e.g., Continent), Type (e.g. Africa), Subregion1 (e.g., Central Africa), Subregion2 (e.g.,
;; Cameroon), and Subregion3. DetailedLocation exists outside the hierarchy.
LocationKeywords
;; Dates related to activities involving the metadata record itself. For example, Future Review
;; date is the date that the metadata record is scheduled to be reviewed.
MetadataDates
;; The Version Description of the collection.
VersionDescription
;; This is deprecated and will be removed. Use LocationKeywords instead. Controlled hierarchical
;; keywords used to specify the spatial location of the collection. The controlled vocabulary for
;; spatial keywords is maintained in the Keyword Management System (KMS). The Spatial Keyword
;; hierarchy includes one or more of the following layers: Location_Category (e.g., Continent),
;; Location_Type (e.g. Africa), Location_Subregion1 (e.g., Central Africa), Location_Subregion2
;; (e.g., Cameroon), and Location_Subregion3.
SpatialKeywords
;; Identifies the topic categories from the EN ISO 19115-1:2014 Geographic Information โ Metadata
;; โ Part 1: Fundamentals (http://www.isotc211.org/) Topic Category Code List that pertain to
;; this collection, based on the Science Keywords associated with the collection. An ISO Topic
;; Category is a high-level thematic classification to assist in the grouping of and search for
;; available collections.
ISOTopicCategories
;; The short name associated with the collection.
ShortName
;; A brief description of the collection or service the metadata represents.
Abstract
;; The language used in the metadata record.
MetadataLanguage
;; Formerly called Internal Directory Name (IDN) Node (IDN_Node). This element has been used
;; historically by the GCMD internally to identify association, responsibility and/or ownership
;; of the dataset, service or supplemental information. Note: This field only occurs in the DIF.
;; When a DIF record is retrieved in the ECHO10 or ISO 19115 formats, this element will not be
;; translated.
DirectoryNames
;; Suggested usage or purpose for the collection data or service.
Purpose
;; Name of the two-dimensional tiling system for the collection. Previously called
;; TwoDCoordinateSystem.
TilingIdentificationSystems
;; Describes key bibliographic citations pertaining to the collection.
PublicationReferences
;; This element stores the DOI (Digital Object Identifier) that identifies the collection. Note:
;; The values should start with the directory indicator which in ESDIS' case is 10. If the DOI
;; was registered through ESDIS, the beginning of the string should be 10.5067. The DOI URL is
;; not stored here; it should be stored as a RelatedURL. The DOI organization that is responsible
;; for creating the DOI is described in the Authority element. For ESDIS records the value of
;; https://doi.org/ should be used. While this element is not required, NASA metadata providers
;; are strongly encouraged to include DOI and DOI Authority for their collections.
DOI
;; This element describes any data/service related URLs that include project home pages,
;; services, related data archives/servers, metadata extensions, direct links to online software
;; packages, web mapping services, links to images, or other data.
RelatedUrls
;; Dates related to activities involving the collection data. For example, Creation date is the
;; date that the collection data first entered the data archive system.
DataDates
;; Information about the personnel responsible for this collection and its metadata.
ContactPersons
;; Allows the author to constrain access to the collection. This includes any special
;; restrictions, legal prerequisites, limitations and/or warnings on obtaining collection data.
;; Some words that may be used in this element's value include: Public, In-house, Limited, None.
;; The value field is used for special ACL rules (Access Control Lists
;; (http://en.wikipedia.org/wiki/Access_control_list)). For example it can be used to hide
;; metadata when it isn't ready for public consumption.
AccessConstraints
SpatialExtent
;; Information about the personnel groups responsible for this collection and its metadata.
ContactGroups
;; The dataโs distinctive attributes of the collection (i.e. attributes used to describe the
;; unique characteristics of the collection which extend beyond those defined).
AdditionalAttributes
;; This element and all of its sub elements exist for display purposes. It allows a data provider
;; to provide archive and distribution information up front to an end user, to help them decide
;; if they can use the product.
ArchiveAndDistributionInformation
;; Controlled Science Keywords describing the collection. The controlled vocabulary for Science
;; Keywords is maintained in the Keyword Management System (KMS).
ScienceKeywords
;; Free text description of the quality of the collection data. Description may include: 1)
;; succinct description of the quality of data in the collection; 2) Any quality assurance
;; procedures followed in producing the data in the collection; 3) indicators of collection
;; quality or quality flags - both validated or invalidated; 4) recognized or potential problems
;; with quality; 5) established quality control mechanisms; and 6) established quantitative
;; quality measurements.
Quality
;; The title of the collection or service described by the metadata.
EntryTitle
;; This element describes the production status of the data set. There are five choices for Data
;; Providers: PLANNED refers to data sets to be collected in the future and are thus unavailable
;; at the present time. For Example: The Hydro spacecraft has not been launched, but information
;; on planned data sets may be available. ACTIVE refers to data sets currently in production or
;; data that is continuously being collected or updated. For Example: data from the AIRS
;; instrument on Aqua is being collected continuously. COMPLETE refers to data sets in which no
;; updates or further data collection will be made. For Example: Nimbus-7 SMMR data collection
;; has been completed. DEPRECATED refers to data sets that have been retired, but still can be
;; retrieved. Usually newer products exist that replace the retired data set. NOT APPLICABLE
;; refers to data sets in which a collection progress is not applicable such as a calibration
;; collection. There is a sixth value of NOT PROVIDED that should not be used by a data provider.
;; It is currently being used as a value when a correct translation cannot be done with the
;; current valid values, or when the value is not provided by the data provider.
CollectionProgress
;; For paleoclimate or geologic data, PaleoTemporalCoverage is the length of time represented by
;; the data collected. PaleoTemporalCoverage should be used when the data spans time frames
;; earlier than yyyy-mm-dd = 0001-01-01.
PaleoTemporalCoverages
;; The reference frame or system in which altitudes (elevations) are given. The information
;; contains the datum name, distance units and encoding method, which provide the definition for
;; the system. This field also stores the characteristics of the reference frame or system from
;; which depths are measured. The additional information in the field is geometry reference data
;; etc.
SpatialInformation
;; Identifies the collection as a Science Quality collection or a non-science-quality collection
;; such as a Near-Real-Time collection.
CollectionDataType
;; Designed to protect privacy and/or intellectual property by allowing the author to specify how
;; the collection may or may not be used after access is granted. This includes any special
;; restrictions, legal prerequisites, terms and conditions, and/or limitations on using the item.
;; Providers may request acknowledgement of the item from users and claim no responsibility for
;; quality and completeness. Note: Use Constraints describe how the item may be used once access
;; has been granted; and is distinct from Access Constraints, which refers to any constraints in
;; accessing the item.
UseConstraints
;; One or more words or phrases that describe the temporal resolution of the dataset.
TemporalKeywords
;; Allows authors to provide words or phrases outside of the controlled Science Keyword
;; vocabulary, to further describe the collection.
AncillaryKeywords
;; The identifier for the processing level of the collection (e.g., Level0, Level1A).
ProcessingLevel
;; Information about the relevant platform(s) used to acquire the data in the collection.
;; Platform types are controlled in the Keyword Management System (KMS), and include Spacecraft,
;; Aircraft, Vessel, Buoy, Platform, Station, Network, Human, etc.
Platforms
;; The name of the scientific program, field campaign, or project from which the data were
;; collected. This element is intended for the non-space assets such as aircraft, ground systems,
;; balloons, sondes, ships, etc. associated with campaigns. This element may also cover a long
;; term project that continuously creates new data sets โ like MEaSUREs from ISCCP and NVAP or
;; CMARES from MISR. Project also includes the Campaign sub-element to support multiple campaigns
;; under the same project.
Projects
;; The Version of the collection.
Version
;; This class contains attributes which describe the temporal range of a specific collection.
;; Temporal Extent includes a specification of the Temporal Range Type of the collection, which
;; is one of Range Date Time, Single Date Time, or Periodic Date Time
TemporalExtents
;; Information about the data centers responsible for this collection and its metadata.
DataCenters
;; This element is used to identify other services, collections, visualizations, granules, and
;; other metadata types and resources that are associated with or dependent on the data described
;; by the metadata. This element is also used to identify a parent metadata record if it exists.
;; This usage should be reserved for instances where a group of metadata records are subsets that
;; can be better represented by one parent metadata record, which describes the entire set. In
;; some instances, a child may point to more than one parent. The EntryId is the same as the
;; element described elsewhere in this document where it contains and ID, and Version.
MetadataAssociations
;; Describes the language used in the preparation, storage, and description of the collection. It
;; is the language of the collection data themselves. It does not refer to the language used in
;; the metadata record (although this may be the same language).
DataLanguage
])
(record-pretty-printer/enable-record-pretty-printing UMM-C)
;; For paleoclimate or geologic data, PaleoTemporalCoverage is the length of time represented by the
;; data collected. PaleoTemporalCoverage should be used when the data spans time frames earlier than
;; yyyy-mm-dd = 0001-01-01.
(defrecord PaleoTemporalCoverageType
[
;; Hierarchy of terms indicating units of geologic time, i.e., eon (e.g, Phanerozoic), era (e.g.,
;; Cenozoic), period (e.g., Paleogene), epoch (e.g., Oligocene), and stage or age (e.g,
;; Chattian).
ChronostratigraphicUnits
;; A string indicating the number of years furthest back in time, including units, e.g., 100 Ga.
;; Units may be Ga (billions of years before present), Ma (millions of years before present), ka
;; (thousands of years before present) or ybp (years before present).
StartDate
;; A string indicating the number of years closest to the present time, including units, e.g., 10
;; ka. Units may be Ga (billions of years before present), Ma (millions of years before present),
;; ka (thousands of years before present) or ybp (years before present).
EndDate
])
(record-pretty-printer/enable-record-pretty-printing PaleoTemporalCoverageType)
;; This sub-element either contains a license summary or free-text description that details the
;; permitted use or limitation of this collection.
(defrecord UseConstraintsDescriptionType
[
;; This sub-element either contains a license summary or free-text description that details the
;; permitted use or limitation of this collection.
Description
])
(record-pretty-printer/enable-record-pretty-printing UseConstraintsDescriptionType)
;; Information about a collection with horizontal spatial coverage.
(defrecord HorizontalSpatialDomainType
[
;; The appropriate numeric or alpha code used to identify the various zones in the collection's
;; grid coordinate system.
ZoneIdentifier
Geometry
;; Specifies the horizontal spatial extents coordinate system and its resolution.
ResolutionAndCoordinateSystem
])
(record-pretty-printer/enable-record-pretty-printing HorizontalSpatialDomainType)
;; Contains the excluded boundaries from the GPolygon.
(defrecord ExclusiveZoneType
[
Boundaries
])
(record-pretty-printer/enable-record-pretty-printing ExclusiveZoneType)
;; Information about a two-dimensional tiling system related to this collection.
(defrecord TilingIdentificationSystemType
[
TilingIdentificationSystemName
Coordinate1
Coordinate2
])
(record-pretty-printer/enable-record-pretty-printing TilingIdentificationSystemType)
;; Specifies the geographic and vertical (altitude, depth) coverage of the data.
(defrecord SpatialExtentType
[
;; Denotes whether the collection's spatial coverage requires horizontal, vertical, horizontal
;; and vertical, orbit, or vertical and orbit in the spatial domain and coordinate system
;; definitions.
SpatialCoverageType
HorizontalSpatialDomain
VerticalSpatialDomains
OrbitParameters
GranuleSpatialRepresentation
])
(record-pretty-printer/enable-record-pretty-printing SpatialExtentType)
;; This element defines a mapping to the GCMD KMS hierarchical location list. It replaces
;; SpatialKeywords. Each tier must have data in the tier above it.
(defrecord LocationKeywordType
[
;; Top-level controlled keyword hierarchical level that contains the largest general location
;; where the collection data was taken from.
Category
;; Second-tier controlled keyword hierarchical level that contains the regional location where
;; the collection data was taken from
Type
;; Third-tier controlled keyword heirarchical level that contains the regional sub-location where
;; the collection data was taken from
Subregion1
;; Fourth-tier controlled keyword heirarchical level that contains the regional sub-location
;; where the collection data was taken from
Subregion2
;; Fifth-tier controlled keyword heirarchical level that contains the regional sub-location where
;; the collection data was taken from
Subregion3
;; Uncontrolled keyword heirarchical level that contains the specific location where the
;; collection data was taken from. Exists outside the heirarchy.
DetailedLocation
])
(record-pretty-printer/enable-record-pretty-printing LocationKeywordType)
(defrecord LocalCoordinateSystemType
[
;; The information provided to register the local system to the Earth (e.g. control points,
;; satellite ephemeral data, and inertial navigation data).
GeoReferenceInformation
;; A description of the Local Coordinate System and geo-reference information.
Description
])
(record-pretty-printer/enable-record-pretty-printing LocalCoordinateSystemType)
;; This element defines a single artifact that is distributed by the data provider. This element
;; only includes the distributable artifacts that can be obtained by the user without the user
;; having to invoke a service. These should be documented in the UMM-S specification.
(defrecord FileDistributionInformationType
[
;; Allows the provider to state whether the distributable item's format is its native format or
;; another supported format.
FormatType
;; An approximate average size of the distributable item. This gives an end user an idea of the
;; magnitude for each distributable file if more than 1 exists.
AverageFileSize
;; Conveys the price one has to pay to obtain the distributable item.
Fees
;; This element defines a single format for a distributable artifact. Examples of format include:
;; ascii, binary, GRIB, BUFR, HDF4, HDF5, HDF-EOS4, HDF-EOS5, jpeg, png, tiff, geotiff, kml.
Format
;; An approximate total size of all of the distributable items within a collection. This gives an
;; end user an idea of the magnitude for all of distributable files combined.
TotalCollectionFileSize
;; The date of which this collection started to collect data. This date is used by users to be
;; able to calculate the current total collection file size. The date needs to be in the
;; yyyy-MM-ddTHH:mm:ssZ format; for example: 2018-01-01T10:00:00Z.
TotalCollectionFileSizeBeginDate
;; Allows the record provider to provide supporting documentation about the Format.
FormatDescription
;; Unit of measure for the total collection file size.
TotalCollectionFileSizeUnit
;; Provides the data provider a way to convey more information about the distributable item.
Description
;; Unit of measure for the average file size.
AverageFileSizeUnit
;; This element defines the media by which the end user can obtain the distributable item. Each
;; media type is listed separately. Examples of media include: CD-ROM, 9 track tape, diskettes,
;; hard drives, online, transparencies, hardcopy, etc.
Media
])
(record-pretty-printer/enable-record-pretty-printing FileDistributionInformationType)
;; This class defines the horizontal spatial extents coordinate system and the data product's
;; horizontal data resolution. The horizontal data resolution is defined as the smallest horizontal
;; distance between successive elements of data in a dataset. This is synonymous with terms such as
;; ground sample distance, sample spacing and pixel size. It is to be noted that the horizontal data
;; resolution could be different in the two horizontal dimensions. Also, it is different from the
;; spatial resolution of an instrument, which is the minimum distance between points that an
;; instrument can see as distinct.
(defrecord ResolutionAndCoordinateSystemType
[
;; This element holds a description about the resoultion and coordinate system for people to
;; read.
Description
;; This element describes the geodetic model for the data product.
GeodeticModel
;; This class defines a number of the data products horizontal data resolution. The horizontal
;; data resolution is defined as the smallest horizontal distance between successive elements of
;; data in a dataset. This is synonymous with terms such as ground sample distance, sample
;; spacing and pixel size. It is to be noted that the horizontal data resolution could be
;; different in the two horizontal dimensions. Also, it is different from the spatial resolution
;; of an instrument, which is the minimum distance between points that an instrument can see as
;; distinct.
HorizontalDataResolution
;; This element describes the local coordinate system for the data product.
LocalCoordinateSystem
])
(record-pretty-printer/enable-record-pretty-printing ResolutionAndCoordinateSystemType)
;; Generic Resolutions object describes general resolution data for a data product where it is not
;; known if a data product is gridded or not.
(defrecord HorizontalDataGenericResolutionType
[
;; The minimum difference between two adjacent values on a horizontal plane in the X axis. In
;; most cases this is along the longitudinal axis.
XDimension
;; The minimum difference between two adjacent values on a horizontal plan in the Y axis. In most
;; cases this is along the latitudinal axis.
YDimension
;; Units of measure used for the XDimension and YDimension values.
Unit
])
(record-pretty-printer/enable-record-pretty-printing HorizontalDataGenericResolutionType)
(defrecord BoundingRectangleType
[
WestBoundingCoordinate
NorthBoundingCoordinate
EastBoundingCoordinate
SouthBoundingCoordinate
])
(record-pretty-printer/enable-record-pretty-printing BoundingRectangleType)
(defrecord LineType
[
Points
])
(record-pretty-printer/enable-record-pretty-printing LineType)
;; This class defines a number of the data products horizontal data resolution. The horizontal data
;; resolution is defined as the smallest horizontal distance between successive elements of data in
;; a dataset. This is synonymous with terms such as ground sample distance, sample spacing and pixel
;; size. It is to be noted that the horizontal data resolution could be different in the two
;; horizontal dimensions. Also, it is different from the spatial resolution of an instrument, which
;; is the minimum distance between points that an instrument can see as distinct.
(defrecord HorizontalDataResolutionType
[
;; Varies Resolution object describes a data product that has a number of resolution values.
VariesResolution
;; Point Resolution object describes a data product that is from a point source.
PointResolution
;; Non Gridded Resolutions object describes resolution data for non gridded data products.
NonGriddedResolutions
;; Non Gridded Range Resolutions object describes range resolution data for non gridded data
;; products.
NonGriddedRangeResolutions
;; Gridded Resolutions object describes resolution data for gridded data products.
GriddedResolutions
;; Gridded Range Resolutions object describes range resolution data for gridded data products.
GriddedRangeResolutions
;; Generic Resolutions object describes general resolution data for a data product where it is
;; not known if a data product is gridded or not.
GenericResolutions
])
(record-pretty-printer/enable-record-pretty-printing HorizontalDataResolutionType)
(defrecord ChronostratigraphicUnitType
[
Eon
Era
Epoch
Stage
DetailedClassification
Period
])
(record-pretty-printer/enable-record-pretty-printing ChronostratigraphicUnitType)
(defrecord VerticalSpatialDomainType
[
;; Describes the type of the area of vertical space covered by the collection locality.
Type
;; Describes the extent of the area of vertical space covered by the collection. Must be
;; accompanied by an Altitude Encoding Method description. The datatype for this attribute is the
;; value of the attribute VerticalSpatialDomainType. The unit for this attribute is the value of
;; either DepthDistanceUnits or AltitudeDistanceUnits.
Value
])
(record-pretty-printer/enable-record-pretty-printing VerticalSpatialDomainType)
(defrecord GeometryType
[
CoordinateSystem
Points
BoundingRectangles
GPolygons
Lines
])
(record-pretty-printer/enable-record-pretty-printing GeometryType)
;; The reference frame or system from which altitude is measured. The term 'altitude' is used
;; instead of the common term 'elevation' to conform to the terminology in Federal Information
;; Processing Standards 70-1 and 173. The information contains the datum name, distance units and
;; encoding method, which provide the definition for the system.
(defrecord AltitudeSystemDefinitionType
[
;; The identification given to the level surface taken as the surface of reference from which
;; measurements are compared.
DatumName
;; The units in which measurements are recorded.
DistanceUnits
;; The minimum distance possible between two adjacent values, expressed in distance units of
;; measure for the collection.
Resolutions
])
(record-pretty-printer/enable-record-pretty-printing AltitudeSystemDefinitionType)
;; The longitude and latitude values of a spatially referenced point in degrees.
(defrecord PointType
[
Longitude
Latitude
])
(record-pretty-printer/enable-record-pretty-printing PointType)
;; This element contains the Processing Level Id and the Processing Level Description
(defrecord ProcessingLevelType
[
;; Description of the meaning of the Processing Level Id, e.g., the Description for the Level4
;; Processing Level Id might be 'Model output or results from analyses of lower level data'
ProcessingLevelDescription
;; An identifier indicating the level at which the data in the collection are processed, ranging
;; from Level0 (raw instrument data at full resolution) to Level4 (model output or analysis
;; results). The value of Processing Level Id is chosen from a controlled vocabulary.
Id
])
(record-pretty-printer/enable-record-pretty-printing ProcessingLevelType)
;; Defines the minimum and maximum value for one dimension of a two dimensional coordinate system.
(defrecord TilingCoordinateType
[
MinimumValue
MaximumValue
])
(record-pretty-printer/enable-record-pretty-printing TilingCoordinateType)
(defrecord GPolygonType
[
Boundary
ExclusiveZone
])
(record-pretty-printer/enable-record-pretty-printing GPolygonType)
;; A boundary is set of points connected by straight lines representing a polygon on the earth. It
;; takes a minimum of three points to make a boundary. Points must be specified in counter-clockwise
;; order and closed (the first and last vertices are the same).
(defrecord BoundaryType
[
Points
])
(record-pretty-printer/enable-record-pretty-printing BoundaryType)
;; This element describes the geodetic model for the data product.
(defrecord GeodeticModelType
[
;; The identification given to the reference system used for defining the coordinates of points.
HorizontalDatumName
;; Identification given to established representation of the Earth's shape.
EllipsoidName
;; Radius of the equatorial axis of the ellipsoid.
SemiMajorAxis
;; The ratio of the Earth's major axis to the difference between the major and the minor.
DenominatorOfFlatteningRatio
])
(record-pretty-printer/enable-record-pretty-printing GeodeticModelType)
;; The reference frame or system from which depth is measured. The information contains the datum
;; name, distance units and encoding method, which provide the definition for the system.
(defrecord DepthSystemDefinitionType
[
;; The identification given to the level surface taken as the surface of reference from which
;; measurements are compared.
DatumName
;; The units in which measurements are recorded.
DistanceUnits
;; The minimum distance possible between two adjacent values, expressed in distance units of
;; measure for the collection.
Resolutions
])
(record-pretty-printer/enable-record-pretty-printing DepthSystemDefinitionType)
;; This entity stores the reference frame or system from which horizontal and vertical spatial
;; domains are measured. The horizontal reference frame includes a Geodetic Model, Geographic
;; Coordinates, and Local Coordinates. The Vertical reference frame includes altitudes (elevations)
;; and depths.
(defrecord SpatialInformationType
[
VerticalCoordinateSystem
;; Denotes whether the spatial coverage of the collection is horizontal, vertical, horizontal and
;; vertical, orbit, or vertical and orbit.
SpatialCoverageType
])
(record-pretty-printer/enable-record-pretty-printing SpatialInformationType)
;; Non Gridded Range Resolutions object describes range resolution data for non gridded data
;; products.
(defrecord HorizontalDataResolutionNonGriddedRangeType
[
;; The minimum, minimum difference between two adjacent values on a horizontal plane in the X
;; axis. In most cases this is along the longitudinal axis.
MinimumXDimension
;; The minimum, minimum difference between two adjacent values on a horizontal plan in the Y
;; axis. In most cases this is along the latitudinal axis.
MinimumYDimension
;; The maximum, minimum difference between two adjacent values on a horizontal plane in the X
;; axis. In most cases this is along the longitudinal axis.
MaximumXDimension
;; The maximum, minimum difference between two adjacent values on a horizontal plan in the Y
;; axis. In most cases this is along the latitudinal axis.
MaximumYDimension
;; Units of measure used for the XDimension and YDimension values.
Unit
;; This element describes the angle of the measurement with respect to the instrument that gives
;; an understanding of the specified resolution.
ViewingAngleType
;; This element describes the instrument scanning direction.
ScanDirection
])
(record-pretty-printer/enable-record-pretty-printing HorizontalDataResolutionNonGriddedRangeType)
;; This element defines a single archive artifact which a data provider would like to inform an end
;; user that it exists.
(defrecord FileArchiveInformationType
[
;; Allows the provider to state whether the archivable item's format is its native format or
;; another supported format.
FormatType
;; An approximate average size of the archivable item. This gives an end user an idea of the
;; magnitude for each archivable file if more than 1 exists.
AverageFileSize
;; This element defines a single format for an archival artifact. Examples of format include:
;; ascii, binary, GRIB, BUFR, HDF4, HDF5, HDF-EOS4, HDF-EOS5, jpeg, png, tiff, geotiff, kml.
Format
;; An approximate total size of all of the archivable items within a collection. This gives an
;; end user an idea of the magnitude for all of archivable files combined.
TotalCollectionFileSize
;; The date of which this collection started to collect data. This date is used by users to be
;; able to calculate the current total collection file size. The date needs to be in the
;; yyyy-MM-ddTHH:mm:ssZ format; for example: 2018-01-01T10:00:00Z.
TotalCollectionFileSizeBeginDate
;; Allows the record provider to provide supporting documentation about the Format.
FormatDescription
;; Unit of measure for the total collection file size.
TotalCollectionFileSizeUnit
;; Provides the data provider a way to convey more information about the archivable item.
Description
;; Unit of measure for the average file size.
AverageFileSizeUnit
])
(record-pretty-printer/enable-record-pretty-printing FileArchiveInformationType)
;; Orbit parameters for the collection used by the Orbital Backtrack Algorithm.
(defrecord OrbitParametersType
[
;; Width of the swath at the equator in Kilometers.
SwathWidth
;; Orbital period in decimal minutes.
Period
;; Inclination of the orbit. This is the same as (180-declination) and also the same as the
;; highest latitude achieved by the satellite. Data Unit: Degree.
InclinationAngle
;; Indicates the number of orbits.
NumberOfOrbits
;; The latitude start of the orbit relative to the equator. This is used by the backtrack search
;; algorithm to treat the orbit as if it starts from the specified latitude. This is optional and
;; will default to 0 if not specified.
StartCircularLatitude
])
(record-pretty-printer/enable-record-pretty-printing OrbitParametersType)
;; Gridded Range Resolutions object describes range resolution data for gridded data products.
(defrecord HorizontalDataResolutionGriddedRangeType
[
;; The minimum, minimum difference between two adjacent values on a horizontal plane in the X
;; axis. In most cases this is along the longitudinal axis.
MinimumXDimension
;; The minimum, minimum difference between two adjacent values on a horizontal plan in the Y
;; axis. In most cases this is along the latitudinal axis.
MinimumYDimension
;; The maximum, minimum difference between two adjacent values on a horizontal plane in the X
;; axis. In most cases this is along the longitudinal axis.
MaximumXDimension
;; The maximum, minimum difference between two adjacent values on a horizontal plan in the Y
;; axis. In most cases this is along the latitudinal axis.
MaximumYDimension
;; Units of measure used for the XDimension and YDimension values.
Unit
])
(record-pretty-printer/enable-record-pretty-printing HorizontalDataResolutionGriddedRangeType)
;; This element and all of its sub elements exist for display purposes. It allows a data provider to
;; provide archive and distribution information up front to an end user, to help them decide if they
;; can use the product.
(defrecord ArchiveAndDistributionInformationType
[
;; This element defines a single archive artifact which a data provider would like to inform an
;; end user that it exists.
FileArchiveInformation
;; This element defines a single artifact that is distributed by the data provider. This element
;; only includes the distributable artifacts that can be obtained by the user without the user
;; having to invoke a service. These should be documented in the UMM-S specification.
FileDistributionInformation
])
(record-pretty-printer/enable-record-pretty-printing ArchiveAndDistributionInformationType)
;; Formerly called Internal Directory Name (IDN) Node (IDN_Node). This element has been used
;; historically by the GCMD internally to identify association, responsibility and/or ownership of
;; the dataset, service or supplemental information. Note: This field only occurs in the DIF. When a
;; DIF record is retrieved in the ECHO10 or ISO 19115 formats, this element will not be translated.
(defrecord DirectoryNameType
[
ShortName
LongName
])
(record-pretty-printer/enable-record-pretty-printing DirectoryNameType)
;; Non Gridded Resolutions object describes resolution data for non gridded data products.
(defrecord HorizontalDataResolutionNonGriddedType
[
;; The minimum difference between two adjacent values on a horizontal plane in the X axis. In
;; most cases this is along the longitudinal axis.
XDimension
;; The minimum difference between two adjacent values on a horizontal plan in the Y axis. In most
;; cases this is along the latitudinal axis.
YDimension
;; Units of measure used for the XDimension and YDimension values.
Unit
;; This element describes the angle of the measurement with respect to the instrument that gives
;; an understanding of the specified resolution.
ViewingAngleType
;; This element describes the instrument scanning direction.
ScanDirection
])
(record-pretty-printer/enable-record-pretty-printing HorizontalDataResolutionNonGriddedType)
;; This element defines how the data may or may not be used after access is granted to assure the
;; protection of privacy or intellectual property. This includes license text, license URL, or any
;; special restrictions, legal prerequisites, terms and conditions, and/or limitations on using the
;; data set. Data providers may request acknowledgement of the data from users and claim no
;; responsibility for quality and completeness of data.
(defrecord UseConstraintsType
[
Description
;; This element holds the URL and associated information to access the License on the web. If
;; this element is used the LicenseText element cannot be used.
LicenseUrl
;; This element holds the actual license text. If this element is used the LicenseUrl element
;; cannot be used.
LicenseText
])
(record-pretty-printer/enable-record-pretty-printing UseConstraintsType)
(defrecord VerticalCoordinateSystemType
[
AltitudeSystemDefinition
DepthSystemDefinition
])
(record-pretty-printer/enable-record-pretty-printing VerticalCoordinateSystemType)
;; Gridded Resolutions object describes resolution data for gridded data products.
(defrecord HorizontalDataResolutionGriddedType
[
;; The minimum difference between two adjacent values on a horizontal plane in the X axis. In
;; most cases this is along the longitudinal axis.
XDimension
;; The minimum difference between two adjacent values on a horizontal plan in the Y axis. In most
;; cases this is along the latitudinal axis.
YDimension
;; Units of measure used for the XDimension and YDimension values.
Unit
])
(record-pretty-printer/enable-record-pretty-printing HorizontalDataResolutionGriddedType) | {
"pile_set_name": "Github"
} |
'use strict';
Object.defineProperty(exports, '__esModule', { value: true });
var prefix = 'fas';
var iconName = 'swatchbook';
var width = 511;
var height = 512;
var ligatures = [];
var unicode = 'f5c3';
var svgPathData = 'M479.06 320H372.29L186.15 506.51c-2.06 2.07-4.49 3.58-6.67 5.49h299.58c17.64 0 31.94-14.33 31.94-32V352c0-17.67-14.3-32-31.94-32zm-44.5-152.9l-90.33-90.51c-12.47-12.5-32.69-12.5-45.17 0l-75.5 75.65V416c0 2.96-.67 5.73-.87 8.64l211.87-212.28c12.47-12.5 12.47-32.77 0-45.26zM191.62 32c0-17.67-14.3-32-31.94-32H31.94C14.3 0 0 14.33 0 32v384c0 53.02 42.9 96 95.81 96s95.81-42.98 95.81-96V32zM95.81 440c-13.23 0-23.95-10.75-23.95-24 0-13.26 10.73-24 23.95-24s23.95 10.74 23.95 24c.01 13.25-10.72 24-23.95 24zm31.94-184H63.88v-64h63.88v64zm0-128H63.88V64h63.88v64z';
exports.definition = {
prefix: prefix,
iconName: iconName,
icon: [
width,
height,
ligatures,
unicode,
svgPathData
]};
exports.faSwatchbook = exports.definition;
exports.prefix = prefix;
exports.iconName = iconName;
exports.width = width;
exports.height = height;
exports.ligatures = ligatures;
exports.unicode = unicode;
exports.svgPathData = svgPathData; | {
"pile_set_name": "Github"
} |
"use strict";
var hasOwnProperty = Object.prototype.hasOwnProperty;
function parseQuery(urlObj, options)
{
urlObj.query.string.full = stringify(urlObj.query.object, false);
// TWEAK :: condition only for speed optimization
if (options.removeEmptyQueries)
{
urlObj.query.string.stripped = stringify(urlObj.query.object, true);
}
}
function stringify(queryObj, removeEmptyQueries)
{
var count = 0;
var str = "";
for (var i in queryObj)
{
if ( i!=="" && hasOwnProperty.call(queryObj, i)===true )
{
var value = queryObj[i];
if (value !== "" || !removeEmptyQueries)
{
str += (++count===1) ? "?" : "&";
i = encodeURIComponent(i);
if (value !== "")
{
str += i +"="+ encodeURIComponent(value).replace(/%20/g,"+");
}
else
{
str += i;
}
}
}
}
return str;
}
module.exports = parseQuery;
| {
"pile_set_name": "Github"
} |
package com.eventyay.organizer.core.event.about;
import android.content.res.Resources;
import androidx.core.view.ViewCompat;
import com.google.android.material.appbar.AppBarLayout;
import com.google.android.material.appbar.CollapsingToolbarLayout;
import javax.inject.Inject;
import io.reactivex.Observable;
import io.reactivex.subjects.PublishSubject;
class ToolbarColorChanger {
private AppBarLayout appBarLayout;
private AppBarLayout.OnOffsetChangedListener onOffsetChangedListener;
@Inject
ToolbarColorChanger() {
// Default Implementation
}
void removeChangeListener() {
if (appBarLayout != null && onOffsetChangedListener != null)
appBarLayout.removeOnOffsetChangedListener(onOffsetChangedListener);
}
@SuppressWarnings("PMD.DataflowAnomalyAnalysis")
Observable<Integer> observeColor(AppBarLayout appBar, CollapsingToolbarLayout collapsingToolbar) {
appBarLayout = appBar;
Resources resources = appBar.getResources();
PublishSubject<Boolean> offsetChanger = PublishSubject.create();
onOffsetChangedListener = (appBarLayout, verticalOffset) ->
offsetChanger.onNext((collapsingToolbar.getHeight() + verticalOffset) <
(2 * ViewCompat.getMinimumHeight(collapsingToolbar)));
appBar.addOnOffsetChangedListener(onOffsetChangedListener);
return offsetChanger.distinctUntilChanged()
.map(collapsed -> {
if (collapsed)
return resources.getColor(android.R.color.black);
else
return resources.getColor(android.R.color.white);
});
}
}
| {
"pile_set_name": "Github"
} |
! Copyright (C) 2010 John Benediktsson
! See http://factorcode.org/license.txt for BSD license
USING: arrays ascii assocs combinators combinators.smart fry
http.client io.encodings.ascii io.files io.files.temp kernel
locals math math.ranges math.statistics memoize sequences
sequences.private sorting splitting urls ;
IN: spelling
! http://norvig.com/spell-correct.html
CONSTANT: ALPHABET "abcdefghijklmnopqrstuvwxyz"
: deletes ( word -- edits )
[ length <iota> ] keep '[ _ remove-nth ] map ;
: transposes ( word -- edits )
[ length [1,b) ] keep
'[ dup 1 - _ clone [ exchange-unsafe ] keep ] map ;
: replace1 ( i word -- words )
[ ALPHABET ] 2dip bounds-check
'[ _ _ clone [ set-nth-unsafe ] keep ] { } map-as ;
: replaces ( word -- edits )
[ length <iota> ] keep '[ _ replace1 ] map concat ;
: inserts ( word -- edits )
[ length [0,b] ] keep
'[ CHAR: ? over _ insert-nth replace1 ] map concat ;
: edits1 ( word -- edits )
[
{
[ deletes ]
[ transposes ]
[ replaces ]
[ inserts ]
} cleave
] append-outputs ;
: edits2 ( word -- edits )
edits1 [ edits1 ] map concat ;
: filter-known ( edits dictionary -- words )
'[ _ key? ] filter ;
:: corrections ( word dictionary -- words )
word 1array dictionary filter-known
[ word edits1 dictionary filter-known ] when-empty
[ word edits2 dictionary filter-known ] when-empty
[ dictionary at ] sort-with reverse! ;
: words ( string -- words )
>lower [ letter? not ] split-when harvest ;
: load-dictionary ( file -- assoc )
ascii file-contents words histogram ;
MEMO: default-dictionary ( -- counts )
URL" http://norvig.com/big.txt" "big.txt" temp-file
[ ?download-to ] [ load-dictionary ] bi ;
: (correct) ( word dictionary -- word/f )
corrections ?first ;
: correct ( word -- word/f )
default-dictionary (correct) ;
| {
"pile_set_name": "Github"
} |
package org.testng.eclipse.ui.conversion;
import java.util.Collection;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import org.eclipse.jdt.core.dom.AST;
import org.eclipse.jdt.core.dom.ASTNode;
import org.eclipse.jdt.core.dom.Annotation;
import org.eclipse.jdt.core.dom.ArrayType;
import org.eclipse.jdt.core.dom.Block;
import org.eclipse.jdt.core.dom.CompilationUnit;
import org.eclipse.jdt.core.dom.Expression;
import org.eclipse.jdt.core.dom.ImportDeclaration;
import org.eclipse.jdt.core.dom.MarkerAnnotation;
import org.eclipse.jdt.core.dom.MemberValuePair;
import org.eclipse.jdt.core.dom.MethodDeclaration;
import org.eclipse.jdt.core.dom.MethodInvocation;
import org.eclipse.jdt.core.dom.Modifier;
import org.eclipse.jdt.core.dom.Name;
import org.eclipse.jdt.core.dom.NormalAnnotation;
import org.eclipse.jdt.core.dom.ReturnStatement;
import org.eclipse.jdt.core.dom.SimpleName;
import org.eclipse.jdt.core.dom.SimpleType;
import org.eclipse.jdt.core.dom.SingleMemberAnnotation;
import org.eclipse.jdt.core.dom.SuperConstructorInvocation;
import org.eclipse.jdt.core.dom.TypeDeclaration;
import org.eclipse.jdt.core.dom.TypeLiteral;
import org.eclipse.jdt.core.dom.rewrite.ASTRewrite;
import org.eclipse.jdt.core.dom.rewrite.ListRewrite;
import org.testng.eclipse.TestNGPlugin;
import org.testng.eclipse.util.PreferenceStoreUtil.SuiteMethodTreatment;
/**
* A rewriter that will convert the current JUnit file to TestNG
* using JDK5 annotations
*
* @author C๏ฟฝdric Beust <[email protected]>
*/
public class AnnotationRewriter implements IRewriteProvider
{
private static final Set<String> IMPORTS_TO_REMOVE = new HashSet<String>() {{
add("junit.framework.Assert");
add("junit.framework.Test");
add("junit.framework.TestCase");
add("junit.framework.TestSuite");
add("org.junit.After");
add("org.junit.AfterClass");
add("org.junit.Before");
add("org.junit.BeforeClass");
add("org.junit.Ignore");
add("org.junit.Test");
add("org.junit.runner.RunWith");
add("org.junit.runners.Parameterized");
}};
private static final Set<String> STATIC_IMPORTS_TO_REMOVE = new HashSet<String>() {{
add("org.junit.Assert");
}};
public ASTRewrite createRewriter(CompilationUnit astRoot, AST ast) {
final ASTRewrite result = ASTRewrite.create(astRoot.getAST());
JUnitVisitor visitor = new JUnitVisitor();
astRoot.accept(visitor);
//
// Remove some JUnit imports.
//
List<ImportDeclaration> oldImports = visitor.getJUnitImports();
for (int i = 0; i < oldImports.size(); i++) {
Name importName = oldImports.get(i).getName();
String fqn = importName.getFullyQualifiedName();
if (IMPORTS_TO_REMOVE.contains(fqn)) {
result.remove(oldImports.get(i), null);
}
for (String s : STATIC_IMPORTS_TO_REMOVE) {
if (fqn.contains(s)) {
result.remove(oldImports.get(i), null);
}
}
}
//
// Add imports as needed
//
maybeAddImport(ast, result, astRoot, visitor.hasAsserts(), "org.testng.AssertJUnit");
maybeAddImport(ast, result, astRoot, visitor.hasFail(), "org.testng.Assert");
maybeAddImport(ast, result, astRoot, !visitor.getBeforeClasses().isEmpty(),
"org.testng.annotations.BeforeClass");
maybeAddImport(ast, result, astRoot, !visitor.getBeforeMethods().isEmpty(),
"org.testng.annotations.BeforeMethod");
maybeAddImport(ast, result, astRoot, visitor.hasTestMethods(), "org.testng.annotations.Test");
maybeAddImport(ast, result, astRoot, !visitor.getAfterMethods().isEmpty(),
"org.testng.annotations.AfterMethod");
maybeAddImport(ast, result, astRoot, !visitor.getAfterClasses().isEmpty(),
"org.testng.annotations.AfterClass");
//
// Add static imports
//
Set<String> staticImports = visitor.getStaticImports();
for (String si : staticImports) {
addImport(ast, result, astRoot, "org.testng.AssertJUnit." + si, true /* static import */);
}
//
// Remove "extends TestCase"
//
SimpleType td = visitor.getTestCase();
if (null != td) {
result.remove(td, null);
}
//
// Addd the annotations as needed
//
maybeAddAnnotations(ast, visitor, result, visitor.getTestMethods(), "Test", null, null);
maybeAddAnnotations(ast, visitor, result, visitor.getDisabledTestMethods(), "Test", null,
createDisabledAttribute(ast));
maybeAddAnnotations(ast, visitor, result, visitor.getBeforeMethods(), "BeforeMethod",
"@Before" /* annotation to remove */);
maybeAddAnnotations(ast, visitor, result, visitor.getAfterMethods(), "AfterMethod",
"@After" /* annotation to remove */);
//
// suite() method: remove, comment out or leave untouched, depending on the setting
//
SuiteMethodTreatment smt = TestNGPlugin.getPluginPreferenceStore().getSuiteMethodTreatement();
MethodDeclaration suiteMethod = visitor.getSuite();
if (smt != SuiteMethodTreatment.DONT_TOUCH && suiteMethod != null) {
if (smt == SuiteMethodTreatment.REMOVE) {
// Remove suite()
result.remove(suiteMethod, null);
} else {
// Comment out suite()
TypeDeclaration type = visitor.getType();
ListRewrite lr = result.getListRewrite(type, TypeDeclaration.BODY_DECLARATIONS_PROPERTY);
lr.insertBefore(result.createStringPlaceholder("/*", ASTNode.METHOD_DECLARATION),
suiteMethod, null);
lr.insertAfter(result.createStringPlaceholder("*/", ASTNode.METHOD_DECLARATION),
suiteMethod, null);
}
}
//
// Remove all the nodes that need to be removed
//
for (ASTNode n : visitor.getNodesToRemove()) {
result.remove(n, null);
}
//
// Replace @Ignore with @Test(enabled = false)
//
for (Map.Entry<MethodDeclaration, Annotation> e : visitor.getIgnoredMethods().entrySet()) {
MethodDeclaration md = e.getKey();
Annotation ignored = e.getValue();
// Add the @Test(enabled = false)
NormalAnnotation test = ast.newNormalAnnotation();
test.setTypeName(ast.newName("Test"));
MemberValuePair mvp = ast.newMemberValuePair();
mvp.setName(ast.newSimpleName("enabled"));
mvp.setValue(ast.newBooleanLiteral(false));
test.values().add(mvp);
result.remove(ignored, null);
ListRewrite lr = result.getListRewrite(md, MethodDeclaration.MODIFIERS2_PROPERTY);
lr.insertFirst(test, null);
}
//
// Replace "Assert" with "AssertJUnit", unless the method is already imported statically.
//
Set<MethodInvocation> asserts = visitor.getAsserts();
for (MethodInvocation m : asserts) {
if (! staticImports.contains(m.getName().toString())) {
Expression exp = m.getExpression();
Name name = ast.newName("AssertJUnit");
if (exp != null) {
result.replace(exp, name, null);
} else {
result.set(m, MethodInvocation.EXPRESSION_PROPERTY, name, null);
}
}
}
//
// Replace "fail()" with "Assert.fail()"
//
for (MethodInvocation fail : visitor.getFails()) {
SimpleName exp = ast.newSimpleName("Assert");
result.set(fail, MethodInvocation.EXPRESSION_PROPERTY, exp, null);
}
//
// Replace @Test(expected) with @Test(expectedExceptions)
// and @Test(timeout) with @Test(timeOut)
//
for (Map.Entry<MemberValuePair, String> pair : visitor.getTestsWithExpected().entrySet()) {
result.replace(pair.getKey().getName(), ast.newSimpleName(pair.getValue()), null);
}
//
// Remove super invocation in the constructor
//
SuperConstructorInvocation sci = visitor.getSuperConstructorInvocation();
if (sci != null) {
result.remove(sci, null);
}
//
// Convert @RunWith(Parameterized.class)
//
SingleMemberAnnotation runWith = visitor.getRunWithParameterized();
if (runWith != null) {
// Remove @RunWith
result.remove(runWith, null);
// Add imports
addImport(ast, result, astRoot, "org.testng.ConversionUtils.wrapDataProvider",
true /* static import */);
addImport(ast, result, astRoot, "org.testng.annotations.Factory", false /* not static */);
// Add the factory method
MethodDeclaration parameterMethod = visitor.getParametersMethod();
ListRewrite lr = result.getListRewrite(visitor.getType(),
TypeDeclaration.BODY_DECLARATIONS_PROPERTY);
MethodDeclaration md = ast.newMethodDeclaration();
md.setName(ast.newSimpleName("factory" + capitalize(parameterMethod.getName().toString())));
// Add the "Factory" annotation
MarkerAnnotation factory = ast.newMarkerAnnotation();
factory.setTypeName(ast.newName("Factory"));
md.modifiers().add(factory);
// Make the method public
md.modifiers().addAll(ast.newModifiers(Modifier.PUBLIC | Modifier.STATIC));
ArrayType returnType = ast.newArrayType(ast.newSimpleType(ast.newName("Object")));
md.setReturnType2(returnType);
// Create the method invocation "ConversionUtils.wrapDataProvider(Foo.class, data())"
MethodInvocation mi = ast.newMethodInvocation();
mi.setName(ast.newSimpleName("wrapDataProvider"));
// Add parameters to wrapDataProvider()
// 1) the current class
TypeLiteral tl = ast.newTypeLiteral();
tl.setType(ast.newSimpleType(ast.newSimpleName(visitor.getType().getName().toString())));
mi.arguments().add(tl);
// 2) the call to the @Parameters method
MethodInvocation pmi = ast.newMethodInvocation();
pmi.setName(ast.newSimpleName(parameterMethod.getName().getFullyQualifiedName()));
mi.arguments().add(pmi);
// Create the return statement
ReturnStatement returnStatement = ast.newReturnStatement();
returnStatement.setExpression(mi);
Block block = ast.newBlock();
block.statements().add(returnStatement);
md.setBody(block);
lr.insertFirst(md, null);
}
return result;
}
private String capitalize(String s) {
return s.substring(0, 1).toUpperCase() + s.substring(1);
}
private Map<String, Boolean> createDisabledAttribute(AST ast) {
Map<String, Boolean> result = new HashMap<>();
result.put("enabled", false);
return result;
}
private void maybeAddImport(AST ast, ASTRewrite rewriter, CompilationUnit astRoot, boolean add,
String imp) {
if (add) {
addImport(ast, rewriter, astRoot, imp);
}
}
private void addImport(AST ast, ASTRewrite rewriter, CompilationUnit astRoot, String imp) {
addImport(ast, rewriter, astRoot, imp, false /* non static import */);
}
private void addImport(AST ast, ASTRewrite rewriter, CompilationUnit astRoot, String imp,
boolean isStatic) {
ListRewrite lr = rewriter.getListRewrite(astRoot, CompilationUnit.IMPORTS_PROPERTY);
ImportDeclaration id = ast.newImportDeclaration();
id.setStatic(isStatic);
id.setName(ast.newName(imp));
lr.insertFirst(id, null);
}
/**
* Add the given annotation if the method is non null
*/
private void maybeAddAnnotation(AST ast, JUnitVisitor visitor, ASTRewrite rewriter,
MethodDeclaration method, String annotation, String annotationToRemove,
Map<String, Boolean> attributes)
{
if (method != null) {
addAnnotation(ast, visitor, rewriter, method, createAnnotation(ast, annotation, attributes),
annotationToRemove);
}
}
/**
* @return a NormalAnnotation if the annotation to create has attributes or a
* MarkerAnnotation otherwise.
*/
private Annotation createAnnotation(AST ast, String name, Map<String, Boolean> attributes) {
Annotation result = null;
NormalAnnotation normalAnnotation = null;
if (attributes != null && attributes.size() > 0) {
normalAnnotation = ast.newNormalAnnotation();
result = normalAnnotation;
} else {
result = ast.newMarkerAnnotation();
}
result.setTypeName(ast.newName(name));
if (attributes != null) {
for (Entry<String, Boolean> a : attributes.entrySet()) {
MemberValuePair mvp = ast.newMemberValuePair();
mvp.setName(ast.newSimpleName(a.getKey()));
mvp.setValue(ast.newBooleanLiteral(a.getValue()));
normalAnnotation.values().add(mvp);
}
}
return result;
}
/**
* Add the given annotation if the method is non null
*/
private void maybeAddAnnotations(AST ast, JUnitVisitor visitor, ASTRewrite rewriter,
Collection<MethodDeclaration> methods, String annotation, String annotationToRemove) {
maybeAddAnnotations(ast, visitor, rewriter, methods, annotation, annotationToRemove, null);
}
private void maybeAddAnnotations(AST ast, JUnitVisitor visitor,
ASTRewrite rewriter, Collection<MethodDeclaration> methods, String annotation,
String annotationToRemove, Map<String, Boolean> attributes) {
for (MethodDeclaration method : methods) {
maybeAddAnnotation(ast, visitor, rewriter, method, annotation, annotationToRemove,
attributes);
}
}
private void addAnnotation(AST ast, JUnitVisitor visitor, ASTRewrite rewriter,
MethodDeclaration md, Annotation a, String annotationToRemove)
{
ListRewrite lr = rewriter.getListRewrite(md, MethodDeclaration.MODIFIERS2_PROPERTY);
// Remove the annotation if applicable
if (annotationToRemove != null) {
List modifiers = md.modifiers();
for (int k = 0; k < modifiers.size(); k++) {
Object old = modifiers.get(k);
if (old instanceof Annotation) {
String oldAnnotation = old.toString();
if (oldAnnotation.equals(annotationToRemove) || "@Override".equals(oldAnnotation)) {
lr.remove((Annotation) old, null);
break;
}
}
}
}
// Add the annotation
lr.insertFirst(a, null);
}
public String getName() {
return "Convert to TestNG (Annotations)";
}
}
| {
"pile_set_name": "Github"
} |
/********************************************************************************************
* FrodoKEM: Learning with Errors Key Encapsulation
*
* Abstract: Key Encapsulation Mechanism (KEM) based on Frodo
*********************************************************************************************/
#include <stdint.h>
#include <string.h>
#include "fips202.h"
#include "randombytes.h"
#include "api.h"
#include "common.h"
#include "params.h"
int PQCLEAN_FRODOKEM640AES_CLEAN_crypto_kem_keypair(uint8_t *pk, uint8_t *sk) {
// FrodoKEM's key generation
// Outputs: public key pk ( BYTES_SEED_A + (PARAMS_LOGQ*PARAMS_N*PARAMS_NBAR)/8 bytes)
// secret key sk (CRYPTO_BYTES + BYTES_SEED_A + (PARAMS_LOGQ*PARAMS_N*PARAMS_NBAR)/8 + 2*PARAMS_N*PARAMS_NBAR + BYTES_PKHASH bytes)
uint8_t *pk_seedA = &pk[0];
uint8_t *pk_b = &pk[BYTES_SEED_A];
uint8_t *sk_s = &sk[0];
uint8_t *sk_pk = &sk[CRYPTO_BYTES];
uint8_t *sk_S = &sk[CRYPTO_BYTES + CRYPTO_PUBLICKEYBYTES];
uint8_t *sk_pkh = &sk[CRYPTO_BYTES + CRYPTO_PUBLICKEYBYTES + 2 * PARAMS_N * PARAMS_NBAR];
uint16_t B[PARAMS_N * PARAMS_NBAR] = {0};
uint16_t S[2 * PARAMS_N * PARAMS_NBAR] = {0}; // contains secret data
uint16_t *E = &S[PARAMS_N * PARAMS_NBAR]; // contains secret data
uint8_t randomness[2 * CRYPTO_BYTES + BYTES_SEED_A]; // contains secret data via randomness_s and randomness_seedSE
uint8_t *randomness_s = &randomness[0]; // contains secret data
uint8_t *randomness_seedSE = &randomness[CRYPTO_BYTES]; // contains secret data
uint8_t *randomness_z = &randomness[2 * CRYPTO_BYTES];
uint8_t shake_input_seedSE[1 + CRYPTO_BYTES]; // contains secret data
// Generate the secret value s, the seed for S and E, and the seed for the seed for A. Add seed_A to the public key
randombytes(randomness, CRYPTO_BYTES + CRYPTO_BYTES + BYTES_SEED_A);
shake(pk_seedA, BYTES_SEED_A, randomness_z, BYTES_SEED_A);
// Generate S and E, and compute B = A*S + E. Generate A on-the-fly
shake_input_seedSE[0] = 0x5F;
memcpy(&shake_input_seedSE[1], randomness_seedSE, CRYPTO_BYTES);
shake((uint8_t *)S, 2 * PARAMS_N * PARAMS_NBAR * sizeof(uint16_t), shake_input_seedSE, 1 + CRYPTO_BYTES);
for (size_t i = 0; i < 2 * PARAMS_N * PARAMS_NBAR; i++) {
S[i] = PQCLEAN_FRODOKEM640AES_CLEAN_LE_TO_UINT16(S[i]);
}
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(S, PARAMS_N * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(E, PARAMS_N * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_mul_add_as_plus_e(B, S, E, pk);
// Encode the second part of the public key
PQCLEAN_FRODOKEM640AES_CLEAN_pack(pk_b, CRYPTO_PUBLICKEYBYTES - BYTES_SEED_A, B, PARAMS_N * PARAMS_NBAR, PARAMS_LOGQ);
// Add s, pk and S to the secret key
memcpy(sk_s, randomness_s, CRYPTO_BYTES);
memcpy(sk_pk, pk, CRYPTO_PUBLICKEYBYTES);
for (size_t i = 0; i < PARAMS_N * PARAMS_NBAR; i++) {
S[i] = PQCLEAN_FRODOKEM640AES_CLEAN_UINT16_TO_LE(S[i]);
}
memcpy(sk_S, S, 2 * PARAMS_N * PARAMS_NBAR);
// Add H(pk) to the secret key
shake(sk_pkh, BYTES_PKHASH, pk, CRYPTO_PUBLICKEYBYTES);
// Cleanup:
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)S, PARAMS_N * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)E, PARAMS_N * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(randomness, 2 * CRYPTO_BYTES);
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(shake_input_seedSE, 1 + CRYPTO_BYTES);
return 0;
}
int PQCLEAN_FRODOKEM640AES_CLEAN_crypto_kem_enc(uint8_t *ct, uint8_t *ss, const uint8_t *pk) {
// FrodoKEM's key encapsulation
const uint8_t *pk_seedA = &pk[0];
const uint8_t *pk_b = &pk[BYTES_SEED_A];
uint8_t *ct_c1 = &ct[0];
uint8_t *ct_c2 = &ct[(PARAMS_LOGQ * PARAMS_N * PARAMS_NBAR) / 8];
uint16_t B[PARAMS_N * PARAMS_NBAR] = {0};
uint16_t V[PARAMS_NBAR * PARAMS_NBAR] = {0}; // contains secret data
uint16_t C[PARAMS_NBAR * PARAMS_NBAR] = {0};
uint16_t Bp[PARAMS_N * PARAMS_NBAR] = {0};
uint16_t Sp[(2 * PARAMS_N + PARAMS_NBAR)*PARAMS_NBAR] = {0}; // contains secret data
uint16_t *Ep = &Sp[PARAMS_N * PARAMS_NBAR]; // contains secret data
uint16_t *Epp = &Sp[2 * PARAMS_N * PARAMS_NBAR]; // contains secret data
uint8_t G2in[BYTES_PKHASH + BYTES_MU]; // contains secret data via mu
uint8_t *pkh = &G2in[0];
uint8_t *mu = &G2in[BYTES_PKHASH]; // contains secret data
uint8_t G2out[2 * CRYPTO_BYTES]; // contains secret data
uint8_t *seedSE = &G2out[0]; // contains secret data
uint8_t *k = &G2out[CRYPTO_BYTES]; // contains secret data
uint8_t Fin[CRYPTO_CIPHERTEXTBYTES + CRYPTO_BYTES]; // contains secret data via Fin_k
uint8_t *Fin_ct = &Fin[0];
uint8_t *Fin_k = &Fin[CRYPTO_CIPHERTEXTBYTES]; // contains secret data
uint8_t shake_input_seedSE[1 + CRYPTO_BYTES]; // contains secret data
// pkh <- G_1(pk), generate random mu, compute (seedSE || k) = G_2(pkh || mu)
shake(pkh, BYTES_PKHASH, pk, CRYPTO_PUBLICKEYBYTES);
randombytes(mu, BYTES_MU);
shake(G2out, CRYPTO_BYTES + CRYPTO_BYTES, G2in, BYTES_PKHASH + BYTES_MU);
// Generate Sp and Ep, and compute Bp = Sp*A + Ep. Generate A on-the-fly
shake_input_seedSE[0] = 0x96;
memcpy(&shake_input_seedSE[1], seedSE, CRYPTO_BYTES);
shake((uint8_t *)Sp, (2 * PARAMS_N + PARAMS_NBAR) * PARAMS_NBAR * sizeof(uint16_t), shake_input_seedSE, 1 + CRYPTO_BYTES);
for (size_t i = 0; i < (2 * PARAMS_N + PARAMS_NBAR) * PARAMS_NBAR; i++) {
Sp[i] = PQCLEAN_FRODOKEM640AES_CLEAN_LE_TO_UINT16(Sp[i]);
}
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(Sp, PARAMS_N * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(Ep, PARAMS_N * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_mul_add_sa_plus_e(Bp, Sp, Ep, pk_seedA);
PQCLEAN_FRODOKEM640AES_CLEAN_pack(ct_c1, (PARAMS_LOGQ * PARAMS_N * PARAMS_NBAR) / 8, Bp, PARAMS_N * PARAMS_NBAR, PARAMS_LOGQ);
// Generate Epp, and compute V = Sp*B + Epp
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(Epp, PARAMS_NBAR * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_unpack(B, PARAMS_N * PARAMS_NBAR, pk_b, CRYPTO_PUBLICKEYBYTES - BYTES_SEED_A, PARAMS_LOGQ);
PQCLEAN_FRODOKEM640AES_CLEAN_mul_add_sb_plus_e(V, B, Sp, Epp);
// Encode mu, and compute C = V + enc(mu) (mod q)
PQCLEAN_FRODOKEM640AES_CLEAN_key_encode(C, (uint16_t *)mu);
PQCLEAN_FRODOKEM640AES_CLEAN_add(C, V, C);
PQCLEAN_FRODOKEM640AES_CLEAN_pack(ct_c2, (PARAMS_LOGQ * PARAMS_NBAR * PARAMS_NBAR) / 8, C, PARAMS_NBAR * PARAMS_NBAR, PARAMS_LOGQ);
// Compute ss = F(ct||KK)
memcpy(Fin_ct, ct, CRYPTO_CIPHERTEXTBYTES);
memcpy(Fin_k, k, CRYPTO_BYTES);
shake(ss, CRYPTO_BYTES, Fin, CRYPTO_CIPHERTEXTBYTES + CRYPTO_BYTES);
// Cleanup:
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)V, PARAMS_NBAR * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)Sp, PARAMS_N * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)Ep, PARAMS_N * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)Epp, PARAMS_NBAR * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(mu, BYTES_MU);
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(G2out, 2 * CRYPTO_BYTES);
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(Fin_k, CRYPTO_BYTES);
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(shake_input_seedSE, 1 + CRYPTO_BYTES);
return 0;
}
int PQCLEAN_FRODOKEM640AES_CLEAN_crypto_kem_dec(uint8_t *ss, const uint8_t *ct, const uint8_t *sk) {
// FrodoKEM's key decapsulation
uint16_t B[PARAMS_N * PARAMS_NBAR] = {0};
uint16_t Bp[PARAMS_N * PARAMS_NBAR] = {0};
uint16_t W[PARAMS_NBAR * PARAMS_NBAR] = {0}; // contains secret data
uint16_t C[PARAMS_NBAR * PARAMS_NBAR] = {0};
uint16_t CC[PARAMS_NBAR * PARAMS_NBAR] = {0};
uint16_t BBp[PARAMS_N * PARAMS_NBAR] = {0};
uint16_t Sp[(2 * PARAMS_N + PARAMS_NBAR)*PARAMS_NBAR] = {0}; // contains secret data
uint16_t *Ep = &Sp[PARAMS_N * PARAMS_NBAR]; // contains secret data
uint16_t *Epp = &Sp[2 * PARAMS_N * PARAMS_NBAR]; // contains secret data
const uint8_t *ct_c1 = &ct[0];
const uint8_t *ct_c2 = &ct[(PARAMS_LOGQ * PARAMS_N * PARAMS_NBAR) / 8];
const uint8_t *sk_s = &sk[0];
const uint8_t *sk_pk = &sk[CRYPTO_BYTES];
const uint8_t *sk_S = &sk[CRYPTO_BYTES + CRYPTO_PUBLICKEYBYTES];
uint16_t S[PARAMS_N * PARAMS_NBAR]; // contains secret data
const uint8_t *sk_pkh = &sk[CRYPTO_BYTES + CRYPTO_PUBLICKEYBYTES + 2 * PARAMS_N * PARAMS_NBAR];
const uint8_t *pk_seedA = &sk_pk[0];
const uint8_t *pk_b = &sk_pk[BYTES_SEED_A];
uint8_t G2in[BYTES_PKHASH + BYTES_MU]; // contains secret data via muprime
uint8_t *pkh = &G2in[0];
uint8_t *muprime = &G2in[BYTES_PKHASH]; // contains secret data
uint8_t G2out[2 * CRYPTO_BYTES]; // contains secret data
uint8_t *seedSEprime = &G2out[0]; // contains secret data
uint8_t *kprime = &G2out[CRYPTO_BYTES]; // contains secret data
uint8_t Fin[CRYPTO_CIPHERTEXTBYTES + CRYPTO_BYTES]; // contains secret data via Fin_k
uint8_t *Fin_ct = &Fin[0];
uint8_t *Fin_k = &Fin[CRYPTO_CIPHERTEXTBYTES]; // contains secret data
uint8_t shake_input_seedSEprime[1 + CRYPTO_BYTES]; // contains secret data
for (size_t i = 0; i < PARAMS_N * PARAMS_NBAR; i++) {
S[i] = sk_S[2 * i] | (sk_S[2 * i + 1] << 8);
}
// Compute W = C - Bp*S (mod q), and decode the randomness mu
PQCLEAN_FRODOKEM640AES_CLEAN_unpack(Bp, PARAMS_N * PARAMS_NBAR, ct_c1, (PARAMS_LOGQ * PARAMS_N * PARAMS_NBAR) / 8, PARAMS_LOGQ);
PQCLEAN_FRODOKEM640AES_CLEAN_unpack(C, PARAMS_NBAR * PARAMS_NBAR, ct_c2, (PARAMS_LOGQ * PARAMS_NBAR * PARAMS_NBAR) / 8, PARAMS_LOGQ);
PQCLEAN_FRODOKEM640AES_CLEAN_mul_bs(W, Bp, S);
PQCLEAN_FRODOKEM640AES_CLEAN_sub(W, C, W);
PQCLEAN_FRODOKEM640AES_CLEAN_key_decode((uint16_t *)muprime, W);
// Generate (seedSE' || k') = G_2(pkh || mu')
memcpy(pkh, sk_pkh, BYTES_PKHASH);
shake(G2out, CRYPTO_BYTES + CRYPTO_BYTES, G2in, BYTES_PKHASH + BYTES_MU);
// Generate Sp and Ep, and compute BBp = Sp*A + Ep. Generate A on-the-fly
shake_input_seedSEprime[0] = 0x96;
memcpy(&shake_input_seedSEprime[1], seedSEprime, CRYPTO_BYTES);
shake((uint8_t *)Sp, (2 * PARAMS_N + PARAMS_NBAR) * PARAMS_NBAR * sizeof(uint16_t), shake_input_seedSEprime, 1 + CRYPTO_BYTES);
for (size_t i = 0; i < (2 * PARAMS_N + PARAMS_NBAR) * PARAMS_NBAR; i++) {
Sp[i] = PQCLEAN_FRODOKEM640AES_CLEAN_LE_TO_UINT16(Sp[i]);
}
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(Sp, PARAMS_N * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(Ep, PARAMS_N * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_mul_add_sa_plus_e(BBp, Sp, Ep, pk_seedA);
// Generate Epp, and compute W = Sp*B + Epp
PQCLEAN_FRODOKEM640AES_CLEAN_sample_n(Epp, PARAMS_NBAR * PARAMS_NBAR);
PQCLEAN_FRODOKEM640AES_CLEAN_unpack(B, PARAMS_N * PARAMS_NBAR, pk_b, CRYPTO_PUBLICKEYBYTES - BYTES_SEED_A, PARAMS_LOGQ);
PQCLEAN_FRODOKEM640AES_CLEAN_mul_add_sb_plus_e(W, B, Sp, Epp);
// Encode mu, and compute CC = W + enc(mu') (mod q)
PQCLEAN_FRODOKEM640AES_CLEAN_key_encode(CC, (uint16_t *)muprime);
PQCLEAN_FRODOKEM640AES_CLEAN_add(CC, W, CC);
// Prepare input to F
memcpy(Fin_ct, ct, CRYPTO_CIPHERTEXTBYTES);
// Reducing BBp modulo q
for (size_t i = 0; i < PARAMS_N * PARAMS_NBAR; i++) {
BBp[i] = BBp[i] & ((1 << PARAMS_LOGQ) - 1);
}
// If (Bp == BBp & C == CC) then ss = F(ct || k'), else ss = F(ct || s)
// Needs to avoid branching on secret data as per:
// Qian Guo, Thomas Johansson, Alexander Nilsson. A key-recovery timing attack on post-quantum
// primitives using the Fujisaki-Okamoto transformation and its application on FrodoKEM. In CRYPTO 2020.
int8_t selector = PQCLEAN_FRODOKEM640AES_CLEAN_ct_verify(Bp, BBp, PARAMS_N * PARAMS_NBAR) | PQCLEAN_FRODOKEM640AES_CLEAN_ct_verify(C, CC, PARAMS_NBAR * PARAMS_NBAR);
// If (selector == 0) then load k' to do ss = F(ct || k'), else if (selector == -1) load s to do ss = F(ct || s)
PQCLEAN_FRODOKEM640AES_CLEAN_ct_select((uint8_t *)Fin_k, (uint8_t *)kprime, (uint8_t *)sk_s, CRYPTO_BYTES, selector);
shake(ss, CRYPTO_BYTES, Fin, CRYPTO_CIPHERTEXTBYTES + CRYPTO_BYTES);
// Cleanup:
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)W, PARAMS_NBAR * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)Sp, PARAMS_N * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)S, PARAMS_N * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)Ep, PARAMS_N * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes((uint8_t *)Epp, PARAMS_NBAR * PARAMS_NBAR * sizeof(uint16_t));
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(muprime, BYTES_MU);
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(G2out, 2 * CRYPTO_BYTES);
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(Fin_k, CRYPTO_BYTES);
PQCLEAN_FRODOKEM640AES_CLEAN_clear_bytes(shake_input_seedSEprime, 1 + CRYPTO_BYTES);
return 0;
}
| {
"pile_set_name": "Github"
} |
.oc-dialog-content p {
padding : 10px;
width : 250px;
}
.oc-dialog-content p .ui-icon.ui-icon-notice {
display : none;
}
.oc-dialog-buttonrow.onebutton,
.oc-dialog-buttonrow.twobuttons {
button.primary {
border-color: rgba(0,0,0,0);
}
} | {
"pile_set_name": "Github"
} |
/*
Copyright (c) 2016 The Polymer Project Authors. All rights reserved.
This code may only be used under the BSD style license found at http://polymer.github.io/LICENSE.txt
The complete set of authors may be found at http://polymer.github.io/AUTHORS.txt
The complete set of contributors may be found at http://polymer.github.io/CONTRIBUTORS.txt
Code distributed by Google as part of the polymer project is also
subject to an additional IP rights grant found at http://polymer.github.io/PATENTS.txt
*/
$font-size-small: 13px;
$font-size-medium: 18px;
$small-padding: 20px;
$medium-padding: 40px;
$large-padding: 80px;
$mobile-media-breakpoint: 767px;
$pink-a400: #f50057;
$blue-600: #1e88e5;
$blue-800: #1565c0;
$blue-900: #0d47a1;
$blue-grey-900: #263238;
$blue-grey-50: #eceff1;
* {
box-sizing: border-box;
}
body {
margin: 0;
font-size: 16px;
font-family: Helvetica, Arial, sans-serif;
line-height: 24px;
font-weight: 300;
}
pw-shell {
display: block;
margin-top: 64px;
/* Set up so the footer is always at the bottom of the screen, regardless
* of the content size */
min-height: 100vh;
}
/* For nonversioned pages, the server needs to send both v1 and v2 links for
* the mobile drawer, since it doesn't know what version was previously
* selected, but the client should hide the irrelevant links. */
pw-shell[active-docs-version="1.0"] a[data-version="2.0"],
pw-shell[active-docs-version="1.0"] a[data-version="3.0"],
pw-shell[active-docs-version="2.0"] a[data-version="1.0"],
pw-shell[active-docs-version="2.0"] a[data-version="3.0"],
pw-shell[active-docs-version="3.0"] a[data-version="1.0"],
pw-shell[active-docs-version="3.0"] a[data-version="2.0"] {
display: none;
}
a {
color: $blue-600;
text-decoration: none;
font-weight: 500;
&:hover {
text-decoration: underline;
}
}
/* Don't overflow outside of the content area. */
img {
max-width: 100%;
}
section {
padding: $large-padding;
box-sizing: border-box;
@media (max-width: $mobile-media-breakpoint) {
padding: $medium-padding;
}
@media (max-width: 500px) {
padding-left: $small-padding;
padding-right: $small-padding;
}
&.short {
padding-top: $medium-padding;
padding-bottom: $medium-padding;
@media (max-width: $mobile-media-breakpoint) {
padding-top: $small-padding;
padding-bottom: $small-padding;
}
}
&.gradient-dark, &.gradient-light {
background-repeat: repeat-x;
background-position: bottom;
/* Use smaller gradient triangles */
background-size: 20px;
&.short {
background-size: 12px;
}
}
&.gradient-dark {
background-image: url("/images/textures/section-gradient-dark.png");
}
&.gradient-light {
background-image: url("/images/textures/section-gradient-light.png");
}
&.gradient-light-flipped {
background-image: url("/images/textures/section-gradient-light-flipped.png");
background-repeat: repeat-x;
background-position: top;
/* Use smaller gradient triangles */
background-size: 20px;
}
&.light {
background-color: white;
color: black;
}
&.dark {
background-color: $blue-grey-900;
color: white;
}
&.grey {
background-color: $blue-grey-50;
color: black;
}
&.pink {
background-color: $pink-a400;
}
&.blue {
background-color: $blue-600;
color: white;
a {
color: white;
}
.blue-button {
color: $blue-600;
background: white;
}
h3 {
margin-bottom: 5px;
}
}
& > * {
max-width: 900px;
margin: 0 auto;
}
}
.section-columns {
display: -ms-flexbox;
display: -webkit-flex;
display: flex;
-ms-flex-direction: row;
-webkit-flex-direction: row;
flex-direction: row;
@media (max-width: $mobile-media-breakpoint) {
display: block;
/* Switch to a vertical layout */
-ms-flex-direction: column;
-webkit-flex-direction: column;
flex-direction: column;
}
& > div:not(:last-child),
& > p:not(:last-child),
& > h2:not(:last-child) {
margin-right: $medium-padding;
overflow: hidden;
@media (max-width: $mobile-media-breakpoint) {
margin-right: 0px;
margin-bottom: $medium-padding;
}
}
}
h2 {
font-family: 'Roboto Slab', 'Roboto', 'Noto', sans-serif;
font-size: 24px;
font-weight: 300;
line-height: 38px;
margin: 10px 0;
@media (max-width: $mobile-media-breakpoint) {
font-size: 22px;
}
&.underline {
border-bottom: 1px solid $pink-a400;
text-overflow: ellipsis;
white-space: nowrap;
overflow: hidden;
}
}
h1 {
font-family: 'Roboto Slab', 'Roboto', 'Noto', sans-serif;
line-height: 38px;
margin: 10px 0;
font-size: 32px;
font-weight: 400;
@media (max-width: $mobile-media-breakpoint) {
font-size: 22px;
}
}
h3 {
@media (max-width: $mobile-media-breakpoint) {
font-size: $font-size-medium;
}
}
.blue-button {
color: white;
background: $blue-600;
text-transform: uppercase;
text-decoration: none;
font-size: $font-size-small;
font-weight: 400;
padding: 3px $small-padding;
margin-bottom: 5px;
border-radius: 2px;
text-align: center;
display: inline-block;
&:hover, &:focus {
text-decoration: none;
outline-color: $pink-a400;
}
&.vertical {
margin-bottom: $small-padding;
padding: 3px $medium-padding;
&:first-child {
margin-top: $small-padding;
}
}
&.full-width {
width: 100%;
}
&.flat {
color: $blue-600;
background: transparent;
}
}
.subsection:not(:last-of-type) {
border-bottom: 1px solid $pink-a400;
padding-bottom: $medium-padding;
margin-bottom: $medium-padding;
}
// TOC and deep links ----
.permalink {
display: none;
margin-left: 5px;
vertical-align: top;
}
.has-permalink:hover .permalink {
display: initial;
}
.no-permalink .permalink {
display: none !important;
}
details {
font-size: 16px;
font-weight: 500;
line-height: 24px;
cursor: pointer;
summary {
outline: none;
}
}
iron-doc-viewer, iron-doc-element, iron-doc-mixin, iron-doc-module, iron-doc-namespace, iron-doc-class, iron-doc-behavior {
padding: 20px 40px;
max-width: 800px;
margin: 0;
@media (max-width: $mobile-media-breakpoint) {
padding: 20px;
}
section {
padding: 0;
}
}
// -----
| {
"pile_set_name": "Github"
} |
M 9270 M.map
| {
"pile_set_name": "Github"
} |
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`elements/content-sidebar/SidebarVersions should not render the versions when file is a box note 1`] = `""`;
exports[`elements/content-sidebar/SidebarVersions should not render the versions when version_number <= 1 1`] = `""`;
exports[`elements/content-sidebar/SidebarVersions should not render the versions when version_number is falsy 1`] = `""`;
exports[`elements/content-sidebar/SidebarVersions should not render the versions when version_number is undefined 1`] = `""`;
exports[`elements/content-sidebar/SidebarVersions should render the versions when version_number > 1 1`] = `
<VersionHistoryLink
className="bcs-SidebarVersions"
data-resin-target="versionhistory"
data-testid="versionhistory"
versionCount={2}
/>
`;
| {
"pile_set_name": "Github"
} |
//
// Copyright (c) 2008, 2013, Oracle and/or its affiliates. All rights reserved.
// DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
//
// This code is free software; you can redistribute it and/or modify it
// under the terms of the GNU General Public License version 2 only, as
// published by the Free Software Foundation.
//
// This code is distributed in the hope that it will be useful, but WITHOUT
// ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
// FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
// version 2 for more details (a copy is included in the LICENSE file that
// accompanied this code).
//
// You should have received a copy of the GNU General Public License version
// 2 along with this work; if not, write to the Free Software Foundation,
// Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
//
// Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
// or visit www.oracle.com if you need additional information or have any
// questions.
//
// ARM Architecture Description File
//----------REGISTER DEFINITION BLOCK------------------------------------------
// This information is used by the matcher and the register allocator to
// describe individual registers and classes of registers within the target
// archtecture.
register %{
//----------Architecture Description Register Definitions----------------------
// General Registers
// "reg_def" name ( register save type, C convention save type,
// ideal register type, encoding, vm name );
// Register Save Types:
//
// NS = No-Save: The register allocator assumes that these registers
// can be used without saving upon entry to the method, &
// that they do not need to be saved at call sites.
//
// SOC = Save-On-Call: The register allocator assumes that these registers
// can be used without saving upon entry to the method,
// but that they must be saved at call sites.
//
// SOE = Save-On-Entry: The register allocator assumes that these registers
// must be saved before using them upon entry to the
// method, but they do not need to be saved at call
// sites.
//
// AS = Always-Save: The register allocator assumes that these registers
// must be saved before using them upon entry to the
// method, & that they must be saved at call sites.
//
// Ideal Register Type is used to determine how to save & restore a
// register. Op_RegI will get spilled with LoadI/StoreI, Op_RegP will get
// spilled with LoadP/StoreP. If the register supports both, use Op_RegI.
//
// The encoding number is the actual bit-pattern placed into the opcodes.
// ----------------------------
// Integer/Long Registers
// ----------------------------
reg_def R_R0 (SOC, SOC, Op_RegI, 0, R(0)->as_VMReg());
reg_def R_R1 (SOC, SOC, Op_RegI, 1, R(1)->as_VMReg());
reg_def R_R2 (SOC, SOC, Op_RegI, 2, R(2)->as_VMReg());
reg_def R_R3 (SOC, SOC, Op_RegI, 3, R(3)->as_VMReg());
reg_def R_R4 (SOC, SOE, Op_RegI, 4, R(4)->as_VMReg());
reg_def R_R5 (SOC, SOE, Op_RegI, 5, R(5)->as_VMReg());
reg_def R_R6 (SOC, SOE, Op_RegI, 6, R(6)->as_VMReg());
reg_def R_R7 (SOC, SOE, Op_RegI, 7, R(7)->as_VMReg());
reg_def R_R8 (SOC, SOE, Op_RegI, 8, R(8)->as_VMReg());
reg_def R_R9 (SOC, SOE, Op_RegI, 9, R(9)->as_VMReg());
reg_def R_R10(NS, SOE, Op_RegI, 10, R(10)->as_VMReg());
reg_def R_R11(NS, SOE, Op_RegI, 11, R(11)->as_VMReg());
reg_def R_R12(SOC, SOC, Op_RegI, 12, R(12)->as_VMReg());
reg_def R_R13(NS, NS, Op_RegI, 13, R(13)->as_VMReg());
reg_def R_R14(SOC, SOC, Op_RegI, 14, R(14)->as_VMReg());
reg_def R_R15(NS, NS, Op_RegI, 15, R(15)->as_VMReg());
// ----------------------------
// Float/Double Registers
// ----------------------------
// Float Registers
reg_def R_S0 ( SOC, SOC, Op_RegF, 0, S0->as_VMReg());
reg_def R_S1 ( SOC, SOC, Op_RegF, 1, S1_reg->as_VMReg());
reg_def R_S2 ( SOC, SOC, Op_RegF, 2, S2_reg->as_VMReg());
reg_def R_S3 ( SOC, SOC, Op_RegF, 3, S3_reg->as_VMReg());
reg_def R_S4 ( SOC, SOC, Op_RegF, 4, S4_reg->as_VMReg());
reg_def R_S5 ( SOC, SOC, Op_RegF, 5, S5_reg->as_VMReg());
reg_def R_S6 ( SOC, SOC, Op_RegF, 6, S6_reg->as_VMReg());
reg_def R_S7 ( SOC, SOC, Op_RegF, 7, S7->as_VMReg());
reg_def R_S8 ( SOC, SOC, Op_RegF, 8, S8->as_VMReg());
reg_def R_S9 ( SOC, SOC, Op_RegF, 9, S9->as_VMReg());
reg_def R_S10( SOC, SOC, Op_RegF, 10,S10->as_VMReg());
reg_def R_S11( SOC, SOC, Op_RegF, 11,S11->as_VMReg());
reg_def R_S12( SOC, SOC, Op_RegF, 12,S12->as_VMReg());
reg_def R_S13( SOC, SOC, Op_RegF, 13,S13->as_VMReg());
reg_def R_S14( SOC, SOC, Op_RegF, 14,S14->as_VMReg());
reg_def R_S15( SOC, SOC, Op_RegF, 15,S15->as_VMReg());
reg_def R_S16( SOC, SOE, Op_RegF, 16,S16->as_VMReg());
reg_def R_S17( SOC, SOE, Op_RegF, 17,S17->as_VMReg());
reg_def R_S18( SOC, SOE, Op_RegF, 18,S18->as_VMReg());
reg_def R_S19( SOC, SOE, Op_RegF, 19,S19->as_VMReg());
reg_def R_S20( SOC, SOE, Op_RegF, 20,S20->as_VMReg());
reg_def R_S21( SOC, SOE, Op_RegF, 21,S21->as_VMReg());
reg_def R_S22( SOC, SOE, Op_RegF, 22,S22->as_VMReg());
reg_def R_S23( SOC, SOE, Op_RegF, 23,S23->as_VMReg());
reg_def R_S24( SOC, SOE, Op_RegF, 24,S24->as_VMReg());
reg_def R_S25( SOC, SOE, Op_RegF, 25,S25->as_VMReg());
reg_def R_S26( SOC, SOE, Op_RegF, 26,S26->as_VMReg());
reg_def R_S27( SOC, SOE, Op_RegF, 27,S27->as_VMReg());
reg_def R_S28( SOC, SOE, Op_RegF, 28,S28->as_VMReg());
reg_def R_S29( SOC, SOE, Op_RegF, 29,S29->as_VMReg());
reg_def R_S30( SOC, SOE, Op_RegF, 30,S30->as_VMReg());
reg_def R_S31( SOC, SOE, Op_RegF, 31,S31->as_VMReg());
// Double Registers
// The rules of ADL require that double registers be defined in pairs.
// Each pair must be two 32-bit values, but not necessarily a pair of
// single float registers. In each pair, ADLC-assigned register numbers
// must be adjacent, with the lower number even. Finally, when the
// CPU stores such a register pair to memory, the word associated with
// the lower ADLC-assigned number must be stored to the lower address.
reg_def R_D16 (SOC, SOC, Op_RegD, 32, D16->as_VMReg());
reg_def R_D16x(SOC, SOC, Op_RegD,255, D16->as_VMReg()->next());
reg_def R_D17 (SOC, SOC, Op_RegD, 34, D17->as_VMReg());
reg_def R_D17x(SOC, SOC, Op_RegD,255, D17->as_VMReg()->next());
reg_def R_D18 (SOC, SOC, Op_RegD, 36, D18->as_VMReg());
reg_def R_D18x(SOC, SOC, Op_RegD,255, D18->as_VMReg()->next());
reg_def R_D19 (SOC, SOC, Op_RegD, 38, D19->as_VMReg());
reg_def R_D19x(SOC, SOC, Op_RegD,255, D19->as_VMReg()->next());
reg_def R_D20 (SOC, SOC, Op_RegD, 40, D20->as_VMReg());
reg_def R_D20x(SOC, SOC, Op_RegD,255, D20->as_VMReg()->next());
reg_def R_D21 (SOC, SOC, Op_RegD, 42, D21->as_VMReg());
reg_def R_D21x(SOC, SOC, Op_RegD,255, D21->as_VMReg()->next());
reg_def R_D22 (SOC, SOC, Op_RegD, 44, D22->as_VMReg());
reg_def R_D22x(SOC, SOC, Op_RegD,255, D22->as_VMReg()->next());
reg_def R_D23 (SOC, SOC, Op_RegD, 46, D23->as_VMReg());
reg_def R_D23x(SOC, SOC, Op_RegD,255, D23->as_VMReg()->next());
reg_def R_D24 (SOC, SOC, Op_RegD, 48, D24->as_VMReg());
reg_def R_D24x(SOC, SOC, Op_RegD,255, D24->as_VMReg()->next());
reg_def R_D25 (SOC, SOC, Op_RegD, 50, D25->as_VMReg());
reg_def R_D25x(SOC, SOC, Op_RegD,255, D25->as_VMReg()->next());
reg_def R_D26 (SOC, SOC, Op_RegD, 52, D26->as_VMReg());
reg_def R_D26x(SOC, SOC, Op_RegD,255, D26->as_VMReg()->next());
reg_def R_D27 (SOC, SOC, Op_RegD, 54, D27->as_VMReg());
reg_def R_D27x(SOC, SOC, Op_RegD,255, D27->as_VMReg()->next());
reg_def R_D28 (SOC, SOC, Op_RegD, 56, D28->as_VMReg());
reg_def R_D28x(SOC, SOC, Op_RegD,255, D28->as_VMReg()->next());
reg_def R_D29 (SOC, SOC, Op_RegD, 58, D29->as_VMReg());
reg_def R_D29x(SOC, SOC, Op_RegD,255, D29->as_VMReg()->next());
reg_def R_D30 (SOC, SOC, Op_RegD, 60, D30->as_VMReg());
reg_def R_D30x(SOC, SOC, Op_RegD,255, D30->as_VMReg()->next());
reg_def R_D31 (SOC, SOC, Op_RegD, 62, D31->as_VMReg());
reg_def R_D31x(SOC, SOC, Op_RegD,255, D31->as_VMReg()->next());
// ----------------------------
// Special Registers
// Condition Codes Flag Registers
reg_def APSR (SOC, SOC, Op_RegFlags, 0, VMRegImpl::Bad());
reg_def FPSCR(SOC, SOC, Op_RegFlags, 0, VMRegImpl::Bad());
// ----------------------------
// Specify the enum values for the registers. These enums are only used by the
// OptoReg "class". We can convert these enum values at will to VMReg when needed
// for visibility to the rest of the vm. The order of this enum influences the
// register allocator so having the freedom to set this order and not be stuck
// with the order that is natural for the rest of the vm is worth it.
// registers in that order so that R11/R12 is an aligned pair that can be used for longs
alloc_class chunk0(
R_R4, R_R5, R_R6, R_R7, R_R8, R_R9, R_R11, R_R12, R_R10, R_R13, R_R14, R_R15, R_R0, R_R1, R_R2, R_R3);
// Note that a register is not allocatable unless it is also mentioned
// in a widely-used reg_class below.
alloc_class chunk1(
R_S16, R_S17, R_S18, R_S19, R_S20, R_S21, R_S22, R_S23,
R_S24, R_S25, R_S26, R_S27, R_S28, R_S29, R_S30, R_S31,
R_S0, R_S1, R_S2, R_S3, R_S4, R_S5, R_S6, R_S7,
R_S8, R_S9, R_S10, R_S11, R_S12, R_S13, R_S14, R_S15,
R_D16, R_D16x,R_D17, R_D17x,R_D18, R_D18x,R_D19, R_D19x,
R_D20, R_D20x,R_D21, R_D21x,R_D22, R_D22x,R_D23, R_D23x,
R_D24, R_D24x,R_D25, R_D25x,R_D26, R_D26x,R_D27, R_D27x,
R_D28, R_D28x,R_D29, R_D29x,R_D30, R_D30x,R_D31, R_D31x
);
alloc_class chunk2(APSR, FPSCR);
//----------Architecture Description Register Classes--------------------------
// Several register classes are automatically defined based upon information in
// this architecture description.
// 1) reg_class inline_cache_reg ( as defined in frame section )
// 2) reg_class interpreter_method_oop_reg ( as defined in frame section )
// 3) reg_class stack_slots( /* one chunk of stack-based "registers" */ )
//
// ----------------------------
// Integer Register Classes
// ----------------------------
// Exclusions from i_reg:
// SP (R13), PC (R15)
// R10: reserved by HotSpot to the TLS register (invariant within Java)
reg_class int_reg(R_R0, R_R1, R_R2, R_R3, R_R4, R_R5, R_R6, R_R7, R_R8, R_R9, R_R11, R_R12, R_R14);
reg_class R0_regI(R_R0);
reg_class R1_regI(R_R1);
reg_class R2_regI(R_R2);
reg_class R3_regI(R_R3);
reg_class R12_regI(R_R12);
// ----------------------------
// Pointer Register Classes
// ----------------------------
reg_class ptr_reg(R_R0, R_R1, R_R2, R_R3, R_R4, R_R5, R_R6, R_R7, R_R8, R_R9, R_R11, R_R12, R_R14);
// Special class for storeP instructions, which can store SP or RPC to TLS.
// It is also used for memory addressing, allowing direct TLS addressing.
reg_class sp_ptr_reg(R_R0, R_R1, R_R2, R_R3, R_R4, R_R5, R_R6, R_R7, R_R8, R_R9, R_R11, R_R12, R_R14, R_R10 /* TLS*/, R_R13 /* SP*/);
#define R_Ricklass R_R8
#define R_Rmethod R_R9
#define R_Rthread R_R10
#define R_Rexception_obj R_R4
// Other special pointer regs
reg_class R0_regP(R_R0);
reg_class R1_regP(R_R1);
reg_class R2_regP(R_R2);
reg_class R4_regP(R_R4);
reg_class Rexception_regP(R_Rexception_obj);
reg_class Ricklass_regP(R_Ricklass);
reg_class Rmethod_regP(R_Rmethod);
reg_class Rthread_regP(R_Rthread);
reg_class IP_regP(R_R12);
reg_class LR_regP(R_R14);
reg_class FP_regP(R_R11);
// ----------------------------
// Long Register Classes
// ----------------------------
reg_class long_reg ( R_R0,R_R1, R_R2,R_R3, R_R4,R_R5, R_R6,R_R7, R_R8,R_R9, R_R11,R_R12);
// for ldrexd, strexd: first reg of pair must be even
reg_class long_reg_align ( R_R0,R_R1, R_R2,R_R3, R_R4,R_R5, R_R6,R_R7, R_R8,R_R9);
reg_class R0R1_regL(R_R0,R_R1);
reg_class R2R3_regL(R_R2,R_R3);
// ----------------------------
// Special Class for Condition Code Flags Register
reg_class int_flags(APSR);
reg_class float_flags(FPSCR);
// ----------------------------
// Float Point Register Classes
// ----------------------------
// Skip S14/S15, they are reserved for mem-mem copies
reg_class sflt_reg(R_S0, R_S1, R_S2, R_S3, R_S4, R_S5, R_S6, R_S7, R_S8, R_S9, R_S10, R_S11, R_S12, R_S13,
R_S16, R_S17, R_S18, R_S19, R_S20, R_S21, R_S22, R_S23, R_S24, R_S25, R_S26, R_S27, R_S28, R_S29, R_S30, R_S31);
// Paired floating point registers--they show up in the same order as the floats,
// but they are used with the "Op_RegD" type, and always occur in even/odd pairs.
reg_class dflt_reg(R_S0,R_S1, R_S2,R_S3, R_S4,R_S5, R_S6,R_S7, R_S8,R_S9, R_S10,R_S11, R_S12,R_S13,
R_S16,R_S17, R_S18,R_S19, R_S20,R_S21, R_S22,R_S23, R_S24,R_S25, R_S26,R_S27, R_S28,R_S29, R_S30,R_S31,
R_D16,R_D16x, R_D17,R_D17x, R_D18,R_D18x, R_D19,R_D19x, R_D20,R_D20x, R_D21,R_D21x, R_D22,R_D22x,
R_D23,R_D23x, R_D24,R_D24x, R_D25,R_D25x, R_D26,R_D26x, R_D27,R_D27x, R_D28,R_D28x, R_D29,R_D29x,
R_D30,R_D30x, R_D31,R_D31x);
reg_class dflt_low_reg(R_S0,R_S1, R_S2,R_S3, R_S4,R_S5, R_S6,R_S7, R_S8,R_S9, R_S10,R_S11, R_S12,R_S13,
R_S16,R_S17, R_S18,R_S19, R_S20,R_S21, R_S22,R_S23, R_S24,R_S25, R_S26,R_S27, R_S28,R_S29, R_S30,R_S31);
reg_class actual_dflt_reg %{
if (VM_Version::has_vfp3_32()) {
return DFLT_REG_mask();
} else {
return DFLT_LOW_REG_mask();
}
%}
reg_class S0_regF(R_S0);
reg_class D0_regD(R_S0,R_S1);
reg_class D1_regD(R_S2,R_S3);
reg_class D2_regD(R_S4,R_S5);
reg_class D3_regD(R_S6,R_S7);
reg_class D4_regD(R_S8,R_S9);
reg_class D5_regD(R_S10,R_S11);
reg_class D6_regD(R_S12,R_S13);
reg_class D7_regD(R_S14,R_S15);
reg_class D16_regD(R_D16,R_D16x);
reg_class D17_regD(R_D17,R_D17x);
reg_class D18_regD(R_D18,R_D18x);
reg_class D19_regD(R_D19,R_D19x);
reg_class D20_regD(R_D20,R_D20x);
reg_class D21_regD(R_D21,R_D21x);
reg_class D22_regD(R_D22,R_D22x);
reg_class D23_regD(R_D23,R_D23x);
reg_class D24_regD(R_D24,R_D24x);
reg_class D25_regD(R_D25,R_D25x);
reg_class D26_regD(R_D26,R_D26x);
reg_class D27_regD(R_D27,R_D27x);
reg_class D28_regD(R_D28,R_D28x);
reg_class D29_regD(R_D29,R_D29x);
reg_class D30_regD(R_D30,R_D30x);
reg_class D31_regD(R_D31,R_D31x);
reg_class vectorx_reg(R_S0,R_S1,R_S2,R_S3, R_S4,R_S5,R_S6,R_S7,
R_S8,R_S9,R_S10,R_S11, /* skip S14/S15 */
R_S16,R_S17,R_S18,R_S19, R_S20,R_S21,R_S22,R_S23,
R_S24,R_S25,R_S26,R_S27, R_S28,R_S29,R_S30,R_S31,
R_D16,R_D16x,R_D17,R_D17x, R_D18,R_D18x,R_D19,R_D19x,
R_D20,R_D20x,R_D21,R_D21x, R_D22,R_D22x,R_D23,R_D23x,
R_D24,R_D24x,R_D25,R_D25x, R_D26,R_D26x,R_D27,R_D27x,
R_D28,R_D28x,R_D29,R_D29x, R_D30,R_D30x,R_D31,R_D31x);
%}
source_hpp %{
// FIXME
const MachRegisterNumbers R_mem_copy_lo_num = R_S14_num;
const MachRegisterNumbers R_mem_copy_hi_num = R_S15_num;
const FloatRegister Rmemcopy = S14;
const MachRegisterNumbers R_hf_ret_lo_num = R_S0_num;
const MachRegisterNumbers R_hf_ret_hi_num = R_S1_num;
const MachRegisterNumbers R_Ricklass_num = R_R8_num;
const MachRegisterNumbers R_Rmethod_num = R_R9_num;
#define LDR_DOUBLE "FLDD"
#define LDR_FLOAT "FLDS"
#define STR_DOUBLE "FSTD"
#define STR_FLOAT "FSTS"
#define LDR_64 "LDRD"
#define STR_64 "STRD"
#define LDR_32 "LDR"
#define STR_32 "STR"
#define MOV_DOUBLE "FCPYD"
#define MOV_FLOAT "FCPYS"
#define FMSR "FMSR"
#define FMRS "FMRS"
#define LDREX "ldrex "
#define STREX "strex "
#define str_64 strd
#define ldr_64 ldrd
#define ldr_32 ldr
#define ldrex ldrex
#define strex strex
static inline bool is_memoryD(int offset) {
return offset < 1024 && offset > -1024;
}
static inline bool is_memoryfp(int offset) {
return offset < 1024 && offset > -1024;
}
static inline bool is_memoryI(int offset) {
return offset < 4096 && offset > -4096;
}
static inline bool is_memoryP(int offset) {
return offset < 4096 && offset > -4096;
}
static inline bool is_memoryHD(int offset) {
return offset < 256 && offset > -256;
}
static inline bool is_aimm(int imm) {
return AsmOperand::is_rotated_imm(imm);
}
static inline bool is_limmI(jint imm) {
return AsmOperand::is_rotated_imm(imm);
}
static inline bool is_limmI_low(jint imm, int n) {
int imml = imm & right_n_bits(n);
return is_limmI(imml) || is_limmI(imm);
}
static inline int limmI_low(jint imm, int n) {
int imml = imm & right_n_bits(n);
return is_limmI(imml) ? imml : imm;
}
%}
source %{
// Given a register encoding, produce a Integer Register object
static Register reg_to_register_object(int register_encoding) {
assert(R0->encoding() == R_R0_enc && R15->encoding() == R_R15_enc, "right coding");
return as_Register(register_encoding);
}
// Given a register encoding, produce a single-precision Float Register object
static FloatRegister reg_to_FloatRegister_object(int register_encoding) {
assert(S0->encoding() == R_S0_enc && S31->encoding() == R_S31_enc, "right coding");
return as_FloatRegister(register_encoding);
}
void Compile::pd_compiler2_init() {
// Umimplemented
}
// Location of compiled Java return values. Same as C
OptoRegPair c2::return_value(int ideal_reg) {
assert( ideal_reg >= Op_RegI && ideal_reg <= Op_RegL, "only return normal values" );
#ifndef __ABI_HARD__
static int lo[Op_RegL+1] = { 0, 0, OptoReg::Bad, R_R0_num, R_R0_num, R_R0_num, R_R0_num, R_R0_num };
static int hi[Op_RegL+1] = { 0, 0, OptoReg::Bad, OptoReg::Bad, OptoReg::Bad, OptoReg::Bad, R_R1_num, R_R1_num };
#else
static int lo[Op_RegL+1] = { 0, 0, OptoReg::Bad, R_R0_num, R_R0_num, R_hf_ret_lo_num, R_hf_ret_lo_num, R_R0_num };
static int hi[Op_RegL+1] = { 0, 0, OptoReg::Bad, OptoReg::Bad, OptoReg::Bad, OptoReg::Bad, R_hf_ret_hi_num, R_R1_num };
#endif
return OptoRegPair( hi[ideal_reg], lo[ideal_reg]);
}
// !!!!! Special hack to get all type of calls to specify the byte offset
// from the start of the call to the point where the return address
// will point.
int MachCallStaticJavaNode::ret_addr_offset() {
bool far = (_method == NULL) ? maybe_far_call(this) : !cache_reachable();
return ((far ? 3 : 1) + (_method_handle_invoke ? 1 : 0)) *
NativeInstruction::instruction_size;
}
int MachCallDynamicJavaNode::ret_addr_offset() {
bool far = !cache_reachable();
// mov_oop is always 2 words
return (2 + (far ? 3 : 1)) * NativeInstruction::instruction_size;
}
int MachCallRuntimeNode::ret_addr_offset() {
// bl or movw; movt; blx
bool far = maybe_far_call(this);
return (far ? 3 : 1) * NativeInstruction::instruction_size;
}
%}
// The intptr_t operand types, defined by textual substitution.
// (Cf. opto/type.hpp. This lets us avoid many, many other ifdefs.)
#define immX immI
#define immXRot immIRot
#define iRegX iRegI
#define aimmX aimmI
#define limmX limmI
#define immX10x2 immI10x2
#define LShiftX LShiftI
#define shimmX immU5
// Compatibility interface
#define aimmP immPRot
#define immIMov immIRot
#define store_RegL iRegL
#define store_RegLd iRegLd
#define store_RegI iRegI
#define store_ptr_RegP iRegP
//----------ATTRIBUTES---------------------------------------------------------
//----------Operand Attributes-------------------------------------------------
op_attrib op_cost(1); // Required cost attribute
//----------OPERANDS-----------------------------------------------------------
// Operand definitions must precede instruction definitions for correct parsing
// in the ADLC because operands constitute user defined types which are used in
// instruction definitions.
//----------Simple Operands----------------------------------------------------
// Immediate Operands
operand immIRot() %{
predicate(AsmOperand::is_rotated_imm(n->get_int()));
match(ConI);
op_cost(0);
// formats are generated automatically for constants and base registers
format %{ %}
interface(CONST_INTER);
%}
operand immIRotn() %{
predicate(n->get_int() != 0 && AsmOperand::is_rotated_imm(~n->get_int()));
match(ConI);
op_cost(0);
// formats are generated automatically for constants and base registers
format %{ %}
interface(CONST_INTER);
%}
operand immIRotneg() %{
// if AsmOperand::is_rotated_imm() is true for this constant, it is
// a immIRot and an optimal instruction combination exists to handle the
// constant as an immIRot
predicate(!AsmOperand::is_rotated_imm(n->get_int()) && AsmOperand::is_rotated_imm(-n->get_int()));
match(ConI);
op_cost(0);
// formats are generated automatically for constants and base registers
format %{ %}
interface(CONST_INTER);
%}
// Non-negative integer immediate that is encodable using the rotation scheme,
// and that when expanded fits in 31 bits.
operand immU31Rot() %{
predicate((0 <= n->get_int()) && AsmOperand::is_rotated_imm(n->get_int()));
match(ConI);
op_cost(0);
// formats are generated automatically for constants and base registers
format %{ %}
interface(CONST_INTER);
%}
operand immPRot() %{
predicate(n->get_ptr() == 0 || (AsmOperand::is_rotated_imm(n->get_ptr()) && ((ConPNode*)n)->type()->reloc() == relocInfo::none));
match(ConP);
op_cost(0);
// formats are generated automatically for constants and base registers
format %{ %}
interface(CONST_INTER);
%}
operand immLlowRot() %{
predicate(n->get_long() >> 32 == 0 && AsmOperand::is_rotated_imm((int)n->get_long()));
match(ConL);
op_cost(0);
format %{ %}
interface(CONST_INTER);
%}
operand immLRot2() %{
predicate(AsmOperand::is_rotated_imm((int)(n->get_long() >> 32)) &&
AsmOperand::is_rotated_imm((int)(n->get_long())));
match(ConL);
op_cost(0);
format %{ %}
interface(CONST_INTER);
%}
// Integer Immediate: 12-bit - for addressing mode
operand immI12() %{
predicate((-4096 < n->get_int()) && (n->get_int() < 4096));
match(ConI);
op_cost(0);
format %{ %}
interface(CONST_INTER);
%}
// Integer Immediate: 10-bit disp and disp+4 - for addressing float pair
operand immI10x2() %{
predicate((-1024 < n->get_int()) && (n->get_int() < 1024 - 4));
match(ConI);
op_cost(0);
format %{ %}
interface(CONST_INTER);
%}
// Integer Immediate: 12-bit disp and disp+4 - for addressing word pair
operand immI12x2() %{
predicate((-4096 < n->get_int()) && (n->get_int() < 4096 - 4));
match(ConI);
op_cost(0);
format %{ %}
interface(CONST_INTER);
%}
| {
"pile_set_name": "Github"
} |
/*
* Copyright (c) 2011-Present VMware, Inc. or its affiliates, All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package reactor.netty.transport;
import io.micrometer.core.instrument.MeterRegistry;
import io.micrometer.core.instrument.Metrics;
import io.micrometer.core.instrument.Timer;
import io.micrometer.core.instrument.simple.SimpleMeterRegistry;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import reactor.core.publisher.Mono;
import reactor.netty.DisposableServer;
import reactor.netty.http.client.HttpClient;
import reactor.netty.http.server.HttpServer;
import java.time.Duration;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import static org.assertj.core.api.Assertions.assertThat;
import static reactor.netty.Metrics.REMOTE_ADDRESS;
import static reactor.netty.Metrics.STATUS;
import static reactor.netty.Metrics.SUCCESS;
/**
* @author Violeta Georgieva
*/
public class AddressResolverGroupMetricsTest {
private MeterRegistry registry;
@Before
public void setUp() {
registry = new SimpleMeterRegistry();
Metrics.addRegistry(registry);
}
@After
public void tearDown() {
Metrics.removeRegistry(registry);
registry.clear();
registry.close();
}
@Test
public void test() throws Exception {
DisposableServer server =
HttpServer.create()
.host("localhost")
.port(0)
.handle((req, res) -> res.header("Connection", "close")
.sendString(Mono.just("test")))
.bindNow();
CountDownLatch latch = new CountDownLatch(1);
HttpClient.create()
.doOnResponse((res, conn) ->
conn.channel()
.closeFuture()
.addListener(f -> latch.countDown()))
.metrics(true, s -> s)
.get()
.uri("http://localhost:" + server.port())
.responseContent()
.aggregate()
.asString()
.block(Duration.ofSeconds(30));
assertThat(latch.await(30, TimeUnit.SECONDS)).isTrue();
assertThat(getTimerValue("localhost:" + server.port())).isGreaterThan(0);
server.disposeNow();
}
private double getTimerValue(String address) {
Timer timer = registry.find("reactor.netty.http.client.address.resolver")
.tags(REMOTE_ADDRESS, address, STATUS, SUCCESS).timer();
double result = -1;
if (timer != null) {
result = timer.totalTime(TimeUnit.NANOSECONDS);
}
return result;
}
}
| {
"pile_set_name": "Github"
} |
// Microsoft Visual C++ generated resource script.
//
#include "resource.h"
#define APSTUDIO_READONLY_SYMBOLS
/////////////////////////////////////////////////////////////////////////////
//
// Generated from the TEXTINCLUDE 2 resource.
//
#include "winres.h"
/////////////////////////////////////////////////////////////////////////////
#undef APSTUDIO_READONLY_SYMBOLS
/////////////////////////////////////////////////////////////////////////////
// Neutral resources
#if !defined(AFX_RESOURCE_DLL) || defined(AFX_TARG_NEU)
LANGUAGE LANG_NEUTRAL, SUBLANG_NEUTRAL
#pragma code_page(1252)
#ifdef APSTUDIO_INVOKED
/////////////////////////////////////////////////////////////////////////////
//
// TEXTINCLUDE
//
1 TEXTINCLUDE
BEGIN
"resource.h\0"
END
2 TEXTINCLUDE
BEGIN
"#include ""winres.h""\r\n"
"\0"
END
3 TEXTINCLUDE
BEGIN
"\r\n"
"\0"
END
#endif // APSTUDIO_INVOKED
/////////////////////////////////////////////////////////////////////////////
//
// Dialog
//
IDD_ABOUTDLG DIALOGEX 0, 0, 270, 122
STYLE DS_ABSALIGN | DS_SETFONT | DS_MODALFRAME | DS_3DLOOK | DS_FIXEDSYS | DS_CENTER | WS_POPUP | WS_VISIBLE | WS_CAPTION | WS_SYSMENU
EXSTYLE WS_EX_WINDOWEDGE
CAPTION "About JSON Viewer"
FONT 8, "MS Shell Dlg", 400, 0, 0x1
BEGIN
DEFPUSHBUTTON "OK",IDOK,110,100,50,14
GROUPBOX "JSON Viewer",IDC_STATIC,11,7,245,90,BS_CENTER
LTEXT "Author:",IDC_STATIC,23,21,24,8
LTEXT "Kapil Ratnani",IDC_STATIC,100,21,43,8
LTEXT "Version:",IDC_STATIC,23,35,26,8
LTEXT "x.x.x.x",IDC_VERSION,100,35,41,9
LTEXT "Licence:",IDC_STATIC,23,48,28,8
LTEXT "GPL",IDC_STATIC,100,48,15,8
LTEXT "Special thanks to:",IDC_STATIC,23,63,58,8
LTEXT "Don Ho for Notepad++",IDC_STATIC,100,63,152,8
LTEXT "Website:",IDC_STATIC,23,78,29,8
LTEXT "https://github.com/kapilratnani/JSON-Viewer",IDC_WEB,100,78,154,8,SS_NOTIFY
END
IDD_TREEDLG DIALOGEX 0, 0, 208, 322
STYLE DS_SETFONT | DS_3DLOOK | DS_FIXEDSYS | DS_CENTER | WS_CHILD | WS_VISIBLE | WS_CAPTION | WS_VSCROLL | WS_HSCROLL
EXSTYLE WS_EX_TOPMOST | WS_EX_TOOLWINDOW
CAPTION "JSON Viewer"
FONT 8, "MS Shell Dlg", 400, 0, 0x1
BEGIN
CONTROL "",IDC_TREE,"SysTreeView32",TVS_HASBUTTONS | TVS_HASLINES | TVS_DISABLEDRAGDROP | TVS_SHOWSELALWAYS | TVS_FULLROWSELECT | TVS_SINGLEEXPAND | WS_TABSTOP,1,0,208,322
END
/////////////////////////////////////////////////////////////////////////////
//
// DESIGNINFO
//
#ifdef APSTUDIO_INVOKED
GUIDELINES DESIGNINFO
BEGIN
IDD_ABOUTDLG, DIALOG
BEGIN
LEFTMARGIN, 4
RIGHTMARGIN, 261
TOPMARGIN, 7
BOTTOMMARGIN, 114
END
IDD_TREEDLG, DIALOG
BEGIN
LEFTMARGIN, 7
RIGHTMARGIN, 195
TOPMARGIN, 7
BOTTOMMARGIN, 308
END
END
#endif // APSTUDIO_INVOKED
/////////////////////////////////////////////////////////////////////////////
//
// Version
//
VS_VERSION_INFO VERSIONINFO
FILEVERSION 1,40,0,0
PRODUCTVERSION 1,40,0,0
FILEFLAGSMASK 0x3fL
#ifdef _DEBUG
FILEFLAGS 0x1L
#else
FILEFLAGS 0x0L
#endif
FILEOS 0x40004L
FILETYPE 0x2L
FILESUBTYPE 0x0L
BEGIN
BLOCK "StringFileInfo"
BEGIN
BLOCK "040901b5"
BEGIN
VALUE "Comments", "JSONViewer plugin for Notepad++"
VALUE "CompanyName", "Kapil Ratnani"
VALUE "FileDescription", "Notepad++ plugin"
VALUE "FileVersion", "1.40.0.0"
VALUE "InternalName", "JSONViewer"
VALUE "OriginalFilename", "NPPJSONViewer.dll"
VALUE "ProductName", "JSONViewer plugin for Notepad++"
VALUE "ProductVersion", "1.40.0.0"
VALUE "SpecialBuild", "UNICODE"
END
END
BLOCK "VarFileInfo"
BEGIN
VALUE "Translation", 0x409, 437
END
END
/////////////////////////////////////////////////////////////////////////////
//
// AFX_DIALOG_LAYOUT
//
IDD_ABOUTDLG AFX_DIALOG_LAYOUT
BEGIN
0
END
IDD_TREEDLG AFX_DIALOG_LAYOUT
BEGIN
0
END
#endif // Neutral resources
/////////////////////////////////////////////////////////////////////////////
#ifndef APSTUDIO_INVOKED
/////////////////////////////////////////////////////////////////////////////
//
// Generated from the TEXTINCLUDE 3 resource.
//
/////////////////////////////////////////////////////////////////////////////
#endif // not APSTUDIO_INVOKED
| {
"pile_set_name": "Github"
} |
//
// UIView+MRView.m
// test
//
// Created by ๅๅ
ฅๅพต on 2017/4/15.
// Copyright ยฉ 2017ๅนด Mix_Reality. All rights reserved.
//
#import "UIView+MRView.h"
#import <objc/runtime.h>
static char kDTActionHandlerTapGestureKey;
static char kDTActionHandlerTapBlockKey;
@implementation UIView (MRView)
/**
ๆทปๅ ่ฝปๅปๆๅฟ
*/
- (void)tapActionWithBlock:(void (^)(void))block{
if(!block){
return;
}
self.userInteractionEnabled = YES;
UITapGestureRecognizer *gesture = objc_getAssociatedObject(self, &kDTActionHandlerTapGestureKey);
if (!gesture){
gesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(__handleActionForTapGesture:)];
[self addGestureRecognizer:gesture];
objc_setAssociatedObject(self, &kDTActionHandlerTapGestureKey, gesture, OBJC_ASSOCIATION_RETAIN);
}
objc_setAssociatedObject(self, &kDTActionHandlerTapBlockKey, block, OBJC_ASSOCIATION_COPY);
}
// ่ฟๆฎตไปฃ็ ๆฃๆตไบๆๅฟ่ฏๅซ็ๅ
ณ่ๅฏน่ฑกใๅฆๆๆฒกๆ๏ผๅๅๅปบๅนถๅปบ็ซๅ
ณ่ๅ
ณ็ณปใๅๆถ๏ผๅฐไผ ๅ
ฅ็ๅๅฏน่ฑก่ฟๆฅๅฐๆๅฎ็ key ไธใๆณจๆ `block` ๅฏน่ฑก็ๅ
ณ่ๅ
ๅญ็ฎก็็ญ็ฅใ
// ๆๅฟ่ฏๅซๅฏน่ฑก้่ฆไธไธช`target`ๅ`action`๏ผๆไปฅๆฅไธๆฅๆไปฌๅฎไนๅค็ๆนๆณ๏ผ
// objc
- (void)__handleActionForTapGesture:(UITapGestureRecognizer *)gesture
{
if (gesture.state == UIGestureRecognizerStateRecognized)
{
void(^action)(void) = objc_getAssociatedObject(self, &kDTActionHandlerTapBlockKey);
if (action)
{
action();
}
}
}
- (CGFloat)left {
return self.frame.origin.x;
}
- (void)setLeft:(CGFloat)x {
CGRect frame = self.frame;
frame.origin.x = x;
self.frame = frame;
}
- (CGFloat)top {
return self.frame.origin.y;
}
- (void)setTop:(CGFloat)y {
CGRect frame = self.frame;
frame.origin.y = y;
self.frame = frame;
}
- (CGFloat)right {
return self.frame.origin.x + self.frame.size.width;
}
- (void)setRight:(CGFloat)right {
CGRect frame = self.frame;
frame.origin.x = right - frame.size.width;
self.frame = frame;
}
- (CGFloat)bottom {
return self.frame.origin.y + self.frame.size.height;
}
- (void)setBottom:(CGFloat)bottom {
CGRect frame = self.frame;
frame.origin.y = bottom - frame.size.height;
self.frame = frame;
}
- (CGFloat)width {
return self.frame.size.width;
}
- (void)setWidth:(CGFloat)width {
CGRect frame = self.frame;
frame.size.width = width;
self.frame = frame;
}
- (CGFloat)height {
return self.frame.size.height;
}
- (void)setHeight:(CGFloat)height {
CGRect frame = self.frame;
frame.size.height = height;
self.frame = frame;
}
- (CGFloat)centerX {
return self.center.x;
}
- (void)setCenterX:(CGFloat)centerX {
self.center = CGPointMake(centerX, self.center.y);
}
- (CGFloat)centerY {
return self.center.y;
}
- (void)setCenterY:(CGFloat)centerY {
self.center = CGPointMake(self.center.x, centerY);
}
- (CGPoint)origin {
return self.frame.origin;
}
- (void)setOrigin:(CGPoint)origin {
CGRect frame = self.frame;
frame.origin = origin;
self.frame = frame;
}
- (CGSize)size {
return self.frame.size;
}
- (void)setSize:(CGSize)size {
CGRect frame = self.frame;
frame.size = size;
self.frame = frame;
}
@end
| {
"pile_set_name": "Github"
} |
# Copyright (c) 2019-2020 hippo91 <[email protected]>
# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html
# For details: https://github.com/PyCQA/astroid/blob/master/COPYING.LESSER
# TODO(hippo91) : correct the methods signature.
"""Astroid hooks for numpy.core.numerictypes module."""
import astroid
def numpy_core_numerictypes_transform():
# TODO: Uniformize the generic API with the ndarray one.
# According to numpy doc the generic object should expose
# the same API than ndarray. This has been done here partially
# through the astype method.
return astroid.parse(
"""
# different types defined in numerictypes.py
class generic(object):
def __init__(self, value):
self.T = np.ndarray([0, 0])
self.base = None
self.data = None
self.dtype = None
self.flags = None
# Should be a numpy.flatiter instance but not available for now
#ย Putting an array instead so that iteration and indexing are authorized
self.flat = np.ndarray([0, 0])
self.imag = None
self.itemsize = None
self.nbytes = None
self.ndim = None
self.real = None
self.size = None
self.strides = None
def all(self): return uninferable
def any(self): return uninferable
def argmax(self): return uninferable
def argmin(self): return uninferable
def argsort(self): return uninferable
def astype(self, dtype, order='K', casting='unsafe', subok=True, copy=True): return np.ndarray([0, 0])
def base(self): return uninferable
def byteswap(self): return uninferable
def choose(self): return uninferable
def clip(self): return uninferable
def compress(self): return uninferable
def conj(self): return uninferable
def conjugate(self): return uninferable
def copy(self): return uninferable
def cumprod(self): return uninferable
def cumsum(self): return uninferable
def data(self): return uninferable
def diagonal(self): return uninferable
def dtype(self): return uninferable
def dump(self): return uninferable
def dumps(self): return uninferable
def fill(self): return uninferable
def flags(self): return uninferable
def flat(self): return uninferable
def flatten(self): return uninferable
def getfield(self): return uninferable
def imag(self): return uninferable
def item(self): return uninferable
def itemset(self): return uninferable
def itemsize(self): return uninferable
def max(self): return uninferable
def mean(self): return uninferable
def min(self): return uninferable
def nbytes(self): return uninferable
def ndim(self): return uninferable
def newbyteorder(self): return uninferable
def nonzero(self): return uninferable
def prod(self): return uninferable
def ptp(self): return uninferable
def put(self): return uninferable
def ravel(self): return uninferable
def real(self): return uninferable
def repeat(self): return uninferable
def reshape(self): return uninferable
def resize(self): return uninferable
def round(self): return uninferable
def searchsorted(self): return uninferable
def setfield(self): return uninferable
def setflags(self): return uninferable
def shape(self): return uninferable
def size(self): return uninferable
def sort(self): return uninferable
def squeeze(self): return uninferable
def std(self): return uninferable
def strides(self): return uninferable
def sum(self): return uninferable
def swapaxes(self): return uninferable
def take(self): return uninferable
def tobytes(self): return uninferable
def tofile(self): return uninferable
def tolist(self): return uninferable
def tostring(self): return uninferable
def trace(self): return uninferable
def transpose(self): return uninferable
def var(self): return uninferable
def view(self): return uninferable
class dtype(object):
def __init__(self, obj, align=False, copy=False):
self.alignment = None
self.base = None
self.byteorder = None
self.char = None
self.descr = None
self.fields = None
self.flags = None
self.hasobject = None
self.isalignedstruct = None
self.isbuiltin = None
self.isnative = None
self.itemsize = None
self.kind = None
self.metadata = None
self.name = None
self.names = None
self.num = None
self.shape = None
self.str = None
self.subdtype = None
self.type = None
def newbyteorder(self, new_order='S'): return uninferable
def __neg__(self): return uninferable
class busdaycalendar(object):
def __init__(self, weekmask='1111100', holidays=None):
self.holidays = None
self.weekmask = None
class flexible(generic): pass
class bool_(generic): pass
class number(generic):
def __neg__(self): return uninferable
class datetime64(generic):
def __init__(self, nb, unit=None): pass
class void(flexible):
def __init__(self, *args, **kwargs):
self.base = None
self.dtype = None
self.flags = None
def getfield(self): return uninferable
def setfield(self): return uninferable
class character(flexible): pass
class integer(number):
def __init__(self, value):
self.denominator = None
self.numerator = None
class inexact(number): pass
class str_(str, character):
def maketrans(self, x, y=None, z=None): return uninferable
class bytes_(bytes, character):
def fromhex(self, string): return uninferable
def maketrans(self, frm, to): return uninferable
class signedinteger(integer): pass
class unsignedinteger(integer): pass
class complexfloating(inexact): pass
class floating(inexact): pass
class float64(floating, float):
def fromhex(self, string): return uninferable
class uint64(unsignedinteger): pass
class complex64(complexfloating): pass
class int16(signedinteger): pass
class float96(floating): pass
class int8(signedinteger): pass
class uint32(unsignedinteger): pass
class uint8(unsignedinteger): pass
class _typedict(dict): pass
class complex192(complexfloating): pass
class timedelta64(signedinteger):
def __init__(self, nb, unit=None): pass
class int32(signedinteger): pass
class uint16(unsignedinteger): pass
class float32(floating): pass
class complex128(complexfloating, complex): pass
class float16(floating): pass
class int64(signedinteger): pass
buffer_type = memoryview
bool8 = bool_
byte = int8
bytes0 = bytes_
cdouble = complex128
cfloat = complex128
clongdouble = complex192
clongfloat = complex192
complex_ = complex128
csingle = complex64
double = float64
float_ = float64
half = float16
int0 = int32
int_ = int32
intc = int32
intp = int32
long = int32
longcomplex = complex192
longdouble = float96
longfloat = float96
longlong = int64
object0 = object_
object_ = object_
short = int16
single = float32
singlecomplex = complex64
str0 = str_
string_ = bytes_
ubyte = uint8
uint = uint32
uint0 = uint32
uintc = uint32
uintp = uint32
ulonglong = uint64
unicode = str_
unicode_ = str_
ushort = uint16
void0 = void
"""
)
astroid.register_module_extender(
astroid.MANAGER, "numpy.core.numerictypes", numpy_core_numerictypes_transform
)
| {
"pile_set_name": "Github"
} |
// XzHandler.h
#ifndef __XZ_HANDLER_H
#define __XZ_HANDLER_H
#include "../../../C/Xz.h"
#include "../ICoder.h"
namespace NArchive {
namespace NXz {
struct CXzUnpackerCPP
{
Byte *InBuf;
Byte *OutBuf;
CXzUnpacker p;
CXzUnpackerCPP();
~CXzUnpackerCPP();
};
struct CStatInfo
{
UInt64 InSize;
UInt64 OutSize;
UInt64 PhySize;
UInt64 NumStreams;
UInt64 NumBlocks;
bool UnpackSize_Defined;
bool NumStreams_Defined;
bool NumBlocks_Defined;
bool IsArc;
bool UnexpectedEnd;
bool DataAfterEnd;
bool Unsupported;
bool HeadersError;
bool DataError;
bool CrcError;
CStatInfo() { Clear(); }
void Clear();
};
struct CDecoder: public CStatInfo
{
CXzUnpackerCPP xzu;
SRes DecodeRes; // it's not HRESULT
CDecoder(): DecodeRes(SZ_OK) {}
/* Decode() can return ERROR code only if there is progress or stream error.
Decode() returns S_OK in case of xz decoding error, but DecodeRes and CStatInfo contain error information */
HRESULT Decode(ISequentialInStream *seqInStream, ISequentialOutStream *outStream, ICompressProgressInfo *compressProgress);
Int32 Get_Extract_OperationResult() const;
};
}}
#endif
| {
"pile_set_name": "Github"
} |
// Temporary fixes for recharts v2.x.x-beta, until the typings are updated
declare module 'recharts' {
export * from '@types/recharts'
import { Point } from '@types/recharts'
// How do you call payload inside a payload?
export interface LegendPayloadPayload {
activeDot?: boolean
animateNewValues?: boolean
animationBegin?: number
animationDuration?: number
animationEasing?: string
connectNulls?: false
dataKey?: string
dot?: false
fill?: string
hide?: false
isAnimationActive?: false
legendType?: string
name?: string
points?: Point[]
length?: number
stroke?: string
strokeWidth?: number
type?: string
xAxisId?: number
yAxisId?: number
}
export interface LegendPayload {
color?: string
dataKey?: string
inactive?: boolean
payload?: LegendPayloadPayload
type: string
value: string
}
}
| {
"pile_set_name": "Github"
} |
cairo
gobject-2.0
| {
"pile_set_name": "Github"
} |
load("@io_bazel_rules_go//go:def.bzl", "go_library", "go_test")
go_library(
name = "go_default_library",
srcs = ["hash.go"],
importmap = "k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/hash",
importpath = "k8s.io/kubectl/pkg/util/hash",
visibility = ["//visibility:public"],
deps = ["//staging/src/k8s.io/api/core/v1:go_default_library"],
)
go_test(
name = "go_default_test",
srcs = ["hash_test.go"],
embed = [":go_default_library"],
deps = ["//staging/src/k8s.io/api/core/v1:go_default_library"],
)
filegroup(
name = "package-srcs",
srcs = glob(["**"]),
tags = ["automanaged"],
visibility = ["//visibility:private"],
)
filegroup(
name = "all-srcs",
srcs = [":package-srcs"],
tags = ["automanaged"],
visibility = ["//visibility:public"],
)
| {
"pile_set_name": "Github"
} |
/** @file
Copyright (c) 2018, Linaro, Ltd. All rights reserved.
SPDX-License-Identifier: BSD-2-Clause-Patent
**/
#include <Uefi.h>
#include <Library/BaseLib.h>
#include <Library/BaseMemoryLib.h>
#include <Library/DebugLib.h>
#include <Library/DevicePathLib.h>
#include <Library/HiiLib.h>
#include <Library/UefiBootServicesTableLib.h>
#include <Library/UefiLib.h>
#include <Library/UefiRuntimeServicesTableLib.h>
#include <Protocol/AcpiTable.h>
#include <Protocol/LsConnector.h>
#include <Protocol/Mezzanine.h>
#include "LsConnectorDxe.h"
extern UINT8 LsConnectorHiiBin[];
extern UINT8 LsConnectorDxeStrings[];
typedef struct {
VENDOR_DEVICE_PATH VendorDevicePath;
EFI_DEVICE_PATH_PROTOCOL End;
} HII_VENDOR_DEVICE_PATH;
STATIC HII_VENDOR_DEVICE_PATH m96BoardsDxeVendorDevicePath = {
{
{
HARDWARE_DEVICE_PATH,
HW_VENDOR_DP,
{
(UINT8) (sizeof (VENDOR_DEVICE_PATH)),
(UINT8) ((sizeof (VENDOR_DEVICE_PATH)) >> 8)
}
},
NINETY_SIX_BOARDS_FORMSET_GUID
},
{
END_DEVICE_PATH_TYPE,
END_ENTIRE_DEVICE_PATH_SUBTYPE,
{
(UINT8) (END_DEVICE_PATH_LENGTH),
(UINT8) ((END_DEVICE_PATH_LENGTH) >> 8)
}
}
};
STATIC LS_CONNECTOR_PROTOCOL mLsConnector;
STATIC EFI_EVENT EndOfDxeEvent;
STATIC
EFI_STATUS
InstallHiiPages (
VOID
)
{
EFI_STATUS Status;
EFI_HII_HANDLE HiiHandle;
EFI_HANDLE DriverHandle;
DriverHandle = NULL;
Status = gBS->InstallMultipleProtocolInterfaces (&DriverHandle,
&gEfiDevicePathProtocolGuid,
&m96BoardsDxeVendorDevicePath,
NULL);
if (EFI_ERROR (Status)) {
return Status;
}
HiiHandle = HiiAddPackages (&g96BoardsFormsetGuid,
DriverHandle,
LsConnectorDxeStrings,
LsConnectorHiiBin,
NULL);
if (HiiHandle == NULL) {
gBS->UninstallMultipleProtocolInterfaces (DriverHandle,
&gEfiDevicePathProtocolGuid,
&m96BoardsDxeVendorDevicePath,
NULL);
return EFI_OUT_OF_RESOURCES;
}
return EFI_SUCCESS;
}
STATIC
VOID
EFIAPI
PublishOsDescription (
EFI_EVENT Event,
VOID *Context
)
{
VOID *Dtb;
MEZZANINE_PROTOCOL *Mezzanine;
EFI_STATUS Status;
EFI_ACPI_TABLE_PROTOCOL *AcpiProtocol;
Status = gBS->LocateProtocol (&g96BoardsMezzanineProtocolGuid, NULL,
(VOID **)&Mezzanine);
if (EFI_ERROR (Status)) {
DEBUG ((DEBUG_INFO, "%a: no mezzanine driver active\n", __FUNCTION__));
return;
}
Status = gBS->LocateProtocol (&gEfiAcpiTableProtocolGuid, NULL,
(VOID **)&AcpiProtocol);
if (!EFI_ERROR (Status)) {
Status = Mezzanine->InstallSsdtTable (Mezzanine, AcpiProtocol);
if (EFI_ERROR (Status)) {
DEBUG ((DEBUG_WARN, "%a: failed to install SSDT table - %r\n",
__FUNCTION__, Status));
}
return;
}
//
// Find the DTB in the configuration table array. If it isn't there, just
// bail without an error: the system may be able to proceed even without
// ACPI or DT description, so it isn't up to us to complain about this.
//
Status = EfiGetSystemConfigurationTable (&gFdtTableGuid, &Dtb);
if (Status == EFI_NOT_FOUND) {
return;
}
ASSERT_EFI_ERROR (Status);
Status = Mezzanine->ApplyDeviceTreeOverlay (Mezzanine, Dtb);
if (EFI_ERROR (Status)) {
DEBUG ((DEBUG_WARN, "%a: failed to apply DT overlay - %r\n", __FUNCTION__,
Status));
}
}
/**
The entry point for 96BoardsDxe driver.
@param[in] ImageHandle The image handle of the driver.
@param[in] SystemTable The system table.
@retval EFI_ALREADY_STARTED The driver already exists in system.
@retval EFI_OUT_OF_RESOURCES Fail to execute entry point due to lack of
resources.
@retval EFI_SUCCES All the related protocols are installed on
the driver.
**/
EFI_STATUS
EFIAPI
EntryPoint (
IN EFI_HANDLE ImageHandle,
IN EFI_SYSTEM_TABLE *SystemTable
)
{
EFI_STATUS Status;
NINETY_SIX_BOARDS_CONFIG_DATA ConfigData;
UINTN BufferSize;
//
// Get the current config settings from the EFI variable.
//
BufferSize = sizeof (ConfigData);
Status = gRT->GetVariable (NINETY_SIX_BOARDS_CONFIG_VARIABLE_NAME,
&g96BoardsFormsetGuid, NULL, &BufferSize, &ConfigData);
if (EFI_ERROR (Status)) {
DEBUG ((DEBUG_INFO, "%a: no config data found\n", __FUNCTION__));
ConfigData.MezzanineType = MEZZANINE_NONE;
}
if (!EFI_ERROR (Status) &&
ConfigData.MezzanineType >= MEZZANINE_MAX) {
DEBUG ((DEBUG_WARN,
"%a: invalid value for %s, defaulting to MEZZANINE_NONE\n",
__FUNCTION__, NINETY_SIX_BOARDS_CONFIG_VARIABLE_NAME));
ConfigData.MezzanineType = MEZZANINE_NONE;
Status = EFI_INVALID_PARAMETER; // trigger setvar below
}
//
// Write the newly selected value back to the variable store.
//
if (EFI_ERROR (Status)) {
ZeroMem (&ConfigData.Reserved, sizeof (ConfigData.Reserved));
Status = gRT->SetVariable (NINETY_SIX_BOARDS_CONFIG_VARIABLE_NAME,
&g96BoardsFormsetGuid,
EFI_VARIABLE_NON_VOLATILE | EFI_VARIABLE_BOOTSERVICE_ACCESS,
sizeof (ConfigData), &ConfigData);
if (EFI_ERROR (Status)) {
DEBUG ((DEBUG_ERROR, "%a: gRT->SetVariable () failed - %r\n",
__FUNCTION__, Status));
return Status;
}
}
switch (ConfigData.MezzanineType) {
case MEZZANINE_SECURE96:
mLsConnector.MezzanineType = MezzanineSecure96;
break;
default:
mLsConnector.MezzanineType = MezzanineUnknown;
}
Status = gBS->InstallProtocolInterface (&ImageHandle,
&g96BoardsLsConnectorProtocolGuid,
EFI_NATIVE_INTERFACE,
&mLsConnector);
if (EFI_ERROR (Status)) {
return Status;
}
Status = gBS->CreateEventEx (
EVT_NOTIFY_SIGNAL,
TPL_NOTIFY,
PublishOsDescription,
NULL,
&gEfiEndOfDxeEventGroupGuid,
&EndOfDxeEvent);
ASSERT_EFI_ERROR (Status);
return InstallHiiPages ();
}
| {
"pile_set_name": "Github"
} |
fileFormatVersion: 2
guid: 48d4276c5eddf1a4494888f1b72180bc
NativeFormatImporter:
externalObjects: {}
mainObjectFileID: 8926484042661614526
userData:
assetBundleName:
assetBundleVariant:
| {
"pile_set_name": "Github"
} |
no-superimposed-version-on-logo.patch
01-optional-icons-in-dialogs.patch
02-request-field-dialogs.patch
03-blist-dialogs.patch
04-saved-status-dialogs.patch
05-statusbox-icon-size.patch
06-account-dialogs.patch
07-roomlist-dialog.patch
define-pda-mode.patch
desktop-name-2.0.0.patch
docklet-icon-size.patch
08-prefs-dialog.patch
09-filetransfer-dialog.patch
10-pda-default-settings.patch
| {
"pile_set_name": "Github"
} |
---
Description: An enhanced metafile is an array of records.
ms.assetid: af3261c7-2113-4777-97c0-504f23022550
title: Enhanced Metafile Records
ms.topic: article
ms.date: 05/31/2018
---
# Enhanced Metafile Records
An enhanced metafile is an array of records. A metafile record is a variable-length [**ENHMETARECORD**](/windows/win32/api/wingdi/ns-wingdi-enhmetarecord) structure. At the beginning of every enhanced metafile record is an [**EMR**](/windows/win32/api/wingdi/ns-wingdi-emr) structure, which contains two members. The first member, iType, identifies the record type that is, the GDI function whose parameters are contained in the record. Because the structures are variable in length, the other member, nSize, contains the size of the record. Immediately following the nSize member are the remaining parameters, if any, of the GDI function. The remainder of the structure contains additional data that is dependent on the record type.
The first record in an enhanced metafile is always the [**ENHMETAHEADER**](/windows/win32/api/wingdi/ns-wingdi-enhmetaheader) structure, which is the enhanced-metafile header. The header specifies the following information:
- Size of the metafile, in bytes
- Dimensions of the picture frame, in device units
- Dimensions of the picture frame, in .01-millimeter units
- Number of records in the metafile
- Offset to an optional text description
- Size of the optional palette
- Resolution of the original device, in pixels
- Resolution of the original device, in millimeters
An optional text description can follow the header record. The text description describes the picture and the author's name. The optional palette specifies the colors used to create the enhanced metafile. The remaining records identify the GDI functions used to create the picture. The following hexadecimal output corresponds to a record generated for a call to the [**SetMapMode**](/windows/desktop/api/Wingdi/nf-wingdi-setmapmode) function.
```C++
00000011 0000000C 00000004
```
The value 0x00000011 specifies the record type (corresponds to the EMR\_SETMAPMODE constant defined in the file Wingdi.h). The value 0x0000000C specifies the length of the record, in bytes. The value 0x00000004 identifies the mapping mode (corresponds to the MM\_LOENGLISH constant defined in the [**SetMapMode**](/windows/desktop/api/Wingdi/nf-wingdi-setmapmode) function).
For a list of additional record types, see [Metafile Structures](metafile-structures.md).
ย
ย
| {
"pile_set_name": "Github"
} |
; haribote-os boot asm
; TAB=4
BOTPAK EQU 0x00280000 ; ๅ ่ฝฝbootpack
DSKCAC EQU 0x00100000 ; ็ฃ็็ผๅญ็ไฝ็ฝฎ
DSKCAC0 EQU 0x00008000 ; ็ฃ็็ผๅญ็ไฝ็ฝฎ๏ผๅฎๆจกๅผ๏ผ
; BOOT_INFO็ธๅ
ณ
CYLS EQU 0x0ff0 ; ๅผๅฏผๆๅบ่ฎพ็ฝฎ
LEDS EQU 0x0ff1
VMODE EQU 0x0ff2 ; ๅ
ณไบ้ข่ฒ็ไฟกๆฏ
SCRNX EQU 0x0ff4 ; ๅ่พจ็X
SCRNY EQU 0x0ff6 ; ๅ่พจ็Y
VRAM EQU 0x0ff8 ; ๅพๅ็ผๅฒๅบ็่ตทๅงๅฐๅ
ORG 0xc200 ; ่ฟไธช็็จๅบ่ฆ่ขซ่ฃ
่ฝฝ็ๅ
ๅญๅฐๅ
; ็ป้ขๆจกๅผ่ฎพๅฎ
MOV AL,0x13 ; VGAๆพๅก๏ผ320x200x8bit
MOV AH,0x00
INT 0x10
MOV BYTE [VMODE],8 ; ๅฑๅน็ๆจกๅผ๏ผๅ่C่ฏญ่จ็ๅผ็จ๏ผ
MOV WORD [SCRNX],320
MOV WORD [SCRNY],200
MOV DWORD [VRAM],0x000a0000
; ้่ฟBIOS่ทๅๆ็คบ็ฏ็ถๆ
MOV AH,0x02
INT 0x16 ; keyboard BIOS
MOV [LEDS],AL
; PICๅ
ณ้ญไธๅไธญๆญ
; ๆ นๆฎATๅ
ผๅฎนๆบ็่งๆ ผ๏ผๅฆๆ่ฆๅๅงๅPIC,
; ๅฟ
้กปๅจCLIไนๅ่ฟ่ก๏ผๅฆๅๆๆถไผๆ่ตทใ
; ้ๅ่ฟ่กPIC็ๅๅงๅ
MOV AL,0xff
OUT 0x21,AL
NOP ; ๅฆๆ่ฟ็ปญๆง่กOUTๆไปค, ๆไบๆบ็งๆ ๆณๅๆญฃๅธธ่ฟ่ก๏ผCPUไผๆฏไธไธชๆถ้๏ผ
OUT 0xa1,AL
CLI ; ็ฆๆญขCPU็บงๅซ็ไธญๆญ
; ่ฎฉCPUๆฏๆ1Mไปฅไธๅ
ๅญใ่ฎพ็ฝฎA20GATE
CALL waitkbdout
MOV AL,0xd1
OUT 0x64,AL
CALL waitkbdout
MOV AL,0xdf ; enable A20
OUT 0x60,AL
CALL waitkbdout
; ไฟๆคๆจกๅผ่ฝฌๆข
[INSTRSET "i486p"] ; ่ฏดๆไฝฟ็จ486ๆไปค
LGDT [GDTR0] ; ่ฎพ็ฝฎไธดๆถGDT
MOV EAX,CR0
AND EAX,0x7fffffff ; ไฝฟ็จbit31๏ผไธบไบ็ฆๆญขๅ้กต๏ผ
OR EAX,0x00000001 ; bit0ๅฐ1่ฝฌๆข๏ผไฟๆคๆจกๅผ่ฟๆธก๏ผ
MOV CR0,EAX
JMP pipelineflush
pipelineflush:
MOV AX,1*8 ; ๅ่ฏปๅ็ๆฎต 32bit
MOV DS,AX
MOV ES,AX
MOV FS,AX
MOV GS,AX
MOV SS,AX
; bootpackไผ ้
MOV ESI,bootpack ; ่ฝฌ้ๆบ
MOV EDI,BOTPAK ; ่ฝฌ้็็ฎๆ
MOV ECX,512*1024/4
CALL memcpy
; ็ฃ็ๆฐๆฎๆ็ป่ฝฌ้ๅฐๅฎๆฌๆฅ็ไฝ็ฝฎๅป
; ้ฆๅ
ไปๅผๅฏผๅบๅผๅง
MOV ESI,0x7c00 ; ่ฝฌ้ๆบ
MOV EDI,DSKCAC ; ่ฝฌ้็็ฎๆ
MOV ECX,512/4
CALL memcpy
; ๅฉไฝ็ๅ
จ้จ
MOV ESI,DSKCAC0+512 ; ่ฝฌ้ๆบ
MOV EDI,DSKCAC+512 ; ่ฝฌ้็็ฎๆ
MOV ECX,0
MOV CL,BYTE [CYLS]
IMUL ECX,512*18*2/4 ; ไปๆฑ้ขๆฐๅๆขไธบๅญ่ๆฐ/4(่ฝฌ้ๆฐๆฎๅคงๅฐๆฏไปฅๅๅญไธบๅไฝ)
SUB ECX,512/4 ; ๅๅปIPL
CALL memcpy
; ๅฟ
้กป็ฑasmheadๆฅๅฎๆ็ๅทฅไฝ๏ผ่ณๆญคๅ
จ้จๅฎๆฏ
; ไปฅๅๅฐฑไบค็ฑbootpackๆฅๅฎๆ
; bootpackๅฏๅจ
MOV EBX,BOTPAK
MOV ECX,[EBX+16]
ADD ECX,3 ; ECX += 3;
SHR ECX,2 ; ECX /= 4;
JZ skip ; ไผ ่พๅฎๆ๏ผๆฒกๆ่ฆ่ฝฌ้็ไธ่ฅฟๆถ๏ผ
MOV ESI,[EBX+20] ; ่ฝฌ้ๆบ
ADD ESI,EBX
MOV EDI,[EBX+12] ; ่ฝฌ้็ฎ็ๅฐ
CALL memcpy
skip:
MOV ESP,[EBX+12] ; ๆ ็ๅๅงๅ
JMP DWORD 2*8:0x0000001b
waitkbdout:
IN AL,0x64
AND AL,0x02
JNZ waitkbdout ; AND็ปๆไธไธบ0ๅฐฑ่ทณ่ฝฌๅฐwaitkbdout
RET
memcpy:
MOV EAX,[ESI]
ADD ESI,4
MOV [EDI],EAX
ADD EDI,4
SUB ECX,1
JNZ memcpy ; ่ฟ็ฎ็ปๆไธไธบ0ๅฐฑ่ทณ่ฝฌๅฐmemcpy
RET
; memcpyๅฐๅๅ็ผๅคงๅฐ
ALIGNB 16
GDT0:
RESB 8 ; NULL selector
DW 0xffff,0x0000,0x9200,0x00cf ; ๅฏไปฅ่ฏปๅ็ๆฎต๏ผsegment๏ผ32bit
DW 0xffff,0x0000,0x9a28,0x0047 ; ๅฏไปฅๆง่ก็ๆฎต๏ผsegment๏ผ32bit(bootpack็จ)
DW 0
GDTR0:
DW 8*3-1
DD GDT0
ALIGNB 16
bootpack:
| {
"pile_set_name": "Github"
} |
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>add-articles-to-reading-list</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืืกืคืช %#@v1@ ืืจืฉืืืช ืืงืจืืื</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืฃ ืืื</string>
<key>other</key>
<string>%1$d ืืคืื</string>
</dict>
</dict>
<key>article-deleted-accessibility-notification</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืขืจื ื ืืืง</string>
<key>other</key>
<string>ืืขืจืืื ื ืืืงื</string>
</dict>
</dict>
<key>diff-paragraph-moved-distance-line</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืฉืืจื ืืืช</string>
<key>other</key>
<string>%1$d ืฉืืจืืช</string>
</dict>
</dict>
<key>diff-paragraph-moved-distance-section</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืงืืข ืืื</string>
<key>other</key>
<string>%1$d ืืงืืขืื</string>
</dict>
</dict>
<key>diff-single-header-editor-number-edits-format</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืขืจืืื ืืืช</string>
<key>other</key>
<string>%1$d ืขืจืืืืช</string>
</dict>
</dict>
<key>diff-single-header-subtitle-bytes-added</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืืช ืืื ื ืืกืฃ</string>
<key>other</key>
<string>%1$d ืืชืื ื ืืกืคื</string>
</dict>
</dict>
<key>diff-single-header-subtitle-bytes-removed</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืืช ืืื ืืืกืจ</string>
<key>other</key>
<string>%1$d ืืชืื ืืืกืจื</string>
</dict>
</dict>
<key>diff-unedited-lines-format</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืฉืืจื ืฉืื ื ืขืจืื</string>
<key>other</key>
<string>%1$d ืฉืืจืืช ืฉืื ื ืขืจืื</string>
</dict>
</dict>
<key>intermediate-edits-count</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืจืกืชึพืืื ืืื ืืืช ืื ืืืฆืืช</string>
<key>other</key>
<string>%1$d ืืจืกืืืชึพืืื ืืื ืื ืืืฆืืืช</string>
</dict>
</dict>
<key>intermediate-edits-editors-count</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@ ืืืช %#@v2@ %#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืื ื ืืืฆืืช</string>
<key>other</key>
<string>ืืื ื ืืืฆืืืช</string>
</dict>
<key>v2</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืฉืชืืฉ ืืื</string>
<key>other</key>
<string>%2$d ืืฉืชืืฉืื</string>
</dict>
</dict>
<key>intermediate-edits-editors-limited-count</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@ ืืืช %2$d+ %#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืื ื ืืืฆืืช</string>
<key>other</key>
<string>ืืื ื ืืืฆืืืช</string>
</dict>
</dict>
<key>intermediate-edits-limited-editors-count</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%1$d+ ืืจืกืืืชึพืืื ืืื ืืืช %#@v2@ ืืื ื ืืืฆืืืช</string>
<key>v2</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืฉืชืืฉ ืืื</string>
<key>other</key>
<string>%2$d ืืฉืชืืฉืื</string>
</dict>
</dict>
<key>move-articles-to-reading-list</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืขืืจืช %#@v1@ ืืจืฉืืืช ืืงืจืืื</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืฃ ืืื</string>
<key>other</key>
<string>%1$d ืืคืื</string>
</dict>
</dict>
<key>on-this-day-detail-header-title</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืืจืืข ืืืกืืืจื</string>
<key>other</key>
<string>%1$d ืืืจืืขืื ืืืกืืืจืืื</string>
</dict>
</dict>
<key>page-history-revision-size-diff-addition</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ื ืืก%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืฃ ืืืช ืืื</string>
<key>other</key>
<string>ืคื %1$d ืืชืื</string>
</dict>
</dict>
<key>page-history-revision-size-diff-subtraction</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืืกืจ%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string> ืืืช ืืื</string>
<key>other</key>
<string>ื %1$d ืืชืื</string>
</dict>
</dict>
<key>page-history-stats-text</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@ ืืื %2$@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืขืจืืื ืืืช</string>
<key>other</key>
<string>%1$d ืขืจืืืืช</string>
</dict>
</dict>
<key>places-filter-top-articles-count</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืขืจื ืืื</string>
<key>other</key>
<string>%1$d ืขืจืืื</string>
</dict>
</dict>
<key>reading-list-entry-limit-exceeded-title</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืจืืช ืืืืืืช %#@v1@ ืืืฉืืื.</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืขืจื ืืื</string>
<key>other</key>
<string>%1$d ืขืจืืื</string>
</dict>
</dict>
<key>reading-list-entry-limit-reached</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืื ื ืืชื ืืืืกืืฃ %#@v1@ ืืจืฉืืื ืื. ืืืขืช ืืืืืื ืฉื %2$d ืขืจืืื ืืจืฉืืืช ืงืจืืื ืขืืืจ %3$@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืขืจื</string>
<key>other</key>
<string>ืขืจืืื</string>
</dict>
</dict>
<key>reading-list-list-limit-exceeded-title</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืจืืช ืืืืืืช %#@v1@ ืืืฉืืื.</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืจืฉืืืช ืงืจืืื ืืืช</string>
<key>other</key>
<string>%1$d ืจืฉืืืืช ืงืจืืื</string>
</dict>
</dict>
<key>reading-lists-count</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืจืฉืืืช ืงืจืืื ืืืช</string>
<key>other</key>
<string>%1$d ืจืฉืืืืช ืงืจืืื</string>
</dict>
</dict>
<key>reading-lists-delete-reading-list-alert-message</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืื ื ืืชื ืืืื ืืช ืืคืขืืื ืืืืช. ืื ืืขืจืืื ืฉื ืฉืืจื %#@v1@ ืื ืืืฉืืจื.</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืจืฉืืื ืื ืืืื</string>
<key>other</key>
<string>ืืจืฉืืืืช ืืื ืืืื</string>
</dict>
</dict>
<key>reading-lists-delete-reading-list-alert-title</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืืืืง ืืช %#@v1@?</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืจืฉืืื</string>
<key>other</key>
<string>ืืจืฉืืืืช</string>
</dict>
</dict>
<key>reading-lists-large-sync-completed</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@ %#@v2@ ืกืื ืืจื ื ืืืืฉืืื ืฉืื</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืขืจื ืืื</string>
<key>other</key>
<string>%1$d ืขืจืืื</string>
</dict>
<key>v2</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืจืฉืืืช ืงืจืืื ืืืช</string>
<key>other</key>
<string>ืึพ%2$d ืจืฉืืืืช ืงืจืืื</string>
</dict>
</dict>
<key>relative-date-days-ago</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืชืืื</string>
<key>other</key>
<string>ืืคื ื %1$d ืืืื</string>
<key>zero</key>
<string>ืืืื</string>
</dict>
</dict>
<key>relative-date-hours-ago</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืคื ื ืฉืขื</string>
<key>other</key>
<string>ืืคื ื %1$d ืฉืขืืช</string>
<key>zero</key>
<string>ืืคื ื ืืื ืงืฆืจ</string>
</dict>
</dict>
<key>relative-date-minutes-ago</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืคื ื ืืงื</string>
<key>other</key>
<string>ืืคื ื %1$d ืืงืืช</string>
<key>zero</key>
<string>ืื ืขืชื</string>
</dict>
</dict>
<key>relative-date-months-ago</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืืืืฉ ืฉืขืืจ</string>
<key>other</key>
<string>ืืคื ื %1$d ืืืืฉืื</string>
<key>zero</key>
<string>ืืืืืฉ</string>
</dict>
</dict>
<key>relative-date-years-ago</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืฉื ื ืฉืขืืจื</string>
<key>other</key>
<string>ืืคื ื %1$d ืฉื ืื</string>
<key>zero</key>
<string>ืืฉื ื</string>
</dict>
</dict>
<key>replace-replace-all-results-count</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>%#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืคืจืื ืืื ืืืืืฃ</string>
<key>other</key>
<string>%1$d ืคืจืืืื ืืืืืคื</string>
</dict>
</dict>
<key>saved-unsave-article-and-remove-from-reading-lists-message</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืืืื ืืฉืืืจื ืฉื %#@v1@ ืืื ืจืฉืืืืช ืืงืจืืื ืืืฉืืืืืช %#@v1@</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืืื</string>
<key>other</key>
<string>ืืืืื</string>
</dict>
</dict>
<key>saved-unsave-article-and-remove-from-reading-lists-title</key>
<dict>
<key>NSStringLocalizedFormatKey</key>
<string>ืืืื ืืช ืืฉืืืจื ืฉื %#@v1@?</string>
<key>v1</key>
<dict>
<key>NSStringFormatSpecTypeKey</key>
<string>NSStringPluralRuleType</string>
<key>NSStringFormatValueTypeKey</key>
<string>d</string>
<key>one</key>
<string>ืืขืจื</string>
<key>other</key>
<string>ืืขืจืืื</string>
</dict>
</dict>
</dict>
</plist>
| {
"pile_set_name": "Github"
} |
#pragma once
#include <functional>
namespace torch { namespace jit {
class ResourceGuard {
std::function<void()> _destructor;
bool _released;
public:
ResourceGuard(std::function<void()> destructor)
: _destructor(std::move(destructor))
, _released(false) {}
~ResourceGuard() {
if (!_released) _destructor();
}
void release() {
_released = true;
}
};
}}
| {
"pile_set_name": "Github"
} |
//
// NEEDS TO BE CLEANED UP
// See _maps.scss
// Use #{rh-var($cssvar)}
// ========================================================================
// UI Styles
// ========================================================================
$rh-global--link--TextDecoration: underline;
$rh-global--link--TextDecoration--hover: underline;
// ========================================================================
// Borders
// ========================================================================
$rh-global--border-width: 1px !default;
$rh-global--border-style: solid !default;
$rh-global--border-radius: 2px;
// For border color, see _colors.scss
// ========================================================================
// Shadows
// ========================================================================
$rh-global--box-shadow--sm: 0 rh-size-prem(1) rh-size-prem(2) 0 rgba($rh-color--gray-1000, .2) !default;
$rh-global--box-shadow--md: 0 rh-size-prem(2) rh-size-prem(1) rh-size-prem(1) rgba($rh-color--gray-1000, .12), 0 rh-size-prem(4) rh-size-prem(11) rh-size-prem(6) rgba($rh-color--gray-1000, .05) !default;
$rh-global--box-shadow--lg: 0 rh-size-prem(3) rh-size-prem(7) rh-size-prem(3) rgba($rh-color--gray-1000, .13), 0 rh-size-prem(11) rh-size-prem(24) rh-size-prem(16) rgba($rh-color--gray-1000, .12) !default;
$rh-global--box-shadow--inset: inset 0 0 rh-size-prem(10) 0 rgba($rh-color--gray-1000, .25) !default;
// ========================================================================
// Animations
// ========================================================================
$rh-global--animation-timing: cubic-bezier(0.465, 0.183, 0.153, 0.946);
// ========================================================================
// Needs to be resolved! DO NOT USE!
// ========================================================================
$rh-global--border--BorderWidth: $rh-global--border-width;
$rh-global--border--BorderWidth--thin: $rh-global--border-width;
$rh-global--border--BorderStyle: $rh-global--border-style;
$rh-global--border--BorderColor: $rh-color--surface--border;
$rh-global--border--BorderColor--light: $rh-color--surface--border--lightest;
$rh-global--border--BorderColor--dark: $rh-color--surface--border--darkest;
$rh-global--border--BorderRadius: $rh-global--border-radius;
$rh-global--button-border--BorderRadius: $rh-global--border-radius;
// ========================================================================
// Shadows
// ========================================================================
$rh-global--shadow--BoxShadow--sm: $rh-global--box-shadow--sm;
$rh-global--shadow--BoxShadow--md: $rh-global--box-shadow--md;
$rh-global--shadow--BoxShadow--lg: $rh-global--box-shadow--lg;
$rh-global--shadow--BoxShadow--sm-right: rh-size-prem(4) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .12) !default;
$rh-global--shadow--BoxShadow--sm-left: rh-size-prem(-4) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .12) !default;
$rh-global--shadow--BoxShadow--sm-bottom: 0 rh-size-prem(4) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .12) !default;
$rh-global--shadow--BoxShadow--sm-top: 0 rh-size-prem(-4) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .12) !default;
$rh-global--shadow--BoxShadow--md-right: rh-size-prem(5) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .25) !default;
$rh-global--shadow--BoxShadow--md-left: rh-size-prem(-5) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .25) !default;
$rh-global--shadow--BoxShadow--md-bottom: 0 rh-size-prem(5) rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .25) !default;
$rh-global--shadow--BoxShadow--md-top: 0 rh-size-prem(-5) rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .25) !default;
$rh-global--shadow--BoxShadow--lg-right: rh-size-prem(12) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .07) !default;
$rh-global--shadow--BoxShadow--lg-left: rh-size-prem(-12) 0 rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .07) !default;
$rh-global--shadow--BoxShadow--lg-bottom: 0 rh-size-prem(12) rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .07) !default;
$rh-global--shadow--BoxShadow--lg-top: 0 rh-size-prem(-12) rh-size-prem(10) rh-size-prem(-4) rgba($pf-color-black-1000, .07) !default;
$rh-global--shadow--BoxShadow--inset: $rh-global--box-shadow--inset;
| {
"pile_set_name": "Github"
} |
package com.example.admin.processingboilerplate;
import android.app.Application;
import android.test.ApplicationTestCase;
/**
* <a href="http://d.android.com/tools/testing/testing_android.html">Testing Fundamentals</a>
*/
public class ApplicationTest extends ApplicationTestCase<Application> {
public ApplicationTest() {
super(Application.class);
}
} | {
"pile_set_name": "Github"
} |
ParamikoSSHClient
=================
.. autoclass:: omniduct.remotes.ssh_paramiko.ParamikoSSHClient
:members:
:special-members: __init__
:inherited-members:
:show-inheritance:
| {
"pile_set_name": "Github"
} |
.include "macros.inc"
start
test_name CMPGU_1
mvi r1, 0
mvi r2, 0
cmpgu r3, r1, r2
check_r3 0
test_name CMPGU_2
mvi r1, 0
mvi r2, 1
cmpgu r3, r1, r2
check_r3 0
test_name CMPGU_3
mvi r1, 1
mvi r2, 0
cmpgu r3, r1, r2
check_r3 1
test_name CMPGU_4
mvi r1, 1
mvi r2, 1
cmpgu r3, r1, r2
check_r3 0
test_name CMPGU_5
mvi r1, 0
mvi r2, -1
cmpgu r3, r1, r2
check_r3 0
test_name CMPGU_6
mvi r1, -1
mvi r2, 0
cmpgu r3, r1, r2
check_r3 1
test_name CMPGU_7
mvi r1, -1
mvi r2, -1
cmpgu r3, r1, r2
check_r3 0
test_name CMPGU_8
mvi r3, 0
mvi r2, 1
cmpgu r3, r3, r2
check_r3 0
test_name CMPGU_9
mvi r3, 1
mvi r2, 0
cmpgu r3, r3, r2
check_r3 1
test_name CMPGU_10
mvi r3, 0
cmpgu r3, r3, r3
check_r3 0
end
| {
"pile_set_name": "Github"
} |
/*******************************************************************************
* Copyright 2012-2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
* Licensed under the Apache License, Version 2.0 (the "License"). You may not use
* this file except in compliance with the License. A copy of the License is located at
*
* http://aws.amazon.com/apache2.0
*
* or in the "license" file accompanying this file.
* This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
* CONDITIONS OF ANY KIND, either express or implied. See the License for the
* specific language governing permissions and limitations under the License.
* *****************************************************************************
*
* AWS Tools for Windows (TM) PowerShell (TM)
*
*/
using System;
using System.Collections.Generic;
using System.Linq;
using System.Management.Automation;
using System.Text;
using Amazon.PowerShell.Common;
using Amazon.Runtime;
using Amazon.Connect;
using Amazon.Connect.Model;
namespace Amazon.PowerShell.Cmdlets.CONN
{
/// <summary>
/// Retrieves a token for federation.
/// </summary>
[Cmdlet("Get", "CONNFederationToken")]
[OutputType("Amazon.Connect.Model.Credentials")]
[AWSCmdlet("Calls the Amazon Connect Service GetFederationToken API operation.", Operation = new[] {"GetFederationToken"}, SelectReturnType = typeof(Amazon.Connect.Model.GetFederationTokenResponse))]
[AWSCmdletOutput("Amazon.Connect.Model.Credentials or Amazon.Connect.Model.GetFederationTokenResponse",
"This cmdlet returns an Amazon.Connect.Model.Credentials object.",
"The service call response (type Amazon.Connect.Model.GetFederationTokenResponse) can also be referenced from properties attached to the cmdlet entry in the $AWSHistory stack."
)]
public partial class GetCONNFederationTokenCmdlet : AmazonConnectClientCmdlet, IExecutor
{
#region Parameter InstanceId
/// <summary>
/// <para>
/// <para>The identifier of the Amazon Connect instance.</para>
/// </para>
/// </summary>
#if !MODULAR
[System.Management.Automation.Parameter(Position = 0, ValueFromPipelineByPropertyName = true, ValueFromPipeline = true)]
#else
[System.Management.Automation.Parameter(Position = 0, ValueFromPipelineByPropertyName = true, ValueFromPipeline = true, Mandatory = true)]
[System.Management.Automation.AllowEmptyString]
[System.Management.Automation.AllowNull]
#endif
[Amazon.PowerShell.Common.AWSRequiredParameter]
public System.String InstanceId { get; set; }
#endregion
#region Parameter Select
/// <summary>
/// Use the -Select parameter to control the cmdlet output. The default value is 'Credentials'.
/// Specifying -Select '*' will result in the cmdlet returning the whole service response (Amazon.Connect.Model.GetFederationTokenResponse).
/// Specifying the name of a property of type Amazon.Connect.Model.GetFederationTokenResponse will result in that property being returned.
/// Specifying -Select '^ParameterName' will result in the cmdlet returning the selected cmdlet parameter value.
/// </summary>
[System.Management.Automation.Parameter(ValueFromPipelineByPropertyName = true)]
public string Select { get; set; } = "Credentials";
#endregion
#region Parameter PassThru
/// <summary>
/// Changes the cmdlet behavior to return the value passed to the InstanceId parameter.
/// The -PassThru parameter is deprecated, use -Select '^InstanceId' instead. This parameter will be removed in a future version.
/// </summary>
[System.Obsolete("The -PassThru parameter is deprecated, use -Select '^InstanceId' instead. This parameter will be removed in a future version.")]
[System.Management.Automation.Parameter(ValueFromPipelineByPropertyName = true)]
public SwitchParameter PassThru { get; set; }
#endregion
protected override void ProcessRecord()
{
base.ProcessRecord();
var context = new CmdletContext();
// allow for manipulation of parameters prior to loading into context
PreExecutionContextLoad(context);
#pragma warning disable CS0618, CS0612 //A class member was marked with the Obsolete attribute
if (ParameterWasBound(nameof(this.Select)))
{
context.Select = CreateSelectDelegate<Amazon.Connect.Model.GetFederationTokenResponse, GetCONNFederationTokenCmdlet>(Select) ??
throw new System.ArgumentException("Invalid value for -Select parameter.", nameof(this.Select));
if (this.PassThru.IsPresent)
{
throw new System.ArgumentException("-PassThru cannot be used when -Select is specified.", nameof(this.Select));
}
}
else if (this.PassThru.IsPresent)
{
context.Select = (response, cmdlet) => this.InstanceId;
}
#pragma warning restore CS0618, CS0612 //A class member was marked with the Obsolete attribute
context.InstanceId = this.InstanceId;
#if MODULAR
if (this.InstanceId == null && ParameterWasBound(nameof(this.InstanceId)))
{
WriteWarning("You are passing $null as a value for parameter InstanceId which is marked as required. In case you believe this parameter was incorrectly marked as required, report this by opening an issue at https://github.com/aws/aws-tools-for-powershell/issues.");
}
#endif
// allow further manipulation of loaded context prior to processing
PostExecutionContextLoad(context);
var output = Execute(context) as CmdletOutput;
ProcessOutput(output);
}
#region IExecutor Members
public object Execute(ExecutorContext context)
{
var cmdletContext = context as CmdletContext;
// create request
var request = new Amazon.Connect.Model.GetFederationTokenRequest();
if (cmdletContext.InstanceId != null)
{
request.InstanceId = cmdletContext.InstanceId;
}
CmdletOutput output;
// issue call
var client = Client ?? CreateClient(_CurrentCredentials, _RegionEndpoint);
try
{
var response = CallAWSServiceOperation(client, request);
object pipelineOutput = null;
pipelineOutput = cmdletContext.Select(response, this);
output = new CmdletOutput
{
PipelineOutput = pipelineOutput,
ServiceResponse = response
};
}
catch (Exception e)
{
output = new CmdletOutput { ErrorResponse = e };
}
return output;
}
public ExecutorContext CreateContext()
{
return new CmdletContext();
}
#endregion
#region AWS Service Operation Call
private Amazon.Connect.Model.GetFederationTokenResponse CallAWSServiceOperation(IAmazonConnect client, Amazon.Connect.Model.GetFederationTokenRequest request)
{
Utils.Common.WriteVerboseEndpointMessage(this, client.Config, "Amazon Connect Service", "GetFederationToken");
try
{
#if DESKTOP
return client.GetFederationToken(request);
#elif CORECLR
return client.GetFederationTokenAsync(request).GetAwaiter().GetResult();
#else
#error "Unknown build edition"
#endif
}
catch (AmazonServiceException exc)
{
var webException = exc.InnerException as System.Net.WebException;
if (webException != null)
{
throw new Exception(Utils.Common.FormatNameResolutionFailureMessage(client.Config, webException.Message), webException);
}
throw;
}
}
#endregion
internal partial class CmdletContext : ExecutorContext
{
public System.String InstanceId { get; set; }
public System.Func<Amazon.Connect.Model.GetFederationTokenResponse, GetCONNFederationTokenCmdlet, object> Select { get; set; } =
(response, cmdlet) => response.Credentials;
}
}
}
| {
"pile_set_name": "Github"
} |
@PACKAGE_INIT@
set(Notcurses_DIR "@PACKAGE_SOME_INSTALL_DIR@")
# Compute paths
get_filename_component(Notcurses_CMAKE_DIR "${CMAKE_CURRENT_LIST_FILE}" PATH)
set(Notcurses_INCLUDE_DIRS "@CONF_INCLUDE_DIRS@")
set(Notcurses_LIBRARY_DIRS "@CONF_LIBRARY_DIRS@")
set(Notcurses_LIBRARIES -lnotcurses)
| {
"pile_set_name": "Github"
} |
/* ----------------------------------------------------------------------------
* This file was automatically generated by SWIG (http://www.swig.org).
* Version 4.0.0
*
* Do not make changes to this file unless you know what you are doing--modify
* the SWIG interface file instead.
* ----------------------------------------------------------------------------- */
package xyz.redtorch.gateway.ctp.x64v6v3v19p1v.api;
public class CThostFtdcExchangeForQuoteField {
private transient long swigCPtr;
protected transient boolean swigCMemOwn;
protected CThostFtdcExchangeForQuoteField(long cPtr, boolean cMemoryOwn) {
swigCMemOwn = cMemoryOwn;
swigCPtr = cPtr;
}
protected static long getCPtr(CThostFtdcExchangeForQuoteField obj) {
return (obj == null) ? 0 : obj.swigCPtr;
}
@SuppressWarnings("deprecation")
protected void finalize() {
delete();
}
public synchronized void delete() {
if (swigCPtr != 0) {
if (swigCMemOwn) {
swigCMemOwn = false;
jctpv6v3v19p1x64apiJNI.delete_CThostFtdcExchangeForQuoteField(swigCPtr);
}
swigCPtr = 0;
}
}
public void setForQuoteLocalID(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ForQuoteLocalID_set(swigCPtr, this, value);
}
public String getForQuoteLocalID() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ForQuoteLocalID_get(swigCPtr, this);
}
public void setExchangeID(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ExchangeID_set(swigCPtr, this, value);
}
public String getExchangeID() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ExchangeID_get(swigCPtr, this);
}
public void setParticipantID(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ParticipantID_set(swigCPtr, this, value);
}
public String getParticipantID() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ParticipantID_get(swigCPtr, this);
}
public void setClientID(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ClientID_set(swigCPtr, this, value);
}
public String getClientID() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ClientID_get(swigCPtr, this);
}
public void setExchangeInstID(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ExchangeInstID_set(swigCPtr, this, value);
}
public String getExchangeInstID() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ExchangeInstID_get(swigCPtr, this);
}
public void setTraderID(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_TraderID_set(swigCPtr, this, value);
}
public String getTraderID() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_TraderID_get(swigCPtr, this);
}
public void setInstallID(int value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_InstallID_set(swigCPtr, this, value);
}
public int getInstallID() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_InstallID_get(swigCPtr, this);
}
public void setInsertDate(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_InsertDate_set(swigCPtr, this, value);
}
public String getInsertDate() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_InsertDate_get(swigCPtr, this);
}
public void setInsertTime(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_InsertTime_set(swigCPtr, this, value);
}
public String getInsertTime() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_InsertTime_get(swigCPtr, this);
}
public void setForQuoteStatus(char value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ForQuoteStatus_set(swigCPtr, this, value);
}
public char getForQuoteStatus() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_ForQuoteStatus_get(swigCPtr, this);
}
public void setIPAddress(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_IPAddress_set(swigCPtr, this, value);
}
public String getIPAddress() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_IPAddress_get(swigCPtr, this);
}
public void setMacAddress(String value) {
jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_MacAddress_set(swigCPtr, this, value);
}
public String getMacAddress() {
return jctpv6v3v19p1x64apiJNI.CThostFtdcExchangeForQuoteField_MacAddress_get(swigCPtr, this);
}
public CThostFtdcExchangeForQuoteField() {
this(jctpv6v3v19p1x64apiJNI.new_CThostFtdcExchangeForQuoteField(), true);
}
}
| {
"pile_set_name": "Github"
} |
/** @file
Unaligned access functions of BaseLib.
Copyright (c) 2006 - 2010, Intel Corporation. All rights reserved.<BR>
SPDX-License-Identifier: BSD-2-Clause-Patent
**/
#include "BaseLibInternals.h"
/**
Reads a 16-bit value from memory that may be unaligned.
This function returns the 16-bit value pointed to by Buffer. The function
guarantees that the read operation does not produce an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 16-bit value that may be unaligned.
@return The 16-bit value read from Buffer.
**/
UINT16
EFIAPI
ReadUnaligned16 (
IN CONST UINT16 *Buffer
)
{
ASSERT (Buffer != NULL);
return *Buffer;
}
/**
Writes a 16-bit value to memory that may be unaligned.
This function writes the 16-bit value specified by Value to Buffer. Value is
returned. The function guarantees that the write operation does not produce
an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 16-bit value that may be unaligned.
@param Value 16-bit value to write to Buffer.
@return The 16-bit value to write to Buffer.
**/
UINT16
EFIAPI
WriteUnaligned16 (
OUT UINT16 *Buffer,
IN UINT16 Value
)
{
ASSERT (Buffer != NULL);
return *Buffer = Value;
}
/**
Reads a 24-bit value from memory that may be unaligned.
This function returns the 24-bit value pointed to by Buffer. The function
guarantees that the read operation does not produce an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 24-bit value that may be unaligned.
@return The 24-bit value read from Buffer.
**/
UINT32
EFIAPI
ReadUnaligned24 (
IN CONST UINT32 *Buffer
)
{
ASSERT (Buffer != NULL);
return *Buffer & 0xffffff;
}
/**
Writes a 24-bit value to memory that may be unaligned.
This function writes the 24-bit value specified by Value to Buffer. Value is
returned. The function guarantees that the write operation does not produce
an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 24-bit value that may be unaligned.
@param Value 24-bit value to write to Buffer.
@return The 24-bit value to write to Buffer.
**/
UINT32
EFIAPI
WriteUnaligned24 (
OUT UINT32 *Buffer,
IN UINT32 Value
)
{
ASSERT (Buffer != NULL);
*Buffer = BitFieldWrite32 (*Buffer, 0, 23, Value);
return Value;
}
/**
Reads a 32-bit value from memory that may be unaligned.
This function returns the 32-bit value pointed to by Buffer. The function
guarantees that the read operation does not produce an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 32-bit value that may be unaligned.
@return The 32-bit value read from Buffer.
**/
UINT32
EFIAPI
ReadUnaligned32 (
IN CONST UINT32 *Buffer
)
{
ASSERT (Buffer != NULL);
return *Buffer;
}
/**
Writes a 32-bit value to memory that may be unaligned.
This function writes the 32-bit value specified by Value to Buffer. Value is
returned. The function guarantees that the write operation does not produce
an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 32-bit value that may be unaligned.
@param Value The 32-bit value to write to Buffer.
@return The 32-bit value to write to Buffer.
**/
UINT32
EFIAPI
WriteUnaligned32 (
OUT UINT32 *Buffer,
IN UINT32 Value
)
{
ASSERT (Buffer != NULL);
return *Buffer = Value;
}
/**
Reads a 64-bit value from memory that may be unaligned.
This function returns the 64-bit value pointed to by Buffer. The function
guarantees that the read operation does not produce an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 64-bit value that may be unaligned.
@return The 64-bit value read from Buffer.
**/
UINT64
EFIAPI
ReadUnaligned64 (
IN CONST UINT64 *Buffer
)
{
ASSERT (Buffer != NULL);
return *Buffer;
}
/**
Writes a 64-bit value to memory that may be unaligned.
This function writes the 64-bit value specified by Value to Buffer. Value is
returned. The function guarantees that the write operation does not produce
an alignment fault.
If the Buffer is NULL, then ASSERT().
@param Buffer A pointer to a 64-bit value that may be unaligned.
@param Value The 64-bit value to write to Buffer.
@return The 64-bit value to write to Buffer.
**/
UINT64
EFIAPI
WriteUnaligned64 (
OUT UINT64 *Buffer,
IN UINT64 Value
)
{
ASSERT (Buffer != NULL);
return *Buffer = Value;
}
| {
"pile_set_name": "Github"
} |
class UserMailer < ActionMailer::Base
default from: "[email protected]"
def password_reset(user)
@user = user
mail :to => user.email, :subject => "Password Reset"
end
end
| {
"pile_set_name": "Github"
} |
/****************************************************************************
Copyright (c) 2012-2013 cocos2d-x.org
http://www.cocos2d-x.org
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
****************************************************************************/
#ifndef __PLUGIN_JNI_HELPER_H__
#define __PLUGIN_JNI_HELPER_H__
#include <jni.h>
#include <string>
namespace cocos2d {
typedef struct PluginJniMethodInfo_
{
JNIEnv * env;
jclass classID;
jmethodID methodID;
} PluginJniMethodInfo;
class PluginJniHelper
{
public:
static JavaVM* getJavaVM();
static void setJavaVM(JavaVM *javaVM);
static jclass getClassID(const char *className, JNIEnv *env=0);
static bool getStaticMethodInfo(PluginJniMethodInfo &methodinfo, const char *className, const char *methodName, const char *paramCode);
static bool getMethodInfo(PluginJniMethodInfo &methodinfo, const char *className, const char *methodName, const char *paramCode);
static std::string jstring2string(jstring str);
private:
static JavaVM *m_psJavaVM;
};
}
#endif // __PLUGIN_JNI_HELPER_H__
| {
"pile_set_name": "Github"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.