id
stringlengths 1
8
| text
stringlengths 6
1.05M
| dataset_id
stringclasses 1
value |
---|---|---|
/xandikos-0.2.8.tar.gz/xandikos-0.2.8/notes/structure.rst
|
Xandikos has a fairly clear distinction between different components.
Modules
=======
The core WebDAV implementation lives in xandikos.webdav. This just implements
the WebDAV protocol, and provides abstract classes for WebDAV resources that can be
implemented by other code.
Several WebDAV extensions (access, CardDAV, CalDAV) live in their own
Python file. They build on top of the WebDAV module, and provide extra
reporter and property implementations as defined in those specifications.
Store is a simple object-store implementation on top of a Git repository, which
has several properties that make it useful as a WebDAV backend.
The business logic lives in xandikos.web; it ties together the other modules,
|
PypiClean
|
/pyghc-0.1.0.tar.gz/pyghc-0.1.0/github/types/__init__.py
|
from .actor import Actor
from .app_permissions import AppPermissions
from .email import Email
from .event import Event
from .github_error import GithubError
from .headers import Headers
from .integration import Integration
from .issue import Issue
from .issue_comment import IssueComment
from .label import Label
from .license import SimpleLicense
from .list import List
from .milestone import Milestone
from .object import Object
from .page import Page
from .payload import Payload
from .pull_request import PullRequest
from .reaction_rollup import ReactionRollup
from .repo_permissions import RepoPermissions
from .repository import Repository, MinimalRepository, FullRepository, AdvancedSecurity, SecurityAndAnalysis, \
SecretScanning, CodeOfConduct, Repo, RepositorySubscription
from .response import Response, STATUS_CODES_MAPPING
from .user import SimpleUser, PrivateUser, PublicUser
from .stargazer import Stargazer
from .search import Match, SearchResultTextMatch, CodeSearchResultItem, SearchCodeResult, SearchRepositoriesResult, \
RepoSearchResultItem, RelatedObject, AliasObject, TopicSearchResultItem, SearchTopicsResult, SearchUsersResult, \
UserSearchResultItem, LabelSearchResultItem, LabelSearchResult
from .rate_limit import RateLimitOverview, RateLimit, Resources
__all__ = [
'Headers',
'SimpleLicense',
'List',
'Object',
'RepoPermissions',
'Repository',
'MinimalRepository',
'FullRepository',
'SecurityAndAnalysis',
'SecretScanning',
'CodeOfConduct',
'AdvancedSecurity',
'Repo',
'RepositorySubscription',
'SimpleUser',
'PrivateUser',
'PublicUser',
'STATUS_CODES_MAPPING',
'Response',
'GithubError',
'Email',
'Event',
'Label',
'Milestone',
'Page',
'PullRequest',
'AppPermissions',
'Issue',
'IssueComment',
'Integration',
'Payload',
'ReactionRollup',
'Stargazer',
'Match',
'CodeSearchResultItem',
'SearchResultTextMatch',
'SearchCodeResult',
'SearchRepositoriesResult',
'RepoSearchResultItem',
'RelatedObject',
'AliasObject',
'TopicSearchResultItem',
'SearchTopicsResult',
'UserSearchResultItem',
'SearchUsersResult',
'LabelSearchResult',
'LabelSearchResultItem',
'RateLimit',
'RateLimitOverview',
'Resources',
]
|
PypiClean
|
/naturalsort-1.5.1.tar.gz/naturalsort-1.5.1/README.rst
|
Simple natural order sorting API for Python
===========================================
.. image:: https://travis-ci.org/xolox/python-naturalsort.svg?branch=master
:target: https://travis-ci.org/xolox/python-naturalsort
.. image:: https://coveralls.io/repos/xolox/python-naturalsort/badge.png?branch=master
:target: https://coveralls.io/r/xolox/python-naturalsort?branch=master
The ``natsort.natsort()`` function in the ``naturalsort`` package is a very
simple alternative to Python's ``sorted()`` function that implements `natural
order sorting`_ in Python. The package is available on PyPI_, so getting
started is very simple::
$ pip install naturalsort
$ python
> from natsort import natsort
> versions = ['1.8.1-r26', '1.8.1-r30', '2.0-r2', '2.0-r7', '2.0-r11']
> natsort(['my-package-%s' % v for v in versions])
['my-package-1.8.1-r26',
'my-package-1.8.1-r30',
'my-package-2.0-r2',
'my-package-2.0-r7',
'my-package-2.0-r11']
Usage
-----
Here's an example of regular sorting (based on the ASCII_ order of individual
characters) compared to `natural order sorting`_::
> # Import the sorted() alternative.
> from natsort import natsort
>
> # This is plain old sorting (what we DON'T want).
> sorted(['1', '5', '10', '50'])
['1', '10', '5', '50']
>
> # This is natural order sorting (what we DO want).
> natsort(['1', '5', '10', '50'])
['1', '5', '10', '50']
>
> # natsort() accepts a optional ``reverse`` argument for consistency with
> the built-in sorted() function.
> natsort(['1', '5', '10', '50'], reverse=True)
['50', '10', '5', '1']
Custom comparison keys
^^^^^^^^^^^^^^^^^^^^^^
The main use case that the naturalsort_ package was originally created for is
sorting of filenames with versions numbers embedded in them. Unfortunately this
won't always work out of the box; you may need to define a custom comparison
key. Here's an example where a custom comparison key is required to get the
proper results::
> from natsort import natsort
> from pprint import pprint
> versions = ['package-name_1_all.deb',
... 'package-name_1.5_all.deb',
... 'package-name_2_all.deb']
This is what happens by default::
> pprint(natsort(versions))
['package-name_1.5_all.deb',
'package-name_1_all.deb',
'package-name_2_all.deb']
Here's how to get the right results::
> from os.path import basename, splitext
> def version_from_fname(filename):
... filename, extension = splitext(basename(filename))
.. name, version, architecture = filename.split('_')
... return version
...
> pprint(natsort(versions, key=version_from_fname))
['package-name_1_all.deb',
'package-name_1.5_all.deb',
'package-name_2_all.deb']
Why another natsort module?!
----------------------------
The natsort_ package on PyPI is more advanced and configurable than my
naturalsort_ package, so depending on your use case you may prefer to use that
package instead. Here are the differences:
1. My naturalsort_ package implements only a small subset of the functionality
of the natsort_ package, specifically the following calls result in the same
sorting order:
naturalsort package:
``natsort.natsort(['1-1', '1-2'])``
natsort package:
``natsort.natsorted(['1-1', '1-2'], number_type=None)``
This example shows the different goals of the two packages: The naturalsort_
package is intended to sort version numbers while the natsort_ package by
default interprets dashes as a negative sign and requires the keyword
argument ``number_type=None`` to disable this behavior.
2. The naturalsort_ package works on Python 2.4 and 2.5 while the natsort_
package requires at least Python 2.6.
Contact
-------
The latest version of naturalsort_ is available on PyPI_ and GitHub_. For
bug reports please create an issue on GitHub_. If you have questions,
suggestions, etc. feel free to send me an e-mail at `[email protected]`_.
License
-------
This software is licensed under the `MIT license`_.
© 2015 Peter Odding.
.. External references:
.. _ASCII: http://en.wikipedia.org/wiki/ASCII
.. _GitHub: https://github.com/xolox/python-naturalsort
.. _MIT license: http://en.wikipedia.org/wiki/MIT_License
.. _natsort: https://pypi.python.org/pypi/natsort
.. _natural order sorting: http://www.codinghorror.com/blog/2007/12/sorting-for-humans-natural-sort-order.htm
.. _naturalsort: https://pypi.python.org/pypi/naturalsort
.. [email protected]: [email protected]
.. _PyPI: https://pypi.python.org/pypi/naturalsort
|
PypiClean
|
/pulumiverse_mssql-0.0.7.tar.gz/pulumiverse_mssql-0.0.7/pulumiverse_mssql/server_permission.py
|
import copy
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['ServerPermissionArgs', 'ServerPermission']
@pulumi.input_type
class ServerPermissionArgs:
def __init__(__self__, *,
permission: pulumi.Input[str],
principal_id: pulumi.Input[str],
with_grant_option: Optional[pulumi.Input[bool]] = None):
"""
The set of arguments for constructing a ServerPermission resource.
:param pulumi.Input[str] permission: Name of server-level SQL permission. For full list of supported permissions see [docs](https://learn.microsoft.com/en-us/sql/t-sql/statements/grant-server-permissions-transact-sql?view=azuresqldb-current#remarks)
:param pulumi.Input[str] principal_id: ID of the principal who will be granted `permission`. Can be retrieved using `ServerRole` or `SqlLogin`.
:param pulumi.Input[bool] with_grant_option: When set to `true`, `principal_id` will be allowed to grant the `permission` to other principals. Defaults to `false`
"""
pulumi.set(__self__, "permission", permission)
pulumi.set(__self__, "principal_id", principal_id)
if with_grant_option is not None:
pulumi.set(__self__, "with_grant_option", with_grant_option)
@property
@pulumi.getter
def permission(self) -> pulumi.Input[str]:
"""
Name of server-level SQL permission. For full list of supported permissions see [docs](https://learn.microsoft.com/en-us/sql/t-sql/statements/grant-server-permissions-transact-sql?view=azuresqldb-current#remarks)
"""
return pulumi.get(self, "permission")
@permission.setter
def permission(self, value: pulumi.Input[str]):
pulumi.set(self, "permission", value)
@property
@pulumi.getter(name="principalId")
def principal_id(self) -> pulumi.Input[str]:
"""
ID of the principal who will be granted `permission`. Can be retrieved using `ServerRole` or `SqlLogin`.
"""
return pulumi.get(self, "principal_id")
@principal_id.setter
def principal_id(self, value: pulumi.Input[str]):
pulumi.set(self, "principal_id", value)
@property
@pulumi.getter(name="withGrantOption")
def with_grant_option(self) -> Optional[pulumi.Input[bool]]:
"""
When set to `true`, `principal_id` will be allowed to grant the `permission` to other principals. Defaults to `false`
"""
return pulumi.get(self, "with_grant_option")
@with_grant_option.setter
def with_grant_option(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "with_grant_option", value)
@pulumi.input_type
class _ServerPermissionState:
def __init__(__self__, *,
permission: Optional[pulumi.Input[str]] = None,
principal_id: Optional[pulumi.Input[str]] = None,
with_grant_option: Optional[pulumi.Input[bool]] = None):
"""
Input properties used for looking up and filtering ServerPermission resources.
:param pulumi.Input[str] permission: Name of server-level SQL permission. For full list of supported permissions see [docs](https://learn.microsoft.com/en-us/sql/t-sql/statements/grant-server-permissions-transact-sql?view=azuresqldb-current#remarks)
:param pulumi.Input[str] principal_id: ID of the principal who will be granted `permission`. Can be retrieved using `ServerRole` or `SqlLogin`.
:param pulumi.Input[bool] with_grant_option: When set to `true`, `principal_id` will be allowed to grant the `permission` to other principals. Defaults to `false`
"""
if permission is not None:
pulumi.set(__self__, "permission", permission)
if principal_id is not None:
pulumi.set(__self__, "principal_id", principal_id)
if with_grant_option is not None:
pulumi.set(__self__, "with_grant_option", with_grant_option)
@property
@pulumi.getter
def permission(self) -> Optional[pulumi.Input[str]]:
"""
Name of server-level SQL permission. For full list of supported permissions see [docs](https://learn.microsoft.com/en-us/sql/t-sql/statements/grant-server-permissions-transact-sql?view=azuresqldb-current#remarks)
"""
return pulumi.get(self, "permission")
@permission.setter
def permission(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "permission", value)
@property
@pulumi.getter(name="principalId")
def principal_id(self) -> Optional[pulumi.Input[str]]:
"""
ID of the principal who will be granted `permission`. Can be retrieved using `ServerRole` or `SqlLogin`.
"""
return pulumi.get(self, "principal_id")
@principal_id.setter
def principal_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "principal_id", value)
@property
@pulumi.getter(name="withGrantOption")
def with_grant_option(self) -> Optional[pulumi.Input[bool]]:
"""
When set to `true`, `principal_id` will be allowed to grant the `permission` to other principals. Defaults to `false`
"""
return pulumi.get(self, "with_grant_option")
@with_grant_option.setter
def with_grant_option(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "with_grant_option", value)
class ServerPermission(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
permission: Optional[pulumi.Input[str]] = None,
principal_id: Optional[pulumi.Input[str]] = None,
with_grant_option: Optional[pulumi.Input[bool]] = None,
__props__=None):
"""
Grants server-level permission.
## Example Usage
```python
import pulumi
import pulumi_mssql as mssql
import pulumiverse_mssql as mssql
example = mssql.get_sql_login(name="example_login")
connect_to_example = mssql.ServerPermission("connectToExample",
principal_id=example.principal_id,
permission="CONNECT SQL",
with_grant_option=True)
```
## Import
import using <principal_id>/<permission>
```sh
$ pulumi import mssql:index/serverPermission:ServerPermission example '7/CONNECT SQL'
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] permission: Name of server-level SQL permission. For full list of supported permissions see [docs](https://learn.microsoft.com/en-us/sql/t-sql/statements/grant-server-permissions-transact-sql?view=azuresqldb-current#remarks)
:param pulumi.Input[str] principal_id: ID of the principal who will be granted `permission`. Can be retrieved using `ServerRole` or `SqlLogin`.
:param pulumi.Input[bool] with_grant_option: When set to `true`, `principal_id` will be allowed to grant the `permission` to other principals. Defaults to `false`
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ServerPermissionArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Grants server-level permission.
## Example Usage
```python
import pulumi
import pulumi_mssql as mssql
import pulumiverse_mssql as mssql
example = mssql.get_sql_login(name="example_login")
connect_to_example = mssql.ServerPermission("connectToExample",
principal_id=example.principal_id,
permission="CONNECT SQL",
with_grant_option=True)
```
## Import
import using <principal_id>/<permission>
```sh
$ pulumi import mssql:index/serverPermission:ServerPermission example '7/CONNECT SQL'
```
:param str resource_name: The name of the resource.
:param ServerPermissionArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ServerPermissionArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
permission: Optional[pulumi.Input[str]] = None,
principal_id: Optional[pulumi.Input[str]] = None,
with_grant_option: Optional[pulumi.Input[bool]] = None,
__props__=None):
opts = pulumi.ResourceOptions.merge(_utilities.get_resource_opts_defaults(), opts)
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ServerPermissionArgs.__new__(ServerPermissionArgs)
if permission is None and not opts.urn:
raise TypeError("Missing required property 'permission'")
__props__.__dict__["permission"] = permission
if principal_id is None and not opts.urn:
raise TypeError("Missing required property 'principal_id'")
__props__.__dict__["principal_id"] = principal_id
__props__.__dict__["with_grant_option"] = with_grant_option
super(ServerPermission, __self__).__init__(
'mssql:index/serverPermission:ServerPermission',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
permission: Optional[pulumi.Input[str]] = None,
principal_id: Optional[pulumi.Input[str]] = None,
with_grant_option: Optional[pulumi.Input[bool]] = None) -> 'ServerPermission':
"""
Get an existing ServerPermission resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] permission: Name of server-level SQL permission. For full list of supported permissions see [docs](https://learn.microsoft.com/en-us/sql/t-sql/statements/grant-server-permissions-transact-sql?view=azuresqldb-current#remarks)
:param pulumi.Input[str] principal_id: ID of the principal who will be granted `permission`. Can be retrieved using `ServerRole` or `SqlLogin`.
:param pulumi.Input[bool] with_grant_option: When set to `true`, `principal_id` will be allowed to grant the `permission` to other principals. Defaults to `false`
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ServerPermissionState.__new__(_ServerPermissionState)
__props__.__dict__["permission"] = permission
__props__.__dict__["principal_id"] = principal_id
__props__.__dict__["with_grant_option"] = with_grant_option
return ServerPermission(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def permission(self) -> pulumi.Output[str]:
"""
Name of server-level SQL permission. For full list of supported permissions see [docs](https://learn.microsoft.com/en-us/sql/t-sql/statements/grant-server-permissions-transact-sql?view=azuresqldb-current#remarks)
"""
return pulumi.get(self, "permission")
@property
@pulumi.getter(name="principalId")
def principal_id(self) -> pulumi.Output[str]:
"""
ID of the principal who will be granted `permission`. Can be retrieved using `ServerRole` or `SqlLogin`.
"""
return pulumi.get(self, "principal_id")
@property
@pulumi.getter(name="withGrantOption")
def with_grant_option(self) -> pulumi.Output[bool]:
"""
When set to `true`, `principal_id` will be allowed to grant the `permission` to other principals. Defaults to `false`
"""
return pulumi.get(self, "with_grant_option")
|
PypiClean
|
/functions-cli-0.1.0a1.tar.gz/functions-cli-0.1.0a1/functions/docker.py
|
from pydantic.main import BaseModel
from functions.constants import DockerLabel
from typing import List, Optional
import docker
from docker.models.images import Image as DockerImage
from docker.models.containers import Container as DockerContainer
from pydantic import ValidationError
from functions.types import FunctionConfig
from functions.system import load_config
from functions.errors import FunctionValueError
# TODO: Find a better way of doing this
docker_client: docker.client = None
if not docker_client:
docker_client = docker.from_env()
# TODO: Add a docker lables class
def get_config_from_image(image: DockerImage) -> FunctionConfig:
# TODO: Change to function dir
config_path = image.labels.get(DockerLabel.FUNCTION_PATH)
try:
return load_config(config_path)
except ValidationError:
raise ValueError(
"Could not load image configuration. Missing config file. Try rebuilding an image"
)
def get_function_tag_from_labels(labels: dict) -> Optional[str]:
return labels.get(DockerLabel.FUNCTION_NAME)
def all_images() -> List[DockerImage]:
"""Returns all functions created by this package"""
return docker_client.images.list(
filters={"label": f"{DockerLabel.ORGANISATION}=Ventress"}
)
def all_functions() -> List[str]:
"""Returns the names of functions that are workable"""
functions = []
for image in all_images():
function_tag = get_function_tag_from_labels(image.labels)
if function_tag:
functions.append(function_tag)
return functions
def all_running_containers() -> List[DockerContainer]:
"""Returns all containers"""
return docker_client.containers.list(
filters={"label": f"{DockerLabel.ORGANISATION}=Ventress"}
)
def all_running_functions() -> List[str]:
"""Returns a list of all running functions"""
functions = []
for container in all_running_containers():
function_tag = get_function_tag_from_labels(container.labels)
if function_tag:
functions.append(function_tag)
return functions
def build_image(image_name: str) -> DockerImage:
# TODO: Export inline code
...
def remove_image(image_name: str):
docker_client.images.remove(image_name)
|
PypiClean
|
/django-mstats-0.1.2.tar.gz/django-mstats-0.1.2/django_mstats/static/Chart.min.js
|
(function(){"use strict";var t=this,i=t.Chart,e=function(t){this.canvas=t.canvas,this.ctx=t;this.width=t.canvas.width,this.height=t.canvas.height;return this.aspectRatio=this.width/this.height,s.retinaScale(this),this};e.defaults={global:{animation:!0,animationSteps:60,animationEasing:"easeOutQuart",showScale:!0,scaleOverride:!1,scaleSteps:null,scaleStepWidth:null,scaleStartValue:null,scaleLineColor:"rgba(0,0,0,.1)",scaleLineWidth:1,scaleShowLabels:!0,scaleLabel:"<%=value%>",scaleIntegersOnly:!0,scaleBeginAtZero:!1,scaleFontFamily:"'Helvetica Neue', 'Helvetica', 'Arial', sans-serif",scaleFontSize:12,scaleFontStyle:"normal",scaleFontColor:"#666",responsive:!1,showTooltips:!0,tooltipEvents:["mousemove","touchstart","touchmove","mouseout"],tooltipFillColor:"rgba(0,0,0,0.8)",tooltipFontFamily:"'Helvetica Neue', 'Helvetica', 'Arial', sans-serif",tooltipFontSize:14,tooltipFontStyle:"normal",tooltipFontColor:"#fff",tooltipTitleFontFamily:"'Helvetica Neue', 'Helvetica', 'Arial', sans-serif",tooltipTitleFontSize:14,tooltipTitleFontStyle:"bold",tooltipTitleFontColor:"#fff",tooltipYPadding:6,tooltipXPadding:6,tooltipCaretSize:8,tooltipCornerRadius:6,tooltipXOffset:10,tooltipTemplate:"<%if (label){%><%=label%>: <%}%><%= value %>",multiTooltipTemplate:"<%= value %>",multiTooltipKeyBackground:"#fff",onAnimationProgress:function(){},onAnimationComplete:function(){}}},e.types={};var s=e.helpers={},n=s.each=function(t,i,e){var s=Array.prototype.slice.call(arguments,3);if(t)if(t.length===+t.length){var n;for(n=0;n<t.length;n++)i.apply(e,[t[n],n].concat(s))}else for(var o in t)i.apply(e,[t[o],o].concat(s))},o=s.clone=function(t){var i={};return n(t,function(e,s){t.hasOwnProperty(s)&&(i[s]=e)}),i},a=s.extend=function(t){return n(Array.prototype.slice.call(arguments,1),function(i){n(i,function(e,s){i.hasOwnProperty(s)&&(t[s]=e)})}),t},h=s.merge=function(){var t=Array.prototype.slice.call(arguments,0);return t.unshift({}),a.apply(null,t)},l=s.indexOf=function(t,i){if(Array.prototype.indexOf)return t.indexOf(i);for(var e=0;e<t.length;e++)if(t[e]===i)return e;return-1},r=s.inherits=function(t){var i=this,e=t&&t.hasOwnProperty("constructor")?t.constructor:function(){return i.apply(this,arguments)},s=function(){this.constructor=e};return s.prototype=i.prototype,e.prototype=new s,e.extend=r,t&&a(e.prototype,t),e.__super__=i.prototype,e},c=s.noop=function(){},u=s.uid=function(){var t=0;return function(){return"chart-"+t++}}(),d=s.warn=function(t){window.console&&"function"==typeof window.console.warn&&console.warn(t)},p=s.amd="function"==typeof t.define&&t.define.amd,f=s.isNumber=function(t){return!isNaN(parseFloat(t))&&isFinite(t)},g=s.max=function(t){return Math.max.apply(Math,t)},m=s.min=function(t){return Math.min.apply(Math,t)},v=(s.cap=function(t,i,e){if(f(i)){if(t>i)return i}else if(f(e)&&e>t)return e;return t},s.getDecimalPlaces=function(t){return t%1!==0&&f(t)?t.toString().split(".")[1].length:0}),S=s.radians=function(t){return t*(Math.PI/180)},x=(s.getAngleFromPoint=function(t,i){var e=i.x-t.x,s=i.y-t.y,n=Math.sqrt(e*e+s*s),o=2*Math.PI+Math.atan2(s,e);return 0>e&&0>s&&(o+=2*Math.PI),{angle:o,distance:n}},s.aliasPixel=function(t){return t%2===0?0:.5}),C=(s.splineCurve=function(t,i,e,s){var n=Math.sqrt(Math.pow(i.x-t.x,2)+Math.pow(i.y-t.y,2)),o=Math.sqrt(Math.pow(e.x-i.x,2)+Math.pow(e.y-i.y,2)),a=s*n/(n+o),h=s*o/(n+o);return{inner:{x:i.x-a*(e.x-t.x),y:i.y-a*(e.y-t.y)},outer:{x:i.x+h*(e.x-t.x),y:i.y+h*(e.y-t.y)}}},s.calculateOrderOfMagnitude=function(t){return Math.floor(Math.log(t)/Math.LN10)}),y=(s.calculateScaleRange=function(t,i,e,s,n){var o=2,a=Math.floor(i/(1.5*e)),h=o>=a,l=g(t),r=m(t);l===r&&(l+=.5,r>=.5&&!s?r-=.5:l+=.5);for(var c=Math.abs(l-r),u=C(c),d=Math.ceil(l/(1*Math.pow(10,u)))*Math.pow(10,u),p=s?0:Math.floor(r/(1*Math.pow(10,u)))*Math.pow(10,u),f=d-p,v=Math.pow(10,u),S=Math.round(f/v);(S>a||a>2*S)&&!h;)if(S>a)v*=2,S=Math.round(f/v),S%1!==0&&(h=!0);else if(n&&u>=0){if(v/2%1!==0)break;v/=2,S=Math.round(f/v)}else v/=2,S=Math.round(f/v);return h&&(S=o,v=f/S),{steps:S,stepValue:v,min:p,max:p+S*v}},s.template=function(t,i){function e(t,i){var e=/\W/.test(t)?new Function("obj","var p=[],print=function(){p.push.apply(p,arguments);};with(obj){p.push('"+t.replace(/[\r\t\n]/g," ").split("<%").join(" ").replace(/((^|%>)[^\t]*)'/g,"$1\r").replace(/\t=(.*?)%>/g,"',$1,'").split(" ").join("');").split("%>").join("p.push('").split("\r").join("\\'")+"');}return p.join('');"):s[t]=s[t];return i?e(i):e}var s={};return e(t,i)}),b=(s.generateLabels=function(t,i,e,s){var o=new Array(i);return labelTemplateString&&n(o,function(i,n){o[n]=y(t,{value:e+s*(n+1)})}),o},s.easingEffects={linear:function(t){return t},easeInQuad:function(t){return t*t},easeOutQuad:function(t){return-1*t*(t-2)},easeInOutQuad:function(t){return(t/=.5)<1?.5*t*t:-0.5*(--t*(t-2)-1)},easeInCubic:function(t){return t*t*t},easeOutCubic:function(t){return 1*((t=t/1-1)*t*t+1)},easeInOutCubic:function(t){return(t/=.5)<1?.5*t*t*t:.5*((t-=2)*t*t+2)},easeInQuart:function(t){return t*t*t*t},easeOutQuart:function(t){return-1*((t=t/1-1)*t*t*t-1)},easeInOutQuart:function(t){return(t/=.5)<1?.5*t*t*t*t:-0.5*((t-=2)*t*t*t-2)},easeInQuint:function(t){return 1*(t/=1)*t*t*t*t},easeOutQuint:function(t){return 1*((t=t/1-1)*t*t*t*t+1)},easeInOutQuint:function(t){return(t/=.5)<1?.5*t*t*t*t*t:.5*((t-=2)*t*t*t*t+2)},easeInSine:function(t){return-1*Math.cos(t/1*(Math.PI/2))+1},easeOutSine:function(t){return 1*Math.sin(t/1*(Math.PI/2))},easeInOutSine:function(t){return-0.5*(Math.cos(Math.PI*t/1)-1)},easeInExpo:function(t){return 0===t?1:1*Math.pow(2,10*(t/1-1))},easeOutExpo:function(t){return 1===t?1:1*(-Math.pow(2,-10*t/1)+1)},easeInOutExpo:function(t){return 0===t?0:1===t?1:(t/=.5)<1?.5*Math.pow(2,10*(t-1)):.5*(-Math.pow(2,-10*--t)+2)},easeInCirc:function(t){return t>=1?t:-1*(Math.sqrt(1-(t/=1)*t)-1)},easeOutCirc:function(t){return 1*Math.sqrt(1-(t=t/1-1)*t)},easeInOutCirc:function(t){return(t/=.5)<1?-0.5*(Math.sqrt(1-t*t)-1):.5*(Math.sqrt(1-(t-=2)*t)+1)},easeInElastic:function(t){var i=1.70158,e=0,s=1;return 0===t?0:1==(t/=1)?1:(e||(e=.3),s<Math.abs(1)?(s=1,i=e/4):i=e/(2*Math.PI)*Math.asin(1/s),-(s*Math.pow(2,10*(t-=1))*Math.sin(2*(1*t-i)*Math.PI/e)))},easeOutElastic:function(t){var i=1.70158,e=0,s=1;return 0===t?0:1==(t/=1)?1:(e||(e=.3),s<Math.abs(1)?(s=1,i=e/4):i=e/(2*Math.PI)*Math.asin(1/s),s*Math.pow(2,-10*t)*Math.sin(2*(1*t-i)*Math.PI/e)+1)},easeInOutElastic:function(t){var i=1.70158,e=0,s=1;return 0===t?0:2==(t/=.5)?1:(e||(e=.3*1.5),s<Math.abs(1)?(s=1,i=e/4):i=e/(2*Math.PI)*Math.asin(1/s),1>t?-.5*s*Math.pow(2,10*(t-=1))*Math.sin(2*(1*t-i)*Math.PI/e):s*Math.pow(2,-10*(t-=1))*Math.sin(2*(1*t-i)*Math.PI/e)*.5+1)},easeInBack:function(t){var i=1.70158;return 1*(t/=1)*t*((i+1)*t-i)},easeOutBack:function(t){var i=1.70158;return 1*((t=t/1-1)*t*((i+1)*t+i)+1)},easeInOutBack:function(t){var i=1.70158;return(t/=.5)<1?.5*t*t*(((i*=1.525)+1)*t-i):.5*((t-=2)*t*(((i*=1.525)+1)*t+i)+2)},easeInBounce:function(t){return 1-b.easeOutBounce(1-t)},easeOutBounce:function(t){return(t/=1)<1/2.75?7.5625*t*t:2/2.75>t?1*(7.5625*(t-=1.5/2.75)*t+.75):2.5/2.75>t?1*(7.5625*(t-=2.25/2.75)*t+.9375):1*(7.5625*(t-=2.625/2.75)*t+.984375)},easeInOutBounce:function(t){return.5>t?.5*b.easeInBounce(2*t):.5*b.easeOutBounce(2*t-1)+.5}}),w=s.requestAnimFrame=function(){return window.requestAnimationFrame||window.webkitRequestAnimationFrame||window.mozRequestAnimationFrame||window.oRequestAnimationFrame||window.msRequestAnimationFrame||function(t){return window.setTimeout(t,1e3/60)}}(),P=(s.cancelAnimFrame=function(){return window.cancelAnimationFrame||window.webkitCancelAnimationFrame||window.mozCancelAnimationFrame||window.oCancelAnimationFrame||window.msCancelAnimationFrame||function(t){return window.clearTimeout(t,1e3/60)}}(),s.animationLoop=function(t,i,e,s,n,o){var a=0,h=b[e]||b.linear,l=function(){a++;var e=a/i,r=h(e);t.call(o,r,e,a),s.call(o,r,e),i>a?o.animationFrame=w(l):n.apply(o)};w(l)},s.getRelativePosition=function(t){var i,e,s=t.originalEvent||t,n=t.currentTarget||t.srcElement,o=n.getBoundingClientRect();return s.touches?(i=s.touches[0].clientX-o.left,e=s.touches[0].clientY-o.top):(i=s.clientX-o.left,e=s.clientY-o.top),{x:i,y:e}},s.addEvent=function(t,i,e){t.addEventListener?t.addEventListener(i,e):t.attachEvent?t.attachEvent("on"+i,e):t["on"+i]=e}),L=s.removeEvent=function(t,i,e){t.removeEventListener?t.removeEventListener(i,e,!1):t.detachEvent?t.detachEvent("on"+i,e):t["on"+i]=c},k=(s.bindEvents=function(t,i,e){t.events||(t.events={}),n(i,function(i){t.events[i]=function(){e.apply(t,arguments)},P(t.chart.canvas,i,t.events[i])})},s.unbindEvents=function(t,i){n(i,function(i,e){L(t.chart.canvas,e,i)})}),F=s.getMaximumSize=function(t){var i=t.parentNode;return i.clientWidth},R=s.retinaScale=function(t){var i=t.ctx,e=t.canvas.width,s=t.canvas.height;window.devicePixelRatio&&(i.canvas.style.width=e+"px",i.canvas.style.height=s+"px",i.canvas.height=s*window.devicePixelRatio,i.canvas.width=e*window.devicePixelRatio,i.scale(window.devicePixelRatio,window.devicePixelRatio))},A=s.clear=function(t){t.ctx.clearRect(0,0,t.width,t.height)},T=s.fontString=function(t,i,e){return i+" "+t+"px "+e},M=s.longestText=function(t,i,e){t.font=i;var s=0;return n(e,function(i){var e=t.measureText(i).width;s=e>s?e:s}),s},W=s.drawRoundedRectangle=function(t,i,e,s,n,o){t.beginPath(),t.moveTo(i+o,e),t.lineTo(i+s-o,e),t.quadraticCurveTo(i+s,e,i+s,e+o),t.lineTo(i+s,e+n-o),t.quadraticCurveTo(i+s,e+n,i+s-o,e+n),t.lineTo(i+o,e+n),t.quadraticCurveTo(i,e+n,i,e+n-o),t.lineTo(i,e+o),t.quadraticCurveTo(i,e,i+o,e),t.closePath()};e.instances={},e.Type=function(t,i,s){this.options=i,this.chart=s,this.id=u(),e.instances[this.id]=this,i.responsive&&this.resize(),this.initialize.call(this,t)},a(e.Type.prototype,{initialize:function(){return this},clear:function(){return A(this.chart),this},stop:function(){return s.cancelAnimFrame.call(t,this.animationFrame),this},resize:function(t){this.stop();var i=this.chart.canvas,e=F(this.chart.canvas),s=e/this.chart.aspectRatio;return i.width=this.chart.width=e,i.height=this.chart.height=s,R(this.chart),"function"==typeof t&&t.apply(this,Array.prototype.slice.call(arguments,1)),this},reflow:c,render:function(t){return t&&this.reflow(),this.options.animation&&!t?s.animationLoop(this.draw,this.options.animationSteps,this.options.animationEasing,this.options.onAnimationProgress,this.options.onAnimationComplete,this):(this.draw(),this.options.onAnimationComplete.call(this)),this},generateLegend:function(){return y(this.options.legendTemplate,this)},destroy:function(){this.clear(),k(this,this.events),delete e.instances[this.id]},showTooltip:function(t,i){"undefined"==typeof this.activeElements&&(this.activeElements=[]);var o=function(t){var i=!1;return t.length!==this.activeElements.length?i=!0:(n(t,function(t,e){t!==this.activeElements[e]&&(i=!0)},this),i)}.call(this,t);if(o||i){if(this.activeElements=t,this.draw(),t.length>0)if(this.datasets&&this.datasets.length>1){for(var a,h,r=this.datasets.length-1;r>=0&&(a=this.datasets[r].points||this.datasets[r].bars||this.datasets[r].segments,h=l(a,t[0]),-1===h);r--);var c=[],u=[],d=function(){var t,i,e,n,o,a=[],l=[],r=[];return s.each(this.datasets,function(i){t=i.points||i.bars||i.segments,t[h]&&a.push(t[h])}),s.each(a,function(t){l.push(t.x),r.push(t.y),c.push(s.template(this.options.multiTooltipTemplate,t)),u.push({fill:t._saved.fillColor||t.fillColor,stroke:t._saved.strokeColor||t.strokeColor})},this),o=m(r),e=g(r),n=m(l),i=g(l),{x:n>this.chart.width/2?n:i,y:(o+e)/2}}.call(this,h);new e.MultiTooltip({x:d.x,y:d.y,xPadding:this.options.tooltipXPadding,yPadding:this.options.tooltipYPadding,xOffset:this.options.tooltipXOffset,fillColor:this.options.tooltipFillColor,textColor:this.options.tooltipFontColor,fontFamily:this.options.tooltipFontFamily,fontStyle:this.options.tooltipFontStyle,fontSize:this.options.tooltipFontSize,titleTextColor:this.options.tooltipTitleFontColor,titleFontFamily:this.options.tooltipTitleFontFamily,titleFontStyle:this.options.tooltipTitleFontStyle,titleFontSize:this.options.tooltipTitleFontSize,cornerRadius:this.options.tooltipCornerRadius,labels:c,legendColors:u,legendColorBackground:this.options.multiTooltipKeyBackground,title:t[0].label,chart:this.chart,ctx:this.chart.ctx}).draw()}else n(t,function(t){var i=t.tooltipPosition();new e.Tooltip({x:Math.round(i.x),y:Math.round(i.y),xPadding:this.options.tooltipXPadding,yPadding:this.options.tooltipYPadding,fillColor:this.options.tooltipFillColor,textColor:this.options.tooltipFontColor,fontFamily:this.options.tooltipFontFamily,fontStyle:this.options.tooltipFontStyle,fontSize:this.options.tooltipFontSize,caretHeight:this.options.tooltipCaretSize,cornerRadius:this.options.tooltipCornerRadius,text:y(this.options.tooltipTemplate,t),chart:this.chart}).draw()},this);return this}},toBase64Image:function(){return this.chart.canvas.toDataURL.apply(this.chart.canvas,arguments)}}),e.Type.extend=function(t){var i=this,s=function(){return i.apply(this,arguments)};if(s.prototype=o(i.prototype),a(s.prototype,t),s.extend=e.Type.extend,t.name||i.prototype.name){var n=t.name||i.prototype.name,l=e.defaults[i.prototype.name]?o(e.defaults[i.prototype.name]):{};e.defaults[n]=a(l,t.defaults),e.types[n]=s,e.prototype[n]=function(t,i){var o=h(e.defaults.global,e.defaults[n],i||{});return new s(t,o,this)}}else d("Name not provided for this chart, so it hasn't been registered");return i},e.Element=function(t){a(this,t),this.initialize.apply(this,arguments),this.save()},a(e.Element.prototype,{initialize:function(){},restore:function(t){return t?n(t,function(t){this[t]=this._saved[t]},this):a(this,this._saved),this},save:function(){return this._saved=o(this),delete this._saved._saved,this},update:function(t){return n(t,function(t,i){this._saved[i]=this[i],this[i]=t},this),this},transition:function(t,i){return n(t,function(t,e){this[e]=(t-this._saved[e])*i+this._saved[e]},this),this},tooltipPosition:function(){return{x:this.x,y:this.y}}}),e.Element.extend=r,e.Point=e.Element.extend({display:!0,inRange:function(t,i){var e=this.hitDetectionRadius+this.radius;return Math.pow(t-this.x,2)+Math.pow(i-this.y,2)<Math.pow(e,2)},draw:function(){if(this.display){var t=this.ctx;t.beginPath(),t.arc(this.x,this.y,this.radius,0,2*Math.PI),t.closePath(),t.strokeStyle=this.strokeColor,t.lineWidth=this.strokeWidth,t.fillStyle=this.fillColor,t.fill(),t.stroke()}}}),e.Arc=e.Element.extend({inRange:function(t,i){var e=s.getAngleFromPoint(this,{x:t,y:i}),n=e.angle>=this.startAngle&&e.angle<=this.endAngle,o=e.distance>=this.innerRadius&&e.distance<=this.outerRadius;return n&&o},tooltipPosition:function(){var t=this.startAngle+(this.endAngle-this.startAngle)/2,i=(this.outerRadius-this.innerRadius)/2+this.innerRadius;return{x:this.x+Math.cos(t)*i,y:this.y+Math.sin(t)*i}},draw:function(t){var i=this.ctx;i.beginPath(),i.arc(this.x,this.y,this.outerRadius,this.startAngle,this.endAngle),i.arc(this.x,this.y,this.innerRadius,this.endAngle,this.startAngle,!0),i.closePath(),i.strokeStyle=this.strokeColor,i.lineWidth=this.strokeWidth,i.fillStyle=this.fillColor,i.fill(),i.lineJoin="bevel",this.showStroke&&i.stroke()}}),e.Rectangle=e.Element.extend({draw:function(){var t=this.ctx,i=this.width/2,e=this.x-i,s=this.x+i,n=this.base-(this.base-this.y),o=this.strokeWidth/2;this.showStroke&&(e+=o,s-=o,n+=o),t.beginPath(),t.fillStyle=this.fillColor,t.strokeStyle=this.strokeColor,t.lineWidth=this.strokeWidth,t.moveTo(e,this.base),t.lineTo(e,n),t.lineTo(s,n),t.lineTo(s,this.base),t.fill(),this.showStroke&&t.stroke()},height:function(){return this.base-this.y},inRange:function(t,i){return t>=this.x-this.width/2&&t<=this.x+this.width/2&&i>=this.y&&i<=this.base}}),e.Tooltip=e.Element.extend({draw:function(){var t=this.chart.ctx;t.font=T(this.fontSize,this.fontStyle,this.fontFamily),this.xAlign="center",this.yAlign="above";var i=2,e=t.measureText(this.text).width+2*this.xPadding,s=this.fontSize+2*this.yPadding,n=s+this.caretHeight+i;this.x+e/2>this.chart.width?this.xAlign="left":this.x-e/2<0&&(this.xAlign="right"),this.y-n<0&&(this.yAlign="below");var o=this.x-e/2,a=this.y-n;switch(t.fillStyle=this.fillColor,this.yAlign){case"above":t.beginPath(),t.moveTo(this.x,this.y-i),t.lineTo(this.x+this.caretHeight,this.y-(i+this.caretHeight)),t.lineTo(this.x-this.caretHeight,this.y-(i+this.caretHeight)),t.closePath(),t.fill();break;case"below":a=this.y+i+this.caretHeight,t.beginPath(),t.moveTo(this.x,this.y+i),t.lineTo(this.x+this.caretHeight,this.y+i+this.caretHeight),t.lineTo(this.x-this.caretHeight,this.y+i+this.caretHeight),t.closePath(),t.fill()}switch(this.xAlign){case"left":o=this.x-e+(this.cornerRadius+this.caretHeight);break;case"right":o=this.x-(this.cornerRadius+this.caretHeight)}W(t,o,a,e,s,this.cornerRadius),t.fill(),t.fillStyle=this.textColor,t.textAlign="center",t.textBaseline="middle",t.fillText(this.text,o+e/2,a+s/2)}}),e.MultiTooltip=e.Element.extend({initialize:function(){this.font=T(this.fontSize,this.fontStyle,this.fontFamily),this.titleFont=T(this.titleFontSize,this.titleFontStyle,this.titleFontFamily),this.height=this.labels.length*this.fontSize+(this.labels.length-1)*(this.fontSize/2)+2*this.yPadding+1.5*this.titleFontSize,this.ctx.font=this.titleFont;var t=this.ctx.measureText(this.title).width,i=M(this.ctx,this.font,this.labels)+this.fontSize+3,e=g([i,t]);this.width=e+2*this.xPadding;var s=this.height/2;this.y-s<0?this.y=s:this.y+s>this.chart.height&&(this.y=this.chart.height-s),this.x>this.chart.width/2?this.x-=this.xOffset+this.width:this.x+=this.xOffset},getLineHeight:function(t){var i=this.y-this.height/2+this.yPadding,e=t-1;return 0===t?i+this.titleFontSize/2:i+(1.5*this.fontSize*e+this.fontSize/2)+1.5*this.titleFontSize},draw:function(){W(this.ctx,this.x,this.y-this.height/2,this.width,this.height,this.cornerRadius);var t=this.ctx;t.fillStyle=this.fillColor,t.fill(),t.closePath(),t.textAlign="left",t.textBaseline="middle",t.fillStyle=this.titleTextColor,t.font=this.titleFont,t.fillText(this.title,this.x+this.xPadding,this.getLineHeight(0)),t.font=this.font,s.each(this.labels,function(i,e){t.fillStyle=this.textColor,t.fillText(i,this.x+this.xPadding+this.fontSize+3,this.getLineHeight(e+1)),t.fillStyle=this.legendColorBackground,t.fillRect(this.x+this.xPadding,this.getLineHeight(e+1)-this.fontSize/2,this.fontSize,this.fontSize),t.fillStyle=this.legendColors[e].fill,t.fillRect(this.x+this.xPadding,this.getLineHeight(e+1)-this.fontSize/2,this.fontSize,this.fontSize)},this)}}),e.Scale=e.Element.extend({initialize:function(){this.fit()},buildYLabels:function(){this.yLabels=[];for(var t=v(this.stepValue),i=0;i<=this.steps;i++)this.yLabels.push(y(this.templateString,{value:(this.min+i*this.stepValue).toFixed(t)}));this.yLabelWidth=this.display&&this.showLabels?M(this.ctx,this.font,this.yLabels):0},addXLabel:function(t){this.xLabels.push(t),this.valuesCount++,this.fit()},removeXLabel:function(){this.xLabels.shift(),this.valuesCount--,this.fit()},fit:function(){this.startPoint=this.display?this.fontSize:0,this.endPoint=this.display?this.height-1.5*this.fontSize-5:this.height,this.startPoint+=this.padding,this.endPoint-=this.padding;var t,i=this.endPoint-this.startPoint;for(this.calculateYRange(i),this.buildYLabels(),this.calculateXLabelRotation();i>this.endPoint-this.startPoint;)i=this.endPoint-this.startPoint,t=this.yLabelWidth,this.calculateYRange(i),this.buildYLabels(),t<this.yLabelWidth&&this.calculateXLabelRotation()},calculateXLabelRotation:function(){this.ctx.font=this.font;var t,i,e=this.ctx.measureText(this.xLabels[0]).width,s=this.ctx.measureText(this.xLabels[this.xLabels.length-1]).width;if(this.xScalePaddingRight=s/2+3,this.xScalePaddingLeft=e/2>this.yLabelWidth+10?e/2:this.yLabelWidth+10,this.xLabelRotation=0,this.display){var n,o=M(this.ctx,this.font,this.xLabels);this.xLabelWidth=o;for(var a=Math.floor(this.calculateX(1)-this.calculateX(0))-6;this.xLabelWidth>a&&0===this.xLabelRotation||this.xLabelWidth>a&&this.xLabelRotation<=90&&this.xLabelRotation>0;)n=Math.cos(S(this.xLabelRotation)),t=n*e,i=n*s,t+this.fontSize/2>this.yLabelWidth+8&&(this.xScalePaddingLeft=t+this.fontSize/2),this.xScalePaddingRight=this.fontSize/2,this.xLabelRotation++,this.xLabelWidth=n*o;this.xLabelRotation>0&&(this.endPoint-=Math.sin(S(this.xLabelRotation))*o+3)}else this.xLabelWidth=0,this.xScalePaddingRight=this.padding,this.xScalePaddingLeft=this.padding},calculateYRange:c,drawingArea:function(){return this.startPoint-this.endPoint},calculateY:function(t){var i=this.drawingArea()/(this.min-this.max);return this.endPoint-i*(t-this.min)},calculateX:function(t){var i=(this.xLabelRotation>0,this.width-(this.xScalePaddingLeft+this.xScalePaddingRight)),e=i/(this.valuesCount-(this.offsetGridLines?0:1)),s=e*t+this.xScalePaddingLeft;return this.offsetGridLines&&(s+=e/2),Math.round(s)},update:function(t){s.extend(this,t),this.fit()},draw:function(){var t=this.ctx,i=(this.endPoint-this.startPoint)/this.steps,e=Math.round(this.xScalePaddingLeft);this.display&&(t.fillStyle=this.textColor,t.font=this.font,n(this.yLabels,function(n,o){var a=this.endPoint-i*o,h=Math.round(a);t.textAlign="right",t.textBaseline="middle",this.showLabels&&t.fillText(n,e-10,a),t.beginPath(),o>0?(t.lineWidth=this.gridLineWidth,t.strokeStyle=this.gridLineColor):(t.lineWidth=this.lineWidth,t.strokeStyle=this.lineColor),h+=s.aliasPixel(t.lineWidth),t.moveTo(e,h),t.lineTo(this.width,h),t.stroke(),t.closePath(),t.lineWidth=this.lineWidth,t.strokeStyle=this.lineColor,t.beginPath(),t.moveTo(e-5,h),t.lineTo(e,h),t.stroke(),t.closePath()},this),n(this.xLabels,function(i,e){var s=this.calculateX(e)+x(this.lineWidth),n=this.calculateX(e-(this.offsetGridLines?.5:0))+x(this.lineWidth),o=this.xLabelRotation>0;t.beginPath(),e>0?(t.lineWidth=this.gridLineWidth,t.strokeStyle=this.gridLineColor):(t.lineWidth=this.lineWidth,t.strokeStyle=this.lineColor),t.moveTo(n,this.endPoint),t.lineTo(n,this.startPoint-3),t.stroke(),t.closePath(),t.lineWidth=this.lineWidth,t.strokeStyle=this.lineColor,t.beginPath(),t.moveTo(n,this.endPoint),t.lineTo(n,this.endPoint+5),t.stroke(),t.closePath(),t.save(),t.translate(s,o?this.endPoint+12:this.endPoint+8),t.rotate(-1*S(this.xLabelRotation)),t.font=this.font,t.textAlign=o?"right":"center",t.textBaseline=o?"middle":"top",t.fillText(i,0,0),t.restore()},this))}}),e.RadialScale=e.Element.extend({initialize:function(){this.size=m([this.height,this.width]),this.drawingArea=this.display?this.size/2-(this.fontSize/2+this.backdropPaddingY):this.size/2},calculateCenterOffset:function(t){var i=this.drawingArea/(this.max-this.min);return(t-this.min)*i},update:function(){this.lineArc?this.drawingArea=this.display?this.size/2-(this.fontSize/2+this.backdropPaddingY):this.size/2:this.setScaleSize(),this.buildYLabels()},buildYLabels:function(){this.yLabels=[];for(var t=v(this.stepValue),i=0;i<=this.steps;i++)this.yLabels.push(y(this.templateString,{value:(this.min+i*this.stepValue).toFixed(t)}))},getCircumference:function(){return 2*Math.PI/this.valuesCount},setScaleSize:function(){var t,i,e,s,n,o,a,h,l,r,c,u,d=m([this.height/2-this.pointLabelFontSize-5,this.width/2]),p=this.width,g=0;for(this.ctx.font=T(this.pointLabelFontSize,this.pointLabelFontStyle,this.pointLabelFontFamily),i=0;i<this.valuesCount;i++)t=this.getPointPosition(i,d),e=this.ctx.measureText(y(this.templateString,{value:this.labels[i]})).width+5,0===i||i===this.valuesCount/2?(s=e/2,t.x+s>p&&(p=t.x+s,n=i),t.x-s<g&&(g=t.x-s,a=i)):i<this.valuesCount/2?t.x+e>p&&(p=t.x+e,n=i):i>this.valuesCount/2&&t.x-e<g&&(g=t.x-e,a=i);l=g,r=Math.ceil(p-this.width),o=this.getIndexAngle(n),h=this.getIndexAngle(a),c=r/Math.sin(o+Math.PI/2),u=l/Math.sin(h+Math.PI/2),c=f(c)?c:0,u=f(u)?u:0,this.drawingArea=d-(u+c)/2,this.setCenterPoint(u,c)},setCenterPoint:function(t,i){var e=this.width-i-this.drawingArea,s=t+this.drawingArea;this.xCenter=(s+e)/2,this.yCenter=this.height/2},getIndexAngle:function(t){var i=2*Math.PI/this.valuesCount;return t*i-Math.PI/2},getPointPosition:function(t,i){var e=this.getIndexAngle(t);return{x:Math.cos(e)*i+this.xCenter,y:Math.sin(e)*i+this.yCenter}},draw:function(){if(this.display){var t=this.ctx;if(n(this.yLabels,function(i,e){if(e>0){var s,n=e*(this.drawingArea/this.steps),o=this.yCenter-n;if(this.lineWidth>0)if(t.strokeStyle=this.lineColor,t.lineWidth=this.lineWidth,this.lineArc)t.beginPath(),t.arc(this.xCenter,this.yCenter,n,0,2*Math.PI),t.closePath(),t.stroke();else{t.beginPath();for(var a=0;a<this.valuesCount;a++)s=this.getPointPosition(a,this.calculateCenterOffset(this.min+e*this.stepValue)),0===a?t.moveTo(s.x,s.y):t.lineTo(s.x,s.y);t.closePath(),t.stroke()}if(this.showLabels){if(t.font=T(this.fontSize,this.fontStyle,this.fontFamily),this.showLabelBackdrop){var h=t.measureText(i).width;t.fillStyle=this.backdropColor,t.fillRect(this.xCenter-h/2-this.backdropPaddingX,o-this.fontSize/2-this.backdropPaddingY,h+2*this.backdropPaddingX,this.fontSize+2*this.backdropPaddingY)}t.textAlign="center",t.textBaseline="middle",t.fillStyle=this.fontColor,t.fillText(i,this.xCenter,o)}}},this),!this.lineArc){t.lineWidth=this.angleLineWidth,t.strokeStyle=this.angleLineColor;for(var i=this.valuesCount-1;i>=0;i--){if(this.angleLineWidth>0){var e=this.getPointPosition(i,this.calculateCenterOffset(this.max));t.beginPath(),t.moveTo(this.xCenter,this.yCenter),t.lineTo(e.x,e.y),t.stroke(),t.closePath()}var s=this.getPointPosition(i,this.calculateCenterOffset(this.max)+5);t.font=T(this.pointLabelFontSize,this.pointLabelFontStyle,this.pointLabelFontFamily),t.fillStyle=this.pointLabelFontColor;var o=this.labels.length,a=this.labels.length/2,h=a/2,l=h>i||i>o-h,r=i===h||i===o-h;t.textAlign=0===i?"center":i===a?"center":a>i?"left":"right",t.textBaseline=r?"middle":l?"bottom":"top",t.fillText(this.labels[i],s.x,s.y)}}}}}),s.addEvent(window,"resize",function(){var t;return function(){clearTimeout(t),t=setTimeout(function(){n(e.instances,function(t){t.options.responsive&&t.resize(t.render,!0)})},50)}}()),p?define(function(){return e}):"object"==typeof module&&module.exports&&(module.exports=e),t.Chart=e,e.noConflict=function(){return t.Chart=i,e}}).call(this),function(){"use strict";var t=this,i=t.Chart,e=i.helpers,s={scaleBeginAtZero:!0,scaleShowGridLines:!0,scaleGridLineColor:"rgba(0,0,0,.05)",scaleGridLineWidth:1,barShowStroke:!0,barStrokeWidth:2,barValueSpacing:5,barDatasetSpacing:1,legendTemplate:'<ul class="<%=name.toLowerCase()%>-legend"><% for (var i=0; i<datasets.length; i++){%><li><span style="background-color:<%=datasets[i].fillColor%>"></span><%if(datasets[i].label){%><%=datasets[i].label%><%}%></li><%}%></ul>'};i.Type.extend({name:"Bar",defaults:s,initialize:function(t){var s=this.options;this.ScaleClass=i.Scale.extend({offsetGridLines:!0,calculateBarX:function(t,i,e){var n=this.calculateBaseWidth(),o=this.calculateX(e)-n/2,a=this.calculateBarWidth(t);return o+a*i+i*s.barDatasetSpacing+a/2},calculateBaseWidth:function(){return this.calculateX(1)-this.calculateX(0)-2*s.barValueSpacing},calculateBarWidth:function(t){var i=this.calculateBaseWidth()-(t-1)*s.barDatasetSpacing;return i/t}}),this.datasets=[],this.options.showTooltips&&e.bindEvents(this,this.options.tooltipEvents,function(t){var i="mouseout"!==t.type?this.getBarsAtEvent(t):[];this.eachBars(function(t){t.restore(["fillColor","strokeColor"])}),e.each(i,function(t){t.fillColor=t.highlightFill,t.strokeColor=t.highlightStroke}),this.showTooltip(i)}),this.BarClass=i.Rectangle.extend({strokeWidth:this.options.barStrokeWidth,showStroke:this.options.barShowStroke,ctx:this.chart.ctx}),e.each(t.datasets,function(i){var s={label:i.label||null,fillColor:i.fillColor,strokeColor:i.strokeColor,bars:[]};this.datasets.push(s),e.each(i.data,function(n,o){e.isNumber(n)&&s.bars.push(new this.BarClass({value:n,label:t.labels[o],datasetLabel:i.label,strokeColor:i.strokeColor,fillColor:i.fillColor,highlightFill:i.highlightFill||i.fillColor,highlightStroke:i.highlightStroke||i.strokeColor}))},this)},this),this.buildScale(t.labels),this.BarClass.prototype.base=this.scale.endPoint,this.eachBars(function(t,i,s){e.extend(t,{width:this.scale.calculateBarWidth(this.datasets.length),x:this.scale.calculateBarX(this.datasets.length,s,i),y:this.scale.endPoint}),t.save()},this),this.render()},update:function(){this.scale.update(),e.each(this.activeElements,function(t){t.restore(["fillColor","strokeColor"])}),this.eachBars(function(t){t.save()}),this.render()},eachBars:function(t){e.each(this.datasets,function(i,s){e.each(i.bars,t,this,s)},this)},getBarsAtEvent:function(t){for(var i,s=[],n=e.getRelativePosition(t),o=function(t){s.push(t.bars[i])},a=0;a<this.datasets.length;a++)for(i=0;i<this.datasets[a].bars.length;i++)if(this.datasets[a].bars[i].inRange(n.x,n.y))return e.each(this.datasets,o),s;return s},buildScale:function(t){var i=this,s=function(){var t=[];return i.eachBars(function(i){t.push(i.value)}),t},n={templateString:this.options.scaleLabel,height:this.chart.height,width:this.chart.width,ctx:this.chart.ctx,textColor:this.options.scaleFontColor,fontSize:this.options.scaleFontSize,fontStyle:this.options.scaleFontStyle,fontFamily:this.options.scaleFontFamily,valuesCount:t.length,beginAtZero:this.options.scaleBeginAtZero,integersOnly:this.options.scaleIntegersOnly,calculateYRange:function(t){var i=e.calculateScaleRange(s(),t,this.fontSize,this.beginAtZero,this.integersOnly);e.extend(this,i)},xLabels:t,font:e.fontString(this.options.scaleFontSize,this.options.scaleFontStyle,this.options.scaleFontFamily),lineWidth:this.options.scaleLineWidth,lineColor:this.options.scaleLineColor,gridLineWidth:this.options.scaleShowGridLines?this.options.scaleGridLineWidth:0,gridLineColor:this.options.scaleShowGridLines?this.options.scaleGridLineColor:"rgba(0,0,0,0)",padding:this.options.showScale?0:this.options.barShowStroke?this.options.barStrokeWidth:0,showLabels:this.options.scaleShowLabels,display:this.options.showScale};this.options.scaleOverride&&e.extend(n,{calculateYRange:e.noop,steps:this.options.scaleSteps,stepValue:this.options.scaleStepWidth,min:this.options.scaleStartValue,max:this.options.scaleStartValue+this.options.scaleSteps*this.options.scaleStepWidth}),this.scale=new this.ScaleClass(n)},addData:function(t,i){e.each(t,function(t,s){e.isNumber(t)&&this.datasets[s].bars.push(new this.BarClass({value:t,label:i,x:this.scale.calculateBarX(this.datasets.length,s,this.scale.valuesCount+1),y:this.scale.endPoint,width:this.scale.calculateBarWidth(this.datasets.length),base:this.scale.endPoint,strokeColor:this.datasets[s].strokeColor,fillColor:this.datasets[s].fillColor}))},this),this.scale.addXLabel(i),this.update()},removeData:function(){this.scale.removeXLabel(),e.each(this.datasets,function(t){t.bars.shift()},this),this.update()},reflow:function(){e.extend(this.BarClass.prototype,{y:this.scale.endPoint,base:this.scale.endPoint});var t=e.extend({height:this.chart.height,width:this.chart.width});this.scale.update(t)},draw:function(t){var i=t||1;this.clear();this.chart.ctx;this.scale.draw(i),e.each(this.datasets,function(t,s){e.each(t.bars,function(t,e){t.base=this.scale.endPoint,t.transition({x:this.scale.calculateBarX(this.datasets.length,s,e),y:this.scale.calculateY(t.value),width:this.scale.calculateBarWidth(this.datasets.length)},i).draw()},this)},this)}})}.call(this),function(){"use strict";var t=this,i=t.Chart,e=i.helpers,s={segmentShowStroke:!0,segmentStrokeColor:"#fff",segmentStrokeWidth:2,percentageInnerCutout:50,animationSteps:100,animationEasing:"easeOutBounce",animateRotate:!0,animateScale:!1,legendTemplate:'<ul class="<%=name.toLowerCase()%>-legend"><% for (var i=0; i<segments.length; i++){%><li><span style="background-color:<%=segments[i].fillColor%>"></span><%if(segments[i].label){%><%=segments[i].label%><%}%></li><%}%></ul>'};i.Type.extend({name:"Doughnut",defaults:s,initialize:function(t){this.segments=[],this.outerRadius=(e.min([this.chart.width,this.chart.height])-this.options.segmentStrokeWidth/2)/2,this.SegmentArc=i.Arc.extend({ctx:this.chart.ctx,x:this.chart.width/2,y:this.chart.height/2}),this.options.showTooltips&&e.bindEvents(this,this.options.tooltipEvents,function(t){var i="mouseout"!==t.type?this.getSegmentsAtEvent(t):[];
e.each(this.segments,function(t){t.restore(["fillColor"])}),e.each(i,function(t){t.fillColor=t.highlightColor}),this.showTooltip(i)}),this.calculateTotal(t),e.each(t,function(t,i){this.addData(t,i,!0)},this),this.render()},getSegmentsAtEvent:function(t){var i=[],s=e.getRelativePosition(t);return e.each(this.segments,function(t){t.inRange(s.x,s.y)&&i.push(t)},this),i},addData:function(t,i,e){var s=i||this.segments.length;this.segments.splice(s,0,new this.SegmentArc({value:t.value,outerRadius:this.options.animateScale?0:this.outerRadius,innerRadius:this.options.animateScale?0:this.outerRadius/100*this.options.percentageInnerCutout,fillColor:t.color,highlightColor:t.highlight||t.color,showStroke:this.options.segmentShowStroke,strokeWidth:this.options.segmentStrokeWidth,strokeColor:this.options.segmentStrokeColor,startAngle:1.5*Math.PI,circumference:this.options.animateRotate?0:this.calculateCircumference(t.value),label:t.label})),e||(this.reflow(),this.update())},calculateCircumference:function(t){return 2*Math.PI*(t/this.total)},calculateTotal:function(t){this.total=0,e.each(t,function(t){this.total+=t.value},this)},update:function(){this.calculateTotal(this.segments),e.each(this.activeElements,function(t){t.restore(["fillColor"])}),e.each(this.segments,function(t){t.save()}),this.render()},removeData:function(t){var i=e.isNumber(t)?t:this.segments.length-1;this.segments.splice(i,1),this.reflow(),this.update()},reflow:function(){e.extend(this.SegmentArc.prototype,{x:this.chart.width/2,y:this.chart.height/2}),this.outerRadius=(e.min([this.chart.width,this.chart.height])-this.options.segmentStrokeWidth/2)/2,e.each(this.segments,function(t){t.update({outerRadius:this.outerRadius,innerRadius:this.outerRadius/100*this.options.percentageInnerCutout})},this)},draw:function(t){var i=t?t:1;this.clear(),e.each(this.segments,function(t,e){t.transition({circumference:this.calculateCircumference(t.value),outerRadius:this.outerRadius,innerRadius:this.outerRadius/100*this.options.percentageInnerCutout},i),t.endAngle=t.startAngle+t.circumference,t.draw(),0===e&&(t.startAngle=1.5*Math.PI),e<this.segments.length-1&&(this.segments[e+1].startAngle=t.endAngle)},this)}}),i.types.Doughnut.extend({name:"Pie",defaults:e.merge(s,{percentageInnerCutout:0})})}.call(this),function(){"use strict";var t=this,i=t.Chart,e=i.helpers,s={scaleShowGridLines:!0,scaleGridLineColor:"rgba(0,0,0,.05)",scaleGridLineWidth:1,bezierCurve:!0,bezierCurveTension:.4,pointDot:!0,pointDotRadius:4,pointDotStrokeWidth:1,pointHitDetectionRadius:20,datasetStroke:!0,datasetStrokeWidth:2,datasetFill:!0,legendTemplate:'<ul class="<%=name.toLowerCase()%>-legend"><% for (var i=0; i<datasets.length; i++){%><li><span style="background-color:<%=datasets[i].strokeColor%>"></span><%if(datasets[i].label){%><%=datasets[i].label%><%}%></li><%}%></ul>'};i.Type.extend({name:"Line",defaults:s,initialize:function(t){this.PointClass=i.Point.extend({strokeWidth:this.options.pointDotStrokeWidth,radius:this.options.pointDotRadius,display:this.options.pointDot,hitDetectionRadius:this.options.pointHitDetectionRadius,ctx:this.chart.ctx,inRange:function(t){return Math.pow(t-this.x,2)<Math.pow(this.radius+this.hitDetectionRadius,2)}}),this.datasets=[],this.options.showTooltips&&e.bindEvents(this,this.options.tooltipEvents,function(t){var i="mouseout"!==t.type?this.getPointsAtEvent(t):[];this.eachPoints(function(t){t.restore(["fillColor","strokeColor"])}),e.each(i,function(t){t.fillColor=t.highlightFill,t.strokeColor=t.highlightStroke}),this.showTooltip(i)}),e.each(t.datasets,function(i){var s={label:i.label||null,fillColor:i.fillColor,strokeColor:i.strokeColor,pointColor:i.pointColor,pointStrokeColor:i.pointStrokeColor,points:[]};this.datasets.push(s),e.each(i.data,function(n,o){e.isNumber(n)&&s.points.push(new this.PointClass({value:n,label:t.labels[o],datasetLabel:i.label,strokeColor:i.pointStrokeColor,fillColor:i.pointColor,highlightFill:i.pointHighlightFill||i.pointColor,highlightStroke:i.pointHighlightStroke||i.pointStrokeColor}))},this),this.buildScale(t.labels),this.eachPoints(function(t,i){e.extend(t,{x:this.scale.calculateX(i),y:this.scale.endPoint}),t.save()},this)},this),this.render()},update:function(){this.scale.update(),e.each(this.activeElements,function(t){t.restore(["fillColor","strokeColor"])}),this.eachPoints(function(t){t.save()}),this.render()},eachPoints:function(t){e.each(this.datasets,function(i){e.each(i.points,t,this)},this)},getPointsAtEvent:function(t){var i=[],s=e.getRelativePosition(t);return e.each(this.datasets,function(t){e.each(t.points,function(t){t.inRange(s.x,s.y)&&i.push(t)})},this),i},buildScale:function(t){var s=this,n=function(){var t=[];return s.eachPoints(function(i){t.push(i.value)}),t},o={templateString:this.options.scaleLabel,height:this.chart.height,width:this.chart.width,ctx:this.chart.ctx,textColor:this.options.scaleFontColor,fontSize:this.options.scaleFontSize,fontStyle:this.options.scaleFontStyle,fontFamily:this.options.scaleFontFamily,valuesCount:t.length,beginAtZero:this.options.scaleBeginAtZero,integersOnly:this.options.scaleIntegersOnly,calculateYRange:function(t){var i=e.calculateScaleRange(n(),t,this.fontSize,this.beginAtZero,this.integersOnly);e.extend(this,i)},xLabels:t,font:e.fontString(this.options.scaleFontSize,this.options.scaleFontStyle,this.options.scaleFontFamily),lineWidth:this.options.scaleLineWidth,lineColor:this.options.scaleLineColor,gridLineWidth:this.options.scaleShowGridLines?this.options.scaleGridLineWidth:0,gridLineColor:this.options.scaleShowGridLines?this.options.scaleGridLineColor:"rgba(0,0,0,0)",padding:this.options.showScale?0:this.options.pointDotRadius+this.options.pointDotStrokeWidth,showLabels:this.options.scaleShowLabels,display:this.options.showScale};this.options.scaleOverride&&e.extend(o,{calculateYRange:e.noop,steps:this.options.scaleSteps,stepValue:this.options.scaleStepWidth,min:this.options.scaleStartValue,max:this.options.scaleStartValue+this.options.scaleSteps*this.options.scaleStepWidth}),this.scale=new i.Scale(o)},addData:function(t,i){e.each(t,function(t,s){e.isNumber(t)&&this.datasets[s].points.push(new this.PointClass({value:t,label:i,x:this.scale.calculateX(this.scale.valuesCount+1),y:this.scale.endPoint,strokeColor:this.datasets[s].pointStrokeColor,fillColor:this.datasets[s].pointColor}))},this),this.scale.addXLabel(i),this.update()},removeData:function(){this.scale.removeXLabel(),e.each(this.datasets,function(t){t.points.shift()},this),this.update()},reflow:function(){var t=e.extend({height:this.chart.height,width:this.chart.width});this.scale.update(t)},draw:function(t){var i=t||1;this.clear();var s=this.chart.ctx;this.scale.draw(i),e.each(this.datasets,function(t){e.each(t.points,function(t,e){t.transition({y:this.scale.calculateY(t.value),x:this.scale.calculateX(e)},i)},this),this.options.bezierCurve&&e.each(t.points,function(i,s){i.controlPoints=0===s?e.splineCurve(i,i,t.points[s+1],0):s>=t.points.length-1?e.splineCurve(t.points[s-1],i,i,0):e.splineCurve(t.points[s-1],i,t.points[s+1],this.options.bezierCurveTension)},this),s.lineWidth=this.options.datasetStrokeWidth,s.strokeStyle=t.strokeColor,s.beginPath(),e.each(t.points,function(i,e){e>0?this.options.bezierCurve?s.bezierCurveTo(t.points[e-1].controlPoints.outer.x,t.points[e-1].controlPoints.outer.y,i.controlPoints.inner.x,i.controlPoints.inner.y,i.x,i.y):s.lineTo(i.x,i.y):s.moveTo(i.x,i.y)},this),s.stroke(),this.options.datasetFill&&(s.lineTo(t.points[t.points.length-1].x,this.scale.endPoint),s.lineTo(this.scale.calculateX(0),this.scale.endPoint),s.fillStyle=t.fillColor,s.closePath(),s.fill()),e.each(t.points,function(t){t.draw()})},this)}})}.call(this),function(){"use strict";var t=this,i=t.Chart,e=i.helpers,s={scaleShowLabelBackdrop:!0,scaleBackdropColor:"rgba(255,255,255,0.75)",scaleBeginAtZero:!0,scaleBackdropPaddingY:2,scaleBackdropPaddingX:2,scaleShowLine:!0,segmentShowStroke:!0,segmentStrokeColor:"#fff",segmentStrokeWidth:2,animationSteps:100,animationEasing:"easeOutBounce",animateRotate:!0,animateScale:!1,legendTemplate:'<ul class="<%=name.toLowerCase()%>-legend"><% for (var i=0; i<segments.length; i++){%><li><span style="background-color:<%=segments[i].fillColor%>"></span><%if(segments[i].label){%><%=segments[i].label%><%}%></li><%}%></ul>'};i.Type.extend({name:"PolarArea",defaults:s,initialize:function(t){this.segments=[],this.SegmentArc=i.Arc.extend({showStroke:this.options.segmentShowStroke,strokeWidth:this.options.segmentStrokeWidth,strokeColor:this.options.segmentStrokeColor,ctx:this.chart.ctx,innerRadius:0,x:this.chart.width/2,y:this.chart.height/2}),this.scale=new i.RadialScale({display:this.options.showScale,fontStyle:this.options.scaleFontStyle,fontSize:this.options.scaleFontSize,fontFamily:this.options.scaleFontFamily,fontColor:this.options.scaleFontColor,showLabels:this.options.scaleShowLabels,showLabelBackdrop:this.options.scaleShowLabelBackdrop,backdropColor:this.options.scaleBackdropColor,backdropPaddingY:this.options.scaleBackdropPaddingY,backdropPaddingX:this.options.scaleBackdropPaddingX,lineWidth:this.options.scaleShowLine?this.options.scaleLineWidth:0,lineColor:this.options.scaleLineColor,lineArc:!0,width:this.chart.width,height:this.chart.height,xCenter:this.chart.width/2,yCenter:this.chart.height/2,ctx:this.chart.ctx,templateString:this.options.scaleLabel,valuesCount:t.length}),this.updateScaleRange(t),this.scale.update(),e.each(t,function(t,i){this.addData(t,i,!0)},this),this.options.showTooltips&&e.bindEvents(this,this.options.tooltipEvents,function(t){var i="mouseout"!==t.type?this.getSegmentsAtEvent(t):[];e.each(this.segments,function(t){t.restore(["fillColor"])}),e.each(i,function(t){t.fillColor=t.highlightColor}),this.showTooltip(i)}),this.render()},getSegmentsAtEvent:function(t){var i=[],s=e.getRelativePosition(t);return e.each(this.segments,function(t){t.inRange(s.x,s.y)&&i.push(t)},this),i},addData:function(t,i,e){var s=i||this.segments.length;this.segments.splice(s,0,new this.SegmentArc({fillColor:t.color,highlightColor:t.highlight||t.color,label:t.label,value:t.value,outerRadius:this.options.animateScale?0:this.scale.calculateCenterOffset(t.value),circumference:this.options.animateRotate?0:this.scale.getCircumference(),startAngle:1.5*Math.PI})),e||(this.reflow(),this.update())},removeData:function(t){var i=e.isNumber(t)?t:this.segments.length-1;this.segments.splice(i,1),this.reflow(),this.update()},calculateTotal:function(t){this.total=0,e.each(t,function(t){this.total+=t.value},this),this.scale.valuesCount=this.segments.length},updateScaleRange:function(t){var i=[];e.each(t,function(t){i.push(t.value)});var s=this.options.scaleOverride?{steps:this.options.scaleSteps,stepValue:this.options.scaleStepWidth,min:this.options.scaleStartValue,max:this.options.scaleStartValue+this.options.scaleSteps*this.options.scaleStepWidth}:e.calculateScaleRange(i,e.min([this.chart.width,this.chart.height])/2,this.options.scaleFontSize,this.options.scaleBeginAtZero,this.options.scaleIntegersOnly);e.extend(this.scale,s,{size:e.min([this.chart.width,this.chart.height]),xCenter:this.chart.width/2,yCenter:this.chart.height/2})},update:function(){this.calculateTotal(this.segments),e.each(this.segments,function(t){t.save()}),this.render()},reflow:function(){e.extend(this.SegmentArc.prototype,{x:this.chart.width/2,y:this.chart.height/2}),this.updateScaleRange(this.segments),this.scale.update(),e.extend(this.scale,{xCenter:this.chart.width/2,yCenter:this.chart.height/2}),e.each(this.segments,function(t){t.update({outerRadius:this.scale.calculateCenterOffset(t.value)})},this)},draw:function(t){var i=t||1;this.clear(),e.each(this.segments,function(t,e){t.transition({circumference:this.scale.getCircumference(),outerRadius:this.scale.calculateCenterOffset(t.value)},i),t.endAngle=t.startAngle+t.circumference,0===e&&(t.startAngle=1.5*Math.PI),e<this.segments.length-1&&(this.segments[e+1].startAngle=t.endAngle),t.draw()},this),this.scale.draw()}})}.call(this),function(){"use strict";var t=this,i=t.Chart,e=i.helpers;i.Type.extend({name:"Radar",defaults:{scaleShowLine:!0,angleShowLineOut:!0,scaleShowLabels:!1,scaleBeginAtZero:!0,angleLineColor:"rgba(0,0,0,.1)",angleLineWidth:1,pointLabelFontFamily:"'Arial'",pointLabelFontStyle:"normal",pointLabelFontSize:10,pointLabelFontColor:"#666",pointDot:!0,pointDotRadius:3,pointDotStrokeWidth:1,pointHitDetectionRadius:20,datasetStroke:!0,datasetStrokeWidth:2,datasetFill:!0,legendTemplate:'<ul class="<%=name.toLowerCase()%>-legend"><% for (var i=0; i<datasets.length; i++){%><li><span style="background-color:<%=datasets[i].strokeColor%>"></span><%if(datasets[i].label){%><%=datasets[i].label%><%}%></li><%}%></ul>'},initialize:function(t){this.PointClass=i.Point.extend({strokeWidth:this.options.pointDotStrokeWidth,radius:this.options.pointDotRadius,display:this.options.pointDot,hitDetectionRadius:this.options.pointHitDetectionRadius,ctx:this.chart.ctx}),this.datasets=[],this.buildScale(t),this.options.showTooltips&&e.bindEvents(this,this.options.tooltipEvents,function(t){var i="mouseout"!==t.type?this.getPointsAtEvent(t):[];this.eachPoints(function(t){t.restore(["fillColor","strokeColor"])}),e.each(i,function(t){t.fillColor=t.highlightFill,t.strokeColor=t.highlightStroke}),this.showTooltip(i)}),e.each(t.datasets,function(i){var s={label:i.label||null,fillColor:i.fillColor,strokeColor:i.strokeColor,pointColor:i.pointColor,pointStrokeColor:i.pointStrokeColor,points:[]};this.datasets.push(s),e.each(i.data,function(n,o){if(e.isNumber(n)){var a;this.scale.animation||(a=this.scale.getPointPosition(o,this.scale.calculateCenterOffset(n))),s.points.push(new this.PointClass({value:n,label:t.labels[o],datasetLabel:i.label,x:this.options.animation?this.scale.xCenter:a.x,y:this.options.animation?this.scale.yCenter:a.y,strokeColor:i.pointStrokeColor,fillColor:i.pointColor,highlightFill:i.pointHighlightFill||i.pointColor,highlightStroke:i.pointHighlightStroke||i.pointStrokeColor}))}},this)},this),this.render()},eachPoints:function(t){e.each(this.datasets,function(i){e.each(i.points,t,this)},this)},getPointsAtEvent:function(t){var i=e.getRelativePosition(t),s=e.getAngleFromPoint({x:this.scale.xCenter,y:this.scale.yCenter},i),n=2*Math.PI/this.scale.valuesCount,o=Math.round((s.angle-1.5*Math.PI)/n),a=[];return(o>=this.scale.valuesCount||0>o)&&(o=0),s.distance<=this.scale.drawingArea&&e.each(this.datasets,function(t){a.push(t.points[o])}),a},buildScale:function(t){this.scale=new i.RadialScale({display:this.options.showScale,fontStyle:this.options.scaleFontStyle,fontSize:this.options.scaleFontSize,fontFamily:this.options.scaleFontFamily,fontColor:this.options.scaleFontColor,showLabels:this.options.scaleShowLabels,showLabelBackdrop:this.options.scaleShowLabelBackdrop,backdropColor:this.options.scaleBackdropColor,backdropPaddingY:this.options.scaleBackdropPaddingY,backdropPaddingX:this.options.scaleBackdropPaddingX,lineWidth:this.options.scaleShowLine?this.options.scaleLineWidth:0,lineColor:this.options.scaleLineColor,angleLineColor:this.options.angleLineColor,angleLineWidth:this.options.angleShowLineOut?this.options.angleLineWidth:0,pointLabelFontColor:this.options.pointLabelFontColor,pointLabelFontSize:this.options.pointLabelFontSize,pointLabelFontFamily:this.options.pointLabelFontFamily,pointLabelFontStyle:this.options.pointLabelFontStyle,height:this.chart.height,width:this.chart.width,xCenter:this.chart.width/2,yCenter:this.chart.height/2,ctx:this.chart.ctx,templateString:this.options.scaleLabel,labels:t.labels,valuesCount:t.datasets[0].data.length}),this.scale.setScaleSize(),this.updateScaleRange(t.datasets),this.scale.buildYLabels()},updateScaleRange:function(t){var i=function(){var i=[];return e.each(t,function(t){t.data?i=i.concat(t.data):e.each(t.points,function(t){i.push(t.value)})}),i}(),s=this.options.scaleOverride?{steps:this.options.scaleSteps,stepValue:this.options.scaleStepWidth,min:this.options.scaleStartValue,max:this.options.scaleStartValue+this.options.scaleSteps*this.options.scaleStepWidth}:e.calculateScaleRange(i,e.min([this.chart.width,this.chart.height])/2,this.options.scaleFontSize,this.options.scaleBeginAtZero,this.options.scaleIntegersOnly);e.extend(this.scale,s)},addData:function(t,i){this.scale.valuesCount++,e.each(t,function(t,s){if(e.isNumber(t)){var n=this.scale.getPointPosition(this.scale.valuesCount,this.scale.calculateCenterOffset(t));this.datasets[s].points.push(new this.PointClass({value:t,label:i,x:n.x,y:n.y,strokeColor:this.datasets[s].pointStrokeColor,fillColor:this.datasets[s].pointColor}))}},this),this.scale.labels.push(i),this.reflow(),this.update()},removeData:function(){this.scale.valuesCount--,this.scale.labels.shift(),e.each(this.datasets,function(t){t.points.shift()},this),this.reflow(),this.update()},update:function(){this.eachPoints(function(t){t.save()}),this.reflow(),this.render()},reflow:function(){e.extend(this.scale,{width:this.chart.width,height:this.chart.height,size:e.min([this.chart.width,this.chart.height]),xCenter:this.chart.width/2,yCenter:this.chart.height/2}),this.updateScaleRange(this.datasets),this.scale.setScaleSize(),this.scale.buildYLabels()},draw:function(t){var i=t||1,s=this.chart.ctx;this.clear(),this.scale.draw(),e.each(this.datasets,function(t){e.each(t.points,function(t,e){t.transition(this.scale.getPointPosition(e,this.scale.calculateCenterOffset(t.value)),i)},this),s.lineWidth=this.options.datasetStrokeWidth,s.strokeStyle=t.strokeColor,s.beginPath(),e.each(t.points,function(t,i){0===i?s.moveTo(t.x,t.y):s.lineTo(t.x,t.y)},this),s.closePath(),s.stroke(),s.fillStyle=t.fillColor,s.fill(),e.each(t.points,function(t){t.draw()})},this)}})}.call(this);
|
PypiClean
|
/pulumi_azure_native-2.5.1a1693590910.tar.gz/pulumi_azure_native-2.5.1a1693590910/pulumi_azure_native/machinelearningservices/v20230401/list_online_endpoint_keys.py
|
import copy
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
__all__ = [
'ListOnlineEndpointKeysResult',
'AwaitableListOnlineEndpointKeysResult',
'list_online_endpoint_keys',
'list_online_endpoint_keys_output',
]
@pulumi.output_type
class ListOnlineEndpointKeysResult:
"""
Keys for endpoint authentication.
"""
def __init__(__self__, primary_key=None, secondary_key=None):
if primary_key and not isinstance(primary_key, str):
raise TypeError("Expected argument 'primary_key' to be a str")
pulumi.set(__self__, "primary_key", primary_key)
if secondary_key and not isinstance(secondary_key, str):
raise TypeError("Expected argument 'secondary_key' to be a str")
pulumi.set(__self__, "secondary_key", secondary_key)
@property
@pulumi.getter(name="primaryKey")
def primary_key(self) -> Optional[str]:
"""
The primary key.
"""
return pulumi.get(self, "primary_key")
@property
@pulumi.getter(name="secondaryKey")
def secondary_key(self) -> Optional[str]:
"""
The secondary key.
"""
return pulumi.get(self, "secondary_key")
class AwaitableListOnlineEndpointKeysResult(ListOnlineEndpointKeysResult):
# pylint: disable=using-constant-test
def __await__(self):
if False:
yield self
return ListOnlineEndpointKeysResult(
primary_key=self.primary_key,
secondary_key=self.secondary_key)
def list_online_endpoint_keys(endpoint_name: Optional[str] = None,
resource_group_name: Optional[str] = None,
workspace_name: Optional[str] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> AwaitableListOnlineEndpointKeysResult:
"""
Keys for endpoint authentication.
:param str endpoint_name: Online Endpoint name.
:param str resource_group_name: The name of the resource group. The name is case insensitive.
:param str workspace_name: Name of Azure Machine Learning workspace.
"""
__args__ = dict()
__args__['endpointName'] = endpoint_name
__args__['resourceGroupName'] = resource_group_name
__args__['workspaceName'] = workspace_name
opts = pulumi.InvokeOptions.merge(_utilities.get_invoke_opts_defaults(), opts)
__ret__ = pulumi.runtime.invoke('azure-native:machinelearningservices/v20230401:listOnlineEndpointKeys', __args__, opts=opts, typ=ListOnlineEndpointKeysResult).value
return AwaitableListOnlineEndpointKeysResult(
primary_key=pulumi.get(__ret__, 'primary_key'),
secondary_key=pulumi.get(__ret__, 'secondary_key'))
@_utilities.lift_output_func(list_online_endpoint_keys)
def list_online_endpoint_keys_output(endpoint_name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
workspace_name: Optional[pulumi.Input[str]] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> pulumi.Output[ListOnlineEndpointKeysResult]:
"""
Keys for endpoint authentication.
:param str endpoint_name: Online Endpoint name.
:param str resource_group_name: The name of the resource group. The name is case insensitive.
:param str workspace_name: Name of Azure Machine Learning workspace.
"""
...
|
PypiClean
|
/love_course_2016_2019-2023.3.1.0-py3-none-any.whl/LoveCourse20162019/docs/chris/2020『社交直覺養成』线上视频课:第02课:两性真相“关系升级”的基础筹码是什么?.md
|
# 2020『社交直覺養成』线上视频课:第02课:两性真相“关系升级”的基础筹码是什么?
3月2日,你好,這裡是梁叔,上捷克我們學習了關系核心3藥署以及他們的重要性,這捷克我們來聊一聊,3藥署的不同搭配組合可以將關系分別帶入什麼不同的方向,首先我們先來看第一種,有邏輯思考,有情緒好感傳口。
但是沒有升級關系,這樣的搭配結構會把你們的關系帶到有疑區關係模型,比如說你們是大學同學,互相關系很好,也一起經歷了不少事情,你們在一起的時候,他多半都是開心的愉快的,活潑亂跳的。
但是你一直不敢抓住機會進行關系升級,勇敢的統潑那城窗戶子,把友情升華為愛情,導致情緒好感傳口,一直大量的背浪費,最後你們的關系被固化在有疑區裡面,很難再有字的變化,要麼你把這一切生滿在心裡,獨自平常。
要麼你終於鼓起勇氣表白,然後大概率也會被拒,最後連朋友的關係也會顯得有點尷尬,這是第一種,第二種就是常見的好人卡這種關係,他的類型有邏輯思考,也有可能沒有,絕對沒有情緒好感傳口有關係升級。
比如說你喜歡隔壁班的一個女孩子,你們沒有說過什麼話,也沒有單獨相處過,彼此可能還沒有什麼了解,然後你通過朋友要到了這個女生的,拉呀微信啊或者QQ啊,完了之後你在線上一通表白。
然後女生莫名其妙的多棒都會回覆你,我現在不想糖尼愛,我要集中學習,巴拉巴拉巴拉的,又或者你們是同班同學每天見面,單獨接觸的機會很少,瞭解總是會停留在很表面,很表誠的程度上面你就是暗鏈它。
終於忍不住找了個機會表白,然後被發了一張好人卡,你很好相信你會遇到比我更好的,我覺得我們先是做朋友比較好,巴拉巴拉的這是第二種,第三種沒有邏輯思考,沒有情緒好感,就是有升級關係,如果是現下的肢體升級。
這個就是我們俗稱的耍流嗎,或者在日本的話呢叫做吃汗,也就是我們常說的非理,那如果是線上,那就是語言升級,比如說我們現在默默、嘆嘆、微信上,加了一個女生,每一條兩句你就開社群王孝,發修修的表情調戲他。
甚至說你不如來我這樣吃點消夜,這樣子的話就會被女生貼上了,為所拿聽清楚是為所拿,或者是紮拿的標籤,然後呢果端拉黑,這是第三種,第四種沒有太多的邏輯思考,但是有情緒好感,有關係升級,這種往往是一夜情。
或者是說短期關係的模型,比如你在酒吧夜店,日式一個女生,然後你們兩個喝酒跳舞,玩得非常的開心,情緒、氣氛、產和一切都這樣而然的發生,但是往往在第二天或者幾天後,兩人迅速的又從親密關係,轉變為陌生關係。
就像流星華過夜空一樣,轉瞬即適,當然了也有的戀人士,由一夜情發展而來,那麼他們一定是發生關係之後,又充分地補足了邏輯思考的部分,甚至是在女生出現事後,返回之前男生就做了,很好的鋪電和處理,現在你明白了。
我們在夜店酒吧、KTV、狂歡節、聚會之後,總是會很容易發生一些路水情緣了吧,沒錯,就是因為那些場合和活動,要男女的情緒都很容易開起來,只要男生抓住機會,適當地進行升級,有些事情就這樣而然的發生了。
當然了,一段優質的長期關係模型,肯定是邏輯思考,情緒好感和關係升級,三個同時都是必須要滿足的,比如說你看到的那些要分手的情侶,或者婚姻質量很低的夫婦,無疑例外都只剩下了邏輯思考,沒有情緒好感。
自然也不會有關係升級,也就是說我只要你是我的女朋友,但是我情緒不好,甚至我對你感到驗誤和嫌勤,我的身體很抗拒你和排斥你,我不可能接受和你的關係升級,到這裡相信你已經意識到了,關係的商藥數當中。
好像是比較重要的,就是情緒好感穿口,因為它是關係升級的,籌碼和土壤,有它你就有可能進行關係升級,沒有它就絕對不可能進行關係升級,如果長期關係沒有它,就算你們曾經很親密,也有可能會面臨分手。
你也可以把情緒好感穿口,理解為關係這台車的氣勇,邏輯思考理解為雞勇,而關係升級就是這台車的遊門,再重複一遍,情緒好感是關係升級的基礎土壤,是快樂的,還是高興的,積極的還是放鬆的,刺激的還是好奇的。
是激動的還是愛媚的,還是期待的,這些都是常見的良好的情緒穿口,那麼常見的關係升級,有哪些呢,它分為線上的語言調情,和線下的姿體升級,語言調情一般指的是,和異性在線上聊天軟件當中,聊一些和私密的物品。
私密的部位,私密的行為,這三個話題相關的內容,具體我以後會在,高手聊天的科學當中,給大家講解,會並且會結合暗粒,那麼線下的關係升級,一般會包括,一般的接觸,還有柔間,弗妖,擁抱,牽手,接紋。
然後發生親密關係等等,那麼好的關係升級,它是有錢入身的,緩慢升高的,你很難在沒有,遮體接觸的情況下,突然去穩女生,這樣就很彈圖,很突兀,也很奇怪,多半女生都會被嚇到,那麼在這種行為之中,哪些具有關係。
自變的重要意義呢,沒錯,就是牽手,接紋,和發生親密關係,雖然女生心裡對,我們的愛業和喜歡,我們是看不見的,但是我們可以通過外部,我們和他們的,肢體的親密感來感覺,這裡有一個,小機槽給大家分享一下,牽手。
可以算是關係,到達60分,親吻,可以算是關係到達70分,發生親密關係,可以算是關係到達80分,那麼就難以程度來說,哪個容易,哪個難,哪個容易,哪個難呢,當然是發生親密關係,是最難的,發生以前。
這個東西叫做蒸超,是女人,四之為,比生命還要重要的東西,雖然現在社會開放了很多,但是完成這一步,還是需要很多,內在和外在的條件,其次就是親手,也許你聽到了時候,會覺得很奇怪,不應該是接紋嗎,我告訴你。
不是的,親手的難度,要比接紋來的呢,而且,意義也有大得多,因為親手是同學,同事,朋友等等,普通關係,轉向男女,親密關係的分學,但是我這裡說的親手,不是做遊戲的時候,隨便親一下,我說的親手是,任何時候。
你和她,想親就親,而不用擔心,被甩開,或者,找到拒絕,如果,你可以完成到這一步,那麼只要你不犯,嚴重的錯誤,隨著時間的推議,你一定是可以,穩到她的,所以接紋的難度,反而沒有親手的大,最後,我們簡單來看。
幾張,最後我們簡單來看,幾張圖片,假設這些女生,是你約會的對象,你能不能準確的,判斷出,他們的情緒窗口,然後做正確的事情,第一張,你覺得這個女孩,是什麼情緒,是殺膠,還是生氣,這裡,梁叔要告訴大家。
這裡很明顯,是殺膠,僅有的,一點點生氣,也是源自於,你沒有,把心思放在她的身上,比如說,你和她約會的時候,打電話,網友系發微信等等,所以這就是一個,求可得性的,情緒窗口,她是好的,那麼下一張。
我們來看一下這張,她是生氣了嗎,她其實,沒有生氣,但是她已經,非常無聊了,如果你們正處於,一個咖啡廳的話,我想你應該要考慮,轉場,去另外一個場景,否則過不了多久,這個女生就該主動提出,我要先走了。
下面這張,這個女孩,這個女孩很明顯是生氣了,而且,這個生氣,一定是有你造成的,不管你是說錯話,還是做錯事,總之她的憤怒,指標已經接近了,百分之八十了,這個時候建議你,收起,吸皮銷臉的態度,立即。
嚴肅起來,然後向她,就乾脆你的行為,或者語言,進行道歉,另外一大女生,情緒,有所緩和,因進找的,問女生叫車,讓師傅送她回去,因為在這種情緒之後,你會,是很難繼續的,而且,你們最好,盡早分開,一般。
情況下,我們會,很少遇到這種,碰到女生出現這種表情,我回想了一下,女生的第三種表情,我們都很少看到,如果你,經常把女孩子的表情搞成,現在這張,也就是第四張的畫,那麼,你就可以考慮,來我們的新一下課了。
好了,今天的課,就到此結束,這裡兩書,有個作業給大家,分析一下,自己之前的,情感經歷,明鏈的,還是按鈕的,成樂的,還是眉層的,都算,然後,用關係的,商藥書,套一下,輸了一下,將文字把,提交給我。
來
|
PypiClean
|
/secretflow_lite-1.0.0b0-cp38-cp38-macosx_10_16_x86_64.whl/secretflow/ml/boost/sgb_v/factory/components/sampler/sampler.py
|
from dataclasses import dataclass
from typing import List, Tuple, Union
import numpy as np
from secretflow.data import FedNdarray, PartitionWay
from secretflow.device import PYUObject
from secretflow.ml.boost.sgb_v.factory.params import default_params
from secretflow.ml.boost.sgb_v.factory.sgb_actor import SGBActor
from ..component import Component, Devices, print_params
from .sample_actor import SampleActor
@dataclass
class SamplerParams:
"""
'row_sample_rate': float. Row sub sample ratio of the training instances.
default: 1
range: (0, 1]
'col_sample_rate': float. Col sub sample ratio of columns when constructing each tree.
default: 1
range: (0, 1]
'seed': int. Pseudorandom number generator seed.
default: 1212
'label_holder_feature_only': bool. affects col sampling.
default: False
if turned on, all non-label holder's col sample rate will be 0.
'enable_goss': bool. whether enable GOSS, see lightGBM's paper for more understanding in GOSS.
default: False
'top_rate': float. GOSS-specific parameter. The fraction of large gradients to sample.
default: 0.3
range: (0, 1), but top_rate + bottom_rate < 1
'bottom_rate': float. GOSS-specific parameter. The fraction of small gradients to sample.
default: 0.5
range: (0, 1), but top_rate + bottom_rate < 1
"""
row_sample_rate: float = default_params.row_sample_rate
col_sample_rate: float = default_params.col_sample_rate
seed: int = default_params.seed
label_holder_feature_only: bool = default_params.label_holder_feature_only
enable_goss: bool = default_params.enable_goss
top_rate: float = default_params.top_rate
bottom_rate: float = default_params.bottom_rate
class Sampler(Component):
def __init__(self):
self.params = SamplerParams()
def show_params(self):
print_params(self.params)
def set_params(self, params: dict):
subsample = float(params.get('row_sample_rate', 1))
assert (
subsample > 0 and subsample <= 1
), f"row_sample_rate should in (0, 1], got {subsample}"
colsample = float(params.get('col_sample_rate', 1))
assert (
colsample > 0 and colsample <= 1
), f"col_sample_rate should in (0, 1], got {colsample}"
top_rate = float(params.get('top_rate', 0.3))
assert (
top_rate > 0 and top_rate < 1
), f"top_rate should in (0, 1), got {top_rate}"
bottom_rate = float(params.get('bottom_rate', 0.5))
assert (
bottom_rate > 0 and bottom_rate < 1
), f"bottom_rate should in (0, 1), got {bottom_rate}"
assert (
bottom_rate + top_rate < 1
), f"the sum of top_rate and bottom_rate should be less than 1, got {bottom_rate + top_rate}"
self.params.row_sample_rate = subsample
self.params.col_sample_rate = colsample
self.params.seed = int(params.get('seed', 1212))
self.params.label_holder_feature_only = bool(
params.get('label_holder_feature_only', False)
)
self.params.enable_goss = bool(params.get('enable_goss', False))
self.params.top_rate = top_rate
self.params.bottom_rate = bottom_rate
def get_params(self, params: dict):
params['seed'] = self.params.seed
params['row_sample_rate'] = self.params.row_sample_rate
params['col_sample_rate'] = self.params.col_sample_rate
params['label_holder_feature_only'] = self.params.label_holder_feature_only
params['enable_goss'] = self.params.enable_goss
params['top_rate'] = self.params.top_rate
params['bottom_rate'] = self.params.bottom_rate
def set_devices(self, devices: Devices):
self.label_holder = devices.label_holder
self.workers = devices.workers
def set_actors(self, actors: SGBActor):
self.sample_actors = {actor.device: actor for actor in actors}
for actor in self.sample_actors.values():
actor.register_class('SampleActor', SampleActor, self.params.seed)
def generate_col_choices(
self, feature_buckets: List[PYUObject]
) -> Tuple[List[PYUObject], List[PYUObject]]:
"""Generate column sample choices.
Args:
feature_buckets (List[PYUObject]): Behind PYUObject is List[int], bucket num for each feature.
Returns:
Tuple[List[PYUObject], List[PYUObject]]: first list is column choices, second is total number of buckets after sampling
"""
colsample = self.params.col_sample_rate
if self.params.label_holder_feature_only:
col_choices, total_buckets = zip(
*[
self.sample_actors[fb.device].invoke_class_method_two_ret(
'SampleActor',
'generate_one_partition_col_choices',
colsample,
fb,
)
if fb.device == self.label_holder
else self.sample_actors[fb.device].invoke_class_method_two_ret(
'SampleActor', 'generate_one_partition_col_choices', 0, fb
)
for fb in feature_buckets
]
)
else:
col_choices, total_buckets = zip(
*[
self.sample_actors[fb.device].invoke_class_method_two_ret(
'SampleActor',
'generate_one_partition_col_choices',
colsample,
fb,
)
for fb in feature_buckets
]
)
return col_choices, total_buckets
def generate_row_choices(
self, row_num: int, g: PYUObject
) -> Tuple[Union[None, np.ndarray], Union[None, np.ndarray]]:
"""Sample rows,
either in a goss style or normal style based on config
Args:
row_num (int): row number
g (PYUObject): gradient
Returns:
Tuple[Union[None, np.ndarray], Union[None, np.ndarray]]:
1. row choices
2. weight (for info gain), None if not GOSS-enabled
"""
if self.params.enable_goss:
top_rate = self.params.top_rate
bottom_rate = self.params.bottom_rate
return self.sample_actors[g.device].invoke_class_method_two_ret(
'SampleActor', 'goss', row_num, g, top_rate, bottom_rate
)
else:
sample_rate = self.params.row_sample_rate
choices = self.sample_actors[g.device].invoke_class_method(
'SampleActor', 'generate_row_choices', row_num, sample_rate
)
return choices, None
def _should_row_subsampling(self) -> bool:
return self.params.row_sample_rate < 1 or self.params.enable_goss
def _apply_vector_sampling(
self,
x: PYUObject,
indices: Union[PYUObject, np.ndarray],
):
"""Sample x for a single partition. Assuming we have a column vector.
Assume the indices was generated from row sampling by sampler"""
if self.params.row_sample_rate < 1:
return x.device(lambda x, indices: x.reshape(-1, 1)[indices, :])(x, indices)
else:
return x.device(lambda x: x.reshape(-1, 1))(x)
def apply_vector_sampling_weighted(
self,
x: PYUObject,
indices: Union[PYUObject, np.ndarray],
weight: Union[PYUObject, None] = None,
):
if self.params.enable_goss:
return x.device(
lambda x, indices, weight: (
np.multiply(x.reshape(-1)[indices], weight.reshape(-1))
).reshape(-1, 1)
)(
x,
indices,
weight,
)
else:
return self._apply_vector_sampling(x, indices)
def apply_v_fed_sampling(
self,
X: FedNdarray,
row_choices: Union[None, np.ndarray, PYUObject] = None,
col_choices: List[Union[None, np.ndarray, PYUObject]] = [],
) -> FedNdarray:
"""Sample X based on row choices and col choices.
Assume the choices were generated by sampler.
Args:
X (FedNdarray): Array to sample from
row_choices (Union[None, np.ndarray, PYUObject]): row sampling choices. devices are assumed to be ordered as X.
col_choices (List[Union[None, np.ndarray,PYUObject]): col sampling choices. devices are assumed to be ordered as X.
Returns:
X_sub (FedNdarray): subsampled X
shape (Tuple[int, int]): shape of X_sub
"""
X_sub = X
# sample cols and rows of bucket_map
if self.params.col_sample_rate < 1 and self._should_row_subsampling():
# sub choices is stored in context owned by label_holder and shared to all workers.
X_sub = FedNdarray(
partitions={
pyu: pyu(lambda x, y, z: x[y, :][:, z])(
partition,
row_choices.to(pyu)
if isinstance(row_choices, PYUObject)
else row_choices,
col_choices[i],
)
for i, (pyu, partition) in enumerate(X.partitions.items())
},
partition_way=PartitionWay.VERTICAL,
)
# only sample cols
elif self.params.col_sample_rate < 1:
X_sub = FedNdarray(
partitions={
pyu: pyu(lambda x, y: x[:, y])(partition, col_choices[i])
for i, (pyu, partition) in enumerate(X.partitions.items())
},
partition_way=PartitionWay.VERTICAL,
)
# only sample rows
elif self._should_row_subsampling():
X_sub = FedNdarray(
partitions={
pyu: pyu(lambda x, y: x[y, :])(
partition,
row_choices.to(pyu)
if isinstance(row_choices, PYUObject)
else row_choices,
)
for pyu, partition in X.partitions.items()
},
partition_way=PartitionWay.VERTICAL,
)
return X_sub
|
PypiClean
|
/mis_modulos-0.1.tar.gz/mis_modulos-0.1/wheel/cli/__init__.py
|
from __future__ import annotations
import argparse
import os
import sys
class WheelError(Exception):
pass
def unpack_f(args):
from .unpack import unpack
unpack(args.wheelfile, args.dest)
def pack_f(args):
from .pack import pack
pack(args.directory, args.dest_dir, args.build_number)
def convert_f(args):
from .convert import convert
convert(args.files, args.dest_dir, args.verbose)
def version_f(args):
from .. import __version__
print("wheel %s" % __version__)
def parser():
p = argparse.ArgumentParser()
s = p.add_subparsers(help="commands")
unpack_parser = s.add_parser("unpack", help="Unpack wheel")
unpack_parser.add_argument(
"--dest", "-d", help="Destination directory", default="."
)
unpack_parser.add_argument("wheelfile", help="Wheel file")
unpack_parser.set_defaults(func=unpack_f)
repack_parser = s.add_parser("pack", help="Repack wheel")
repack_parser.add_argument("directory", help="Root directory of the unpacked wheel")
repack_parser.add_argument(
"--dest-dir",
"-d",
default=os.path.curdir,
help="Directory to store the wheel (default %(default)s)",
)
repack_parser.add_argument(
"--build-number", help="Build tag to use in the wheel name"
)
repack_parser.set_defaults(func=pack_f)
convert_parser = s.add_parser("convert", help="Convert egg or wininst to wheel")
convert_parser.add_argument("files", nargs="*", help="Files to convert")
convert_parser.add_argument(
"--dest-dir",
"-d",
default=os.path.curdir,
help="Directory to store wheels (default %(default)s)",
)
convert_parser.add_argument("--verbose", "-v", action="store_true")
convert_parser.set_defaults(func=convert_f)
version_parser = s.add_parser("version", help="Print version and exit")
version_parser.set_defaults(func=version_f)
help_parser = s.add_parser("help", help="Show this help")
help_parser.set_defaults(func=lambda args: p.print_help())
return p
def main():
p = parser()
args = p.parse_args()
if not hasattr(args, "func"):
p.print_help()
else:
try:
args.func(args)
return 0
except WheelError as e:
print(e, file=sys.stderr)
return 1
|
PypiClean
|
/RotatingProxyBot-0.0.3.tar.gz/RotatingProxyBot-0.0.3/README.md
|
# RotatingProxyBot
A Bot that uses a Rotating Proxy to simulate many clients making a request to a single server
**Version 0.0.3**
### Author
**Daniel Gisolfi** - *All current work* - [dgisolfi](https://github.com/dgisolfi)
## Usage
```python
#!/usr/bin/python3
from RotatingProxyBot import ProxyBot
# Create new custom bot
bot = ProxyBot(
address='IP OR URL',
method='POST'
desired_reqs=10,
reqs_per_int=2,
wait_time=60 # 1min
)
# Start Submiting and rotating proxies
bot.enable()
```
## Building a list of Proxies
The bot will need a list of proxies to use for making requests, it can use either an API to retrieve proxies or a file to import them from.
### Proxies from an API
By default, the bot will retrieve a few thousand proxies from an [API](https://www.proxy-list.download/). To use a custom API tell the bot what address to reach the API at by passing in the following argument when creating a new instance `proxy_api='http://api.com'`.
### Proxies from a File
To use a custom file of proxies rather than the default API, pass in the following argument to the bot constructor `proxy_file=filename.txt`
For the bot to be able to import the list of proxies, the file should have the following structure:
```txt
0.0.0.0:80
1.1.1.1:90
2.2.2.2:20
```
## Methods
The following are some useful methods that are a part of the package
### ProxyBot
* **getRequest(proxy)** - if passed a specific proxy, this method will perform a`GET` request using the specified proxy to the address set in the creation of the bot. EX: `getRequest('0.0.0.0:80')`
* **postRequest(proxy)** - if passed a specific proxy, this method will perform a `POST` request using the specified proxy to the address set in the creation of the bot. EX: `postRequest('0.0.0.0:80')`
* **preformRotate()** - if called the bot will request a new proxy from the RotatingProxy class and perform the specified request to the specified address, returning the response element
* **enable()** - if called will initiate the main loop of the bot, making the specified requests to the address using the set number of intervals and wait time
* **disable()** - if called will shut down the main loop of the program and delete the bot
### RotatingProxy
* **buildProxyList()** - when called will contact the set API(or default one) to retrieve a list of up to date proxies in which it can pull from to make requests
* **importProxyList()** - will attempt to build a list of proxies from the file name provided.
* **rotate()** - will return the 0th proxy in the list and add it to the used proxy list.
## Additional Arguments for the Constructor
The following are keyword arguments that can be passed into the constructor of the ProxyBot Class.
* **id** - Assigns the instance of the ProxyBot with the given numeric ID
Example: `RotatingProxyBot(id=1)`
* **address** - the IP or URL for the bot to contact, will default to a tester API
Example: `RotatingProxyBot(address='0.0.0.0')`
* **method** - The request method to be used. Only `GET` and `POST` supported. The default is `GET`
Example: `RotatingProxyBot(method='POST')`
* **params** - Parameters to be passed with the request, works with all request methods
Example: `RotatingProxyBot(params={'example':'test'})`
* **desired_reqs** - Desired number of requests to be completed
Example: `RotatingProxyBot(desired_reqs=10)`
* **keep_alive** - A boolean allow the bot to continue to make requests forever
*if set to `True` dont set `desired_reqs`*
Example: `RotatingProxyBot(keep_alive=True)`
* **reqs_per_int** - Requests Per Interval, number of requests to be completed before waiting. This will prevent the server from being DOSed
Example: `RotatingProxyBot(reqs_per_int=2)`
* **wait_time** - Amount of time in Seconds to wait until the next interval of requests
Example: `RotatingProxyBot(wait_time=600)`
|
PypiClean
|
/django-admin-adminlte-1.0.5.tar.gz/django-admin-adminlte-1.0.5/admin_adminlte/static/plugins/flot/plugins/jquery.flot.resize.js
|
Copyright (c) 2007-2014 IOLA and Ole Laursen.
Licensed under the MIT license.
It works by listening for changes on the placeholder div (through the jQuery
resize event plugin) - if the size changes, it will redraw the plot.
There are no options. If you need to disable the plugin for some plots, you
can just fix the size of their placeholders.
*/
/* Inline dependency:
* jQuery resize event - v1.1 - 3/14/2010
* http://benalman.com/projects/jquery-resize-plugin/
*
* Copyright (c) 2010 "Cowboy" Ben Alman
* Dual licensed under the MIT and GPL licenses.
* http://benalman.com/about/license/
*/
(function($,e,t){"$:nomunge";var i=[],n=$.resize=$.extend($.resize,{}),a,r=false,s="setTimeout",u="resize",m=u+"-special-event",o="pendingDelay",l="activeDelay",f="throttleWindow";n[o]=200;n[l]=20;n[f]=true;$.event.special[u]={setup:function(){if(!n[f]&&this[s]){return false}var e=$(this);i.push(this);e.data(m,{w:e.width(),h:e.height()});if(i.length===1){a=t;h()}},teardown:function(){if(!n[f]&&this[s]){return false}var e=$(this);for(var t=i.length-1;t>=0;t--){if(i[t]==this){i.splice(t,1);break}}e.removeData(m);if(!i.length){if(r){cancelAnimationFrame(a)}else{clearTimeout(a)}a=null}},add:function(e){if(!n[f]&&this[s]){return false}var i;function a(e,n,a){var r=$(this),s=r.data(m)||{};s.w=n!==t?n:r.width();s.h=a!==t?a:r.height();i.apply(this,arguments)}if($.isFunction(e)){i=e;return a}else{i=e.handler;e.handler=a}}};function h(t){if(r===true){r=t||1}for(var s=i.length-1;s>=0;s--){var l=$(i[s]);if(l[0]==e||l.is(":visible")){var f=l.width(),c=l.height(),d=l.data(m);if(d&&(f!==d.w||c!==d.h)){l.trigger(u,[d.w=f,d.h=c]);r=t||true}}else{d=l.data(m);d.w=0;d.h=0}}if(a!==null){if(r&&(t==null||t-r<1e3)){a=e.requestAnimationFrame(h)}else{a=setTimeout(h,n[o]);r=false}}}if(!e.requestAnimationFrame){e.requestAnimationFrame=function(){return e.webkitRequestAnimationFrame||e.mozRequestAnimationFrame||e.oRequestAnimationFrame||e.msRequestAnimationFrame||function(t,i){return e.setTimeout(function(){t((new Date).getTime())},n[l])}}()}if(!e.cancelAnimationFrame){e.cancelAnimationFrame=function(){return e.webkitCancelRequestAnimationFrame||e.mozCancelRequestAnimationFrame||e.oCancelRequestAnimationFrame||e.msCancelRequestAnimationFrame||clearTimeout}()}})(jQuery,window);
/* eslint-enable */
(function ($) {
var options = { }; // no options
function init(plot) {
function onResize() {
var placeholder = plot.getPlaceholder();
// somebody might have hidden us and we can't plot
// when we don't have the dimensions
if (placeholder.width() === 0 || placeholder.height() === 0) return;
plot.resize();
plot.setupGrid();
plot.draw();
}
function bindEvents(plot, eventHolder) {
plot.getPlaceholder().resize(onResize);
}
function shutdown(plot, eventHolder) {
plot.getPlaceholder().unbind("resize", onResize);
}
plot.hooks.bindEvents.push(bindEvents);
plot.hooks.shutdown.push(shutdown);
}
$.plot.plugins.push({
init: init,
options: options,
name: 'resize',
version: '1.0'
});
})(jQuery);
|
PypiClean
|
/kivy-django-1.9.1.tar.gz/kivy-django-1.9.1/django/contrib/auth/management/commands/changepassword.py
|
from __future__ import unicode_literals
import getpass
from django.contrib.auth import get_user_model
from django.contrib.auth.password_validation import validate_password
from django.core.exceptions import ValidationError
from django.core.management.base import BaseCommand, CommandError
from django.db import DEFAULT_DB_ALIAS
from django.utils.encoding import force_str
class Command(BaseCommand):
help = "Change a user's password for django.contrib.auth."
requires_system_checks = False
def _get_pass(self, prompt="Password: "):
p = getpass.getpass(prompt=force_str(prompt))
if not p:
raise CommandError("aborted")
return p
def add_arguments(self, parser):
parser.add_argument('username', nargs='?',
help='Username to change password for; by default, it\'s the current username.')
parser.add_argument('--database', action='store', dest='database',
default=DEFAULT_DB_ALIAS,
help='Specifies the database to use. Default is "default".')
def handle(self, *args, **options):
if options.get('username'):
username = options['username']
else:
username = getpass.getuser()
UserModel = get_user_model()
try:
u = UserModel._default_manager.using(options.get('database')).get(**{
UserModel.USERNAME_FIELD: username
})
except UserModel.DoesNotExist:
raise CommandError("user '%s' does not exist" % username)
self.stdout.write("Changing password for user '%s'\n" % u)
MAX_TRIES = 3
count = 0
p1, p2 = 1, 2 # To make them initially mismatch.
password_validated = False
while (p1 != p2 or not password_validated) and count < MAX_TRIES:
p1 = self._get_pass()
p2 = self._get_pass("Password (again): ")
if p1 != p2:
self.stdout.write("Passwords do not match. Please try again.\n")
count += 1
# Don't validate passwords that don't match.
continue
try:
validate_password(p2, u)
except ValidationError as err:
self.stderr.write('\n'.join(err.messages))
count += 1
else:
password_validated = True
if count == MAX_TRIES:
raise CommandError("Aborting password change for user '%s' after %s attempts" % (u, count))
u.set_password(p1)
u.save()
return "Password changed successfully for user '%s'" % u
|
PypiClean
|
/sendmail-2.0.tar.gz/sendmail-2.0/sendmail.py
|
__author__ = 'Phil Budne <[email protected]>'
__version__ = '2.0'
__revision__ = '$Id: sendmail.py,v 1.10 2018/10/27 19:05:22 phil Exp $'
# Copyright (c) 2009,2018 Philip Budne ([email protected])
# Licensed under the MIT licence:
#
# Permission is hereby granted, free of charge, to any person
# obtaining a copy of this software and associated documentation
# files (the "Software"), to deal in the Software without
# restriction, including without limitation the rights to use,
# copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following
# conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.
import subprocess
import smtplib # for SMTPException
import sys
import os
OS_OK = getattr(os, 'OS_OK', 0)
class SendmailException(smtplib.SMTPException):
"""
subclass of smtplib.SMTPException for (crude) compatibility
"""
class Sendmail(object):
"""smtplib compatible object for queuing e-mail messages
using local sendmail program"""
# take as initializer arg? search for it?
SENDMAIL = '/usr/sbin/sendmail'
debug = False
def set_debuglevel(self, debug):
"""enable debug output"""
self.debug = debug
def sendmail(self, from_addr, to_addrs, msg, mail_options=()):
"""invoke local "sendmail" program to send a message.
`from_addr' is envelope sender string (may be empty)
`to_addrs' is list of envelope recipient addresses
string will be treated as a list with 1 address.
`msg' is headers and body of message to be sent
`mail_options' is iterable of options ('8bitmime')"""
# -i flag: do NOT treat bare dot as EOF
cmd = [self.SENDMAIL, '-i']
if from_addr: # envelope sender?
cmd.append('-f%s' % from_addr)
if isinstance(to_addrs, tuple): # be liberal
to_addrs = list(to_addrs)
elif not isinstance(to_addrs, list):
to_addrs = [to_addrs]
if sys.version_info[0] >= 3 or isinstance(msg, unicode):
msg = msg.encode('utf-8')
# need to force 8BIT (if length changed)?
if '8bitmime' in mail_options:
cmd.append('-B8BITMIME')
# avoid shell / quoting issues
proc = subprocess.Popen(cmd + to_addrs, shell=False,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
out, err = proc.communicate(input=msg)
ret = proc.returncode # not clearly documented?!
if self.debug:
print("ret: %d" % ret)
print("stdout:")
print(out)
print("stderr:")
print(err)
if ret != OS_OK:
# values suggested by Mateo Roldan:
raise SendmailException(ret, out, err)
return {}
def quit(self):
"""for SMTP compatibility"""
|
PypiClean
|
/ember-compressor-compiler-0.3.1.zip/ember-compressor-compiler-0.3.1/src/embercompressorcompiler/js/ember-template-compiler.js
|
(function() {
var Ember = { assert: function() {} };
// Version: v1.0.0-rc.6-242-g15a358a
// Last commit: 15a358a (2013-08-01 07:09:51 -0700)
(function() {
/**
@module ember
@submodule ember-handlebars-compiler
*/
// Eliminate dependency on any Ember to simplify precompilation workflow
var objectCreate = Object.create || function(parent) {
function F() {}
F.prototype = parent;
return new F();
};
var Handlebars = this.Handlebars || (Ember.imports && Ember.imports.Handlebars);
if (!Handlebars && typeof require === 'function') {
Handlebars = require('handlebars');
}
Ember.assert("Ember Handlebars requires Handlebars version 1.0.0. Include a SCRIPT tag in the HTML HEAD linking to the Handlebars file before you link to Ember.", Handlebars);
Ember.assert("Ember Handlebars requires Handlebars version 1.0.0, COMPILER_REVISION expected: 4, got: " + Handlebars.COMPILER_REVISION + " - Please note: Builds of master may have other COMPILER_REVISION values.", Handlebars.COMPILER_REVISION === 4);
/**
Prepares the Handlebars templating library for use inside Ember's view
system.
The `Ember.Handlebars` object is the standard Handlebars library, extended to
use Ember's `get()` method instead of direct property access, which allows
computed properties to be used inside templates.
To create an `Ember.Handlebars` template, call `Ember.Handlebars.compile()`.
This will return a function that can be used by `Ember.View` for rendering.
@class Handlebars
@namespace Ember
*/
Ember.Handlebars = objectCreate(Handlebars);
function makeBindings(options) {
var hash = options.hash,
hashType = options.hashTypes;
for (var prop in hash) {
if (hashType[prop] === 'ID') {
hash[prop + 'Binding'] = hash[prop];
hashType[prop + 'Binding'] = 'STRING';
delete hash[prop];
delete hashType[prop];
}
}
}
/**
Register a bound helper or custom view helper.
## Simple bound helper example
```javascript
Ember.Handlebars.helper('capitalize', function(value) {
return value.toUpperCase();
});
```
The above bound helper can be used inside of templates as follows:
```handlebars
{{capitalize name}}
```
In this case, when the `name` property of the template's context changes,
the rendered value of the helper will update to reflect this change.
For more examples of bound helpers, see documentation for
`Ember.Handlebars.registerBoundHelper`.
## Custom view helper example
Assuming a view subclass named `App.CalendarView` were defined, a helper
for rendering instances of this view could be registered as follows:
```javascript
Ember.Handlebars.helper('calendar', App.CalendarView):
```
The above bound helper can be used inside of templates as follows:
```handlebars
{{calendar}}
```
Which is functionally equivalent to:
```handlebars
{{view App.CalendarView}}
```
Options in the helper will be passed to the view in exactly the same
manner as with the `view` helper.
@method helper
@for Ember.Handlebars
@param {String} name
@param {Function|Ember.View} function or view class constructor
@param {String} dependentKeys*
*/
Ember.Handlebars.helper = function(name, value) {
if (Ember.Component.detect(value)) {
Ember.assert("You tried to register a component named '" + name + "', but component names must include a '-'", name.match(/-/));
var proto = value.proto();
if (!proto.layoutName && !proto.templateName) {
value.reopen({
layoutName: 'components/' + name
});
}
}
if (Ember.View.detect(value)) {
Ember.Handlebars.registerHelper(name, function(options) {
Ember.assert("You can only pass attributes (such as name=value) not bare values to a helper for a View", arguments.length < 2);
makeBindings(options);
return Ember.Handlebars.helpers.view.call(this, value, options);
});
} else {
Ember.Handlebars.registerBoundHelper.apply(null, arguments);
}
};
/**
@class helpers
@namespace Ember.Handlebars
*/
Ember.Handlebars.helpers = objectCreate(Handlebars.helpers);
/**
Override the the opcode compiler and JavaScript compiler for Handlebars.
@class Compiler
@namespace Ember.Handlebars
@private
@constructor
*/
Ember.Handlebars.Compiler = function() {};
// Handlebars.Compiler doesn't exist in runtime-only
if (Handlebars.Compiler) {
Ember.Handlebars.Compiler.prototype = objectCreate(Handlebars.Compiler.prototype);
}
Ember.Handlebars.Compiler.prototype.compiler = Ember.Handlebars.Compiler;
/**
@class JavaScriptCompiler
@namespace Ember.Handlebars
@private
@constructor
*/
Ember.Handlebars.JavaScriptCompiler = function() {};
// Handlebars.JavaScriptCompiler doesn't exist in runtime-only
if (Handlebars.JavaScriptCompiler) {
Ember.Handlebars.JavaScriptCompiler.prototype = objectCreate(Handlebars.JavaScriptCompiler.prototype);
Ember.Handlebars.JavaScriptCompiler.prototype.compiler = Ember.Handlebars.JavaScriptCompiler;
}
Ember.Handlebars.JavaScriptCompiler.prototype.namespace = "Ember.Handlebars";
Ember.Handlebars.JavaScriptCompiler.prototype.initializeBuffer = function() {
return "''";
};
/**
@private
Override the default buffer for Ember Handlebars. By default, Handlebars
creates an empty String at the beginning of each invocation and appends to
it. Ember's Handlebars overrides this to append to a single shared buffer.
@method appendToBuffer
@param string {String}
*/
Ember.Handlebars.JavaScriptCompiler.prototype.appendToBuffer = function(string) {
return "data.buffer.push("+string+");";
};
var prefix = "ember" + (+new Date()), incr = 1;
/**
@private
Rewrite simple mustaches from `{{foo}}` to `{{bind "foo"}}`. This means that
all simple mustaches in Ember's Handlebars will also set up an observer to
keep the DOM up to date when the underlying property changes.
@method mustache
@for Ember.Handlebars.Compiler
@param mustache
*/
Ember.Handlebars.Compiler.prototype.mustache = function(mustache) {
if (mustache.isHelper && mustache.id.string === 'control') {
mustache.hash = mustache.hash || new Handlebars.AST.HashNode([]);
mustache.hash.pairs.push(["controlID", new Handlebars.AST.StringNode(prefix + incr++)]);
} else if (mustache.params.length || mustache.hash) {
// no changes required
} else {
var id = new Handlebars.AST.IdNode([{ part: '_triageMustache' }]);
// Update the mustache node to include a hash value indicating whether the original node
// was escaped. This will allow us to properly escape values when the underlying value
// changes and we need to re-render the value.
if (!mustache.escaped) {
mustache.hash = mustache.hash || new Handlebars.AST.HashNode([]);
mustache.hash.pairs.push(["unescaped", new Handlebars.AST.StringNode("true")]);
}
mustache = new Handlebars.AST.MustacheNode([id].concat([mustache.id]), mustache.hash, !mustache.escaped);
}
return Handlebars.Compiler.prototype.mustache.call(this, mustache);
};
/**
Used for precompilation of Ember Handlebars templates. This will not be used
during normal app execution.
@method precompile
@for Ember.Handlebars
@static
@param {String} string The template to precompile
*/
Ember.Handlebars.precompile = function(string) {
var ast = Handlebars.parse(string);
var options = {
knownHelpers: {
action: true,
unbound: true,
bindAttr: true,
template: true,
view: true,
_triageMustache: true
},
data: true,
stringParams: true
};
var environment = new Ember.Handlebars.Compiler().compile(ast, options);
return new Ember.Handlebars.JavaScriptCompiler().compile(environment, options, undefined, true);
};
// We don't support this for Handlebars runtime-only
if (Handlebars.compile) {
/**
The entry point for Ember Handlebars. This replaces the default
`Handlebars.compile` and turns on template-local data and String
parameters.
@method compile
@for Ember.Handlebars
@static
@param {String} string The template to compile
@return {Function}
*/
Ember.Handlebars.compile = function(string) {
var ast = Handlebars.parse(string);
var options = { data: true, stringParams: true };
var environment = new Ember.Handlebars.Compiler().compile(ast, options);
var templateSpec = new Ember.Handlebars.JavaScriptCompiler().compile(environment, options, undefined, true);
return Ember.Handlebars.template(templateSpec);
};
}
})();
exports.precompile = Ember.Handlebars.precompile;
exports.EmberHandlebars = Ember.Handlebars;
})();
|
PypiClean
|
/inotipy-0.1.1-py3-none-any.whl/inotipy.py
|
import os
import enum
import ctypes as ct
import struct
from weakref import \
ref as weak_ref, \
WeakValueDictionary
import asyncio
import atexit
libc = ct.CDLL("libc.so.6", use_errno = True)
NAME_MAX = 255 # from <linux/limits.h>
class inotify_event(ct.Structure) :
# from <sys/inotify.h>
_fields_ = \
[
("wd", ct.c_int),
("mask", ct.c_uint),
("cookie", ct.c_uint),
("len", ct.c_uint), # length of name, including trailing NULs, won’t exceed NAME_MAX
# name follows
]
#end inotify_event
class IN :
"definitions of flag bits that you will need."
# from <bits/inotify.h>: flags for inotify_init1
CLOEXEC = 0o2000000
NONBLOCK = 0o0004000
# from <sys/inotify.h>:
# mask bits for INOTIFY_ADD_WATCH:
ACCESS = 0x00000001
MODIFY = 0x00000002
ATTRIB = 0x00000004
CLOSE_WRITE = 0x00000008
CLOSE_NOWRITE = 0x00000010
CLOSE = CLOSE_WRITE | CLOSE_NOWRITE
OPEN = 0x00000020
MOVED_FROM = 0x00000040
MOVED_TO = 0x00000080
MOVE = MOVED_FROM | MOVED_TO
CREATE = 0x00000100
DELETE = 0x00000200
DELETE_SELF = 0x00000400
MOVE_SELF = 0x00000800
# events from kernel:
UNMOUNT = 0x00002000
Q_OVERFLOW = 0x00004000
IGNORED = 0x00008000
# special flags:
ONLYDIR = 0x01000000
DONT_FOLLOW = 0x02000000
EXCL_UNLINK = 0x04000000
MASK_ADD = 0x20000000
ISDIR = 0x40000000
ONESHOT = 0x80000000
ALL_EVENTS = \
(
ACCESS
|
MODIFY
|
ATTRIB
|
CLOSE_WRITE
|
CLOSE_NOWRITE
|
OPEN
|
MOVED_FROM
|
MOVED_TO
|
CREATE
|
DELETE
|
DELETE_SELF
|
MOVE_SELF
)
#end IN
@enum.unique
class EVENT_BIT(enum.IntEnum) :
"names for single bits in mask; value is bit number."
ACCESS = 0
MODIFY = 1
ATTRIB = 2
CLOSE_WRITE = 3
CLOSE_NOWRITE = 4
OPEN = 5
MOVED_FROM = 6
MOVED_TO = 7
CREATE = 8
DELETE = 9
DELETE_SELF = 10
MOVE_SELF = 11
UNMOUNT = 13
Q_OVERFLOW = 14
IGNORED = 15
ONLYDIR = 24
DONT_FOLLOW = 25
EXCL_UNLINK = 26
MASK_ADD = 29
ISDIR = 30
ONESHOT = 31
@property
def mask(self) :
"convert bit number to mask."
return \
1 << self.value
#end mask
#end EVENT_BIT
#+
# Library prototypes
#-
libc.inotify_init.restype = ct.c_int
libc.inotify_init.argtypes = ()
libc.inotify_init1.restype = ct.c_int
libc.inotify_init1.argtypes = (ct.c_int,)
libc.inotify_add_watch.restype = ct.c_int
libc.inotify_add_watch.argtypes = (ct.c_int, ct.c_char_p, ct.c_uint)
libc.inotify_rm_watch.restype = ct.c_int
libc.inotify_rm_watch.argtypes = (ct.c_int, ct.c_int)
#+
# High-level stuff follows
#-
def decode_mask(mask) :
mask_bits = []
for i in range(32) :
if 1 << i & mask != 0 :
try :
name = EVENT_BIT(i)
except ValueError :
name = "?%d" % i
#end try
mask_bits.append(name)
#end if
#end for
return \
mask_bits
#end decode_mask
class Watch :
"represents a file path being watched. Do not create directly; get from Watcher.watch()."
__slots__ = ("__weakref__", "_wd", "_parent", "pathname", "mask") # to forestall typos
_instances = WeakValueDictionary()
def __new__(celf, _wd, _parent) :
self = celf._instances.get((_wd, _parent._fd))
if self == None :
self = super().__new__(celf)
self._wd = _wd
self._parent = weak_ref(_parent)
celf._instances[(_wd, _parent._fd)] = self
#end if
# pathname, mask set by parent
_parent._watches[_wd] = self
return \
self
#end __new__
def __del__(self) :
self.remove()
#end __del__
@property
def valid(self) :
"is this Watch object still valid. It can become invalid after a" \
" remove() call, or after inotify sends an IN.IGNORED event for it."
return \
self._parent != None and self._wd != None
#end valid
def remove(self) :
"removes itself from being watched. Do not try to use this Watch" \
" object for anything else after making this call."
if self._wd != None and self._parent != None :
parent = self._parent()
if parent != None :
libc.inotify_rm_watch(parent._fd, self._wd) # ignoring any error
parent._watches.pop(self._wd, None)
#end if
self._wd = None
#end if
#end remove
def replace_mask(self, mask) :
"lets you change the mask associated with this Watch."
parent = self._parent()
assert parent != None, "parent has gone away"
wd = libc.inotify_add_watch(parent._fd, self.pathname.encode(), mask)
if wd < 0 :
errno = ct.get_errno()
raise OSError(errno, os.strerror(errno))
elif wd != self._wd :
raise RuntimeError("inconsistency in watch descriptors")
#end if
self.mask = mask
#end replace_mask
def __repr__(self) :
return \
"%s(%s, %s, %d:%d)" % (type(self).__name__, repr(self.pathname), decode_mask(self.mask), self._parent()._fd, self._wd)
#end __repr__
#end Watch
class Event :
"represents a watch event. Do not instantiate directly; get from Watcher.get()."
__slots__ = ("watch", "mask", "cookie", "pathname") # to forestall typos
def __init__(self, watch, mask, cookie, pathname) :
self.watch = watch
self.mask = mask
self.cookie = cookie
self.pathname = pathname
#end __init
def __repr__(self) :
return \
"%s(%s, %s, %d, %s)" % (type(self).__name__, (lambda : None, lambda : self.watch._wd)[self.watch != None](), decode_mask(self.mask), self.cookie, repr(self.pathname))
#end __repr__
#end Event
class Watcher :
"a context for watching one or more files or directories. Do not instantiate directly;" \
" use the create() method."
__slots__ = \
( # to forestall typos
"__weakref__",
"_fd",
"_watches",
"_loop",
"_reader_count",
"_awaiting",
"_notifs",
)
_instances = WeakValueDictionary()
def __new__(celf, _fd) :
self = celf._instances.get(_fd)
if self == None :
self = super().__new__(celf)
self._fd = _fd
self._loop = None # to begin with
self._watches = {}
self._reader_count = 0
self._awaiting = []
self._notifs = []
celf._instances[_fd] = self
#end if
return \
self
#end __new__
def _add_remove_watch(self, add) :
loop = self._loop()
if add :
assert loop != None, "loop has gone away"
loop.add_reader(self._fd, self._callback)
else :
if loop != None :
loop.remove_reader(self._fd)
#end if
#end if
#end _add_remove_watch
@classmethod
def create(celf, flags = 0, loop = None) :
"creates a new Watcher for collecting filesystem notifications. loop is the" \
" asyncio event loop into which to install reader callbacks; the default" \
" loop is used if this not specified."
if loop == None :
loop = asyncio.get_event_loop()
#end if
fd = libc.inotify_init1(flags)
if fd < 0 :
errno = ct.get_errno()
raise OSError(errno, os.strerror(errno))
#end if
result = celf(fd)
if result._loop == None :
result._loop = weak_ref(loop)
elif result._loop() != loop :
raise RuntimeError("watcher was not created on current event loop")
#end if
return \
result
#end create
def watch(self, pathname, mask) :
"adds a watch for the specified path, or replaces any previous" \
" watch settings if there is already a watch on that path. Returns" \
" the Watch object, either the same one as before or a new one for a" \
" new path."
wd = libc.inotify_add_watch(self._fd, pathname.encode(), mask)
if wd < 0 :
errno = ct.get_errno()
raise OSError(errno, os.strerror(errno))
#end if
result = Watch(wd, self)
result.pathname = pathname
result.mask = mask
return \
result
#end watch
@property
def watches(self) :
"returns a list of currently-associated Watch objects."
return \
sorted(self._watches.values(), key = lambda w : w.pathname)
#end watches
def __del__(self) :
if self._fd != None :
self._add_remove_watch(False)
os.close(self._fd)
#end if
self._fd = None
#end __del__
def fileno(self) :
return \
self._fd
#end fileno
def _callback(self) :
# called by asyncio when there is a notification event to be read.
fixed_size = ct.sizeof(inotify_event)
buf = os.read(self._fd, fixed_size + NAME_MAX + 1)
while len(buf) != 0 :
assert len(buf) >= fixed_size, "truncated inotify message: expected %d bytes, got %d" % (fixed_size, len(buf))
wd, mask, cookie, namelen = struct.unpack("@iIII", buf[:fixed_size])
assert len(buf) >= fixed_size + namelen, "truncated rest of inotify message: expected %d bytes, got %d" % (fixed_size + namelen, len(buf))
pathname = buf[fixed_size : fixed_size + namelen]
buf = buf[fixed_size + namelen:]
end = pathname.find(0)
if end >= 0 :
pathname = pathname[:end]
#end if
pathname = pathname.decode()
if wd >= 0 :
watch = self._watches[wd]
else :
assert mask & IN.Q_OVERFLOW != 0
watch = None
#end if
if mask & IN.IGNORED != 0 :
# watch is gone
watch._parent = None # Watch object doesn’t need to remove itself
self._watches.pop(wd)
#end if
wakeup = len(self._notifs) == 0
self._notifs.append(Event(watch, mask, cookie, pathname))
if wakeup and len(self._awaiting) != 0 :
# wake up task at head of queue
# also need to remove it from queue here, in case
# anybody else is also waiting behind it and I have
# additional incoming messages for them
self._awaiting.pop(0).set_result(True)
#end if
#end while
#end _callback
async def get(self, timeout = None) :
"waits for and returns the next available Event. Waits forever if" \
" necessary if timeout is None; else it is the number of seconds" \
" (fractions allowed) to wait; if no event becomes available during" \
" that time, None is returned."
awaiting = None
def timedout() :
awaiting.set_result(False)
#end timedout
#begin get
loop = self._loop()
assert loop != None, "loop has gone away"
while True :
if len(self._notifs) != 0 :
result = self._notifs.pop(0)
break
#end if
awaiting = loop.create_future()
timeout_task = None
if timeout != None :
if timeout <= 0 :
result = None
break
#end if
timeout_task = loop.call_later(timeout, timedout)
#end if
self._awaiting.append(awaiting)
if self._reader_count == 0 :
self._add_remove_watch(True)
#end if
self._reader_count += 1
got_one = await awaiting
self._reader_count -= 1
if self._reader_count == 0 :
self._add_remove_watch(False)
#end if
if timeout_task != None :
timeout_task.cancel()
#end if
try :
self._awaiting.pop(self._awaiting.index(awaiting))
except ValueError :
pass
#end try
if not got_one :
result = None
break
#end if
#end while
return \
result
#end get
#end Watcher
#+
# Cleanup
#-
def _atexit() :
# disable all __del__ methods at process termination to avoid segfaults
for cls in Watch, Watcher :
delattr(cls, "__del__")
#end for
#end _atexit
atexit.register(_atexit)
del _atexit
|
PypiClean
|
/django-ra-erp-1.3.1.tar.gz/django-ra-erp-1.3.1/ra/static/adminlte/plugins/datatables-responsive/js/dataTables.responsive.js
|
* @summary Responsive
* @description Responsive tables plug-in for DataTables
* @version 2.2.3
* @file dataTables.responsive.js
* @author SpryMedia Ltd (www.sprymedia.co.uk)
* @contact www.sprymedia.co.uk/contact
* @copyright Copyright 2014-2018 SpryMedia Ltd.
*
* This source file is free software, available under the following license:
* MIT license - http://datatables.net/license/mit
*
* This source file is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
* or FITNESS FOR A PARTICULAR PURPOSE. See the license files for details.
*
* For details please refer to: http://www.datatables.net
*/
(function( factory ){
if ( typeof define === 'function' && define.amd ) {
// AMD
define( ['jquery', 'datatables.net'], function ( $ ) {
return factory( $, window, document );
} );
}
else if ( typeof exports === 'object' ) {
// CommonJS
module.exports = function (root, $) {
if ( ! root ) {
root = window;
}
if ( ! $ || ! $.fn.dataTable ) {
$ = require('datatables.net')(root, $).$;
}
return factory( $, root, root.document );
};
}
else {
// Browser
factory( jQuery, window, document );
}
}(function( $, window, document, undefined ) {
'use strict';
var DataTable = $.fn.dataTable;
/**
* Responsive is a plug-in for the DataTables library that makes use of
* DataTables' ability to change the visibility of columns, changing the
* visibility of columns so the displayed columns fit into the table container.
* The end result is that complex tables will be dynamically adjusted to fit
* into the viewport, be it on a desktop, tablet or mobile browser.
*
* Responsive for DataTables has two modes of operation, which can used
* individually or combined:
*
* * Class name based control - columns assigned class names that match the
* breakpoint logic can be shown / hidden as required for each breakpoint.
* * Automatic control - columns are automatically hidden when there is no
* room left to display them. Columns removed from the right.
*
* In additional to column visibility control, Responsive also has built into
* options to use DataTables' child row display to show / hide the information
* from the table that has been hidden. There are also two modes of operation
* for this child row display:
*
* * Inline - when the control element that the user can use to show / hide
* child rows is displayed inside the first column of the table.
* * Column - where a whole column is dedicated to be the show / hide control.
*
* Initialisation of Responsive is performed by:
*
* * Adding the class `responsive` or `dt-responsive` to the table. In this case
* Responsive will automatically be initialised with the default configuration
* options when the DataTable is created.
* * Using the `responsive` option in the DataTables configuration options. This
* can also be used to specify the configuration options, or simply set to
* `true` to use the defaults.
*
* @class
* @param {object} settings DataTables settings object for the host table
* @param {object} [opts] Configuration options
* @requires jQuery 1.7+
* @requires DataTables 1.10.3+
*
* @example
* $('#example').DataTable( {
* responsive: true
* } );
* } );
*/
var Responsive = function ( settings, opts ) {
// Sanity check that we are using DataTables 1.10 or newer
if ( ! DataTable.versionCheck || ! DataTable.versionCheck( '1.10.10' ) ) {
throw 'DataTables Responsive requires DataTables 1.10.10 or newer';
}
this.s = {
dt: new DataTable.Api( settings ),
columns: [],
current: []
};
// Check if responsive has already been initialised on this table
if ( this.s.dt.settings()[0].responsive ) {
return;
}
// details is an object, but for simplicity the user can give it as a string
// or a boolean
if ( opts && typeof opts.details === 'string' ) {
opts.details = { type: opts.details };
}
else if ( opts && opts.details === false ) {
opts.details = { type: false };
}
else if ( opts && opts.details === true ) {
opts.details = { type: 'inline' };
}
this.c = $.extend( true, {}, Responsive.defaults, DataTable.defaults.responsive, opts );
settings.responsive = this;
this._constructor();
};
$.extend( Responsive.prototype, {
/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
* Constructor
*/
/**
* Initialise the Responsive instance
*
* @private
*/
_constructor: function ()
{
var that = this;
var dt = this.s.dt;
var dtPrivateSettings = dt.settings()[0];
var oldWindowWidth = $(window).width();
dt.settings()[0]._responsive = this;
// Use DataTables' throttle function to avoid processor thrashing on
// resize
$(window).on( 'resize.dtr orientationchange.dtr', DataTable.util.throttle( function () {
// iOS has a bug whereby resize can fire when only scrolling
// See: http://stackoverflow.com/questions/8898412
var width = $(window).width();
if ( width !== oldWindowWidth ) {
that._resize();
oldWindowWidth = width;
}
} ) );
// DataTables doesn't currently trigger an event when a row is added, so
// we need to hook into its private API to enforce the hidden rows when
// new data is added
dtPrivateSettings.oApi._fnCallbackReg( dtPrivateSettings, 'aoRowCreatedCallback', function (tr, data, idx) {
if ( $.inArray( false, that.s.current ) !== -1 ) {
$('>td, >th', tr).each( function ( i ) {
var idx = dt.column.index( 'toData', i );
if ( that.s.current[idx] === false ) {
$(this).css('display', 'none');
}
} );
}
} );
// Destroy event handler
dt.on( 'destroy.dtr', function () {
dt.off( '.dtr' );
$( dt.table().body() ).off( '.dtr' );
$(window).off( 'resize.dtr orientationchange.dtr' );
// Restore the columns that we've hidden
$.each( that.s.current, function ( i, val ) {
if ( val === false ) {
that._setColumnVis( i, true );
}
} );
} );
// Reorder the breakpoints array here in case they have been added out
// of order
this.c.breakpoints.sort( function (a, b) {
return a.width < b.width ? 1 :
a.width > b.width ? -1 : 0;
} );
this._classLogic();
this._resizeAuto();
// Details handler
var details = this.c.details;
if ( details.type !== false ) {
that._detailsInit();
// DataTables will trigger this event on every column it shows and
// hides individually
dt.on( 'column-visibility.dtr', function () {
// Use a small debounce to allow multiple columns to be set together
if ( that._timer ) {
clearTimeout( that._timer );
}
that._timer = setTimeout( function () {
that._timer = null;
that._classLogic();
that._resizeAuto();
that._resize();
that._redrawChildren();
}, 100 );
} );
// Redraw the details box on each draw which will happen if the data
// has changed. This is used until DataTables implements a native
// `updated` event for rows
dt.on( 'draw.dtr', function () {
that._redrawChildren();
} );
$(dt.table().node()).addClass( 'dtr-'+details.type );
}
dt.on( 'column-reorder.dtr', function (e, settings, details) {
that._classLogic();
that._resizeAuto();
that._resize();
} );
// Change in column sizes means we need to calc
dt.on( 'column-sizing.dtr', function () {
that._resizeAuto();
that._resize();
});
// On Ajax reload we want to reopen any child rows which are displayed
// by responsive
dt.on( 'preXhr.dtr', function () {
var rowIds = [];
dt.rows().every( function () {
if ( this.child.isShown() ) {
rowIds.push( this.id(true) );
}
} );
dt.one( 'draw.dtr', function () {
that._resizeAuto();
that._resize();
dt.rows( rowIds ).every( function () {
that._detailsDisplay( this, false );
} );
} );
});
dt.on( 'init.dtr', function (e, settings, details) {
that._resizeAuto();
that._resize();
// If columns were hidden, then DataTables needs to adjust the
// column sizing
if ( $.inArray( false, that.s.current ) ) {
dt.columns.adjust();
}
} );
// First pass - draw the table for the current viewport size
this._resize();
},
/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
* Private methods
*/
/**
* Calculate the visibility for the columns in a table for a given
* breakpoint. The result is pre-determined based on the class logic if
* class names are used to control all columns, but the width of the table
* is also used if there are columns which are to be automatically shown
* and hidden.
*
* @param {string} breakpoint Breakpoint name to use for the calculation
* @return {array} Array of boolean values initiating the visibility of each
* column.
* @private
*/
_columnsVisiblity: function ( breakpoint )
{
var dt = this.s.dt;
var columns = this.s.columns;
var i, ien;
// Create an array that defines the column ordering based first on the
// column's priority, and secondly the column index. This allows the
// columns to be removed from the right if the priority matches
var order = columns
.map( function ( col, idx ) {
return {
columnIdx: idx,
priority: col.priority
};
} )
.sort( function ( a, b ) {
if ( a.priority !== b.priority ) {
return a.priority - b.priority;
}
return a.columnIdx - b.columnIdx;
} );
// Class logic - determine which columns are in this breakpoint based
// on the classes. If no class control (i.e. `auto`) then `-` is used
// to indicate this to the rest of the function
var display = $.map( columns, function ( col, i ) {
if ( dt.column(i).visible() === false ) {
return 'not-visible';
}
return col.auto && col.minWidth === null ?
false :
col.auto === true ?
'-' :
$.inArray( breakpoint, col.includeIn ) !== -1;
} );
// Auto column control - first pass: how much width is taken by the
// ones that must be included from the non-auto columns
var requiredWidth = 0;
for ( i=0, ien=display.length ; i<ien ; i++ ) {
if ( display[i] === true ) {
requiredWidth += columns[i].minWidth;
}
}
// Second pass, use up any remaining width for other columns. For
// scrolling tables we need to subtract the width of the scrollbar. It
// may not be requires which makes this sub-optimal, but it would
// require another full redraw to make complete use of those extra few
// pixels
var scrolling = dt.settings()[0].oScroll;
var bar = scrolling.sY || scrolling.sX ? scrolling.iBarWidth : 0;
var widthAvailable = dt.table().container().offsetWidth - bar;
var usedWidth = widthAvailable - requiredWidth;
// Control column needs to always be included. This makes it sub-
// optimal in terms of using the available with, but to stop layout
// thrashing or overflow. Also we need to account for the control column
// width first so we know how much width is available for the other
// columns, since the control column might not be the first one shown
for ( i=0, ien=display.length ; i<ien ; i++ ) {
if ( columns[i].control ) {
usedWidth -= columns[i].minWidth;
}
}
// Allow columns to be shown (counting by priority and then right to
// left) until we run out of room
var empty = false;
for ( i=0, ien=order.length ; i<ien ; i++ ) {
var colIdx = order[i].columnIdx;
if ( display[colIdx] === '-' && ! columns[colIdx].control && columns[colIdx].minWidth ) {
// Once we've found a column that won't fit we don't let any
// others display either, or columns might disappear in the
// middle of the table
if ( empty || usedWidth - columns[colIdx].minWidth < 0 ) {
empty = true;
display[colIdx] = false;
}
else {
display[colIdx] = true;
}
usedWidth -= columns[colIdx].minWidth;
}
}
// Determine if the 'control' column should be shown (if there is one).
// This is the case when there is a hidden column (that is not the
// control column). The two loops look inefficient here, but they are
// trivial and will fly through. We need to know the outcome from the
// first , before the action in the second can be taken
var showControl = false;
for ( i=0, ien=columns.length ; i<ien ; i++ ) {
if ( ! columns[i].control && ! columns[i].never && display[i] === false ) {
showControl = true;
break;
}
}
for ( i=0, ien=columns.length ; i<ien ; i++ ) {
if ( columns[i].control ) {
display[i] = showControl;
}
// Replace not visible string with false from the control column detection above
if ( display[i] === 'not-visible' ) {
display[i] = false;
}
}
// Finally we need to make sure that there is at least one column that
// is visible
if ( $.inArray( true, display ) === -1 ) {
display[0] = true;
}
return display;
},
/**
* Create the internal `columns` array with information about the columns
* for the table. This includes determining which breakpoints the column
* will appear in, based upon class names in the column, which makes up the
* vast majority of this method.
*
* @private
*/
_classLogic: function ()
{
var that = this;
var calc = {};
var breakpoints = this.c.breakpoints;
var dt = this.s.dt;
var columns = dt.columns().eq(0).map( function (i) {
var column = this.column(i);
var className = column.header().className;
var priority = dt.settings()[0].aoColumns[i].responsivePriority;
if ( priority === undefined ) {
var dataPriority = $(column.header()).data('priority');
priority = dataPriority !== undefined ?
dataPriority * 1 :
10000;
}
return {
className: className,
includeIn: [],
auto: false,
control: false,
never: className.match(/\bnever\b/) ? true : false,
priority: priority
};
} );
// Simply add a breakpoint to `includeIn` array, ensuring that there are
// no duplicates
var add = function ( colIdx, name ) {
var includeIn = columns[ colIdx ].includeIn;
if ( $.inArray( name, includeIn ) === -1 ) {
includeIn.push( name );
}
};
var column = function ( colIdx, name, operator, matched ) {
var size, i, ien;
if ( ! operator ) {
columns[ colIdx ].includeIn.push( name );
}
else if ( operator === 'max-' ) {
// Add this breakpoint and all smaller
size = that._find( name ).width;
for ( i=0, ien=breakpoints.length ; i<ien ; i++ ) {
if ( breakpoints[i].width <= size ) {
add( colIdx, breakpoints[i].name );
}
}
}
else if ( operator === 'min-' ) {
// Add this breakpoint and all larger
size = that._find( name ).width;
for ( i=0, ien=breakpoints.length ; i<ien ; i++ ) {
if ( breakpoints[i].width >= size ) {
add( colIdx, breakpoints[i].name );
}
}
}
else if ( operator === 'not-' ) {
// Add all but this breakpoint
for ( i=0, ien=breakpoints.length ; i<ien ; i++ ) {
if ( breakpoints[i].name.indexOf( matched ) === -1 ) {
add( colIdx, breakpoints[i].name );
}
}
}
};
// Loop over each column and determine if it has a responsive control
// class
columns.each( function ( col, i ) {
var classNames = col.className.split(' ');
var hasClass = false;
// Split the class name up so multiple rules can be applied if needed
for ( var k=0, ken=classNames.length ; k<ken ; k++ ) {
var className = $.trim( classNames[k] );
if ( className === 'all' ) {
// Include in all
hasClass = true;
col.includeIn = $.map( breakpoints, function (a) {
return a.name;
} );
return;
}
else if ( className === 'none' || col.never ) {
// Include in none (default) and no auto
hasClass = true;
return;
}
else if ( className === 'control' ) {
// Special column that is only visible, when one of the other
// columns is hidden. This is used for the details control
hasClass = true;
col.control = true;
return;
}
$.each( breakpoints, function ( j, breakpoint ) {
// Does this column have a class that matches this breakpoint?
var brokenPoint = breakpoint.name.split('-');
var re = new RegExp( '(min\\-|max\\-|not\\-)?('+brokenPoint[0]+')(\\-[_a-zA-Z0-9])?' );
var match = className.match( re );
if ( match ) {
hasClass = true;
if ( match[2] === brokenPoint[0] && match[3] === '-'+brokenPoint[1] ) {
// Class name matches breakpoint name fully
column( i, breakpoint.name, match[1], match[2]+match[3] );
}
else if ( match[2] === brokenPoint[0] && ! match[3] ) {
// Class name matched primary breakpoint name with no qualifier
column( i, breakpoint.name, match[1], match[2] );
}
}
} );
}
// If there was no control class, then automatic sizing is used
if ( ! hasClass ) {
col.auto = true;
}
} );
this.s.columns = columns;
},
/**
* Show the details for the child row
*
* @param {DataTables.Api} row API instance for the row
* @param {boolean} update Update flag
* @private
*/
_detailsDisplay: function ( row, update )
{
var that = this;
var dt = this.s.dt;
var details = this.c.details;
if ( details && details.type !== false ) {
var res = details.display( row, update, function () {
return details.renderer(
dt, row[0], that._detailsObj(row[0])
);
} );
if ( res === true || res === false ) {
$(dt.table().node()).triggerHandler( 'responsive-display.dt', [dt, row, res, update] );
}
}
},
/**
* Initialisation for the details handler
*
* @private
*/
_detailsInit: function ()
{
var that = this;
var dt = this.s.dt;
var details = this.c.details;
// The inline type always uses the first child as the target
if ( details.type === 'inline' ) {
details.target = 'td:first-child, th:first-child';
}
// Keyboard accessibility
dt.on( 'draw.dtr', function () {
that._tabIndexes();
} );
that._tabIndexes(); // Initial draw has already happened
$( dt.table().body() ).on( 'keyup.dtr', 'td, th', function (e) {
if ( e.keyCode === 13 && $(this).data('dtr-keyboard') ) {
$(this).click();
}
} );
// type.target can be a string jQuery selector or a column index
var target = details.target;
var selector = typeof target === 'string' ? target : 'td, th';
// Click handler to show / hide the details rows when they are available
$( dt.table().body() )
.on( 'click.dtr mousedown.dtr mouseup.dtr', selector, function (e) {
// If the table is not collapsed (i.e. there is no hidden columns)
// then take no action
if ( ! $(dt.table().node()).hasClass('collapsed' ) ) {
return;
}
// Check that the row is actually a DataTable's controlled node
if ( $.inArray( $(this).closest('tr').get(0), dt.rows().nodes().toArray() ) === -1 ) {
return;
}
// For column index, we determine if we should act or not in the
// handler - otherwise it is already okay
if ( typeof target === 'number' ) {
var targetIdx = target < 0 ?
dt.columns().eq(0).length + target :
target;
if ( dt.cell( this ).index().column !== targetIdx ) {
return;
}
}
// $().closest() includes itself in its check
var row = dt.row( $(this).closest('tr') );
// Check event type to do an action
if ( e.type === 'click' ) {
// The renderer is given as a function so the caller can execute it
// only when they need (i.e. if hiding there is no point is running
// the renderer)
that._detailsDisplay( row, false );
}
else if ( e.type === 'mousedown' ) {
// For mouse users, prevent the focus ring from showing
$(this).css('outline', 'none');
}
else if ( e.type === 'mouseup' ) {
// And then re-allow at the end of the click
$(this).blur().css('outline', '');
}
} );
},
/**
* Get the details to pass to a renderer for a row
* @param {int} rowIdx Row index
* @private
*/
_detailsObj: function ( rowIdx )
{
var that = this;
var dt = this.s.dt;
return $.map( this.s.columns, function( col, i ) {
// Never and control columns should not be passed to the renderer
if ( col.never || col.control ) {
return;
}
return {
title: dt.settings()[0].aoColumns[ i ].sTitle,
data: dt.cell( rowIdx, i ).render( that.c.orthogonal ),
hidden: dt.column( i ).visible() && !that.s.current[ i ],
columnIndex: i,
rowIndex: rowIdx
};
} );
},
/**
* Find a breakpoint object from a name
*
* @param {string} name Breakpoint name to find
* @return {object} Breakpoint description object
* @private
*/
_find: function ( name )
{
var breakpoints = this.c.breakpoints;
for ( var i=0, ien=breakpoints.length ; i<ien ; i++ ) {
if ( breakpoints[i].name === name ) {
return breakpoints[i];
}
}
},
/**
* Re-create the contents of the child rows as the display has changed in
* some way.
*
* @private
*/
_redrawChildren: function ()
{
var that = this;
var dt = this.s.dt;
dt.rows( {page: 'current'} ).iterator( 'row', function ( settings, idx ) {
var row = dt.row( idx );
that._detailsDisplay( dt.row( idx ), true );
} );
},
/**
* Alter the table display for a resized viewport. This involves first
* determining what breakpoint the window currently is in, getting the
* column visibilities to apply and then setting them.
*
* @private
*/
_resize: function ()
{
var that = this;
var dt = this.s.dt;
var width = $(window).width();
var breakpoints = this.c.breakpoints;
var breakpoint = breakpoints[0].name;
var columns = this.s.columns;
var i, ien;
var oldVis = this.s.current.slice();
// Determine what breakpoint we are currently at
for ( i=breakpoints.length-1 ; i>=0 ; i-- ) {
if ( width <= breakpoints[i].width ) {
breakpoint = breakpoints[i].name;
break;
}
}
// Show the columns for that break point
var columnsVis = this._columnsVisiblity( breakpoint );
this.s.current = columnsVis;
// Set the class before the column visibility is changed so event
// listeners know what the state is. Need to determine if there are
// any columns that are not visible but can be shown
var collapsedClass = false;
for ( i=0, ien=columns.length ; i<ien ; i++ ) {
if ( columnsVis[i] === false && ! columns[i].never && ! columns[i].control && ! dt.column(i).visible() === false ) {
collapsedClass = true;
break;
}
}
$( dt.table().node() ).toggleClass( 'collapsed', collapsedClass );
var changed = false;
var visible = 0;
dt.columns().eq(0).each( function ( colIdx, i ) {
if ( columnsVis[i] === true ) {
visible++;
}
if ( columnsVis[i] !== oldVis[i] ) {
changed = true;
that._setColumnVis( colIdx, columnsVis[i] );
}
} );
if ( changed ) {
this._redrawChildren();
// Inform listeners of the change
$(dt.table().node()).trigger( 'responsive-resize.dt', [dt, this.s.current] );
// If no records, update the "No records" display element
if ( dt.page.info().recordsDisplay === 0 ) {
$('td', dt.table().body()).eq(0).attr('colspan', visible);
}
}
},
/**
* Determine the width of each column in the table so the auto column hiding
* has that information to work with. This method is never going to be 100%
* perfect since column widths can change slightly per page, but without
* seriously compromising performance this is quite effective.
*
* @private
*/
_resizeAuto: function ()
{
var dt = this.s.dt;
var columns = this.s.columns;
// Are we allowed to do auto sizing?
if ( ! this.c.auto ) {
return;
}
// Are there any columns that actually need auto-sizing, or do they all
// have classes defined
if ( $.inArray( true, $.map( columns, function (c) { return c.auto; } ) ) === -1 ) {
return;
}
// Need to restore all children. They will be reinstated by a re-render
if ( ! $.isEmptyObject( _childNodeStore ) ) {
$.each( _childNodeStore, function ( key ) {
var idx = key.split('-');
_childNodesRestore( dt, idx[0]*1, idx[1]*1 );
} );
}
// Clone the table with the current data in it
var tableWidth = dt.table().node().offsetWidth;
var columnWidths = dt.columns;
var clonedTable = dt.table().node().cloneNode( false );
var clonedHeader = $( dt.table().header().cloneNode( false ) ).appendTo( clonedTable );
var clonedBody = $( dt.table().body() ).clone( false, false ).empty().appendTo( clonedTable ); // use jQuery because of IE8
// Header
var headerCells = dt.columns()
.header()
.filter( function (idx) {
return dt.column(idx).visible();
} )
.to$()
.clone( false )
.css( 'display', 'table-cell' )
.css( 'min-width', 0 );
// Body rows - we don't need to take account of DataTables' column
// visibility since we implement our own here (hence the `display` set)
$(clonedBody)
.append( $(dt.rows( { page: 'current' } ).nodes()).clone( false ) )
.find( 'th, td' ).css( 'display', '' );
// Footer
var footer = dt.table().footer();
if ( footer ) {
var clonedFooter = $( footer.cloneNode( false ) ).appendTo( clonedTable );
var footerCells = dt.columns()
.footer()
.filter( function (idx) {
return dt.column(idx).visible();
} )
.to$()
.clone( false )
.css( 'display', 'table-cell' );
$('<tr/>')
.append( footerCells )
.appendTo( clonedFooter );
}
$('<tr/>')
.append( headerCells )
.appendTo( clonedHeader );
// In the inline case extra padding is applied to the first column to
// give space for the show / hide icon. We need to use this in the
// calculation
if ( this.c.details.type === 'inline' ) {
$(clonedTable).addClass( 'dtr-inline collapsed' );
}
// It is unsafe to insert elements with the same name into the DOM
// multiple times. For example, cloning and inserting a checked radio
// clears the chcecked state of the original radio.
$( clonedTable ).find( '[name]' ).removeAttr( 'name' );
// A position absolute table would take the table out of the flow of
// our container element, bypassing the height and width (Scroller)
$( clonedTable ).css( 'position', 'relative' )
var inserted = $('<div/>')
.css( {
width: 1,
height: 1,
overflow: 'hidden',
clear: 'both'
} )
.append( clonedTable );
inserted.insertBefore( dt.table().node() );
// The cloned header now contains the smallest that each column can be
headerCells.each( function (i) {
var idx = dt.column.index( 'fromVisible', i );
columns[ idx ].minWidth = this.offsetWidth || 0;
} );
inserted.remove();
},
/**
* Set a column's visibility.
*
* We don't use DataTables' column visibility controls in order to ensure
* that column visibility can Responsive can no-exist. Since only IE8+ is
* supported (and all evergreen browsers of course) the control of the
* display attribute works well.
*
* @param {integer} col Column index
* @param {boolean} showHide Show or hide (true or false)
* @private
*/
_setColumnVis: function ( col, showHide )
{
var dt = this.s.dt;
var display = showHide ? '' : 'none'; // empty string will remove the attr
$( dt.column( col ).header() ).css( 'display', display );
$( dt.column( col ).footer() ).css( 'display', display );
dt.column( col ).nodes().to$().css( 'display', display );
// If the are child nodes stored, we might need to reinsert them
if ( ! $.isEmptyObject( _childNodeStore ) ) {
dt.cells( null, col ).indexes().each( function (idx) {
_childNodesRestore( dt, idx.row, idx.column );
} );
}
},
/**
* Update the cell tab indexes for keyboard accessibility. This is called on
* every table draw - that is potentially inefficient, but also the least
* complex option given that column visibility can change on the fly. Its a
* shame user-focus was removed from CSS 3 UI, as it would have solved this
* issue with a single CSS statement.
*
* @private
*/
_tabIndexes: function ()
{
var dt = this.s.dt;
var cells = dt.cells( { page: 'current' } ).nodes().to$();
var ctx = dt.settings()[0];
var target = this.c.details.target;
cells.filter( '[data-dtr-keyboard]' ).removeData( '[data-dtr-keyboard]' );
if ( typeof target === 'number' ) {
dt.cells( null, target, { page: 'current' } ).nodes().to$()
.attr( 'tabIndex', ctx.iTabIndex )
.data( 'dtr-keyboard', 1 );
}
else {
// This is a bit of a hack - we need to limit the selected nodes to just
// those of this table
if ( target === 'td:first-child, th:first-child' ) {
target = '>td:first-child, >th:first-child';
}
$( target, dt.rows( { page: 'current' } ).nodes() )
.attr( 'tabIndex', ctx.iTabIndex )
.data( 'dtr-keyboard', 1 );
}
}
} );
/**
* List of default breakpoints. Each item in the array is an object with two
* properties:
*
* * `name` - the breakpoint name.
* * `width` - the breakpoint width
*
* @name Responsive.breakpoints
* @static
*/
Responsive.breakpoints = [
{ name: 'desktop', width: Infinity },
{ name: 'tablet-l', width: 1024 },
{ name: 'tablet-p', width: 768 },
{ name: 'mobile-l', width: 480 },
{ name: 'mobile-p', width: 320 }
];
/**
* Display methods - functions which define how the hidden data should be shown
* in the table.
*
* @namespace
* @name Responsive.defaults
* @static
*/
Responsive.display = {
childRow: function ( row, update, render ) {
if ( update ) {
if ( $(row.node()).hasClass('parent') ) {
row.child( render(), 'child' ).show();
return true;
}
}
else {
if ( ! row.child.isShown() ) {
row.child( render(), 'child' ).show();
$( row.node() ).addClass( 'parent' );
return true;
}
else {
row.child( false );
$( row.node() ).removeClass( 'parent' );
return false;
}
}
},
childRowImmediate: function ( row, update, render ) {
if ( (! update && row.child.isShown()) || ! row.responsive.hasHidden() ) {
// User interaction and the row is show, or nothing to show
row.child( false );
$( row.node() ).removeClass( 'parent' );
return false;
}
else {
// Display
row.child( render(), 'child' ).show();
$( row.node() ).addClass( 'parent' );
return true;
}
},
// This is a wrapper so the modal options for Bootstrap and jQuery UI can
// have options passed into them. This specific one doesn't need to be a
// function but it is for consistency in the `modal` name
modal: function ( options ) {
return function ( row, update, render ) {
if ( ! update ) {
// Show a modal
var close = function () {
modal.remove(); // will tidy events for us
$(document).off( 'keypress.dtr' );
};
var modal = $('<div class="dtr-modal"/>')
.append( $('<div class="dtr-modal-display"/>')
.append( $('<div class="dtr-modal-content"/>')
.append( render() )
)
.append( $('<div class="dtr-modal-close">×</div>' )
.click( function () {
close();
} )
)
)
.append( $('<div class="dtr-modal-background"/>')
.click( function () {
close();
} )
)
.appendTo( 'body' );
$(document).on( 'keyup.dtr', function (e) {
if ( e.keyCode === 27 ) {
e.stopPropagation();
close();
}
} );
}
else {
$('div.dtr-modal-content')
.empty()
.append( render() );
}
if ( options && options.header ) {
$('div.dtr-modal-content').prepend(
'<h2>'+options.header( row )+'</h2>'
);
}
};
}
};
var _childNodeStore = {};
function _childNodes( dt, row, col ) {
var name = row+'-'+col;
if ( _childNodeStore[ name ] ) {
return _childNodeStore[ name ];
}
// https://jsperf.com/childnodes-array-slice-vs-loop
var nodes = [];
var children = dt.cell( row, col ).node().childNodes;
for ( var i=0, ien=children.length ; i<ien ; i++ ) {
nodes.push( children[i] );
}
_childNodeStore[ name ] = nodes;
return nodes;
}
function _childNodesRestore( dt, row, col ) {
var name = row+'-'+col;
if ( ! _childNodeStore[ name ] ) {
return;
}
var node = dt.cell( row, col ).node();
var store = _childNodeStore[ name ];
var parent = store[0].parentNode;
var parentChildren = parent.childNodes;
var a = [];
for ( var i=0, ien=parentChildren.length ; i<ien ; i++ ) {
a.push( parentChildren[i] );
}
for ( var j=0, jen=a.length ; j<jen ; j++ ) {
node.appendChild( a[j] );
}
_childNodeStore[ name ] = undefined;
}
/**
* Display methods - functions which define how the hidden data should be shown
* in the table.
*
* @namespace
* @name Responsive.defaults
* @static
*/
Responsive.renderer = {
listHiddenNodes: function () {
return function ( api, rowIdx, columns ) {
var ul = $('<ul data-dtr-index="'+rowIdx+'" class="dtr-details"/>');
var found = false;
var data = $.each( columns, function ( i, col ) {
if ( col.hidden ) {
$(
'<li data-dtr-index="'+col.columnIndex+'" data-dt-row="'+col.rowIndex+'" data-dt-column="'+col.columnIndex+'">'+
'<span class="dtr-title">'+
col.title+
'</span> '+
'</li>'
)
.append( $('<span class="dtr-data"/>').append( _childNodes( api, col.rowIndex, col.columnIndex ) ) )// api.cell( col.rowIndex, col.columnIndex ).node().childNodes ) )
.appendTo( ul );
found = true;
}
} );
return found ?
ul :
false;
};
},
listHidden: function () {
return function ( api, rowIdx, columns ) {
var data = $.map( columns, function ( col ) {
return col.hidden ?
'<li data-dtr-index="'+col.columnIndex+'" data-dt-row="'+col.rowIndex+'" data-dt-column="'+col.columnIndex+'">'+
'<span class="dtr-title">'+
col.title+
'</span> '+
'<span class="dtr-data">'+
col.data+
'</span>'+
'</li>' :
'';
} ).join('');
return data ?
$('<ul data-dtr-index="'+rowIdx+'" class="dtr-details"/>').append( data ) :
false;
}
},
tableAll: function ( options ) {
options = $.extend( {
tableClass: ''
}, options );
return function ( api, rowIdx, columns ) {
var data = $.map( columns, function ( col ) {
return '<tr data-dt-row="'+col.rowIndex+'" data-dt-column="'+col.columnIndex+'">'+
'<td>'+col.title+':'+'</td> '+
'<td>'+col.data+'</td>'+
'</tr>';
} ).join('');
return $('<table class="'+options.tableClass+' dtr-details" width="100%"/>').append( data );
}
}
};
/**
* Responsive default settings for initialisation
*
* @namespace
* @name Responsive.defaults
* @static
*/
Responsive.defaults = {
/**
* List of breakpoints for the instance. Note that this means that each
* instance can have its own breakpoints. Additionally, the breakpoints
* cannot be changed once an instance has been creased.
*
* @type {Array}
* @default Takes the value of `Responsive.breakpoints`
*/
breakpoints: Responsive.breakpoints,
/**
* Enable / disable auto hiding calculations. It can help to increase
* performance slightly if you disable this option, but all columns would
* need to have breakpoint classes assigned to them
*
* @type {Boolean}
* @default `true`
*/
auto: true,
/**
* Details control. If given as a string value, the `type` property of the
* default object is set to that value, and the defaults used for the rest
* of the object - this is for ease of implementation.
*
* The object consists of the following properties:
*
* * `display` - A function that is used to show and hide the hidden details
* * `renderer` - function that is called for display of the child row data.
* The default function will show the data from the hidden columns
* * `target` - Used as the selector for what objects to attach the child
* open / close to
* * `type` - `false` to disable the details display, `inline` or `column`
* for the two control types
*
* @type {Object|string}
*/
details: {
display: Responsive.display.childRow,
renderer: Responsive.renderer.listHidden(),
target: 0,
type: 'inline'
},
/**
* Orthogonal data request option. This is used to define the data type
* requested when Responsive gets the data to show in the child row.
*
* @type {String}
*/
orthogonal: 'display'
};
/*
* API
*/
var Api = $.fn.dataTable.Api;
// Doesn't do anything - work around for a bug in DT... Not documented
Api.register( 'responsive()', function () {
return this;
} );
Api.register( 'responsive.index()', function ( li ) {
li = $(li);
return {
column: li.data('dtr-index'),
row: li.parent().data('dtr-index')
};
} );
Api.register( 'responsive.rebuild()', function () {
return this.iterator( 'table', function ( ctx ) {
if ( ctx._responsive ) {
ctx._responsive._classLogic();
}
} );
} );
Api.register( 'responsive.recalc()', function () {
return this.iterator( 'table', function ( ctx ) {
if ( ctx._responsive ) {
ctx._responsive._resizeAuto();
ctx._responsive._resize();
}
} );
} );
Api.register( 'responsive.hasHidden()', function () {
var ctx = this.context[0];
return ctx._responsive ?
$.inArray( false, ctx._responsive.s.current ) !== -1 :
false;
} );
Api.registerPlural( 'columns().responsiveHidden()', 'column().responsiveHidden()', function () {
return this.iterator( 'column', function ( settings, column ) {
return settings._responsive ?
settings._responsive.s.current[ column ] :
false;
}, 1 );
} );
/**
* Version information
*
* @name Responsive.version
* @static
*/
Responsive.version = '2.2.3';
$.fn.dataTable.Responsive = Responsive;
$.fn.DataTable.Responsive = Responsive;
// Attach a listener to the document which listens for DataTables initialisation
// events so we can automatically initialise
$(document).on( 'preInit.dt.dtr', function (e, settings, json) {
if ( e.namespace !== 'dt' ) {
return;
}
if ( $(settings.nTable).hasClass( 'responsive' ) ||
$(settings.nTable).hasClass( 'dt-responsive' ) ||
settings.oInit.responsive ||
DataTable.defaults.responsive
) {
var init = settings.oInit.responsive;
if ( init !== false ) {
new Responsive( settings, $.isPlainObject( init ) ? init : {} );
}
}
} );
return Responsive;
}));
|
PypiClean
|
/convex-api-py-0.2.6.tar.gz/convex-api-py-0.2.6/convex_api/utils.py
|
import binascii
import re
from cryptography.hazmat.backends.openssl.backend import backend
from cryptography.hazmat.primitives import hashes
def to_address(value):
"""
Convert address text with possible leading '#' to an interger address value.
:param str text: Address text to convert
:returns: Integer address or None if not a valid address
"""
if isinstance(value, int):
return int(value)
elif is_account(value):
return value.address
elif isinstance(value, str):
try:
address = int(re.sub(r'^#', '', value.strip()))
except ValueError:
return None
return address
def is_address(text):
"""
Returns True if the text value is a valid address.
:param str, int text: Possible address field.
:returns: True if the text field is a valid address.
"""
value = to_address(text)
if isinstance(value, int):
return value >= 0
return False
def is_public_key_hex(public_key):
"""
Returns True if the value passed is a valid public key.
:params str public_key: Public key to check, this has to be a hex string with a possible `0x` at the front.
:returns: True if the passed value is a valid hex public key.
"""
if is_hexstr(add_0x_prefix(public_key)):
address_base = remove_0x_prefix(public_key)
if len(address_base) == 64:
return True
return False
def is_public_key(public_key):
"""
Returns True if the value passed is a valid public key.
:params str public_key: Public key to check, this has to be a hex string with a possible `0x` at the front.
:returns: True if the passed value is a valid hex public key.
"""
if is_public_key_checksum(public_key):
return True
if is_public_key_hex(public_key):
return True
return False
def to_public_key_checksum(public_key):
"""
Convert a public key to a checksum key. This will first make all a-f chars lower case
then convert a-f chars to uppercase depending on the hash of the public key.
:params str public_key: Key to convert to a checksum key.
:returns: Checksum key of the public_key.
"""
digest = hashes.Hash(hashes.SHA3_256(), backend=backend)
digest.update(to_bytes(hexstr=public_key))
public_key_hash = remove_0x_prefix(to_hex(digest.finalize()))
public_key_clean = remove_0x_prefix(public_key.lower())
checksum = ''
hash_index = 0
for value in public_key_clean:
if int(public_key_hash[hash_index], 16) > 7:
checksum += value.upper()
else:
checksum += value
hash_index += 1
if hash_index >= len(public_key_hash):
hash_index = 0
return add_0x_prefix(checksum)
def is_public_key_checksum(public_key):
"""
Returns True if the public_key passed is a valid checksum.
:param str public_key: Public key that is in the checksum format
:returns: True if the key passed has the correct checksum applied to it
"""
return remove_0x_prefix(public_key) and remove_0x_prefix(public_key) == remove_0x_prefix(to_public_key_checksum(public_key))
def is_hexstr(text):
"""
Return True if the text passed is a hex string.
:param str text: Hex chars including the '0x' at the begining.
:returns: True if all chars are hex
"""
return re.match('^0x[0-9a-f]+$', text, re.IGNORECASE)
def add_0x_prefix(text):
"""
Append the 0x prefix to the hex chars
:param str text: Text to preappend the `0x` too.
:returns: The text with a `0x` appended to the front
"""
if text:
return '0x' + remove_0x_prefix(text)
def remove_0x_prefix(text):
"""
Removes the '0x' from the front of the hex string.
:param str text: Hex string to remove the '0x' from.
:results: Removed '0x' from the hex string.
"""
if text:
return re.sub(r'^0x', '', text, re.IGNORECASE)
def to_bytes(data=None, hexstr=None):
"""
Convert byte data or hexstr to bytes.
:param bytes, int data: Data to convert to bytes
:param str hexstr: Hex string to convert to bytes
:returns Bytes of the hex data
"""
if data:
return data.to_bytes(32, 'big')
elif hexstr and is_hexstr(add_0x_prefix(hexstr)):
return binascii.unhexlify(remove_0x_prefix(hexstr))
def to_hex(value):
"""
Convert byte data to hex.
:params value: data to convert to hex
:returns: Returns a hex string with a preappended '0x'
"""
return add_0x_prefix(binascii.hexlify(value).decode())
def is_account(value):
from convex_api import Account
return isinstance(value, Account)
|
PypiClean
|
/sahara_plugin_mapr-9.0.0.0rc1-py3-none-any.whl/sahara_plugin_mapr/plugins/mapr/domain/service.py
|
from oslo_log import log as logging
from oslo_serialization import jsonutils as json
import sahara.plugins.exceptions as ex
import sahara.plugins.provisioning as p
from sahara.plugins import utils
from sahara_plugin_mapr.i18n import _
from sahara_plugin_mapr.plugins.mapr.util import commands as cmd
from sahara_plugin_mapr.plugins.mapr.util import event_log as el
from sahara_plugin_mapr.plugins.mapr.util import general as g
from sahara_plugin_mapr.plugins.mapr.util import service_utils as su
LOG = logging.getLogger(__name__)
SERVICE_UI = 'Web UI'
_INSTALL_PACKAGES_TIMEOUT = 3600
class Service(object, metaclass=g.Singleton):
def __init__(self):
self._name = None
self._ui_name = None
self._node_processes = []
self._version = None
self._dependencies = []
self._ui_info = []
self._cluster_defaults = []
self._node_defaults = []
self._validation_rules = []
self._priority = 1
@property
def name(self):
return self._name
@property
def ui_name(self):
return self._ui_name
@property
def version(self):
return self._version
@property
def node_processes(self):
return self._node_processes
@property
def dependencies(self):
return self._dependencies
@property
def cluster_defaults(self):
return self._cluster_defaults
@property
def node_defaults(self):
return self._node_defaults
@property
def validation_rules(self):
return self._validation_rules
def get_ui_info(self, cluster_context):
return self._ui_info
def install(self, cluster_context, instances):
service_instances = cluster_context.filter_instances(instances,
service=self)
@el.provision_step(_("Install %s service") % self.ui_name,
cluster_context_reference=0, instances_reference=1)
def _install(_context, _instances):
g.execute_on_instances(_instances,
self._install_packages_on_instance,
_context)
if service_instances:
_install(cluster_context, service_instances)
@el.provision_event(instance_reference=1)
def _install_packages_on_instance(self, instance, cluster_context):
processes = [p for p in self.node_processes if
p.ui_name in instance.node_group.node_processes]
if processes is not None and len(processes) > 0:
packages = self._get_packages(cluster_context, processes)
cmd = cluster_context.distro.create_install_cmd(packages)
with instance.remote() as r:
r.execute_command(cmd, run_as_root=True,
timeout=_INSTALL_PACKAGES_TIMEOUT)
def _get_packages(self, cluster_context, node_processes):
result = []
result += self.dependencies
result += [(np.package, self.version) for np in node_processes]
return result
def _set_service_dir_owner(self, cluster_context, instances):
service_instances = cluster_context.filter_instances(instances,
service=self)
LOG.debug("Changing %s service dir owner", self.ui_name)
for instance in service_instances:
cmd.chown(instance, 'mapr:mapr', self.service_dir(cluster_context))
def post_install(self, cluster_context, instances):
pass
def post_start(self, cluster_context, instances):
pass
def configure(self, cluster_context, instances=None):
pass
def update(self, cluster_context, instances=None):
pass
def get_file_path(self, file_name):
template = 'plugins/mapr/services/%(service)s/resources/%(file_name)s'
args = {'service': self.name, 'file_name': file_name}
return template % args
def get_configs(self):
result = []
for d_file in self.cluster_defaults:
data = self._load_config_file(self.get_file_path(d_file))
result += [self._create_config_obj(c, self.ui_name) for c in data]
for d_file in self.node_defaults:
data = self._load_config_file(self.get_file_path(d_file))
result += [self._create_config_obj(c, self.ui_name, scope='node')
for c in data]
return result
def get_configs_dict(self):
result = dict()
for conf_obj in self.get_configs():
result.update({conf_obj.name: conf_obj.default_value})
return {self.ui_name: result}
def _load_config_file(self, file_path=None):
return json.loads(utils.get_file_text(file_path, 'sahara_plugin_mapr'))
def get_config_files(self, cluster_context, configs, instance=None):
return []
def _create_config_obj(self, item, target='general', scope='cluster',
high_priority=False):
def _prepare_value(value):
if isinstance(value, str):
return value.strip().lower()
return value
conf_name = _prepare_value(item.get('name', None))
conf_value = _prepare_value(item.get('value', None))
if not conf_name:
raise ex.HadoopProvisionError(_("Config missing 'name'"))
if conf_value is None:
raise ex.PluginInvalidDataException(
_("Config '%s' missing 'value'") % conf_name)
if high_priority or item.get('priority', 2) == 1:
priority = 1
else:
priority = 2
return p.Config(
name=conf_name,
applicable_target=target,
scope=scope,
config_type=item.get('config_type', "string"),
config_values=item.get('config_values', None),
default_value=conf_value,
is_optional=item.get('is_optional', True),
description=item.get('description', None),
priority=priority)
def get_version_config(self, versions):
return p.Config(
name='%s Version' % self._ui_name,
applicable_target=self.ui_name,
scope='cluster',
config_type='dropdown',
config_values=[(v, v) for v in sorted(versions, reverse=True)],
is_optional=False,
description=_('Specify the version of the service'),
priority=1)
def __eq__(self, other):
if isinstance(other, self.__class__):
version_eq = self.version == other.version
ui_name_eq = self.ui_name == other.ui_name
return version_eq and ui_name_eq
return NotImplemented
def restart(self, instances):
for node_process in self.node_processes:
filtered_instances = su.filter_by_node_process(instances,
node_process)
if filtered_instances:
node_process.restart(filtered_instances)
def service_dir(self, cluster_context):
args = {'mapr_home': cluster_context.mapr_home, 'name': self.name}
return '%(mapr_home)s/%(name)s' % args
def home_dir(self, cluster_context):
args = {
'service_dir': self.service_dir(cluster_context),
'name': self.name,
'version': self.version,
}
return '%(service_dir)s/%(name)s-%(version)s' % args
def conf_dir(self, cluster_context):
return '%s/conf' % self.home_dir(cluster_context)
def post_configure_sh(self, cluster_context, instances):
pass
def post_configure(self, cluster_context, instances):
pass
|
PypiClean
|
/lsv2test-core-2.0.0.tar.gz/lsv2test-core-2.0.0/localstack/aws/protocol/validate.py
|
from typing import Any, Dict, List, NamedTuple
from botocore.model import OperationModel, Shape
from botocore.validate import ParamValidator as BotocoreParamValidator
from botocore.validate import ValidationErrors as BotocoreValidationErrors
from botocore.validate import type_check
from localstack.aws.api import ServiceRequest
class Error(NamedTuple):
"""
A wrapper around ``botocore.validate`` error tuples.
Attributes:
reason The error type
name The name of the parameter the error occured at
attributes Error type-specific attributes
"""
reason: str
name: str
attributes: Dict[str, Any]
class ParameterValidationError(Exception):
error: Error
def __init__(self, error: Error) -> None:
self.error = error
super().__init__(self.message)
@property
def reason(self):
return self.error.reason
@property
def message(self) -> str:
"""
Returns a default message for the error formatted by BotocoreValidationErrors.
:return: the exception message.
"""
return BotocoreValidationErrors()._format_error(self.error)
class MissingRequiredField(ParameterValidationError):
@property
def required_name(self) -> str:
return self.error.attributes["required_name"]
# TODO: extend subclasses with properties from error arguments as needed. see ValidationErrors._format_error for
# which those are.
class UnknownField(ParameterValidationError):
pass
class InvalidType(ParameterValidationError):
pass
class InvalidRange(ParameterValidationError):
pass
class InvalidLength(ParameterValidationError):
pass
class JsonEncodingError(ParameterValidationError):
pass
class InvalidDocumentType(ParameterValidationError):
pass
class MoreThanOneInput(ParameterValidationError):
pass
class EmptyInput(ParameterValidationError):
pass
class ValidationErrors(BotocoreValidationErrors):
def __init__(self, shape: Shape, params: Dict[str, Any]):
super().__init__()
self.shape = shape
self.params = params
self._exceptions: List[ParameterValidationError] = []
@property
def exceptions(self):
return self._exceptions
def raise_first(self):
for error in self._exceptions:
raise error
def report(self, name, reason, **kwargs):
error = Error(reason, name, kwargs)
self._errors.append(error)
self._exceptions.append(self.to_exception(error))
def to_exception(self, error: Error) -> ParameterValidationError:
error_type, name, additional = error
if error_type == "missing required field":
return MissingRequiredField(error)
elif error_type == "unknown field":
return UnknownField(error)
elif error_type == "invalid type":
return InvalidType(error)
elif error_type == "invalid range":
return InvalidRange(error)
elif error_type == "invalid length":
return InvalidLength(error)
elif error_type == "unable to encode to json":
return JsonEncodingError(error)
elif error_type == "invalid type for document":
return InvalidDocumentType(error)
elif error_type == "more than one input":
return MoreThanOneInput(error)
elif error_type == "empty input":
return EmptyInput(error)
return ParameterValidationError(error)
class ParamValidator(BotocoreParamValidator):
def validate(self, params: Dict[str, Any], shape: Shape):
"""Validate parameters against a shape model.
This method will validate the parameters against a provided shape model.
All errors will be collected before returning to the caller. This means
that this method will not stop at the first error, it will return all
possible errors.
:param params: User provided dict of parameters
:param shape: A shape model describing the expected input.
:return: A list of errors.
"""
errors = ValidationErrors(shape, params)
self._validate(params, shape, errors, name="")
return errors
@type_check(valid_types=(dict,))
def _validate_structure(self, params, shape, errors, name):
# our parser sets the value of required members to None if they are not in the incoming request. we correct
# this behavior here to get the correct error messages.
for required_member in shape.metadata.get("required", []):
if required_member in params and params[required_member] is None:
params.pop(required_member)
super(ParamValidator, self)._validate_structure(params, shape, errors, name)
def validate_request(operation: OperationModel, request: ServiceRequest) -> ValidationErrors:
"""
Validates the service request with the input shape of the given operation.
:param operation: the operation
:param request: the input shape of the operation being validated
:return: ValidationError object
"""
return ParamValidator().validate(request, operation.input_shape)
|
PypiClean
|
/GTW-1.2.6.tar.gz/GTW-1.2.6/media/js/jquery.tablesorter.min.js
|
(function($){$.extend({tablesorter:new
function(){var parsers=[],widgets=[];this.defaults={cssHeader:"header",cssAsc:"headerSortUp",cssDesc:"headerSortDown",cssChildRow:"expand-child",sortInitialOrder:"asc",sortMultiSortKey:"shiftKey",sortForce:null,sortAppend:null,sortLocaleCompare:true,textExtraction:"simple",parsers:{},widgets:[],widgetZebra:{css:["even","odd"]},headers:{},widthFixed:false,cancelSelection:true,sortList:[],headerList:[],dateFormat:"us",decimal:'/\.|\,/g',onRenderHeader:null,selectorHeaders:'thead th',debug:false};function benchmark(s,d){log(s+","+(new Date().getTime()-d.getTime())+"ms");}this.benchmark=benchmark;function log(s){if(typeof console!="undefined"&&typeof console.debug!="undefined"){console.log(s);}else{alert(s);}}function buildParserCache(table,$headers){if(table.config.debug){var parsersDebug="";}if(table.tBodies.length==0)return;var rows=table.tBodies[0].rows;if(rows[0]){var list=[],cells=rows[0].cells,l=cells.length;for(var i=0;i<l;i++){var p=false;if($.metadata&&($($headers[i]).metadata()&&$($headers[i]).metadata().sorter)){p=getParserById($($headers[i]).metadata().sorter);}else if((table.config.headers[i]&&table.config.headers[i].sorter)){p=getParserById(table.config.headers[i].sorter);}if(!p){p=detectParserForColumn(table,rows,-1,i);}if(table.config.debug){parsersDebug+="column:"+i+" parser:"+p.id+"\n";}list.push(p);}}if(table.config.debug){log(parsersDebug);}return list;};function detectParserForColumn(table,rows,rowIndex,cellIndex){var l=parsers.length,node=false,nodeValue=false,keepLooking=true;while(nodeValue==''&&keepLooking){rowIndex++;if(rows[rowIndex]){node=getNodeFromRowAndCellIndex(rows,rowIndex,cellIndex);nodeValue=trimAndGetNodeText(table.config,node);if(table.config.debug){log('Checking if value was empty on row:'+rowIndex);}}else{keepLooking=false;}}for(var i=1;i<l;i++){if(parsers[i].is(nodeValue,table,node)){return parsers[i];}}return parsers[0];}function getNodeFromRowAndCellIndex(rows,rowIndex,cellIndex){return rows[rowIndex].cells[cellIndex];}function trimAndGetNodeText(config,node){return $.trim(getElementText(config,node));}function getParserById(name){var l=parsers.length;for(var i=0;i<l;i++){if(parsers[i].id.toLowerCase()==name.toLowerCase()){return parsers[i];}}return false;}function buildCache(table){if(table.config.debug){var cacheTime=new Date();}var totalRows=(table.tBodies[0]&&table.tBodies[0].rows.length)||0,totalCells=(table.tBodies[0].rows[0]&&table.tBodies[0].rows[0].cells.length)||0,parsers=table.config.parsers,cache={row:[],normalized:[]};for(var i=0;i<totalRows;++i){var c=$(table.tBodies[0].rows[i]),cols=[];if(c.hasClass(table.config.cssChildRow)){cache.row[cache.row.length-1]=cache.row[cache.row.length-1].add(c);continue;}cache.row.push(c);for(var j=0;j<totalCells;++j){cols.push(parsers[j].format(getElementText(table.config,c[0].cells[j]),table,c[0].cells[j]));}cols.push(cache.normalized.length);cache.normalized.push(cols);cols=null;};if(table.config.debug){benchmark("Building cache for "+totalRows+" rows:",cacheTime);}return cache;};function getElementText(config,node){var text="";if(!node)return"";if(!config.supportsTextContent)config.supportsTextContent=node.textContent||false;if(config.textExtraction=="simple"){if(config.supportsTextContent){text=node.textContent;}else{if(node.childNodes[0]&&node.childNodes[0].hasChildNodes()){text=node.childNodes[0].innerHTML;}else{text=node.innerHTML;}}}else{if(typeof(config.textExtraction)=="function"){text=config.textExtraction(node);}else{text=$(node).text();}}return text;}function appendToTable(table,cache){if(table.config.debug){var appendTime=new Date()}var c=cache,r=c.row,n=c.normalized,totalRows=n.length,checkCell=(n[0].length-1),tableBody=$(table.tBodies[0]),rows=[];for(var i=0;i<totalRows;i++){var pos=n[i][checkCell];rows.push(r[pos]);if(!table.config.appender){var l=r[pos].length;for(var j=0;j<l;j++){tableBody[0].appendChild(r[pos][j]);}}}if(table.config.appender){table.config.appender(table,rows);}rows=null;if(table.config.debug){benchmark("Rebuilt table:",appendTime);}applyWidget(table);setTimeout(function(){$(table).trigger("sortEnd");},0);};function buildHeaders(table){if(table.config.debug){var time=new Date();}var meta=($.metadata)?true:false;var header_index=computeTableHeaderCellIndexes(table);$tableHeaders=$(table.config.selectorHeaders,table).each(function(index){this.column=header_index[this.parentNode.rowIndex+"-"+this.cellIndex];this.order=formatSortingOrder(table.config.sortInitialOrder);this.count=this.order;if(checkHeaderMetadata(this)||checkHeaderOptions(table,index))this.sortDisabled=true;if(checkHeaderOptionsSortingLocked(table,index))this.order=this.lockedOrder=checkHeaderOptionsSortingLocked(table,index);if(!this.sortDisabled){var $th=$(this).addClass(table.config.cssHeader);if(table.config.onRenderHeader)table.config.onRenderHeader.apply($th);}table.config.headerList[index]=this;});if(table.config.debug){benchmark("Built headers:",time);log($tableHeaders);}return $tableHeaders;};function computeTableHeaderCellIndexes(t){var matrix=[];var lookup={};var thead=t.getElementsByTagName('THEAD')[0];var trs=thead.getElementsByTagName('TR');for(var i=0;i<trs.length;i++){var cells=trs[i].cells;for(var j=0;j<cells.length;j++){var c=cells[j];var rowIndex=c.parentNode.rowIndex;var cellId=rowIndex+"-"+c.cellIndex;var rowSpan=c.rowSpan||1;var colSpan=c.colSpan||1
var firstAvailCol;if(typeof(matrix[rowIndex])=="undefined"){matrix[rowIndex]=[];}for(var k=0;k<matrix[rowIndex].length+1;k++){if(typeof(matrix[rowIndex][k])=="undefined"){firstAvailCol=k;break;}}lookup[cellId]=firstAvailCol;for(var k=rowIndex;k<rowIndex+rowSpan;k++){if(typeof(matrix[k])=="undefined"){matrix[k]=[];}var matrixrow=matrix[k];for(var l=firstAvailCol;l<firstAvailCol+colSpan;l++){matrixrow[l]="x";}}}}return lookup;}function checkCellColSpan(table,rows,row){var arr=[],r=table.tHead.rows,c=r[row].cells;for(var i=0;i<c.length;i++){var cell=c[i];if(cell.colSpan>1){arr=arr.concat(checkCellColSpan(table,headerArr,row++));}else{if(table.tHead.length==1||(cell.rowSpan>1||!r[row+1])){arr.push(cell);}}}return arr;};function checkHeaderMetadata(cell){if(($.metadata)&&($(cell).metadata().sorter===false)){return true;};return false;}function checkHeaderOptions(table,i){if((table.config.headers[i])&&(table.config.headers[i].sorter===false)){return true;};return false;}function checkHeaderOptionsSortingLocked(table,i){if((table.config.headers[i])&&(table.config.headers[i].lockedOrder))return table.config.headers[i].lockedOrder;return false;}function applyWidget(table){var c=table.config.widgets;var l=c.length;for(var i=0;i<l;i++){getWidgetById(c[i]).format(table);}}function getWidgetById(name){var l=widgets.length;for(var i=0;i<l;i++){if(widgets[i].id.toLowerCase()==name.toLowerCase()){return widgets[i];}}};function formatSortingOrder(v){if(typeof(v)!="Number"){return(v.toLowerCase()=="desc")?1:0;}else{return(v==1)?1:0;}}function isValueInArray(v,a){var l=a.length;for(var i=0;i<l;i++){if(a[i][0]==v){return true;}}return false;}function setHeadersCss(table,$headers,list,css){$headers.removeClass(css[0]).removeClass(css[1]);var h=[];$headers.each(function(offset){if(!this.sortDisabled){h[this.column]=$(this);}});var l=list.length;for(var i=0;i<l;i++){h[list[i][0]].addClass(css[list[i][1]]);}}function fixColumnWidth(table,$headers){var c=table.config;if(c.widthFixed){var colgroup=$('<colgroup>');$("tr:first td",table.tBodies[0]).each(function(){colgroup.append($('<col>').css('width',$(this).width()));});$(table).prepend(colgroup);};}function updateHeaderSortCount(table,sortList){var c=table.config,l=sortList.length;for(var i=0;i<l;i++){var s=sortList[i],o=c.headerList[s[0]];o.count=s[1];o.count++;}}function multisort(table,sortList,cache){if(table.config.debug){var sortTime=new Date();}var dynamicExp="var sortWrapper = function(a,b) {",l=sortList.length;for(var i=0;i<l;i++){var c=sortList[i][0];var order=sortList[i][1];var s=(table.config.parsers[c].type=="text")?((order==0)?makeSortFunction("text","asc",c):makeSortFunction("text","desc",c)):((order==0)?makeSortFunction("numeric","asc",c):makeSortFunction("numeric","desc",c));var e="e"+i;dynamicExp+="var "+e+" = "+s;dynamicExp+="if("+e+") { return "+e+"; } ";dynamicExp+="else { ";}var orgOrderCol=cache.normalized[0].length-1;dynamicExp+="return a["+orgOrderCol+"]-b["+orgOrderCol+"];";for(var i=0;i<l;i++){dynamicExp+="}; ";}dynamicExp+="return 0; ";dynamicExp+="}; ";if(table.config.debug){benchmark("Evaling expression:"+dynamicExp,new Date());}eval(dynamicExp);cache.normalized.sort(sortWrapper);if(table.config.debug){benchmark("Sorting on "+sortList.toString()+" and dir "+order+" time:",sortTime);}return cache;};function makeSortFunction(type,direction,index){var a="a["+index+"]",b="b["+index+"]";if(type=='text'&&direction=='asc'){return"("+a+" == "+b+" ? 0 : ("+a+" === null ? Number.POSITIVE_INFINITY : ("+b+" === null ? Number.NEGATIVE_INFINITY : ("+a+" < "+b+") ? -1 : 1 )));";}else if(type=='text'&&direction=='desc'){return"("+a+" == "+b+" ? 0 : ("+a+" === null ? Number.POSITIVE_INFINITY : ("+b+" === null ? Number.NEGATIVE_INFINITY : ("+b+" < "+a+") ? -1 : 1 )));";}else if(type=='numeric'&&direction=='asc'){return"("+a+" === null && "+b+" === null) ? 0 :("+a+" === null ? Number.POSITIVE_INFINITY : ("+b+" === null ? Number.NEGATIVE_INFINITY : "+a+" - "+b+"));";}else if(type=='numeric'&&direction=='desc'){return"("+a+" === null && "+b+" === null) ? 0 :("+a+" === null ? Number.POSITIVE_INFINITY : ("+b+" === null ? Number.NEGATIVE_INFINITY : "+b+" - "+a+"));";}};function makeSortText(i){return"((a["+i+"] < b["+i+"]) ? -1 : ((a["+i+"] > b["+i+"]) ? 1 : 0));";};function makeSortTextDesc(i){return"((b["+i+"] < a["+i+"]) ? -1 : ((b["+i+"] > a["+i+"]) ? 1 : 0));";};function makeSortNumeric(i){return"a["+i+"]-b["+i+"];";};function makeSortNumericDesc(i){return"b["+i+"]-a["+i+"];";};function sortText(a,b){if(table.config.sortLocaleCompare)return a.localeCompare(b);return((a<b)?-1:((a>b)?1:0));};function sortTextDesc(a,b){if(table.config.sortLocaleCompare)return b.localeCompare(a);return((b<a)?-1:((b>a)?1:0));};function sortNumeric(a,b){return a-b;};function sortNumericDesc(a,b){return b-a;};function getCachedSortType(parsers,i){return parsers[i].type;};this.construct=function(settings){return this.each(function(){if(!this.tHead||!this.tBodies)return;var $this,$document,$headers,cache,config,shiftDown=0,sortOrder;this.config={};config=$.extend(this.config,$.tablesorter.defaults,settings);$this=$(this);$.data(this,"tablesorter",config);$headers=buildHeaders(this);this.config.parsers=buildParserCache(this,$headers);cache=buildCache(this);var sortCSS=[config.cssDesc,config.cssAsc];fixColumnWidth(this);$headers.click(function(e){var totalRows=($this[0].tBodies[0]&&$this[0].tBodies[0].rows.length)||0;if(!this.sortDisabled&&totalRows>0){$this.trigger("sortStart");var $cell=$(this);var i=this.column;this.order=this.count++%2;if(this.lockedOrder)this.order=this.lockedOrder;if(!e[config.sortMultiSortKey]){config.sortList=[];if(config.sortForce!=null){var a=config.sortForce;for(var j=0;j<a.length;j++){if(a[j][0]!=i){config.sortList.push(a[j]);}}}config.sortList.push([i,this.order]);}else{if(isValueInArray(i,config.sortList)){for(var j=0;j<config.sortList.length;j++){var s=config.sortList[j],o=config.headerList[s[0]];if(s[0]==i){o.count=s[1];o.count++;s[1]=o.count%2;}}}else{config.sortList.push([i,this.order]);}};setTimeout(function(){setHeadersCss($this[0],$headers,config.sortList,sortCSS);appendToTable($this[0],multisort($this[0],config.sortList,cache));},1);return false;}}).mousedown(function(){if(config.cancelSelection){this.onselectstart=function(){return false};return false;}});$this.bind("update",function(){var me=this;setTimeout(function(){me.config.parsers=buildParserCache(me,$headers);cache=buildCache(me);},1);}).bind("updateCell",function(e,cell){var config=this.config;var pos=[(cell.parentNode.rowIndex-1),cell.cellIndex];cache.normalized[pos[0]][pos[1]]=config.parsers[pos[1]].format(getElementText(config,cell),cell);}).bind("sorton",function(e,list){$(this).trigger("sortStart");config.sortList=list;var sortList=config.sortList;updateHeaderSortCount(this,sortList);setHeadersCss(this,$headers,sortList,sortCSS);appendToTable(this,multisort(this,sortList,cache));}).bind("appendCache",function(){appendToTable(this,cache);}).bind("applyWidgetId",function(e,id){getWidgetById(id).format(this);}).bind("applyWidgets",function(){applyWidget(this);});if($.metadata&&($(this).metadata()&&$(this).metadata().sortlist)){config.sortList=$(this).metadata().sortlist;}if(config.sortList.length>0){$this.trigger("sorton",[config.sortList]);}applyWidget(this);});};this.addParser=function(parser){var l=parsers.length,a=true;for(var i=0;i<l;i++){if(parsers[i].id.toLowerCase()==parser.id.toLowerCase()){a=false;}}if(a){parsers.push(parser);};};this.addWidget=function(widget){widgets.push(widget);};this.formatFloat=function(s){var i=parseFloat(s);return(isNaN(i))?0:i;};this.formatInt=function(s){var i=parseInt(s);return(isNaN(i))?0:i;};this.isDigit=function(s,config){return/^[-+]?\d*$/.test($.trim(s.replace(/[,.']/g,'')));};this.clearTableBody=function(table){if($.browser.msie){function empty(){while(this.firstChild)this.removeChild(this.firstChild);}empty.apply(table.tBodies[0]);}else{table.tBodies[0].innerHTML="";}};}});$.fn.extend({tablesorter:$.tablesorter.construct});var ts=$.tablesorter;ts.addParser({id:"text",is:function(s){return true;},format:function(s){return $.trim(s.toLocaleLowerCase());},type:"text"});ts.addParser({id:"digit",is:function(s,table){var c=table.config;return $.tablesorter.isDigit(s,c);},format:function(s){return $.tablesorter.formatFloat(s);},type:"numeric"});ts.addParser({id:"currency",is:function(s){return/^[£$€?.]/.test(s);},format:function(s){return $.tablesorter.formatFloat(s.replace(new RegExp(/[£$€]/g),""));},type:"numeric"});ts.addParser({id:"ipAddress",is:function(s){return/^\d{2,3}[\.]\d{2,3}[\.]\d{2,3}[\.]\d{2,3}$/.test(s);},format:function(s){var a=s.split("."),r="",l=a.length;for(var i=0;i<l;i++){var item=a[i];if(item.length==2){r+="0"+item;}else{r+=item;}}return $.tablesorter.formatFloat(r);},type:"numeric"});ts.addParser({id:"url",is:function(s){return/^(https?|ftp|file):\/\/$/.test(s);},format:function(s){return jQuery.trim(s.replace(new RegExp(/(https?|ftp|file):\/\//),''));},type:"text"});ts.addParser({id:"isoDate",is:function(s){return/^\d{4}[\/-]\d{1,2}[\/-]\d{1,2}$/.test(s);},format:function(s){return $.tablesorter.formatFloat((s!="")?new Date(s.replace(new RegExp(/-/g),"/")).getTime():"0");},type:"numeric"});ts.addParser({id:"percent",is:function(s){return/\%$/.test($.trim(s));},format:function(s){return $.tablesorter.formatFloat(s.replace(new RegExp(/%/g),""));},type:"numeric"});ts.addParser({id:"usLongDate",is:function(s){return s.match(new RegExp(/^[A-Za-z]{3,10}\.? [0-9]{1,2}, ([0-9]{4}|'?[0-9]{2}) (([0-2]?[0-9]:[0-5][0-9])|([0-1]?[0-9]:[0-5][0-9]\s(AM|PM)))$/));},format:function(s){return $.tablesorter.formatFloat(new Date(s).getTime());},type:"numeric"});ts.addParser({id:"shortDate",is:function(s){return/\d{1,2}[\/\-]\d{1,2}[\/\-]\d{2,4}/.test(s);},format:function(s,table){var c=table.config;s=s.replace(/\-/g,"/");if(c.dateFormat=="us"){s=s.replace(/(\d{1,2})[\/\-](\d{1,2})[\/\-](\d{4})/,"$3/$1/$2");}else if(c.dateFormat=="uk"){s=s.replace(/(\d{1,2})[\/\-](\d{1,2})[\/\-](\d{4})/,"$3/$2/$1");}else if(c.dateFormat=="dd/mm/yy"||c.dateFormat=="dd-mm-yy"){s=s.replace(/(\d{1,2})[\/\-](\d{1,2})[\/\-](\d{2})/,"$1/$2/$3");}return $.tablesorter.formatFloat(new Date(s).getTime());},type:"numeric"});ts.addParser({id:"time",is:function(s){return/^(([0-2]?[0-9]:[0-5][0-9])|([0-1]?[0-9]:[0-5][0-9]\s(am|pm)))$/.test(s);},format:function(s){return $.tablesorter.formatFloat(new Date("2000/01/01 "+s).getTime());},type:"numeric"});ts.addParser({id:"metadata",is:function(s){return false;},format:function(s,table,cell){var c=table.config,p=(!c.parserMetadataName)?'sortValue':c.parserMetadataName;return $(cell).metadata()[p];},type:"numeric"});ts.addWidget({id:"zebra",format:function(table){if(table.config.debug){var time=new Date();}var $tr,row=-1,odd;$("tr:visible",table.tBodies[0]).each(function(i){$tr=$(this);if(!$tr.hasClass(table.config.cssChildRow))row++;odd=(row%2==0);$tr.removeClass(table.config.widgetZebra.css[odd?0:1]).addClass(table.config.widgetZebra.css[odd?1:0])});if(table.config.debug){$.tablesorter.benchmark("Applying Zebra widget",time);}}});})(jQuery);
|
PypiClean
|
/karas_py-0.2.11.tar.gz/karas_py-0.2.11/karas/util/__init__.py
|
import hashlib
from time import time
from karas.exceptions import *
from karas.permission import PermissionEnum
class MetaBase(type):
def __new__(mcs, name, bases, namespace):
if not namespace.get("type"):
namespace["type"] = name
return super().__new__(mcs, name, bases, namespace)
class BaseModel(metaclass=MetaBase):
type: str
def __init__(self, *args, **kws) -> None:
self._data = kws or args
for _k, _v in kws.items():
_k = _k if _k != "from" else "From"
_type = self.__annotations__.get(_k)
try:
if _type and _v is not None and not isinstance(_v, _type):
_v = PermissionEnum[_v].value \
if _k in ("origin", "current") and isinstance(_v, str) else _type(*_v) \
if _k in ("origin", "current", "messageChain") else _type(**_v)
except Exception as e:
raise e
setattr(self, _k, _v)
@classmethod
def parse(cls, *_, **kwargs) -> "BaseModel":
_params = [(_K, _V) for _K, _V in kwargs.items()]
_arg_mapper = filter(lambda x: x if x[0] in kwargs.keys(
) and x[0] != "type" else None, _params)
_filter = {_K: _V for _K, _V in filter(
lambda x: x is not None, _arg_mapper)}
_obj = cls(**_filter)
return _obj
raw = property(lambda self: self._data, ..., ...)
def __str__(self) -> str:
return self.__dict__.__str__()
status_code_exception = {
0: None,
1: VerifyException,
2: BotNotFoundException,
3: SessionInvalidationException,
4: SessionUnauthorizedException,
5: TargetNotFoundException,
6: FileNotFoundException,
10: PermissionException,
20: BotMutedException,
30: MessageTooLongException,
400: InvalidArgumentException,
500: UnknownException
}
class DefaultNamespace:
def __init__(self) -> None:
pass
@classmethod
def gen(cls) -> str:
return hashlib.md5(str(time()).encode()).hexdigest()[:8]
|
PypiClean
|
/fdrtd_simon-0.5.7-py3-none-any.whl/fdrtd/plugins/simon/microprotocols/create.py
|
import fdrtd
import fdrtd.server.exceptions
def create(microprotocol):
if microprotocol == 'MinimumMaximum':
from fdrtd.plugins.simon.microprotocols.microprotocol_minimum_maximum import MicroprotocolMinimumMaximum
return MicroprotocolMinimumMaximum
if microprotocol == 'SecureSum':
from fdrtd.plugins.simon.microprotocols.microprotocol_secure_sum import MicroprotocolSecureSum
return MicroprotocolSecureSum
if microprotocol == 'SecureMatrixMultiplication':
from fdrtd.plugins.simon.microprotocols.microprotocol_secure_matrix_multiplication import MicroprotocolSecureMatrixMultiplication
return MicroprotocolSecureMatrixMultiplication
if microprotocol == 'SetIntersection':
from fdrtd.plugins.simon.microprotocols.microprotocol_set_intersection import MicroprotocolSetIntersection
return MicroprotocolSetIntersection
if microprotocol == 'SetIntersectionSize':
from fdrtd.plugins.simon.microprotocols.microprotocol_set_intersection_size import MicroprotocolSetIntersectionSize
return MicroprotocolSetIntersectionSize
if microprotocol == 'StatisticsBivariate':
from fdrtd.plugins.simon.microprotocols.microprotocol_statistics_bivariate import MicroprotocolStatisticsBivariate
return MicroprotocolStatisticsBivariate
if microprotocol == 'StatisticsFrequency':
from fdrtd.plugins.simon.microprotocols.microprotocol_statistics_frequency import MicroprotocolStatisticsFrequency
return MicroprotocolStatisticsFrequency
if microprotocol == 'StatisticsContingency':
from fdrtd.plugins.simon.microprotocols.microprotocol_statistics_contingency import MicroprotocolStatisticsContingency
return MicroprotocolStatisticsContingency
if microprotocol == 'StatisticsUnivariate':
from fdrtd.plugins.simon.microprotocols.microprotocol_statistics_univariate import MicroprotocolStatisticsUnivariate
return MicroprotocolStatisticsUnivariate
if microprotocol == 'StatisticsContingencyVertical':
from fdrtd.plugins.simon.microprotocols.microprotocol_statistics_contingency_vertical import MicroprotocolStatisticsContingencyVertical
return MicroprotocolStatisticsContingencyVertical
if microprotocol == 'StatisticsRegressionOLSVertical':
from fdrtd.plugins.simon.microprotocols.microprotocol_statistics_regression_ols_vertical import MicroprotocolStatisticsRegressionOLSVertical
return MicroprotocolStatisticsRegressionOLSVertical
raise fdrtd.server.exceptions.NotAvailable(microprotocol)
|
PypiClean
|
/asyncssh-unofficial-0.9.2.tar.gz/asyncssh-unofficial-0.9.2/asyncssh/stream.py
|
import asyncio
from .constants import *
from .misc import *
from .channel import *
class SSHReader:
"""SSH read stream handler"""
def __init__(self, session, datatype=None):
self._session = session
self._chan = session._chan
self._datatype = datatype
@property
def channel(self):
"""The SSH channel associated with this stream"""
return self._chan
def get_extra_info(self, name, default=None):
"""Return additional information about this stream
This method returns extra information about the channel
associated with this stream.
"""
return self._chan.get_extra_info(name, default)
def read(self, n=-1):
"""Read data from the stream
This method is a coroutine which reads up to ``n`` bytes
or characters from the stream. If ``n`` is not provided or
set to ``-1``, it reads until EOF or until a signal is
received on the stream.
If EOF was received and the receive buffer is empty, an
empty ``bytes`` or ``string`` object is returned.
.. note:: Unlike traditional ``asyncio`` stream readers,
the data will be delivered as either bytes or
a string depending on whether an encoding was
specified when the underlying channel was opened.
"""
return self._session.read(n, self._datatype, exact=False)
def readline(self):
"""Read one line from the stream
This method is a coroutine which reads one line, ending in
``'\\n'``.
If EOF was received before ``'\\n'`` was found, the partial
line is returned. If EOF was received and the receive buffer
is empty, an empty ``bytes`` or ``string`` object is returned.
"""
return self._session.readline(self._datatype)
def readexactly(self, n):
"""Read an exact amount of data from the stream
This method is a coroutine which reads exactly n bytes or
characters from the stream.
If EOF is received before ``n`` bytes are read, an
:exc:`IncompleteReadError <asyncio.IncompleteReadError>` is
raised and its ``partial`` attribute contains the partially
read data.
"""
return self._session.read(n, self._datatype, exact=True)
def at_eof(self):
"""Return whether the stream is at EOF
This method returns ``True`` when EOF has been received and
all data in the stream has been read.
"""
return self._session._eof_received and \
not self._session._recv_buf[self._datatype]
class SSHWriter:
"""SSH write stream handler"""
def __init__(self, session, datatype=None):
self._session = session
self._chan = session._chan
self._datatype = datatype
@property
def channel(self):
"""The SSH channel associated with this stream"""
return self._chan
def get_extra_info(self, name, default=None):
"""Return additional information about this stream
This method returns extra information about the channel
associated with this stream.
"""
return self._chan.get_extra_info(name, default)
def can_write_eof(self):
"""Return whether the stream supports :meth:`write_eof`"""
return self._chan.can_write_eof()
def close(self):
"""Close the channel
.. note:: After this is called, no data can be read or written
from any of the streams associated with this channel.
"""
return self._chan.close()
@asyncio.coroutine
def drain(self):
"""Wait until the write buffer on the channel is flushed
This method is a coroutine which blocks the caller if the
stream is currently paused for writing, returning when
enough data has been sent on the channel to allow writing
to resume. This can be used to avoid buffering an excessive
amount of data in the channel's send buffer.
"""
return (yield from self._session.drain())
def write(self, data):
"""Write data to the stream
This method writes bytes or characters to the stream.
.. note:: Unlike traditional ``asyncio`` stream writers,
the data must be supplied as either bytes or
a string depending on whether an encoding was
specified when the underlying channel was opened.
"""
return self._chan.write(data, self._datatype)
def writelines(self, list_of_data):
"""Write a collection of data to the stream"""
return self._chan.writelines(list_of_data, self._datatype)
def write_eof(self):
"""Write EOF on the channel
This method sends an end-of-file indication on the channel,
after which no more data can be written.
.. note:: On an :class:`SSHServerChannel` where multiple
output streams are created, writing EOF on one
stream signals EOF for all of them, since it
applies to the channel as a whole.
"""
return self._chan.write_eof()
class SSHStreamSession:
"""SSH stream session handler"""
def __init__(self):
self._chan = None
self._exception = None
self._eof_received = False
self._connection_lost = False
self._recv_buf = { None: [] }
self._recv_buf_len = 0
self._read_waiter = { None: None }
self._write_paused = False
self._drain_waiters = []
@asyncio.coroutine
def _block_read(self, datatype):
if self._read_waiter[datatype]:
raise RuntimeError('read called while another coroutine is '
'already waiting to read')
waiter = asyncio.Future(loop=self._chan._loop)
self._read_waiter[datatype] = waiter
yield from waiter
def _unblock_read(self, datatype):
waiter = self._read_waiter[datatype]
if waiter:
waiter.set_result(None)
self._read_waiter[datatype] = None
def _unblock_drain(self):
for waiter in self._drain_waiters:
waiter.set_result(None)
self._drain_waiters = []
def connection_made(self, chan):
self._chan = chan
self._limit = self._chan._init_recv_window
for datatype in self._chan._read_datatypes:
self._recv_buf[datatype] = []
self._read_waiter[datatype] = None
def connection_lost(self, exc):
self._connection_lost = True
self._exception = exc
if not self._eof_received:
self.eof_received()
if self._write_paused:
self._unblock_drain()
def data_received(self, data, datatype):
self._recv_buf[datatype].append(data)
self._recv_buf_len += len(data)
self._unblock_read(datatype)
if self._recv_buf_len >= self._limit:
self._chan.pause_reading()
def eof_received(self):
self._eof_received = True
for datatype in self._read_waiter.keys():
self._unblock_read(datatype)
return True
def pause_writing(self):
self._write_paused = True
def resume_writing(self):
self._write_paused = False
self._unblock_drain()
@asyncio.coroutine
def read(self, n, datatype, exact):
recv_buf = self._recv_buf[datatype]
buf = '' if self._chan._encoding else b''
data = []
while True:
while recv_buf:
if isinstance(recv_buf[0], Exception):
if data:
break
else:
raise recv_buf.pop(0)
l = len(recv_buf[0])
if n > 0 and l > n:
data.append(recv_buf[0][:n])
recv_buf[0] = recv_buf[0][n:]
self._recv_buf_len -= n
n = 0
break
data.append(recv_buf.pop(0))
self._recv_buf_len -= l
n -= l
if n == 0 or (data and not exact) or self._eof_received:
break
yield from self._block_read(datatype)
buf = buf.join(data)
if n > 0 and exact:
raise asyncio.IncompleteReadError(buf, len(buf) + n)
return buf
@asyncio.coroutine
def readline(self, datatype):
recv_buf = self._recv_buf[datatype]
buf, sep = ('', '\n') if self._chan._encoding else (b'', b'\n')
data = []
while True:
while recv_buf:
if isinstance(recv_buf[0], Exception):
if data:
return buf.join(data)
else:
raise recv_buf.pop(0)
idx = recv_buf[0].find(sep) + 1
if idx > 0:
data.append(recv_buf[0][:idx])
recv_buf[0] = recv_buf[0][idx:]
self._recv_buf_len -= idx
return buf.join(data)
l = len(recv_buf[0])
data.append(recv_buf.pop(0))
self._recv_buf_len -= l
if self._eof_received:
return buf.join(data)
yield from self._block_read(datatype)
@asyncio.coroutine
def drain(self):
if self._write_paused and not self._connection_lost:
waiter = asyncio.Future(loop=self._chan._loop)
self._drain_waiters.append(waiter)
yield from waiter
if self._connection_lost:
exc = self._exception
if not exc and self._write_paused:
exc = BrokenPipeError()
raise exc
class SSHClientStreamSession(SSHStreamSession, SSHClientSession):
"""SSH client stream session handler"""
class SSHServerStreamSession(SSHStreamSession, SSHServerSession):
"""SSH server stream session handler"""
def __init__(self, handler_factory):
super().__init__()
self._handler_factory = handler_factory
def shell_requested(self):
return True
def exec_requested(self, command):
return True
def subsystem_requested(self, subsystem):
return True
def session_started(self):
if self._handler_factory:
handler = \
self._handler_factory(SSHReader(self), SSHWriter(self),
SSHWriter(self, EXTENDED_DATA_STDERR))
if asyncio.iscoroutine(handler):
asyncio.async(handler)
def break_received(self, msec):
self._recv_buf[None].append(BreakReceived(msec))
self._unblock_read(None)
return True
def signal_received(self, signal):
self._recv_buf[None].append(SignalReceived(signal))
self._unblock_read(None)
def terminal_size_changed(self, *args):
self._recv_buf[None].append(TerminalSizeChanged(*args))
self._unblock_read(None)
class SSHTCPStreamSession(SSHStreamSession, SSHTCPSession):
"""SSH TCP stream session handler"""
def __init__(self, handler_factory=None):
super().__init__()
self._handler_factory = handler_factory
def session_started(self):
if self._handler_factory:
handler = self._handler_factory(SSHReader(self), SSHWriter(self))
if asyncio.iscoroutine(handler):
asyncio.async(handler)
|
PypiClean
|
/great_expectations_cta-0.15.43.tar.gz/great_expectations_cta-0.15.43/great_expectations/experimental/datasources/metadatasource.py
|
from __future__ import annotations
import logging
from pprint import pformat as pf
from typing import TYPE_CHECKING, Set, Type
import pydantic
from great_expectations.experimental.datasources.sources import _SourceFactories
if TYPE_CHECKING:
from great_expectations.experimental.datasources.interfaces import Datasource
LOGGER = logging.getLogger(__name__)
class MetaDatasource(pydantic.main.ModelMetaclass):
__cls_set: Set[Type] = set()
def __new__(
meta_cls: Type[MetaDatasource], cls_name: str, bases: tuple[type], cls_dict
) -> MetaDatasource:
"""
MetaDatasource hook that runs when a new `Datasource` is defined.
This methods binds a factory method for the defined `Datasource` to `_SourceFactories` class which becomes
available as part of the `DataContext`.
Also binds asset adding methods according to the declared `asset_types`.
"""
LOGGER.debug(f"1a. {meta_cls.__name__}.__new__() for `{cls_name}`")
cls = super().__new__(meta_cls, cls_name, bases, cls_dict)
if cls_name == "Datasource":
# NOTE: the above check is brittle and must be kept in-line with the Datasource.__name__
LOGGER.debug("1c. Skip factory registration of base `Datasource`")
return cls
LOGGER.debug(f" {cls_name} __dict__ ->\n{pf(cls.__dict__, depth=3)}")
meta_cls.__cls_set.add(cls)
LOGGER.info(f"Datasources: {len(meta_cls.__cls_set)}")
def _datasource_factory(name: str, **kwargs) -> Datasource:
# TODO: update signature to match Datasource __init__ (ex update __signature__)
LOGGER.info(f"5. Adding '{name}' {cls_name}")
return cls(name=name, **kwargs)
_datasource_factory.__doc__ = cls.__doc__
# TODO: generate schemas from `cls` if needed
if cls.__module__ == "__main__":
LOGGER.warning(
f"Datasource `{cls_name}` should not be defined as part of __main__ this may cause typing lookup collisions"
)
_SourceFactories.register_types_and_ds_factory(cls, _datasource_factory)
return cls
|
PypiClean
|
/qrm_client-0.1.0.tar.gz/qrm_client-0.1.0/README.md
|
# qrm
Queue Resources Manager

## Build
[](https://codecov.io/gh/final-israel/qrm)


[](https://pypi.org/project/qrm-client/)
[](https://github.com/final-israel/vmn)
## management server
Add resource to qrm:
```bash
curl --header "Content-Type: application/json" --request POST --data '[{"name": "resource_2", "type": "server"}]' http://localhost:8080/add_resources
```
Remove resource from qrm:
```bash
curl --header "Content-Type: application/json" --request POST --data '[{"name": "resource_2", "type": "server"}]' http://localhost:8080/remove_resources
```
Set resource status:
```bash
curl --header "Content-Type: application/json" --request POST --data '{"resource_name": "resource_2", "status": "active"}' http://localhost:8080/set_resource_status
curl --header "Content-Type: application/json" --request POST --data '{"resource_name": "resource_2", "status": "disabled"}' http://localhost:8080/set_resource_status
```
Add job to resource:
```bash
curl --header "Content-Type: application/json" --request POST --data '{"resource_name": "resource_1", "job": {"token": 1, "job_name": "foo"}}' http://localhost:8080/add_job_to_resource
````
Remove job from resources:
```bash
curl --header "Content-Type: application/json" --request POST --data '{"token": 1, "resources": ["resource_1"]}' http://localhost:8080/remove_job
```
Add tag to resource:
```bash
curl --header "Content-Type: application/json" --request POST --data '{"resource_name": "resource_1", "tag": "foo"}' http://localhost:8080/add_tag_to_resource
```
Remove tag from resource:
```bash
curl --header "Content-Type: application/json" --request POST --data '{"resource_name": "resource_1", "tag": "foo"}' http://localhost:8080/remove_tag_from_resource
```
Set server status (control the global qrm state):
```bash
curl --header "Content-Type: application/json" --request POST --data '{"status": "disabled"}' http://localhost:8080/set_server_status
curl --header "Content-Type: application/json" --request POST --data '{"status": "active"}' http://localhost:8080/set_server_status
```
Show status of the server with it's resources and their jobs url:
```console
http://127.0.0.1:8080/status
```
## Communication Server
##### Server uptime info:
This web URL will show information about the server.
```console
http://127.0.0.1:5555/is_server_up
```
##### Is server up API:
```bash
curl http://localhost:5555/is_server_up
Json Response
{"status": true}
```
### API Version 1
#### To access to version 1 API all API calls must end with the suffix "/v1"
example: /new_request -> /new_request/v1
#### New_request:
```bash
curl --header "Content-Type: application/json" --request POST --data '{"names": [{"names": ["r1"], "count": 1}], "tags": [], "token": "token1234"}' http://localhost:8080/new_request/v1
```
#### Get token status:
```bash
curl --header "Content-Type: application/json" http://localhost:8080//get_token_status/v1?token=<token>
```
#### Cancel token:
```bash
curl --header "Content-Type: application/json" --request POST --data '{"token": "token1234"}' http://localhost:8080/cancel_token/v1
```
|
PypiClean
|
/cdktf-cdktf-provider-google-9.0.1.tar.gz/cdktf-cdktf-provider-google-9.0.1/src/cdktf_cdktf_provider_google/data_catalog_policy_tag_iam_member/__init__.py
|
import abc
import builtins
import datetime
import enum
import typing
import jsii
import publication
import typing_extensions
from typeguard import check_type
from .._jsii import *
import cdktf as _cdktf_9a9027ec
import constructs as _constructs_77d1e7e8
class DataCatalogPolicyTagIamMember(
_cdktf_9a9027ec.TerraformResource,
metaclass=jsii.JSIIMeta,
jsii_type="@cdktf/provider-google.dataCatalogPolicyTagIamMember.DataCatalogPolicyTagIamMember",
):
'''Represents a {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member google_data_catalog_policy_tag_iam_member}.'''
def __init__(
self,
scope: _constructs_77d1e7e8.Construct,
id_: builtins.str,
*,
member: builtins.str,
policy_tag: builtins.str,
role: builtins.str,
condition: typing.Optional[typing.Union["DataCatalogPolicyTagIamMemberCondition", typing.Dict[builtins.str, typing.Any]]] = None,
id: typing.Optional[builtins.str] = None,
connection: typing.Optional[typing.Union[typing.Union[_cdktf_9a9027ec.SSHProvisionerConnection, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.WinrmProvisionerConnection, typing.Dict[builtins.str, typing.Any]]]] = None,
count: typing.Optional[typing.Union[jsii.Number, _cdktf_9a9027ec.TerraformCount]] = None,
depends_on: typing.Optional[typing.Sequence[_cdktf_9a9027ec.ITerraformDependable]] = None,
for_each: typing.Optional[_cdktf_9a9027ec.ITerraformIterator] = None,
lifecycle: typing.Optional[typing.Union[_cdktf_9a9027ec.TerraformResourceLifecycle, typing.Dict[builtins.str, typing.Any]]] = None,
provider: typing.Optional[_cdktf_9a9027ec.TerraformProvider] = None,
provisioners: typing.Optional[typing.Sequence[typing.Union[typing.Union[_cdktf_9a9027ec.FileProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.LocalExecProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.RemoteExecProvisioner, typing.Dict[builtins.str, typing.Any]]]]] = None,
) -> None:
'''Create a new {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member google_data_catalog_policy_tag_iam_member} Resource.
:param scope: The scope in which to define this construct.
:param id_: The scoped construct ID. Must be unique amongst siblings in the same scope
:param member: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#member DataCatalogPolicyTagIamMember#member}.
:param policy_tag: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#policy_tag DataCatalogPolicyTagIamMember#policy_tag}.
:param role: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#role DataCatalogPolicyTagIamMember#role}.
:param condition: condition block. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#condition DataCatalogPolicyTagIamMember#condition}
:param id: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#id DataCatalogPolicyTagIamMember#id}. Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
:param connection:
:param count:
:param depends_on:
:param for_each:
:param lifecycle:
:param provider:
:param provisioners:
'''
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__c1f18e43bc794c32462bb9dbdeb3da0337f0d77164e4070c9ef7612d259aca60)
check_type(argname="argument scope", value=scope, expected_type=type_hints["scope"])
check_type(argname="argument id_", value=id_, expected_type=type_hints["id_"])
config = DataCatalogPolicyTagIamMemberConfig(
member=member,
policy_tag=policy_tag,
role=role,
condition=condition,
id=id,
connection=connection,
count=count,
depends_on=depends_on,
for_each=for_each,
lifecycle=lifecycle,
provider=provider,
provisioners=provisioners,
)
jsii.create(self.__class__, self, [scope, id_, config])
@jsii.member(jsii_name="putCondition")
def put_condition(
self,
*,
expression: builtins.str,
title: builtins.str,
description: typing.Optional[builtins.str] = None,
) -> None:
'''
:param expression: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#expression DataCatalogPolicyTagIamMember#expression}.
:param title: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#title DataCatalogPolicyTagIamMember#title}.
:param description: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#description DataCatalogPolicyTagIamMember#description}.
'''
value = DataCatalogPolicyTagIamMemberCondition(
expression=expression, title=title, description=description
)
return typing.cast(None, jsii.invoke(self, "putCondition", [value]))
@jsii.member(jsii_name="resetCondition")
def reset_condition(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetCondition", []))
@jsii.member(jsii_name="resetId")
def reset_id(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetId", []))
@jsii.member(jsii_name="synthesizeAttributes")
def _synthesize_attributes(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "synthesizeAttributes", []))
@jsii.python.classproperty
@jsii.member(jsii_name="tfResourceType")
def TF_RESOURCE_TYPE(cls) -> builtins.str:
return typing.cast(builtins.str, jsii.sget(cls, "tfResourceType"))
@builtins.property
@jsii.member(jsii_name="condition")
def condition(self) -> "DataCatalogPolicyTagIamMemberConditionOutputReference":
return typing.cast("DataCatalogPolicyTagIamMemberConditionOutputReference", jsii.get(self, "condition"))
@builtins.property
@jsii.member(jsii_name="etag")
def etag(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "etag"))
@builtins.property
@jsii.member(jsii_name="conditionInput")
def condition_input(
self,
) -> typing.Optional["DataCatalogPolicyTagIamMemberCondition"]:
return typing.cast(typing.Optional["DataCatalogPolicyTagIamMemberCondition"], jsii.get(self, "conditionInput"))
@builtins.property
@jsii.member(jsii_name="idInput")
def id_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "idInput"))
@builtins.property
@jsii.member(jsii_name="memberInput")
def member_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "memberInput"))
@builtins.property
@jsii.member(jsii_name="policyTagInput")
def policy_tag_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "policyTagInput"))
@builtins.property
@jsii.member(jsii_name="roleInput")
def role_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "roleInput"))
@builtins.property
@jsii.member(jsii_name="id")
def id(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "id"))
@id.setter
def id(self, value: builtins.str) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__0760525791b591f10c8ed894533c308b1bce6f4fca73cc3d2928a60a37ca8f7b)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "id", value)
@builtins.property
@jsii.member(jsii_name="member")
def member(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "member"))
@member.setter
def member(self, value: builtins.str) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__89b5a7b95dc40403e14fc396612d0aeff0f350ccedaafa6427c2950e45320e1f)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "member", value)
@builtins.property
@jsii.member(jsii_name="policyTag")
def policy_tag(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "policyTag"))
@policy_tag.setter
def policy_tag(self, value: builtins.str) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__77ce1569ef073181705cf87512f3efc4e01c250e95be5fd252dbd8faa9114673)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "policyTag", value)
@builtins.property
@jsii.member(jsii_name="role")
def role(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "role"))
@role.setter
def role(self, value: builtins.str) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__05f247af8a2654f71bd28634e1a5b64cb413bf52be6125fca9a5daec0093b09a)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "role", value)
@jsii.data_type(
jsii_type="@cdktf/provider-google.dataCatalogPolicyTagIamMember.DataCatalogPolicyTagIamMemberCondition",
jsii_struct_bases=[],
name_mapping={
"expression": "expression",
"title": "title",
"description": "description",
},
)
class DataCatalogPolicyTagIamMemberCondition:
def __init__(
self,
*,
expression: builtins.str,
title: builtins.str,
description: typing.Optional[builtins.str] = None,
) -> None:
'''
:param expression: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#expression DataCatalogPolicyTagIamMember#expression}.
:param title: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#title DataCatalogPolicyTagIamMember#title}.
:param description: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#description DataCatalogPolicyTagIamMember#description}.
'''
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__993d6a56cca5bc28030db6ad40c472ef55b79c49cb5e6773ae24454a04569aac)
check_type(argname="argument expression", value=expression, expected_type=type_hints["expression"])
check_type(argname="argument title", value=title, expected_type=type_hints["title"])
check_type(argname="argument description", value=description, expected_type=type_hints["description"])
self._values: typing.Dict[builtins.str, typing.Any] = {
"expression": expression,
"title": title,
}
if description is not None:
self._values["description"] = description
@builtins.property
def expression(self) -> builtins.str:
'''Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#expression DataCatalogPolicyTagIamMember#expression}.'''
result = self._values.get("expression")
assert result is not None, "Required property 'expression' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def title(self) -> builtins.str:
'''Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#title DataCatalogPolicyTagIamMember#title}.'''
result = self._values.get("title")
assert result is not None, "Required property 'title' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def description(self) -> typing.Optional[builtins.str]:
'''Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#description DataCatalogPolicyTagIamMember#description}.'''
result = self._values.get("description")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "DataCatalogPolicyTagIamMemberCondition(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class DataCatalogPolicyTagIamMemberConditionOutputReference(
_cdktf_9a9027ec.ComplexObject,
metaclass=jsii.JSIIMeta,
jsii_type="@cdktf/provider-google.dataCatalogPolicyTagIamMember.DataCatalogPolicyTagIamMemberConditionOutputReference",
):
def __init__(
self,
terraform_resource: _cdktf_9a9027ec.IInterpolatingParent,
terraform_attribute: builtins.str,
) -> None:
'''
:param terraform_resource: The parent resource.
:param terraform_attribute: The attribute on the parent resource this class is referencing.
'''
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__8bc82e1923f9cab71cea6e98b840a31105baf5b9ae520daef91471e36dda26da)
check_type(argname="argument terraform_resource", value=terraform_resource, expected_type=type_hints["terraform_resource"])
check_type(argname="argument terraform_attribute", value=terraform_attribute, expected_type=type_hints["terraform_attribute"])
jsii.create(self.__class__, self, [terraform_resource, terraform_attribute])
@jsii.member(jsii_name="resetDescription")
def reset_description(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetDescription", []))
@builtins.property
@jsii.member(jsii_name="descriptionInput")
def description_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "descriptionInput"))
@builtins.property
@jsii.member(jsii_name="expressionInput")
def expression_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "expressionInput"))
@builtins.property
@jsii.member(jsii_name="titleInput")
def title_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "titleInput"))
@builtins.property
@jsii.member(jsii_name="description")
def description(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "description"))
@description.setter
def description(self, value: builtins.str) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__c46af1ae4610d09b0cdfec45bedace254063bd3d17c2b3055d0715c5f01e8df7)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "description", value)
@builtins.property
@jsii.member(jsii_name="expression")
def expression(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "expression"))
@expression.setter
def expression(self, value: builtins.str) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__658a1963b042e9676652b9ba45ae72e79b2bba97fea8e9a976a491c366774a26)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "expression", value)
@builtins.property
@jsii.member(jsii_name="title")
def title(self) -> builtins.str:
return typing.cast(builtins.str, jsii.get(self, "title"))
@title.setter
def title(self, value: builtins.str) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__58debdc1c79d1e0351bcd0ba36694b27c48f95fef3f718fbc3e8c9faca7fceb1)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "title", value)
@builtins.property
@jsii.member(jsii_name="internalValue")
def internal_value(self) -> typing.Optional[DataCatalogPolicyTagIamMemberCondition]:
return typing.cast(typing.Optional[DataCatalogPolicyTagIamMemberCondition], jsii.get(self, "internalValue"))
@internal_value.setter
def internal_value(
self,
value: typing.Optional[DataCatalogPolicyTagIamMemberCondition],
) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__a530122b69a252023bb9045b033830b0a138d112d938ab441e9a4ef5780c9c39)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "internalValue", value)
@jsii.data_type(
jsii_type="@cdktf/provider-google.dataCatalogPolicyTagIamMember.DataCatalogPolicyTagIamMemberConfig",
jsii_struct_bases=[_cdktf_9a9027ec.TerraformMetaArguments],
name_mapping={
"connection": "connection",
"count": "count",
"depends_on": "dependsOn",
"for_each": "forEach",
"lifecycle": "lifecycle",
"provider": "provider",
"provisioners": "provisioners",
"member": "member",
"policy_tag": "policyTag",
"role": "role",
"condition": "condition",
"id": "id",
},
)
class DataCatalogPolicyTagIamMemberConfig(_cdktf_9a9027ec.TerraformMetaArguments):
def __init__(
self,
*,
connection: typing.Optional[typing.Union[typing.Union[_cdktf_9a9027ec.SSHProvisionerConnection, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.WinrmProvisionerConnection, typing.Dict[builtins.str, typing.Any]]]] = None,
count: typing.Optional[typing.Union[jsii.Number, _cdktf_9a9027ec.TerraformCount]] = None,
depends_on: typing.Optional[typing.Sequence[_cdktf_9a9027ec.ITerraformDependable]] = None,
for_each: typing.Optional[_cdktf_9a9027ec.ITerraformIterator] = None,
lifecycle: typing.Optional[typing.Union[_cdktf_9a9027ec.TerraformResourceLifecycle, typing.Dict[builtins.str, typing.Any]]] = None,
provider: typing.Optional[_cdktf_9a9027ec.TerraformProvider] = None,
provisioners: typing.Optional[typing.Sequence[typing.Union[typing.Union[_cdktf_9a9027ec.FileProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.LocalExecProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.RemoteExecProvisioner, typing.Dict[builtins.str, typing.Any]]]]] = None,
member: builtins.str,
policy_tag: builtins.str,
role: builtins.str,
condition: typing.Optional[typing.Union[DataCatalogPolicyTagIamMemberCondition, typing.Dict[builtins.str, typing.Any]]] = None,
id: typing.Optional[builtins.str] = None,
) -> None:
'''
:param connection:
:param count:
:param depends_on:
:param for_each:
:param lifecycle:
:param provider:
:param provisioners:
:param member: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#member DataCatalogPolicyTagIamMember#member}.
:param policy_tag: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#policy_tag DataCatalogPolicyTagIamMember#policy_tag}.
:param role: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#role DataCatalogPolicyTagIamMember#role}.
:param condition: condition block. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#condition DataCatalogPolicyTagIamMember#condition}
:param id: Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#id DataCatalogPolicyTagIamMember#id}. Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
'''
if isinstance(lifecycle, dict):
lifecycle = _cdktf_9a9027ec.TerraformResourceLifecycle(**lifecycle)
if isinstance(condition, dict):
condition = DataCatalogPolicyTagIamMemberCondition(**condition)
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__e43f2cad212e48e01232b0b3e08b49ed40b295eef97c8b374e2cbb0b73b13091)
check_type(argname="argument connection", value=connection, expected_type=type_hints["connection"])
check_type(argname="argument count", value=count, expected_type=type_hints["count"])
check_type(argname="argument depends_on", value=depends_on, expected_type=type_hints["depends_on"])
check_type(argname="argument for_each", value=for_each, expected_type=type_hints["for_each"])
check_type(argname="argument lifecycle", value=lifecycle, expected_type=type_hints["lifecycle"])
check_type(argname="argument provider", value=provider, expected_type=type_hints["provider"])
check_type(argname="argument provisioners", value=provisioners, expected_type=type_hints["provisioners"])
check_type(argname="argument member", value=member, expected_type=type_hints["member"])
check_type(argname="argument policy_tag", value=policy_tag, expected_type=type_hints["policy_tag"])
check_type(argname="argument role", value=role, expected_type=type_hints["role"])
check_type(argname="argument condition", value=condition, expected_type=type_hints["condition"])
check_type(argname="argument id", value=id, expected_type=type_hints["id"])
self._values: typing.Dict[builtins.str, typing.Any] = {
"member": member,
"policy_tag": policy_tag,
"role": role,
}
if connection is not None:
self._values["connection"] = connection
if count is not None:
self._values["count"] = count
if depends_on is not None:
self._values["depends_on"] = depends_on
if for_each is not None:
self._values["for_each"] = for_each
if lifecycle is not None:
self._values["lifecycle"] = lifecycle
if provider is not None:
self._values["provider"] = provider
if provisioners is not None:
self._values["provisioners"] = provisioners
if condition is not None:
self._values["condition"] = condition
if id is not None:
self._values["id"] = id
@builtins.property
def connection(
self,
) -> typing.Optional[typing.Union[_cdktf_9a9027ec.SSHProvisionerConnection, _cdktf_9a9027ec.WinrmProvisionerConnection]]:
'''
:stability: experimental
'''
result = self._values.get("connection")
return typing.cast(typing.Optional[typing.Union[_cdktf_9a9027ec.SSHProvisionerConnection, _cdktf_9a9027ec.WinrmProvisionerConnection]], result)
@builtins.property
def count(
self,
) -> typing.Optional[typing.Union[jsii.Number, _cdktf_9a9027ec.TerraformCount]]:
'''
:stability: experimental
'''
result = self._values.get("count")
return typing.cast(typing.Optional[typing.Union[jsii.Number, _cdktf_9a9027ec.TerraformCount]], result)
@builtins.property
def depends_on(
self,
) -> typing.Optional[typing.List[_cdktf_9a9027ec.ITerraformDependable]]:
'''
:stability: experimental
'''
result = self._values.get("depends_on")
return typing.cast(typing.Optional[typing.List[_cdktf_9a9027ec.ITerraformDependable]], result)
@builtins.property
def for_each(self) -> typing.Optional[_cdktf_9a9027ec.ITerraformIterator]:
'''
:stability: experimental
'''
result = self._values.get("for_each")
return typing.cast(typing.Optional[_cdktf_9a9027ec.ITerraformIterator], result)
@builtins.property
def lifecycle(self) -> typing.Optional[_cdktf_9a9027ec.TerraformResourceLifecycle]:
'''
:stability: experimental
'''
result = self._values.get("lifecycle")
return typing.cast(typing.Optional[_cdktf_9a9027ec.TerraformResourceLifecycle], result)
@builtins.property
def provider(self) -> typing.Optional[_cdktf_9a9027ec.TerraformProvider]:
'''
:stability: experimental
'''
result = self._values.get("provider")
return typing.cast(typing.Optional[_cdktf_9a9027ec.TerraformProvider], result)
@builtins.property
def provisioners(
self,
) -> typing.Optional[typing.List[typing.Union[_cdktf_9a9027ec.FileProvisioner, _cdktf_9a9027ec.LocalExecProvisioner, _cdktf_9a9027ec.RemoteExecProvisioner]]]:
'''
:stability: experimental
'''
result = self._values.get("provisioners")
return typing.cast(typing.Optional[typing.List[typing.Union[_cdktf_9a9027ec.FileProvisioner, _cdktf_9a9027ec.LocalExecProvisioner, _cdktf_9a9027ec.RemoteExecProvisioner]]], result)
@builtins.property
def member(self) -> builtins.str:
'''Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#member DataCatalogPolicyTagIamMember#member}.'''
result = self._values.get("member")
assert result is not None, "Required property 'member' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def policy_tag(self) -> builtins.str:
'''Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#policy_tag DataCatalogPolicyTagIamMember#policy_tag}.'''
result = self._values.get("policy_tag")
assert result is not None, "Required property 'policy_tag' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def role(self) -> builtins.str:
'''Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#role DataCatalogPolicyTagIamMember#role}.'''
result = self._values.get("role")
assert result is not None, "Required property 'role' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def condition(self) -> typing.Optional[DataCatalogPolicyTagIamMemberCondition]:
'''condition block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#condition DataCatalogPolicyTagIamMember#condition}
'''
result = self._values.get("condition")
return typing.cast(typing.Optional[DataCatalogPolicyTagIamMemberCondition], result)
@builtins.property
def id(self) -> typing.Optional[builtins.str]:
'''Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/4.80.0/docs/resources/data_catalog_policy_tag_iam_member#id DataCatalogPolicyTagIamMember#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2.
If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
'''
result = self._values.get("id")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "DataCatalogPolicyTagIamMemberConfig(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
__all__ = [
"DataCatalogPolicyTagIamMember",
"DataCatalogPolicyTagIamMemberCondition",
"DataCatalogPolicyTagIamMemberConditionOutputReference",
"DataCatalogPolicyTagIamMemberConfig",
]
publication.publish()
def _typecheckingstub__c1f18e43bc794c32462bb9dbdeb3da0337f0d77164e4070c9ef7612d259aca60(
scope: _constructs_77d1e7e8.Construct,
id_: builtins.str,
*,
member: builtins.str,
policy_tag: builtins.str,
role: builtins.str,
condition: typing.Optional[typing.Union[DataCatalogPolicyTagIamMemberCondition, typing.Dict[builtins.str, typing.Any]]] = None,
id: typing.Optional[builtins.str] = None,
connection: typing.Optional[typing.Union[typing.Union[_cdktf_9a9027ec.SSHProvisionerConnection, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.WinrmProvisionerConnection, typing.Dict[builtins.str, typing.Any]]]] = None,
count: typing.Optional[typing.Union[jsii.Number, _cdktf_9a9027ec.TerraformCount]] = None,
depends_on: typing.Optional[typing.Sequence[_cdktf_9a9027ec.ITerraformDependable]] = None,
for_each: typing.Optional[_cdktf_9a9027ec.ITerraformIterator] = None,
lifecycle: typing.Optional[typing.Union[_cdktf_9a9027ec.TerraformResourceLifecycle, typing.Dict[builtins.str, typing.Any]]] = None,
provider: typing.Optional[_cdktf_9a9027ec.TerraformProvider] = None,
provisioners: typing.Optional[typing.Sequence[typing.Union[typing.Union[_cdktf_9a9027ec.FileProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.LocalExecProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.RemoteExecProvisioner, typing.Dict[builtins.str, typing.Any]]]]] = None,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__0760525791b591f10c8ed894533c308b1bce6f4fca73cc3d2928a60a37ca8f7b(
value: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__89b5a7b95dc40403e14fc396612d0aeff0f350ccedaafa6427c2950e45320e1f(
value: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__77ce1569ef073181705cf87512f3efc4e01c250e95be5fd252dbd8faa9114673(
value: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__05f247af8a2654f71bd28634e1a5b64cb413bf52be6125fca9a5daec0093b09a(
value: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__993d6a56cca5bc28030db6ad40c472ef55b79c49cb5e6773ae24454a04569aac(
*,
expression: builtins.str,
title: builtins.str,
description: typing.Optional[builtins.str] = None,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__8bc82e1923f9cab71cea6e98b840a31105baf5b9ae520daef91471e36dda26da(
terraform_resource: _cdktf_9a9027ec.IInterpolatingParent,
terraform_attribute: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__c46af1ae4610d09b0cdfec45bedace254063bd3d17c2b3055d0715c5f01e8df7(
value: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__658a1963b042e9676652b9ba45ae72e79b2bba97fea8e9a976a491c366774a26(
value: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__58debdc1c79d1e0351bcd0ba36694b27c48f95fef3f718fbc3e8c9faca7fceb1(
value: builtins.str,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__a530122b69a252023bb9045b033830b0a138d112d938ab441e9a4ef5780c9c39(
value: typing.Optional[DataCatalogPolicyTagIamMemberCondition],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__e43f2cad212e48e01232b0b3e08b49ed40b295eef97c8b374e2cbb0b73b13091(
*,
connection: typing.Optional[typing.Union[typing.Union[_cdktf_9a9027ec.SSHProvisionerConnection, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.WinrmProvisionerConnection, typing.Dict[builtins.str, typing.Any]]]] = None,
count: typing.Optional[typing.Union[jsii.Number, _cdktf_9a9027ec.TerraformCount]] = None,
depends_on: typing.Optional[typing.Sequence[_cdktf_9a9027ec.ITerraformDependable]] = None,
for_each: typing.Optional[_cdktf_9a9027ec.ITerraformIterator] = None,
lifecycle: typing.Optional[typing.Union[_cdktf_9a9027ec.TerraformResourceLifecycle, typing.Dict[builtins.str, typing.Any]]] = None,
provider: typing.Optional[_cdktf_9a9027ec.TerraformProvider] = None,
provisioners: typing.Optional[typing.Sequence[typing.Union[typing.Union[_cdktf_9a9027ec.FileProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.LocalExecProvisioner, typing.Dict[builtins.str, typing.Any]], typing.Union[_cdktf_9a9027ec.RemoteExecProvisioner, typing.Dict[builtins.str, typing.Any]]]]] = None,
member: builtins.str,
policy_tag: builtins.str,
role: builtins.str,
condition: typing.Optional[typing.Union[DataCatalogPolicyTagIamMemberCondition, typing.Dict[builtins.str, typing.Any]]] = None,
id: typing.Optional[builtins.str] = None,
) -> None:
"""Type checking stubs"""
pass
|
PypiClean
|
/ryu-4.34.tar.gz/ryu-4.34/doc/source/writing_ryu_app.rst
|
*********************
The First Application
*********************
Whetting Your Appetite
======================
If you want to manage network gear (switches, routers, etc) your
own way, you just need to write your own Ryu application. Your application
tells Ryu how you want to manage the gear. Then Ryu configures the
gear by using OpenFlow protocol, etc.
Writing Ryu applications is easy. They're just Python scripts.
Start Writing
=============
Here we show a Ryu application that makes an OpenFlow switch work as a dumb
layer 2 switch.
Open a text editor and create a new file with the following content:
.. code-block:: python
from ryu.base import app_manager
class L2Switch(app_manager.RyuApp):
def __init__(self, *args, **kwargs):
super(L2Switch, self).__init__(*args, **kwargs)
Ryu applications are just Python scripts so you can save the file with
any name, any extension, and any place you want. Let's name the file
'l2.py' in your home directory.
This application does nothing useful yet, however it's a complete Ryu
application. In fact, you can run this Ryu application::
% ryu-manager ~/l2.py
loading app /Users/fujita/l2.py
instantiating app /Users/fujita/l2.py
All you have to do is define a new subclass of RyuApp to run
your Python script as a Ryu application.
Next let's add some functionality that sends a received packet to all
the ports.
.. code-block:: python
from ryu.base import app_manager
from ryu.controller import ofp_event
from ryu.controller.handler import MAIN_DISPATCHER
from ryu.controller.handler import set_ev_cls
from ryu.ofproto import ofproto_v1_0
class L2Switch(app_manager.RyuApp):
OFP_VERSIONS = [ofproto_v1_0.OFP_VERSION]
def __init__(self, *args, **kwargs):
super(L2Switch, self).__init__(*args, **kwargs)
@set_ev_cls(ofp_event.EventOFPPacketIn, MAIN_DISPATCHER)
def packet_in_handler(self, ev):
msg = ev.msg
dp = msg.datapath
ofp = dp.ofproto
ofp_parser = dp.ofproto_parser
actions = [ofp_parser.OFPActionOutput(ofp.OFPP_FLOOD)]
out = ofp_parser.OFPPacketOut(
datapath=dp, buffer_id=msg.buffer_id, in_port=msg.in_port,
actions=actions)
dp.send_msg(out)
A new method 'packet_in_handler' is added to the L2Switch class. This is
called when Ryu receives an OpenFlow packet_in message. The trick is the
'set_ev_cls' decorator. This decorator tells Ryu when the decorated
function should be called.
The first argument of the decorator indicates which type of event this
function should be called for. As you might expect, every time Ryu gets a
packet_in message, this function is called.
The second argument indicates the state of the switch. You probably
want to ignore packet_in messages before the negotiation between Ryu
and the switch is finished. Using 'MAIN_DISPATCHER' as the second
argument means this function is called only after the negotiation
completes.
Next let's look at the first half of the 'packet_in_handler' function.
* ev.msg is an object that represents a packet_in data structure.
* msg.dp is an object that represents a datapath (switch).
* dp.ofproto and dp.ofproto_parser are objects that represent the
OpenFlow protocol that Ryu and the switch negotiated.
Ready for the second half.
* OFPActionOutput class is used with a packet_out message to specify a
switch port that you want to send the packet out of. This
application uses the OFPP_FLOOD flag to indicate that the packet should
be sent out on all ports.
* OFPPacketOut class is used to build a packet_out message.
* If you call Datapath class's send_msg method with a OpenFlow message
class object, Ryu builds and sends the on-wire data format to the switch.
There, you finished implementing your first Ryu application. You are ready to
run a Ryu application that does something useful.
Is a dumb L2 switch is too dumb? You want to implement a learning L2
switch? Move to `the next step
<https://github.com/osrg/ryu/blob/master/ryu/app/simple_switch.py>`_. You
can learn from the existing Ryu applications at `ryu/app
<https://github.com/osrg/ryu/blob/master/ryu/app/>`_ directory and
`integrated tests
<https://github.com/osrg/ryu/blob/master/ryu/tests/integrated/>`_
directory.
|
PypiClean
|
/SCRIdb-1.2.11a2.tar.gz/SCRIdb-1.2.11a2/.eggs/jmespath-0.10.0-py3.7.egg/jmespath/ast.py
|
def comparator(name, first, second):
return {'type': 'comparator', 'children': [first, second], 'value': name}
def current_node():
return {'type': 'current', 'children': []}
def expref(expression):
return {'type': 'expref', 'children': [expression]}
def function_expression(name, args):
return {'type': 'function_expression', 'children': args, 'value': name}
def field(name):
return {"type": "field", "children": [], "value": name}
def filter_projection(left, right, comparator):
return {'type': 'filter_projection', 'children': [left, right, comparator]}
def flatten(node):
return {'type': 'flatten', 'children': [node]}
def identity():
return {"type": "identity", 'children': []}
def index(index):
return {"type": "index", "value": index, "children": []}
def index_expression(children):
return {"type": "index_expression", 'children': children}
def key_val_pair(key_name, node):
return {"type": "key_val_pair", 'children': [node], "value": key_name}
def literal(literal_value):
return {'type': 'literal', 'value': literal_value, 'children': []}
def multi_select_dict(nodes):
return {"type": "multi_select_dict", "children": nodes}
def multi_select_list(nodes):
return {"type": "multi_select_list", "children": nodes}
def or_expression(left, right):
return {"type": "or_expression", "children": [left, right]}
def and_expression(left, right):
return {"type": "and_expression", "children": [left, right]}
def not_expression(expr):
return {"type": "not_expression", "children": [expr]}
def pipe(left, right):
return {'type': 'pipe', 'children': [left, right]}
def projection(left, right):
return {'type': 'projection', 'children': [left, right]}
def subexpression(children):
return {"type": "subexpression", 'children': children}
def slice(start, end, step):
return {"type": "slice", "children": [start, end, step]}
def value_projection(left, right):
return {'type': 'value_projection', 'children': [left, right]}
|
PypiClean
|
/xs_transformers-1.0.7-py3-none-any.whl/xs_transformers/models/transfo_xl/modeling_tf_transfo_xl.py
|
from dataclasses import dataclass
from typing import List, Optional, Tuple, Union
import numpy as np
import tensorflow as tf
from ...modeling_tf_utils import (
TFModelInputType,
TFPreTrainedModel,
TFSequenceClassificationLoss,
get_initializer,
keras_serializable,
unpack_inputs,
)
from ...tf_utils import shape_list, stable_softmax
from ...utils import (
ModelOutput,
add_code_sample_docstrings,
add_start_docstrings,
add_start_docstrings_to_model_forward,
logging,
)
from .configuration_transfo_xl import TransfoXLConfig
from .modeling_tf_transfo_xl_utilities import TFAdaptiveSoftmaxMask
logger = logging.get_logger(__name__)
_CHECKPOINT_FOR_DOC = "transfo-xl-wt103"
_CONFIG_FOR_DOC = "TransfoXLConfig"
_TOKENIZER_FOR_DOC = "TransfoXLTokenizer"
TF_TRANSFO_XL_PRETRAINED_MODEL_ARCHIVE_LIST = [
"transfo-xl-wt103",
# See all Transformer XL models at https://huggingface.co/models?filter=transfo-xl
]
class TFPositionalEmbedding(tf.keras.layers.Layer):
def __init__(self, demb, **kwargs):
super().__init__(**kwargs)
self.inv_freq = 1 / (10000 ** (tf.range(0, demb, 2.0) / demb))
def call(self, pos_seq, bsz=None):
self.inv_freq = tf.cast(self.inv_freq, dtype=pos_seq.dtype)
sinusoid_inp = tf.einsum("i,j->ij", pos_seq, self.inv_freq)
pos_emb = tf.concat([tf.sin(sinusoid_inp), tf.cos(sinusoid_inp)], -1)
if bsz is not None:
return tf.tile(pos_emb[:, None, :], [1, bsz, 1])
else:
return pos_emb[:, None, :]
class TFPositionwiseFF(tf.keras.layers.Layer):
def __init__(
self,
d_model,
d_inner,
dropout,
pre_lnorm=False,
layer_norm_epsilon=1e-5,
init_std=0.02,
**kwargs,
):
super().__init__(**kwargs)
self.d_model = d_model
self.d_inner = d_inner
self.dropout = dropout
self.layer_1 = tf.keras.layers.Dense(
d_inner,
kernel_initializer=get_initializer(init_std),
activation=tf.nn.relu,
name="CoreNet_._0",
)
self.drop_1 = tf.keras.layers.Dropout(dropout)
self.layer_2 = tf.keras.layers.Dense(
d_model, kernel_initializer=get_initializer(init_std), name="CoreNet_._3"
)
self.drop_2 = tf.keras.layers.Dropout(dropout)
self.layer_norm = tf.keras.layers.LayerNormalization(
epsilon=layer_norm_epsilon, name="layer_norm"
)
self.pre_lnorm = pre_lnorm
def call(self, inp, training=False):
if self.pre_lnorm:
# layer normalization + positionwise feed-forward
core_out = self.layer_norm(inp)
core_out = self.layer_1(core_out)
core_out = self.drop_1(core_out, training=training)
core_out = self.layer_2(core_out)
core_out = self.drop_2(core_out, training=training)
# residual connection
output = core_out + inp
else:
# positionwise feed-forward
core_out = self.layer_1(inp)
core_out = self.drop_1(core_out, training=training)
core_out = self.layer_2(core_out)
core_out = self.drop_2(core_out, training=training)
# residual connection + layer normalization
output = self.layer_norm(inp + core_out)
return output
class TFRelPartialLearnableMultiHeadAttn(tf.keras.layers.Layer):
def __init__(
self,
n_head,
d_model,
d_head,
dropout,
dropatt=0.0,
pre_lnorm=False,
r_r_bias=None,
r_w_bias=None,
layer_norm_epsilon=1e-5,
init_std=0.02,
output_attentions=False,
**kwargs,
):
super().__init__(**kwargs)
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.output_attentions = output_attentions
self.qkv_net = tf.keras.layers.Dense(
3 * n_head * d_head,
kernel_initializer=get_initializer(init_std),
use_bias=False,
name="qkv_net",
)
self.drop = tf.keras.layers.Dropout(dropout)
self.dropatt = tf.keras.layers.Dropout(dropatt)
self.o_net = tf.keras.layers.Dense(
d_model,
kernel_initializer=get_initializer(init_std),
use_bias=False,
name="o_net",
)
self.layer_norm = tf.keras.layers.LayerNormalization(
epsilon=layer_norm_epsilon, name="layer_norm"
)
self.scale = 1 / (d_head**0.5)
self.pre_lnorm = pre_lnorm
if r_r_bias is not None and r_w_bias is not None: # Biases are shared
self.r_r_bias = r_r_bias
self.r_w_bias = r_w_bias
else:
self.r_r_bias = None
self.r_w_bias = None
self.r_net = tf.keras.layers.Dense(
self.n_head * self.d_head,
kernel_initializer=get_initializer(init_std),
use_bias=False,
name="r_net",
)
def build(self, input_shape):
if self.r_r_bias is None or self.r_w_bias is None: # Biases are not shared
self.r_r_bias = self.add_weight(
shape=(self.n_head, self.d_head),
initializer="zeros",
trainable=True,
name="r_r_bias",
)
self.r_w_bias = self.add_weight(
shape=(self.n_head, self.d_head),
initializer="zeros",
trainable=True,
name="r_w_bias",
)
super().build(input_shape)
def _rel_shift(self, x):
x_size = shape_list(x)
x = tf.pad(x, [[0, 0], [1, 0], [0, 0], [0, 0]])
x = tf.reshape(x, [x_size[1] + 1, x_size[0], x_size[2], x_size[3]])
x = tf.slice(x, [1, 0, 0, 0], [-1, -1, -1, -1])
x = tf.reshape(x, x_size)
return x
def call(self, w, r, attn_mask, mems, head_mask, output_attentions, training=False):
qlen, rlen, bsz = shape_list(w)[0], shape_list(r)[0], shape_list(w)[1]
if mems is not None:
mems = tf.cast(mems, dtype=w.dtype)
cat = tf.concat([mems, w], 0)
if self.pre_lnorm:
w_heads = self.qkv_net(self.layer_norm(cat))
else:
w_heads = self.qkv_net(cat)
r_head_k = self.r_net(r)
w_head_q, w_head_k, w_head_v = tf.split(w_heads, 3, axis=-1)
w_head_q = w_head_q[-qlen:]
else:
if self.pre_lnorm:
w_heads = self.qkv_net(self.layer_norm(w))
else:
w_heads = self.qkv_net(w)
r_head_k = self.r_net(r)
w_head_q, w_head_k, w_head_v = tf.split(w_heads, 3, axis=-1)
klen = shape_list(w_head_k)[0]
w_head_q = tf.reshape(
w_head_q, (qlen, bsz, self.n_head, self.d_head)
) # qlen x bsz x n_head x d_head
w_head_k = tf.reshape(
w_head_k, (klen, bsz, self.n_head, self.d_head)
) # qlen x bsz x n_head x d_head
w_head_v = tf.reshape(
w_head_v, (klen, bsz, self.n_head, self.d_head)
) # qlen x bsz x n_head x d_head
r_head_k = tf.reshape(
r_head_k, (rlen, self.n_head, self.d_head)
) # qlen x n_head x d_head
# compute attention score
rw_head_q = w_head_q + self.r_w_bias # qlen x bsz x n_head x d_head
AC = tf.einsum(
"ibnd,jbnd->ijbn", rw_head_q, w_head_k
) # qlen x klen x bsz x n_head
rr_head_q = w_head_q + self.r_r_bias
BD = tf.einsum(
"ibnd,jnd->ijbn", rr_head_q, r_head_k
) # qlen x klen x bsz x n_head
BD = self._rel_shift(BD)
# [qlen x klen x bsz x n_head]
attn_score = AC + BD
attn_score = attn_score * self.scale
# compute attention probability
if attn_mask is not None:
attn_mask_t = attn_mask[:, :, None, None]
attn_mask_t = tf.cast(attn_mask_t, dtype=attn_score.dtype)
attn_score = attn_score * (1.0 - attn_mask_t) - 1e30 * attn_mask_t
# [qlen x klen x bsz x n_head]
attn_prob = stable_softmax(attn_score, axis=1)
attn_prob = self.dropatt(attn_prob, training=training)
# Mask heads if we want to
if head_mask is not None:
attn_prob = attn_prob * head_mask
# compute attention vector
attn_vec = tf.einsum("ijbn,jbnd->ibnd", attn_prob, w_head_v)
# [qlen x bsz x n_head x d_head]
attn_vec_sizes = shape_list(attn_vec)
attn_vec = tf.reshape(
attn_vec, (attn_vec_sizes[0], attn_vec_sizes[1], self.n_head * self.d_head)
)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out, training=training)
if self.pre_lnorm:
# residual connection
outputs = [w + attn_out]
else:
# residual connection + layer normalization
outputs = [self.layer_norm(w + attn_out)]
if output_attentions:
outputs.append(attn_prob)
return outputs
class TFRelPartialLearnableDecoderLayer(tf.keras.layers.Layer):
def __init__(
self,
n_head,
d_model,
d_head,
d_inner,
dropout,
dropatt=0.0,
pre_lnorm=False,
r_w_bias=None,
r_r_bias=None,
layer_norm_epsilon=1e-5,
init_std=0.02,
output_attentions=False,
**kwargs,
):
super().__init__(**kwargs)
self.dec_attn = TFRelPartialLearnableMultiHeadAttn(
n_head,
d_model,
d_head,
dropout,
dropatt=dropatt,
pre_lnorm=pre_lnorm,
r_w_bias=r_w_bias,
r_r_bias=r_r_bias,
init_std=init_std,
layer_norm_epsilon=layer_norm_epsilon,
output_attentions=output_attentions,
name="dec_attn",
)
self.pos_ff = TFPositionwiseFF(
d_model,
d_inner,
dropout,
pre_lnorm=pre_lnorm,
init_std=init_std,
layer_norm_epsilon=layer_norm_epsilon,
name="pos_ff",
)
def call(
self,
dec_inp,
r,
dec_attn_mask,
mems,
head_mask,
output_attentions,
training=False,
):
attn_outputs = self.dec_attn(
dec_inp,
r,
dec_attn_mask,
mems,
head_mask,
output_attentions,
training=training,
)
ff_output = self.pos_ff(attn_outputs[0], training=training)
outputs = [ff_output] + attn_outputs[1:]
return outputs
class TFTransfoEmbeddings(tf.keras.layers.Layer):
def __init__(self, vocab_size, emb_size, init_std, **kwargs):
super().__init__(**kwargs)
self.vocab_size = vocab_size
self.emb_size = emb_size
self.init_std = init_std
def build(self, input_shape):
self.weight = self.add_weight(
shape=(self.vocab_size, self.emb_size),
initializer=get_initializer(self.init_std),
name="embeddings",
)
super().build(input_shape)
def call(self, inputs):
return tf.gather(self.weight, inputs)
class TFAdaptiveEmbedding(tf.keras.layers.Layer):
def __init__(
self,
n_token,
d_embed,
d_proj,
cutoffs,
div_val=1,
init_std=0.02,
sample_softmax=False,
**kwargs,
):
super().__init__(**kwargs)
self.n_token = n_token
self.d_embed = d_embed
self.init_std = init_std
self.cutoffs = cutoffs + [n_token]
self.div_val = div_val
self.d_proj = d_proj
self.emb_scale = d_proj**0.5
self.cutoff_ends = [0] + self.cutoffs
self.emb_layers = []
self.emb_projs = []
if div_val == 1:
raise NotImplementedError # Removed these to avoid maintaining dead code - They are not used in our pretrained checkpoint
else:
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
d_emb_i = d_embed // (div_val**i)
self.emb_layers.append(
TFTransfoEmbeddings(
r_idx - l_idx,
d_emb_i,
init_std,
name=f"emb_layers_._{i}",
)
)
def build(self, input_shape):
for i in range(len(self.cutoffs)):
d_emb_i = self.d_embed // (self.div_val**i)
self.emb_projs.append(
self.add_weight(
shape=(d_emb_i, self.d_proj),
initializer=get_initializer(self.init_std),
trainable=True,
name=f"emb_projs_._{i}",
)
)
super().build(input_shape)
def call(self, inp):
if self.div_val == 1:
raise NotImplementedError # Removed these to avoid maintaining dead code - They are not used in our pretrained checkpoint
else:
inp_flat = tf.reshape(inp, (-1,))
emb_flat = tf.zeros([shape_list(inp_flat)[0], self.d_proj])
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
mask_i = (inp_flat >= l_idx) & (inp_flat < r_idx)
inp_i = tf.boolean_mask(inp_flat, mask_i) - l_idx
emb_i = self.emb_layers[i](inp_i)
emb_i = tf.einsum("id,de->ie", emb_i, self.emb_projs[i])
mask_idx = tf.where(mask_i)
scatter = tf.scatter_nd(mask_idx, emb_i, shape_list(emb_flat))
emb_flat = tf.cast(emb_flat, dtype=scatter.dtype)
emb_flat += scatter
embed_shape = shape_list(inp) + [self.d_proj]
embed = tf.reshape(emb_flat, embed_shape)
embed *= self.emb_scale
return embed
@keras_serializable
class TFTransfoXLMainLayer(tf.keras.layers.Layer):
config_class = TransfoXLConfig
def __init__(self, config, **kwargs):
super().__init__(**kwargs)
self.config = config
self.output_hidden_states = config.output_hidden_states
self.output_attentions = config.output_attentions
self.return_dict = config.use_return_dict
self.n_token = config.vocab_size
self.d_embed = config.d_embed
self.d_model = config.d_model
self.n_head = config.n_head
self.d_head = config.d_head
self.untie_r = config.untie_r
self.word_emb = TFAdaptiveEmbedding(
config.vocab_size,
config.d_embed,
config.d_model,
config.cutoffs,
div_val=config.div_val,
init_std=config.init_std,
name="word_emb",
)
self.drop = tf.keras.layers.Dropout(config.dropout)
self.n_layer = config.n_layer
self.mem_len = config.mem_len
self.attn_type = config.attn_type
self.layers = []
if config.attn_type == 0: # the default attention
for i in range(config.n_layer):
self.layers.append(
TFRelPartialLearnableDecoderLayer(
config.n_head,
config.d_model,
config.d_head,
config.d_inner,
config.dropout,
dropatt=config.dropatt,
pre_lnorm=config.pre_lnorm,
r_w_bias=None if self.untie_r else self.r_w_bias,
r_r_bias=None if self.untie_r else self.r_r_bias,
layer_norm_epsilon=config.layer_norm_epsilon,
init_std=config.init_std,
output_attentions=self.output_attentions,
name=f"layers_._{i}",
)
)
else: # learnable embeddings and absolute embeddings
raise NotImplementedError # Removed these to avoid maintaining dead code - They are not used in our pretrained checkpoint
self.same_length = config.same_length
self.clamp_len = config.clamp_len
if self.attn_type == 0: # default attention
self.pos_emb = TFPositionalEmbedding(self.d_model, name="pos_emb")
else: # learnable embeddings and absolute embeddings
raise NotImplementedError # Removed these to avoid maintaining dead code - They are not used in our pretrained checkpoint
def build(self, input_shape):
if not self.untie_r:
self.r_w_bias = self.add_weight(
shape=(self.n_head, self.d_head),
initializer="zeros",
trainable=True,
name="r_w_bias",
)
self.r_r_bias = self.add_weight(
shape=(self.n_head, self.d_head),
initializer="zeros",
trainable=True,
name="r_r_bias",
)
super().build(input_shape)
def get_input_embeddings(self):
return self.word_emb
def set_input_embeddings(self, value):
raise NotImplementedError
def backward_compatible(self):
self.sample_softmax = -1
def reset_memory_length(self, mem_len):
self.mem_len = mem_len
def _prune_heads(self, heads):
raise NotImplementedError
def init_mems(self, bsz):
if self.mem_len > 0:
mems = []
for i in range(self.n_layer):
empty = tf.zeros([self.mem_len, bsz, self.d_model])
mems.append(empty)
return mems
else:
return None
def _update_mems(self, hids, mems, mlen, qlen):
# does not deal with None
if mems is None:
return None
# mems is not None
assert len(hids) == len(mems), "len(hids) != len(mems)"
# There are `mlen + qlen` steps that can be cached into mems
new_mems = []
end_idx = mlen + tf.math.maximum(0, qlen)
beg_idx = tf.math.maximum(0, end_idx - tf.convert_to_tensor(self.mem_len))
for i in range(len(hids)):
mems[i] = tf.cast(mems[i], dtype=hids[i].dtype)
cat = tf.concat([mems[i], hids[i]], axis=0)
tf.stop_gradient(cat)
new_mems.append(cat[beg_idx:end_idx])
return new_mems
@unpack_inputs
def call(
self,
input_ids: Optional[TFModelInputType] = None,
mems: Optional[List[tf.Tensor]] = None,
head_mask: Optional[Union[np.ndarray, tf.Tensor]] = None,
inputs_embeds: Optional[Union[np.ndarray, tf.Tensor]] = None,
output_attentions: Optional[bool] = None,
output_hidden_states: Optional[bool] = None,
return_dict: Optional[bool] = None,
labels: Optional[Union[np.ndarray, tf.Tensor]] = None,
training: bool = False,
):
# the original code for Transformer-XL used shapes [len, bsz] but we want a unified interface in the library
# so we transpose here from shape [bsz, len] to shape [len, bsz]
if input_ids is not None and inputs_embeds is not None:
raise ValueError(
"You cannot specify both input_ids and inputs_embeds at the same time"
)
elif input_ids is not None:
input_ids = tf.transpose(input_ids, perm=(1, 0))
qlen, bsz = shape_list(input_ids)
elif inputs_embeds is not None:
inputs_embeds = tf.transpose(inputs_embeds, perm=(1, 0, 2))
qlen, bsz = shape_list(inputs_embeds)[:2]
else:
raise ValueError("You have to specify either input_ids or inputs_embeds")
if mems is None:
mems = self.init_mems(bsz)
# Prepare head mask if needed
# 1.0 in head_mask indicate we keep the head
# attention_probs has shape bsz x n_heads x N x N
# input head_mask has shape [num_heads] or [num_hidden_layers x num_heads] (a head_mask for each layer)
# and head_mask is converted to shape [num_hidden_layers x qlen x klen x bsz x n_head]
if head_mask is not None:
raise NotImplementedError
else:
head_mask = [None] * self.n_layer
if inputs_embeds is not None:
word_emb = inputs_embeds
else:
word_emb = self.word_emb(input_ids)
mlen = shape_list(mems[0])[0] if mems is not None else 0
klen = mlen + qlen
# Compute decoder attention mask
# ::: PyTorch masking code for reference :::
# if self.same_length:
# all_ones = word_emb.new_ones((qlen, klen), dtype=torch.uint8)
# mask_len = klen - self.mem_len
# if mask_len > 0:
# mask_shift_len = qlen - mask_len
# else:
# mask_shift_len = qlen
# dec_attn_mask = (torch.triu(all_ones, 1+mlen)
# + torch.tril(all_ones, -mask_shift_len))[:, :, None] # -1
# else:
# dec_attn_mask = torch.triu(
# word_emb.new_ones((qlen, klen), dtype=torch.uint8), diagonal=1+mlen)[:,:,None]
# TensorFlow version
dec_attn_mask = 1 - tf.linalg.band_part(
tf.ones([qlen, klen], dtype=tf.int32), -1, mlen
) # (q, q): diagonal with 1's
if self.same_length:
mask_len = klen - self.mem_len
if mask_len > 0:
mask_shift_len = qlen - mask_len
else:
mask_shift_len = qlen
if mask_shift_len >= 1:
dec_attn_mask += 1 - tf.linalg.band_part(
tf.ones([qlen, klen], dtype=tf.int32), mask_shift_len - 1, -1
)
else:
dec_attn_mask += tf.linalg.band_part(
tf.ones([qlen, klen], dtype=tf.int32), -1, -mask_shift_len
)
hids = []
attentions = [] if output_attentions else None
if self.attn_type == 0: # default
pos_seq = tf.range(klen - 1, -1, -1.0)
if self.clamp_len > 0:
pos_seq = tf.minimum(pos_seq, self.clamp_len)
pos_emb = self.pos_emb(pos_seq)
core_out = self.drop(word_emb, training=training)
pos_emb = self.drop(pos_emb, training=training)
for i, layer in enumerate(self.layers):
hids.append(core_out)
mems_i = None if mems is None else mems[i]
layer_outputs = layer(
core_out,
pos_emb,
dec_attn_mask,
mems_i,
head_mask[i],
output_attentions,
training=training,
)
core_out = layer_outputs[0]
if output_attentions:
attentions.append(layer_outputs[1])
else: # learnable embeddings and absolute embeddings
raise NotImplementedError # Removed these to avoid maintaining dead code - They are not used in our pretrained checkpoint
core_out = self.drop(core_out, training=training)
new_mems = self._update_mems(hids, mems, mlen, qlen)
# We transpose back here to shape [bsz, len, hidden_dim]
core_out = tf.transpose(core_out, perm=(1, 0, 2))
if output_hidden_states:
# Transpose to library standard shape [bsz, len, hidden_dim] and add last layer
hids = tuple(tf.transpose(t, perm=(1, 0, 2)) for t in hids)
hids = hids + (core_out,)
else:
hids = None
if output_attentions:
# Transpose to library standard shape [bsz, n_heads, query_seq_len, key_seq_len]
attentions = tuple(tf.transpose(t, perm=(2, 3, 0, 1)) for t in attentions)
if not return_dict:
return tuple(
v for v in [core_out, new_mems, hids, attentions] if v is not None
)
return TFTransfoXLModelOutput(
last_hidden_state=core_out,
mems=new_mems,
hidden_states=hids,
attentions=attentions,
)
class TFTransfoXLPreTrainedModel(TFPreTrainedModel):
"""
An abstract class to handle weights initialization and a simple interface for downloading and loading pretrained
models.
"""
config_class = TransfoXLConfig
base_model_prefix = "transformer"
@tf.function(
input_signature=[
{
"input_ids": tf.TensorSpec((None, None), tf.int64, name="input_ids"),
}
]
)
def serving(self, inputs):
output = self.call(inputs)
return self.serving_output(output)
@dataclass
class TFTransfoXLModelOutput(ModelOutput):
"""
Base class for model's outputs that may also contain a past key/values (to speed up sequential decoding).
Args:
last_hidden_state (`tf.Tensor` of shape `(batch_size, sequence_length, hidden_size)`):
Sequence of hidden-states at the output of the last layer of the model.
mems (`List[tf.Tensor]` of length `config.n_layers`):
Contains pre-computed hidden-states (key and values in the attention blocks). Can be used (see `mems`
input) to speed up sequential decoding. The token ids which have their past given to this model should not
be passed as input ids as they have already been computed.
hidden_states (`tuple(tf.Tensor)`, *optional*, returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`):
Tuple of `tf.Tensor` (one for the output of the embeddings + one for the output of each layer) of shape
`(batch_size, sequence_length, hidden_size)`.
Hidden-states of the model at the output of each layer plus the initial embedding outputs.
attentions (`tuple(tf.Tensor)`, *optional*, returned when `output_attentions=True` is passed or when `config.output_attentions=True`):
Tuple of `tf.Tensor` (one for each layer) of shape `(batch_size, num_heads, sequence_length,
sequence_length)`.
Attentions weights after the attention softmax, used to compute the weighted average in the self-attention
heads.
"""
last_hidden_state: tf.Tensor = None
mems: List[tf.Tensor] = None
hidden_states: Optional[Tuple[tf.Tensor]] = None
attentions: Optional[Tuple[tf.Tensor]] = None
@dataclass
class TFTransfoXLLMHeadModelOutput(ModelOutput):
"""
Base class for model's outputs that may also contain a past key/values (to speed up sequential decoding).
Args:
losses (`tf.Tensor` of shape *(batch_size, sequence_length-1)*, *optional*, returned when `labels` is provided):
Language modeling losses (not reduced).
prediction_scores (`tf.Tensor` of shape `(batch_size, sequence_length, config.vocab_size)`):
Prediction scores of the language modeling head (scores for each vocabulary token after SoftMax).
mems (`List[tf.Tensor]` of length `config.n_layers`):
Contains pre-computed hidden-states (key and values in the attention blocks). Can be used (see `mems`
input) to speed up sequential decoding. The token ids which have their past given to this model should not
be passed as input ids as they have already been computed.
hidden_states (`tuple(tf.Tensor)`, *optional*, returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`):
Tuple of `tf.Tensor` (one for the output of the embeddings + one for the output of each layer) of shape
`(batch_size, sequence_length, hidden_size)`.
Hidden-states of the model at the output of each layer plus the initial embedding outputs.
attentions (`tuple(tf.Tensor)`, *optional*, returned when `output_attentions=True` is passed or when `config.output_attentions=True`):
Tuple of `tf.Tensor` (one for each layer) of shape `(batch_size, num_heads, sequence_length,
sequence_length)`.
Attentions weights after the attention softmax, used to compute the weighted average in the self-attention
heads.
"""
prediction_scores: tf.Tensor = None
mems: List[tf.Tensor] = None
hidden_states: Optional[Tuple[tf.Tensor]] = None
attentions: Optional[Tuple[tf.Tensor]] = None
@dataclass
class TFTransfoXLSequenceClassifierOutputWithPast(ModelOutput):
"""
Base class for outputs of sentence classification models.
Args:
loss (`tf.Tensor` of shape `(1,)`, *optional*, returned when `labels` is provided):
Classification (or regression if config.num_labels==1) loss.
logits (`tf.Tensor` of shape `(batch_size, config.num_labels)`):
Classification (or regression if config.num_labels==1) scores (before SoftMax).
mems (`List[tf.Tensor]` of length `config.n_layers`):
Contains pre-computed hidden-states (key and values in the attention blocks). Can be used (see `mems`
input) to speed up sequential decoding. The token ids which have their past given to this model should not
be passed as input ids as they have already been computed.
hidden_states (`tuple(tf.Tensor)`, *optional*, returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`):
Tuple of `tf.Tensor` (one for the output of the embeddings + one for the output of each layer) of shape
`(batch_size, sequence_length, hidden_size)`.
Hidden-states of the model at the output of each layer plus the initial embedding outputs.
attentions (`tuple(tf.Tensor)`, *optional*, returned when `output_attentions=True` is passed or when `config.output_attentions=True`):
Tuple of `tf.Tensor` (one for each layer) of shape `(batch_size, num_heads, sequence_length,
sequence_length)`.
Attentions weights after the attention softmax, used to compute the weighted average in the self-attention
heads.
"""
loss: Optional[tf.Tensor] = None
logits: tf.Tensor = None
mems: List[tf.Tensor] = None
hidden_states: Optional[Tuple[tf.Tensor]] = None
attentions: Optional[Tuple[tf.Tensor]] = None
TRANSFO_XL_START_DOCSTRING = r"""
This model inherits from [`TFPreTrainedModel`]. Check the superclass documentation for the generic methods the
library implements for all its model (such as downloading or saving, resizing the input embeddings, pruning heads
etc.)
This model is also a [tf.keras.Model](https://www.tensorflow.org/api_docs/python/tf/keras/Model) subclass. Use it
as a regular TF 2.0 Keras Model and refer to the TF 2.0 documentation for all matter related to general usage and
behavior.
<Tip>
TensorFlow models and layers in `transformers` accept two formats as input:
- having all inputs as keyword arguments (like PyTorch models), or
- having all inputs as a list, tuple or dict in the first positional argument.
The reason the second format is supported is that Keras methods prefer this format when passing inputs to models
and layers. Because of this support, when using methods like `model.fit()` things should "just work" for you - just
pass your inputs and labels in any format that `model.fit()` supports! If, however, you want to use the second
format outside of Keras methods like `fit()` and `predict()`, such as when creating your own layers or models with
the Keras `Functional` API, there are three possibilities you can use to gather all the input Tensors in the first
positional argument:
- a single Tensor with `input_ids` only and nothing else: `model(input_ids)`
- a list of varying length with one or several input Tensors IN THE ORDER given in the docstring:
`model([input_ids, attention_mask])` or `model([input_ids, attention_mask, token_type_ids])`
- a dictionary with one or several input Tensors associated to the input names given in the docstring:
`model({"input_ids": input_ids, "token_type_ids": token_type_ids})`
Note that when creating models and layers with
[subclassing](https://keras.io/guides/making_new_layers_and_models_via_subclassing/) then you don't need to worry
about any of this, as you can just pass inputs like you would to any other Python function!
</Tip>
Parameters:
config ([`TransfoXLConfig`]): Model configuration class with all the parameters of the model.
Initializing with a config file does not load the weights associated with the model, only the
configuration. Check out the [`~PreTrainedModel.from_pretrained`] method to load the model weights.
"""
TRANSFO_XL_INPUTS_DOCSTRING = r"""
Args:
input_ids (`tf.Tensor` or `Numpy array` of shape `(batch_size, sequence_length)`):
Indices of input sequence tokens in the vocabulary.
Indices can be obtained using [`BertTokenizer`]. See [`PreTrainedTokenizer.__call__`] and
[`PreTrainedTokenizer.encode`] for details.
[What are input IDs?](../glossary#input-ids)
mems (`List[tf.Tensor]` of length `config.n_layers`):
Contains pre-computed hidden-states (key and values in the attention blocks) as computed by the model (see
`mems` output below). Can be used to speed up sequential decoding. The token ids which have their mems
given to this model should not be passed as `input_ids` as they have already been computed.
head_mask (`tf.Tensor` or `Numpy array` of shape `(num_heads,)` or `(num_layers, num_heads)`, *optional*):
Mask to nullify selected heads of the self-attention modules. Mask values selected in `[0, 1]`:
- 1 indicates the head is **not masked**,
- 0 indicates the head is **masked**.
inputs_embeds (`tf.Tensor` or `Numpy array` of shape `(batch_size, sequence_length, hidden_size)`, *optional*):
Optionally, instead of passing `input_ids` you can choose to directly pass an embedded representation. This
is useful if you want more control over how to convert `input_ids` indices into associated vectors than the
model's internal embedding lookup matrix.
output_attentions (`bool`, *optional*):
Whether or not to return the attentions tensors of all attention layers. See `attentions` under returned
tensors for more detail. This argument can be used only in eager mode, in graph mode the value in the
config will be used instead.
output_hidden_states (`bool`, *optional*):
Whether or not to return the hidden states of all layers. See `hidden_states` under returned tensors for
more detail. This argument can be used only in eager mode, in graph mode the value in the config will be
used instead.
return_dict (`bool`, *optional*):
Whether or not to return a [`~utils.ModelOutput`] instead of a plain tuple. This argument can be used in
eager mode, in graph mode the value will always be set to True.
training (`bool`, *optional*, defaults to `False`):
Whether or not to use the model in training mode (some modules like dropout modules have different
behaviors between training and evaluation).
"""
@add_start_docstrings(
"The bare Bert Model transformer outputting raw hidden-states without any specific head on top.",
TRANSFO_XL_START_DOCSTRING,
)
class TFTransfoXLModel(TFTransfoXLPreTrainedModel):
def __init__(self, config, *inputs, **kwargs):
super().__init__(config, *inputs, **kwargs)
self.transformer = TFTransfoXLMainLayer(config, name="transformer")
@unpack_inputs
@add_start_docstrings_to_model_forward(TRANSFO_XL_INPUTS_DOCSTRING)
@add_code_sample_docstrings(
processor_class=_TOKENIZER_FOR_DOC,
checkpoint=_CHECKPOINT_FOR_DOC,
output_type=TFTransfoXLModelOutput,
config_class=_CONFIG_FOR_DOC,
)
def call(
self,
input_ids: Optional[TFModelInputType] = None,
mems: Optional[List[tf.Tensor]] = None,
head_mask: Optional[Union[np.ndarray, tf.Tensor]] = None,
inputs_embeds: Optional[Union[np.ndarray, tf.Tensor]] = None,
output_attentions: Optional[bool] = None,
output_hidden_states: Optional[bool] = None,
return_dict: Optional[bool] = None,
training: bool = False,
):
outputs = self.transformer(
input_ids=input_ids,
mems=mems,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
training=training,
)
return outputs
def serving_output(self, output):
hs = (
tf.convert_to_tensor(output.hidden_states)
if self.config.output_hidden_states
else None
)
attns = (
tf.convert_to_tensor(output.attentions)
if self.config.output_attentions
else None
)
return TFTransfoXLModelOutput(
last_hidden_state=output.last_hidden_state,
mems=tf.convert_to_tensor(output.mems),
hidden_states=hs,
attentions=attns,
)
@add_start_docstrings(
"""
The Transformer-XL Model with a language modeling head on top (adaptive softmax with weights tied to the adaptive
input embeddings)
""",
TRANSFO_XL_START_DOCSTRING,
)
class TFTransfoXLLMHeadModel(TFTransfoXLPreTrainedModel):
def __init__(self, config):
super().__init__(config)
self.transformer = TFTransfoXLMainLayer(config, name="transformer")
self.sample_softmax = config.sample_softmax
assert self.sample_softmax <= 0, (
"Sampling from the softmax is not implemented yet. Please look at issue: #3310:"
" https://github.com/huggingface/transformers/issues/3310"
)
self.crit = TFAdaptiveSoftmaxMask(
config.vocab_size,
config.d_embed,
config.d_model,
config.cutoffs,
div_val=config.div_val,
name="crit",
)
def _resize_token_embeddings(self, new_num_tokens):
raise NotImplementedError()
def get_output_embeddings(self):
"""Double-check if you are using adaptive softmax."""
if len(self.crit.out_layers) > 0:
return self.crit.out_layers[-1]
return None
def reset_memory_length(self, mem_len):
self.transformer.reset_memory_length(mem_len)
def init_mems(self, bsz):
return self.transformer.init_mems(bsz)
@unpack_inputs
@add_start_docstrings_to_model_forward(TRANSFO_XL_INPUTS_DOCSTRING)
@add_code_sample_docstrings(
processor_class=_TOKENIZER_FOR_DOC,
checkpoint=_CHECKPOINT_FOR_DOC,
output_type=TFTransfoXLLMHeadModelOutput,
config_class=_CONFIG_FOR_DOC,
)
def call(
self,
input_ids: Optional[TFModelInputType] = None,
mems: Optional[List[tf.Tensor]] = None,
head_mask: Optional[Union[np.ndarray, tf.Tensor]] = None,
inputs_embeds: Optional[Union[np.ndarray, tf.Tensor]] = None,
output_attentions: Optional[bool] = None,
output_hidden_states: Optional[bool] = None,
return_dict: Optional[bool] = None,
labels: Optional[Union[np.ndarray, tf.Tensor]] = None,
training: bool = False,
):
if input_ids is not None:
bsz, tgt_len = shape_list(input_ids)[:2]
else:
bsz, tgt_len = shape_list(inputs_embeds)[:2]
transformer_outputs = self.transformer(
input_ids,
mems,
head_mask,
inputs_embeds,
output_attentions,
output_hidden_states,
return_dict,
training=training,
)
last_hidden = transformer_outputs[0]
pred_hid = last_hidden[:, -tgt_len:]
softmax_output = self.crit(pred_hid, labels, training=training)
prediction_scores = softmax_output if labels is None else ()
if not return_dict:
return (prediction_scores,) + transformer_outputs[1:]
return TFTransfoXLLMHeadModelOutput(
prediction_scores=prediction_scores,
mems=transformer_outputs.mems,
hidden_states=transformer_outputs.hidden_states,
attentions=transformer_outputs.attentions,
)
def serving_output(self, output):
hs = (
tf.convert_to_tensor(output.hidden_states)
if self.config.output_hidden_states
else None
)
attns = (
tf.convert_to_tensor(output.attentions)
if self.config.output_attentions
else None
)
return TFTransfoXLLMHeadModelOutput(
prediction_scores=output.prediction_scores,
mems=tf.convert_to_tensor(output.mems),
hidden_states=hs,
attentions=attns,
)
def prepare_inputs_for_generation(self, input_ids, past=None, **model_kwargs):
inputs = {}
# if past is defined in model kwargs then use it for faster decoding
if past:
input_ids = tf.expand_dims(input_ids[:, -1], axis=-1)
else:
input_ids = input_ids
return inputs
@staticmethod
def _reorder_cache(mems: List[tf.Tensor], beam_idx: tf.Tensor) -> List[tf.Tensor]:
return [tf.gather(layer_past, beam_idx, axis=1) for layer_past in mems]
@add_start_docstrings(
"""
The Transfo XL Model transformer with a sequence classification head on top (linear layer).
[`TFTransfoXLForSequenceClassification`] uses the last token in order to do the classification, as other causal
models (e.g. GPT-1,GPT-2) do.
Since it does classification on the last token, it requires to know the position of the last token. If a
`pad_token_id` is defined in the configuration, it finds the last token that is not a padding token in each row. If
no `pad_token_id` is defined, it simply takes the last value in each row of the batch. Since it cannot guess the
padding tokens when `inputs_embeds` are passed instead of `input_ids`, it does the same (take the last value in
each row of the batch).
""",
TRANSFO_XL_START_DOCSTRING,
)
class TFTransfoXLForSequenceClassification(
TFTransfoXLPreTrainedModel, TFSequenceClassificationLoss
):
def __init__(self, config, *inputs, **kwargs):
super().__init__(config, *inputs, **kwargs)
self.num_labels = config.num_labels
self.score = tf.keras.layers.Dense(
config.num_labels,
kernel_initializer=get_initializer(config.init_range),
name="score",
use_bias=False,
)
self.transformer = TFTransfoXLMainLayer(config, name="transformer")
def get_output_embeddings(self):
return self.transformer.word_emb
@unpack_inputs
@add_start_docstrings_to_model_forward(TRANSFO_XL_INPUTS_DOCSTRING)
@add_code_sample_docstrings(
processor_class=_TOKENIZER_FOR_DOC,
checkpoint=_CHECKPOINT_FOR_DOC,
output_type=TFTransfoXLSequenceClassifierOutputWithPast,
config_class=_CONFIG_FOR_DOC,
)
def call(
self,
input_ids: Optional[TFModelInputType] = None,
mems: Optional[List[tf.Tensor]] = None,
head_mask: Optional[Union[np.ndarray, tf.Tensor]] = None,
inputs_embeds: Optional[Union[np.ndarray, tf.Tensor]] = None,
output_attentions: Optional[bool] = None,
output_hidden_states: Optional[bool] = None,
return_dict: Optional[bool] = None,
labels: Optional[Union[np.ndarray, tf.Tensor]] = None,
training: Optional[bool] = False,
) -> Union[Tuple, TFTransfoXLSequenceClassifierOutputWithPast]:
r"""
labels (`tf.Tensor` of shape `(batch_size, sequence_length)`, *optional*):
Labels for computing the cross entropy classification loss. Indices should be in `[0, ...,
config.vocab_size - 1]`.
"""
transformer_outputs = self.transformer(
input_ids=input_ids,
mems=mems,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
training=training,
)
hidden_states = transformer_outputs[0]
logits = self.score(hidden_states)
in_logits = None
if self.config.pad_token_id is None:
sequence_lengths = -1
else:
if input_ids is not None:
sequence_lengths = (
tf.reduce_sum(
tf.cast(
tf.math.not_equal(input_ids, self.config.pad_token_id),
dtype=input_ids.dtype,
),
-1,
keepdims=False,
)
- 1
)
in_logits = tf.gather(logits, sequence_lengths, batch_dims=1, axis=1)
else:
sequence_lengths = -1
logger.warning(
f"{self.__class__.__name__} will not detect padding tokens in `inputs_embeds`. Results may be "
"unexpected if using padding tokens in conjunction with `inputs_embeds.`"
)
loss = None
if labels is not None:
if input_ids is not None:
batch_size, sequence_length = shape_list(input_ids)[:2]
else:
batch_size, sequence_length = shape_list(inputs_embeds)[:2]
assert (
self.config.pad_token_id is not None or batch_size == 1
), "Cannot handle batch sizes > 1 if no padding token is defined."
if not tf.is_tensor(sequence_lengths):
in_logits = logits[0:batch_size, sequence_lengths]
loss = self.hf_compute_loss(
tf.reshape(labels, [-1, 1]),
tf.reshape(in_logits, [-1, self.num_labels]),
)
pooled_logits = in_logits if in_logits is not None else logits
if not return_dict:
output = (pooled_logits,) + transformer_outputs[1:]
return ((loss,) + output) if loss is not None else output
return TFTransfoXLSequenceClassifierOutputWithPast(
loss=loss,
logits=pooled_logits,
mems=transformer_outputs.mems,
hidden_states=transformer_outputs.hidden_states,
attentions=transformer_outputs.attentions,
)
def serving_output(self, output):
hs = (
tf.convert_to_tensor(output.hidden_states)
if self.config.output_hidden_states
else None
)
attns = (
tf.convert_to_tensor(output.attentions)
if self.config.output_attentions
else None
)
return TFTransfoXLSequenceClassifierOutputWithPast(
logits=output.logits,
mems=tf.convert_to_tensor(output.mems),
hidden_states=hs,
attentions=attns,
)
|
PypiClean
|
/DeepPhysX-22.6.tar.gz/DeepPhysX-22.6/src/Pipelines/BasePipeline.py
|
from typing import Dict, Optional, Any, List, Union
from DeepPhysX.Core.Network.BaseNetworkConfig import BaseNetworkConfig
from DeepPhysX.Core.Dataset.BaseDatasetConfig import BaseDatasetConfig
from DeepPhysX.Core.Environment.BaseEnvironmentConfig import BaseEnvironmentConfig
from DeepPhysX.Core.Manager.Manager import Manager
from DeepPhysX.Core.Manager.NetworkManager import NetworkManager
from DeepPhysX.Core.Manager.DataManager import DataManager
from DeepPhysX.Core.Manager.StatsManager import StatsManager
from DeepPhysX.Core.Manager.DatasetManager import DatasetManager
from DeepPhysX.Core.Manager.EnvironmentManager import EnvironmentManager
from DeepPhysX.Core.Manager.VisualizerManager import VisualizerManager
class BasePipeline:
"""
| Base class defining Pipelines common variables.
:param BaseNetworkConfig network_config: Specialisation containing the parameters of the network manager
:param BaseDatasetConfig dataset_config: Specialisation containing the parameters of the dataset manager
:param BaseEnvironmentConfig environment_config: Specialisation containing the parameters of the environment manager
:param str session_name: Name of the newly created directory if session_dir is not defined
:param Optional[str] session_dir: Name of the directory in which to write all the necessary data
:param Optional[str] pipeline: Values at either 'training' or 'prediction'
"""
def __init__(self,
network_config: Optional[BaseNetworkConfig] = None,
dataset_config: Optional[BaseDatasetConfig] = None,
environment_config: Optional[BaseEnvironmentConfig] = None,
session_name: str = 'default',
session_dir: Optional[str] = None,
pipeline: Optional[str] = None):
self.name: str = self.__class__.__name__
# Check the arguments
if network_config is not None and not isinstance(network_config, BaseNetworkConfig):
raise TypeError(f"[{self.name}] The network configuration must be a BaseNetworkConfig")
if environment_config is not None and not isinstance(environment_config, BaseEnvironmentConfig):
raise TypeError(f"[{self.name}] The environment configuration must be a BaseEnvironmentConfig")
if dataset_config is not None and not isinstance(dataset_config, BaseDatasetConfig):
raise TypeError(f"[{self.name}] The dataset configuration must be a BaseDatasetConfig")
if type(session_name) != str:
raise TypeError(f"[{self.name}] The network config must be a BaseNetworkConfig object.")
if session_dir is not None and type(session_dir) != str:
raise TypeError(f"[{self.name}] The session directory must be a str.")
self.type: str = pipeline # Either training or prediction
self.debug: bool = False
self.new_session: bool = True
self.record_data: Optional[Dict[str, bool]] = None # Can be of type {'in': bool, 'out': bool}
# Dataset variables
self.dataset_config: BaseDatasetConfig = dataset_config
# Network variables
self.network_config: BaseNetworkConfig = network_config
# Simulation variables
self.environment_config: BaseEnvironmentConfig = environment_config
# Main manager
self.manager: Optional[Manager] = None
def get_any_manager(self, manager_names: Union[str, List[str]]) -> Optional[Any]:
"""
| Return the desired Manager associated with the pipeline if it exists.
:param Union[str, List[str]] manager_names: Name of the desired Manager or order of access to the desired
Manager
:return: Manager associated with the Pipeline
"""
# If manager variable is not defined, cannot access other manager
if self.manager is None:
return None
# Direct access to manager
if type(manager_names) == str:
return getattr(self.manager, manager_names) if hasattr(self.manager, manager_names) else None
# Intermediates to access manager
accessed_manager = self.manager
for next_manager in manager_names:
if hasattr(accessed_manager, next_manager):
accessed_manager = getattr(accessed_manager, next_manager)
else:
return None
return accessed_manager
def get_network_manager(self) -> NetworkManager:
"""
| Return the NetworkManager associated with the pipeline.
:return: NetworkManager associated with the pipeline
"""
return self.get_any_manager('network_manager')
def get_data_manager(self) -> DataManager:
"""
| Return the DataManager associated with the pipeline.
:return: DataManager associated with the pipeline
"""
return self.get_any_manager('data_manager')
def get_stats_manager(self) -> StatsManager:
"""
| Return the StatsManager associated with the pipeline.
:return: StatsManager associated with the pipeline
"""
return self.get_any_manager('stats_manager')
def get_dataset_manager(self) -> DatasetManager:
"""
| Return the DatasetManager associated with the pipeline.
:return: DatasetManager associated with the pipeline
"""
return self.get_any_manager(['data_manager', 'dataset_manager'])
def get_environment_manager(self) -> EnvironmentManager:
"""
| Return the EnvironmentManager associated with the pipeline.
:return: EnvironmentManager associated with the pipeline
"""
return self.get_any_manager(['data_manager', 'environment_manager'])
def get_visualizer_manager(self) -> VisualizerManager:
"""
| Return the VisualizerManager associated with the pipeline.
:return: VisualizerManager associated with the pipeline
"""
return self.get_any_manager(['data_manager', 'environment_manager', 'visualizer_manager'])
|
PypiClean
|
/django-admin-modernize-1.0.3.tar.gz/django-admin-modernize-1.0.3/admin_modernize/static/assets/libs/bootstrap/js/dist/dom/selector-engine.js
|
(function (global, factory) {
typeof exports === 'object' && typeof module !== 'undefined' ? module.exports = factory(require('../util/index')) :
typeof define === 'function' && define.amd ? define(['../util/index'], factory) :
(global = typeof globalThis !== 'undefined' ? globalThis : global || self, global.SelectorEngine = factory(global.Index));
})(this, (function (index) { 'use strict';
/**
* --------------------------------------------------------------------------
* Bootstrap (v5.2.3): dom/selector-engine.js
* Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)
* --------------------------------------------------------------------------
*/
/**
* Constants
*/
const SelectorEngine = {
find(selector, element = document.documentElement) {
return [].concat(...Element.prototype.querySelectorAll.call(element, selector));
},
findOne(selector, element = document.documentElement) {
return Element.prototype.querySelector.call(element, selector);
},
children(element, selector) {
return [].concat(...element.children).filter(child => child.matches(selector));
},
parents(element, selector) {
const parents = [];
let ancestor = element.parentNode.closest(selector);
while (ancestor) {
parents.push(ancestor);
ancestor = ancestor.parentNode.closest(selector);
}
return parents;
},
prev(element, selector) {
let previous = element.previousElementSibling;
while (previous) {
if (previous.matches(selector)) {
return [previous];
}
previous = previous.previousElementSibling;
}
return [];
},
// TODO: this is now unused; remove later along with prev()
next(element, selector) {
let next = element.nextElementSibling;
while (next) {
if (next.matches(selector)) {
return [next];
}
next = next.nextElementSibling;
}
return [];
},
focusableChildren(element) {
const focusables = ['a', 'button', 'input', 'textarea', 'select', 'details', '[tabindex]', '[contenteditable="true"]'].map(selector => `${selector}:not([tabindex^="-"])`).join(',');
return this.find(focusables, element).filter(el => !index.isDisabled(el) && index.isVisible(el));
}
};
return SelectorEngine;
}));
//# sourceMappingURL=selector-engine.js.map
|
PypiClean
|
/image_bootstrap-2.0.5-py3-none-any.whl/directory_bootstrap/distros/centos.py
|
import os
import re
import urllib.request, urllib.parse, urllib.error
from textwrap import dedent
from directory_bootstrap.distros.yum_based import YumBasedDirectoryBootstrapper
from directory_bootstrap.shared.loaders._bs4 import BeautifulSoup
def _abs_filename_to_url(abs_filename):
return 'file://%s' % urllib.request.pathname2url(abs_filename)
class CentOsBootstrapper(YumBasedDirectoryBootstrapper):
DISTRO_KEY = 'centos'
DISTRO_NAME_LONG = 'CentOS'
EXAMPLE_RELEASE = '7.4.1708' # 2017-12-29
def _write_yum_conf(self, abs_yum_conf_path, abs_gpg_public_key_filename):
self._messenger.info('Writing file "%s"...' % abs_yum_conf_path)
gpg_public_key_file_url = _abs_filename_to_url(abs_gpg_public_key_filename)
with open(abs_yum_conf_path, 'w') as f:
print(dedent("""\
[base]
name=CentOS-$releasever - Base
baseurl=http://mirror.centos.org/centos/$releasever/os/$basearch/
gpgcheck=1
gpgkey=%(gpgkey)s
#released updates
[updates]
name=CentOS-$releasever - Updates
baseurl=http://mirror.centos.org/centos/$releasever/updates/$basearch/
gpgcheck=1
gpgkey=%(gpgkey)s
#additional packages that may be useful
[extras]
name=CentOS-$releasever - Extras
baseurl=http://mirror.centos.org/centos/$releasever/extras/$basearch/
gpgcheck=1
gpgkey=%(gpgkey)s
""" % {
'gpgkey': gpg_public_key_file_url,
}), file=f)
def _find_latest_release(self):
html = self.get_url_content('https://wiki.centos.org/Download')
soup = BeautifulSoup(html, 'lxml')
minor_version_matcher = re.compile('^ ?([0-9]+) \(([0-9]+)\) ?$')
candidates = []
prev = None
for paragraph in soup.find_all('p'):
m = minor_version_matcher.match(paragraph.text)
if not m:
prev = paragraph
continue
try:
mayor_version = int(prev.text.strip())
except ValueError:
prev = paragraph
continue
if mayor_version > 7:
# CentOS >=8 needs DNF and we only support YUM right now
continue
version = '%s.%s.%s' % (mayor_version, m.group(1), m.group(2))
candidates.append(version)
return sorted(candidates)[-1]
def _download_release_public_key(self):
self._messenger.info('Downloading related GnuPG public key...')
release_major = self._releasever.split('.')[0]
if int(release_major) > 7:
rel_gpg_public_key_filename = 'RPM-GPG-KEY-CentOS-Official'
else:
rel_gpg_public_key_filename = 'RPM-GPG-KEY-CentOS-%s' % release_major
abs_gpg_public_key_filename = os.path.join(self._abs_cache_dir, rel_gpg_public_key_filename)
self.download_url_to_file(
'https://www.centos.org/keys/%s' % rel_gpg_public_key_filename,
abs_gpg_public_key_filename)
return abs_gpg_public_key_filename
|
PypiClean
|
/jdcloud_sdk-1.6.243.tar.gz/jdcloud_sdk-1.6.243/jdcloud_sdk/services/apigateway/apis/CreateBackendConfigRequest.py
|
# Copyright 2018 JDCLOUD.COM
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# NOTE: This class is auto generated by the jdcloud code generator program.
from jdcloud_sdk.core.jdcloudrequest import JDCloudRequest
class CreateBackendConfigRequest(JDCloudRequest):
"""
开通后端配置
"""
def __init__(self, parameters, header=None, version="v1"):
super(CreateBackendConfigRequest, self).__init__(
'/regions/{regionId}/apiGroups/{apiGroupId}/backendConfig', 'POST', header, version)
self.parameters = parameters
class CreateBackendConfigParameters(object):
def __init__(self, regionId, apiGroupId, environment, backendServiceType, sort, ):
"""
:param regionId: 地域ID
:param apiGroupId: 分组ID
:param environment: 环境:test、preview、online
:param backendServiceType: 后端服务类型:mock、HTTP/HTTPS
:param sort: 排序,赋值0时为默认的后端配置
"""
self.regionId = regionId
self.apiGroupId = apiGroupId
self.backendConfigId = None
self.baseGroupId = None
self.environment = environment
self.backendUrl = None
self.backendServiceType = backendServiceType
self.headerParams = None
self.queryParams = None
self.description = None
self.createTime = None
self.sort = sort
self.userSort = None
self.jdsfId = None
self.jdsfParam = None
self.jdsfRegion = None
self.jdsfPin = None
def setBackendConfigId(self, backendConfigId):
"""
:param backendConfigId: (Optional) 接口ID
"""
self.backendConfigId = backendConfigId
def setBaseGroupId(self, baseGroupId):
"""
:param baseGroupId: (Optional) 分组ID
"""
self.baseGroupId = baseGroupId
def setBackendUrl(self, backendUrl):
"""
:param backendUrl: (Optional) 后端地址
"""
self.backendUrl = backendUrl
def setHeaderParams(self, headerParams):
"""
:param headerParams: (Optional) header参数列表
"""
self.headerParams = headerParams
def setQueryParams(self, queryParams):
"""
:param queryParams: (Optional) query参数列表
"""
self.queryParams = queryParams
def setDescription(self, description):
"""
:param description: (Optional) 描述
"""
self.description = description
def setCreateTime(self, createTime):
"""
:param createTime: (Optional) 发布日期,格式为毫秒级时间戳
"""
self.createTime = createTime
def setUserSort(self, userSort):
"""
:param userSort: (Optional) 排序,用于展示使用
"""
self.userSort = userSort
def setJdsfId(self, jdsfId):
"""
:param jdsfId: (Optional) vpc网关id
"""
self.jdsfId = jdsfId
def setJdsfParam(self, jdsfParam):
"""
:param jdsfParam: (Optional) vpc后端地址
"""
self.jdsfParam = jdsfParam
def setJdsfRegion(self, jdsfRegion):
"""
:param jdsfRegion: (Optional) vpc网关所属region
"""
self.jdsfRegion = jdsfRegion
def setJdsfPin(self, jdsfPin):
"""
:param jdsfPin: (Optional) vpc网关创建者的pin
"""
self.jdsfPin = jdsfPin
|
PypiClean
|
/mvg_console-2021.2.12-py3-none-any.whl/mvg_console/main.py
|
import mvg_api as mvg
from .departure import Departure
from colr import color
from prettytable import PrettyTable
from typer import Typer
MVG_BG = "#2a4779"
MVG_FG = "#ffffff"
######
# Types of Transports
# 1. UBAHN
# 2. BUS
# 3. REGIONAL_BUS
# 4. TRAM
# 5. SBAHN
# 5. NACHT_BUSs
#######
app = Typer()
__package_name__ = "mvg_console"
__version__ = "2021.02.12"
__description__ = "A Command Line Tool to get the MVG departures for a station."
def display_title_bar():
""" Print a title bar. """
color_it_mvg = lambda x: color(x, fore=MVG_FG, back=MVG_BG)
bar_mvg_colored = color_it_mvg("*" * 48)
fifteen_stars = "*" * 15
print(bar_mvg_colored)
print(color_it_mvg(fifteen_stars + " MVG - Departures " + fifteen_stars))
print(bar_mvg_colored + "\n")
def display_departures(station_name, limit=10, mode=None):
station_id = mvg.get_id_for_station(station_name)
assert station_id is not None, f"Station {station_name} not found!"
departuresJSON = mvg.get_departures(station_id)
departures = []
if mode is not None:
for d in departuresJSON:
if mode.upper() in d['product']:
departures += [Departure(d)]
else:
departures = [ Departure(i) for i in departuresJSON ]
departures = departures[:limit]
print('\nStation: '+station_name+'\n')
table = PrettyTable(['Line', 'Destination', 'Departure (min)'])
# table.set_deco(Texttable.HEADER)
rows = []
# rows.append(['Line', 'Destination', 'Departure (min)'])
for dep in departures:
rows.append( [dep.get_label_colored(), dep.destination, dep.departure_time_minutes] )
table.add_rows(rows)
# print( color(table.draw(), fore=MVG_FG, back=MVG_BG) )
print(table)
def get_nearest_stations(address):
location = mvg.get_locations(address)
assert len(location) > 0, f"Location: {address} not found!"
lat = location[0]['latitude']
lng = location[0]['longitude']
stations_json = mvg.get_nearby_stations(lat,lng)[:5]
print('Nearest Stations to '+address+' :')
print(''.join([str(idx+1)+". "+station['name']+": "+', '.join(station['products'])+'\n' for idx,station in enumerate(stations_json)]))
return
@app.command("dest", help="Prints the departures from the station.")
def departures(station: str, limit: int=10, mode=None):
display_departures(station_name=station, limit=limit, mode=mode)
@app.command("search", help="Displays the nearest stations to the search query.")
def search(query: str):
get_nearest_stations(query)
@app.command("version", help="Displays version info.")
def version():
print(__package_name__)
print(__description__)
print(f"Version: {__version__}")
if __name__ == "__main__":
# main()
app()
|
PypiClean
|
/automodel-server-0.1.16.tar.gz/automodel-server-0.1.16/modellerstep/Malign.py
|
from Modeller_Caller import modeller_caller
import os
class Malign(object):
"""docstring for Malign"""
def __init__(self, path):
self.path = path
self.myscript = ""
def create_script_in_folder(self, template, best_sequence, loop_sequence):
script = """from modeller import *
import os
os.chdir('""" + self.path + """')
log.verbose()
env = environ()
env.io.atom_files_directory = './:../atom_files/'
aln = alignment(env)
for (code, chain) in (('""" + template + """', 'A'), ('""" + best_sequence + """', ''), ('""" + loop_sequence + """', '')):
mdl = model(env, file=code, model_segment=('FIRST:'+chain, 'LAST:'+chain))
aln.append_model(mdl, atom_files=code, align_codes=code+chain)
for (weights, write_fit, whole) in (((1., 0., 0., 0., 1., 0.), False, True),
((1., 0.5, 1., 1., 1., 0.), False, True),
((1., 1., 1., 1., 1., 0.), True, False)):
aln.salign(rms_cutoff=3.5, normalize_pp_scores=False,
rr_file='$(LIB)/as1.sim.mat', overhang=30,
gap_penalties_1d=(-450, -50),
gap_penalties_3d=(0, 3), gap_gap_score=0, gap_residue_score=0,
dendrogram_file='ali.tree',
alignment_type='tree', # If 'progresive', the tree is not
# computed and all structues will be
# aligned sequentially to the first
feature_weights=weights, # For a multiple sequence alignment only
# the first feature needs to be non-zero
improve_alignment=True, fit=True, write_fit=write_fit,
write_whole_pdb=whole, output='ALIGNMENT QUALITY')
#aln.write(file='ali.pap', alignment_format='PAP')
aln.write(file='ali.ali', alignment_format='PIR')
aln.salign(rms_cutoff=1.0, normalize_pp_scores=False,
rr_file='$(LIB)/as1.sim.mat', overhang=30,
gap_penalties_1d=(-450, -50), gap_penalties_3d=(0, 3),
gap_gap_score=0, gap_residue_score=0, dendrogram_file='1is3A.tree',
alignment_type='progressive', feature_weights=[0]*6,
improve_alignment=False, fit=False, write_fit=True,
write_whole_pdb=False, output='QUALITY')
"""
script_path = self.path + os.sep + "salign.py"
loop_script = file(script_path, "w")
loop_script.write(script)
loop_script.close()
self.myscript = script_path
def get_model(self):
processo = modeller_caller()
processo.run(self.myscript)
|
PypiClean
|
/galileo-db-0.10.4.tar.gz/galileo-db-0.10.4/galileodb/cli/recorder.py
|
import argparse
import logging
import os
import signal
import time
import redis
from galileodb.factory import create_experiment_database_from_env
from galileodb.model import Experiment, generate_experiment_id
from galileodb.recorder import Recorder
logger = logging.getLogger(__name__)
def create_experiment(args):
experiment_id = generate_experiment_id()
if args.name:
name = args.name
else:
name = experiment_id
if args.creator:
creator = args.creator
else:
creator = 'galileodb-recorder-' + str(os.getpid())
now = time.time()
return Experiment(experiment_id, name=name, creator=creator, start=now, created=now, status='RUNNING')
def create_redis():
host = os.getenv('galileo_redis_host', 'localhost')
password = os.getenv('galileo_redis_password', None)
port = int(os.getenv('galileo_redis_port', 6379))
logger.info('connecting to redis event bus on %s:%d', host, port)
return redis.Redis(host=host, port=port, password=password, decode_responses=True)
def run(args):
signal.signal(signal.SIGTERM, handle_sigterm)
# connect to redis eventbus
rds = create_redis()
# connect to experiment database
exp_db = create_experiment_database_from_env()
exp_db.open()
# create and save the experiment
exp = create_experiment(args)
exp_db.save_experiment(exp)
# main control loop
recorder = Recorder(rds, exp_db, exp.id)
try:
logger.info('starting experiment recorder for exp %s', exp.id)
recorder.start()
logger.debug('storing node info keys')
recorder.telemetry_recorder.save_nodeinfos()
recorder.join()
except KeyboardInterrupt:
logger.debug('interrupt received')
pass
finally:
exp_db.finalize_experiment(exp, 'FINISHED')
logger.info('shutting down experiment recorder')
if recorder:
try:
recorder.stop(5)
except:
pass
exp_db.close()
logger.info('experiment %s exiting', exp.id)
def main():
parser = argparse.ArgumentParser()
parser.add_argument('--name', required=False, help='set name of experiment', default='')
parser.add_argument('--creator', required=False, help='set name of creator', default='')
args = parser.parse_args()
logging.basicConfig(level=logging._nameToLevel[os.getenv('galileo_log_level', 'INFO')])
run(args)
def handle_sigterm(signal_number, _stack_frame):
logger.debug('received signal %s', signal_number)
raise KeyboardInterrupt
if __name__ == '__main__':
main()
|
PypiClean
|
/osmosis_protobuf-0.3.1.tar.gz/osmosis_protobuf-0.3.1/src/osmosis_protobuf/osmosis/valset_pref/v1beta1/query_pb2.py
|
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
_sym_db = _symbol_database.Default()
from ....gogoproto import gogo_pb2 as gogoproto_dot_gogo__pb2
from ....google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2
from ....osmosis.valset_pref.v1beta1 import state_pb2 as osmosis_dot_valset__pref_dot_v1beta1_dot_state__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\'osmosis/valset_pref/v1beta1/query.proto\x12\x1aosmosis.valsetpref.v1beta1\x1a\x14gogoproto/gogo.proto\x1a\x1cgoogle/api/annotations.proto\x1a\'osmosis/valset_pref/v1beta1/state.proto"2\n\x1fUserValidatorPreferencesRequest\x12\x0f\n\x07address\x18\x01 \x01(\t"n\n UserValidatorPreferencesResponse\x12J\n\x0bpreferences\x18\x01 \x03(\x0b2/.osmosis.valsetpref.v1beta1.ValidatorPreferenceB\x04\xc8\xde\x1f\x002\xcf\x01\n\x05Query\x12\xc5\x01\n\x18UserValidatorPreferences\x12;.osmosis.valsetpref.v1beta1.UserValidatorPreferencesRequest\x1a<.osmosis.valsetpref.v1beta1.UserValidatorPreferencesResponse".\x82\xd3\xe4\x93\x02(\x12&/osmosis/valset_pref/v1beta1/{address}BIZCgithub.com/osmosis-labs/osmosis/v16/x/valset_pref/client/queryproto\xc8\xe1\x1e\x00b\x06proto3')
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'osmosis.valset_pref.v1beta1.query_pb2', _globals)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
DESCRIPTOR._serialized_options = b'ZCgithub.com/osmosis-labs/osmosis/v16/x/valset_pref/client/queryproto\xc8\xe1\x1e\x00'
_USERVALIDATORPREFERENCESRESPONSE.fields_by_name['preferences']._options = None
_USERVALIDATORPREFERENCESRESPONSE.fields_by_name['preferences']._serialized_options = b'\xc8\xde\x1f\x00'
_QUERY.methods_by_name['UserValidatorPreferences']._options = None
_QUERY.methods_by_name['UserValidatorPreferences']._serialized_options = b'\x82\xd3\xe4\x93\x02(\x12&/osmosis/valset_pref/v1beta1/{address}'
_globals['_USERVALIDATORPREFERENCESREQUEST']._serialized_start = 164
_globals['_USERVALIDATORPREFERENCESREQUEST']._serialized_end = 214
_globals['_USERVALIDATORPREFERENCESRESPONSE']._serialized_start = 216
_globals['_USERVALIDATORPREFERENCESRESPONSE']._serialized_end = 326
_globals['_QUERY']._serialized_start = 329
_globals['_QUERY']._serialized_end = 536
|
PypiClean
|
/matrix_sydent-2.5.6-py3-none-any.whl/sydent/sydent.py
|
import gc
import logging
import logging.handlers
import os
import sqlite3
from typing import Optional
import attr
import prometheus_client
import twisted.internet.reactor
from matrix_common.versionstring import get_distribution_version_string
from signedjson.types import SigningKey
from twisted.internet import address, task
from twisted.internet.interfaces import (
IReactorCore,
IReactorPluggableNameResolver,
IReactorSSL,
IReactorTCP,
IReactorTime,
)
from twisted.python import log
from twisted.web.http import Request
from zope.interface import Interface
from sydent.config import SydentConfig
from sydent.db.hashing_metadata import HashingMetadataStore
from sydent.db.sqlitedb import SqliteDatabase
from sydent.db.valsession import ThreePidValSessionStore
from sydent.hs_federation.verifier import Verifier
from sydent.http.httpcommon import SslComponents
from sydent.http.httpsclient import ReplicationHttpsClient
from sydent.http.httpserver import (
ClientApiHttpServer,
InternalApiHttpServer,
ReplicationHttpsServer,
)
from sydent.replication.pusher import Pusher
from sydent.threepid.bind import ThreepidBinder
from sydent.util.hash import sha256_and_url_safe_base64
from sydent.util.ratelimiter import Ratelimiter
from sydent.util.tokenutils import generateAlphanumericTokenOfLength
from sydent.validators.emailvalidator import EmailValidator
from sydent.validators.msisdnvalidator import MsisdnValidator
logger = logging.getLogger(__name__)
class SydentReactor(
IReactorCore,
IReactorTCP,
IReactorSSL,
IReactorTime,
IReactorPluggableNameResolver,
Interface,
):
pass
class Sydent:
def __init__(
self,
sydent_config: SydentConfig,
reactor: SydentReactor = twisted.internet.reactor, # type: ignore[assignment]
use_tls_for_federation: bool = True,
):
self.config = sydent_config
self.reactor = reactor
self.use_tls_for_federation = use_tls_for_federation
logger.info("Starting Sydent server")
self.db: sqlite3.Connection = SqliteDatabase(self).db
if self.config.general.sentry_enabled:
import sentry_sdk
sentry_sdk.init(
dsn=self.config.general.sentry_dsn,
release=get_distribution_version_string("matrix-sydent"),
)
with sentry_sdk.configure_scope() as scope:
scope.set_tag("sydent_server_name", self.config.general.server_name)
# workaround for https://github.com/getsentry/sentry-python/issues/803: we
# disable automatic GC and run it periodically instead.
gc.disable()
cb = task.LoopingCall(run_gc)
cb.clock = self.reactor
cb.start(1.0)
# See if a pepper already exists in the database
# Note: This MUST be run before we start serving requests, otherwise lookups for
# 3PID hashes may come in before we've completed generating them
hashing_metadata_store = HashingMetadataStore(self)
lookup_pepper = hashing_metadata_store.get_lookup_pepper()
if not lookup_pepper:
# No pepper defined in the database, generate one
lookup_pepper = generateAlphanumericTokenOfLength(5)
# Store it in the database and rehash 3PIDs
hashing_metadata_store.store_lookup_pepper(
sha256_and_url_safe_base64, lookup_pepper
)
self.validators: Validators = Validators(
EmailValidator(self), MsisdnValidator(self)
)
self.keyring: Keyring = Keyring(self.config.crypto.signing_key)
self.keyring.ed25519.alg = "ed25519"
self.sig_verifier: Verifier = Verifier(self)
self.threepidBinder: ThreepidBinder = ThreepidBinder(self)
self.sslComponents: SslComponents = SslComponents(self)
self.clientApiHttpServer = ClientApiHttpServer(self, lookup_pepper)
self.replicationHttpsServer = ReplicationHttpsServer(self)
self.replicationHttpsClient: ReplicationHttpsClient = ReplicationHttpsClient(
self
)
self.pusher: Pusher = Pusher(self)
self.email_sender_ratelimiter: Ratelimiter[str] = Ratelimiter(
self.reactor,
burst=self.config.email.email_sender_ratelimit_burst,
rate_hz=self.config.email.email_sender_ratelimit_rate_hz,
)
def run(self) -> None:
self.clientApiHttpServer.setup()
self.replicationHttpsServer.setup()
self.pusher.setup()
self.maybe_start_prometheus_server()
# A dedicated validation session store just to clean up old sessions every N minutes
self.cleanupValSession = ThreePidValSessionStore(self)
cb = task.LoopingCall(self.cleanupValSession.deleteOldSessions)
cb.clock = self.reactor
cb.start(10 * 60.0)
if self.config.http.internal_port is not None:
internalport = self.config.http.internal_port
interface = self.config.http.internal_bind_address
self.internalApiHttpServer = InternalApiHttpServer(self)
self.internalApiHttpServer.setup(interface, internalport)
if self.config.general.pidfile:
with open(self.config.general.pidfile, "w") as pidfile:
pidfile.write(str(os.getpid()) + "\n")
self.reactor.run()
def maybe_start_prometheus_server(self) -> None:
if self.config.general.prometheus_enabled:
assert self.config.general.prometheus_addr is not None
assert self.config.general.prometheus_port is not None
prometheus_client.start_http_server(
port=self.config.general.prometheus_port,
addr=self.config.general.prometheus_addr,
)
def ip_from_request(self, request: Request) -> Optional[str]:
if self.config.http.obey_x_forwarded_for and request.requestHeaders.hasHeader(
"X-Forwarded-For"
):
# Type safety: hasHeaders returning True means that getRawHeaders
# returns a nonempty list
return request.requestHeaders.getRawHeaders("X-Forwarded-For")[0] # type: ignore[index]
client = request.getClientAddress()
if isinstance(client, (address.IPv4Address, address.IPv6Address)):
return client.host
else:
return None
def brand_from_request(self, request: Request) -> Optional[str]:
"""
If the brand GET parameter is passed, returns that as a string, otherwise returns None.
:param request: The incoming request.
:return: The brand to use or None if no hint is found.
"""
if b"brand" in request.args:
return request.args[b"brand"][0].decode("utf-8")
return None
def get_branded_template(
self,
brand: Optional[str],
template_name: str,
) -> str:
"""
Calculate a branded template filename to use.
Attempt to use the hinted brand from the request if the brand
is valid. Otherwise, fallback to the default brand.
:param brand: The hint of which brand to use.
:type brand: str or None
:param template_name: The name of the template file to load.
:type template_name: str
:return: The template filename to use.
:rtype: str
"""
# If a brand hint is provided, attempt to use it if it is valid.
if brand:
if brand not in self.config.general.valid_brands:
brand = None
# If the brand hint is not valid, or not provided, fallback to the default brand.
if not brand:
brand = self.config.general.default_brand
root_template_path = self.config.general.templates_path
# Grab jinja template if it exists
if os.path.exists(
os.path.join(root_template_path, brand, template_name + ".j2")
):
return os.path.join(brand, template_name + ".j2")
else:
return os.path.join(root_template_path, brand, template_name)
@attr.s(frozen=True, slots=True, auto_attribs=True)
class Validators:
email: EmailValidator
msisdn: MsisdnValidator
@attr.s(frozen=True, slots=True, auto_attribs=True)
class Keyring:
ed25519: SigningKey
def get_config_file_path() -> str:
return os.environ.get("SYDENT_CONF", "sydent.conf")
def run_gc() -> None:
threshold = gc.get_threshold()
counts = gc.get_count()
for i in reversed(range(len(threshold))):
if threshold[i] < counts[i]:
gc.collect(i)
def setup_logging(config: SydentConfig) -> None:
"""
Setup logging using the options specified in the config
:param config: the configuration to use
"""
log_path = config.general.log_path
log_level = config.general.log_level
log_format = "%(asctime)s - %(name)s - %(lineno)d - %(levelname)s" " - %(message)s"
formatter = logging.Formatter(log_format)
handler: logging.Handler
if log_path != "":
handler = logging.handlers.TimedRotatingFileHandler(
log_path, when="midnight", backupCount=365
)
handler.setFormatter(formatter)
else:
handler = logging.StreamHandler()
handler.setFormatter(formatter)
rootLogger = logging.getLogger("")
rootLogger.setLevel(log_level)
rootLogger.addHandler(handler)
observer = log.PythonLoggingObserver()
observer.start()
def main() -> None:
sydent_config = SydentConfig()
sydent_config.parse_config_file(get_config_file_path())
setup_logging(sydent_config)
syd = Sydent(sydent_config)
syd.run()
if __name__ == "__main__":
main()
|
PypiClean
|
/chromatinhd-0.0.21.tar.gz/chromatinhd-0.0.21/docs/source/quickstart/2_pred.py
|
# %% [markdown]
# # ChromatinHD-*pred*
# %% tags=["hide_code", "hide_output"]
# autoreload
import IPython
if IPython.get_ipython() is not None:
IPython.get_ipython().run_line_magic("load_ext", "autoreload")
IPython.get_ipython().run_line_magic("autoreload", "2")
# %% tags=["hide_output"]
import chromatinhd as chd
import matplotlib.pyplot as plt
# %% [markdown]
# ChromatinHD-<i>pred</i> uses accessibility fragments to predict gene expression. As such, it can detect features such as broad or narrow positioning of fragments, or fragment sizes, that are predictive for gene expression.
# %% [markdown]
# We first load in all the input data which was created in the [data preparation tutorial](../1_data).
# %%
import pathlib
dataset_folder = pathlib.Path("example")
fragments = chd.data.Fragments(dataset_folder / "fragments")
transcriptome = chd.data.Transcriptome(dataset_folder / "transcriptome")
folds = chd.data.folds.Folds(dataset_folder / "folds" / "5x1")
# %% [markdown]
# ## Train the models
# %% [markdown]
# The basic ChromatinHD-*pred* model
# %%
models = chd.models.pred.model.additive.Models(dataset_folder / "models" / "additive", reset=True)
# %% tags=["hide_output"]
models.train_models(fragments, transcriptome, folds, device="cuda")
# %% [markdown]
# ## Some quality checks
# %% [markdown]
# We will first check whether the model learned something, by comparing the predictive performance with a baseline
# %%
gene_cors = models.get_gene_cors(fragments, transcriptome, folds, device="cuda")
gene_cors["symbol"] = gene_cors.index.map(transcriptome.symbol)
# %%
gene_cors.sort_values("deltacor", ascending=False).head(10)
# %%
import pandas as pd
import matplotlib.pyplot as plt
fig, ax = plt.subplots(figsize=(4, 4))
for name, group in gene_cors.iterrows():
ax.plot([0, 1], group[["cor_n_fragments", "cor_predicted"]], color="#3338", zorder=0, marker="o", markersize=2)
ax.boxplot(
gene_cors[["cor_n_fragments", "cor_predicted"]].values,
positions=[0, 1],
widths=0.1,
showfliers=False,
showmeans=True,
meanline=True,
meanprops={"color": "red", "linewidth": 2},
)
ax.set_xticks([0, 1])
ax.set_xticklabels(["# fragments", "ChromatinHD-pred"])
ax.set_ylabel("$cor$")
# %% [markdown]
# Note that every gene gains from the ChromatinHD model, even if some only gain a little. The genes with a low $\Delta cor$ are often those with only a few fragments:
# %%
fig, ax = plt.subplots(figsize=(4, 4))
ax.scatter(gene_cors["n_fragments"], gene_cors["deltacor"])
ax.set_ylabel("$\\Delta$ cor")
ax.set_xlabel("# fragments")
ax.set_xscale("log")
# %% [markdown]
# ## Predictivity per position
# %% [markdown]
# To determine which regions were important for the model to predict gene expression, we will censor fragments from windows of various sizes, and then check whether the model performance on a set of test cells decreased. This functionality is implemented in the `GeneMultiWindow` class. This will only run the censoring for a subset of genes to speed up interpretation.
# %%
censorer = chd.models.pred.interpret.MultiWindowCensorer(fragments.regions.window)
genemultiwindow = chd.models.pred.interpret.GeneMultiWindow(models.path / "interpret" / "genemultiwindow")
# %%
genemultiwindow.score(
fragments,
transcriptome,
models,
folds,
transcriptome.gene_id(
[
"CCL4",
"IL1B",
"EBF1",
"PAX5",
"CD79A",
"RHEX",
]
),
censorer=censorer,
)
# %%
genemultiwindow.interpolate()
# %%
symbol = "EBF1"
fig = chd.grid.Figure(chd.grid.Grid(padding_height=0.05))
width = 10
region = fragments.regions.coordinates.loc[transcriptome.gene_id(symbol)]
panel_genes = chd.plot.genome.genes.Genes.from_region(region, width=width)
fig.main.add_under(panel_genes)
panel_pileup = chd.models.pred.plot.Pileup.from_genemultiwindow(
genemultiwindow, transcriptome.gene_id(symbol), width=width
)
fig.main.add_under(panel_pileup)
panel_predictivity = chd.models.pred.plot.Predictivity.from_genemultiwindow(
genemultiwindow, transcriptome.gene_id(symbol), width=width
)
fig.main.add_under(panel_predictivity)
fig.plot()
# %% [markdown]
# ## Co-predictivity per position
# %% [markdown]
# In a similar fashion we can determine the co-predictivity per position.
# %%
censorer = chd.models.pred.interpret.WindowCensorer(fragments.regions.window)
genepairwindow = chd.models.pred.interpret.GenePairWindow(models.path / "interpret" / "genepairwindow", reset=True)
genepairwindow.score(fragments, transcriptome, models, folds, censorer=censorer, genes=transcriptome.gene_id(["CCL4"]))
# %%
symbol = "CCL4"
fig = chd.grid.Figure(chd.grid.Grid(padding_height=0.05))
width = 10
# genes
region = fragments.regions.coordinates.loc[transcriptome.gene_id(symbol)]
panel_genes = chd.plot.genome.genes.Genes.from_region(region, width=width)
fig.main.add_under(panel_genes)
# pileup
panel_pileup = chd.models.pred.plot.Pileup.from_genemultiwindow(
genemultiwindow, transcriptome.gene_id(symbol), width=width
)
fig.main.add_under(panel_pileup)
# predictivity
panel_predictivity = chd.models.pred.plot.Predictivity.from_genemultiwindow(
genemultiwindow, transcriptome.gene_id(symbol), width=width
)
fig.main.add_under(panel_predictivity)
# copredictivity
panel_copredictivity = chd.models.pred.plot.Copredictivity.from_genepairwindow(
genepairwindow, transcriptome.gene_id(symbol), width=width
)
fig.main.add_under(panel_copredictivity)
fig.plot()
# %%
|
PypiClean
|
/monk_cuda92-0.0.1-py3-none-any.whl/monk/gluon/finetune/level_13_updates_main.py
|
from monk.gluon.finetune.imports import *
from monk.system.imports import *
from monk.gluon.finetune.level_12_losses_main import prototype_losses
class prototype_updates(prototype_losses):
'''
Main class for all parametric update functions
Args:
verbose (int): Set verbosity levels
0 - Print Nothing
1 - Print desired details
'''
@accepts("self", verbose=int, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def __init__(self, verbose=1):
super().__init__(verbose=verbose);
##########################################################################################################################################################
@warning_checks(None, ["gte", 32, "lte", 1024], post_trace=False)
@error_checks(None, ["gt", 0], post_trace=False)
@accepts("self", int, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_input_size(self, input_size):
'''
Update input size.
Args:
input_size (int): New input size
Returns:
None
'''
self.system_dict = set_input_size(input_size, self.system_dict);
self.custom_print("Update: Input size - {}".format(self.system_dict["dataset"]["params"]["input_size"]));
self.custom_print("");
@warning_checks(None, ["lte", 128], post_trace=False)
@error_checks(None, ["gt", 0], post_trace=False)
@accepts("self", int, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_batch_size(self, batch_size):
'''
Update batch size.
Args:
batch_size (int): New batch size
Returns:
None
'''
self.system_dict = set_batch_size(batch_size, self.system_dict);
self.custom_print("Update: Batch size - {}".format(self.system_dict["dataset"]["params"]["batch_size"]));
self.custom_print("");
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_shuffle_data(self, shuffle):
'''
Update to shuffle data or not.
Args:
shuffle (bool): If True, will shuffle data
Returns:
None
'''
self.system_dict = set_data_shuffle(shuffle, self.system_dict);
self.custom_print("Update: Data shuffle - {}".format(self.system_dict["dataset"]["params"]["train_shuffle"]));
self.custom_print("");
@warning_checks(None, ["lte", psutil.cpu_count()], post_trace=False)
@error_checks(None, ["gt", 0], post_trace=False)
@accepts("self", int, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_num_processors(self, num_processors):
'''
Update num processors for data loader.
Args:
num_processors (int): Max CPUs for data sampling
Returns:
None
'''
self.system_dict = set_num_processors(num_processors, self.system_dict);
self.custom_print("Update: Num processors - {}".format(self.system_dict["dataset"]["params"]["num_workers"]));
self.custom_print("");
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_weighted_sampling(self, sample):
'''
Function inactive
'''
self.system_dict = set_weighted_sampling(sample, self.system_dict);
self.custom_print("Update: Weighted Sampling - {}".format(self.system_dict["dataset"]["params"]["weighted_sample"]));
self.custom_print("");
@warning_checks(None, ["gt", 0.5, "lt", 1], post_trace=False)
@error_checks(None, ["gt", 0, "lt", 1], post_trace=False)
@accepts("self", float, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_trainval_split(self, value):
'''
Update training-validation split
Args:
value (float): Indicating train validation split
Division happens as follows:
train - total dataset * split * 100
val - total dataset * (1-split) * 100
Returns:
None
'''
if(self.system_dict["dataset"]["dataset_type"] == "train"):
dataset_path = self.system_dict["dataset"]["train_path"];
path_to_csv=False;
elif(self.system_dict["dataset"]["dataset_type"] == "train-val"):
dataset_path = [self.system_dict["dataset"]["train_path"], self.system_dict["dataset"]["val_path"]];
path_to_csv=False;
elif(self.system_dict["dataset"]["dataset_type"] == "csv_train"):
dataset_path = self.system_dict["dataset"]["train_path"];
path_to_csv = self.system_dict["dataset"]["csv_train"];
elif(self.system_dict["dataset"]["dataset_type"] == "csv_train-val"):
dataset_path = [self.system_dict["dataset"]["train_path"], self.system_dict["dataset"]["val_path"]];
path_to_csv = [self.system_dict["dataset"]["csv_train"], self.system_dict["dataset"]["csv_val"]];
else:
msg = "Dataset Type invalid.\n";
msg += "Cannot update split"
ConstraintsWarning(msg)
self.system_dict = set_dataset_train_path(self.system_dict, dataset_path, value, path_to_csv, self.system_dict["dataset"]["params"]["delimiter"]);
@warning_checks(None, dataset_path=None, split=["gt", 0.5, "lt", 1], path_to_csv=None, delimiter=None, post_trace=False)
@error_checks(None, dataset_path=["folder", 'r'], split=["gt", 0, "lt", 1], path_to_csv=["file", 'r'], delimiter=["in", [",", ";", "-", " "]], post_trace=False)
@accepts("self", dataset_path=[str, list], split=float, path_to_csv=[str, list, bool], delimiter=str, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_dataset(self, dataset_path=False, split=0.9, path_to_csv=False, delimiter=","):
'''
Update dataset path
Args:
dataset_path (str, list): Path to Dataset folder
1) Single string if validation data does not exist
2) List [train_path, val_path] in case of separate train and val data
path_to_csv (str, list): Path to csv file pointing towards images
1) Single string if validation data does not exist
2) List [train_path, val_path] in case of separate train and val data
value (float): Indicating train validation split
Division happens as follows:
train - total dataset * split * 100
val - total dataset * (1-split) * 100
delimiter (str): Delimiter for csv file
Returns:
None
'''
self.system_dict = set_dataset_train_path(self.system_dict, dataset_path, split, path_to_csv, delimiter);
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", str, force=bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_model_name(self, model_name, force=False):
'''
Update model name
Args:
model_name (str): Select from available models. Check via List_Models() function
force (bool): Dummy function
Returns:
None
'''
if(not force):
if(self.system_dict["training"]["status"]):
ConstraintWarning("Model trained using {}\n".format(self.system_dict["model"]["params"]["model_name"]));
ConstraintWarning("Changing the model will overwrite previously trained models if training is executed.\n");
inp = input("Do you wish to continue further (y/n):");
if(inp == "y"):
self.system_dict = set_model_name(model_name, self.system_dict);
self.custom_print("Update: Model name - {}".format(self.system_dict["model"]["params"]["model_name"]));
self.custom_print("");
else:
self.custom_print("Model not updated.");
self.custom_print("");
else:
self.system_dict = set_model_name(model_name, self.system_dict);
self.custom_print("Update: Model name - {}".format(self.system_dict["model"]["params"]["model_name"]));
self.custom_print("");
else:
self.system_dict = set_model_name(model_name, self.system_dict);
self.custom_print("Update: Model name - {}".format(self.system_dict["model"]["params"]["model_name"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", [str, list], force=bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_model_path(self, model_path, force=False):
'''
Update model path for inferencing
Args:
model_path (str): Path to model weights.
force (bool): Dummy function
Returns:
None
'''
if(not force):
if(self.system_dict["training"]["status"]):
ConstraintWarning("Model trained using {}\n".format(self.system_dict["model"]["params"]["model_name"]));
ConstraintWarning("Changing the model will overwrite previously trained models if training is executed.\n");
inp = input("Do you wish to continue further (y/n):");
if(inp == "y"):
self.system_dict = set_model_path(model_path, self.system_dict);
self.custom_print("Update: Model path - {}".format(self.system_dict["model"]["params"]["model_path"]));
self.custom_print("");
else:
self.custom_print("Model not updated.");
self.custom_print("");
else:
self.system_dict = set_model_path(model_path, self.system_dict);
self.custom_print("Update: Model path - {}".format(self.system_dict["model"]["params"]["model_path"]));
self.custom_print("");
else:
self.system_dict = set_model_path(model_path, self.system_dict);
self.custom_print("Update: Model path - {}".format(self.system_dict["model"]["params"]["model_path"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_use_gpu(self, gpu):
'''
Update to use gpu or cpu
Args:
gpu (bool): If True, then use GPU
Returns:
None
'''
self.system_dict = set_device(gpu, self.system_dict);
self.custom_print("Update: Use Gpu - {}".format(self.system_dict["model"]["params"]["use_gpu"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_use_pretrained(self, pretrained):
'''
Update to use pretrained wights or randomly initialized weights
Args:
pretrained (bool): If True, use pretrained weights
else, use randomly initialized weights
Returns:
None
'''
self.system_dict = set_pretrained(pretrained, self.system_dict);
self.custom_print("Update: Use pretrained - {}".format(self.system_dict["model"]["params"]["use_pretrained"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_freeze_base_network(self, freeze):
'''
Update whether freeze base network or not
Args:
freeze (bool): If True, then base network is non-trainable, works as a feature extractor
Returns:
None
'''
self.system_dict = set_freeze_base_network(freeze, self.system_dict);
self.custom_print("Update: Freeze Base Network - {}".format(self.system_dict["model"]["params"]["freeze_base_network"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@error_checks(None, ["gte", 0], post_trace=False)
@accepts("self", int, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_freeze_layers(self, num_freeze):
'''
Update to freeze certain layers in the network
Args:
num_freeze (int): Number of layers to freeze in network starting from top
Returns:
None
'''
self.system_dict["model"]["params"]["num_freeze"] = num_freeze;
self.custom_print("Update: Freeze layers - {}".format(self.system_dict["model"]["params"]["num_freeze"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@warning_checks(None, ["lt", 100], post_trace=False)
@error_checks(None, ["gt", 0], post_trace=False)
@accepts("self", int, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_num_epochs(self, num_epochs):
'''
Update number of epochs to train the network
Args:
num_epochs (int): New number of epochs
Returns:
None
'''
self.system_dict = set_num_epochs(num_epochs, self.system_dict);
self.custom_print("Update: Num Epochs - {}".format(self.system_dict["hyper-parameters"]["num_epochs"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@warning_checks(None, ["lt", 1], post_trace=False)
@error_checks(None, ["gt", 0], post_trace=False)
@accepts("self", [int, float], post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_learning_rate(self, learning_rate):
'''
Update base learning rate for training
Args:
learning_rate (float): New base learning rate
Returns:
None
'''
self.system_dict["hyper-parameters"]["learning_rate"] = learning_rate;
self.system_dict["hyper-parameters"]["optimizer"]["params"]["lr"] = learning_rate;
self.custom_print("Update: Learning Rate - {}".format(self.system_dict["hyper-parameters"]["learning_rate"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_display_progress_realtime(self, value):
'''
Update display progress param
Args:
value (bool): If True, then real time progress is displayed
Returns:
None
'''
self.system_dict = set_display_progress_realtime(value, self.system_dict);
self.custom_print("Update: Display progress realtime - {}".format(self.system_dict["training"]["settings"]["display_progress_realtime"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_display_progress(self, value):
'''
Update display progress param
Args:
value (bool): If True, then per epoch progress is displayed
Returns:
None
'''
self.system_dict = set_display_progress(value, self.system_dict);
self.custom_print("Update: Display progress - {}".format(self.system_dict["training"]["settings"]["display_progress"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@error_checks(None, None, prefix=["name", ["A-Z", "a-z", "0-9", "-", "_"]], post_trace=False)
@accepts("self", bool, prefix=str, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_save_intermediate_models(self, value, prefix="intermediate_model_"):
'''
Update whether to save intermediate models or not
Args:
value (bool): If True, saves model weight post every epoch
prefix (str): Appends a prefix to intermediate weights
Returns:
None
'''
if(value):
if(not os.access(self.system_dict["model_dir"], os.W_OK)):
msg = "Folder \"{}\" has no read access".format(self.system_dict["model_dir"])
msg += "Cannot save Intermediate models";
raise ConstraintError(msg);
self.system_dict = set_save_intermediate_models(value, self.system_dict);
self.system_dict = set_intermediate_model_prefix(prefix, self.system_dict);
self.custom_print("Update: Save Intermediate models - {}".format(self.system_dict["training"]["settings"]["save_intermediate_models"]));
if(self.system_dict["training"]["settings"]["save_intermediate_models"]):
self.custom_print("Update: Intermediate model prefix - {}".format(self.system_dict["training"]["settings"]["intermediate_model_prefix"]));
self.custom_print("");
##########################################################################################################################################################
##########################################################################################################################################################
@accepts("self", bool, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def update_save_training_logs(self, value):
'''
Update whether to save training logs or not
Args:
value (bool): If True, saves all training and validation metrics. Required for comparison.
Returns:
None
'''
self.system_dict = set_save_training_logs(value, self.system_dict);
self.custom_print("Update: Save Training logs - {}".format(self.system_dict["training"]["settings"]["save_training_logs"]));
self.custom_print("");
##########################################################################################################################################################
|
PypiClean
|
/pulumi_azure_nextgen-0.6.2a1613157620.tar.gz/pulumi_azure_nextgen-0.6.2a1613157620/pulumi_azure_nextgen/domainregistration/v20190801/_inputs.py
|
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union
from ... import _utilities, _tables
from ._enums import *
__all__ = [
'AddressArgs',
'ContactArgs',
'DomainPurchaseConsentArgs',
]
@pulumi.input_type
class AddressArgs:
def __init__(__self__, *,
address1: pulumi.Input[str],
city: pulumi.Input[str],
country: pulumi.Input[str],
postal_code: pulumi.Input[str],
state: pulumi.Input[str],
address2: Optional[pulumi.Input[str]] = None):
"""
Address information for domain registration.
:param pulumi.Input[str] address1: First line of an Address.
:param pulumi.Input[str] city: The city for the address.
:param pulumi.Input[str] country: The country for the address.
:param pulumi.Input[str] postal_code: The postal code for the address.
:param pulumi.Input[str] state: The state or province for the address.
:param pulumi.Input[str] address2: The second line of the Address. Optional.
"""
pulumi.set(__self__, "address1", address1)
pulumi.set(__self__, "city", city)
pulumi.set(__self__, "country", country)
pulumi.set(__self__, "postal_code", postal_code)
pulumi.set(__self__, "state", state)
if address2 is not None:
pulumi.set(__self__, "address2", address2)
@property
@pulumi.getter
def address1(self) -> pulumi.Input[str]:
"""
First line of an Address.
"""
return pulumi.get(self, "address1")
@address1.setter
def address1(self, value: pulumi.Input[str]):
pulumi.set(self, "address1", value)
@property
@pulumi.getter
def city(self) -> pulumi.Input[str]:
"""
The city for the address.
"""
return pulumi.get(self, "city")
@city.setter
def city(self, value: pulumi.Input[str]):
pulumi.set(self, "city", value)
@property
@pulumi.getter
def country(self) -> pulumi.Input[str]:
"""
The country for the address.
"""
return pulumi.get(self, "country")
@country.setter
def country(self, value: pulumi.Input[str]):
pulumi.set(self, "country", value)
@property
@pulumi.getter(name="postalCode")
def postal_code(self) -> pulumi.Input[str]:
"""
The postal code for the address.
"""
return pulumi.get(self, "postal_code")
@postal_code.setter
def postal_code(self, value: pulumi.Input[str]):
pulumi.set(self, "postal_code", value)
@property
@pulumi.getter
def state(self) -> pulumi.Input[str]:
"""
The state or province for the address.
"""
return pulumi.get(self, "state")
@state.setter
def state(self, value: pulumi.Input[str]):
pulumi.set(self, "state", value)
@property
@pulumi.getter
def address2(self) -> Optional[pulumi.Input[str]]:
"""
The second line of the Address. Optional.
"""
return pulumi.get(self, "address2")
@address2.setter
def address2(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "address2", value)
@pulumi.input_type
class ContactArgs:
def __init__(__self__, *,
email: pulumi.Input[str],
name_first: pulumi.Input[str],
name_last: pulumi.Input[str],
phone: pulumi.Input[str],
address_mailing: Optional[pulumi.Input['AddressArgs']] = None,
fax: Optional[pulumi.Input[str]] = None,
job_title: Optional[pulumi.Input[str]] = None,
name_middle: Optional[pulumi.Input[str]] = None,
organization: Optional[pulumi.Input[str]] = None):
"""
Contact information for domain registration. If 'Domain Privacy' option is not selected then the contact information is made publicly available through the Whois
directories as per ICANN requirements.
:param pulumi.Input[str] email: Email address.
:param pulumi.Input[str] name_first: First name.
:param pulumi.Input[str] name_last: Last name.
:param pulumi.Input[str] phone: Phone number.
:param pulumi.Input['AddressArgs'] address_mailing: Mailing address.
:param pulumi.Input[str] fax: Fax number.
:param pulumi.Input[str] job_title: Job title.
:param pulumi.Input[str] name_middle: Middle name.
:param pulumi.Input[str] organization: Organization contact belongs to.
"""
pulumi.set(__self__, "email", email)
pulumi.set(__self__, "name_first", name_first)
pulumi.set(__self__, "name_last", name_last)
pulumi.set(__self__, "phone", phone)
if address_mailing is not None:
pulumi.set(__self__, "address_mailing", address_mailing)
if fax is not None:
pulumi.set(__self__, "fax", fax)
if job_title is not None:
pulumi.set(__self__, "job_title", job_title)
if name_middle is not None:
pulumi.set(__self__, "name_middle", name_middle)
if organization is not None:
pulumi.set(__self__, "organization", organization)
@property
@pulumi.getter
def email(self) -> pulumi.Input[str]:
"""
Email address.
"""
return pulumi.get(self, "email")
@email.setter
def email(self, value: pulumi.Input[str]):
pulumi.set(self, "email", value)
@property
@pulumi.getter(name="nameFirst")
def name_first(self) -> pulumi.Input[str]:
"""
First name.
"""
return pulumi.get(self, "name_first")
@name_first.setter
def name_first(self, value: pulumi.Input[str]):
pulumi.set(self, "name_first", value)
@property
@pulumi.getter(name="nameLast")
def name_last(self) -> pulumi.Input[str]:
"""
Last name.
"""
return pulumi.get(self, "name_last")
@name_last.setter
def name_last(self, value: pulumi.Input[str]):
pulumi.set(self, "name_last", value)
@property
@pulumi.getter
def phone(self) -> pulumi.Input[str]:
"""
Phone number.
"""
return pulumi.get(self, "phone")
@phone.setter
def phone(self, value: pulumi.Input[str]):
pulumi.set(self, "phone", value)
@property
@pulumi.getter(name="addressMailing")
def address_mailing(self) -> Optional[pulumi.Input['AddressArgs']]:
"""
Mailing address.
"""
return pulumi.get(self, "address_mailing")
@address_mailing.setter
def address_mailing(self, value: Optional[pulumi.Input['AddressArgs']]):
pulumi.set(self, "address_mailing", value)
@property
@pulumi.getter
def fax(self) -> Optional[pulumi.Input[str]]:
"""
Fax number.
"""
return pulumi.get(self, "fax")
@fax.setter
def fax(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "fax", value)
@property
@pulumi.getter(name="jobTitle")
def job_title(self) -> Optional[pulumi.Input[str]]:
"""
Job title.
"""
return pulumi.get(self, "job_title")
@job_title.setter
def job_title(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "job_title", value)
@property
@pulumi.getter(name="nameMiddle")
def name_middle(self) -> Optional[pulumi.Input[str]]:
"""
Middle name.
"""
return pulumi.get(self, "name_middle")
@name_middle.setter
def name_middle(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name_middle", value)
@property
@pulumi.getter
def organization(self) -> Optional[pulumi.Input[str]]:
"""
Organization contact belongs to.
"""
return pulumi.get(self, "organization")
@organization.setter
def organization(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "organization", value)
@pulumi.input_type
class DomainPurchaseConsentArgs:
def __init__(__self__, *,
agreed_at: Optional[pulumi.Input[str]] = None,
agreed_by: Optional[pulumi.Input[str]] = None,
agreement_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
Domain purchase consent object, representing acceptance of applicable legal agreements.
:param pulumi.Input[str] agreed_at: Timestamp when the agreements were accepted.
:param pulumi.Input[str] agreed_by: Client IP address.
:param pulumi.Input[Sequence[pulumi.Input[str]]] agreement_keys: List of applicable legal agreement keys. This list can be retrieved using ListLegalAgreements API under <code>TopLevelDomain</code> resource.
"""
if agreed_at is not None:
pulumi.set(__self__, "agreed_at", agreed_at)
if agreed_by is not None:
pulumi.set(__self__, "agreed_by", agreed_by)
if agreement_keys is not None:
pulumi.set(__self__, "agreement_keys", agreement_keys)
@property
@pulumi.getter(name="agreedAt")
def agreed_at(self) -> Optional[pulumi.Input[str]]:
"""
Timestamp when the agreements were accepted.
"""
return pulumi.get(self, "agreed_at")
@agreed_at.setter
def agreed_at(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "agreed_at", value)
@property
@pulumi.getter(name="agreedBy")
def agreed_by(self) -> Optional[pulumi.Input[str]]:
"""
Client IP address.
"""
return pulumi.get(self, "agreed_by")
@agreed_by.setter
def agreed_by(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "agreed_by", value)
@property
@pulumi.getter(name="agreementKeys")
def agreement_keys(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of applicable legal agreement keys. This list can be retrieved using ListLegalAgreements API under <code>TopLevelDomain</code> resource.
"""
return pulumi.get(self, "agreement_keys")
@agreement_keys.setter
def agreement_keys(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "agreement_keys", value)
|
PypiClean
|
/Nuitka_winsvc-1.7.10-cp310-cp310-win_amd64.whl/nuitka/utils/CStrings.py
|
import codecs
import re
from nuitka.__past__ import unicode
def _identifierEncode(c):
"""Nuitka handler to encode unicode to ASCII identifiers for C compiler."""
return "$%02x$" % ord(c.object[c.end - 1]), c.end
codecs.register_error("c_identifier", _identifierEncode)
def _encodePythonStringToC(value):
"""Encode a string, so that it gives a C string literal.
This doesn't handle limits.
"""
assert type(value) is bytes, type(value)
result = ""
octal = False
for c in value:
if str is bytes:
cv = ord(c)
else:
cv = c
if c in b'\\\t\r\n"?':
result += r"\%o" % cv
octal = True
elif 32 <= cv <= 127:
if octal and c in b"0123456789":
result += '" "'
result += chr(cv)
octal = False
else:
result += r"\%o" % cv
octal = True
result = result.replace('" "\\', "\\")
return '"%s"' % result
def encodePythonUnicodeToC(value):
"""Encode a string, so that it gives a wide C string literal."""
assert type(value) is unicode, type(value)
result = ""
for c in value:
cv = ord(c)
result += r"\%o" % cv
return 'L"%s"' % result
def encodePythonStringToC(value):
"""Encode bytes, so that it gives a C string literal."""
# Not all compilers allow arbitrary large C strings, therefore split it up
# into chunks. That changes nothing to the meanings, but is easier on the
# parser. Currently only MSVC is known to have this issue, but the
# workaround can be used universally.
result = _encodePythonStringToC(value[:16000])
value = value[16000:]
while value:
result += " "
result += _encodePythonStringToC(value[:16000])
value = value[16000:]
return result
def encodePythonIdentifierToC(value):
"""Encode an identifier from a given Python string."""
# Python identifiers allow almost of characters except a very
# few, much more than C identifiers support. This attempts to
# be bi-directional, so we can reverse it.
def r(match):
c = match.group()
if c == ".":
return "$"
else:
return "$$%d$" % ord(c)
return "".join(re.sub("[^a-zA-Z0-9_]", r, c) for c in value)
|
PypiClean
|
/dscript-0.2.4.tar.gz/dscript-0.2.4/docs/source/api/dscript.commands.rst
|
dscript.commands
================
dscript.commands.predict
------------------------
See `Prediction <../usage.html#prediction>`_ for full usage details.
.. automodule:: dscript.commands.predict
:members:
:undoc-members:
:show-inheritance:
dscript.commands.embed
----------------------
See `Embedding <../usage.html#embedding>`_ for full usage details.
.. automodule:: dscript.commands.embed
:members:
:undoc-members:
:show-inheritance:
dscript.commands.train
----------------------
See `Training <../usage.html#training>`_ for full usage details.
.. automodule:: dscript.commands.train
:members:
:undoc-members:
:show-inheritance:
dscript.commands.evaluate
-------------------------
See `Evaluation <../usage.html#evaluation>`_ for full usage details.
.. automodule:: dscript.commands.evaluate
:members:
:undoc-members:
:show-inheritance:
|
PypiClean
|
/pyelftools-0.29.tar.gz/pyelftools-0.29/elftools/elf/gnuversions.py
|
from ..construct import CString
from ..common.utils import struct_parse, elf_assert
from .sections import Section, Symbol
class Version(object):
""" Version object - representing a version definition or dependency
entry from a "Version Needed" or a "Version Dependency" table section.
This kind of entry contains a pointer to an array of auxiliary entries
that store the information about version names or dependencies.
These entries are not stored in this object and should be accessed
through the appropriate method of a section object which will return
an iterator of VersionAuxiliary objects.
Similarly to Section objects, allows dictionary-like access to
verdef/verneed entry
"""
def __init__(self, entry, name=None):
self.entry = entry
self.name = name
def __getitem__(self, name):
""" Implement dict-like access to entry
"""
return self.entry[name]
class VersionAuxiliary(object):
""" Version Auxiliary object - representing an auxiliary entry of a version
definition or dependency entry
Similarly to Section objects, allows dictionary-like access to the
verdaux/vernaux entry
"""
def __init__(self, entry, name):
self.entry = entry
self.name = name
def __getitem__(self, name):
""" Implement dict-like access to entries
"""
return self.entry[name]
class GNUVersionSection(Section):
""" Common ancestor class for ELF SUNW|GNU Version Needed/Dependency
sections class which contains shareable code
"""
def __init__(self, header, name, elffile, stringtable,
field_prefix, version_struct, version_auxiliaries_struct):
super(GNUVersionSection, self).__init__(header, name, elffile)
self.stringtable = stringtable
self.field_prefix = field_prefix
self.version_struct = version_struct
self.version_auxiliaries_struct = version_auxiliaries_struct
def num_versions(self):
""" Number of version entries in the section
"""
return self['sh_info']
def _field_name(self, name, auxiliary=False):
""" Return the real field's name of version or a version auxiliary
entry
"""
middle = 'a_' if auxiliary else '_'
return self.field_prefix + middle + name
def _iter_version_auxiliaries(self, entry_offset, count):
""" Yield all auxiliary entries of a version entry
"""
name_field = self._field_name('name', auxiliary=True)
next_field = self._field_name('next', auxiliary=True)
for _ in range(count):
entry = struct_parse(
self.version_auxiliaries_struct,
self.stream,
stream_pos=entry_offset)
name = self.stringtable.get_string(entry[name_field])
version_aux = VersionAuxiliary(entry, name)
yield version_aux
entry_offset += entry[next_field]
def iter_versions(self):
""" Yield all the version entries in the section
Each time it returns the main version structure
and an iterator to walk through its auxiliaries entries
"""
aux_field = self._field_name('aux')
count_field = self._field_name('cnt')
next_field = self._field_name('next')
entry_offset = self['sh_offset']
for _ in range(self.num_versions()):
entry = struct_parse(
self.version_struct,
self.stream,
stream_pos=entry_offset)
elf_assert(entry[count_field] > 0,
'Expected number of version auxiliary entries (%s) to be > 0'
'for the following version entry: %s' % (
count_field, str(entry)))
version = Version(entry)
aux_entries_offset = entry_offset + entry[aux_field]
version_auxiliaries_iter = self._iter_version_auxiliaries(
aux_entries_offset, entry[count_field])
yield version, version_auxiliaries_iter
entry_offset += entry[next_field]
class GNUVerNeedSection(GNUVersionSection):
""" ELF SUNW or GNU Version Needed table section.
Has an associated StringTableSection that's passed in the constructor.
"""
def __init__(self, header, name, elffile, stringtable):
super(GNUVerNeedSection, self).__init__(
header, name, elffile, stringtable, 'vn',
elffile.structs.Elf_Verneed, elffile.structs.Elf_Vernaux)
self._has_indexes = None
def has_indexes(self):
""" Return True if at least one version definition entry has an index
that is stored in the vna_other field.
This information is used for symbol versioning
"""
if self._has_indexes is None:
self._has_indexes = False
for _, vernaux_iter in self.iter_versions():
for vernaux in vernaux_iter:
if vernaux['vna_other']:
self._has_indexes = True
break
return self._has_indexes
def iter_versions(self):
for verneed, vernaux in super(GNUVerNeedSection, self).iter_versions():
verneed.name = self.stringtable.get_string(verneed['vn_file'])
yield verneed, vernaux
def get_version(self, index):
""" Get the version information located at index #n in the table
Return boths the verneed structure and the vernaux structure
that contains the name of the version
"""
for verneed, vernaux_iter in self.iter_versions():
for vernaux in vernaux_iter:
if vernaux['vna_other'] == index:
return verneed, vernaux
return None
class GNUVerDefSection(GNUVersionSection):
""" ELF SUNW or GNU Version Definition table section.
Has an associated StringTableSection that's passed in the constructor.
"""
def __init__(self, header, name, elffile, stringtable):
super(GNUVerDefSection, self).__init__(
header, name, elffile, stringtable, 'vd',
elffile.structs.Elf_Verdef, elffile.structs.Elf_Verdaux)
def get_version(self, index):
""" Get the version information located at index #n in the table
Return boths the verdef structure and an iterator to retrieve
both the version names and dependencies in the form of
verdaux entries
"""
for verdef, verdaux_iter in self.iter_versions():
if verdef['vd_ndx'] == index:
return verdef, verdaux_iter
return None
class GNUVerSymSection(Section):
""" ELF SUNW or GNU Versym table section.
Has an associated SymbolTableSection that's passed in the constructor.
"""
def __init__(self, header, name, elffile, symboltable):
super(GNUVerSymSection, self).__init__(header, name, elffile)
self.symboltable = symboltable
def num_symbols(self):
""" Number of symbols in the table
"""
return self['sh_size'] // self['sh_entsize']
def get_symbol(self, n):
""" Get the symbol at index #n from the table (Symbol object)
It begins at 1 and not 0 since the first entry is used to
store the current version of the syminfo table
"""
# Grab the symbol's entry from the stream
entry_offset = self['sh_offset'] + n * self['sh_entsize']
entry = struct_parse(
self.structs.Elf_Versym,
self.stream,
stream_pos=entry_offset)
# Find the symbol name in the associated symbol table
name = self.symboltable.get_symbol(n).name
return Symbol(entry, name)
def iter_symbols(self):
""" Yield all the symbols in the table
"""
for i in range(self.num_symbols()):
yield self.get_symbol(i)
|
PypiClean
|
/chompack-2.3.2.tar.gz/chompack-2.3.2/examples/cholesky.py
|
from cvxopt import matrix, spmatrix, normal, amd, blas
from chompack import symbolic, cspmatrix, cholesky, llt, completion, projected_inverse, hessian, trsm, trmm,\
merge_size_fill, tril, symmetrize, perm, eye
import random
def sp_rand(m,n,a):
"""
Generates an mxn sparse 'd' matrix with round(a*m*n) nonzeros.
"""
if m == 0 or n == 0: return spmatrix([], [], [], (m,n))
nnz = min(max(0, int(round(a*m*n))), m*n)
nz = matrix(random.sample(range(m*n), nnz), tc='i')
return spmatrix(normal(nnz,1), nz%m, matrix([int(ii) for ii in nz/m]), (m,n))
random.seed(1)
# Generate random sparse matrix of order ...
n = 200
# and with density ...
rho = 0.02
As = sp_rand(n,n,rho) + spmatrix(10.0,range(n),range(n))
# We will use (greedy) clique merging in this example:
fmerge = merge_size_fill(16,4)
# Compute symbolic factorization with AMD ordering and clique merging; only lower triangular part of As is accessed
print("Computing symbolic factorization..")
p = amd.order
symb = symbolic(As, p = p, merge_function = fmerge)
print("Order of matrix : %i" % (symb.n))
print("Number of nonzeros : %i" % (symb.nnz))
print("Number of supernodes : %i" % (symb.Nsn))
print("Largest supernode : %i" % (max([symb.snptr[k+1]-symb.snptr[k] for k in range(symb.Nsn)])))
print("Largest clique : %i\n" % (symb.clique_number))
A = cspmatrix(symb) # create new cspmatrix from symbolic factorization
A += As # add spmatrix 'As' to cspmatrix 'A'; this ignores the upper triangular entries in As
print("Computing Cholesky factorization..")
L = A.copy() # make a copy of A
cholesky(L) # compute Cholesky factorization; overwrites L
print("Computing Cholesky product..")
At = L.copy() # make a copy of L
llt(At) # compute Cholesky product; overwrites At
print("Computing projected inverse..")
Y = L.copy() # make a copy of L
projected_inverse(Y) # compute projected inverse; overwrites Y
print("Computing completion..")
Lc = Y.copy() # make a copy of Y
completion(Lc, factored_updates = False) # compute completion; overwrites Lc
print("Computing completion with factored updates..")
Lc2 = Y.copy() # make a copy of Y
completion(Lc2, factored_updates = True) # compute completion (with factored updates); overwrites Lc2
print("Applying Hessian factors..")
U = At.copy()
fupd = False
hessian(L, Y, U, adj = False, inv = False, factored_updates = fupd)
hessian(L, Y, U, adj = True, inv = False, factored_updates = fupd)
hessian(L, Y, U, adj = True, inv = True, factored_updates = fupd)
hessian(L, Y, U, adj = False, inv = True, factored_updates = fupd)
print("\nEvaluating errors:\n")
# Compute norm of error: A - L*L.T
tmp = (A-At).spmatrix().V
print("Cholesky factorization/product : err = %.3e" % (blas.nrm2(tmp)))
# Compute norm of error: L - Lc
tmp = (L.spmatrix()-Lc.spmatrix()).V
print("Projected inverse/completion : err = %.3e" % (blas.nrm2(tmp)))
# Compute norm of error: L - Lc2
tmp = (L.spmatrix()-Lc2.spmatrix()).V
print("Projected inverse/completion (upd) : err = %.3e" % (blas.nrm2(tmp)))
# Compute norm of error: At - U
tmp = (At-U).spmatrix().V
print("Hessian factors NN/TN/TI/NI : err = %.3e" % (blas.nrm2(tmp)))
# Test triangular matrix products and solve
p = L.symb.p
B = eye(n)
trsm(L, B)
print("trsm, trans = 'N' : err = %.3e" % (blas.nrm2(L.spmatrix(reordered = True)*B[p,p] - eye(n))))
B = eye(n)
trsm(L, B, trans = 'T')
print("trsm, trans = 'T' : err = %.3e" % (blas.nrm2(L.spmatrix(reordered = True).T*B[p,p] - eye(n))))
B = eye(n)
trmm(L, B)
print("trmm, trans = 'N' : err = %.3e" % (blas.nrm2(L.spmatrix(reordered = True) - B[p,p])))
B = eye(n)
trmm(L, B, trans = 'T')
print("trmm, trans = 'T' : err = %.3e" % (blas.nrm2(L.spmatrix(reordered = True).T - B[p,p])))
B = eye(n)
trmm(L,B,trans='T')
trmm(L,B)
print("llt(L) - trmm N/T : err = %.3e" % (blas.nrm2(tril(B - As))))
|
PypiClean
|
/spire.xls-13.4.0-py3-none-any.whl/spire/xls/XlsBordersCollection.py
|
from enum import Enum
from plum import dispatch
from typing import TypeVar,Union,Generic,List,Tuple
from spire.common import *
from spire.xls import *
from ctypes import *
import abc
class XlsBordersCollection ( CollectionBase[XlsBorder],IBorders) :
"""
"""
@property
def KnownColor(self)->'ExcelColors':
"""
<summary>
Returns or sets the primary excel color of the object.
</summary>
"""
GetDllLibXls().XlsBordersCollection_get_KnownColor.argtypes=[c_void_p]
GetDllLibXls().XlsBordersCollection_get_KnownColor.restype=c_int
ret = GetDllLibXls().XlsBordersCollection_get_KnownColor(self.Ptr)
objwraped = ExcelColors(ret)
return objwraped
@KnownColor.setter
def KnownColor(self, value:'ExcelColors'):
GetDllLibXls().XlsBordersCollection_set_KnownColor.argtypes=[c_void_p, c_int]
GetDllLibXls().XlsBordersCollection_set_KnownColor(self.Ptr, value.value)
@property
def Color(self)->'Color':
"""
<summary>
Returns or sets the primary color of the object.
</summary>
"""
GetDllLibXls().XlsBordersCollection_get_Color.argtypes=[c_void_p]
GetDllLibXls().XlsBordersCollection_get_Color.restype=c_void_p
intPtr = GetDllLibXls().XlsBordersCollection_get_Color(self.Ptr)
ret = None if intPtr==None else Color(intPtr)
return ret
@Color.setter
def Color(self, value:'Color'):
GetDllLibXls().XlsBordersCollection_set_Color.argtypes=[c_void_p, c_void_p]
GetDllLibXls().XlsBordersCollection_set_Color(self.Ptr, value.Ptr)
def get_Item(self ,index:'BordersLineType')->'IBorder':
"""
"""
enumindex:c_int = index.value
GetDllLibXls().XlsBordersCollection_get_Item.argtypes=[c_void_p ,c_int]
GetDllLibXls().XlsBordersCollection_get_Item.restype=c_void_p
intPtr = GetDllLibXls().XlsBordersCollection_get_Item(self.Ptr, enumindex)
ret = None if intPtr==None else IBorder(intPtr)
return ret
@property
def LineStyle(self)->'LineStyleType':
"""
<summary>
Returns or sets the line style for the border.
</summary>
"""
GetDllLibXls().XlsBordersCollection_get_LineStyle.argtypes=[c_void_p]
GetDllLibXls().XlsBordersCollection_get_LineStyle.restype=c_int
ret = GetDllLibXls().XlsBordersCollection_get_LineStyle(self.Ptr)
objwraped = LineStyleType(ret)
return objwraped
@LineStyle.setter
def LineStyle(self, value:'LineStyleType'):
GetDllLibXls().XlsBordersCollection_set_LineStyle.argtypes=[c_void_p, c_int]
GetDllLibXls().XlsBordersCollection_set_LineStyle(self.Ptr, value.value)
@property
def Value(self)->'LineStyleType':
"""
"""
GetDllLibXls().XlsBordersCollection_get_Value.argtypes=[c_void_p]
GetDllLibXls().XlsBordersCollection_get_Value.restype=c_int
ret = GetDllLibXls().XlsBordersCollection_get_Value(self.Ptr)
objwraped = LineStyleType(ret)
return objwraped
@Value.setter
def Value(self, value:'LineStyleType'):
GetDllLibXls().XlsBordersCollection_set_Value.argtypes=[c_void_p, c_int]
GetDllLibXls().XlsBordersCollection_set_Value(self.Ptr, value.value)
|
PypiClean
|
/ym-xadmin-0.0.5.tar.gz/ym-xadmin-0.0.5/xadmin/js/xadmin.main.js
|
;(function ($) {
$.fn.exform = function () {
this.each(function () {
var form = $(this);
for (var i = $.fn.exform.renders.length - 1; i >= 0; i--) {
$.fn.exform.renders[i](form)
}
;
form.addClass('rended');
})
}
$.fn.exform.renders = [];
$(function () {
$('.exform:not(.rended)').exform();
});
$.getCookie = function (name) {
var cookieValue = null;
if (document.cookie && document.cookie != '') {
var cookies = document.cookie.split(';');
for (var i = 0; i < cookies.length; i++) {
var cookie = jQuery.trim(cookies[i]);
// Does this cookie string begin with the name we want?
if (cookie.substring(0, name.length + 1) == (name + '=')) {
cookieValue = decodeURIComponent(cookie.substring(name.length + 1));
break;
}
}
}
return cookieValue;
}
//dropdown submenu plugin
$(document)
.on('click.xa.dropdown.data-api touchstart.xa.dropdown.data-api', '.dropdown-submenu', function (e) {
e.stopPropagation();
})
.on('click.xa.dropdown.data-api', function (e) {
$('.dropdown-submenu.open').removeClass('open');
});
if ('ontouchstart' in document.documentElement) {
$('.dropdown-submenu a').on('click.xa.dropdown.data-api', function (e) {
$(this).parent().toggleClass('open');
});
} else {
$('.dropdown-submenu').on('click.xa.dropdown.data-api mouseover.xa.dropdown.data-api', function (e) {
$(this).parent().find('>.dropdown-submenu.open').removeClass('open');
$(this).addClass('open');
});
}
//toggle class button
$('body').on('click.xa.togglebtn.data-api', '[data-toggle=class]', function (e) {
var $this = $(this), href
var target = $this.attr('data-target')
|| e.preventDefault()
|| (href = $this.attr('href')) && href.replace(/.*(?=#[^\s]+$)/, '') //strip for ie7
var className = $this.attr('data-class-name')
$(target).toggleClass(className)
})
// loading btn
// $('.btn.btn-loading,.btn[type=submit]')
// .click(function () {
// var btn = $(this)
// btn.button('loading')
// })
//.nav-content bar nav-menu
$('.navbar-xs .navbar-nav > li')
.on('shown.bs.dropdown', function (e) {
$(this).find('>.dropdown-menu').css('max-height', $(window).height() - 120);
$(this).parent().find('>li').addClass('hidden-xs');
$(this).removeClass('hidden-xs');
})
.on('hidden.bs.dropdown', function (e) {
$(this).parent().find('>li').removeClass('hidden-xs');
});
// dashboard widget
$('.widget-form').each(function (e) {
var el = $(this);
el.find('.btn-remove').click(function () {
el.find('input[name=_delete]').val('on');
el.submit();
});
});
// g-search
$('#g-search .dropdown-menu a').click(function () {
$('#g-search').attr('action', $(this).data('action')).submit();
})
// save settings
$.save_user_settings = function (key, value, success, error) {
var csrftoken = $.getCookie('csrftoken');
$.ajax({
type: 'POST',
url: window.__admin_path_prefix__ + 'settings/user',
data: {'key': key, 'value': value},
success: success,
error: error,
beforeSend: function (xhr, settings) {
xhr.setRequestHeader("X-CSRFToken", csrftoken);
}
});
}
// xadmin 后台实现ajax按钮上传文件
$(".submit_form").click(function (othis) {
var id = $(this).attr("data-id");
var clearFile = $('#myfile_' + id)
var files = $('#myfile_' + id)[0].files[0]
var filedata = new FormData();
filedata.append('file', files);
filedata.append("id", id);
$.ajax({
url: "/api/ImportUser",
type: "POST",
data: filedata,
processData: false,
contentType: false,
success: function (data) {
// alert("success")
console.log(data.msg)
alert(data.msg)
// 文件提交后,清空input的内容
clearFile.after(clearFile.clone().val(""));
clearFile.remove();
}
})
})
})(jQuery)
|
PypiClean
|
/langchain_xfyun-0.0.275b2-py3-none-any.whl/langchain_xfyun/schema/prompt_template.py
|
from __future__ import annotations
import json
from abc import ABC, abstractmethod
from pathlib import Path
from typing import Any, Callable, Dict, List, Mapping, Optional, Union
import yaml
from langchain_xfyun.load.serializable import Serializable
from langchain_xfyun.pydantic_v1 import Field, root_validator
from langchain_xfyun.schema.document import Document
from langchain_xfyun.schema.output_parser import BaseOutputParser
from langchain_xfyun.schema.prompt import PromptValue
from langchain_xfyun.schema.runnable import Runnable, RunnableConfig
class BasePromptTemplate(Serializable, Runnable[Dict, PromptValue], ABC):
"""Base class for all prompt templates, returning a prompt."""
input_variables: List[str]
"""A list of the names of the variables the prompt template expects."""
output_parser: Optional[BaseOutputParser] = None
"""How to parse the output of calling an LLM on this formatted prompt."""
partial_variables: Mapping[str, Union[str, Callable[[], str]]] = Field(
default_factory=dict
)
@property
def lc_serializable(self) -> bool:
return True
class Config:
"""Configuration for this pydantic object."""
arbitrary_types_allowed = True
def invoke(self, input: Dict, config: RunnableConfig | None = None) -> PromptValue:
return self._call_with_config(
lambda inner_input: self.format_prompt(**inner_input),
input,
config,
run_type="prompt",
)
@abstractmethod
def format_prompt(self, **kwargs: Any) -> PromptValue:
"""Create Chat Messages."""
@root_validator()
def validate_variable_names(cls, values: Dict) -> Dict:
"""Validate variable names do not include restricted names."""
if "stop" in values["input_variables"]:
raise ValueError(
"Cannot have an input variable named 'stop', as it is used internally,"
" please rename."
)
if "stop" in values["partial_variables"]:
raise ValueError(
"Cannot have an partial variable named 'stop', as it is used "
"internally, please rename."
)
overall = set(values["input_variables"]).intersection(
values["partial_variables"]
)
if overall:
raise ValueError(
f"Found overlapping input and partial variables: {overall}"
)
return values
def partial(self, **kwargs: Union[str, Callable[[], str]]) -> BasePromptTemplate:
"""Return a partial of the prompt template."""
prompt_dict = self.__dict__.copy()
prompt_dict["input_variables"] = list(
set(self.input_variables).difference(kwargs)
)
prompt_dict["partial_variables"] = {**self.partial_variables, **kwargs}
return type(self)(**prompt_dict)
def _merge_partial_and_user_variables(self, **kwargs: Any) -> Dict[str, Any]:
# Get partial params:
partial_kwargs = {
k: v if isinstance(v, str) else v()
for k, v in self.partial_variables.items()
}
return {**partial_kwargs, **kwargs}
@abstractmethod
def format(self, **kwargs: Any) -> str:
"""Format the prompt with the inputs.
Args:
kwargs: Any arguments to be passed to the prompt template.
Returns:
A formatted string.
Example:
.. code-block:: python
prompt.format(variable1="foo")
"""
@property
def _prompt_type(self) -> str:
"""Return the prompt type key."""
raise NotImplementedError
def dict(self, **kwargs: Any) -> Dict:
"""Return dictionary representation of prompt."""
prompt_dict = super().dict(**kwargs)
prompt_dict["_type"] = self._prompt_type
return prompt_dict
def save(self, file_path: Union[Path, str]) -> None:
"""Save the prompt.
Args:
file_path: Path to directory to save prompt to.
Example:
.. code-block:: python
prompt.save(file_path="path/prompt.yaml")
"""
if self.partial_variables:
raise ValueError("Cannot save prompt with partial variables.")
# Convert file to Path object.
if isinstance(file_path, str):
save_path = Path(file_path)
else:
save_path = file_path
directory_path = save_path.parent
directory_path.mkdir(parents=True, exist_ok=True)
# Fetch dictionary to save
prompt_dict = self.dict()
if save_path.suffix == ".json":
with open(file_path, "w") as f:
json.dump(prompt_dict, f, indent=4)
elif save_path.suffix == ".yaml":
with open(file_path, "w") as f:
yaml.dump(prompt_dict, f, default_flow_style=False)
else:
raise ValueError(f"{save_path} must be json or yaml")
def format_document(doc: Document, prompt: BasePromptTemplate) -> str:
"""Format a document into a string based on a prompt template.
First, this pulls information from the document from two sources:
1. `page_content`:
This takes the information from the `document.page_content`
and assigns it to a variable named `page_content`.
2. metadata:
This takes information from `document.metadata` and assigns
it to variables of the same name.
Those variables are then passed into the `prompt` to produce a formatted string.
Args:
doc: Document, the page_content and metadata will be used to create
the final string.
prompt: BasePromptTemplate, will be used to format the page_content
and metadata into the final string.
Returns:
string of the document formatted.
Example:
.. code-block:: python
from langchain_xfyun.schema import Document
from langchain_xfyun.prompts import PromptTemplate
doc = Document(page_content="This is a joke", metadata={"page": "1"})
prompt = PromptTemplate.from_template("Page {page}: {page_content}")
format_document(doc, prompt)
>>> "Page 1: This is a joke"
"""
base_info = {"page_content": doc.page_content, **doc.metadata}
missing_metadata = set(prompt.input_variables).difference(base_info)
if len(missing_metadata) > 0:
required_metadata = [
iv for iv in prompt.input_variables if iv != "page_content"
]
raise ValueError(
f"Document prompt requires documents to have metadata variables: "
f"{required_metadata}. Received document with missing metadata: "
f"{list(missing_metadata)}."
)
document_info = {k: base_info[k] for k in prompt.input_variables}
return prompt.format(**document_info)
|
PypiClean
|
/djangocms-googlecalendar-0.1.1.tar.gz/djangocms-googlecalendar-0.1.1/djangocms_googlecalendar/static/djangocms_googlecalendar/fullcalendar-3.4.0/locale/nl.js
|
!function(e){"function"==typeof define&&define.amd?define(["jquery","moment"],e):"object"==typeof exports?module.exports=e(require("jquery"),require("moment")):e(jQuery,moment)}(function(e,a){!function(){var e="jan._feb._mrt._apr._mei_jun._jul._aug._sep._okt._nov._dec.".split("_"),n="jan_feb_mrt_apr_mei_jun_jul_aug_sep_okt_nov_dec".split("_"),t=[/^jan/i,/^feb/i,/^maart|mrt.?$/i,/^apr/i,/^mei$/i,/^jun[i.]?$/i,/^jul[i.]?$/i,/^aug/i,/^sep/i,/^okt/i,/^nov/i,/^dec/i],r=/^(januari|februari|maart|april|mei|april|ju[nl]i|augustus|september|oktober|november|december|jan\.?|feb\.?|mrt\.?|apr\.?|ju[nl]\.?|aug\.?|sep\.?|okt\.?|nov\.?|dec\.?)/i;a.defineLocale("nl",{months:"januari_februari_maart_april_mei_juni_juli_augustus_september_oktober_november_december".split("_"),monthsShort:function(a,t){return a?/-MMM-/.test(t)?n[a.month()]:e[a.month()]:e},monthsRegex:r,monthsShortRegex:r,monthsStrictRegex:/^(januari|februari|maart|mei|ju[nl]i|april|augustus|september|oktober|november|december)/i,monthsShortStrictRegex:/^(jan\.?|feb\.?|mrt\.?|apr\.?|mei|ju[nl]\.?|aug\.?|sep\.?|okt\.?|nov\.?|dec\.?)/i,monthsParse:t,longMonthsParse:t,shortMonthsParse:t,weekdays:"zondag_maandag_dinsdag_woensdag_donderdag_vrijdag_zaterdag".split("_"),weekdaysShort:"zo._ma._di._wo._do._vr._za.".split("_"),weekdaysMin:"Zo_Ma_Di_Wo_Do_Vr_Za".split("_"),weekdaysParseExact:!0,longDateFormat:{LT:"HH:mm",LTS:"HH:mm:ss",L:"DD-MM-YYYY",LL:"D MMMM YYYY",LLL:"D MMMM YYYY HH:mm",LLLL:"dddd D MMMM YYYY HH:mm"},calendar:{sameDay:"[vandaag om] LT",nextDay:"[morgen om] LT",nextWeek:"dddd [om] LT",lastDay:"[gisteren om] LT",lastWeek:"[afgelopen] dddd [om] LT",sameElse:"L"},relativeTime:{future:"over %s",past:"%s geleden",s:"een paar seconden",m:"één minuut",mm:"%d minuten",h:"één uur",hh:"%d uur",d:"één dag",dd:"%d dagen",M:"één maand",MM:"%d maanden",y:"één jaar",yy:"%d jaar"},dayOfMonthOrdinalParse:/\d{1,2}(ste|de)/,ordinal:function(e){return e+(1===e||8===e||e>=20?"ste":"de")},week:{dow:1,doy:4}})}(),e.fullCalendar.datepickerLocale("nl","nl",{closeText:"Sluiten",prevText:"←",nextText:"→",currentText:"Vandaag",monthNames:["januari","februari","maart","april","mei","juni","juli","augustus","september","oktober","november","december"],monthNamesShort:["jan","feb","mrt","apr","mei","jun","jul","aug","sep","okt","nov","dec"],dayNames:["zondag","maandag","dinsdag","woensdag","donderdag","vrijdag","zaterdag"],dayNamesShort:["zon","maa","din","woe","don","vri","zat"],dayNamesMin:["zo","ma","di","wo","do","vr","za"],weekHeader:"Wk",dateFormat:"dd-mm-yy",firstDay:1,isRTL:!1,showMonthAfterYear:!1,yearSuffix:""}),e.fullCalendar.locale("nl",{buttonText:{month:"Maand",week:"Week",day:"Dag",list:"Agenda"},allDayText:"Hele dag",eventLimitText:"extra",noEventsMessage:"Geen evenementen om te laten zien"})});
|
PypiClean
|
/hnzhu010506-0.0.1-py3-none-any.whl/poco/services/command_runners.py
|
import os
import platform
from subprocess import check_call, CalledProcessError
from .console_logger import ColorPrint
from .file_utils import FileUtils
from .project_utils import ProjectUtils
from .state import StateHolder
from .environment_utils import EnvironmentUtils
class AbstractPlanRunner(object):
@staticmethod
def run_script_with_check(cmd, working_directory, envs):
res = check_call(" ".join(cmd), cwd=working_directory, env=envs, shell=True)
if res > 0:
ColorPrint.exit_after_print_messages(message=res)
@staticmethod
def get_file_list(base_dir, working_dir, dir_list):
file_list = list()
for file in FileUtils.get_filtered_sorted_alter_from_base_dir(base_dir=base_dir,
actual_dir=working_dir,
target_directories=dir_list,
filter_ends=('.yml', '.yaml')):
file_list.append(ProjectUtils.get_file(file=file))
return file_list
@staticmethod
def get_file(repo_dir, working_directory, file):
return ProjectUtils.get_file(file=FileUtils.get_compose_file_relative_path(
repo_dir=repo_dir, working_directory=working_directory,
file_name=file))
@staticmethod
def get_files_list(plan, repo_dir, working_directory):
files = list()
if isinstance(plan, dict) and 'kubernetes-file' in plan:
for file in ProjectUtils.get_list_value(plan['kubernetes-file']):
files.append(AbstractPlanRunner.get_file(repo_dir=repo_dir, working_directory=working_directory,
file=file))
return files
class ScriptPlanRunner(AbstractPlanRunner):
def __init__(self, project_compose, working_directory):
self.project_compose = project_compose
self.working_directory = working_directory
def run(self, plan, script_type):
scripts = self.get_native_scripts(plan=plan, script_type=script_type)
if len(scripts) > 0:
for script in scripts:
base_image = self.get_script_image(script)
ColorPrint.print_with_lvl(message="Executing " + script_type + " in " + base_image + " image")
command = self.get_script_command(script)
cmd = self.get_script_base(base_image, command)
ColorPrint.print_with_lvl(message= "Docker command: " + str(cmd), lvl=1)
self.run_script_with_check(cmd=cmd, working_directory=self.working_directory, envs=os.environ.copy())
def get_script_image(self, script):
base_image = "alpine:latest"
if isinstance(script, dict) and 'image' in script:
base_image = script['image']
return base_image
def get_script_command(self, script):
command = script
if isinstance(script, dict) and 'command' in script:
command = script['command']
return self.get_script_command_array(command)
def get_script_command_array(self, command):
command_array = list()
if isinstance(command, list):
for c in command:
ColorPrint.print_with_lvl(" - " + str(c))
command_array.append(c)
else:
ColorPrint.print_with_lvl(" - " + str(command))
command_array.append("/bin/sh")
command_array.append("-c")
command_array.append("\"")
command_array.append(command)
command_array.append("\"")
return command_array
def get_native_scripts(self, plan, script_type):
"""Get scripts """
scripts = list()
if not script_type == 'script' and script_type in self.project_compose:
scripts.extend(ProjectUtils.get_list_value(self.project_compose[script_type]))
if script_type in plan:
scripts.extend(ProjectUtils.get_list_value(plan[script_type]))
return scripts
def get_script_base(self, base_image, command):
command_array = list()
command_array.append("docker")
command_array.append("run")
"""Add host system to environment"""
command_array.append("-e")
command_array.append("HOST_SYSTEM="+platform.system())
if not platform.system() == 'Windows':
command_array.append("-u")
command_array.append(EnvironmentUtils.get_variable("POCO_UID", "1000") + ":" + EnvironmentUtils.get_variable("POCO_GID", "1000"))
command_array.append("--rm")
command_array.append("-v")
command_array.append(str(self.working_directory) + ":/usr/local")
command_array.append("-w")
command_array.append("/usr/local")
command_array.append(base_image)
for c in command:
command_array.append(c)
return command_array
class KubernetesRunner(AbstractPlanRunner):
def __init__(self, working_directory, repo_dir):
self.working_directory = working_directory
self.repo_dir = repo_dir
def run(self, plan, commands, envs):
files = AbstractPlanRunner.get_files_list(plan=plan, repo_dir=self.repo_dir,
working_directory=self.working_directory)
if isinstance(plan, dict) and len(files) == 0 and 'kubernetes-dir' in plan:
files.extend(self.get_file_list(self.repo_dir, self.working_directory,
ProjectUtils.get_list_value(plan['kubernetes-dir'])))
"""Kubernetes commands"""
for kube_file in files:
cmd = list()
cmd.append("kubectl")
cmd.extend(ProjectUtils.get_list_value(commands))
cmd.append("-f")
cmd.append(str(kube_file))
ColorPrint.print_with_lvl(message="Kubernetes command: " + str(cmd), lvl=1)
self.run_script_with_check(cmd=cmd, working_directory=self.working_directory, envs=envs)
class HelmRunner(AbstractPlanRunner):
def __init__(self, working_directory, repo_dir):
self.working_directory = working_directory
self.repo_dir = repo_dir
def run(self, plan, commands, envs):
files = AbstractPlanRunner.get_files_list(plan=plan, repo_dir=self.repo_dir,
working_directory=self.working_directory)
dirs = list()
if isinstance(plan, dict) and 'helm-dir' in plan:
directories = ProjectUtils.get_list_value(plan['helm-dir'])
if len(directories) > 1:
ColorPrint.print_with_lvl(message="Helm plan use only the first directory from helm-dir")
dirs.append(os.path.join(FileUtils.get_relative_path(self.repo_dir, self.working_directory),
directories[0]))
"""Helm command"""
cmd = list()
cmd.append("helm")
cmd.extend(ProjectUtils.get_list_value(commands))
cmd.append("poco-" + StateHolder.name)
HelmRunner.build_command(cmd=cmd, dirs=dirs, files=files)
ColorPrint.print_with_lvl(message="Helm command: " + str(cmd), lvl=1)
try:
self.run_script_with_check(cmd=cmd, working_directory=self.working_directory, envs=envs)
except CalledProcessError:
pass
@staticmethod
def build_command(cmd, dirs, files):
if "install" in cmd or "upgrade" in cmd:
if len(dirs) > 0:
cmd.append(str(dirs[0]))
for file in files:
cmd.append("-f")
cmd.append(str(file))
class DockerPlanRunner(AbstractPlanRunner):
def __init__(self, project_compose, working_directory, repo_dir):
self.working_directory = working_directory
self.project_compose = project_compose
self.repo_dir = repo_dir
def run(self, plan, commands, envs):
"""Get compose file(s) from config depends on selected plan"""
docker_files = self.get_docker_files(plan=plan)
"""Compose docker command array with project name and compose files"""
cmd = list()
cmd.append("docker-compose")
cmd.append("--project-name")
cmd.append(StateHolder.name)
for compose_file in docker_files:
cmd.append("-f")
cmd.append(str(compose_file))
cmd.extend(ProjectUtils.get_list_value(commands))
ColorPrint.print_with_lvl(message="Docker command: " + str(cmd), lvl=1)
self.run_script_with_check(cmd=cmd, working_directory=self.working_directory, envs=envs)
def get_docker_files(self, plan):
docker_files = list()
if isinstance(plan, dict) and 'docker-compose-file' in plan:
self.parse_file_list(ProjectUtils.get_list_value(plan['docker-compose-file']), docker_files)
elif isinstance(plan, dict) and 'docker-compose-dir' in plan:
docker_files.extend(self.get_file_list(self.repo_dir, self.working_directory,
ProjectUtils.get_list_value(plan['docker-compose-dir'])))
else:
self.parse_file_list(ProjectUtils.get_list_value(plan), docker_files)
return docker_files
def parse_file_list(self, services, docker_files):
for service in services:
docker_files.append(self.get_docker_compose(service=service))
def get_docker_compose(self, service):
"""Get back the docker compose file"""
file_name = self.get_compose_file_name(service=service)
return ProjectUtils.get_file(file=FileUtils.get_compose_file_relative_path(
repo_dir=self.repo_dir, working_directory=self.working_directory,
file_name=file_name))
def get_compose_file_name(self, service):
"""Get back docker compose file name"""
if self.project_compose is None:
return service
if 'containers' in self.project_compose and service in self.project_compose['containers']:
return self.project_compose['containers'].get(service)
return service
|
PypiClean
|
/fnal-column-analysis-tools-0.4.23.tar.gz/fnal-column-analysis-tools-0.4.23/fnal_column_analysis_tools/lookup_tools/json_converters.py
|
from fnal_column_analysis_tools.util import numpy as np
import json
def extract_json_histo_structure(parselevel, axis_names, axes):
if 'value' in parselevel.keys():
return
name = list(parselevel)[0].split(':')[0]
bins_pairs = [key.split(':')[-1].strip('[]').split(',') for key in parselevel.keys()]
bins = []
for pair in bins_pairs:
bins.extend([float(val) for val in pair])
bins.sort()
bins = np.unique(np.array(bins))
axis_names.append(name.encode())
axes[axis_names[-1]] = bins
extract_json_histo_structure(parselevel[list(parselevel)[0]], axis_names, axes)
def extract_json_histo_values(parselevel, binlows, values, val_names):
if 'value' in parselevel.keys():
binvals = {}
binvals.update(parselevel)
keylows = tuple(binlows)
values[keylows] = binvals
for key in parselevel.keys():
val_names.add(key)
return
for key in parselevel.keys():
lowside = float(key.split(':')[-1].strip('[]').split(',')[0])
thelows = [lowside]
if len(binlows) != 0:
thelows = binlows + thelows
extract_json_histo_values(parselevel[key], thelows, values, val_names)
def convert_histo_json_file(filename):
file = open(filename)
info = json.load(file)
file.close()
names_and_orders = {}
names_and_axes = {}
names_and_binvalues = {}
names_and_valnames = {}
# first pass, convert info['dir']['hist_title'] to dir/hist_title
# and un-nest everything from the json structure, make binnings, etc.
for dir in info.keys():
for htitle in info[dir].keys():
axis_order = [] # keep the axis order
axes = {}
bins_and_values = {}
val_names = set()
extract_json_histo_structure(info[dir][htitle], axis_order, axes)
extract_json_histo_values(info[dir][htitle], [], bins_and_values, val_names)
histname = '%s/%s' % (dir, htitle)
names_and_axes[histname] = axes
names_and_orders[histname] = axis_order
names_and_binvalues[histname] = bins_and_values
names_and_valnames[histname] = val_names
wrapped_up = {}
for name, axes in names_and_axes.items():
theshape = tuple([axes[axis].size - 1 for axis in names_and_orders[name]])
valsdict = {}
for vname in names_and_valnames[histname]:
valsdict[vname] = np.zeros(shape=theshape).flatten()
flatidx = np.arange(np.zeros(shape=theshape).size)
binidx = np.unravel_index(flatidx, shape=theshape)
for vname in valsdict:
for iflat in flatidx:
binlows = []
for idim, axis in enumerate(names_and_orders[name]):
binlows.append(axes[axis][binidx[idim][iflat]])
thevals = names_and_binvalues[name][tuple(binlows)]
valsdict[vname][iflat] = thevals[vname]
valsdict[vname] = valsdict[vname].reshape(theshape)
bins_in_order = []
for axis in names_and_orders[name]:
bins_in_order.append(axes[axis])
for vname in valsdict:
wrapped_up[(name + '_' + vname, 'dense_lookup')] = (valsdict[vname],
tuple(bins_in_order))
return wrapped_up
|
PypiClean
|
/gamification-engine-0.4.0.tar.gz/gamification-engine-0.4.0/gengine/app/jsscripts/node_modules/caniuse-lite/data/regions/VI.js
|
module.exports={D:{"11":0.049,"38":0.0196,"45":0.0735,"46":0.0049,"47":0.0049,"49":0.1715,"53":0.1519,"56":0.0098,"57":0.0098,"58":0.1176,"59":0.2303,"60":0.0196,"61":0.0098,"62":0.049,"63":0.0833,"64":0.0539,"65":0.1274,"66":0.0833,"67":0.1029,"68":0.0539,"69":0.1274,"70":0.9114,"71":22.7164,"72":0.1127,"73":0.0343,"74":0.0098,_:"4 5 6 7 8 9 10 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 39 40 41 42 43 44 48 50 51 52 54 55 75"},C:{"43":0.0196,"47":0.0098,"48":0.0147,"52":0.0539,"60":0.0294,"61":0.0196,"62":0.0833,"63":0.1225,"64":3.2291,"65":0.098,_:"2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 44 45 46 49 50 51 53 54 55 56 57 58 59 66 67 3.5 3.6"},F:{"57":0.4508,_:"9 11 12 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 9.5-9.6 10.5 10.6 11.1 11.5 11.6 12.1","10.0-10.1":0},E:{"4":0.0098,"8":0.0245,"9":0.0098,"10":0.0147,"11":0.0735,"12":2.8616,_:"0 5 6 7 3.1 3.2 5.1 6.1 7.1","9.1":0.1813,"10.1":0.2891,"11.1":0.5341},G:{"8":0.71712565822405,"3.2":0.019159845830413,"4.0-4.1":0.010948483331665,"4.2-4.3":0.0054742416658324,"5.0-5.1":0.030108329162078,"6.0-6.1":0.019159845830413,"7.0-7.1":0.060216658324157,"8.1-8.4":0.13138179997998,"9.0-9.2":0.13138179997998,"9.3":0.59942946240865,"10.0-10.2":0.39414539993993,"10.3":0.81839912904195,"11.0-11.2":1.0975854539994,"11.3-11.4":2.709749624587,"12.0-12.1":20.596834267694},I:{"3":0.00020039292730845,"4":0.026852652259332,_:"67","2.1":0.0014027504911591,"2.2":0.0044086444007859,"2.3":0.0022043222003929,"4.1":0.020039292730845,"4.2-4.3":0.051901768172888,"4.4":0,"4.4.3-4.4.4":0.096990176817289},K:{_:"0 10 11 12 11.1 11.5 12.1"},A:{"9":0.0098,"10":0.0294,"11":4.1307,_:"6 7 8 5.5"},B:{"12":0.0147,"13":0.0441,"14":0.1372,"15":0.0735,"16":0.2058,"17":6.4827,_:"18"},P:{_:"4 5.0-5.4 6.2-6.4 7.2-7.4 8.2"},N:{"10":0.021894,"11":0.142311},J:{"7":0,"10":0},R:{_:"0"},M:{"0":0.2703},O:{"0":0.1071},Q:{_:"1.2"},H:{"0":0.11105206185567},L:{"0":21.5209}};
|
PypiClean
|
/PyQt6-6.5.2.tar.gz/PyQt6-6.5.2/uic/enum_map.py
|
# Map enum member names to fully scoped names.
EnumMap = {
'Qt::AlignHCenter': 'Qt::AlignmentFlag::AlignHCenter',
'Qt::AlignJustify': 'Qt::AlignmentFlag::AlignJustify',
'Qt::AlignLeft': 'Qt::AlignmentFlag::AlignLeft',
'Qt::AlignRight': 'Qt::AlignmentFlag::AlignRight',
'Qt::AlignBaseline': 'Qt::AlignmentFlag::AlignBaseline',
'Qt::AlignBottom': 'Qt::AlignmentFlag::AlignBottom',
'Qt::AlignTop': 'Qt::AlignmentFlag::AlignTop',
'Qt::AlignVCenter': 'Qt::AlignmentFlag::AlignVCenter',
'Qt::AlignAbsolute': 'Qt::AlignmentFlag::AlignAbsolute',
'Qt::AlignLeading': 'Qt::AlignmentFlag::AlignLeading',
'Qt::AlignTrailing': 'Qt::AlignmentFlag::AlignTrailing',
'Qt::AlignCenter': 'Qt::AlignmentFlag::AlignCenter',
'Qt::AlignHorizontal_Mask': 'Qt::AlignmentFlag::AlignHorizontal_Mask',
'Qt::AlignVertical_Mask': 'Qt::AlignmentFlag::AlignVertical_Mask',
'Qt::DownArrow': 'Qt::ArrowType::DownArrow',
'Qt::LeftArrow': 'Qt::ArrowType::LeftArrow',
'Qt::NoArrow': 'Qt::ArrowType::NoArrow',
'Qt::RightArrow': 'Qt::ArrowType::RightArrow',
'Qt::UpArrow': 'Qt::ArrowType::UpArrow',
'Qt::Checked': 'Qt::CheckState::Checked',
'Qt::PartiallyChecked': 'Qt::CheckState::PartiallyChecked',
'Qt::Unchecked': 'Qt::CheckState::Unchecked',
'Qt::ActionsContextMenu': 'Qt::ContextMenuPolicy::ActionsContextMenu',
'Qt::CustomContextMenu': 'Qt::ContextMenuPolicy::CustomContextMenu',
'Qt::DefaultContextMenu': 'Qt::ContextMenuPolicy::DefaultContextMenu',
'Qt::NoContextMenu': 'Qt::ContextMenuPolicy::NoContextMenu',
'Qt::PreventContextMenu': 'Qt::ContextMenuPolicy::PreventContextMenu',
'Qt::LogicalMoveStyle': 'Qt::CursorMoveStyle::LogicalMoveStyle',
'Qt::VisualMoveStyle': 'Qt::CursorMoveStyle::VisualMoveStyle',
'Qt::Monday': 'Qt::DayOfWeek::Monday',
'Qt::Tuesday': 'Qt::DayOfWeek::Tuesday',
'Qt::Wednesday': 'Qt::DayOfWeek::Wednesday',
'Qt::Thursday': 'Qt::DayOfWeek::Thursday',
'Qt::Friday': 'Qt::DayOfWeek::Friday',
'Qt::Saturday': 'Qt::DayOfWeek::Saturday',
'Qt::Sunday': 'Qt::DayOfWeek::Sunday',
'Qt::AllDockWidgetAreas': 'Qt::DockWidgetArea::AllDockWidgetAreas',
'Qt::LeftDockWidgetArea': 'Qt::DockWidgetArea::LeftDockWidgetArea',
'Qt::RightDockWidgetArea': 'Qt::DockWidgetArea::RightDockWidgetArea',
'Qt::TopDockWidgetArea': 'Qt::DockWidgetArea::TopDockWidgetArea',
'Qt::BottomDockWidgetArea': 'Qt::DockWidgetArea::BottomDockWidgetArea',
'Qt::NoDockWidgetArea': 'Qt::DockWidgetArea::NoDockWidgetArea',
'Qt::ActionMask': 'Qt::DropAction::ActionMask',
'Qt::CopyAction': 'Qt::DropAction::CopyAction',
'Qt::IgnoreAction': 'Qt::DropAction::IgnoreAction',
'Qt::LinkAction': 'Qt::DropAction::LinkAction',
'Qt::MoveAction': 'Qt::DropAction::MoveAction',
'Qt::TargetMoveAction': 'Qt::DropAction::TargetMoveAction',
'Qt::ClickFocus': 'Qt::FocusPolicy::ClickFocus',
'Qt::NoFocus': 'Qt::FocusPolicy::NoFocus',
'Qt::TabFocus': 'Qt::FocusPolicy::TabFocus',
'Qt::StrongFocus': 'Qt::FocusPolicy::StrongFocus',
'Qt::WheelFocus': 'Qt::FocusPolicy::WheelFocus',
'Qt::ImhDate': 'Qt::InputMethodHint::ImhDate',
'Qt::ImhDialableCharactersOnly': 'Qt::InputMethodHint::ImhDialableCharactersOnly',
'Qt::ImhDigitsOnly': 'Qt::InputMethodHint::ImhDigitsOnly',
'Qt::ImhEmailCharactersOnly': 'Qt::InputMethodHint::ImhEmailCharactersOnly',
'Qt::ImhExclusiveInputMask': 'Qt::InputMethodHint::ImhExclusiveInputMask',
'Qt::ImhFormattedNumbersOnly': 'Qt::InputMethodHint::ImhFormattedNumbersOnly',
'Qt::ImhHiddenText': 'Qt::InputMethodHint::ImhHiddenText',
'Qt::ImhLatinOnly': 'Qt::InputMethodHint::ImhLatinOnly',
'Qt::ImhLowercaseOnly': 'Qt::InputMethodHint::ImhLowercaseOnly',
'Qt::ImhMultiLine': 'Qt::InputMethodHint::ImhMultiLine',
'Qt::ImhNoAutoUppercase': 'Qt::InputMethodHint::ImhNoAutoUppercase',
'Qt::ImhNoEditMenu': 'Qt::InputMethodHint::ImhNoEditMenu',
'Qt::ImhNoPredictiveText': 'Qt::InputMethodHint::ImhNoPredictiveText',
'Qt::ImhNoTextHandles': 'Qt::InputMethodHint::ImhNoTextHandles',
'Qt::ImhNone': 'Qt::InputMethodHint::ImhNone',
'Qt::ImhPreferLatin': 'Qt::InputMethodHint::ImhPreferLatin',
'Qt::ImhPreferLowercase': 'Qt::InputMethodHint::ImhPreferLowercase',
'Qt::ImhPreferNumbers': 'Qt::InputMethodHint::ImhPreferNumbers',
'Qt::ImhPreferUppercase': 'Qt::InputMethodHint::ImhPreferUppercase',
'Qt::ImhSensitiveData': 'Qt::InputMethodHint::ImhSensitiveData',
'Qt::ImhTime': 'Qt::InputMethodHint::ImhTime',
'Qt::ImhUppercaseOnly': 'Qt::InputMethodHint::ImhUppercaseOnly',
'Qt::ImhUrlCharactersOnly': 'Qt::InputMethodHint::ImhUrlCharactersOnly',
'Qt::ItemIsAutoTristate': 'Qt::ItemFlag::ItemIsAutoTristate',
'Qt::ItemIsDragEnabled': 'Qt::ItemFlag::ItemIsDragEnabled',
'Qt::ItemIsDropEnabled': 'Qt::ItemFlag::ItemIsDropEnabled',
'Qt::ItemIsEditable': 'Qt::ItemFlag::ItemIsEditable',
'Qt::ItemIsEnabled': 'Qt::ItemFlag::ItemIsEnabled',
'Qt::ItemIsSelectable': 'Qt::ItemFlag::ItemIsSelectable',
'Qt::ItemIsUserCheckable': 'Qt::ItemFlag::ItemIsUserCheckable',
'Qt::ItemIsUserTristate': 'Qt::ItemFlag::ItemIsUserTristate',
'Qt::ItemNeverHasChildren': 'Qt::ItemFlag::ItemNeverHasChildren',
'Qt::NoItemFlags': 'Qt::ItemFlag::NoItemFlags',
'Qt::ContainsItemBoundingRect': 'Qt::ItemSelectionMode::ContainsItemBoundingRect',
'Qt::ContainsItemShape': 'Qt::ItemSelectionMode::ContainsItemShape',
'Qt::IntersectsItemBoundingRect': 'Qt::ItemSelectionMode::IntersectsItemBoundingRect',
'Qt::IntersectsItemShape': 'Qt::ItemSelectionMode::IntersectsItemShape',
'Qt::LayoutDirectionAuto': 'Qt::LayoutDirection::LayoutDirectionAuto',
'Qt::LeftToRight': 'Qt::LayoutDirection::LeftToRight',
'Qt::RightToLeft': 'Qt::LayoutDirection::RightToLeft',
'Qt::Horizontal': 'Qt::Orientation::Horizontal',
'Qt::Vertical': 'Qt::Orientation::Vertical',
'Qt::CustomDashLine': 'Qt::PenStyle::CustomDashLine',
'Qt::DashDotDotLine': 'Qt::PenStyle::DashDotDotLine',
'Qt::DashDotLine': 'Qt::PenStyle::DashDotLine',
'Qt::DashLine': 'Qt::PenStyle::DashLine',
'Qt::DotLine': 'Qt::PenStyle::DotLine',
'Qt::NoPen': 'Qt::PenStyle::NoPen',
'Qt::SolidLine': 'Qt::PenStyle::SolidLine',
'Qt::ScrollBarAlwaysOff': 'Qt::ScrollBarPolicy::ScrollBarAlwaysOff',
'Qt::ScrollBarAlwaysOn': 'Qt::ScrollBarPolicy::ScrollBarAlwaysOn',
'Qt::ScrollBarAsNeeded': 'Qt::ScrollBarPolicy::ScrollBarAsNeeded',
'Qt::ApplicationShortcut': 'Qt::ShortcutContext::ApplicationShortcut',
'Qt::WidgetShortcut': 'Qt::ShortcutContext::WidgetShortcut',
'Qt::WidgetWithChildrenShortcut': 'Qt::ShortcutContext::WidgetWithChildrenShortcut',
'Qt::WindowShortcut': 'Qt::ShortcutContext::WindowShortcut',
'Qt::ElideLeft': 'Qt::TextElideMode::ElideLeft',
'Qt::ElideRight': 'Qt::TextElideMode::ElideRight',
'Qt::ElideMiddle': 'Qt::TextElideMode::ElideMiddle',
'Qt::ElideNone': 'Qt::TextElideMode::ElideNone',
'Qt::NoTextInteraction': 'Qt::TextInteractionFlag::NoTextInteraction',
'Qt::TextSelectableByMouse': 'Qt::TextInteractionFlag::TextSelectableByMouse',
'Qt::TextSelectableByKeyboard': 'Qt::TextInteractionFlag::TextSelectableByKeyboard',
'Qt::LinksAccessibleByMouse': 'Qt::TextInteractionFlag::LinksAccessibleByMouse',
'Qt::LinksAccessibleByKeyboard': 'Qt::TextInteractionFlag::LinksAccessibleByKeyboard',
'Qt::TextEditable': 'Qt::TextInteractionFlag::TextEditable',
'Qt::TextEditorInteraction': 'Qt::TextInteractionFlag::TextEditorInteraction',
'Qt::TextBrowserInteraction': 'Qt::TextInteractionFlag::TextBrowserInteraction',
'Qt::AutoText': 'Qt::TextFormat::AutoText',
'Qt::MarkdownText': 'Qt::TextFormat::MarkdownText',
'Qt::PlainText': 'Qt::TextFormat::PlainText',
'Qt::RichText': 'Qt::TextFormat::RichText',
'Qt::LocalTime': 'Qt::TimeSpec::LocalTime',
'Qt::OffsetFromUTC': 'Qt::TimeSpec::OffsetFromUTC',
'Qt::TimeZone': 'Qt::TimeSpec::TimeZone',
'Qt::UTC': 'Qt::TimeSpec::UTC',
'Qt::LeftToolBarArea': 'Qt::ToolBarArea::LeftToolBarArea',
'Qt::RightToolBarArea': 'Qt::ToolBarArea::RightToolBarArea',
'Qt::TopToolBarArea': 'Qt::ToolBarArea::TopToolBarArea',
'Qt::BottomToolBarArea': 'Qt::ToolBarArea::BottomToolBarArea',
'Qt::AllToolBarAreas': 'Qt::ToolBarArea::AllToolBarAreas',
'Qt::NoToolBarArea': 'Qt::ToolBarArea::NoToolBarArea',
'Qt::ToolButtonFollowStyle': 'Qt::ToolButtonStyle::ToolButtonFollowStyle',
'Qt::ToolButtonIconOnly': 'Qt::ToolButtonStyle::ToolButtonIconOnly',
'Qt::ToolButtonTextBesideIcon': 'Qt::ToolButtonStyle::ToolButtonTextBesideIcon',
'Qt::ToolButtonTextOnly': 'Qt::ToolButtonStyle::ToolButtonTextOnly',
'Qt::ToolButtonTextUnderIcon': 'Qt::ToolButtonStyle::ToolButtonTextUnderIcon',
'Qt::ApplicationModal': 'Qt::WindowModality::ApplicationModal',
'Qt::NonModal': 'Qt::WindowModality::NonModal',
'Qt::WindowModal': 'Qt::WindowModality::WindowModal',
'QAbstractItemView::NoDragDrop': 'QAbstractItemView::DragDropMode::NoDragDrop',
'QAbstractItemView::DragOnly': 'QAbstractItemView::DragDropMode::DragOnly',
'QAbstractItemView::DropOnly': 'QAbstractItemView::DragDropMode::DropOnly',
'QAbstractItemView::DragDrop': 'QAbstractItemView::DragDropMode::DragDrop',
'QAbstractItemView::InternalMove': 'QAbstractItemView::DragDropMode::InternalMove',
'QAbstractItemView::NoEditTriggers': 'QAbstractItemView::EditTrigger::NoEditTriggers',
'QAbstractItemView::CurrentChanged': 'QAbstractItemView::EditTrigger::CurrentChanged',
'QAbstractItemView::DoubleClicked': 'QAbstractItemView::EditTrigger::DoubleClicked',
'QAbstractItemView::SelectedClicked': 'QAbstractItemView::EditTrigger::SelectedClicked',
'QAbstractItemView::EditKeyPressed': 'QAbstractItemView::EditTrigger::EditKeyPressed',
'QAbstractItemView::AnyKeyPressed': 'QAbstractItemView::EditTrigger::AnyKeyPressed',
'QAbstractItemView::AllEditTriggers': 'QAbstractItemView::EditTrigger::AllEditTriggers',
'QAbstractItemView::ScrollPerItem': 'QAbstractItemView::ScrollMode::ScrollPerItem',
'QAbstractItemView::ScrollPerPixel': 'QAbstractItemView::ScrollMode::ScrollPerPixel',
'QAbstractItemView::SelectColumns': 'QAbstractItemView::SelectionBehavior::SelectColumns',
'QAbstractItemView::SelectItems': 'QAbstractItemView::SelectionBehavior::SelectItems',
'QAbstractItemView::SelectRows': 'QAbstractItemView::SelectionBehavior::SelectRows',
'QAbstractItemView::NoSelection': 'QAbstractItemView::SelectionMode::NoSelection',
'QAbstractItemView::SingleSelection': 'QAbstractItemView::SelectionMode::SingleSelection',
'QAbstractItemView::MultiSelection': 'QAbstractItemView::SelectionMode::MultiSelection',
'QAbstractItemView::ExtendedSelection': 'QAbstractItemView::SelectionMode::ExtendedSelection',
'QAbstractItemView::ContiguousSelection': 'QAbstractItemView::SelectionMode::ContiguousSelection',
'QAbstractScrollArea::AdjustIgnored': 'QAbstractScrollArea::SizeAdjustPolicy::AdjustIgnored',
'QAbstractScrollArea::AdjustToContents': 'QAbstractScrollArea::SizeAdjustPolicy::AdjustToContents',
'QAbstractScrollArea::AdjustToContentsOnFirstShow': 'QAbstractScrollArea::SizeAdjustPolicy::AdjustToContentsOnFirstShow',
'QAbstractSpinBox::NoButtons': 'QAbstractSpinBox::ButtonSymbols::NoButtons',
'QAbstractSpinBox::PlusMinus': 'QAbstractSpinBox::ButtonSymbols::PlusMinus',
'QAbstractSpinBox::UpDownArrows': 'QAbstractSpinBox::ButtonSymbols::UpDownArrows',
'QAbstractSpinBox::CorrectToNearestValue': 'QAbstractSpinBox::CorrectionMode::CorrectToNearestValue',
'QAbstractSpinBox::CorrectToPreviousValue': 'QAbstractSpinBox::CorrectionMode::CorrectToPreviousValue',
'QAbstractSpinBox::AdaptiveDecimalStepType': 'QAbstractSpinBox::StepType::AdaptiveDecimalStepType',
'QAbstractSpinBox::DefaultStepType': 'QAbstractSpinBox::StepType::DefaultStepType',
'QAction::NoRole': 'QAction::MenuRole::NoRole',
'QAction::TextHeuristicRole': 'QAction::MenuRole::TextHeuristicRole',
'QAction::ApplicationSpecificRole': 'QAction::MenuRole::ApplicationSpecificRole',
'QAction::AboutQtRole': 'QAction::MenuRole::AboutQtRole',
'QAction::AboutRole': 'QAction::MenuRole::AboutRole',
'QAction::PreferencesRole': 'QAction::MenuRole::PreferencesRole',
'QAction::QuitRole': 'QAction::MenuRole::QuitRole',
'QCalendarWidget::LongDayNames': 'QCalendarWidget::HorizontalHeaderFormat::LongDayNames',
'QCalendarWidget::NoHorizontalHeader': 'QCalendarWidget::HorizontalHeaderFormat::NoHorizontalHeader',
'QCalendarWidget::ShortDayNames': 'QCalendarWidget::HorizontalHeaderFormat::ShortDayNames',
'QCalendarWidget::SingleLetterDayNames': 'QCalendarWidget::HorizontalHeaderFormat::SingleLetterDayNames',
'QCalendarWidget::NoSelection': 'QCalendarWidget::SelectionMode::NoSelection',
'QCalendarWidget::SingleSelection': 'QCalendarWidget::SelectionMode::SingleSelection',
'QCalendarWidget::ISOWeekNumbers': 'QCalendarWidget::VerticalHeaderFormat::ISOWeekNumbers',
'QCalendarWidget::NoVerticalHeader': 'QCalendarWidget::VerticalHeaderFormat::NoVerticalHeader',
'QComboBox::InsertAfterCurrent': 'QComboBox::InsertPolicy::InsertAfterCurrent',
'QComboBox::InsertAlphabetically': 'QComboBox::InsertPolicy::InsertAlphabetically',
'QComboBox::InsertAtBottom': 'QComboBox::InsertPolicy::InsertAtBottom',
'QComboBox::InsertAtCurrent': 'QComboBox::InsertPolicy::InsertAtCurrent',
'QComboBox::InsertAtTop': 'QComboBox::InsertPolicy::InsertAtTop',
'QComboBox::InsertBeforeCurrent': 'QComboBox::InsertPolicy::InsertBeforeCurrent',
'QComboBox::NoInsert': 'QComboBox::InsertPolicy::NoInsert',
'QComboBox::AdjustToContents': 'QComboBox::SizeAdjustPolicy::AdjustToContents',
'QComboBox::AdjustToContentsOnFirstShow': 'QComboBox::SizeAdjustPolicy::AdjustToContentsOnFirstShow',
'QComboBox::AdjustToMinimumContentsLengthWithIcon': 'QComboBox::SizeAdjustPolicy::AdjustToMinimumContentsLengthWithIcon',
'QDateTimeEdit::NoSection': 'QDateTimeEdit::Section::NoSection',
'QDateTimeEdit::AmPmSection': 'QDateTimeEdit::Section::AmPmSection',
'QDateTimeEdit::MSecSection': 'QDateTimeEdit::Section::MSecSection',
'QDateTimeEdit::SecondSection': 'QDateTimeEdit::Section::SecondSection',
'QDateTimeEdit::MinuteSection': 'QDateTimeEdit::Section::MinuteSection',
'QDateTimeEdit::HourSection': 'QDateTimeEdit::Section::HourSection',
'QDateTimeEdit::DaySection': 'QDateTimeEdit::Section::DaySection',
'QDateTimeEdit::MonthSection': 'QDateTimeEdit::Section::MonthSection',
'QDateTimeEdit::YearSection': 'QDateTimeEdit::Section::YearSection',
'QDialogButtonBox::NoButton': 'QDialogButtonBox::StandardButton::NoButton',
'QDialogButtonBox::Ok': 'QDialogButtonBox::StandardButton::Ok',
'QDialogButtonBox::Save': 'QDialogButtonBox::StandardButton::Save',
'QDialogButtonBox::SaveAll': 'QDialogButtonBox::StandardButton::SaveAll',
'QDialogButtonBox::Open': 'QDialogButtonBox::StandardButton::Open',
'QDialogButtonBox::Yes': 'QDialogButtonBox::StandardButton::Yes',
'QDialogButtonBox::YesToAll': 'QDialogButtonBox::StandardButton::YesToAll',
'QDialogButtonBox::No': 'QDialogButtonBox::StandardButton::No',
'QDialogButtonBox::NoToAll': 'QDialogButtonBox::StandardButton::NoToAll',
'QDialogButtonBox::Abort': 'QDialogButtonBox::StandardButton::Abort',
'QDialogButtonBox::Retry': 'QDialogButtonBox::StandardButton::Retry',
'QDialogButtonBox::Ignore': 'QDialogButtonBox::StandardButton::Ignore',
'QDialogButtonBox::Close': 'QDialogButtonBox::StandardButton::Close',
'QDialogButtonBox::Cancel': 'QDialogButtonBox::StandardButton::Cancel',
'QDialogButtonBox::Discard': 'QDialogButtonBox::StandardButton::Discard',
'QDialogButtonBox::Help': 'QDialogButtonBox::StandardButton::Help',
'QDialogButtonBox::Apply': 'QDialogButtonBox::StandardButton::Apply',
'QDialogButtonBox::Reset': 'QDialogButtonBox::StandardButton::Reset',
'QDialogButtonBox::RestoreDefaults': 'QDialogButtonBox::StandardButton::RestoreDefaults',
'QDockWidget::DockWidgetClosable': 'QDockWidget::DockWidgetFeature::DockWidgetClosable',
'QDockWidget::DockWidgetFloatable': 'QDockWidget::DockWidgetFeature::DockWidgetFloatable',
'QDockWidget::DockWidgetMovable': 'QDockWidget::DockWidgetFeature::DockWidgetMovable',
'QDockWidget::DockWidgetVerticalTitleBar': 'QDockWidget::DockWidgetFeature::DockWidgetVerticalTitleBar',
'QDockWidget::NoDockWidgetFeatures': 'QDockWidget::DockWidgetFeature::NoDockWidgetFeatures',
'QFontComboBox::AllFonts': 'QFontComboBox::FontFilter::AllFonts',
'QFontComboBox::MonospacedFonts': 'QFontComboBox::FontFilter::MonospacedFonts',
'QFontComboBox::NonScalableFonts': 'QFontComboBox::FontFilter::NonScalableFonts',
'QFontComboBox::ProportionalFonts': 'QFontComboBox::FontFilter::ProportionalFonts',
'QFontComboBox::ScalableFonts': 'QFontComboBox::FontFilter::ScalableFonts',
'QFontDatabase::Any': 'QFontDatabase::WritingSystem::Any',
'QFontDatabase::Latin': 'QFontDatabase::WritingSystem::Latin',
'QFontDatabase::Greek': 'QFontDatabase::WritingSystem::Greek',
'QFontDatabase::Cyrillic': 'QFontDatabase::WritingSystem::Cyrillic',
'QFontDatabase::Armenian': 'QFontDatabase::WritingSystem::Armenian',
'QFontDatabase::Hebrew': 'QFontDatabase::WritingSystem::Hebrew',
'QFontDatabase::Arabic': 'QFontDatabase::WritingSystem::Arabic',
'QFontDatabase::Syriac': 'QFontDatabase::WritingSystem::Syriac',
'QFontDatabase::Thaana': 'QFontDatabase::WritingSystem::Thaana',
'QFontDatabase::Devanagari': 'QFontDatabase::WritingSystem::Devanagari',
'QFontDatabase::Bengali': 'QFontDatabase::WritingSystem::Bengali',
'QFontDatabase::Gurmukhi': 'QFontDatabase::WritingSystem::Gurmukhi',
'QFontDatabase::Gujarati': 'QFontDatabase::WritingSystem::Gujarati',
'QFontDatabase::Oriya': 'QFontDatabase::WritingSystem::Oriya',
'QFontDatabase::Tamil': 'QFontDatabase::WritingSystem::Tamil',
'QFontDatabase::Telugu': 'QFontDatabase::WritingSystem::Telugu',
'QFontDatabase::Kannada': 'QFontDatabase::WritingSystem::Kannada',
'QFontDatabase::Malayalam': 'QFontDatabase::WritingSystem::Malayalam',
'QFontDatabase::Sinhala': 'QFontDatabase::WritingSystem::Sinhala',
'QFontDatabase::Thai': 'QFontDatabase::WritingSystem::Thai',
'QFontDatabase::Lao': 'QFontDatabase::WritingSystem::Lao',
'QFontDatabase::Tibetan': 'QFontDatabase::WritingSystem::Tibetan',
'QFontDatabase::Myanmar': 'QFontDatabase::WritingSystem::Myanmar',
'QFontDatabase::Georgian': 'QFontDatabase::WritingSystem::Georgian',
'QFontDatabase::Khmer': 'QFontDatabase::WritingSystem::Khmer',
'QFontDatabase::SimplifiedChinese': 'QFontDatabase::WritingSystem::SimplifiedChinese',
'QFontDatabase::TraditionalChinese': 'QFontDatabase::WritingSystem::TraditionalChinese',
'QFontDatabase::Japanese': 'QFontDatabase::WritingSystem::Japanese',
'QFontDatabase::Korean': 'QFontDatabase::WritingSystem::Korean',
'QFontDatabase::Vietnamese': 'QFontDatabase::WritingSystem::Vietnamese',
'QFontDatabase::Other': 'QFontDatabase::WritingSystem::Other',
'QFontDatabase::Symbol': 'QFontDatabase::WritingSystem::Symbol',
'QFontDatabase::Ogham': 'QFontDatabase::WritingSystem::Ogham',
'QFontDatabase::Runic': 'QFontDatabase::WritingSystem::Runic',
'QFontDatabase::Nko': 'QFontDatabase::WritingSystem::Nko',
'QFormLayout::AllNonFixedFieldsGrow': 'QFormLayout::FieldGrowthPolicy::AllNonFixedFieldsGrow',
'QFormLayout::ExpandingFieldsGrow': 'QFormLayout::FieldGrowthPolicy::ExpandingFieldsGrow',
'QFormLayout::FieldsStayAtSizeHint': 'QFormLayout::FieldGrowthPolicy::FieldsStayAtSizeHint',
'QFormLayout::DontWrapRows': 'QFormLayout::RowWrapPolicy::DontWrapRows',
'QFormLayout::WrapLongRows': 'QFormLayout::RowWrapPolicy::WrapLongRows',
'QFormLayout::WrapAllRows': 'QFormLayout::RowWrapPolicy::WrapAllRows',
'QFrame::Box': 'QFrame::Shape::Box',
'QFrame::HLine': 'QFrame::Shape::HLine',
'QFrame::NoFrame': 'QFrame::Shape::NoFrame',
'QFrame::Panel': 'QFrame::Shape::Panel',
'QFrame::StyledPanel': 'QFrame::Shape::StyledPanel',
'QFrame::VLine': 'QFrame::Shape::VLine',
'QFrame::WinPanel': 'QFrame::Shape::WinPanel',
'QFrame::Plain': 'QFrame::Shadow::Plain',
'QFrame::Raised': 'QFrame::Shadow::Raised',
'QFrame::Sunken': 'QFrame::Shadow::Sunken',
'QGraphicsView::CacheNone': 'QGraphicsView::CacheMode::CacheNone',
'QGraphicsView::CacheBackground': 'QGraphicsView::CacheMode::CacheBackground',
'QGraphicsView::DontAdjustForAntialiasing': 'QGraphicsView::OptimizationFlags::DontAdjustForAntialiasing',
'QGraphicsView::DontSavePainterState': 'QGraphicsView::OptimizationFlags::DontSavePainterState',
'QGraphicsView::NoAnchor': 'QGraphicsView::ViewportAnchor::NoAnchor',
'QGraphicsView::AnchorViewCenter': 'QGraphicsView::ViewportAnchor::AnchorViewCenter',
'QGraphicsView::AnchorUnderMouse': 'QGraphicsView::ViewportAnchor::AnchorUnderMouse',
'QGraphicsView::BoundingRectViewportUpdate': 'QGraphicsView::ViewportUpdateMode::BoundingRectViewportUpdate',
'QGraphicsView::FullViewportUpdate': 'QGraphicsView::ViewportUpdateMode::FullViewportUpdate',
'QGraphicsView::MinimalViewportUpdate': 'QGraphicsView::ViewportUpdateMode::MinimalViewportUpdate',
'QGraphicsView::NoViewportUpdate': 'QGraphicsView::ViewportUpdateMode::NoViewportUpdate',
'QGraphicsView::SmartViewportUpdate': 'QGraphicsView::ViewportUpdateMode::SmartViewportUpdate',
'QLayout::SetDefaultConstraint': 'QLayout::SizeConstraint::SetDefaultConstraint',
'QLayout::SetFixedSize': 'QLayout::SizeConstraint::SetFixedSize',
'QLayout::SetMaximumSize': 'QLayout::SizeConstraint::SetMaximumSize',
'QLayout::SetMinAndMaxSize': 'QLayout::SizeConstraint::SetMinAndMaxSize',
'QLayout::SetMinimumSize': 'QLayout::SizeConstraint::SetMinimumSize',
'QLayout::SetNoConstraint': 'QLayout::SizeConstraint::SetNoConstraint',
'QLCDNumber::Bin': 'QLCDNumber::Mode::Bin',
'QLCDNumber::Dec': 'QLCDNumber::Mode::Dec',
'QLCDNumber::Hex': 'QLCDNumber::Mode::Hex',
'QLCDNumber::Oct': 'QLCDNumber::Mode::Oct',
'QLCDNumber::Filled': 'QLCDNumber::SegmentStyle::Filled',
'QLCDNumber::Flat': 'QLCDNumber::SegmentStyle::Flat',
'QLCDNumber::Outline': 'QLCDNumber::SegmentStyle::Outline',
'QLineEdit::NoEcho': 'QLineEdit::EchoMode::NoEcho',
'QLineEdit::Normal': 'QLineEdit::EchoMode::Normal',
'QLineEdit::Password': 'QLineEdit::EchoMode::Password',
'QLineEdit::PasswordEchoOnEdit': 'QLineEdit::EchoMode::PasswordEchoOnEdit',
'QListView::LeftToRight': 'QListView::Flow::LeftToRight',
'QListView::TopToBottom': 'QListView::Flow::TopToBottom',
'QListView::Batched': 'QListView::LayoutMode::Batched',
'QListView::SinglePass': 'QListView::LayoutMode::SinglePass',
'QListView::Free': 'QListView::Movement::Free',
'QListView::Snap': 'QListView::Movement::Snap',
'QListView::Static': 'QListView::Movement::Static',
'QListView::Adjust': 'QListView::ResizeMode::Adjust',
'QListView::Fixed': 'QListView::ResizeMode::Fixed',
'QListView::IconMode': 'QListView::ViewMode::IconMode',
'QListView::ListMode': 'QListView::ViewMode::ListMode',
'QMdiArea::SubWindowView': 'QMdiArea::ViewMode::SubWindowView',
'QMdiArea::TabbedView': 'QMdiArea::ViewMode::TabbedView',
'QMdiArea::ActivationHistoryOrder': 'QMdiArea::WindowOrder::ActivationHistoryOrder',
'QMdiArea::CreationOrder': 'QMdiArea::WindowOrder::CreationOrder',
'QMdiArea::StackingOrder': 'QMdiArea::WindowOrder::StackingOrder',
'QPainter::Antialiasing': 'QPainter::RenderHint::Antialiasing',
'QPainter::LosslessImageRendering': 'QPainter::RenderHint::LosslessImageRendering',
'QPainter::SmoothPixmapTransform': 'QPainter::RenderHint::SmoothPixmapTransform',
'QPainter::TextAntialiasing': 'QPainter::RenderHint::TextAntialiasing',
'QPlainTextEdit::NoWrap': 'QPlainTextEdit::LineWrapMode::NoWrap',
'QPlainTextEdit::WidgetWidth': 'QPlainTextEdit::LineWrapMode::WidgetWidth',
'QProgressBar::BottomToTop': 'QProgressBar::Direction::BottomToTop',
'QProgressBar::TopToBottom': 'QProgressBar::Direction::TopToBottom',
'QQuickWidget::SizeRootObjectToView': 'QQuickWidget::ResizeMode::SizeRootObjectToView',
'QQuickWidget::SizeViewToRootObject': 'QQuickWidget::ResizeMode::SizeViewToRootObject',
'QSizePolicy::Fixed': 'QSizePolicy::Policy::Fixed',
'QSizePolicy::Minimum': 'QSizePolicy::Policy::Minimum',
'QSizePolicy::Maximum': 'QSizePolicy::Policy::Maximum',
'QSizePolicy::Preferred': 'QSizePolicy::Policy::Preferred',
'QSizePolicy::MinimumExpanding': 'QSizePolicy::Policy::MinimumExpanding',
'QSizePolicy::Expanding': 'QSizePolicy::Policy::Expanding',
'QSizePolicy::Ignored': 'QSizePolicy::Policy::Ignored',
'QSlider::NoTicks': 'QSlider::TickPosition::NoTicks',
'QSlider::TicksAbove': 'QSlider::TickPosition::TicksAbove',
'QSlider::TicksBelow': 'QSlider::TickPosition::TicksBelow',
'QSlider::TicksBothSides': 'QSlider::TickPosition::TicksBothSides',
'QSlider::TicksLeft': 'QSlider::TickPosition::TicksLeft',
'QSlider::TicksRight': 'QSlider::TickPosition::TicksRight',
'QTabWidget::North': 'QTabWidget::TabPosition::North',
'QTabWidget::South': 'QTabWidget::TabPosition::South',
'QTabWidget::West': 'QTabWidget::TabPosition::West',
'QTabWidget::East': 'QTabWidget::TabPosition::East',
'QTabWidget::Rounded': 'QTabWidget::TabShape::Rounded',
'QTabWidget::Triangular': 'QTabWidget::TabShape::Triangular',
'QTextEdit::AutoAll': 'QTextEdit::AutoFormattingFlag::AutoAll',
'QTextEdit::AutoBulletList': 'QTextEdit::AutoFormattingFlag::AutoBulletList',
'QTextEdit::AutoNone': 'QTextEdit::AutoFormattingFlag::AutoNone',
'QTextEdit::FixedColumnWidth': 'QTextEdit::LineWrapMode::FixedColumnWidth',
'QTextEdit::FixedPixelWidth': 'QTextEdit::LineWrapMode::FixedPixelWidth',
'QTextEdit::NoWrap': 'QTextEdit::LineWrapMode::NoWrap',
'QTextEdit::WidgetWidth': 'QTextEdit::LineWrapMode::WidgetWidth',
'QToolButton::DelayedPopup': 'QToolButton::ToolButtonPopupMode::DelayedPopup',
'QToolButton::InstantPopup': 'QToolButton::ToolButtonPopupMode::InstantPopup',
'QToolButton::MenuButtonPopup': 'QToolButton::ToolButtonPopupMode::MenuButtonPopup',
'QWizard::CancelButtonOnLeft': 'QWizard::WizardOption::CancelButtonOnLeft',
'QWizard::DisabledBackButtonOnLastPage': 'QWizard::WizardOption::DisabledBackButtonOnLastPage',
'QWizard::ExtendedWatermarkPixmap': 'QWizard::WizardOption::ExtendedWatermarkPixmap',
'QWizard::HaveCustomButton1': 'QWizard::WizardOption::HaveCustomButton1',
'QWizard::HaveCustomButton2': 'QWizard::WizardOption::HaveCustomButton2',
'QWizard::HaveCustomButton3': 'QWizard::WizardOption::HaveCustomButton3',
'QWizard::HaveFinishButtonOnEarlyPages': 'QWizard::WizardOption::HaveFinishButtonOnEarlyPages',
'QWizard::HaveHelpButton': 'QWizard::WizardOption::HaveHelpButton',
'QWizard::HaveNextButtonOnLastPage': 'QWizard::WizardOption::HaveNextButtonOnLastPage',
'QWizard::HelpButtonOnRight': 'QWizard::WizardOption::HelpButtonOnRight',
'QWizard::IgnoreSubTitles': 'QWizard::WizardOption::IgnoreSubTitles',
'QWizard::IndependentPages': 'QWizard::WizardOption::IndependentPages',
'QWizard::NoBackButtonOnLastPage': 'QWizard::WizardOption::NoBackButtonOnLastPage',
'QWizard::NoBackButtonOnStartPage': 'QWizard::WizardOption::NoBackButtonOnStartPage',
'QWizard::NoCancelButton': 'QWizard::WizardOption::NoCancelButton',
'QWizard::NoCancelButtonOnLastPage': 'QWizard::WizardOption::NoCancelButtonOnLastPage',
'QWizard::NoDefaultButton': 'QWizard::WizardOption::NoDefaultButton',
'QWizard::AeroStyle': 'QWizard::WizardStyle::AeroStyle',
'QWizard::ClassicStyle': 'QWizard::WizardStyle::ClassicStyle',
'QWizard::MacStyle': 'QWizard::WizardStyle::MacStyle',
'QWizard::ModernStyle': 'QWizard::WizardStyle::ModernStyle',
}
|
PypiClean
|
/nni-3.0rc1-py3-none-macosx_10_9_x86_64.whl/nni_node/node_modules/lodash/release.md
|
npm run build
npm run doc
npm i
git clone --depth=10 --branch=master [email protected]:lodash-archive/lodash-cli.git ./node_modules/lodash-cli
mkdir -p ./node_modules/lodash-cli/node_modules/lodash; cd $_; cp ../../../../lodash.js ./lodash.js; cp ../../../../package.json ./package.json
cd ../../; npm i --production; cd ../../
node ./node_modules/lodash-cli/bin/lodash core exports=node -o ./npm-package/core.js
node ./node_modules/lodash-cli/bin/lodash modularize exports=node -o ./npm-package
cp lodash.js npm-package/lodash.js
cp dist/lodash.min.js npm-package/lodash.min.js
cp LICENSE npm-package/LICENSE
1. Clone two repos
Bump lodash version in package.json, readme, package=locak, lodash.js
npm run build
npm run doc
2. update mappings in ldoash-cli
3. copy ldoash into lodash-cli node modules and package json.
node ./node_modules/lodash-cli/bin/lodash core exports=node -o ./npm-package/core.js
node ./node_modules/lodash-cli/bin/lodash modularize exports=node -o ./npm-package
1. Clone the two repositories:
```sh
$ git clone https://github.com/lodash/lodash.git
$ git clone https://github.com/bnjmnt4n/lodash-cli.git
```
2. Update lodash-cli to accomdate changes in lodash source. This can typically involve adding new function dependency mappings in lib/mappings.js. Sometimes, additional changes might be needed for more involved functions.
3. In the lodash repository, update references to the lodash version in README.md, lodash.js, package.jsona nd package-lock.json
4. Run:
```sh
npm run build
npm run doc
node ../lodash-cli/bin/lodash core -o ./dist/lodash.core.js
```
5. Add a commit and tag the release
mkdir ../lodash-temp
cp lodash.js dist/lodash.min.js dist/lodash.core.js dist/lodash.core.min.js ../lodash-temp/
node ../lodash-cli/bin/lodash modularize exports=node -o .
cp ../lodash-temp/lodash.core.js core.js
cp ../lodash-temp/lodash.core.min.js core.min.js
cp ../lodash-temp/lodash.js lodash.js
cp ../lodash-temp/lodash.min.js lodash.min.js
❯ node ../lodash-cli/bin/lodash modularize exports=es -o .
|
PypiClean
|
/markdown-checklists-0.6.3.tar.gz/markdown-checklists-0.6.3/README
|
[Markdown Checklists](https://github.com/tobiashochguertel/markdown-checklists)
[](https://travis-ci.org/tobiashochguertel/markdown-checklists)
<!--[](https://coveralls.io/r/FND/markdown-checklist)-->
a [Python Markdown](http://pythonhosted.org/Markdown/) extension for lists of
tasks with checkboxes inspured by [GitHub task lists](https://github.com/blog/1375-task-lists-in-gfm-issues-pulls-comments).
Markdown-Checklists is forked from [Markdown Checklist](https://github.com/FND/markdown-checklist) and extended with addtional features.
## Features
* a dash can be used instead of an asterisk for list items
* both upper- and lowercase "x" are accepted to activate checkboxes
## Additional Features
* MakeFile provides task to create for Sublime-Text 3 Plugin [OmniMarkupPreviewer](https://github.com/timonwong/OmniMarkupPreviewer) an Markdown-Renderer extension with additonal template files.
* `class` attribute for `<ul>`-Tag for Checklists.
* `class` attribute for `<li>`-Tag of Checklists.
* Genearting key (hash) for each checkpoint text, using hash to make checklist check"ed" able.
* `id` attribute for `<input>`-Tag of Checklists.
* `for` attribute for `<label>`-Tag of Checklists.
## Example HTML Output
```
h1>Hello World</h1>
<ul class="checklist">
<li class="task-list-item"><input type="checkbox" id="ca052d1d7e0a2f787f4ef9937840dcf91e647b08b208df4bbce2e78d527a4f8c"><label for="ca052d1d7e0a2f787f4ef9937840dcf91e647b08b208df4bbce2e78d527a4f8c"> foo</label></li>
<li class="task-list-item"><input type="checkbox" id="375719a43941c6a5e7f957c74b6f1d7e20cfefd0040181aaf6d3074c8eaac311" checked><label for="375719a43941c6a5e7f957c74b6f1d7e20cfefd0040181aaf6d3074c8eaac311"> bar</label></li>
<li class="task-list-item"><input type="checkbox" id="7d80d75283fdbf2a3d8a0e2eed45e9d844d1a7482372cd8bc59581725373c179"><label for="7d80d75283fdbf2a3d8a0e2eed45e9d844d1a7482372cd8bc59581725373c179"> baz</label></li>
<li class="task-list-item"><input type="checkbox"></li>
<li class="task-list-item"><input type="checkbox" checked></li>
</ul>
<p>lorem ipsum</p>
```
## Installation
$ pip install markdown-checklists
### Markdown-Renderer Extension for OmniMarkupPreviewer
*installs the extension to the current user.*
$ make OmniMarkupPreviewerInstall
## Usage
import markdown
html = markdown.markdown(source, extensions=['markdown_checklists.extension'])
or
import markdown
from markdown_checklists.extension import ChecklistsExtension
html = markdown.markdown(source, extensions=[ChecklistsExtension()])
There is also a small JavaScript/jQuery library to make checkboxes interactive:
new Checklists("article", function(checkbox, callback) {
var uri = checkbox.closest("article").find("h1 a").attr("href");
jQuery.get(uri, callback);
}, function(markdown, checkbox, callback) {
var uri = checkbox.closest("article").find("h1 a").attr("href");
jQuery.ajax({
type: "put",
uri: uri,
data: markdown,
success: callback
});
});
See included `checklists.js` for details.
|
PypiClean
|
/m360-ptelegraf-3.2.2.tar.gz/m360-ptelegraf-3.2.2/m360/agents/tomcat/manager.py
|
import requests
import bs4
import re
import m360.base.manager as base
from logging import getLogger
LOG=getLogger('m360.agents.tomcat')
requests.packages.urllib3.disable_warnings()
class Manager(base.Manager):
@staticmethod
def list_processes(text_url, username, password):
'''
This function queries the Tomcat Manager for a list of running
Tomcat container processes.
:param:
:returns: a dict of process information. Keys are the contxt paths; values
are also dictionaries with uniform keys 'running'(boolean),
'session_count'(int), and 'context_path'(string).
:raises:
'''
url = text_url + '/text/list'
session = requests.Session()
session.verify = False
resp = session.get(url, auth=(username, password))
resp.raise_for_status()
# construct dict of dicts
processes_by_context = {}
for line in resp.text.split('\n'):
if line.startswith('/'):
segments = line.split(':')
# context path as key
context = segments[0]
# dict of segment name/values
val = {}
val['running'] = segments[1] and segments[1] == 'running'
val['session_count'] = int(segments[2])
val['context_path'] = segments[3]
# make dict entry
processes_by_context[context] = val
return processes_by_context
@staticmethod
def server_info(text_url, username, password):
uris = {'/text/serverinfo': Manager.server_info_tomcat8, # TOMCAT8,
'/html': Manager.server_info_tomcat5 # TOMCAT5
}
resp = None
for uri in uris:
url = text_url + uri
try:
session = requests.Session()
session.verify = False
resp = session.get(url, auth=(username, password))
resp.raise_for_status()
break # La primera url que responda la pillamos
except requests.HTTPError as e:
LOG.warning("No se pudo conectar con el endpoint: %s, error: %s", url, e.message,
extra=Manager.getcaller())
resp = None
pass
except requests.ConnectionError as e:
LOG.warning("No se pudo conectar con el endpoint: %s, error: %s", url, e.message,
extra=Manager.getcaller())
resp = None
pass
return uris[uri](resp.text, resp.encoding) if resp else {}
@staticmethod
def server_info_tomcat5(text, encode):
servinfo = {}
# scrape the HTML
soup = bs4.BeautifulSoup(text, "html.parser")
tables = soup.find_all('table')
# TOMCAT5
# TOMCAT8
lasttable = tables[-1:][0]
lines = Manager.scrape_table_rows(lasttable)
for i in range(len(lines[1])):
var = lines[1][i]
servinfo[var.strip()] = lines[2][i].strip()
return servinfo
@staticmethod
def server_info_tomcat8(text, encode):
servinfo = {}
lines = text.split('\n')
if lines[0].startswith('OK -'):
for line in lines[1:]:
parts = line.split(':')
# lines without one ':' should be ignored
if len(parts) > 1:
if len(parts) > 2:
# there was an extra colon in the value.
# Rebuild the value
parts = [parts[0], ':'.join(parts[1:])]
servinfo[parts[0].strip()] = parts[1].strip()
return servinfo
@staticmethod
def server_resources(text_url, username, password):
uris = {'/text/resources': Manager.jndi_resources, # TOMCAT8,
'/resources': Manager.jndi_resources # TOMCAT5
}
resp = None
for uri in uris:
url = text_url + uri
try:
session = requests.Session()
session.verify = False
resp = session.get(url, auth=(username, password))
resp.raise_for_status()
break # La primera url que responda la pillamos
except requests.HTTPError as e:
LOG.warning("No se pudo conectar con el endpoint: %s, error: %s", url, e.message,
extra=Manager.getcaller())
resp = None
pass
except requests.ConnectionError as e:
LOG.warning("No se pudo conectar con el endpoint: %s, error: %s", url, e.message,
extra=Manager.getcaller())
resp = None
pass
return uris[uri](resp.text) if resp else {}
@staticmethod
def jndi_resources(text):
jndis = text.split('\n')
if jndis[0].startswith('OK -'):
if len(jndis) == 1:
return []
else:
return jndis[1:]
@staticmethod
def find_named_sibling(tag, desired_name, how_many_tries=5, beforetag=None):
'''
Convenience method. Given a BeautifulSoup tag, finds the first occurrence
of the desired_name in a sibling tag. Will look for up to how_many_tries.
'''
sib = tag
for indx in range(how_many_tries):
if sib.next_sibling:
sib = sib.next_sibling
if sib.name == desired_name:
return sib
if beforetag:
if sib.name == beforetag:
return None
else:
print('Out of siblings at ' + str(indx) + '. wtf.')
return None
@staticmethod
def scrape_table_rows(tbl, filt=None):
'''
Walks through the rows of a table. If a given row is not eliminated
by filter function 'filt', the row is converted into a list of string
values of the th or td tags in the row.
The 'filt' function must return True if the row is desired, else False.
'''
retrows = []
rows = tbl.find_all('tr')
if filt:
rows = [row for row in rows if filt(row)]
for row in rows:
retrows.append([cell.string.strip() for cell in row.find_all(['td', 'th'])])
return retrows
@staticmethod
def scrape_paragraph(p, filt=None):
'''
Walks through the words of a paragrah. If a given row is not eliminated
by filter function 'filt', the row is converted into a list of string
values of the th or td tags in the row.
The 'filt' function must return True if the row is desired, else False.
'''
retrows = {}
factors = {"MB": 1024 * 1024, "KB": 1024, "GB": 1024 * 1204 * 1024, "B": 1}
# field : value [MB|KB|GB|B]"
for text in p.contents:
if type(text) != bs4.NavigableString:
continue
if text != "":
text = text.replace(":", " : ")
arraytext = text.split()
field = ""
value = ""
fieldcomplete = False
for word in arraytext:
if word != ":" and word not in ["MB", "KB", "GB", "B"]:
if fieldcomplete and value == "":
value = word
else:
if value != "":
retrows[field] = value
value = ""
field = ""
fieldcomplete = False
field = field + word.capitalize()
elif word == ":":
fieldcomplete = True
else:
value = float(value) * factors[word]
fieldcomplete = False
retrows[field] = value
field = ""
value = ""
if value != "":
retrows[field] = value
return retrows
@staticmethod
def skip_ready_threads(row):
'''
Filter method: returns True IFF the parameter has a first cell that does
not contain the string value "R". Intended to eliminate thread table
rows that are in "Ready" state (i.e., not working.)
:param row: A beautiful soup tag for an HTML row
:returns: False if the parameter is None, has no first cell, or has a
first cell with string value "R"; else True
'''
if row:
firstcell = row.find(['th', 'td'])
if firstcell:
firstcontent = firstcell.string
return firstcontent != 'R'
else:
return False
else:
return False
@staticmethod
def server_jmx(status_url, username, password, metrics=True, rootnode="Catalina", beanname="", **kargs):
data = {}
extrarrgs = ",".join([arg + "=" + value if value != "*" else "*" for (arg, value) in kargs.iteritems()])
args = "{0}:type={1},{2}".format(rootnode, beanname, extrarrgs)
try:
session = requests.Session()
session.verify = False
LOG.debug("Llamando al jmxproxy: %s", status_url + "/jmxproxy?qry=" + args, extra=Manager.getcaller())
resp = session.get(status_url + "/jmxproxy", params={"qry": args}, auth=(username, password))
resp.raise_for_status()
except Exception as e:
LOG.error("Error al invocar al jmxproxy: %s,Error: %s", status_url + "/jmxproxy?qry=" + args,
str(e), extra=Manager.getcaller())
return data
if resp.content.startswith("OK"):
text = resp.content.replace("\r", "").split("\n")
indexes = [i for i in range(len(text)) if text[i].startswith("Name:")]
indexes.append(len(text) - 1)
beanblocks = [text[indexes[x]:indexes[x + 1]] for x in range(len(indexes) - 1)]
for bean in beanblocks:
if bean[0].startswith("Name"):
m = re.search(r'name="?([^\s,"]+)"?', bean[0], re.IGNORECASE)
if m:
name = m.group(1)
else:
m = re.search(r'host=([^\s,]+)', bean[0], re.IGNORECASE)
if m:
name = m.group(1)
else:
break
data[name] = {}
for j in range(1, len(bean)):
if bean[j] == "":
continue
if bean[j].startswith("modeler"):
continue
keyvalue = bean[j].split(":")
if (len(keyvalue) > 1):
key = keyvalue[0].strip()
try:
if metrics:
val = float(keyvalue[1].strip())
else:
val = keyvalue[1].strip()
data[name][key] = val
except Exception as e:
pass
return data
@staticmethod
def server_status(status_url, username, password):
session = requests.Session()
session.verify = False
resp = session.get(status_url + "/status", auth=(username, password))
resp.raise_for_status()
# scrape the HTML
soup = bs4.BeautifulSoup(resp.text, "html.parser")
# html headers are defined as header name (page content) and filter (function)
header_defs = {r"JVM": None,
r"http": Manager.skip_ready_threads,
r"jk": Manager.skip_ready_threads,
r"ajp": Manager.skip_ready_threads}
hdrs = soup.find_all('h1')
headertables = {}
for hdr in hdrs:
# TOMCAT 8 con "
headername = str(hdr.string).replace('"', '')
headers = [(headername, h) for h in header_defs if re.match(h, headername)]
if headers:
p = Manager.find_named_sibling(hdr, 'p')
if p:
data = Manager.scrape_paragraph(p, filt=header_defs[headers[0][1]])
headertables[headers[0][0]] = data
# TOMCAT 8x: Memory Pool.
if headername == "JVM":
table = Manager.find_named_sibling(hdr, 'table', beforetag="h1")
if table:
headertables["MemoryPool"] = []
lines = Manager.scrape_table_rows(table)
if len(lines) > 1:
for j in range(len(lines) - 1):
metricas = {}
for i in range(len(lines[0])):
metricas[lines[0][i].replace(' ', '')] = Manager.to_bytes(lines[j + 1][i])
headertables["MemoryPool"].append(metricas)
return headertables
@staticmethod
def to_bytes(value):
ret2 = 0
ret = value.replace(' ', '')
ret = re.sub("\(.*\)", "", ret)
if re.search("[0-9](MB|KB|GB|B)$", ret):
factor = 1
if ret.endswith("MB"):
factor = 1024 * 1024
elif ret.endswith("KB"):
factor = 1024
elif ret.endswith("GB"):
factor = 1024 * 1024 * 1024
try:
ret = float(re.sub("[A-Za-z]", "", ret)) * factor
except Exception as e:
return ret
return ret
|
PypiClean
|
/ClueDojo-1.4.3-1.tar.gz/ClueDojo-1.4.3-1/src/cluedojo/static/dojox/validate/README
|
-------------------------------------------------------------------------------
dojox.validate
-------------------------------------------------------------------------------
Version 0.02
Release date: 07/12/2007
-------------------------------------------------------------------------------
Project state: experimental / beta
-------------------------------------------------------------------------------
Credits
port: Peter Higgins (dante)
contributions: Kun Xi (bookstack at gmail com), Jared Jurkiewicz
-------------------------------------------------------------------------------
Project description
Provides a set of validation functions to match
values against known constants for use in form
validation, such as email address, TLD, ipAddress,
country-specific phone numbers and SSN, among
others..
It is broken into many parts. dojox.validate._base
is required by most named modules in this project.
ca.js provides Canadian specific functionality
check.js provides an experimental form-management utility,
which will likely be deprecated in favor
creditCard.js provides validation functions for most standard
credit card types.
isbn.js validates ISBN numbers
regexp.js provides a strange place to put regular expressions
related to validation. It was formerly polluting namespaces
and created in `dojox.regexp`. This is now `dojox.validate.regexp`
to confine values to the dojox.validate project.
us.js provides US-Specific validation. Zip, Social, etc.
web.js provides url and email address validation, as well as a number
of web/network related validation functions.
br.js provides Brazil specific validators for CNPJ and CPF numbers.
-------------------------------------------------------------------------------
Dependencies:
Requires Base Dojo and dojo.regexp.
-------------------------------------------------------------------------------
Installation instructions
Grab the following from the Dojo SVN Repository:
http://svn.dojotoolkit.org/src/dojox/trunk/validate.js
http://svn.dojotoolkit.org/src/dojox/trunk/validate
Install into the following directory structure:
/dojox/validate/
...which should be at the same level as your Dojo checkout.
|
PypiClean
|
/apachecn_sec_zh_pt2-2022.9.27.0-py3-none-any.whl/ApachecnSecZhPt2/docs/real-world-crypto/02.md
|
# 二、哈希函数
本章涵盖了
* 哈希函数及其安全属性
* 当今广泛采用的哈希函数
* 现有的其他类型的哈希
将全球唯一标识符赋予任何事物,这是你将在本章中学习的第一个加密构造的承诺——哈希函数*。哈希函数在密码学中无处不在——无处不在!非正式地说,它们接受你想要的任何数据作为输入,并产生一个独特的字节串作为回报。给定相同的输入,哈希函数总是再现相同的字节串。这可能看起来没什么,但是这个简单的构造对于构建许多其他的密码学结构非常有用。在这一章中,你将学到关于哈希函数的所有知识,以及为什么它们如此通用。*
*## 2.1 什么是哈希函数?
在你面前,一个下载按钮正在占据页面的一大块。你可以阅读字母*下载*,点击它似乎会将你重定向到一个包含文件的不同网站。在它下面,躺着一长串不知所云的字母:
```
f63e68ac0bf052ae923c03f5b12aedc6cca49874c1c9b0ccf3f39b662d1f487b
```
后面是看起来像是某种缩写的东西:`sha256sum`。听起来熟悉吗?你可能在过去的生活中下载了一些也伴随着这样一个奇怪的字符串的东西(图 2.1)。

图 2.1 链接到包含文件的外部网站的网页。外部网站不能修改文件的内容,因为第一页提供了文件的哈希或摘要,这确保了下载文件的完整性。
如果你想知道那根长绳子有什么用途:
1. 点击按钮下载文件
2. 使用 SHA-256 算法对下载的文件进行*哈希*
3. 将输出(摘要)与网页上显示的长字符串进行比较
这允许你验证你下载了正确的文件。
注意哈希函数的输出通常称为*摘要*或*哈希*。我在本书中交替使用这两个词。其他人可能称为*校验和*或*和*,我避免使用,因为这些术语主要由非加密哈希函数使用,可能会导致更多的混淆。当不同的代码库或文档使用不同的术语时,请记住这一点。
为了尝试哈希,你可以使用流行的 OpenSSL 库。它提供了一个多用途命令行界面(CLI ),这是许多系统(包括 macOS)默认提供的。例如,这可以通过打开终端并写入以下行来完成:
```
$ openssl dgst -sha256 downloaded_file
f63e68ac0bf052ae923c03f5b12aedc6cca49874c1c9b0ccf3f39b662d1f487b
```
对于该命令,我们使用 SHA-256 哈希函数将输入(下载的文件)转换成一个唯一的标识符(该值由该命令回显)。这些额外的步骤提供了什么?他们提供*完整性和真实性*。它告诉你,你下载的确实是你应该下载的文件。
所有这些工作都要归功于哈希函数的一个安全属性,叫做*第二预镜像抵抗*。这个受数学启发的术语意味着,从哈希函数的长输出中,`f63e...`,你找不到另一个文件将哈希到相同的输出,`f63e....`。在实践中,它意味着这个摘要与你正在下载的文件密切相关,没有攻击者能够通过给你一个不同的文件来欺骗你。
十六进制表示法
顺便说一下,长输出串`f63e...`代表以*十六进制*显示的二进制数据(一种以 16 为基数的编码,用数字 0 到 9 和字母 *a* 到 *f* 来代表几个数据位)。我们可以用 0 和 1(基数为 2)来显示二进制数据,但这会占用更多的空间。相反,十六进制编码允许我们为遇到的每 8 位(1 字节)写入 2 个字母数字字符。它对人类来说是可读的,并且占用更少的空间。对二进制数据进行编码还有其他方法,但是最广泛使用的两种编码是十六进制和 base64。基数越大,显示一个二进制字符串所需的空间就越小,但是在某个时候,我们会用完人类可读的字符。
请注意,这个长摘要是由网页的所有者控制的,它很容易被任何能够修改网页的人替换。(如果你不相信,花点时间考虑一下。)这意味着我们需要信任提供摘要的页面、其所有者以及用于检索页面的机制(而我们不需要信任提供下载文件的页面)。从这个意义上来说,*哈希函数本身并不能提供完整性*。下载文件的完整性和真实性来自于摘要和提供摘要的可信机制(本例中为 HTTPS)。我们将在第 9 章讨论 HTTPS,但是现在,想象一下它神奇地允许你与一个网站安全地通信。
回到我们的哈希函数,可以形象化为图 2.2 中的黑盒。我们的黑盒接受一个输入,给出一个输出。

图 2.2 哈希函数接受任意长度的输入(文件、消息、视频等),并产生固定长度的输出(例如,SHA-256 的 256 位)。哈希相同的输入产生相同的摘要或哈希。
该功能的*输入*可以是任意大小。甚至可以是空的。*输出*总是具有相同的长度和*确定性*:如果给定相同的输入,它总是产生相同的结果。在我们的例子中,SHA-256 总是提供 256 位(32 字节)的输出,它总是被编码为 64 个十六进制的字母数字字符。哈希函数的一个主要特性是不能还原算法,这意味着不能只从输出中找到输入。我们说哈希函数是单向的。
为了说明哈希函数在实践中是如何工作的,我们将使用相同的 OpenSSL CLI 通过 SHA-256 哈希函数来哈希不同的输入。以下终端会话说明了这一点。
```
$ echo -n "hello" | openssl dgst -sha256
2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824
$ echo -n "hello" | openssl dgst -sha256 ❶
2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824
$ echo -n "hella" | openssl dgst -sha256 ❷
70de66401b1399d79b843521ee726dcec1e9a8cb5708ec1520f1f3bb4b1dd984
$ echo -n "this is a very very very very very very ❸
➥ very very very long sentence" | openssl dgst -sha256 ❸
1166e94d8c45fd8b269ae9451c51547dddec4fc09a91f15a9e27b14afee30006
```
❶ 哈希相同的输入产生相同的结果。
❷ 输入的微小变化完全改变了输出。
❸ 无论输入大小如何,输出总是相同的大小。
在下一节中,我们将了解哈希函数的确切安全属性。
## 2.2 哈希函数的安全属性
应用密码学中的哈希函数是通常被定义为提供三种特定安全属性的构造。这个定义随着时间的推移已经发生了变化,我们将在下一节中看到。但是现在,让我们定义一下构成哈希函数的三个坚实基础。这很重要,因为您需要了解哈希函数在哪里有用,在哪里不起作用。
第一个是*前像阻力*。这个属性确保了在给定输出的情况下,没有人能够通过反转哈希函数来恢复输入。在图 2.3 中,我们通过想象我们的哈希函数就像一个搅拌机来说明这种“单向性”,使得不可能从生产的思慕雪中恢复成分。

图 2.3 给定一个哈希函数(这里表示为 blender)产生的摘要,不可能(或者技术上很难,我们假设它永远不会发生)反转它并找到使用的原始输入。这个安全属性被称为*抗预镜像*。
警告如果你的输入很小,这是真的吗?假设它要么是*是*要么是*非*,那么对于某人来说很容易哈希所有可能的 3 个字母的单词并找出输入是什么。你的输入空间小怎么办?也就是说,你总是会乱用不同的句子,比如“我会在周一凌晨 3 点回家”。在这里,一个人可以预测这一点,但不知道确切的星期几或小时,仍然可以哈希所有可能的句子,直到它产生正确的输出。因此,第一个前映像安全属性有一个明显的警告:*你不能隐藏太小或可预测的东西*。
第二个属性是*第二个前像阻力*。当我们想要保护文件的完整性时,我们已经看到了这个安全属性。该属性说明如下:如果我给你一个输入和它哈希到的摘要,你应该找不到哈希到同一个摘要的不同输入。图 2.4 说明了这一原理。

图 2.4 考虑到一个输入及其相关的摘要,应该永远找不到一个不同的输入哈希到同一个输出。这个安全属性被称为*第二个前像阻力*。
注意*我们不控制第一个输入*。这个重点对于理解哈希函数的下一个安全属性很重要。
最后,第三个属性是*耐碰撞性*。它保证了没有人能够产生两个不同的输入哈希到同一个输出(如图 2.5 所示)。这里,攻击者可以选择两个输入,而不像前面的属性只修复其中一个输入。

图 2.5 应该永远找不到两个输入(在左边表示为两个随机的数据点)哈希到同一个输出值(在右边)。这种安全属性被称为*抗碰撞性*。
人们经常混淆碰撞阻力和二次前像阻力。花一点时间来理解这些差异。
随机神谕
此外,哈希函数通常被设计成其摘要是不可预测的和随机的(T2)。这很有用,因为我们不能总是证明一个协议是安全的,这要归功于我们讨论过的哈希函数的安全属性之一(比如抗冲突)。许多协议在*随机预言模型*中得到验证,其中使用了一个虚构的和理想的参与者,称为随机预言。在这种类型的协议中,可以将任何输入作为请求发送给随机预言机,据说随机预言机会返回完全随机的输出作为响应,就像哈希函数一样,给它相同的输入两次会返回相同的输出两次。
这个模型中的证明有时是有争议的,因为我们不能确定我们是否可以用真正的哈希函数来代替这些随机预言(在实践中)。然而,许多合法的协议使用这种方法被证明是安全的,在这种方法中,哈希函数被认为比它们可能的更理想。
## 2.3 哈希函数的安全考虑
到目前为止,我们看到了哈希函数的三个安全属性:
* 前像电阻
* 第二个前像电阻
* 碰撞阻力
这些安全属性本身通常是没有意义的;这完全取决于你如何利用哈希函数。尽管如此,在我们研究一些真实世界的哈希函数之前,理解这里的一些限制是很重要的。
首先,这些安全属性假设你(合理地)使用了哈希函数。想象一下,我或者哈希单词*是*或者单词*否*,然后我发布摘要。如果你知道我在做什么,你可以简单地把两个单词哈希,然后把结果和我给你的进行比较。因为不涉及任何机密,并且因为我们使用的哈希算法是公开的,所以您可以自由地这样做。事实上,有人可能会认为这将打破哈希函数的前图像阻力,但我认为您的输入不够“随机”。此外,因为哈希函数接受任意长度的输入,并且总是产生相同长度的输出,所以也有无限数量的输入哈希到相同的输出。同样,你可以说,“嗯,这不是打破了第二个前像阻力吗?”第二个前像阻力仅仅是说,很难找到另一个输入,所以我们假设这在实践中是不可能的,但不是理论上不可能。
第二,消化*的大小和*有关系吗。无论如何,这都不是哈希函数的特性。所有加密算法在实践中必须关心其参数的大小。让我们想象下面这个极端的例子。我们有一个哈希函数,它以均匀随机的方式产生长度为 2 位的输出(这意味着它将在 25%的时间内输出`00`,在 25%的时间内输出`01`,以此类推)。您不需要做太多的工作来产生冲突:在哈希一些随机输入字符串后,您应该能够找到两个哈希到相同输出的字符串。为此,有一个*最小输出大小*,哈希函数*必须*在实践中产生:256 位(或 32 字节)。有了这么大的输出,碰撞应该是不可能发生的,除非在计算上有所突破。
这个数字是怎么得到的?在现实世界的密码术中,算法的目标是最低 128 位的安全性。这意味着想要破解算法(提供 128 位安全性)的攻击者必须执行大约 2128 次操作(例如,尝试长度为 128 位的所有可能的输入字符串将需要 2128 次操作)。对于提供前面提到的所有三种安全属性的哈希函数,它需要提供至少 128 位的安全性来抵御所有三种攻击。最容易的攻击通常是由于*生日限制*而发现冲突。
生日界限
生日界限源于概率论,其中生日问题揭示了一些不直观的结果。你需要几个人在一个房间里,这样至少有 50%的几率,两个人同一天生日(那就是碰撞)。原来随机抽取的 23 个人就足够达到这些赔率了!很奇怪吧?
这叫做*生日悖论*。在实践中,当我们从 2 个 <sup class="fm-superscript">N 个</sup>可能性的空间中随机生成字符串时,你可以预期在生成了大约 2 个 <sup class="fm-superscript">N/2 个</sup>字符串后,有 50%的几率有人会发现冲突。
如果我们的哈希函数生成 256 位的随机输出,所有输出的空间大小为 2 <sup class="fm-superscript1">256</sup> 。这意味着在生成 2 个 <sup class="fm-superscript1">128 个</sup>摘要之后,能够以很好的概率发现冲突(由于生日的限制)。这是我们的目标数字,这也是为什么哈希函数至少必须提供 256 位输出的原因。
某些约束有时会促使开发人员通过*截断*(删除一些字节)来减少摘要的大小。理论上,这是可能的,但会大大降低安全性。为了至少实现 128 位安全性,不得在以下情况下截断摘要:
* 256 位用于防碰撞
* 128 位用于前像和第二前像电阻
这意味着根据所依赖的属性,哈希函数的输出可以被截断以获得更短的摘要。
## 2.4 实践中的哈希函数
我们前面说过,实际中很少单独使用哈希函数。它们通常与其他元素相结合来创建加密原语或加密协议。在本书中,我们将会看到许多使用哈希函数构建更复杂对象的例子,但是本节描述了哈希函数在现实世界中的几种不同的使用方式。
### 2.4.1 承诺
想象一下,你知道市场上的一只股票将会升值,在未来一个月将达到 50 美元,但是你真的不能告诉你的朋友这件事(也许是出于某种法律原因)。你仍然希望能够告诉你的朋友你在事后知道这件事,因为你沾沾自喜(不要否认)。你能做的就是承诺这样一句话,“下个月股票 *X* 将达到 50 美元。”要做到这一点,哈希句子,给你的朋友输出。一个月后,揭示句子。您的朋友将能够哈希该句子,以观察到它确实产生了相同的输出。
这就是我们所说的*承诺计划*。密码学中的承诺通常试图实现两个属性:
* *隐藏*—承诺必须隐藏潜在价值。
* *绑定*—承诺必须隐藏单个值。换句话说,如果你承诺一个值 *x* ,你不应该能够在以后成功地揭示一个不同的值 *y* 。
运动
如果用作承诺方案,您能判断出哈希函数是否提供隐藏和绑定吗?
### 子资源完整性
网页导入外部 JavaScript 文件的情况(经常)发生。例如,许多网站使用内容交付网络(cdn)在它们的页面中导入 JavaScript 库或 web 框架相关的文件。这种 cdn 被放置在战略位置,以便将这些文件快速传递给访问者。然而,如果 CDN 失控并决定提供恶意 JavaScript 文件,这可能是一个真正的问题。为了解决这个问题,网页可以使用一个叫做*子资源完整性*的特性,这个特性允许在导入标签中包含一个摘要:
```
<script src="https://code.jquery.com/jquery-2.1.4.min.js"
integrity="sha256-8WqyJLuWKRBVhxXIL1jBDD7SDxU936oZkCnxQbWwJVw="></script>
```
这与我们在本章介绍中谈到的场景完全相同。一旦检索到 JavaScript 文件,浏览器就会对其进行哈希处理(使用 SHA-256 ),并验证它是否对应于页面中硬编码的摘要。如果检查通过,JavaScript 文件将被执行,因为其完整性已经过验证。
### 2.4.3 BitTorrent
世界各地的 用户(称为*对等*)使用 BitTorrent 协议直接在彼此之间共享文件(我们也称之为*对等*)。为了分发一个文件,它被切割成块,并且每个块被单独哈希。然后,这些哈希作为信任源被共享,以表示要下载的文件。
BitTorrent 有几种机制允许对等点从不同的对等点获取文件的不同块。最后,通过哈希每个下载的块并将输出与其各自已知的摘要相匹配来验证整个文件的完整性(在从块重组文件之前)。例如,下面的“magnet link”代表 Ubuntu 操作系统 v19.04。它是一个摘要(以十六进制表示),通过哈希有关文件的元数据以及所有块的摘要获得。
```
magnet:?xt=urn:btih:b7b0fbab74a85d4ac170662c645982a862826455
```
### 2.4.4 Tor
Tor 浏览器的目标是赋予个人匿名浏览互联网的能力。另一个特点是,人们可以创建隐藏的网页,其物理位置很难跟踪。到这些网页的连接通过使用网页公钥的协议来保护。(我们将在第 9 章讨论会话加密时详细了解其工作原理。)例如,丝绸之路曾经是毒品的易贝,直到被联邦调查局查封,才可以通过 Tor 浏览器中的`silkroad6ownowfk` `.onion`进入。这个 base32 字符串实际上代表了丝绸之路公钥的哈希。因此,通过知道洋葱地址,您可以验证您正在访问的隐藏 web 页面的公钥,并确保您正在与正确的页面对话(而不是一个模仿者)。如果这还不清楚,不要担心,我会在第 9 章再次提到这一点。
运动
顺便说一下,这个字符串不可能代表 256 位(32 字节),对吧?根据您在 2.3 节中学到的知识,这是如何安全的呢?此外,你能猜到可怕的海盗罗伯茨(丝绸之路网站管理员的化名)是如何设法获得包含网站名称的哈希码的吗?
在本节的所有示例中,哈希函数在以下情况下提供*内容完整性*或*真实性*:
* 有人可能会篡改被哈希的内容。
* 哈希会安全地传送给您。
我们有时也说我们鉴定了 T2 的某样东西或某人。重要的是要明白,如果哈希不是安全获得的,那么任何人都可以用其他东西的哈希来替换它!因此,它本身并不提供完整性。下一章关于消息认证码将通过引入*机密*来解决这个问题。现在让我们来看看您可以使用哪些实际的哈希函数算法。
## 2.5 标准化哈希函数
我们在前面的例子中提到了 SHA-256,这只是我们可以使用的哈希函数之一。在我们继续并列出我们这个时代推荐的哈希函数之前,让我们首先提到人们在现实世界应用中使用的不被认为是加密哈希函数的其他算法。
首先,像 CRC32 这样的函数*不是*密码哈希函数,而是检错码函数。虽然它们有助于检测一些简单的错误,但是它们不提供前面提到的任何安全属性,并且不要与我们正在讨论的哈希函数相混淆(即使它们有时可能共享名称)。它们的输出通常被称为*校验和*。
第二,像 MD5 和 SHA-1 这样流行的哈希函数现在被认为是坏的。虽然它们都是 20 世纪 90 年代被广泛接受的标准化哈希函数,但 MD5 和 SHA-1 分别在 2004 年和 2016 年被发现被破解,当时不同的研究团队发表了冲突。这些攻击之所以成功,部分是因为计算技术的进步,但主要是因为在哈希函数的设计方式中发现了缺陷。
反对是很难的
MD5 和 SHA-1 都被认为是很好的哈希函数,直到研究人员证明它们缺乏抗碰撞能力。仍然是今天,他们的前像和第二前像抵抗没有受到任何攻击的影响。这对我们来说无关紧要,因为我们只想在本书中讨论安全算法。尽管如此,你仍然会看到有人在系统中使用 MD5 和 SHA-1,这些系统只依赖于这些算法的前像阻力,而不是它们的碰撞阻力。这些攻击者经常声称,由于遗留和向后兼容的原因,他们无法将哈希函数升级到更安全的函数。由于这本书旨在经久不衰,并成为现实世界密码学未来的一束亮光,这将是我最后一次提到这些哈希函数。
接下来的两节将介绍 SHA-2 和 SHA-3,这是两种使用最广泛的哈希函数。图 2.6 介绍了这些功能。

图 2.6 SHA-2 和 SHA-3,两种最广泛采用的哈希函数。SHA-2 基于 Merkle–damg rd 结构,而 SHA-3 基于海绵结构。
### 2 . 5 . 1 SHA-2 哈希函数
既然我们已经看到了什么是哈希函数,并且对它们的潜在用例有所了解,那么我们在实践中可以使用哪些哈希函数还有待观察。在接下来的两节中,我将介绍两个被广泛接受的哈希函数,并且我还将从内部对它们的工作原理进行高层次的解释。高层次的解释不应该提供关于如何使用哈希函数的更深入的见解,因为我给出的黑盒描述应该足够了。但是不管怎样,看看这些密码原语最初是如何被密码学家设计出来的是很有趣的。
最广泛采用的哈希函数是*安全哈希算法 2* (SHA-2)。SHA-2 是由美国国家安全局发明的,并于 2001 年由 NIST 标准化。它的本意是将自己添加到 NIST 已经标准化的老化的安全哈希算法 1 (SHA-1)中。SHA-2 提供 4 种不同的版本,产生 224、256、384 或 512 位的输出。它们各自的名字省略了算法的版本:SHA-224、SHA-256、SHA-384 和 SHA-512。此外,其他两个版本,SHA-512/224 和 SHA-512/256,通过截断 SHA-512 的结果,分别提供 224 位和 256 位输出。
在下面的终端会话中,我们用 OpenSSL CLI 调用 SHA-2 的每个变体。注意,用相同的输入调用不同的变量会产生完全不同的指定长度的输出。
```
$ echo -n "hello world" | openssl dgst -sha224
2f05477fc24bb4faefd86517156dafdecec45b8ad3cf2522a563582b
$ echo -n "hello world" | openssl dgst -sha256
b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9
$ echo -n "hello world" | openssl dgst -sha384
fdbd8e75a67f29f701a4e040385e2e23986303ea10239211af907fcbb83578b3
➥ e417cb71ce646efd0819dd8c088de1bd
$ echo -n "hello world" | openssl dgst -sha512
309ecc489c12d6eb4cc40f50c902f2b4d0ed77ee511a7c7a9bcd3ca86d4cd86f
➥ 989dd35bc5ff499670da34255b45b0cfd830e81f605dcf7dc5542e93ae9cd76f
```
如今,人们大多使用 SHA-256,它为我们的三个安全属性提供所需的最低 128 位安全,而更多偏执的应用程序使用 SHA-512。现在,让我们来看看 SHA-2 是如何工作的简单解释。
异或运算
为了理解下面的内容,你需要理解“异或”运算。XOR 是一种按位运算,这意味着它是按位运算的。下图显示了这是如何工作的。XOR 在密码学中无处不在,所以一定要记住它。

异或或异或(通常表示为 ⊕ )对 2 位进行运算。除了两个操作数都是 1 的情况之外,它类似于 OR 运算。
这一切都始于一个叫做*压缩函数*的特殊函数。压缩函数接受两个一定大小的输入,并产生一个与其中一个输入大小相同的输出。简单来说,就是取一些数据,返回的数据少。图 2.7 说明了这一点。

图 2.7 压缩函数接受大小为 *X* 和 *Y* 的两个不同输入(这里都是 16 字节),并返回大小为 *X* 或 *Y* 的输出。
虽然构建压缩功能有不同的方法,但 SHA-2 使用的是*戴维斯-迈耶*方法(见图 2.8),该方法依赖于*分组密码*(一种可以加密固定大小数据块的密码)。我在第一章提到了 AES 分组密码,但是你还没有了解它。现在,在你阅读第 4 章认证加密之前,接受压缩函数为黑盒。

图 2.8 通过戴维斯-迈耶结构构建的压缩函数示意图。压缩函数的第一个输入(*输入块*)被用作分组密码的密钥。第二个输入(*中间值*)被用作分组密码加密的输入。然后通过将其自身与分组密码的输出进行异或运算来再次使用它。
SHA-2 是一个*Merkle–damg rd*结构,这是一个算法(由 Ralph Merkle 和 Ivan Damgå rd 独立发明),通过迭代调用这样的压缩函数来哈希消息。具体来说,它通过以下两个步骤工作。
首先,它对我们想要哈希的输入应用一个*填充*,然后将输入切割成适合压缩函数的块。填充意味着将特定的字节添加到输入中,以使其长度是某个块大小的倍数。将填充的输入切割成相同块大小的块允许我们将这些放入压缩函数的第一个参数中。例如,SHA-256 具有 512 比特块大小。图 2.9 说明了这一步。

图 2.9 Merkle–damg rd 构造的第一步是向输入消息添加一些填充。在这一步之后,输入长度应该是正在使用的压缩函数的输入大小的倍数(例如,8 字节)。为此,我们在末尾添加了 5 个字节的填充,使其成为 32 个字节。然后,我们将消息分成 4 个 8 字节的块。
其次,它迭代地将压缩函数应用于消息块,使用压缩函数的先前输出作为压缩函数的第二参数。最后的输出是*摘要*。图 2.10 说明了这一步。

图 2.10 Merkle–damg rd 构造迭代地将压缩函数应用于要哈希的输入的每个块和先前压缩函数的输出。对压缩函数的最后一次调用直接返回摘要。
这就是 SHA-2 的工作方式,通过迭代调用输入片段的压缩函数,直到所有内容都被处理成最终的摘要。
注意 Merkle–damg rd 结构经证明是抗碰撞的,如果压缩功能本身是抗碰撞的。这样,*任意长度输入的*哈希函数的安全性就归结为*固定大小的*压缩函数的安全性,更容易设计和分析。Merkle–damg rd 建筑的独创性就在于此。
最初,压缩函数的第二个参数通常是固定的,并被标准化为一个“无中生有”的值。具体来说,SHA-256 使用第一个质数的平方根来得出这个值。一个无中生有的值是为了让加密社区相信,它不是用来削弱哈希函数的(例如,为了创建一个后门)。这是密码学中的一个流行概念。
警告虽然 SHA-2 是一个非常好的哈希函数,但它不适合哈希机密。这是因为 Merkle-damg rd 结构的缺点,这使得 SHA-2 在用于哈希机密时容易受到攻击(称为*长度扩展攻击*)。我们将在下一章更详细地讨论这一点。
### 2 . 5 . 2 SHA-3 哈希函数
正如我前面提到的,MD5 和 SHA-1 哈希函数最近都被破解了。这两个函数使用了我在上一节中描述的相同的 Merkle-damg rd 结构。由于这个原因,以及 SHA-2 容易受到长度扩展攻击的事实,NIST 在 2007 年决定组织一场新标准的公开竞赛: *SHA-3* 。本节介绍了较新的标准,并试图给出其内部工作原理的高级解释。
2007 年,来自不同国际研究团队的 64 名候选人参加了 SHA-3 竞赛。五年后,提交作品之一的 Keccak 被提名为获胜者,并被命名为 SHA-3。2015 年,SHA-3 在 FIPS 出版物 202(【https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf】)中被标准化。
SHA-3 遵守我们之前讨论的三个安全属性,并提供与 SHA-2 变体一样多的安全性。此外,它不易受到长度扩展攻击,并可用于哈希机密。因此,它现在是推荐使用的哈希函数。它提供了与 SHA-2 相同的变体,这一次在它们的命名变体中指出了全名 SHA-3:SHA-3-224、SHA-3-256、SHA-3-384 和 SHA-3-512。因此,类似于 SHA-2,例如,SHA-3-256 提供 256 比特的输出。现在让我花几页来解释 SHA-3 是如何工作的。
SHA-3 是建立在*置换*之上的加密算法。理解排列最简单的方法是想象如下:你有一组元素在左边,同样的一组元素在右边。现在追踪从左边到右边的每个元素的箭头。每个元素只能有一个从它开始到它结束的箭头。你现在有一个排列。图 2.11 说明了这一原理。根据定义,任何排列也是*可逆的*,这意味着我们可以从输出中找到输入。

图 2.11 作用于四种不同形状的排列示例。您可以使用中间图片中的箭头所描述的排列来变换给定的形状。
SHA-3 采用了*海绵结构*,这是一种不同于 Merkle–damg rd 的结构,是 SHA-3 竞赛的一部分。它基于一种叫做 *keccak-f* 的特殊排列,它接受一个输入并返回相同大小的输出。
注意我们不会解释 keccak-f 是如何设计的,但你会在第 4 章中有所了解,因为它基本上类似于 AES 算法(除了它没有密钥)。这并非偶然,因为 AES 的发明者之一也是 SHA-3 的发明者之一。
在接下来的几页中,我将使用一个 8 位排列来说明海绵结构是如何工作的。因为排列是固定不变的,你可以想象图 2.12 很好地说明了这种排列在所有可能的 8 位输入上产生的映射。与我们之前对排列的解释相比,你也可以想象每一个可能的 8 位字符串就是我们所表示的不同形状(`000...`是三角形,`100...`是正方形,等等)。

图 2.12 海绵构造利用了特定的排列 *f* 。通过对输入进行操作,我们的示例置换创建了所有可能的 8 位输入和所有可能的 8 位输出之间的映射。
为了在我们的海绵结构中使用排列,我们还需要将输入和输出任意划分为*速率*和*容量*。这有点奇怪,但坚持下去。图 2.13 说明了这一过程。

图 2.13 排列 *f* 将 8 位大小的输入随机化成相同大小的输出。在海绵结构中,这个排列的输入和输出分为两部分:速率(大小为 *r* 和容量(大小为 *c* )。
我们在这里设置的速率和容量之间的限制是任意的。不同版本的 SHA-3 使用不同的参数。我们非正式地指出,容量要像保密一样对待,容量越大,海绵构造越安全。
现在,像所有好的哈希函数一样,我们需要能够哈希一些东西,对吗?不然就有点没用了。为此,我们只需将输入与排列输入的速率进行异或运算( ⊕ )。一开始,这只是一堆 0。正如我们前面指出的,容量被视为机密,所以我们不会对它进行任何异或运算。图 2.14 说明了这一点。

图 2.14 为了吸收输入的 5 位`00101`,一个速率为 5 位的海绵构造可以简单地将 5 位与速率(初始化为 0)进行异或运算。然后排列使状态随机化。
获得的输出现在看起来应该是随机的(尽管我们可以很容易地发现输入是什么,因为根据定义置换是可逆的)。如果我们想接受更大的输入呢?嗯,类似于我们对 SHA-2 所做的,我们会
1. 如有必要,填充输入,然后将输入分成速率大小的块。
2. 迭代调用置换,同时用置换的输入对每个块进行异或运算,并在每个块被异或后置换*状态*(最后一次运算输出的中间值)。
为了简化起见,我忽略了其余解释中的填充,但是填充是区分输入(例如`0`和`00`)的重要步骤。图 2.15 描绘了这两个步骤。

图 2.15 为了吸收大于速率大小的输入,海绵结构迭代地将输入块与速率进行异或运算,并置换结果。
到目前为止还不错,但是我们还没有制作出一个摘要。为此,我们可以简单地使用海绵的最后状态的速率(同样,我们没有触及容量)。为了获得一个更长的摘要,我们可以继续从状态的 rate 部分进行置换和读取,如图 2.16 所示。

图 2.16 为了获得海绵结构的摘要,需要反复置换状态,并根据需要检索尽可能多的速率(状态的上部)。
这就是 SHA-3 的工作原理。因为是*海绵构造*,摄取输入自然被称为*吸收*产生消化被称为*挤压*。海绵具有 1600 位排列,使用不同的值用于 *r* 和 *c* ,这取决于不同版本的 SHA-3 所宣传的安全性。
沙-3 是一个随机神谕
我之前谈到过随机神谕:一种理想的虚构结构,它对查询返回完全随机的响应,如果我们用相同的输入查询它两次,它就会重复自己。事实证明,海绵结构的行为非常接近随机预言,只要该结构使用的排列看起来足够随机。我们如何在置换上证明这样的安全性质?我们最好的方法是尝试打破它,很多次,直到我们对它的设计有了很强的信心(这就是 SHA-3 比赛期间发生的事情)。SHA-3 可以被建模为随机预言机,这一事实立即赋予了它我们期望从哈希函数中得到的安全属性。
### 2.5.3 SHAKE 和 cSHAKE:两个可扩展的输出功能(XOF)
我介绍了两大哈希函数标准:SHA-2 和 SHA-3。这些都是定义良好的哈希函数,接受任意长度的输入,产生随机外观和固定长度的输出。正如您将在后面的章节中看到的,加密协议通常需要这种类型的原语,但不希望受到哈希函数摘要的固定大小的限制。出于这个原因,SHA-3 标准引入了一个更加通用的原语,称为*可扩展输出函数*或 *XOF* (发音为“zoff”)。本节介绍两个标准化的 xof:SHAKE 和 cSHAKE。
*SHAKE* ,与 SHA-3 一起在 FIPS 202 中指定,可以看作是返回任意长度输出的哈希函数。SHAKE 与 SHA-3 的结构基本相同,只是它更快,并且在挤压阶段的烫发次数与您希望的一样多。产生不同大小的输出非常有用,不仅可以创建一个摘要,还可以创建随机数、导出密钥等等。我会在本书中再次谈到 SHAKE 的不同应用;现在,假设 SHAKE 类似于 SHA-3,只是它提供了您想要的任何长度的输出。
这种构造在密码学中非常有用,以至于在 SHA-3 被标准化一年后,NIST 出版了它的特别出版物 800-185,其中包含一个被称为 *cSHAKE* 的*可定制的摇动*。cSHAKE 很像 SHAKE,除了它也需要一个定制字符串。该自定义字符串可以为空,也可以是您想要的任何字符串。我们先来看一个在伪代码中使用 cSHAKE 的例子:
```
cSHAKE(input="hello world", output_length=256, custom_string="my_hash")
-> 72444fde79690f0cac19e866d7e6505c
cSHAKE(input="hello world", output_length=256, custom_string="your_hash")
-> 688a49e8c2a1e1ab4e78f887c1c73957
```
如你所见,尽管 cSHAKE 与 SHAKE 和 SHA-3 一样具有确定性,但这两个摘要是不同的。这是因为使用了不同的自定义字符串。一个*定制字符串*允许你定制你的 XOF!这在一些协议中是有用的,例如,在这些协议中,必须使用不同的哈希函数来进行证明。我们称这个为*域分离*。
作为密码学中的黄金法则:如果在不同的用例中使用相同的密码原语,不要使用相同的密钥(如果它需要一个密钥)或/和应用域分离。当我们在后面的章节中研究加密协议时,你会看到更多的域分离的例子。
警告 NIST 倾向于指定以比特而不是字节为参数的算法。在本例中,请求的长度为 256 位。想象一下,如果你请求了 16 个字节的长度,却得到了 2 个字节,因为程序认为你请求了 16 位的输出。这个问题有时被称为*位攻击*。
与密码学中的所有东西一样,密钥、参数和输出等密码字符串的长度与系统的安全性密切相关。重要的是不要要求 SHAKE 或 cSHAKE 输出太短。*使用 256 位的输出*永远不会出错,因为它提供了 128 位的防碰撞攻击的安全性。但是现实世界的加密有时会在使用较短加密值的受限环境中运行。如果仔细分析系统的安全性,这是可以做到的。例如,如果冲突阻力在利用该值的协议中无关紧要,则前像阻力只需要来自 SHAKE 或 cSHAKE 的 128 位长的输出。T6】
### 2.5.4 避免使用 TupleHash 进行不明确的哈希运算
在本章中,我已经谈到了不同类型的密码原语和密码算法。这包括
* SHA-2 哈希函数,易受长度扩展攻击,但在没有机密被哈希时仍被广泛使用
* SHA-3 哈希函数,目前推荐的哈希函数
* SHAKE 和 cSHAKE XOFs 是比哈希函数更通用的工具,因为它们提供可变的输出长度
我再讲一个比较好用的函数, *TupleHash* ,它是基于 cSHAKE 的,和 cSHAKE 在同一个标准中指定。TupleHash 是一个有趣的函数,它允许哈希一个*元组*(一个列表)。为了解释 TupleHash 是什么以及它为什么有用,我给你讲个故事。
几年前,作为工作的一部分,我受命审查一种加密货币。它包含了加密货币的基本特性:账户、支付等等。用户之间的交易将包含关于谁向谁发送了多少的元数据。它还包括一笔小额费用,以补偿网络处理交易的费用。
例如,Alice 可以向网络发送交易,但是要让网络接受交易,她需要提供交易来自她的证据。为此,她可以哈希该事务并签名(我在第 1 章中给出了一个类似的例子)。任何人都可以对事务进行哈希处理,并验证哈希上的签名,以确定这是 Alice 想要发送的事务。图 2.17 显示了一个中间人(MITM)攻击者,他在交易到达网络之前拦截交易,将无法篡改交易。这是因为哈希将会改变,签名将不会验证新的事务摘要。

图 2.17 Alice 发送了一个交易以及交易哈希上的签名。如果 MITM 攻击者试图篡改交易,哈希将是不同的,因此附加的签名将是不正确的。
你将在第七章看到,这样的攻击者当然无法在新的摘要上伪造爱丽丝的签名。并且由于所使用的哈希函数的第二个前映像阻力,攻击者无法找到将哈希到相同摘要的完全不同的事务。
我们的 MITM 袭击者无害吗?我们还没有脱离险境。不幸的是,对于我正在审计的加密货币,通过简单地连接每个字段来哈希交易:
```
$ echo -n "Alice""Bob""100""15" | openssl dgst -sha3-256
34d6b397c7f2e8a303fc8e39d283771c0397dad74cef08376e27483efc29bb02
```
看似完全没问题,实际上完全打破了加密货币的支付系统。这样做很容易让攻击者突破哈希函数的第二道前像防线。花点时间想想如何找到哈希到同一个摘要的不同事务,`34d6...`。
如果我们将一位数从*费用*字段移到*金额*字段,会发生什么?可以看到,以下事务哈希到 Alice 签名的同一摘要:
```
$ echo -n "Alice""Bob""1001""5" | openssl dgst -sha3-256
34d6b397c7f2e8a303fc8e39d283771c0397dad74cef08376e27483efc29bb02
```
因此,想要 Bob 多收一点钱的 MITM 攻击者将能够修改交易而不使签名无效。你可能已经猜到了,这就是 TupleHash 解决的问题。它允许您通过使用明确的编码来明确地哈希字段列表。现实中发生的事情接近于下面的内容(使用`||`字符串连接操作):
```
cSHAKE(input="5"||"Alice"||"3"||"Bob"||"3"||"100"||"2"||"10",
➥ output_length=256, custom_string="TupleHash"+"anything you want")
```
这次输入是通过在交易的每个字段前加上其长度来构建的。花一分钟来理解为什么这能解决我们的问题。一般来说,你可以安全地使用任何哈希函数,只要确保在哈希它之前*序列化*输入。序列化输入意味着总会有的方法来*反序列化*它(意味着恢复原始输入)。如果可以反序列化数据,那么在字段定界上就没有任何模糊性。T8T10T12】
## 2.6 哈希密码
本章你已经看到了几个有用的函数,要么是哈希函数,要么是扩展哈希函数。但是在你跳到下一章之前,我需要提一下*密码哈希*。
想象一下下面的场景:你有一个网站(这会让你成为一个网站管理员),你想让你的用户注册并登录网站,所以你为这两个功能分别创建了两个网页。突然,您想知道,您将如何存储他们的密码?你会把这些明文存储在数据库里吗?你想,这一开始好像没什么问题。尽管它并不完美。人们倾向于在任何地方重复使用同一个密码,如果(或当)你被攻破,攻击者设法转储你所有用户的密码,这将对你的用户不利,对你平台的声誉也不利。您想得多一点,您意识到能够窃取该数据库的攻击者就能够以任何用户的身份登录。以明文形式存储密码现在不太理想,您希望有更好的方法来处理这个问题。
一个解决方案是哈希你的密码,只存储摘要。当有人登录到您的网站时,流程将类似于以下内容:
1. 您收到用户的密码。
2. 你对他们给你的密码进行哈希运算,然后去掉密码。
3. 你将摘要与你之前储存的内容进行比较;如果匹配,则用户登录。
该流程允许您在有限的时间内处理用户的密码。尽管如此,进入您的服务器的攻击者仍然可以悄悄地从这个流中记录密码,直到您检测到它的存在。我们承认这仍然不是一个完美的情况,但我们仍然提高了网站的安全性。在安全方面,我们也称这种*深度防御*,这是一种行为,将不完善的防御分层,希望攻击者不会击败所有这些层。这也是真实世界密码学的内容。但是这种解决方案存在其他问题:
* *如果攻击者检索哈希密码,就可能进行暴力攻击或穷举搜索(尝试所有可能的密码)。*这将针对整个数据库测试每个尝试。理想情况下,我们希望攻击者一次只能攻击一个哈希密码。
* *哈希函数应该一样快。*攻击者可以利用这一点来暴力破解(每秒许多许多密码)。理想情况下,我们应该有一种机制来减缓这种攻击。
第一个问题通常通过使用*盐*来解决,这些盐是随机值,它们是公开的并且对于每个用户是不同的。在哈希时,我们使用 salt 和用户的密码,这在某种意义上类似于对 cSHAKE 使用基于用户的定制字符串:它有效地为每个用户创建了不同的哈希函数。因为每个用户使用不同的哈希函数,攻击者无法预先计算大型密码表(称为*彩虹表*),希望通过整个被盗密码哈希数据库来测试这些表。
第二个问题是用*密码哈希*解决的,它们被设计得很慢。目前最先进的选择是从 2013 年到 2015 年举办的密码哈希比赛([【https://password-hashing.net](https://password-hashing.net))的获胜者 *Argon2* 。在撰写本文时(2021 年),Argon2 正在被标准化为 RFC ( [、https://datatracker.ietf.org/doc/draft-irtf-cfrg-argon2/](https://datatracker.ietf.org/doc/draft-irtf-cfrg-argon2/))。实际上,也使用其他非标准算法,如 PBKDF2、bcrypt 和 scrypt。问题是,这些参数可以与不安全的参数一起使用,因此在实践中不容易配置。
此外,只有 Argon2 和 scrypt 可以抵御攻击者的大量优化,因为其他方案都不是内存硬的。术语*内存硬*是指只能通过内存访问的优化来优化算法。换句话说,优化剩下的部分并不会让你收获太多。由于即使使用专用硬件,优化内存访问也是有限的(CPU 周围只能有这么多缓存),内存硬功能在任何类型的设备上运行都很慢。当您希望防止攻击者在计算函数时获得不可忽略的速度优势时,这是一个理想的属性。
图 2.18 回顾了你在本章中看到的不同类型的哈希函数。

图 2.18 在这一章中,你看到了四种类型的哈希函数:(1)普通类型,为任意长度的输入提供一个唯一的随机标识符;(2)相似但提供任意长度输出的可扩展输出函数;(3)明确列出哈希值的元组哈希函数;以及(4)为了安全地存储密码而不容易优化的密码哈希函数。
## 总结
* 哈希函数提供了抗冲突性、抗前像性和第二抗前像性。
* 前像阻力意味着人们不能找到产生摘要的输入。
* 第二个前像阻力是指从一个输入及其摘要中,人们不应该能够找到哈希到同一个摘要的不同输入。
* 抗冲突性意味着不能找到两个随机输入哈希到同一个输出。
* 最广泛采用的哈希函数是 SHA-2,而推荐的哈希函数是 SHA-3,因为 SHA-2 缺乏对长度扩展攻击的抵抗力。
* SHAKE 是一个可扩展的输出函数(XOF ),其作用类似于哈希函数,但提供任意长度的摘要。
* cSHAKE(可定制的 SHAKE)允许用户轻松创建 SHAKE 的实例,就像不同的 xof 一样。这被称为域分离。
* 对象应该在被哈希之前被串行化,以避免打破哈希函数的第二个前映像阻力。像 TupleHash 这样的算法会自动处理这些问题。
* 哈希密码使用专门为此目的设计的较慢的哈希函数。Argon2 是最先进的选择。*
|
PypiClean
|
/remeha_tz-0.12.tar.gz/remeha_tz-0.12/remeha_core.py
|
import logging
from datetime import datetime
import struct
import sys
from datamap import datamap
log = logging.getLogger(__name__)
def eprint(*arguments, **kwargs):
print(*arguments, file=sys.stderr, **kwargs)
def parse_data(input_data, map_resolve=True):
for n, x in enumerate(input_data):
value = datamap[n][1](x)
# print("Value: " + str(value))
if isinstance(value, list):
for sub_index, sub_value in enumerate(value):
# if i[0]:
yield [datamap[n][2][sub_index], sub_value]
elif map_resolve and isinstance(datamap[n][4], dict) and value in datamap[n][4]:
yield [datamap[n][2], datamap[n][4][value]]
else:
yield [datamap[n][2], value]
class Frame:
def __init__(self, io_source=None, frame_data=None):
self.isValid = False
self.frame = None
if io_source:
self.frame = io_source.read(7) # read header first
elif frame_data:
self.frame = frame_data
else:
print("Error need to provide a reading source or the data directly")
return
self.timestamp = datetime.now()
if len(self.frame) < 7 or not (self.frame[0] == 2 or self.frame[0] == 7):
print("Could not read the whole frame header")
return
size = 2 + self.frame[4] # start and end magic are not factored in
if io_source:
self.frame += io_source.read(size - 7)
if len(self.frame) < size:
eprint("Could not read a whole frame of size %d" % size)
return
if (self.frame[0] == 2 and self.frame[-1] != 3) or (self.frame[0] == 7 and self.frame[-1] != 0xd2):
eprint("Frame start/stop magic incorrect")
return
if self.get_checksum() != struct.unpack("<H", self.frame[-3:-1])[0]:
eprint("Checksum incorrect. Should be {:x}, but is {:x}"
.format(self.get_checksum(), struct.unpack("<H", self.frame[-3:-1])[0]))
return
self.isValid = True
self.unpack_with_format = "<" + ''.join(entry[0] for entry in datamap)
def __str__(self):
return ''.join('{:02x} '.format(x) for x in self.frame)
def get_checksum(self):
crc = 0xFFFF
for data_byte in self.frame[1:-3]:
crc ^= data_byte
for counter in range(0, 8):
if (crc & 0x0001) != 0:
crc >>= 1
crc ^= 0xA001
else:
crc >>= 1
return crc
def get_framedata(self):
return self.frame
def get_parseddata(self, ):
return list(struct.unpack(self.unpack_with_format, self.frame[7:-3]))
def get_data(self):
return self.frame[7:-3]
def get_type(self):
return struct.unpack("<H", self.frame[5:7])[0]
def get_readable_type(self):
return self.frame[8:-3]
def get_source_address(self):
return self.frame[1]
class FrameDecoder:
def __init__(self):
self.unpack_with_format = "<" + ''.join(entry[0] for entry in datamap)
self.data_hash = {}
for value_index, element in enumerate(datamap):
name_i = element[2]
if isinstance(name_i, list):
for sub_value_index, sub_element in enumerate(name_i):
self.data_hash[sub_element] = [[value_index, sub_value_index]] + element
else:
self.data_hash[name_i] = [value_index] + element
def decode(self, unpacked_raw_values, value_name):
decoder_data = self.data_hash[value_name]
if isinstance(decoder_data[5], dict):
index_in_unpacked_raw_values = decoder_data[0]
converted_value = decoder_data[2](unpacked_raw_values[index_in_unpacked_raw_values])
return decoder_data[5][converted_value]
else:
return decoder_data[2](unpacked_raw_values[decoder_data[0]])
def decode_all(self, packed_raw_values):
return list(parse_data(struct.unpack(self.unpack_with_format, packed_raw_values)))
def get_unpacked_data(self, packed_raw_values):
return list(struct.unpack(self.unpack_with_format, packed_raw_values))
|
PypiClean
|
/pixel_classifier_torch-0.2-py3-none-any.whl/pixel_classifier_torch-0.2.dist-info/LICENSE.md
|
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.
|
PypiClean
|
/great_expectations_cta-0.15.43.tar.gz/great_expectations_cta-0.15.43/great_expectations/expectations/core/expect_column_values_to_be_in_set.py
|
from typing import TYPE_CHECKING, List, Optional, Union
from great_expectations.core import (
ExpectationConfiguration,
ExpectationValidationResult,
)
from great_expectations.expectations.expectation import (
ColumnMapExpectation,
InvalidExpectationConfigurationError,
)
from great_expectations.render import (
LegacyDescriptiveRendererType,
LegacyRendererType,
RenderedBulletListContent,
RenderedStringTemplateContent,
ValueListContent,
)
from great_expectations.render.renderer.renderer import renderer
from great_expectations.render.renderer_configuration import (
RendererConfiguration,
RendererValueType,
)
from great_expectations.render.util import (
num_to_str,
parse_row_condition_string_pandas_engine,
substitute_none_for_missing,
)
from great_expectations.rule_based_profiler.config import (
ParameterBuilderConfig,
RuleBasedProfilerConfig,
)
from great_expectations.rule_based_profiler.parameter_container import (
DOMAIN_KWARGS_PARAMETER_FULLY_QUALIFIED_NAME,
FULLY_QUALIFIED_PARAMETER_NAME_METADATA_KEY,
FULLY_QUALIFIED_PARAMETER_NAME_SEPARATOR_CHARACTER,
FULLY_QUALIFIED_PARAMETER_NAME_VALUE_KEY,
PARAMETER_KEY,
VARIABLES_KEY,
)
try:
import sqlalchemy as sa # noqa: F401
except ImportError:
pass
from great_expectations.expectations.expectation import (
render_evaluation_parameter_string,
)
if TYPE_CHECKING:
from great_expectations.render.renderer_configuration import RendererParams
class ExpectColumnValuesToBeInSet(ColumnMapExpectation):
"""Expect each column value to be in a given set.
For example:
::
# my_df.my_col = [1,2,2,3,3,3]
>>> my_df.expect_column_values_to_be_in_set(
"my_col",
[2,3]
)
{
"success": false
"result": {
"unexpected_count": 1
"unexpected_percent": 16.66666666666666666,
"unexpected_percent_nonmissing": 16.66666666666666666,
"partial_unexpected_list": [
1
],
},
}
expect_column_values_to_be_in_set is a \
[Column Map Expectation](https://docs.greatexpectations.io/docs/guides/expectations/creating_custom_expectations/how_to_create_custom_column_map_expectations).
Args:
column (str): \
The column name.
value_set (set-like): \
A set of objects used for comparison.
Keyword Args:
mostly (None or a float between 0 and 1): \
Successful if at least mostly fraction of values match the expectation. \
For more detail, see [mostly](https://docs.greatexpectations.io/docs/reference/expectations/standard_arguments/#mostly).
parse_strings_as_datetimes (boolean or None) : If True values provided in value_set will be parsed as \
datetimes before making comparisons.
Other Parameters:
result_format (str or None): \
Which output mode to use: BOOLEAN_ONLY, BASIC, COMPLETE, or SUMMARY. \
For more detail, see [result_format](https://docs.greatexpectations.io/docs/reference/expectations/result_format).
include_config (boolean): \
If True, then include the expectation config as part of the result object.
catch_exceptions (boolean or None): \
If True, then catch exceptions and include them as part of the result object. \
For more detail, see [catch_exceptions](https://docs.greatexpectations.io/docs/reference/expectations/standard_arguments/#catch_exceptions).
meta (dict or None): \
A JSON-serializable dictionary (nesting allowed) that will be included in the output without \
modification. For more detail, see [meta](https://docs.greatexpectations.io/docs/reference/expectations/standard_arguments/#meta).
Returns:
An [ExpectationSuiteValidationResult](https://docs.greatexpectations.io/docs/terms/validation_result)
Exact fields vary depending on the values passed to result_format, include_config, catch_exceptions, and meta.
See Also:
[expect_column_values_to_not_be_in_set](https://greatexpectations.io/expectations/expect_column_values_to_not_be_in_set)
"""
# This dictionary contains metadata for display in the public gallery
library_metadata = {
"maturity": "production",
"tags": ["core expectation", "column map expectation"],
"contributors": ["@great_expectations"],
"requirements": [],
"has_full_test_suite": True,
"manually_reviewed_code": True,
}
map_metric = "column_values.in_set"
args_keys = (
"column",
"value_set",
)
success_keys = (
"value_set",
"mostly",
"parse_strings_as_datetimes",
"auto",
"profiler_config",
)
value_set_estimator_parameter_builder_config: ParameterBuilderConfig = (
ParameterBuilderConfig(
module_name="great_expectations.rule_based_profiler.parameter_builder",
class_name="ValueSetMultiBatchParameterBuilder",
name="value_set_estimator",
metric_domain_kwargs=DOMAIN_KWARGS_PARAMETER_FULLY_QUALIFIED_NAME,
metric_value_kwargs=None,
evaluation_parameter_builder_configs=None,
)
)
validation_parameter_builder_configs: List[ParameterBuilderConfig] = [
value_set_estimator_parameter_builder_config,
]
default_profiler_config = RuleBasedProfilerConfig(
name="expect_column_values_to_be_in_set", # Convention: use "expectation_type" as profiler name.
config_version=1.0,
variables={},
rules={
"default_expect_column_values_to_be_in_set_rule": {
"variables": {
"mostly": 1.0,
},
"domain_builder": {
"class_name": "ColumnDomainBuilder",
"module_name": "great_expectations.rule_based_profiler.domain_builder",
},
"expectation_configuration_builders": [
{
"expectation_type": "expect_column_values_to_be_in_set",
"class_name": "DefaultExpectationConfigurationBuilder",
"module_name": "great_expectations.rule_based_profiler.expectation_configuration_builder",
"validation_parameter_builder_configs": validation_parameter_builder_configs,
"column": f"{DOMAIN_KWARGS_PARAMETER_FULLY_QUALIFIED_NAME}{FULLY_QUALIFIED_PARAMETER_NAME_SEPARATOR_CHARACTER}column",
"value_set": f"{PARAMETER_KEY}{value_set_estimator_parameter_builder_config.name}{FULLY_QUALIFIED_PARAMETER_NAME_SEPARATOR_CHARACTER}{FULLY_QUALIFIED_PARAMETER_NAME_VALUE_KEY}",
"mostly": f"{VARIABLES_KEY}mostly",
"meta": {
"profiler_details": f"{PARAMETER_KEY}{value_set_estimator_parameter_builder_config.name}{FULLY_QUALIFIED_PARAMETER_NAME_SEPARATOR_CHARACTER}{FULLY_QUALIFIED_PARAMETER_NAME_METADATA_KEY}",
},
},
],
},
},
)
default_kwarg_values = {
"value_set": [],
"parse_strings_as_datetimes": False,
"auto": False,
"profiler_config": default_profiler_config,
}
@classmethod
def _prescriptive_template(
cls,
renderer_configuration: RendererConfiguration,
) -> RendererConfiguration:
add_param_args = (
("column", RendererValueType.STRING),
("value_set", RendererValueType.ARRAY),
("mostly", RendererValueType.NUMBER),
("parse_strings_as_datetimes", RendererValueType.BOOLEAN),
)
for name, param_type in add_param_args:
renderer_configuration.add_param(name=name, param_type=param_type)
params: RendererParams = renderer_configuration.params
template_str = ""
if params.value_set:
renderer_configuration = cls._add_value_set_params(
renderer_configuration=renderer_configuration
)
value_set_str: str = cls._get_value_set_string(
renderer_configuration=renderer_configuration
)
template_str += f"values must belong to this set: {value_set_str}"
if params.mostly and params.mostly.value < 1.0:
renderer_configuration = cls._add_mostly_pct_param(
renderer_configuration=renderer_configuration
)
template_str += ", at least $mostly_pct % of the time."
else:
template_str += "."
if params.parse_strings_as_datetimes:
template_str += " Values should be parsed as datetimes."
if renderer_configuration.include_column_name:
template_str = f"$column {template_str}"
renderer_configuration.template_str = template_str
return renderer_configuration
@classmethod
@renderer(renderer_type=LegacyRendererType.PRESCRIPTIVE)
@render_evaluation_parameter_string
def _prescriptive_renderer(
cls,
configuration: Optional[ExpectationConfiguration] = None,
result: Optional[ExpectationValidationResult] = None,
runtime_configuration: Optional[dict] = None,
) -> List[RenderedStringTemplateContent]:
renderer_configuration = RendererConfiguration(
configuration=configuration,
result=result,
runtime_configuration=runtime_configuration,
)
params = substitute_none_for_missing(
renderer_configuration.kwargs,
[
"column",
"value_set",
"mostly",
"parse_strings_as_datetimes",
"row_condition",
"condition_parser",
],
)
if params["value_set"] is None or len(params["value_set"]) == 0:
values_string = "[ ]"
else:
for i, v in enumerate(params["value_set"]):
params[f"v__{str(i)}"] = v
values_string = " ".join(
[f"$v__{str(i)}" for i, v in enumerate(params["value_set"])]
)
template_str = f"values must belong to this set: {values_string}"
if params["mostly"] is not None and params["mostly"] < 1.0:
params["mostly_pct"] = num_to_str(
params["mostly"] * 100, precision=15, no_scientific=True
)
# params["mostly_pct"] = "{:.14f}".format(params["mostly"]*100).rstrip("0").rstrip(".")
template_str += ", at least $mostly_pct % of the time."
else:
template_str += "."
if params.get("parse_strings_as_datetimes"):
template_str += " Values should be parsed as datetimes."
if renderer_configuration.include_column_name:
template_str = f"$column {template_str}"
if params["row_condition"] is not None:
(
conditional_template_str,
conditional_params,
) = parse_row_condition_string_pandas_engine(params["row_condition"])
template_str = f"{conditional_template_str}, then {template_str}"
params.update(conditional_params)
styling = (
runtime_configuration.get("styling", {}) if runtime_configuration else {}
)
return [
RenderedStringTemplateContent(
**{
"content_block_type": "string_template",
"string_template": {
"template": template_str,
"params": params,
"styling": styling,
},
}
)
]
@classmethod
@renderer(renderer_type=LegacyDescriptiveRendererType.EXAMPLE_VALUES_BLOCK)
def _descriptive_example_values_block_renderer(
cls,
configuration: Optional[ExpectationConfiguration] = None,
result: Optional[ExpectationValidationResult] = None,
runtime_configuration: Optional[dict] = None,
) -> Optional[Union[RenderedBulletListContent, ValueListContent]]:
assert result, "Must pass in result."
if "partial_unexpected_counts" in result.result:
partial_unexpected_counts = result.result["partial_unexpected_counts"]
values = [str(v["value"]) for v in partial_unexpected_counts]
elif "partial_unexpected_list" in result.result:
values = [str(item) for item in result.result["partial_unexpected_list"]]
else:
return
classes = ["col-3", "mt-1", "pl-1", "pr-1"]
if any(len(value) > 80 for value in values):
content_block_type = "bullet_list"
content_block_class = RenderedBulletListContent
else:
content_block_type = "value_list"
content_block_class = ValueListContent
new_block = content_block_class(
**{
"content_block_type": content_block_type,
"header": RenderedStringTemplateContent(
**{
"content_block_type": "string_template",
"string_template": {
"template": "Example Values",
"tooltip": {"content": "expect_column_values_to_be_in_set"},
"tag": "h6",
},
}
),
content_block_type: [
{
"content_block_type": "string_template",
"string_template": {
"template": "$value",
"params": {"value": value},
"styling": {
"default": {
"classes": ["badge", "badge-info"]
if content_block_type == "value_list"
else [],
"styles": {"word-break": "break-all"},
},
},
},
}
for value in values
],
"styling": {
"classes": classes,
},
}
)
return new_block
def validate_configuration(
self, configuration: Optional[ExpectationConfiguration] = None
) -> None:
"""Validates that a value_set has been provided."""
super().validate_configuration(configuration)
# supports extensibility by allowing value_set to not be provided in config but captured via child-class default_kwarg_values, e.g. parameterized expectations
value_set = configuration.kwargs.get(
"value_set"
) or self.default_kwarg_values.get("value_set")
try:
assert (
"value_set" in configuration.kwargs or value_set
), "value_set is required"
assert isinstance(
value_set, (list, set, dict)
), "value_set must be a list, set, or dict"
if isinstance(value_set, dict):
assert (
"$PARAMETER" in value_set
), 'Evaluation Parameter dict for value_set kwarg must have "$PARAMETER" key.'
except AssertionError as e:
raise InvalidExpectationConfigurationError(str(e))
|
PypiClean
|
/spectral-cube-0.6.2.tar.gz/spectral-cube-0.6.2/docs/index.rst
|
Spectral Cube documentation
===========================
The spectral-cube package provides an easy way to read, manipulate, analyze,
and write data cubes with two positional dimensions and one spectral dimension,
optionally with Stokes parameters. It provides the following main features:
- A uniform interface to spectral cubes, robust to the
wide range of conventions of axis order, spatial projections,
and spectral units that exist in the wild.
- Easy extraction of cube sub-regions using physical coordinates.
- Ability to easily create, combine, and apply masks to datasets.
- Basic summary statistic methods like moments and array aggregates.
- Designed to work with datasets too large to load into memory.
You can find the latest version and the issue tracker on `github
<https://github.com/radio-astro-tools/spectral-cube>`_.
Quick start
-----------
Here's a simple script demonstrating the spectral-cube package::
>>> import astropy.units as u
>>> from astropy.utils import data
>>> from spectral_cube import SpectralCube
>>> fn = data.get_pkg_data_filename('tests/data/example_cube.fits', 'spectral_cube')
>>> cube = SpectralCube.read(fn)
>>> print(cube)
SpectralCube with shape=(7, 4, 3) and unit=Jy / beam:
n_x: 3 type_x: RA---ARC unit_x: deg range: 52.231466 deg: 52.231544 deg
n_y: 4 type_y: DEC--ARC unit_y: deg range: 31.243639 deg: 31.243739 deg
n_s: 7 type_s: VRAD unit_s: m / s range: 14322.821 m / s: 14944.909 m / s
# extract the subcube between 98 and 100 GHz
>>> slab = cube.spectral_slab(98 * u.GHz, 100 * u.GHz) # doctest: +SKIP
# Ignore elements fainter than 1 Jy/beam
>>> masked_slab = slab.with_mask(slab > 1 Jy/beam) # doctest: +SKIP
# Compute the first moment and write to file
>>> m1 = masked_slab.moment(order=1) # doctest: +SKIP
>>> m1.write('moment_1.fits') # doctest: +SKIP
Using spectral-cube
-------------------
The package centers around the
:class:`~spectral_cube.SpectralCube` class. In the following
sections, we look at how to read data into this class, manipulate spectral
cubes, extract moment maps or subsets of spectral cubes, and write spectral
cubes to files.
Getting started
^^^^^^^^^^^^^^^
.. toctree::
:maxdepth: 2
installing.rst
creating_reading.rst
accessing.rst
Cube Analysis
^^^^^^^^^^^^^
.. toctree::
:maxdepth: 2
moments.rst
errors.rst
writing.rst
beam_handling.rst
masking.rst
arithmetic.rst
metadata.rst
smoothing.rst
reprojection.rst
Subsets
^^^^^^^
.. toctree::
:maxdepth: 2
manipulating.rst
spectral_extraction.rst
continuum_subtraction.rst
Visualization
^^^^^^^^^^^^^
.. toctree::
:maxdepth: 2
quick_looks.rst
visualization.rst
Other Examples
^^^^^^^^^^^^^^
.. toctree::
:maxdepth: 2
examples.rst
There is also an `astropy tutorial
<http://learn.astropy.org/rst-tutorials/FITS-cubes.html>`__ on accessing and
manipulating FITS cubes with spectral-cube.
Advanced
^^^^^^^^
.. toctree::
:maxdepth: 1
dask.rst
yt_example.rst
big_data.rst
developing_with_spectralcube.rst
api.rst
|
PypiClean
|
/cometml_api-0.1.tar.gz/cometml_api-0.1/README.md
|
# comet_ml_api
This is a set of _unofficial_ Python bindings for the CometML REST API. There are functions for all current endpoints as well as a couple of functions that build on the basic endpoint ones to provide, e.g., simpler output or filtering. I haven't used all of the endpoints myself, so some of the functions haven't been tested at all (e.g. `get_html`). Documentation is currently limited but the functions provided are generally quite simple.
See [Endpoints][endpoints] for the official documentation of the REST API.
## Authentication
See [this page][api key] first for how to generate your REST API key. There are two possible methods of authentication using that key:
1. Store the key in `~/.comet_rest_key`. When you import the `api` script, it will automatically load a key that it finds in this file.
2. Call the `set_api_key` function and pass your key in. The key will be saved in a global variable so that the same key is used for all subsequent requests unless you explicitly set a new key.
## Endpoints
Most of the basic endpoint functions are named `get_<endpoint>` where `<endpoint>` is the corresponding endpoint (e.g. `get_workspaces` to access `https://www.comet.ml/api/rest/v1/workspaces`). However, there are a few exceptions.
### Metrics
There are two endpoints for metrics: `metrics` and `metrics-raw`. I call both of these "raw" because they return data that isn't well-formatted for immediate plotting/analysis. As the `metrics` endpoint only return the min, max, and most recent data points for a given metric, I call that one a summary, hence `get_raw_metric_summaries`; the function for `metrics-raw` is `get_raw_metrics`. There is also a helper function `get_metrics` which converts the metrics into a better format for visualization or analysis.
### Params
Similarly to metrics, the raw parameters data may not be in the most usable format right away. I thus also call this endpoint `get_raw_params` and have a helper function `get_params` which provides a more concise output.
### Other
As for params except, though the endpoint is `log-other`, the functions are `get_raw_others` and `get_others`.
### Images
The `images` endpoint doesn't return the images themselves, just the data about them (including the URLs from which the actual images can be downloaded?). I call this endpoint `get_image_data`, but I haven't tested it.
## Example Usage
```python
from comet_ml_api import api
workspaces = api.get_workspaces()
project_ids = api.get_project_names_and_ids(workspaces[0]) # {name: id}
experiments = api.get_experiments(project_ids.popitem()[1])
api.get_params(experiments[0]["experiment_key"])
```
[endpoints]: https://www.comet.ml/docs/rest-api/endpoints/
[api key]: https://www.comet.ml/docs/rest-api/getting-started/
|
PypiClean
|
/DTSR-0.2.0.tar.gz/DTSR-0.2.0/dtsr/bin/optimize_spillover.py
|
import argparse
import os
import sys
import itertools
import re
import pandas as pd
from dtsr.baselines import LM
import gc
from dtsr.config import Config
from dtsr.io import read_data
from dtsr.formula import Formula
from dtsr.data import preprocess_data, compute_splitID, compute_partition
from dtsr.util import mse, mae
pd.options.mode.chained_assignment = None
splitter = re.compile(' *[-+|:] *')
pred_name = re.compile('[^(]+\((.+)\)')
def clean_parens(l):
for i, x in enumerate(l):
x_new = x
if x.startswith('('):
x_new = x[1:]
if x.endswith(')'):
x_new = x[:-1]
l[i] = x_new
return l
def get_preds(bform):
preds = set()
l = bform.split('~')[1].strip()
if l.startswith('('):
l = splitter.split(l.strip())
l_new = clean_parens(l)
p_list = l
else:
p_list = splitter.split(l.strip())
for p in p_list:
name = p
while pred_name.match(name):
name = pred_name.match(name).group(1)
if name not in preds:
preds.add(name)
return preds
def update_preds(preds, perm):
preds_new = preds[:]
for i,x in enumerate(preds):
if perm[i] > 0:
preds_new[i] = x + 'S%d' %perm[i]
return preds_new
def permute_spillover(bform, preds, perms):
forms = []
for perm in perms:
form_name = []
for i in range(len(preds)):
form_name.append(preds[i][:2] + str(perm[i]))
preds_new = update_preds(preds, perm)
l = bform
for i in range(len(preds)):
l = re.sub(r'([+^ (])'+preds[i]+'([+$ )])', r'\1'+preds_new[i]+r'\2', l)
forms.append(''.join(l))
return(forms)
if __name__ == '__main__':
argparser = argparse.ArgumentParser('''
Trains model(s) from formula string(s) given data.
''')
argparser.add_argument('config_path', help='Path to configuration (*.ini) file')
args, unknown = argparser.parse_known_args()
p = Config(args.config_path)
models = p.model_list[:]
lm_formula = p.models['LMnoS_noRE']['formula']
preds = get_preds(lm_formula)
preds = list(preds)
preds.sort(reverse=True, key=lambda x: len(x))
n_pred = len(preds)
perms = list(itertools.product(range(0, 4), repeat=n_pred))
forms = permute_spillover(lm_formula, preds, perms)
dtsr_formula_list = [Formula(p.models[m]['formula']) for m in p.model_list if m.startswith('DTSR')]
dtsr_formula_name_list = [m for m in p.model_list if m.startswith('DTSR')]
X, y = read_data(p.X_train, p.y_train, p.series_ids, categorical_columns=list(set(p.series_ids + [v for x in dtsr_formula_list for v in x.rangf])))
X, y, select, _, _, _, _ = preprocess_data(
X,
y,
dtsr_formula_list,
p.series_ids,
filter_map=p.filter_map,
compute_history=False,
history_length=p.history_length
)
from dtsr.baselines import py2ri
X['splitID'] = compute_splitID(X, p.split_ids)
part = compute_partition(X, p.modulus, 3)
part_select = part[0]
X_baseline = X[part_select]
X_baseline = X_baseline.reset_index(drop=True)[select]
n_train_sample = len(y)
sys.stderr.write('\nNumber of training samples: %d\n' %n_train_sample)
for i in range(len(dtsr_formula_list)):
x = dtsr_formula_list[i]
if x.dv not in X_baseline.columns:
X_baseline[x.dv] = y[x.dv]
for c in X_baseline.columns:
if X_baseline[c].dtype.name == 'category':
X_baseline[c] = X_baseline[c].astype(str)
X_baseline = py2ri(X_baseline)
if not os.path.exists(p.outdir + '/spillover/'):
os.makedirs(p.outdir + '/spillover/')
for formula in forms:
m = '_'.join(sorted(list(get_preds(formula))))
dv = formula.strip().split('~')[0].strip()
if os.path.exists(p.outdir + '/spillover/' + m + '.txt'):
sys.stderr.write('Model %s already exists. Skipping...\n' % m)
continue
else:
sys.stderr.write('Fitting model %s...\n' % m)
lm = LM(formula, X_baseline)
gc.collect()
lm_preds = lm.predict(X_baseline)
lm_mse = mse(y[dv], lm_preds)
lm_mae = mae(y[dv], lm_preds)
summary = '=' * 50 + '\n'
summary += 'Linear regression\n\n'
summary += 'Model name: %s\n\n' % m
summary += 'Formula:\n'
summary += ' ' + formula + '\n'
summary += str(lm.summary()) + '\n'
summary += 'Training set loss:\n'
summary += ' MSE: %.4f\n' % lm_mse
summary += ' MAE: %.4f\n' % lm_mae
summary += '=' * 50 + '\n'
with open(p.outdir + '/spillover/' + m + '.txt', 'w') as f_out:
f_out.write(summary)
sys.stderr.write(summary)
sys.stderr.write('\n\n')
|
PypiClean
|
/sftpgo-client-0.3.1.tar.gz/sftpgo-client-0.3.1/sftpgo_client/base/api/admins/generate_admin_totp_secret.py
|
from typing import Any, Dict, Optional, Union, cast
import httpx
from ...client import AuthenticatedClient
from ...models.generate_admin_totp_secret_json_body import (
GenerateAdminTotpSecretJsonBody,
)
from ...models.generate_admin_totp_secret_response_200 import (
GenerateAdminTotpSecretResponse200,
)
from ...types import Response
def _get_kwargs(
*,
client: AuthenticatedClient,
json_body: GenerateAdminTotpSecretJsonBody,
) -> Dict[str, Any]:
url = "{}/admin/totp/generate".format(client.base_url)
headers: Dict[str, str] = client.get_headers()
cookies: Dict[str, Any] = client.get_cookies()
json_json_body = json_body.to_dict()
return {
"method": "post",
"url": url,
"headers": headers,
"cookies": cookies,
"timeout": client.get_timeout(),
"json": json_json_body,
}
def _parse_response(
*, response: httpx.Response
) -> Optional[Union[Any, GenerateAdminTotpSecretResponse200]]:
if response.status_code == 200:
response_200 = GenerateAdminTotpSecretResponse200.from_dict(response.json())
return response_200
if response.status_code == 400:
response_400 = cast(Any, None)
return response_400
if response.status_code == 401:
response_401 = cast(Any, None)
return response_401
if response.status_code == 403:
response_403 = cast(Any, None)
return response_403
if response.status_code == 500:
response_500 = cast(Any, None)
return response_500
return None
def _build_response(
*, response: httpx.Response
) -> Response[Union[Any, GenerateAdminTotpSecretResponse200]]:
return Response(
status_code=response.status_code,
content=response.content,
headers=response.headers,
parsed=_parse_response(response=response),
)
def sync_detailed(
*,
client: AuthenticatedClient,
json_body: GenerateAdminTotpSecretJsonBody,
) -> Response[Union[Any, GenerateAdminTotpSecretResponse200]]:
"""Generate a new TOTP secret
Generates a new TOTP secret, including the QR code as png, using the specified configuration for the
logged in admin
Args:
json_body (GenerateAdminTotpSecretJsonBody):
Returns:
Response[Union[Any, GenerateAdminTotpSecretResponse200]]
"""
kwargs = _get_kwargs(
client=client,
json_body=json_body,
)
response = httpx.request(
verify=client.verify_ssl,
**kwargs,
)
return _build_response(response=response)
def sync(
*,
client: AuthenticatedClient,
json_body: GenerateAdminTotpSecretJsonBody,
) -> Optional[Union[Any, GenerateAdminTotpSecretResponse200]]:
"""Generate a new TOTP secret
Generates a new TOTP secret, including the QR code as png, using the specified configuration for the
logged in admin
Args:
json_body (GenerateAdminTotpSecretJsonBody):
Returns:
Response[Union[Any, GenerateAdminTotpSecretResponse200]]
"""
return sync_detailed(
client=client,
json_body=json_body,
).parsed
async def asyncio_detailed(
*,
client: AuthenticatedClient,
json_body: GenerateAdminTotpSecretJsonBody,
) -> Response[Union[Any, GenerateAdminTotpSecretResponse200]]:
"""Generate a new TOTP secret
Generates a new TOTP secret, including the QR code as png, using the specified configuration for the
logged in admin
Args:
json_body (GenerateAdminTotpSecretJsonBody):
Returns:
Response[Union[Any, GenerateAdminTotpSecretResponse200]]
"""
kwargs = _get_kwargs(
client=client,
json_body=json_body,
)
async with httpx.AsyncClient(verify=client.verify_ssl) as _client:
response = await _client.request(**kwargs)
return _build_response(response=response)
async def asyncio(
*,
client: AuthenticatedClient,
json_body: GenerateAdminTotpSecretJsonBody,
) -> Optional[Union[Any, GenerateAdminTotpSecretResponse200]]:
"""Generate a new TOTP secret
Generates a new TOTP secret, including the QR code as png, using the specified configuration for the
logged in admin
Args:
json_body (GenerateAdminTotpSecretJsonBody):
Returns:
Response[Union[Any, GenerateAdminTotpSecretResponse200]]
"""
return (
await asyncio_detailed(
client=client,
json_body=json_body,
)
).parsed
|
PypiClean
|
/bpelib-v0.1.2.tar.gz/bpelib-v0.1.2/bpe/bpe.py
|
import os
import re
import pickle
import collections
import numpy as np
from tqdm import tqdm
from typing import List, Tuple, Dict, Iterable
from abc import ABC, abstractmethod, abstractproperty
try:
import bpelib.bpe.libc as bpelibc
except ImportError as import_error:
print(import_error)
print("Continuing without c libraries ...")
bpelibc = None
class WordFreq(Dict[bytes, int]):
def merge_pair(self, pair: Tuple[bytes, bytes]) -> None:
"""
Merges the most frequent pair (or any, for that matter) into the provided word-freq dictionary.
:param pair: the most frequent 'byte pair' in the dictionary
:return: the merged word-freq dictionary
"""
# create a valid regular expression from the provided most-frequent-pair
bigrams = re.escape(b' '.join(pair))
# create a regex object with negative lookbehind and lookahead around our regex string
# no non-space character is allowed to precede or follow our regex string
p = re.compile(rb'(?<!\S)' + bigrams + rb'(?!\S)')
for word in self.copy():
if p.search(word):
w_out = p.sub(b''.join(pair), word)
self[w_out] = self.pop(word)
class _BPE_template(ABC):
"""
Template class for Byte Pair Encoder
Learns to encode any given word, to a series of learned substrings.
Note: do not use this class directly
"""
# there should not be any unknown bytes
__all_bytes: Dict[bytes, int] = {bytes([i]): i for i in range(256)}
# whether to show the learning process or not
mute = False
@staticmethod
def all_bytes() -> Dict[bytes, int]:
return _BPE_template.__all_bytes.copy()
def __eq__(self, other: '_BPE_template'):
if not isinstance(other, _BPE_template):
return False
return (self._vocab == other._vocab and self._merges == other._merges and
self._vocab_size == other._vocab_size and self._eow == other._eow and self._sow == other._sow)
@abstractmethod
def __call__(self, *args, **kwargs): ...
def _vocab_add(self, item: bytes) -> None:
"""
Adds a key to the internal vocabulary, with incrementing integer value.
:param item: the item to add to the vocabulary
"""
if item not in self._vocab:
self._vocab[item] = self._vocab_size
self._vocab_size += 1
def init(self) -> None:
"""
Initialize or reinitialize the learned merge operations and vocabulary.
"""
# initialize the vocabulary
self._vocab = BPE.all_bytes()
self._vocab_size = len(self._vocab)
self._vocab_add(self._sow)
self._vocab_add(self._eow)
# initialize the dictionary containing the merge operations
self._merges: Dict[Tuple[bytes, bytes], int] = dict()
def _learn_encoding_impl(self, corpus: Iterable[bytes], max_vocab_size: int):
# returns frequency of each word
self._word_freq = collections.Counter(corpus)
# convert word_freq object to dictionary
self._word_freq = WordFreq(self._word_freq)
max_it_num = max_vocab_size - 258
iterator = tqdm(range(max_it_num),
desc='Learning BPE ...', ncols=100, ascii=' >>>>>>>>>>>>>=', disable=BPE.mute)
for i in iterator:
# compute frequency of bigrams in a corpus
pairs = BPE._make_pairs(self._word_freq)
# no more byte pairs found -> end loop
if not pairs:
iterator.n = max_it_num
iterator.close()
break
# compute the best pair
best = max(pairs, key=pairs.get)
# merge the most frequent pair in corpus
self._word_freq.merge_pair(best)
# append to merge dict
self._merges[best] = i
# convert a tuple to a string, then append to vocabulary
new_byte = b''.join(best)
self._vocab_add(new_byte)
@staticmethod
def _make_pairs(word_freq: WordFreq) -> Dict[Tuple[bytes, bytes], int]:
"""
Computes frequency of a pair of characters or character sequences.
:param word_freq: dictionary containing each word and its frequencies
:return: frequency of each pair
"""
# any unknown key defaults to 'calling int()', so basically it is zero
pairs = collections.defaultdict(int)
# iterate all word frequency dictionary items
for word, freq in word_freq.items():
# unique symbols in a word
symbols = word.split()
# counting pairs
for i in range(len(symbols) - 1):
pairs[symbols[i], symbols[i + 1]] += freq
return pairs
@abstractmethod
def encode(self, word): ...
@abstractmethod
def decode(self, word): ...
def save(self, directory: str = './') -> None:
"""
Saves vocabulary and merge operations to a specified folder.
:param directory: the folder to save the BPE data
"""
# extract subdirectory names
directory = os.path.abspath(directory)
sub_dirs = directory.split(os.path.sep)
# create the subdirectories as needed
cdir = ''
for sub in sub_dirs:
cdir += sub + os.path.sep
if not os.path.isdir(cdir):
os.mkdir(cdir)
# save the files
pickle.dump(self._vocab, open(directory + os.path.sep + '__vocab__', 'wb'), pickle.HIGHEST_PROTOCOL)
pickle.dump(self._merges, open(directory + os.path.sep + '__merges__', 'wb'), pickle.HIGHEST_PROTOCOL)
def load(self, directory: str) -> None:
"""
Loads the vocabulary and the learned merge operations based on previous model.
:param directory: the folder where we last saved our data
"""
self._vocab = pickle.load(open(directory + os.path.sep + '__vocab__', 'rb'))
self._merges = pickle.load(open(directory + os.path.sep + '__merges__', 'rb'))
self._vocab_size = len(self._vocab)
@property
def vocab(self): ...
@property
def merges(self): ...
@property
def sow(self): ...
@property
def eow(self): ...
class BPE(_BPE_template):
"""
Byte Pair Encoder class for string inputs
Learns to encode any given word, to a series of learned substrings.
"""
def __call__(self, word: str, encode: bool = None, encoding: str = None) -> str:
# try guessing what we want (a very basic guess based on sow and eow tokens)
if encode is None:
# get or set encoding
encoding = encoding if encoding else self._enc
# try finding Start or End Of Word token
if word.find(self._sow.decode(encoding)) < 0 or word.find(self._eow.decode(encoding)) < 0:
return self.encode(word, encoding)
else:
return self.decode(word, encoding)
if encode:
return self.encode(word, encoding)
else:
return self.decode(word, encoding)
def __init__(self,
corpus: Iterable[str] = None,
max_vocab_size: int = 4096,
sow: str = '<w/>',
eow: str = '</w>',
encoding: str = 'utf8'
):
"""
Constructor of Byte Pair Encoding class, will start learning bpe if corpus is provided.
:param corpus: a list of tokens or words
:param max_vocab_size: allowed dictionary size
:param sow: start of word token
:param eow: end of word token
:param encoding: the encoding of the corpus, or the text to be learned
"""
self._max_vocab_size = max_vocab_size
self._sow = bytes(sow, encoding=encoding)
self._eow = bytes(eow, encoding=encoding)
self._enc = encoding
self._word_freq = None
self._vocab: Dict[bytes, int] = dict()
self._vocab_size: int = 0
self._merges: Dict[Tuple[bytes, bytes], int] = dict()
self.init()
if corpus:
# learn the byte pair encoding
self.learn_encoding(corpus, max_vocab_size, encoding)
def learn_encoding(self,
corpus: Iterable[str],
max_vocab_size: int = None,
encoding: str = None
):
"""
Starts the learning of byte pair encodings in a given corpus.
:param corpus: a list of tokens or words
:param max_vocab_size: allowed dictionary size
:param encoding: the encoding of the corpus, or the text to be learned
"""
# get or set encoding
encoding = encoding if encoding else self._enc
# get or set the maximum size of the vocabulary and bpe merges
max_vocab_size = max_vocab_size if max_vocab_size else self._max_vocab_size
# split the word into characters, string is iterable so it can be further 'joined' with spaces
# also appends End Of Word token at the end of each token
b_corpus = [bytes(token, encoding=encoding) for token in tqdm(
corpus, desc='Conversion to bytes ...', ncols=100, ascii=' >>>>>>>>>>>>>=', disable=BPE.mute
)]
if bpelibc:
iterator_obj = tqdm(b_corpus, desc='Splitting bytes ...',
ncols=100, ascii=' >>>>>>>>>>>>>=', disable=BPE.mute)
b_corpus = bpelibc.split_bytes_in_iterable_and_add_boundary(iterator_obj, self._sow, self._eow)
else:
b_corpus = [
self._sow + b' ' +
b' '.join(np.frombuffer(token, dtype=np.int8)) +
b' ' + self._eow for token in tqdm(
b_corpus, desc='Splitting bytes ...', ncols=100, ascii=' >>>>>>>>>>>>>=', disable=BPE.mute
)]
self._learn_encoding_impl(b_corpus, max_vocab_size)
def encode(self, word: str, encoding: str = None) -> str:
"""
Helps to encode a word, with the learned Byte Pair Encoding.
:param word: the word or words to be encoded
:param encoding: original encoding of the string, if not provided,
the one from the constructor will be used (or the default one -> utf8)
:return: the encoded word
"""
# get or set encoding
encoding = encoding if encoding else self._enc
# convert the given string to bytes
b_word = bytes(word, encoding=encoding)
# support the case of multiple words
words = word.split(' ')
b_words = b_word.split(b' ')
# iterate over the list of words
for i, (w, bw) in enumerate(zip(words, b_words)):
# if the byte word could be found in the vocabulary
if self._sow + bw + self._eow in self._vocab:
words[i] = self._sow.decode(encoding=encoding) + w + self._eow.decode(encoding=encoding)
continue
# split byte by byte
bw = self._sow + b' ' + b' '.join(np.frombuffer(bw, dtype=np.int8)) + b' ' + self._eow
# dummy dictionary
word_freq = WordFreq({bw: 1})
# compute frequency
pairs = self._make_pairs(word_freq)
# extract keys
pairs = pairs.keys()
# find the pairs available in the learned operations
match_items = [(i, self._merges[i]) for i in pairs if i in self._merges]
# continue until there are pairs to merge
while len(match_items) != 0:
items, indices = zip(*match_items)
# choose the most frequent learned operation (the merge list is an ordered list by default)
# the most frequent merges will be on the front
best_index = indices.index(min(indices))
best_merge = items[best_index]
# merge the best pair
word_freq.merge_pair(best_merge)
# compute frequency
pairs = self._make_pairs(word_freq)
# extract keys
pairs = pairs.keys()
# find the pairs available in the learned operations
match_items = [(i, self._merges[i]) for i in pairs if i in self._merges]
# extract the only one word in the dictionary
words[i] = list(word_freq.keys())[0].decode(encoding=encoding)
# return the joined, encoded words
return ' '.join(words)
def decode(self, word: str, encoding: str = None) -> str:
"""
Helps to decode an encoded word.
:param word: a word to decode
:param encoding: original encoding of the string, if not provided,
the one from the constructor will be used (or the default one -> utf8)
:return: the decoded word
"""
# get or set encoding
encoding = encoding if encoding else self._enc
# convert the given string to bytes
b_word = bytes(word, encoding=encoding)
re_sow = re.escape(self._sow)
re_eow = re.escape(self._eow)
# create a regex object with positive lookbehind and lookahead around our regex string
p = re.compile(rb'(?<=' + re_sow + rb').*?(?=' + re_eow + rb')')
b_words: List[bytes] = p.findall(b_word)
b_words = [b.replace(b' ', b'') for b in b_words]
words = [b.decode(encoding=encoding) for b in b_words]
return ' '.join(words)
@property
def vocab(self) -> Dict[bytes, int]:
return self._vocab.copy()
@property
def merges(self) -> Dict[Tuple[bytes, bytes], int]:
return self._merges.copy()
@property
def sow(self) -> str:
return self._sow.decode(encoding=self._enc)
@property
def eow(self) -> str:
return self._eow.decode(encoding=self._enc)
class BPEB(_BPE_template):
"""
Byte Pair Encoder class for byte inputs
Learns to encode any given word, to a series of learned substrings.
"""
def __call__(self, word: bytes, encode: bool = None) -> bytes:
# try guessing what we want (a very basic guess based on sow and eow tokens)
if encode is None:
# try finding Start or End Of Word token
if word.find(self._sow) < 0 or word.find(self._eow) < 0:
return self.encode(word)
else:
return self.decode(word)
if encode:
return self.encode(word)
else:
return self.decode(word)
def __init__(self,
corpus: Iterable[bytes] = None,
max_vocab_size: int = 4096,
sow: bytes = b'<w/>',
eow: bytes = b'</w>'
):
"""
Constructor of Byte Pair Encoding class, will start learning bpe if corpus is provided.
:param corpus: a list of tokens or words
:param max_vocab_size: allowed dictionary size
:param sow: start of word token
:param eow: end of word token
"""
self._max_vocab_size = max_vocab_size
self._sow = sow
self._eow = eow
self._word_freq = None
self._vocab: Dict[bytes, int] = dict()
self._vocab_size: int = 0
self._merges: Dict[Tuple[bytes, bytes], int] = dict()
self.init()
if corpus:
# learn the byte pair encoding
self.learn_encoding(corpus, max_vocab_size)
def learn_encoding(self,
corpus: Iterable[bytes],
max_vocab_size: int = None
):
"""
Starts the learning of byte pair encodings in a given corpus.
:param corpus: a list of tokens or words
:param max_vocab_size: allowed dictionary size
"""
# get or set the maximum size of the vocabulary and bpe merges
max_vocab_size = max_vocab_size if max_vocab_size else self._max_vocab_size
# split the word into characters, string is iterable so it can be further 'joined' with spaces
# also appends End Of Word token at the end of each token
if bpelibc:
iterator_obj = tqdm(corpus, desc='Splitting bytes ...',
ncols=100, ascii=' >>>>>>>>>>>>>=', disable=BPE.mute)
b_corpus = bpelibc.split_bytes_in_iterable_and_add_boundary(iterator_obj, self._sow, self._eow)
else:
b_corpus = [
self._sow + b' ' +
b' '.join(np.frombuffer(token, dtype=np.int8)) +
b' ' + self._eow for token in tqdm(
corpus, desc='Splitting bytes ...', ncols=100, ascii=' >>>>>>>>>>>>>=', disable=BPE.mute
)]
self._learn_encoding_impl(b_corpus, max_vocab_size)
def encode(self, word: bytes) -> bytes:
"""
Helps to encode a word, with the learned Byte Pair Encoding.
:param word: the word or words to be encoded
:return: the encoded word
"""
# support the case of multiple words
words = word.split(b' ')
b_words = word.split(b' ')
# iterate over the list of words
for i, (w, bw) in enumerate(zip(words, b_words)):
# if the byte word could be found in the vocabulary
if self._sow + bw + self._eow in self._vocab:
words[i] = self._sow + w + self._eow
continue
# split byte by byte
bw = self._sow + b' ' + b' '.join(np.frombuffer(bw, dtype=np.int8)) + b' ' + self._eow
# dummy dictionary
word_freq = WordFreq({bw: 1})
# compute frequency
pairs = self._make_pairs(word_freq)
# extract keys
pairs = pairs.keys()
# find the pairs available in the learned operations
match_items = [(i, self._merges[i]) for i in pairs if i in self._merges]
# continue until there are pairs to merge
while len(match_items) != 0:
items, indices = zip(*match_items)
# choose the most frequent learned operation (the merge list is an ordered list by default)
# the most frequent merges will be on the front
best_index = indices.index(min(indices))
best_merge = items[best_index]
# merge the best pair
word_freq.merge_pair(best_merge)
# compute frequency
pairs = self._make_pairs(word_freq)
# extract keys
pairs = pairs.keys()
# find the pairs available in the learned operations
match_items = [(i, self._merges[i]) for i in pairs if i in self._merges]
# extract the only one word in the dictionary
words[i] = list(word_freq.keys())[0]
# return the joined, encoded words
return b' '.join(words)
def decode(self, word: bytes) -> bytes:
"""
Helps to decode an encoded word.
:param word: a word to decode
:return: the decoded word
"""
re_sow = re.escape(self._sow)
re_eow = re.escape(self._eow)
# create a regex object with positive lookbehind and lookahead around our regex string
p = re.compile(rb'(?<=' + re_sow + rb').*?(?=' + re_eow + rb')')
b_words: List[bytes] = p.findall(word)
b_words = [b.replace(b' ', b'') for b in b_words]
return b' '.join(b_words)
@property
def vocab(self) -> Dict[bytes, int]:
return self._vocab.copy()
@property
def merges(self) -> Dict[Tuple[bytes, bytes], int]:
return self._merges.copy()
@property
def sow(self) -> bytes:
return self._sow
@property
def eow(self) -> bytes:
return self._eow
def bigram(corpus: List[str]) -> Dict[Tuple[str, str], int]:
bgrams = [b for l in corpus for b in zip(list(l)[:-1], list(l)[1:])]
bgrams = collections.Counter(bgrams)
return dict(bgrams)
|
PypiClean
|
/tvb-ext-xircuits-1.1.0.tar.gz/tvb-ext-xircuits-1.1.0/xai_components/xai_tvb_models/stefanescu_jirsa.py
|
import numpy
from tvb.simulator.models.base import Model
from typing import Union
from xai_components.base import xai_component, InArg, OutArg
from xai_components.base_tvb import ComponentWithWidget
from xai_components.utils import print_component_summary, set_values
@xai_component(color='rgb(101, 179, 46)')
class ReducedSetFitzHughNagumo(ComponentWithWidget):
tau: InArg[Union[float, numpy.ndarray]]
a: InArg[Union[float, numpy.ndarray]]
b: InArg[Union[float, numpy.ndarray]]
K11: InArg[Union[float, numpy.ndarray]]
K12: InArg[Union[float, numpy.ndarray]]
K21: InArg[Union[float, numpy.ndarray]]
sigma: InArg[Union[float, numpy.ndarray]]
mu: InArg[Union[float, numpy.ndarray]]
variables_of_interest: InArg[list]
reducedSetFitzHughNagumo: OutArg[Model]
@property
def tvb_ht_class(self):
from tvb.simulator.models.stefanescu_jirsa import ReducedSetFitzHughNagumo
return ReducedSetFitzHughNagumo
def execute(self, ctx) -> None:
reducedSetFitzHughNagumo = self.tvb_ht_class()
set_values(self, reducedSetFitzHughNagumo)
self.reducedSetFitzHughNagumo.value = reducedSetFitzHughNagumo
print_component_summary(self.reducedSetFitzHughNagumo.value)
@xai_component(color='rgb(101, 179, 46)')
class ReducedSetHindmarshRose(ComponentWithWidget):
r: InArg[Union[float, numpy.ndarray]]
a: InArg[Union[float, numpy.ndarray]]
b: InArg[Union[float, numpy.ndarray]]
c: InArg[Union[float, numpy.ndarray]]
d: InArg[Union[float, numpy.ndarray]]
s: InArg[Union[float, numpy.ndarray]]
xo: InArg[Union[float, numpy.ndarray]]
K11: InArg[Union[float, numpy.ndarray]]
K12: InArg[Union[float, numpy.ndarray]]
K21: InArg[Union[float, numpy.ndarray]]
sigma: InArg[Union[float, numpy.ndarray]]
mu: InArg[Union[float, numpy.ndarray]]
variables_of_interest: InArg[list]
reducedSetHindmarshRose: OutArg[Model]
@property
def tvb_ht_class(self):
from tvb.simulator.models.stefanescu_jirsa import ReducedSetHindmarshRose
return ReducedSetHindmarshRose
def execute(self, ctx) -> None:
reducedSetHindmarshRose = self.tvb_ht_class()
set_values(self, reducedSetHindmarshRose)
self.reducedSetHindmarshRose.value = reducedSetHindmarshRose
print_component_summary(self.reducedSetHindmarshRose.value)
|
PypiClean
|
/cdktf_cdktf_provider_datadog-9.0.0-py3-none-any.whl/cdktf_cdktf_provider_datadog/provider/__init__.py
|
import abc
import builtins
import datetime
import enum
import typing
import jsii
import publication
import typing_extensions
from typeguard import check_type
from .._jsii import *
import cdktf as _cdktf_9a9027ec
import constructs as _constructs_77d1e7e8
class DatadogProvider(
_cdktf_9a9027ec.TerraformProvider,
metaclass=jsii.JSIIMeta,
jsii_type="@cdktf/provider-datadog.provider.DatadogProvider",
):
'''Represents a {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs datadog}.'''
def __init__(
self,
scope: _constructs_77d1e7e8.Construct,
id: builtins.str,
*,
alias: typing.Optional[builtins.str] = None,
api_key: typing.Optional[builtins.str] = None,
api_url: typing.Optional[builtins.str] = None,
app_key: typing.Optional[builtins.str] = None,
http_client_retry_backoff_base: typing.Optional[jsii.Number] = None,
http_client_retry_backoff_multiplier: typing.Optional[jsii.Number] = None,
http_client_retry_enabled: typing.Optional[builtins.str] = None,
http_client_retry_max_retries: typing.Optional[jsii.Number] = None,
http_client_retry_timeout: typing.Optional[jsii.Number] = None,
validate: typing.Optional[builtins.str] = None,
) -> None:
'''Create a new {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs datadog} Resource.
:param scope: The scope in which to define this construct.
:param id: The scoped construct ID. Must be unique amongst siblings in the same scope
:param alias: Alias name. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#alias DatadogProvider#alias}
:param api_key: (Required unless validate is false) Datadog API key. This can also be set via the DD_API_KEY environment variable. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#api_key DatadogProvider#api_key}
:param api_url: The API URL. This can also be set via the DD_HOST environment variable. Note that this URL must not end with the ``/api/`` path. For example, ``https://api.datadoghq.com/`` is a correct value, while ``https://api.datadoghq.com/api/`` is not. And if you're working with "EU" version of Datadog, use ``https://api.datadoghq.eu/``. Other Datadog region examples: ``https://api.us5.datadoghq.com/``, ``https://api.us3.datadoghq.com/`` and ``https://api.ddog-gov.com/``. See https://docs.datadoghq.com/getting_started/site/ for all available regions. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#api_url DatadogProvider#api_url}
:param app_key: (Required unless validate is false) Datadog APP key. This can also be set via the DD_APP_KEY environment variable. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#app_key DatadogProvider#app_key}
:param http_client_retry_backoff_base: The HTTP request retry back off base. Defaults to 2. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_backoff_base DatadogProvider#http_client_retry_backoff_base}
:param http_client_retry_backoff_multiplier: The HTTP request retry back off multiplier. Defaults to 2. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_backoff_multiplier DatadogProvider#http_client_retry_backoff_multiplier}
:param http_client_retry_enabled: Enables request retries on HTTP status codes 429 and 5xx. Valid values are [``true``, ``false``]. Defaults to ``true``. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_enabled DatadogProvider#http_client_retry_enabled}
:param http_client_retry_max_retries: The HTTP request maximum retry number. Defaults to 3. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_max_retries DatadogProvider#http_client_retry_max_retries}
:param http_client_retry_timeout: The HTTP request retry timeout period. Defaults to 60 seconds. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_timeout DatadogProvider#http_client_retry_timeout}
:param validate: Enables validation of the provided API key during provider initialization. Valid values are [``true``, ``false``]. Default is true. When false, api_key won't be checked. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#validate DatadogProvider#validate}
'''
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__0805d396d4fc3c279b51fac738ada9fac3aa1057e300db07efda8efdcdde344e)
check_type(argname="argument scope", value=scope, expected_type=type_hints["scope"])
check_type(argname="argument id", value=id, expected_type=type_hints["id"])
config = DatadogProviderConfig(
alias=alias,
api_key=api_key,
api_url=api_url,
app_key=app_key,
http_client_retry_backoff_base=http_client_retry_backoff_base,
http_client_retry_backoff_multiplier=http_client_retry_backoff_multiplier,
http_client_retry_enabled=http_client_retry_enabled,
http_client_retry_max_retries=http_client_retry_max_retries,
http_client_retry_timeout=http_client_retry_timeout,
validate=validate,
)
jsii.create(self.__class__, self, [scope, id, config])
@jsii.member(jsii_name="resetAlias")
def reset_alias(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetAlias", []))
@jsii.member(jsii_name="resetApiKey")
def reset_api_key(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetApiKey", []))
@jsii.member(jsii_name="resetApiUrl")
def reset_api_url(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetApiUrl", []))
@jsii.member(jsii_name="resetAppKey")
def reset_app_key(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetAppKey", []))
@jsii.member(jsii_name="resetHttpClientRetryBackoffBase")
def reset_http_client_retry_backoff_base(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetHttpClientRetryBackoffBase", []))
@jsii.member(jsii_name="resetHttpClientRetryBackoffMultiplier")
def reset_http_client_retry_backoff_multiplier(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetHttpClientRetryBackoffMultiplier", []))
@jsii.member(jsii_name="resetHttpClientRetryEnabled")
def reset_http_client_retry_enabled(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetHttpClientRetryEnabled", []))
@jsii.member(jsii_name="resetHttpClientRetryMaxRetries")
def reset_http_client_retry_max_retries(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetHttpClientRetryMaxRetries", []))
@jsii.member(jsii_name="resetHttpClientRetryTimeout")
def reset_http_client_retry_timeout(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetHttpClientRetryTimeout", []))
@jsii.member(jsii_name="resetValidate")
def reset_validate(self) -> None:
return typing.cast(None, jsii.invoke(self, "resetValidate", []))
@jsii.member(jsii_name="synthesizeAttributes")
def _synthesize_attributes(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "synthesizeAttributes", []))
@jsii.python.classproperty
@jsii.member(jsii_name="tfResourceType")
def TF_RESOURCE_TYPE(cls) -> builtins.str:
return typing.cast(builtins.str, jsii.sget(cls, "tfResourceType"))
@builtins.property
@jsii.member(jsii_name="aliasInput")
def alias_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "aliasInput"))
@builtins.property
@jsii.member(jsii_name="apiKeyInput")
def api_key_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "apiKeyInput"))
@builtins.property
@jsii.member(jsii_name="apiUrlInput")
def api_url_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "apiUrlInput"))
@builtins.property
@jsii.member(jsii_name="appKeyInput")
def app_key_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "appKeyInput"))
@builtins.property
@jsii.member(jsii_name="httpClientRetryBackoffBaseInput")
def http_client_retry_backoff_base_input(self) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryBackoffBaseInput"))
@builtins.property
@jsii.member(jsii_name="httpClientRetryBackoffMultiplierInput")
def http_client_retry_backoff_multiplier_input(
self,
) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryBackoffMultiplierInput"))
@builtins.property
@jsii.member(jsii_name="httpClientRetryEnabledInput")
def http_client_retry_enabled_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "httpClientRetryEnabledInput"))
@builtins.property
@jsii.member(jsii_name="httpClientRetryMaxRetriesInput")
def http_client_retry_max_retries_input(self) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryMaxRetriesInput"))
@builtins.property
@jsii.member(jsii_name="httpClientRetryTimeoutInput")
def http_client_retry_timeout_input(self) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryTimeoutInput"))
@builtins.property
@jsii.member(jsii_name="validateInput")
def validate_input(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "validateInput"))
@builtins.property
@jsii.member(jsii_name="alias")
def alias(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "alias"))
@alias.setter
def alias(self, value: typing.Optional[builtins.str]) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__41c6a45870fdb947f3ddde847982a16aa4f1f568f90c0090891bcb1ee998ba63)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "alias", value)
@builtins.property
@jsii.member(jsii_name="apiKey")
def api_key(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "apiKey"))
@api_key.setter
def api_key(self, value: typing.Optional[builtins.str]) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__ee1caef371827550f1c55b1ba8b19bb96a191d14b34fa991469a985a09dc4e77)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "apiKey", value)
@builtins.property
@jsii.member(jsii_name="apiUrl")
def api_url(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "apiUrl"))
@api_url.setter
def api_url(self, value: typing.Optional[builtins.str]) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__e52d452c54d0316f9b9282499928ab8d6656b1d0a66772d66565d0efc2ccfb4f)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "apiUrl", value)
@builtins.property
@jsii.member(jsii_name="appKey")
def app_key(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "appKey"))
@app_key.setter
def app_key(self, value: typing.Optional[builtins.str]) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__cc7f9a1134524ccf17343d743bf37c227cbfb744f6493c0fa0f0b472b5882088)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "appKey", value)
@builtins.property
@jsii.member(jsii_name="httpClientRetryBackoffBase")
def http_client_retry_backoff_base(self) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryBackoffBase"))
@http_client_retry_backoff_base.setter
def http_client_retry_backoff_base(
self,
value: typing.Optional[jsii.Number],
) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__52aee4624da24aea1acb1552acce2042bfe7eb8670eef2c1a7f00b77314ac916)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "httpClientRetryBackoffBase", value)
@builtins.property
@jsii.member(jsii_name="httpClientRetryBackoffMultiplier")
def http_client_retry_backoff_multiplier(self) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryBackoffMultiplier"))
@http_client_retry_backoff_multiplier.setter
def http_client_retry_backoff_multiplier(
self,
value: typing.Optional[jsii.Number],
) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__b6b79cf228c79ca64f06762c5f536370611bc19c760e1a92bbd54f82ab46e4f4)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "httpClientRetryBackoffMultiplier", value)
@builtins.property
@jsii.member(jsii_name="httpClientRetryEnabled")
def http_client_retry_enabled(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "httpClientRetryEnabled"))
@http_client_retry_enabled.setter
def http_client_retry_enabled(self, value: typing.Optional[builtins.str]) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__b21e0d0de23a863bd7a81f5e1da490ba483abf1d7f5fa3ca794043556d49537a)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "httpClientRetryEnabled", value)
@builtins.property
@jsii.member(jsii_name="httpClientRetryMaxRetries")
def http_client_retry_max_retries(self) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryMaxRetries"))
@http_client_retry_max_retries.setter
def http_client_retry_max_retries(
self,
value: typing.Optional[jsii.Number],
) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__c43b2b70794dd23b6193b86230dd086d39a380ffb516505934c1f8ef002f6a95)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "httpClientRetryMaxRetries", value)
@builtins.property
@jsii.member(jsii_name="httpClientRetryTimeout")
def http_client_retry_timeout(self) -> typing.Optional[jsii.Number]:
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "httpClientRetryTimeout"))
@http_client_retry_timeout.setter
def http_client_retry_timeout(self, value: typing.Optional[jsii.Number]) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__711a4fe27325ad950c40c539b130cfe134e821ab9c2c2e5188263f30f2e2733a)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "httpClientRetryTimeout", value)
@builtins.property
@jsii.member(jsii_name="validate")
def validate(self) -> typing.Optional[builtins.str]:
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "validate"))
@validate.setter
def validate(self, value: typing.Optional[builtins.str]) -> None:
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__85e8d0f8795f2588aedb2c310fb93daff65345b1a56d3fb6f1156b175f757dc7)
check_type(argname="argument value", value=value, expected_type=type_hints["value"])
jsii.set(self, "validate", value)
@jsii.data_type(
jsii_type="@cdktf/provider-datadog.provider.DatadogProviderConfig",
jsii_struct_bases=[],
name_mapping={
"alias": "alias",
"api_key": "apiKey",
"api_url": "apiUrl",
"app_key": "appKey",
"http_client_retry_backoff_base": "httpClientRetryBackoffBase",
"http_client_retry_backoff_multiplier": "httpClientRetryBackoffMultiplier",
"http_client_retry_enabled": "httpClientRetryEnabled",
"http_client_retry_max_retries": "httpClientRetryMaxRetries",
"http_client_retry_timeout": "httpClientRetryTimeout",
"validate": "validate",
},
)
class DatadogProviderConfig:
def __init__(
self,
*,
alias: typing.Optional[builtins.str] = None,
api_key: typing.Optional[builtins.str] = None,
api_url: typing.Optional[builtins.str] = None,
app_key: typing.Optional[builtins.str] = None,
http_client_retry_backoff_base: typing.Optional[jsii.Number] = None,
http_client_retry_backoff_multiplier: typing.Optional[jsii.Number] = None,
http_client_retry_enabled: typing.Optional[builtins.str] = None,
http_client_retry_max_retries: typing.Optional[jsii.Number] = None,
http_client_retry_timeout: typing.Optional[jsii.Number] = None,
validate: typing.Optional[builtins.str] = None,
) -> None:
'''
:param alias: Alias name. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#alias DatadogProvider#alias}
:param api_key: (Required unless validate is false) Datadog API key. This can also be set via the DD_API_KEY environment variable. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#api_key DatadogProvider#api_key}
:param api_url: The API URL. This can also be set via the DD_HOST environment variable. Note that this URL must not end with the ``/api/`` path. For example, ``https://api.datadoghq.com/`` is a correct value, while ``https://api.datadoghq.com/api/`` is not. And if you're working with "EU" version of Datadog, use ``https://api.datadoghq.eu/``. Other Datadog region examples: ``https://api.us5.datadoghq.com/``, ``https://api.us3.datadoghq.com/`` and ``https://api.ddog-gov.com/``. See https://docs.datadoghq.com/getting_started/site/ for all available regions. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#api_url DatadogProvider#api_url}
:param app_key: (Required unless validate is false) Datadog APP key. This can also be set via the DD_APP_KEY environment variable. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#app_key DatadogProvider#app_key}
:param http_client_retry_backoff_base: The HTTP request retry back off base. Defaults to 2. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_backoff_base DatadogProvider#http_client_retry_backoff_base}
:param http_client_retry_backoff_multiplier: The HTTP request retry back off multiplier. Defaults to 2. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_backoff_multiplier DatadogProvider#http_client_retry_backoff_multiplier}
:param http_client_retry_enabled: Enables request retries on HTTP status codes 429 and 5xx. Valid values are [``true``, ``false``]. Defaults to ``true``. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_enabled DatadogProvider#http_client_retry_enabled}
:param http_client_retry_max_retries: The HTTP request maximum retry number. Defaults to 3. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_max_retries DatadogProvider#http_client_retry_max_retries}
:param http_client_retry_timeout: The HTTP request retry timeout period. Defaults to 60 seconds. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_timeout DatadogProvider#http_client_retry_timeout}
:param validate: Enables validation of the provided API key during provider initialization. Valid values are [``true``, ``false``]. Default is true. When false, api_key won't be checked. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#validate DatadogProvider#validate}
'''
if __debug__:
type_hints = typing.get_type_hints(_typecheckingstub__f0627dc31b14e1dd1b33388294ae7ff9183e0249175790cf0444bf682521e659)
check_type(argname="argument alias", value=alias, expected_type=type_hints["alias"])
check_type(argname="argument api_key", value=api_key, expected_type=type_hints["api_key"])
check_type(argname="argument api_url", value=api_url, expected_type=type_hints["api_url"])
check_type(argname="argument app_key", value=app_key, expected_type=type_hints["app_key"])
check_type(argname="argument http_client_retry_backoff_base", value=http_client_retry_backoff_base, expected_type=type_hints["http_client_retry_backoff_base"])
check_type(argname="argument http_client_retry_backoff_multiplier", value=http_client_retry_backoff_multiplier, expected_type=type_hints["http_client_retry_backoff_multiplier"])
check_type(argname="argument http_client_retry_enabled", value=http_client_retry_enabled, expected_type=type_hints["http_client_retry_enabled"])
check_type(argname="argument http_client_retry_max_retries", value=http_client_retry_max_retries, expected_type=type_hints["http_client_retry_max_retries"])
check_type(argname="argument http_client_retry_timeout", value=http_client_retry_timeout, expected_type=type_hints["http_client_retry_timeout"])
check_type(argname="argument validate", value=validate, expected_type=type_hints["validate"])
self._values: typing.Dict[builtins.str, typing.Any] = {}
if alias is not None:
self._values["alias"] = alias
if api_key is not None:
self._values["api_key"] = api_key
if api_url is not None:
self._values["api_url"] = api_url
if app_key is not None:
self._values["app_key"] = app_key
if http_client_retry_backoff_base is not None:
self._values["http_client_retry_backoff_base"] = http_client_retry_backoff_base
if http_client_retry_backoff_multiplier is not None:
self._values["http_client_retry_backoff_multiplier"] = http_client_retry_backoff_multiplier
if http_client_retry_enabled is not None:
self._values["http_client_retry_enabled"] = http_client_retry_enabled
if http_client_retry_max_retries is not None:
self._values["http_client_retry_max_retries"] = http_client_retry_max_retries
if http_client_retry_timeout is not None:
self._values["http_client_retry_timeout"] = http_client_retry_timeout
if validate is not None:
self._values["validate"] = validate
@builtins.property
def alias(self) -> typing.Optional[builtins.str]:
'''Alias name.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#alias DatadogProvider#alias}
'''
result = self._values.get("alias")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def api_key(self) -> typing.Optional[builtins.str]:
'''(Required unless validate is false) Datadog API key. This can also be set via the DD_API_KEY environment variable.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#api_key DatadogProvider#api_key}
'''
result = self._values.get("api_key")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def api_url(self) -> typing.Optional[builtins.str]:
'''The API URL.
This can also be set via the DD_HOST environment variable. Note that this URL must not end with the ``/api/`` path. For example, ``https://api.datadoghq.com/`` is a correct value, while ``https://api.datadoghq.com/api/`` is not. And if you're working with "EU" version of Datadog, use ``https://api.datadoghq.eu/``. Other Datadog region examples: ``https://api.us5.datadoghq.com/``, ``https://api.us3.datadoghq.com/`` and ``https://api.ddog-gov.com/``. See https://docs.datadoghq.com/getting_started/site/ for all available regions.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#api_url DatadogProvider#api_url}
'''
result = self._values.get("api_url")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def app_key(self) -> typing.Optional[builtins.str]:
'''(Required unless validate is false) Datadog APP key. This can also be set via the DD_APP_KEY environment variable.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#app_key DatadogProvider#app_key}
'''
result = self._values.get("app_key")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def http_client_retry_backoff_base(self) -> typing.Optional[jsii.Number]:
'''The HTTP request retry back off base. Defaults to 2.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_backoff_base DatadogProvider#http_client_retry_backoff_base}
'''
result = self._values.get("http_client_retry_backoff_base")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def http_client_retry_backoff_multiplier(self) -> typing.Optional[jsii.Number]:
'''The HTTP request retry back off multiplier. Defaults to 2.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_backoff_multiplier DatadogProvider#http_client_retry_backoff_multiplier}
'''
result = self._values.get("http_client_retry_backoff_multiplier")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def http_client_retry_enabled(self) -> typing.Optional[builtins.str]:
'''Enables request retries on HTTP status codes 429 and 5xx. Valid values are [``true``, ``false``]. Defaults to ``true``.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_enabled DatadogProvider#http_client_retry_enabled}
'''
result = self._values.get("http_client_retry_enabled")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def http_client_retry_max_retries(self) -> typing.Optional[jsii.Number]:
'''The HTTP request maximum retry number. Defaults to 3.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_max_retries DatadogProvider#http_client_retry_max_retries}
'''
result = self._values.get("http_client_retry_max_retries")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def http_client_retry_timeout(self) -> typing.Optional[jsii.Number]:
'''The HTTP request retry timeout period. Defaults to 60 seconds.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#http_client_retry_timeout DatadogProvider#http_client_retry_timeout}
'''
result = self._values.get("http_client_retry_timeout")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def validate(self) -> typing.Optional[builtins.str]:
'''Enables validation of the provided API key during provider initialization.
Valid values are [``true``, ``false``]. Default is true. When false, api_key won't be checked.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/datadog/datadog/3.29.0/docs#validate DatadogProvider#validate}
'''
result = self._values.get("validate")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "DatadogProviderConfig(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
__all__ = [
"DatadogProvider",
"DatadogProviderConfig",
]
publication.publish()
def _typecheckingstub__0805d396d4fc3c279b51fac738ada9fac3aa1057e300db07efda8efdcdde344e(
scope: _constructs_77d1e7e8.Construct,
id: builtins.str,
*,
alias: typing.Optional[builtins.str] = None,
api_key: typing.Optional[builtins.str] = None,
api_url: typing.Optional[builtins.str] = None,
app_key: typing.Optional[builtins.str] = None,
http_client_retry_backoff_base: typing.Optional[jsii.Number] = None,
http_client_retry_backoff_multiplier: typing.Optional[jsii.Number] = None,
http_client_retry_enabled: typing.Optional[builtins.str] = None,
http_client_retry_max_retries: typing.Optional[jsii.Number] = None,
http_client_retry_timeout: typing.Optional[jsii.Number] = None,
validate: typing.Optional[builtins.str] = None,
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__41c6a45870fdb947f3ddde847982a16aa4f1f568f90c0090891bcb1ee998ba63(
value: typing.Optional[builtins.str],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__ee1caef371827550f1c55b1ba8b19bb96a191d14b34fa991469a985a09dc4e77(
value: typing.Optional[builtins.str],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__e52d452c54d0316f9b9282499928ab8d6656b1d0a66772d66565d0efc2ccfb4f(
value: typing.Optional[builtins.str],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__cc7f9a1134524ccf17343d743bf37c227cbfb744f6493c0fa0f0b472b5882088(
value: typing.Optional[builtins.str],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__52aee4624da24aea1acb1552acce2042bfe7eb8670eef2c1a7f00b77314ac916(
value: typing.Optional[jsii.Number],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__b6b79cf228c79ca64f06762c5f536370611bc19c760e1a92bbd54f82ab46e4f4(
value: typing.Optional[jsii.Number],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__b21e0d0de23a863bd7a81f5e1da490ba483abf1d7f5fa3ca794043556d49537a(
value: typing.Optional[builtins.str],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__c43b2b70794dd23b6193b86230dd086d39a380ffb516505934c1f8ef002f6a95(
value: typing.Optional[jsii.Number],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__711a4fe27325ad950c40c539b130cfe134e821ab9c2c2e5188263f30f2e2733a(
value: typing.Optional[jsii.Number],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__85e8d0f8795f2588aedb2c310fb93daff65345b1a56d3fb6f1156b175f757dc7(
value: typing.Optional[builtins.str],
) -> None:
"""Type checking stubs"""
pass
def _typecheckingstub__f0627dc31b14e1dd1b33388294ae7ff9183e0249175790cf0444bf682521e659(
*,
alias: typing.Optional[builtins.str] = None,
api_key: typing.Optional[builtins.str] = None,
api_url: typing.Optional[builtins.str] = None,
app_key: typing.Optional[builtins.str] = None,
http_client_retry_backoff_base: typing.Optional[jsii.Number] = None,
http_client_retry_backoff_multiplier: typing.Optional[jsii.Number] = None,
http_client_retry_enabled: typing.Optional[builtins.str] = None,
http_client_retry_max_retries: typing.Optional[jsii.Number] = None,
http_client_retry_timeout: typing.Optional[jsii.Number] = None,
validate: typing.Optional[builtins.str] = None,
) -> None:
"""Type checking stubs"""
pass
|
PypiClean
|
/FreePyBX-1.0-RC1.tar.gz/FreePyBX-1.0-RC1/freepybx/public/js/dojox/widget/DocTester.js.uncompressed.js
|
define("dojox/widget/DocTester", ["dijit","dojo","dojox","dojo/require!dojo/string,dijit/_Widget,dijit/_Templated,dojox/form/BusyButton,dojox/testing/DocTest"], function(dijit,dojo,dojox){
dojo.provide("dojox.widget.DocTester");
dojo.require("dojo.string");
dojo.require("dijit._Widget");
dojo.require("dijit._Templated");
dojo.require("dojox.form.BusyButton");
dojo.require("dojox.testing.DocTest");
dojo.declare('dojox.widget.DocTester',
[dijit._Widget, dijit._Templated],
{
// summary: A widget to run DocTests inside an HTML page.
//
templateString: dojo.cache("dojox.widget", "DocTester/DocTester.html", "<div dojoAttachPoint=\"domNode\" class=\"dojoxDocTester\">\n\t<div dojoAttachPoint=\"containerNode\"></div>\n\t<button dojoType=\"dojox.form.BusyButton\" busyLabel=\"Testing...\" dojoAttachPoint=\"runButtonNode\">Run tests</button>\n\t<button dojoType=\"dijit.form.Button\" dojoAttachPoint=\"resetButtonNode\" style=\"display:none;\">Reset</button>\n\t<span>\n\t\t<span dojoAttachPoint=\"numTestsNode\">0</span> tests,\n\t\t<span dojoAttachPoint=\"numTestsOkNode\">0</span> passed,\n\t\t<span dojoAttachPoint=\"numTestsNokNode\">0</span> failed\n\t</span>\n</div>"),
widgetsInTemplate: true,
_fillContent:function(/*DomNode*/source){
// summary: Overridden from _Templates.js, which actually just takes care of filling the containerNode.
var src = source.innerHTML;
this.doctests = new dojox.testing.DocTest();
this.tests = this.doctests.getTestsFromString(this._unescapeHtml(src));
var lineNumbers = dojo.map(this.tests, 'return item.line-1');
var lines = src.split("\n");
var actualResultHtml = '<div class="actualResult">FAILED, actual result was: <span class="result"></span></div>';
var content = '<pre class="testCase testNum0 odd">';
for (var i=0; i<lines.length; i++){
var index = dojo.indexOf(lineNumbers, i);
if (index>0 && index!=-1){
var evenOdd = index%2 ? "even" : "odd";
content += actualResultHtml;
content += '</pre><pre class="testCase testNum'+ index +' '+evenOdd+'">';
}
content += lines[i].replace(/^\s+/, "")+"\n";
}
content += actualResultHtml + '</pre>';
this.containerNode.innerHTML = content;
},
postCreate:function(){
this.inherited("postCreate", arguments);
dojo.connect(this.runButtonNode, "onClick", dojo.hitch(this, "runTests"));
dojo.connect(this.resetButtonNode, "onClick", dojo.hitch(this, "reset"));
this.numTestsNode.innerHTML = this.tests.length;
},
runTests:function(){
var results = {ok:0, nok:0};
for (var i=0; i<this.tests.length; i++){
var ret = this.doctests.runTest(this.tests[i].commands, this.tests[i].expectedResult);
dojo.query(".testNum"+i, this.domNode).addClass(ret.success ? "resultOk" : "resultNok");
if (!ret.success){
results.nok++;
this.numTestsNokNode.innerHTML = results.nok;
var act = dojo.query(".testNum"+i+" .actualResult", this.domNode)[0];
dojo.style(act, "display", "inline");
dojo.query(".result", act)[0].innerHTML = dojo.toJson(ret.actualResult);
} else {
results.ok++;
this.numTestsOkNode.innerHTML = results.ok;
}
}
this.runButtonNode.cancel();
dojo.style(this.runButtonNode.domNode, "display", "none");
dojo.style(this.resetButtonNode.domNode, "display", "");
},
reset:function(){
// summary: Reset the DocTester visuals and enable the "Run tests" button again.
dojo.style(this.runButtonNode.domNode, "display", "");
dojo.style(this.resetButtonNode.domNode, "display", "none");
this.numTestsOkNode.innerHTML = "0";
this.numTestsNokNode.innerHTML = "0";
dojo.query(".actualResult", this.domNode).style("display", "none");
dojo.query(".testCase", this.domNode).removeClass("resultOk").removeClass("resultNok");
},
_unescapeHtml:function(/*string*/str){
// TODO Should become dojo.html.unentities() or so, when exists use instead
// summary:
// Adds escape sequences for special characters in XML: &<>"'
str = String(str).replace(/&/gm, "&").replace(/</gm, "<")
.replace(/>/gm, ">").replace(/"/gm, '"');
return str; // string
}
}
);
});
|
PypiClean
|
/langchain_ibis-0.0.100-py3-none-any.whl/langchain/document_loaders/notebook.py
|
import json
from pathlib import Path
from typing import Any, List
from langchain.docstore.document import Document
from langchain.document_loaders.base import BaseLoader
def concatenate_cells(
cell: dict, include_outputs: bool, max_output_length: int, traceback: bool
) -> str:
"""Combine cells information in a readable format ready to be used."""
cell_type = cell["cell_type"]
source = cell["source"]
output = cell["outputs"]
if include_outputs and cell_type == "code" and output:
if "ename" in output[0].keys():
error_name = output[0]["ename"]
error_value = output[0]["evalue"]
if traceback:
traceback = output[0]["traceback"]
return (
f"'{cell_type}' cell: '{source}'\n, gives error '{error_name}',"
f" with description '{error_value}'\n"
f"and traceback '{traceback}'\n\n"
)
else:
return (
f"'{cell_type}' cell: '{source}'\n, gives error '{error_name}',"
f"with description '{error_value}'\n\n"
)
elif output[0]["output_type"] == "stream":
output = output[0]["text"]
min_output = min(max_output_length, len(output))
return (
f"'{cell_type}' cell: '{source}'\n with "
f"output: '{output[:min_output]}'\n\n"
)
else:
return f"'{cell_type}' cell: '{source}'\n\n"
return ""
def remove_newlines(x: Any) -> Any:
"""Remove recursively newlines, no matter the data structure they are stored in."""
import pandas as pd
if isinstance(x, str):
return x.replace("\n", "")
elif isinstance(x, list):
return [remove_newlines(elem) for elem in x]
elif isinstance(x, pd.DataFrame):
return x.applymap(remove_newlines)
else:
return x
class NotebookLoader(BaseLoader):
"""Loader that loads .ipynb notebook files."""
def __init__(
self,
path: str,
include_outputs: bool = False,
max_output_length: int = 10,
remove_newline: bool = False,
traceback: bool = False,
):
"""Initialize with path."""
self.file_path = path
self.include_outputs = include_outputs
self.max_output_length = max_output_length
self.remove_newline = remove_newline
self.traceback = traceback
def load(
self,
) -> List[Document]:
"""Load documents."""
try:
import pandas as pd
except ImportError:
raise ValueError(
"pandas is needed for Notebook Loader, "
"please install with `pip install pandas`"
)
p = Path(self.file_path)
with open(p, encoding="utf8") as f:
d = json.load(f)
data = pd.json_normalize(d["cells"])
filtered_data = data[["cell_type", "source", "outputs"]]
if self.remove_newline:
filtered_data = filtered_data.applymap(remove_newlines)
text = filtered_data.apply(
lambda x: concatenate_cells(
x, self.include_outputs, self.max_output_length, self.traceback
),
axis=1,
).str.cat(sep=" ")
metadata = {"source": str(p)}
return [Document(page_content=text, metadata=metadata)]
|
PypiClean
|
/great_expectations_cta-0.15.43.tar.gz/great_expectations_cta-0.15.43/great_expectations/rule_based_profiler/parameter_builder/simple_date_format_string_parameter_builder.py
|
from __future__ import annotations
import logging
from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Set, Union
import great_expectations.exceptions as ge_exceptions
from great_expectations.core.domain import Domain
from great_expectations.rule_based_profiler.attributed_resolved_metrics import (
AttributedResolvedMetrics,
)
from great_expectations.rule_based_profiler.config import ParameterBuilderConfig
from great_expectations.rule_based_profiler.helpers.util import (
NP_EPSILON,
get_parameter_value_and_validate_return_type,
)
from great_expectations.rule_based_profiler.metric_computation_result import (
MetricComputationResult,
MetricValues,
)
from great_expectations.rule_based_profiler.parameter_builder import ParameterBuilder
from great_expectations.rule_based_profiler.parameter_container import (
FULLY_QUALIFIED_PARAMETER_NAME_METADATA_KEY,
FULLY_QUALIFIED_PARAMETER_NAME_VALUE_KEY,
ParameterContainer,
)
from great_expectations.types.attributes import Attributes
if TYPE_CHECKING:
from great_expectations.data_context.data_context.abstract_data_context import (
AbstractDataContext,
)
logger = logging.getLogger(__name__)
DEFAULT_CANDIDATE_STRINGS: Set[str] = {
"%H:%M:%S",
"%H:%M:%S,%f",
"%H:%M:%S.%f",
"%Y %b %d %H:%M:%S.%f",
"%Y %b %d %H:%M:%S.%f %Z",
"%Y %b %d %H:%M:%S.%f*%Z",
"%Y%m%d %H:%M:%S.%f",
"%Y-%m-%d",
"%Y-%m-%d %H:%M:%S",
"%Y-%m-%d %H:%M:%S %z",
"%Y-%m-%d %H:%M:%S%z",
"%Y-%m-%d %H:%M:%S,%f",
"%Y-%m-%d %H:%M:%S,%f%z",
"%Y-%m-%d %H:%M:%S.%f",
"%Y-%m-%d %H:%M:%S.%f%z",
"%Y-%m-%d'T'%H:%M:%S",
"%Y-%m-%d'T'%H:%M:%S%z",
"%Y-%m-%d'T'%H:%M:%S'%z'",
"%Y-%m-%d'T'%H:%M:%S.%f",
"%Y-%m-%d'T'%H:%M:%S.%f'%z'",
"%Y-%m-%d*%H:%M:%S",
"%Y-%m-%d*%H:%M:%S:%f",
"%Y-%m-%dT%z",
"%Y/%m/%d",
"%Y/%m/%d*%H:%M:%S",
"%b %d %H:%M:%S",
"%b %d %H:%M:%S %Y",
"%b %d %H:%M:%S %z",
"%b %d %H:%M:%S %z %Y",
"%b %d %Y %H:%M:%S",
"%b %d, %Y %H:%M:%S %p",
"%d %b %Y %H:%M:%S",
"%d %b %Y %H:%M:%S*%f",
"%d-%b-%Y %H:%M:%S",
"%d-%b-%Y %H:%M:%S.%f",
"%d-%m-%Y",
"%d/%b %H:%M:%S,%f",
"%d/%b/%Y %H:%M:%S",
"%d/%b/%Y:%H:%M:%S",
"%d/%b/%Y:%H:%M:%S %z",
"%d/%m/%Y",
"%m%d_%H:%M:%S",
"%m%d_%H:%M:%S.%f",
"%m-%d-%Y",
"%m/%d/%Y",
"%m/%d/%Y %H:%M:%S %p",
"%m/%d/%Y %H:%M:%S %p:%f",
"%m/%d/%Y %H:%M:%S %z",
"%m/%d/%Y*%H:%M:%S",
"%m/%d/%Y*%H:%M:%S*%f",
"%m/%d/%y %H:%M:%S %z",
"%m/%d/%y*%H:%M:%S",
"%y%m%d %H:%M:%S",
"%y-%m-%d",
"%y-%m-%d %H:%M:%S",
"%y-%m-%d %H:%M:%S,%f",
"%y-%m-%d %H:%M:%S,%f %z",
"%y/%m/%d",
"%y/%m/%d %H:%M:%S",
}
class SimpleDateFormatStringParameterBuilder(ParameterBuilder):
"""
Detects the domain date format from a set of candidate date format strings by computing the
column_values.match_strftime_format.unexpected_count metric for each candidate format and returning the format that
has the lowest unexpected_count ratio.
"""
def __init__(
self,
name: str,
metric_domain_kwargs: Optional[Union[str, dict]] = None,
metric_value_kwargs: Optional[Union[str, dict]] = None,
threshold: Union[str, float] = 1.0,
candidate_strings: Optional[Union[Iterable[str], str]] = None,
evaluation_parameter_builder_configs: Optional[
List[ParameterBuilderConfig]
] = None,
data_context: Optional[AbstractDataContext] = None,
) -> None:
"""
Configure this SimpleDateFormatStringParameterBuilder
Args:
name: the name of this parameter -- this is user-specified parameter name (from configuration);
it is not the fully-qualified parameter name; a fully-qualified parameter name must start with "$parameter."
and may contain one or more subsequent parts (e.g., "$parameter.<my_param_from_config>.<metric_name>").
metric_domain_kwargs: used in MetricConfiguration
metric_value_kwargs: used in MetricConfiguration
threshold: the ratio of values that must match a format string for it to be accepted
candidate_strings: a list of candidate date format strings that will replace the default
evaluation_parameter_builder_configs: ParameterBuilder configurations, executing and making whose respective
ParameterBuilder objects' outputs available (as fully-qualified parameter names) is pre-requisite.
These "ParameterBuilder" configurations help build parameters needed for this "ParameterBuilder".
data_context: AbstractDataContext associated with this ParameterBuilder
"""
super().__init__(
name=name,
evaluation_parameter_builder_configs=evaluation_parameter_builder_configs,
data_context=data_context,
)
self._metric_domain_kwargs = metric_domain_kwargs
self._metric_value_kwargs = metric_value_kwargs
self._threshold = threshold
if candidate_strings is not None and isinstance(candidate_strings, list):
self._candidate_strings = set(candidate_strings)
else:
self._candidate_strings = DEFAULT_CANDIDATE_STRINGS
@property
def metric_domain_kwargs(self) -> Optional[Union[str, dict]]:
return self._metric_domain_kwargs
@property
def metric_value_kwargs(self) -> Optional[Union[str, dict]]:
return self._metric_value_kwargs
@metric_value_kwargs.setter
def metric_value_kwargs(self, value: Optional[Union[str, dict]]) -> None:
self._metric_value_kwargs = value
@property
def threshold(self) -> Union[str, float]:
return self._threshold
@property
def candidate_strings(
self,
) -> Union[str, Union[List[str], Set[str]]]:
return self._candidate_strings
def _build_parameters(
self,
domain: Domain,
variables: Optional[ParameterContainer] = None,
parameters: Optional[Dict[str, ParameterContainer]] = None,
recompute_existing_parameter_values: bool = False,
) -> Attributes:
"""
Builds ParameterContainer object that holds ParameterNode objects with attribute name-value pairs and details.
Check the percentage of values matching each string, and return the best fit, or None if no string exceeds the
configured threshold.
Returns:
Attributes object, containing computed parameter values and parameter computation details metadata.
"""
metric_computation_result: MetricComputationResult
metric_computation_result = self.get_metrics(
metric_name="column_values.nonnull.count",
metric_domain_kwargs=self.metric_domain_kwargs,
metric_value_kwargs=self.metric_value_kwargs,
domain=domain,
variables=variables,
parameters=parameters,
)
# This should never happen.
if len(metric_computation_result.attributed_resolved_metrics) != 1:
raise ge_exceptions.ProfilerExecutionError(
message=f'Result of metric computations for {self.__class__.__name__} must be a list with exactly 1 element of type "AttributedResolvedMetrics" ({metric_computation_result.attributed_resolved_metrics} found).'
)
attributed_resolved_metrics: AttributedResolvedMetrics
attributed_resolved_metrics = (
metric_computation_result.attributed_resolved_metrics[0]
)
metric_values: MetricValues
metric_values = attributed_resolved_metrics.conditioned_metric_values
if metric_values is None:
raise ge_exceptions.ProfilerExecutionError(
message=f"Result of metric computations for {self.__class__.__name__} is empty."
)
# Now obtain 1-dimensional vector of values of computed metric (each element corresponds to a Batch ID).
metric_values = metric_values[:, 0]
nonnull_count: int = sum(metric_values)
# Obtain candidate_strings from "rule state" (i.e., variables and parameters); from instance variable otherwise.
candidate_strings: Union[
List[str],
Set[str],
] = get_parameter_value_and_validate_return_type(
domain=domain,
parameter_reference=self.candidate_strings,
expected_return_type=None,
variables=variables,
parameters=parameters,
)
# Gather "metric_value_kwargs" for all candidate "strftime_format" strings.
format_string: str
match_strftime_metric_value_kwargs_list: List[dict] = []
match_strftime_metric_value_kwargs: dict
for format_string in candidate_strings:
if self.metric_value_kwargs:
match_strftime_metric_value_kwargs = {
**self.metric_value_kwargs,
**{"strftime_format": format_string},
}
else:
match_strftime_metric_value_kwargs = {
"strftime_format": format_string,
}
match_strftime_metric_value_kwargs_list.append(
match_strftime_metric_value_kwargs
)
# Obtain resolved metrics and metadata for all metric configurations and available Batch objects simultaneously.
metric_computation_result = self.get_metrics(
metric_name="column_values.match_strftime_format.unexpected_count",
metric_domain_kwargs=self.metric_domain_kwargs,
metric_value_kwargs=match_strftime_metric_value_kwargs_list,
domain=domain,
variables=variables,
parameters=parameters,
)
format_string_success_ratios: dict = {}
for (
attributed_resolved_metrics
) in metric_computation_result.attributed_resolved_metrics:
# Now obtain 1-dimensional vector of values of computed metric (each element corresponds to a Batch ID).
metric_values = attributed_resolved_metrics.conditioned_metric_values[:, 0]
match_strftime_unexpected_count: int = sum(metric_values)
success_ratio: float = (nonnull_count - match_strftime_unexpected_count) / (
nonnull_count + NP_EPSILON
)
format_string_success_ratios[
attributed_resolved_metrics.metric_attributes["strftime_format"]
] = success_ratio
# Obtain threshold from "rule state" (i.e., variables and parameters); from instance variable otherwise.
threshold: float = get_parameter_value_and_validate_return_type(
domain=domain,
parameter_reference=self.threshold,
expected_return_type=float,
variables=variables,
parameters=parameters,
)
# get best-matching datetime string that matches greater than threshold
best_format_string: str
best_ratio: float
(
best_format_string,
best_ratio,
) = ParameterBuilder._get_best_candidate_above_threshold(
format_string_success_ratios, threshold
)
# dict of sorted datetime and ratios for all evaluated candidates
sorted_format_strings_and_ratios: dict = (
ParameterBuilder._get_sorted_candidates_and_ratios(
format_string_success_ratios
)
)
return Attributes(
{
FULLY_QUALIFIED_PARAMETER_NAME_VALUE_KEY: best_format_string,
FULLY_QUALIFIED_PARAMETER_NAME_METADATA_KEY: {
"success_ratio": best_ratio,
"candidate_strings": sorted_format_strings_and_ratios,
},
}
)
|
PypiClean
|
/azure-cli-dla-0.2.6.tar.gz/azure-cli-dla-0.2.6/azure/cli/command_modules/dla/_params.py
|
from argcomplete.completers import FilesCompleter
from knack.arguments import CLIArgumentType
from azure.cli.core.commands.parameters import (
tags_type, get_resource_name_completion_list, resource_group_name_type, get_enum_type)
from azure.cli.command_modules.dla._validators import validate_resource_group_name, datetime_format
# pylint: disable=line-too-long, too-many-statements
def load_arguments(self, _):
from azure.mgmt.datalake.analytics.account.models import (FirewallState, TierType, FirewallAllowAzureIpsState,
AADObjectType)
from azure.mgmt.datalake.analytics.job.models import (CompileMode, JobState, JobResult)
datalake_analytics_name_type = CLIArgumentType(help='Name of the Data Lake Analytics account.', options_list=('--account_name',), completer=get_resource_name_completion_list('Microsoft.DataLakeAnalytics/accounts'), id_part='name')
# PARAMETER REGISTRATIONS
# common
with self.argument_context('dla') as c:
c.argument('resource_group_name', resource_group_name_type, id_part=None, required=False, help='If not specified, will attempt to discover the resource group for the specified Data Lake Analytics account.', validator=validate_resource_group_name)
c.argument('top', help='Maximum number of items to return.', type=int)
c.argument('skip', help='The number of items to skip over before returning elements.', type=int)
c.argument('count', help='The Boolean value of true or false to request a count of the matching resources included with the resources in the response, e.g. Categories?$count=true.', type=bool)
c.argument('account_name', datalake_analytics_name_type, options_list=['--account', '-n'])
# account
with self.argument_context('dla account') as c:
c.argument('tags', tags_type)
c.argument('tier', arg_type=get_enum_type(TierType), help='The desired commitment tier for this account to use.')
with self.argument_context('dla account create') as c:
c.argument('resource_group_name', resource_group_name_type, validator=None)
c.argument('account_name', datalake_analytics_name_type, options_list=('--account', '-n'), completer=None)
with self.argument_context('dla account update') as c:
c.argument('firewall_state', help='Enable/disable existing firewall rules.', arg_type=get_enum_type(FirewallState))
c.argument('allow_azure_ips', help='Allow/block Azure originating IPs through the firewall', arg_type=get_enum_type(FirewallAllowAzureIpsState))
c.argument('max_job_count', help='The maximum supported jobs running under the account at the same time.', type=int)
c.argument('max_degree_of_parallelism', help='The maximum supported degree of parallelism for this account.', type=int)
c.argument('query_store_retention', help='The number of days that job metadata is retained.', type=int)
with self.argument_context('dla account list') as c:
c.argument('resource_group_name', resource_group_name_type, validator=None)
# storage
with self.argument_context('dla account blob-storage') as c:
c.argument('access_key', help='the access key associated with this Azure Storage account that will be used to connect to it')
c.argument('suffix', help='the optional suffix for the storage account')
c.argument('storage_account_name', help='Name of an existing storage account to link to.')
# job
with self.argument_context('dla job submit') as c:
c.argument('compile_mode', arg_type=get_enum_type(CompileMode), help='Indicates the type of compilation to be done on this job. Valid values are: \'Semantic\' (Only performs semantic checks and necessary sanity checks), \'Full\' (full compilation) and \'SingleBox\' (Full compilation performed locally)')
c.argument('compile_only', help='Indicates that the submission should only build the job and not execute if set to true.', action='store_true')
c.argument('script', completer=FilesCompleter(), help="The script to submit. This is either the script contents or use `@<file path>` to load the script from a file")
c.argument('pipeline_id', help='Job relationship pipeline GUID.')
c.argument('pipeline_name', help='Friendly name of the job relationship pipeline.')
c.argument('pipeline_uri', help='Unique pipeline URI which links to the originating service for this pipeline.')
c.argument('run_id', help='GUID of the iteration of this pipeline.')
c.argument('recurrence_id', help='Recurrence GUID, unique per activity/script, regardless of iteration. Links different occurrences of the same job together.')
c.argument('recurrence_name', help='Friendly recurrence nae for the correlation between jobs.')
with self.argument_context('dla job wait') as c:
c.argument('max_wait_time_sec', help='The maximum amount of time to wait before erroring out. Default value is to never timeout. Any value <= 0 means never timeout', type=int)
c.argument('wait_interval_sec', help='The polling interval between checks for the job status, in seconds.', type=int)
with self.argument_context('dla job list') as c:
c.argument('submitted_after', help='A filter which returns jobs only submitted after the specified time, in ISO-8601 format.', type=datetime_format)
c.argument('submitted_before', help='A filter which returns jobs only submitted before the specified time, in ISO-8601 format.', type=datetime_format)
c.argument('state', arg_type=get_enum_type(JobState), help='A filter which returns jobs with only the specified state(s).', nargs='*')
c.argument('result', arg_type=get_enum_type(JobResult), help='A filter which returns jobs with only the specified result(s).', nargs='*')
c.argument('submitter', help='A filter which returns jobs only by the specified submitter.')
c.argument('name', help='A filter which returns jobs only by the specified friendly name.')
c.argument('pipeline_id', help='A filter which returns jobs only containing the specified pipeline_id.')
c.argument('recurrence_id', help='A filter which returns jobs only containing the specified recurrence_id.')
# credential
with self.argument_context('dla catalog credential') as c:
c.argument('uri', help='URI of the external data source.')
with self.argument_context('dla catalog credential create') as c:
c.argument('credential_user_password', options_list=['--password', '-p'], help='Password for the credential user. Will prompt if not given.')
c.argument('credential_user_name', options_list=['--user-name'])
with self.argument_context('dla catalog credential update') as c:
c.argument('credential_user_name', options_list=['--user-name'])
c.argument('credential_user_password', options_list=['--password', '-p'], help='Current password for the credential user. Will prompt if not given.')
c.argument('new_credential_user_password', options_list=['--new-password'], help='New password for the credential user. Will prompt if not given.')
# compute policy
with self.argument_context('dla account compute_policy') as c:
c.argument('max_dop_per_job', help='The maximum degree of parallelism allowed per job for this policy. At least one of --min-priority-per-job and --max-dop-per-job must be specified.', type=int)
c.argument('min_priority_per_job', help='The minimum priority allowed per job for this policy. At least one of --min-priority-per-job and --max-dop-per-job must be specified.', type=int)
with self.argument_context('dla account compute_policy create') as c:
c.argument('object_id', help='The Azure Active Directory object ID of the user, group or service principal to apply the policy to.')
c.argument('object_type', arg_type=get_enum_type(AADObjectType), help='The Azure Active Directory object type associated with the supplied object id.')
|
PypiClean
|
/apache-airflow-providers-google-10.7.0.tar.gz/apache-airflow-providers-google-10.7.0/airflow/providers/google/cloud/links/dataform.py
|
"""This module contains Google Dataflow links."""
from __future__ import annotations
from typing import TYPE_CHECKING
from airflow.models import BaseOperator
from airflow.providers.google.cloud.links.base import BaseGoogleLink
if TYPE_CHECKING:
from airflow.utils.context import Context
DATAFORM_BASE_LINK = "/bigquery/dataform"
DATAFORM_WORKFLOW_INVOCATION_LINK = (
DATAFORM_BASE_LINK
+ "/locations/{region}/repositories/{repository_id}/workflows/"
+ "{workflow_invocation_id}?project={project_id}"
)
DATAFORM_REPOSITORY_LINK = (
DATAFORM_BASE_LINK
+ "/locations/{region}/repositories/{repository_id}/"
+ "details/workspaces?project={project_id}"
)
DATAFORM_WORKSPACE_LINK = (
DATAFORM_BASE_LINK
+ "/locations/{region}/repositories/{repository_id}/"
+ "workspaces/{workspace_id}/"
+ "files/?project={project_id}"
)
class DataformWorkflowInvocationLink(BaseGoogleLink):
"""Helper class for constructing Dataflow Job Link."""
name = "Dataform Workflow Invocation"
key = "dataform_workflow_invocation_config"
format_str = DATAFORM_WORKFLOW_INVOCATION_LINK
@staticmethod
def persist(
operator_instance: BaseOperator,
context: Context,
project_id: str,
region: str,
repository_id: str,
workflow_invocation_id: str,
):
operator_instance.xcom_push(
context,
key=DataformWorkflowInvocationLink.key,
value={
"project_id": project_id,
"region": region,
"repository_id": repository_id,
"workflow_invocation_id": workflow_invocation_id,
},
)
class DataformRepositoryLink(BaseGoogleLink):
"""Helper class for constructing Dataflow repository link."""
name = "Dataform Repository"
key = "dataform_repository"
format_str = DATAFORM_REPOSITORY_LINK
@staticmethod
def persist(
operator_instance: BaseOperator,
context: Context,
project_id: str,
region: str,
repository_id: str,
) -> None:
operator_instance.xcom_push(
context=context,
key=DataformRepositoryLink.key,
value={
"project_id": project_id,
"region": region,
"repository_id": repository_id,
},
)
class DataformWorkspaceLink(BaseGoogleLink):
"""Helper class for constructing Dataform workspace link."""
name = "Dataform Workspace"
key = "dataform_workspace"
format_str = DATAFORM_WORKSPACE_LINK
@staticmethod
def persist(
operator_instance: BaseOperator,
context: Context,
project_id: str,
region: str,
repository_id: str,
workspace_id: str,
) -> None:
operator_instance.xcom_push(
context=context,
key=DataformWorkspaceLink.key,
value={
"project_id": project_id,
"region": region,
"repository_id": repository_id,
"workspace_id": workspace_id,
},
)
|
PypiClean
|
/testflows.core-1.9.230705.1122619.tar.gz/testflows.core-1.9.230705.1122619/testflows/_core/contrib/pygments/lexers/ruby.py
|
import re
from testflows._core.contrib.pygments.lexer import Lexer, RegexLexer, ExtendedRegexLexer, include, \
bygroups, default, LexerContext, do_insertions, words
from testflows._core.contrib.pygments.token import Text, Comment, Operator, Keyword, Name, String, \
Number, Punctuation, Error, Generic
from testflows._core.contrib.pygments.util import shebang_matches
__all__ = ['RubyLexer', 'RubyConsoleLexer', 'FancyLexer']
line_re = re.compile('.*?\n')
RUBY_OPERATORS = (
'*', '**', '-', '+', '-@', '+@', '/', '%', '&', '|', '^', '`', '~',
'[]', '[]=', '<<', '>>', '<', '<>', '<=>', '>', '>=', '==', '==='
)
class RubyLexer(ExtendedRegexLexer):
"""
For `Ruby <http://www.ruby-lang.org>`_ source code.
"""
name = 'Ruby'
aliases = ['rb', 'ruby', 'duby']
filenames = ['*.rb', '*.rbw', 'Rakefile', '*.rake', '*.gemspec',
'*.rbx', '*.duby', 'Gemfile']
mimetypes = ['text/x-ruby', 'application/x-ruby']
flags = re.DOTALL | re.MULTILINE
def heredoc_callback(self, match, ctx):
# okay, this is the hardest part of parsing Ruby...
# match: 1 = <<[-~]?, 2 = quote? 3 = name 4 = quote? 5 = rest of line
start = match.start(1)
yield start, Operator, match.group(1) # <<[-~]?
yield match.start(2), String.Heredoc, match.group(2) # quote ", ', `
yield match.start(3), String.Delimiter, match.group(3) # heredoc name
yield match.start(4), String.Heredoc, match.group(4) # quote again
heredocstack = ctx.__dict__.setdefault('heredocstack', [])
outermost = not bool(heredocstack)
heredocstack.append((match.group(1) in ('<<-', '<<~'), match.group(3)))
ctx.pos = match.start(5)
ctx.end = match.end(5)
# this may find other heredocs
for i, t, v in self.get_tokens_unprocessed(context=ctx):
yield i, t, v
ctx.pos = match.end()
if outermost:
# this is the outer heredoc again, now we can process them all
for tolerant, hdname in heredocstack:
lines = []
for match in line_re.finditer(ctx.text, ctx.pos):
if tolerant:
check = match.group().strip()
else:
check = match.group().rstrip()
if check == hdname:
for amatch in lines:
yield amatch.start(), String.Heredoc, amatch.group()
yield match.start(), String.Delimiter, match.group()
ctx.pos = match.end()
break
else:
lines.append(match)
else:
# end of heredoc not found -- error!
for amatch in lines:
yield amatch.start(), Error, amatch.group()
ctx.end = len(ctx.text)
del heredocstack[:]
def gen_rubystrings_rules():
def intp_regex_callback(self, match, ctx):
yield match.start(1), String.Regex, match.group(1) # begin
nctx = LexerContext(match.group(3), 0, ['interpolated-regex'])
for i, t, v in self.get_tokens_unprocessed(context=nctx):
yield match.start(3)+i, t, v
yield match.start(4), String.Regex, match.group(4) # end[mixounse]*
ctx.pos = match.end()
def intp_string_callback(self, match, ctx):
yield match.start(1), String.Other, match.group(1)
nctx = LexerContext(match.group(3), 0, ['interpolated-string'])
for i, t, v in self.get_tokens_unprocessed(context=nctx):
yield match.start(3)+i, t, v
yield match.start(4), String.Other, match.group(4) # end
ctx.pos = match.end()
states = {}
states['strings'] = [
# easy ones
(r'\:@{0,2}[a-zA-Z_]\w*[!?]?', String.Symbol),
(words(RUBY_OPERATORS, prefix=r'\:@{0,2}'), String.Symbol),
(r":'(\\\\|\\'|[^'])*'", String.Symbol),
(r"'(\\\\|\\'|[^'])*'", String.Single),
(r':"', String.Symbol, 'simple-sym'),
(r'([a-zA-Z_]\w*)(:)(?!:)',
bygroups(String.Symbol, Punctuation)), # Since Ruby 1.9
(r'"', String.Double, 'simple-string'),
(r'(?<!\.)`', String.Backtick, 'simple-backtick'),
]
# double-quoted string and symbol
for name, ttype, end in ('string', String.Double, '"'), \
('sym', String.Symbol, '"'), \
('backtick', String.Backtick, '`'):
states['simple-'+name] = [
include('string-intp-escaped'),
(r'[^\\%s#]+' % end, ttype),
(r'[\\#]', ttype),
(end, ttype, '#pop'),
]
# braced quoted strings
for lbrace, rbrace, bracecc, name in \
('\\{', '\\}', '{}', 'cb'), \
('\\[', '\\]', '\\[\\]', 'sb'), \
('\\(', '\\)', '()', 'pa'), \
('<', '>', '<>', 'ab'):
states[name+'-intp-string'] = [
(r'\\[\\' + bracecc + ']', String.Other),
(lbrace, String.Other, '#push'),
(rbrace, String.Other, '#pop'),
include('string-intp-escaped'),
(r'[\\#' + bracecc + ']', String.Other),
(r'[^\\#' + bracecc + ']+', String.Other),
]
states['strings'].append((r'%[QWx]?' + lbrace, String.Other,
name+'-intp-string'))
states[name+'-string'] = [
(r'\\[\\' + bracecc + ']', String.Other),
(lbrace, String.Other, '#push'),
(rbrace, String.Other, '#pop'),
(r'[\\#' + bracecc + ']', String.Other),
(r'[^\\#' + bracecc + ']+', String.Other),
]
states['strings'].append((r'%[qsw]' + lbrace, String.Other,
name+'-string'))
states[name+'-regex'] = [
(r'\\[\\' + bracecc + ']', String.Regex),
(lbrace, String.Regex, '#push'),
(rbrace + '[mixounse]*', String.Regex, '#pop'),
include('string-intp'),
(r'[\\#' + bracecc + ']', String.Regex),
(r'[^\\#' + bracecc + ']+', String.Regex),
]
states['strings'].append((r'%r' + lbrace, String.Regex,
name+'-regex'))
# these must come after %<brace>!
states['strings'] += [
# %r regex
(r'(%r([\W_]))((?:\\\2|(?!\2).)*)(\2[mixounse]*)',
intp_regex_callback),
# regular fancy strings with qsw
(r'%[qsw]([\W_])((?:\\\1|(?!\1).)*)\1', String.Other),
(r'(%[QWx]([\W_]))((?:\\\2|(?!\2).)*)(\2)',
intp_string_callback),
# special forms of fancy strings after operators or
# in method calls with braces
(r'(?<=[-+/*%=<>&!^|~,(])(\s*)(%([\t ])(?:(?:\\\3|(?!\3).)*)\3)',
bygroups(Text, String.Other, None)),
# and because of fixed width lookbehinds the whole thing a
# second time for line startings...
(r'^(\s*)(%([\t ])(?:(?:\\\3|(?!\3).)*)\3)',
bygroups(Text, String.Other, None)),
# all regular fancy strings without qsw
(r'(%([^a-zA-Z0-9\s]))((?:\\\2|(?!\2).)*)(\2)',
intp_string_callback),
]
return states
tokens = {
'root': [
(r'\A#!.+?$', Comment.Hashbang),
(r'#.*?$', Comment.Single),
(r'=begin\s.*?\n=end.*?$', Comment.Multiline),
# keywords
(words((
'BEGIN', 'END', 'alias', 'begin', 'break', 'case', 'defined?',
'do', 'else', 'elsif', 'end', 'ensure', 'for', 'if', 'in', 'next', 'redo',
'rescue', 'raise', 'retry', 'return', 'super', 'then', 'undef',
'unless', 'until', 'when', 'while', 'yield'), suffix=r'\b'),
Keyword),
# start of function, class and module names
(r'(module)(\s+)([a-zA-Z_]\w*'
r'(?:::[a-zA-Z_]\w*)*)',
bygroups(Keyword, Text, Name.Namespace)),
(r'(def)(\s+)', bygroups(Keyword, Text), 'funcname'),
(r'def(?=[*%&^`~+-/\[<>=])', Keyword, 'funcname'),
(r'(class)(\s+)', bygroups(Keyword, Text), 'classname'),
# special methods
(words((
'initialize', 'new', 'loop', 'include', 'extend', 'raise', 'attr_reader',
'attr_writer', 'attr_accessor', 'attr', 'catch', 'throw', 'private',
'module_function', 'public', 'protected', 'true', 'false', 'nil'),
suffix=r'\b'),
Keyword.Pseudo),
(r'(not|and|or)\b', Operator.Word),
(words((
'autoload', 'block_given', 'const_defined', 'eql', 'equal', 'frozen', 'include',
'instance_of', 'is_a', 'iterator', 'kind_of', 'method_defined', 'nil',
'private_method_defined', 'protected_method_defined',
'public_method_defined', 'respond_to', 'tainted'), suffix=r'\?'),
Name.Builtin),
(r'(chomp|chop|exit|gsub|sub)!', Name.Builtin),
(words((
'Array', 'Float', 'Integer', 'String', '__id__', '__send__', 'abort',
'ancestors', 'at_exit', 'autoload', 'binding', 'callcc', 'caller',
'catch', 'chomp', 'chop', 'class_eval', 'class_variables',
'clone', 'const_defined?', 'const_get', 'const_missing', 'const_set',
'constants', 'display', 'dup', 'eval', 'exec', 'exit', 'extend', 'fail', 'fork',
'format', 'freeze', 'getc', 'gets', 'global_variables', 'gsub',
'hash', 'id', 'included_modules', 'inspect', 'instance_eval',
'instance_method', 'instance_methods',
'instance_variable_get', 'instance_variable_set', 'instance_variables',
'lambda', 'load', 'local_variables', 'loop',
'method', 'method_missing', 'methods', 'module_eval', 'name',
'object_id', 'open', 'p', 'print', 'printf', 'private_class_method',
'private_instance_methods',
'private_methods', 'proc', 'protected_instance_methods',
'protected_methods', 'public_class_method',
'public_instance_methods', 'public_methods',
'putc', 'puts', 'raise', 'rand', 'readline', 'readlines', 'require',
'scan', 'select', 'self', 'send', 'set_trace_func', 'singleton_methods', 'sleep',
'split', 'sprintf', 'srand', 'sub', 'syscall', 'system', 'taint',
'test', 'throw', 'to_a', 'to_s', 'trace_var', 'trap', 'untaint',
'untrace_var', 'warn'), prefix=r'(?<!\.)', suffix=r'\b'),
Name.Builtin),
(r'__(FILE|LINE)__\b', Name.Builtin.Pseudo),
# normal heredocs
(r'(?<!\w)(<<[-~]?)(["`\']?)([a-zA-Z_]\w*)(\2)(.*?\n)',
heredoc_callback),
# empty string heredocs
(r'(<<[-~]?)("|\')()(\2)(.*?\n)', heredoc_callback),
(r'__END__', Comment.Preproc, 'end-part'),
# multiline regex (after keywords or assignments)
(r'(?:^|(?<=[=<>~!:])|'
r'(?<=(?:\s|;)when\s)|'
r'(?<=(?:\s|;)or\s)|'
r'(?<=(?:\s|;)and\s)|'
r'(?<=\.index\s)|'
r'(?<=\.scan\s)|'
r'(?<=\.sub\s)|'
r'(?<=\.sub!\s)|'
r'(?<=\.gsub\s)|'
r'(?<=\.gsub!\s)|'
r'(?<=\.match\s)|'
r'(?<=(?:\s|;)if\s)|'
r'(?<=(?:\s|;)elsif\s)|'
r'(?<=^when\s)|'
r'(?<=^index\s)|'
r'(?<=^scan\s)|'
r'(?<=^sub\s)|'
r'(?<=^gsub\s)|'
r'(?<=^sub!\s)|'
r'(?<=^gsub!\s)|'
r'(?<=^match\s)|'
r'(?<=^if\s)|'
r'(?<=^elsif\s)'
r')(\s*)(/)', bygroups(Text, String.Regex), 'multiline-regex'),
# multiline regex (in method calls or subscripts)
(r'(?<=\(|,|\[)/', String.Regex, 'multiline-regex'),
# multiline regex (this time the funny no whitespace rule)
(r'(\s+)(/)(?![\s=])', bygroups(Text, String.Regex),
'multiline-regex'),
# lex numbers and ignore following regular expressions which
# are division operators in fact (grrrr. i hate that. any
# better ideas?)
# since pygments 0.7 we also eat a "?" operator after numbers
# so that the char operator does not work. Chars are not allowed
# there so that you can use the ternary operator.
# stupid example:
# x>=0?n[x]:""
(r'(0_?[0-7]+(?:_[0-7]+)*)(\s*)([/?])?',
bygroups(Number.Oct, Text, Operator)),
(r'(0x[0-9A-Fa-f]+(?:_[0-9A-Fa-f]+)*)(\s*)([/?])?',
bygroups(Number.Hex, Text, Operator)),
(r'(0b[01]+(?:_[01]+)*)(\s*)([/?])?',
bygroups(Number.Bin, Text, Operator)),
(r'([\d]+(?:_\d+)*)(\s*)([/?])?',
bygroups(Number.Integer, Text, Operator)),
# Names
(r'@@[a-zA-Z_]\w*', Name.Variable.Class),
(r'@[a-zA-Z_]\w*', Name.Variable.Instance),
(r'\$\w+', Name.Variable.Global),
(r'\$[!@&`\'+~=/\\,;.<>_*$?:"^-]', Name.Variable.Global),
(r'\$-[0adFiIlpvw]', Name.Variable.Global),
(r'::', Operator),
include('strings'),
# chars
(r'\?(\\[MC]-)*' # modifiers
r'(\\([\\abefnrstv#"\']|x[a-fA-F0-9]{1,2}|[0-7]{1,3})|\S)'
r'(?!\w)',
String.Char),
(r'[A-Z]\w+', Name.Constant),
# this is needed because ruby attributes can look
# like keywords (class) or like this: ` ?!?
(words(RUBY_OPERATORS, prefix=r'(\.|::)'),
bygroups(Operator, Name.Operator)),
(r'(\.|::)([a-zA-Z_]\w*[!?]?|[*%&^`~+\-/\[<>=])',
bygroups(Operator, Name)),
(r'[a-zA-Z_]\w*[!?]?', Name),
(r'(\[|\]|\*\*|<<?|>>?|>=|<=|<=>|=~|={3}|'
r'!~|&&?|\|\||\.{1,3})', Operator),
(r'[-+/*%=<>&!^|~]=?', Operator),
(r'[(){};,/?:\\]', Punctuation),
(r'\s+', Text)
],
'funcname': [
(r'\(', Punctuation, 'defexpr'),
(r'(?:([a-zA-Z_]\w*)(\.))?'
r'([a-zA-Z_]\w*[!?]?|\*\*?|[-+]@?|'
r'[/%&|^`~]|\[\]=?|<<|>>|<=?>|>=?|===?)',
bygroups(Name.Class, Operator, Name.Function), '#pop'),
default('#pop')
],
'classname': [
(r'\(', Punctuation, 'defexpr'),
(r'<<', Operator, '#pop'),
(r'[A-Z_]\w*', Name.Class, '#pop'),
default('#pop')
],
'defexpr': [
(r'(\))(\.|::)?', bygroups(Punctuation, Operator), '#pop'),
(r'\(', Operator, '#push'),
include('root')
],
'in-intp': [
(r'\{', String.Interpol, '#push'),
(r'\}', String.Interpol, '#pop'),
include('root'),
],
'string-intp': [
(r'#\{', String.Interpol, 'in-intp'),
(r'#@@?[a-zA-Z_]\w*', String.Interpol),
(r'#\$[a-zA-Z_]\w*', String.Interpol)
],
'string-intp-escaped': [
include('string-intp'),
(r'\\([\\abefnrstv#"\']|x[a-fA-F0-9]{1,2}|[0-7]{1,3})',
String.Escape)
],
'interpolated-regex': [
include('string-intp'),
(r'[\\#]', String.Regex),
(r'[^\\#]+', String.Regex),
],
'interpolated-string': [
include('string-intp'),
(r'[\\#]', String.Other),
(r'[^\\#]+', String.Other),
],
'multiline-regex': [
include('string-intp'),
(r'\\\\', String.Regex),
(r'\\/', String.Regex),
(r'[\\#]', String.Regex),
(r'[^\\/#]+', String.Regex),
(r'/[mixounse]*', String.Regex, '#pop'),
],
'end-part': [
(r'.+', Comment.Preproc, '#pop')
]
}
tokens.update(gen_rubystrings_rules())
def analyse_text(text):
return shebang_matches(text, r'ruby(1\.\d)?')
class RubyConsoleLexer(Lexer):
"""
For Ruby interactive console (**irb**) output like:
.. sourcecode:: rbcon
irb(main):001:0> a = 1
=> 1
irb(main):002:0> puts a
1
=> nil
"""
name = 'Ruby irb session'
aliases = ['rbcon', 'irb']
mimetypes = ['text/x-ruby-shellsession']
_prompt_re = re.compile(r'irb\([a-zA-Z_]\w*\):\d{3}:\d+[>*"\'] '
r'|>> |\?> ')
def get_tokens_unprocessed(self, text):
rblexer = RubyLexer(**self.options)
curcode = ''
insertions = []
for match in line_re.finditer(text):
line = match.group()
m = self._prompt_re.match(line)
if m is not None:
end = m.end()
insertions.append((len(curcode),
[(0, Generic.Prompt, line[:end])]))
curcode += line[end:]
else:
if curcode:
for item in do_insertions(
insertions, rblexer.get_tokens_unprocessed(curcode)):
yield item
curcode = ''
insertions = []
yield match.start(), Generic.Output, line
if curcode:
for item in do_insertions(
insertions, rblexer.get_tokens_unprocessed(curcode)):
yield item
class FancyLexer(RegexLexer):
"""
Pygments Lexer For `Fancy <http://www.fancy-lang.org/>`_.
Fancy is a self-hosted, pure object-oriented, dynamic,
class-based, concurrent general-purpose programming language
running on Rubinius, the Ruby VM.
.. versionadded:: 1.5
"""
name = 'Fancy'
filenames = ['*.fy', '*.fancypack']
aliases = ['fancy', 'fy']
mimetypes = ['text/x-fancysrc']
tokens = {
# copied from PerlLexer:
'balanced-regex': [
(r'/(\\\\|\\/|[^/])*/[egimosx]*', String.Regex, '#pop'),
(r'!(\\\\|\\!|[^!])*![egimosx]*', String.Regex, '#pop'),
(r'\\(\\\\|[^\\])*\\[egimosx]*', String.Regex, '#pop'),
(r'\{(\\\\|\\\}|[^}])*\}[egimosx]*', String.Regex, '#pop'),
(r'<(\\\\|\\>|[^>])*>[egimosx]*', String.Regex, '#pop'),
(r'\[(\\\\|\\\]|[^\]])*\][egimosx]*', String.Regex, '#pop'),
(r'\((\\\\|\\\)|[^)])*\)[egimosx]*', String.Regex, '#pop'),
(r'@(\\\\|\\@|[^@])*@[egimosx]*', String.Regex, '#pop'),
(r'%(\\\\|\\%|[^%])*%[egimosx]*', String.Regex, '#pop'),
(r'\$(\\\\|\\\$|[^$])*\$[egimosx]*', String.Regex, '#pop'),
],
'root': [
(r'\s+', Text),
# balanced delimiters (copied from PerlLexer):
(r's\{(\\\\|\\\}|[^}])*\}\s*', String.Regex, 'balanced-regex'),
(r's<(\\\\|\\>|[^>])*>\s*', String.Regex, 'balanced-regex'),
(r's\[(\\\\|\\\]|[^\]])*\]\s*', String.Regex, 'balanced-regex'),
(r's\((\\\\|\\\)|[^)])*\)\s*', String.Regex, 'balanced-regex'),
(r'm?/(\\\\|\\/|[^/\n])*/[gcimosx]*', String.Regex),
(r'm(?=[/!\\{<\[(@%$])', String.Regex, 'balanced-regex'),
# Comments
(r'#(.*?)\n', Comment.Single),
# Symbols
(r'\'([^\'\s\[\](){}]+|\[\])', String.Symbol),
# Multi-line DoubleQuotedString
(r'"""(\\\\|\\"|[^"])*"""', String),
# DoubleQuotedString
(r'"(\\\\|\\"|[^"])*"', String),
# keywords
(r'(def|class|try|catch|finally|retry|return|return_local|match|'
r'case|->|=>)\b', Keyword),
# constants
(r'(self|super|nil|false|true)\b', Name.Constant),
(r'[(){};,/?|:\\]', Punctuation),
# names
(words((
'Object', 'Array', 'Hash', 'Directory', 'File', 'Class', 'String',
'Number', 'Enumerable', 'FancyEnumerable', 'Block', 'TrueClass',
'NilClass', 'FalseClass', 'Tuple', 'Symbol', 'Stack', 'Set',
'FancySpec', 'Method', 'Package', 'Range'), suffix=r'\b'),
Name.Builtin),
# functions
(r'[a-zA-Z](\w|[-+?!=*/^><%])*:', Name.Function),
# operators, must be below functions
(r'[-+*/~,<>=&!?%^\[\].$]+', Operator),
(r'[A-Z]\w*', Name.Constant),
(r'@[a-zA-Z_]\w*', Name.Variable.Instance),
(r'@@[a-zA-Z_]\w*', Name.Variable.Class),
('@@?', Operator),
(r'[a-zA-Z_]\w*', Name),
# numbers - / checks are necessary to avoid mismarking regexes,
# see comment in RubyLexer
(r'(0[oO]?[0-7]+(?:_[0-7]+)*)(\s*)([/?])?',
bygroups(Number.Oct, Text, Operator)),
(r'(0[xX][0-9A-Fa-f]+(?:_[0-9A-Fa-f]+)*)(\s*)([/?])?',
bygroups(Number.Hex, Text, Operator)),
(r'(0[bB][01]+(?:_[01]+)*)(\s*)([/?])?',
bygroups(Number.Bin, Text, Operator)),
(r'([\d]+(?:_\d+)*)(\s*)([/?])?',
bygroups(Number.Integer, Text, Operator)),
(r'\d+([eE][+-]?[0-9]+)|\d+\.\d+([eE][+-]?[0-9]+)?', Number.Float),
(r'\d+', Number.Integer)
]
}
|
PypiClean
|
/Sqlcarve-0.4.31-py3-none-any.whl/sqlcarve/validator/commentsValidator.py
|
from sqlcarve.validator.helpers import *
from sqlcarve.validator.errorManager import *
import re
regex = r"(?:\/\/[^\n]*|\/\*(?:(?!\*\/).)*\*\/)|(--.*?\n)"
# regex = r"\/\*.*?\*\/|--.*?\n"
class CommentsValidator:
"""
Classe commentaire
"""
def __init__(self):
"""
Cette classe initialisation un commentaire
"""
self.comment_error = ErrorManager()
def get_comment_element(self, statement):
"""
Cette fonction permet de recuperer une liste d'elements d'un fichier
sql et retourne une liste contenant seulement les commentaires
:param statement
:return words_tokens
"""
matches = re.finditer(regex, statement, re.DOTALL)
# pat = regex.join(field_labels) + regex
words_tokens = []
for matchNum, match in enumerate(matches, start=1):
for line in (match.group().split('*')):
line = line.replace('/', ' ')
line = line.replace('--', ' ')
words_tokens.append(line.strip())
log.debug("Group {groupNum} found at {start}-{end}: {group}".format(groupNum=matchNum,
start=match.start(),
end=match.end(),
group=match.group()))
# log.debug(words_tokens)
return words_tokens
# Note: for Python 2.7 compatibility, use ur"" to prefix the regex and u"" to prefix the test string and substitution.
def validate_comment(self, statement, fichierjson):
elemnt_list = self.get_comment_element(statement)
"""
Cette fonction permet de valider la liste de commentaires
grace a un fichier de reference contenant les champs qui doivent s'y trouver
:param statement, fichierjson
:return str()
"""
list = []
content_list = []
for i in elemnt_list:
if not (i == '' or i == '\n'):
list.append(i)
log.debug(i)
log.debug(list)
for element in list:
element = element.replace('\n', '')
# if element[0] == ' ':
# element = element.replace(' ', '', 1)
result = re.split(regex, element)
content_list.append(result)
with open(fichierjson) as jf:
data = json.loads(jf.read())
prof_data = data["profcomments"]
student_data = data["studentcomments"]
prof_pattern = zip(prof_data, content_list)
student_pattern = zip(student_data, content_list)
start = 0
for (i, l), (j, k) in zip(prof_pattern, student_pattern):
# print(i, l, ":", j, k)
present_prof = l[0].find(i)
present_student = k[0].find(j)
# print(present_student)
taille = len(l[0])
# print(taille)
x = l[0][:taille]
if present_prof == -1 and present_student == -1:
start = start + 1
if start == 3:
log.debug("No header comment found.")
elif present_prof != -1:
log.debug(i + ' found')
elif present_student != -1:
log.debug(j + ' found')
return list
|
PypiClean
|
/Tensorforce-0.6.5.tar.gz/Tensorforce-0.6.5/tensorforce/agents/a2c.py
|
from collections import OrderedDict
from tensorforce import TensorforceError
from tensorforce.agents import TensorforceAgent
class AdvantageActorCritic(TensorforceAgent):
"""
[Advantage Actor-Critic](https://arxiv.org/abs/1602.01783) agent
(specification key: `a2c`).
Args:
states (specification): States specification
(<span style="color:#C00000"><b>required</b></span>, better implicitly specified via
`environment` argument for `Agent.create(...)`), arbitrarily nested dictionary of state
descriptions (usually taken from `Environment.states()`) with the following attributes:
<ul>
<li><b>type</b> (<i>"bool" | "int" | "float"</i>) – state data type
(<span style="color:#00C000"><b>default</b></span>: "float").</li>
<li><b>shape</b> (<i>int | iter[int]</i>) – state shape
(<span style="color:#C00000"><b>required</b></span>).</li>
<li><b>num_values</b> (<i>int > 0</i>) – number of discrete state values
(<span style="color:#C00000"><b>required</b></span> for type "int").</li>
<li><b>min_value/max_value</b> (<i>float</i>) – minimum/maximum state value
(<span style="color:#00C000"><b>optional</b></span> for type "float").</li>
</ul>
actions (specification): Actions specification
(<span style="color:#C00000"><b>required</b></span>, better implicitly specified via
`environment` argument for `Agent.create(...)`), arbitrarily nested dictionary of
action descriptions (usually taken from `Environment.actions()`) with the following
attributes:
<ul>
<li><b>type</b> (<i>"bool" | "int" | "float"</i>) – action data type
(<span style="color:#C00000"><b>required</b></span>).</li>
<li><b>shape</b> (<i>int > 0 | iter[int > 0]</i>) – action shape
(<span style="color:#00C000"><b>default</b></span>: scalar).</li>
<li><b>num_values</b> (<i>int > 0</i>) – number of discrete action values
(<span style="color:#C00000"><b>required</b></span> for type "int").</li>
<li><b>min_value/max_value</b> (<i>float</i>) – minimum/maximum action value
(<span style="color:#00C000"><b>optional</b></span> for type "float").</li>
</ul>
max_episode_timesteps (int > 0): Upper bound for numer of timesteps per episode
(<span style="color:#00C000"><b>default</b></span>: not given, better implicitly
specified via `environment` argument for `Agent.create(...)`).
batch_size (<a href="../modules/parameters.html">parameter</a>, int > 0): Number of
timesteps per update batch
(<span style="color:#C00000"><b>required</b></span>).
network ("auto" | specification): Policy network configuration, see the
[networks documentation](../modules/networks.html)
(<span style="color:#00C000"><b>default</b></span>: "auto", automatically configured
network).
use_beta_distribution (bool): Whether to use the Beta distribution for bounded continuous
actions by default.
(<span style="color:#00C000"><b>default</b></span>: false).
memory (int > 0): Batch memory capacity, has to fit at least maximum batch_size + maximum
network/estimator horizon + 1 timesteps
(<span style="color:#00C000"><b>default</b></span>: minimum capacity, usually does not
need to be changed).
update_frequency ("never" | <a href="../modules/parameters.html">parameter</a>, int > 0 | 0.0 < float <= 1.0):
Frequency of updates, relative to batch_size if float
(<span style="color:#00C000"><b>default</b></span>: batch_size).
learning_rate (<a href="../modules/parameters.html">parameter</a>, float > 0.0): Optimizer
learning rate
(<span style="color:#00C000"><b>default</b></span>: 1e-3).
horizon ("episode" | <a href="../modules/parameters.html">parameter</a>, int >= 0): Horizon
of discounted-sum reward estimation before critic estimate
(<span style="color:#00C000"><b>default</b></span>: 1).
discount (<a href="../modules/parameters.html">parameter</a>, 0.0 <= float <= 1.0): Discount
factor for future rewards of discounted-sum reward estimation
(<span style="color:#00C000"><b>default</b></span>: 0.99).
return_processing (specification): Return processing as layer or list of layers, see the
[preprocessing documentation](../modules/preprocessing.html)
(<span style="color:#00C000"><b>default</b></span>: no return processing).
advantage_processing (specification): Advantage processing as layer or list of layers, see
the [preprocessing documentation](../modules/preprocessing.html)
(<span style="color:#00C000"><b>default</b></span>: no advantage processing).
predict_terminal_values (bool): Whether to predict the value of terminal states, usually
not required since max_episode_timesteps terminals are handled separately
(<span style="color:#00C000"><b>default</b></span>: false).
reward_processing (specification): Reward preprocessing as layer or list of layers, see the
[preprocessing documentation](../modules/preprocessing.html)
(<span style="color:#00C000"><b>default</b></span>: no reward processing).
critic (specification): Critic network configuration, see the
[networks documentation](../modules/networks.html)
(<span style="color:#00C000"><b>default</b></span>: "auto").
critic_optimizer (float > 0.0 | specification): Critic optimizer configuration, see the
[optimizers documentation](../modules/optimizers.html), a float instead specifies a
custom weight for the critic loss
(<span style="color:#00C000"><b>default</b></span>: 1.0).
l2_regularization (<a href="../modules/parameters.html">parameter</a>, float >= 0.0):
L2 regularization loss weight
(<span style="color:#00C000"><b>default</b></span>: no L2 regularization).
entropy_regularization (<a href="../modules/parameters.html">parameter</a>, float >= 0.0):
Entropy regularization loss weight, to discourage the policy distribution from being
"too certain"
(<span style="color:#00C000"><b>default</b></span>: no entropy regularization).
state_preprocessing (dict[specification]): State preprocessing as layer or list of layers,
see the [preprocessing documentation](../modules/preprocessing.html),
specified per state-type or -name
(<span style="color:#00C000"><b>default</b></span>: linear normalization of bounded
float states to [-2.0, 2.0]).
exploration (<a href="../modules/parameters.html">parameter</a> | dict[<a href="../modules/parameters.html">parameter</a>], float >= 0.0):
Exploration, defined as the probability for uniformly random output in case of `bool`
and `int` actions, and the standard deviation of Gaussian noise added to every output in
case of `float` actions, specified globally or per action-type or -name
(<span style="color:#00C000"><b>default</b></span>: no exploration).
variable_noise (<a href="../modules/parameters.html">parameter</a>, float >= 0.0):
Add Gaussian noise with given standard deviation to all trainable variables, as
alternative exploration mechanism
(<span style="color:#00C000"><b>default</b></span>: no variable noise).<br/><br/>
>>>: For arguments below, see the [Tensorforce agent documentation](tensorforce.html).
parallel_interactions (int > 0)
config (specification)
saver (path | specification)
summarizer (path | specification)
tracking ("all" | iter[string])
recorder (path | specification)
"""
def __init__(
# Required
self, states, actions, batch_size,
# Environment
max_episode_timesteps=None,
# Network
network='auto', use_beta_distribution=False,
# Memory
memory='minimum',
# Optimization
update_frequency=1.0, learning_rate=1e-3,
# Reward estimation
horizon=1, discount=0.99, reward_processing=None, return_processing=None,
advantage_processing=None,
predict_terminal_values=False,
# Critic
critic='auto', critic_optimizer=1.0,
# Preprocessing
state_preprocessing='linear_normalization',
# Exploration
exploration=0.0, variable_noise=0.0,
# Regularization
l2_regularization=0.0, entropy_regularization=0.0,
# Parallel interactions
parallel_interactions=1,
# Config, saver, summarizer, tracking, recorder
config=None, saver=None, summarizer=None, tracking=None, recorder=None,
# Deprecated
**kwargs
):
if 'estimate_terminal' in kwargs:
raise TensorforceError.deprecated(
name='A2C', argument='estimate_terminal', replacement='predict_terminal_values'
)
if 'critic_network' in kwargs:
raise TensorforceError.deprecated(
name='A2C', argument='critic_network', replacement='critic'
)
self.spec = OrderedDict(
agent='a2c',
states=states, actions=actions, batch_size=batch_size,
max_episode_timesteps=max_episode_timesteps,
network=network, use_beta_distribution=use_beta_distribution,
memory=memory,
update_frequency=update_frequency, learning_rate=learning_rate,
horizon=horizon, discount=discount, return_processing=return_processing,
advantage_processing=advantage_processing,
predict_terminal_values=predict_terminal_values,
critic=critic, critic_optimizer=critic_optimizer,
state_preprocessing=state_preprocessing,
exploration=exploration, variable_noise=variable_noise,
l2_regularization=l2_regularization, entropy_regularization=entropy_regularization,
parallel_interactions=parallel_interactions,
config=config, saver=saver, summarizer=summarizer, tracking=tracking, recorder=recorder
)
policy = dict(
type='parametrized_distributions', network=network, temperature=1.0,
use_beta_distribution=use_beta_distribution
)
if memory == 'minimum':
memory = dict(type='recent')
else:
memory = dict(type='recent', capacity=memory)
update = dict(unit='timesteps', batch_size=batch_size, frequency=update_frequency)
optimizer = dict(type='adam', learning_rate=learning_rate)
objective = 'policy_gradient'
reward_estimation = dict(
horizon=horizon, discount=discount, predict_horizon_values='early',
estimate_advantage=True, predict_action_values=False,
reward_processing=reward_processing, return_processing=return_processing,
predict_terminal_values=predict_terminal_values
)
baseline = dict(type='parametrized_state_value', network=critic)
baseline_objective = dict(type='state_value')
super().__init__(
# Agent
states=states, actions=actions, max_episode_timesteps=max_episode_timesteps,
parallel_interactions=parallel_interactions, config=config, recorder=recorder,
# TensorforceModel
policy=policy, memory=memory, update=update, optimizer=optimizer, objective=objective,
reward_estimation=reward_estimation,
baseline=baseline, baseline_optimizer=critic_optimizer,
baseline_objective=baseline_objective,
l2_regularization=l2_regularization, entropy_regularization=entropy_regularization,
state_preprocessing=state_preprocessing,
exploration=exploration, variable_noise=variable_noise,
saver=saver, summarizer=summarizer, tracking=tracking, **kwargs
)
|
PypiClean
|
/urbanaccess-0.2.2.tar.gz/urbanaccess-0.2.2/CHANGELOG.rst
|
v0.2.2
======
2020/11/09
* allows passing matplotlib axes to urbanaccess.plot.plot_net()
* adds flexibility to calendar/date handling (calendar_dates.txt now supported)
* improves GTFS downloading (solves issue where requests were rejected due to missing user agent header)
* improves text encoding support
v0.2.1
======
2020/08/28
* Support for GeoPy 2.0+
* Support for Pandas 1.0+
v0.2.0
======
2018/11/02
* Python 3 compatibility (preserving compatibility with Python 2)
* Updated demo for cross compatibility
* Added Python 3 to travis
* Support latest version of Pandana v0.4.1 (preserving compatibility with prior versions)
* Removed integration test using Great Britain dataset as data no longer available for download
* Updated readme and docs
v0.1.0
======
2017/8/02
* Initial release
|
PypiClean
|
/Pylouis-0.0.1-py3-none-any.whl/louis/__init__.py
|
from __future__ import unicode_literals
from ctypes import *
import atexit
import sys
# Some general utility functions
def _createTablesString(tablesList):
"""Creates a tables string for liblouis calls"""
return b",".join([x.encode(filesystemencoding) if isinstance(x, str) else bytes(x) for x in tablesList])
def _createTypeformbuf(length, typeform=None):
"""Creates a typeform buffer for liblouis calls"""
return (c_ushort*length)(*typeform) if typeform else (c_ushort*length)()
ENCODING_ERROR_HANDLER = None
createEncodedByteString = None
if sys.version_info.major == 2:
ENCODING_ERROR_HANDLER = "strict"
createEncodedByteString = lambda x: unicode(x).encode(conversionEncoding, errors=ENCODING_ERROR_HANDLER)
else: # sys.version_info[0] == 3
ENCODING_ERROR_HANDLER = "surrogatepass"
createEncodedByteString = lambda x: str(x).encode(conversionEncoding, errors=ENCODING_ERROR_HANDLER)
try:
# Native win32
_loader = windll
_functype = WINFUNCTYPE
except NameError:
# Unix/Cygwin
_loader = cdll
_functype = CFUNCTYPE
liblouis = _loader["liblouis.dll"]
atexit.register(liblouis.lou_free)
liblouis.lou_version.restype = c_char_p
liblouis.lou_charSize.restype = c_int
liblouis.lou_translateString.argtypes = (
c_char_p, POINTER(c_char), POINTER(c_int), POINTER(c_char),
POINTER(c_int), POINTER(c_ushort), POINTER(c_char), c_int)
liblouis.lou_translate.argtypes = (
c_char_p, POINTER(c_char), POINTER(c_int), POINTER(c_char),
POINTER(c_int), POINTER(c_ushort), POINTER(c_char),
POINTER(c_int), POINTER(c_int), POINTER(c_int), c_int)
liblouis.lou_backTranslateString.argtypes = (
c_char_p, POINTER(c_char), POINTER(c_int), POINTER(c_char),
POINTER(c_int), POINTER(c_ushort), POINTER(c_char), c_int)
liblouis.lou_backTranslate.argtypes = (
c_char_p, POINTER(c_char), POINTER(c_int), POINTER(c_char), POINTER(c_int),
POINTER(c_ushort), POINTER(c_char), POINTER(c_int), POINTER(c_int),
POINTER(c_int), c_int)
liblouis.lou_hyphenate.argtypes = (
c_char_p, POINTER(c_char), c_int, POINTER(c_char), c_int)
liblouis.lou_checkTable.argtypes = (c_char_p,)
liblouis.lou_compileString.argtypes = (c_char_p, c_char_p)
liblouis.lou_getTypeformForEmphClass.argtypes = (c_char_p, c_char_p)
liblouis.lou_dotsToChar.argtypes = (
c_char_p, POINTER(c_char), POINTER(c_char),
c_int, c_int)
liblouis.lou_charToDots.argtypes = (
c_char_p, POINTER(c_char), POINTER(c_char),
c_int, c_int)
LogCallback = _functype(None, c_int, c_char_p)
liblouis.lou_registerLogCallback.restype = None
liblouis.lou_setLogLevel.restype = None
liblouis.lou_setLogLevel.argtypes = (c_int,)
#{ Module Configuration
#: Specifies the charSize (in bytes) used by liblouis.
#: This is fetched once using L{liblouis.lou_charSize}.
#: Call it directly, since L{charSize} is not yet defined.
#: @type: int
wideCharBytes = liblouis.lou_charSize()
#: Specifies the number by which the input length should be multiplied
#: to calculate the maximum output length.
#: @type: int
# This default will handle the case where every input character is
# undefined in the translation table.
outlenMultiplier = 4 + wideCharBytes * 2
#: Specifies the encoding to use when encode/decode filenames
#: @type: str
filesystemencoding = sys.getfilesystemencoding()
#: Specifies the encoding to use when converting from byte strings to unicode strings.
#: @type: str
conversionEncoding = "utf_%d_le" % (wideCharBytes * 8)
#}
def version():
"""Obtain version information for liblouis.
@return: The version of liblouis, plus other information, such as
the release date and perhaps notable changes.
@rtype: str
"""
return liblouis.lou_version().decode("ASCII")
def charSize():
"""Obtain charSize information for liblouis.
@return: The size of the widechar with which liblouis was compiled.
@rtype: int
"""
return liblouis.lou_charSize()
def translate(tableList, inbuf, typeform=None, cursorPos=0, mode=0):
"""Translate a string of characters, providing position information.
@param tableList: A list of translation tables.
@type tableList: list of str
@param inbuf: The string to translate.
@type inbuf: str
@param typeform: A list of typeform constants indicating the typeform for each position in inbuf,
C{None} for no typeform information.
@type typeform: list of int
@param cursorPos: The position of the cursor in inbuf.
@type cursorPos: int
@param mode: The translation mode; add multiple values for a combined mode.
@type mode: int
@return: A tuple of: the translated string,
a list of input positions for each position in the output,
a list of output positions for each position in the input, and
the position of the cursor in the output.
@rtype: (str, list of int, list of int, int)
@raise RuntimeError: If a complete translation could not be done.
@see: lou_translate in the liblouis documentation
"""
tablesString = _createTablesString(tableList)
inbuf = createEncodedByteString(inbuf)
inlen = c_int(len(inbuf) // wideCharBytes)
outlen = c_int(inlen.value*outlenMultiplier)
outbuf = create_string_buffer(outlen.value * wideCharBytes)
typeformbuf = None
if typeform:
typeformbuf = _createTypeformbuf(outlen.value, typeform)
inPos = (c_int*outlen.value)()
outPos = (c_int*inlen.value)()
cursorPos = c_int(cursorPos)
if not liblouis.lou_translate(tablesString, inbuf, byref(inlen),
outbuf, byref(outlen), typeformbuf,
None, outPos, inPos, byref(cursorPos), mode):
raise RuntimeError("Can't translate: tables %s, inbuf %s, typeform %s, cursorPos %s, mode %s"%(tableList, inbuf, typeform, cursorPos, mode))
if isinstance(typeform, list):
typeform[:] = list(typeformbuf)
return outbuf.raw[:outlen.value * wideCharBytes].decode(conversionEncoding, errors=ENCODING_ERROR_HANDLER), inPos[:outlen.value], outPos[:inlen.value], cursorPos.value
def translateString(tableList, inbuf, typeform=None, mode=0):
"""Translate a string of characters.
@param tableList: A list of translation tables.
@type tableList: list of str
@param inbuf: The string to translate.
@type inbuf: str
@param typeform: A list of typeform constants indicating the typeform for each position in inbuf,
C{None} for no typeform information.
@type typeform: list of int
@param mode: The translation mode; add multiple values for a combined mode.
@type mode: int
@return: The translated string.
@rtype: str
@raise RuntimeError: If a complete translation could not be done.
@see: lou_translateString in the liblouis documentation
"""
tablesString = _createTablesString(tableList)
inbuf = createEncodedByteString(inbuf)
inlen = c_int(len(inbuf) // wideCharBytes)
outlen = c_int(inlen.value*outlenMultiplier)
outbuf = create_string_buffer(outlen.value * wideCharBytes)
typeformbuf = None
if typeform:
typeformbuf = _createTypeformbuf(outlen.value, typeform)
if not liblouis.lou_translateString(tablesString, inbuf, byref(inlen),
outbuf, byref(outlen), typeformbuf,
None, mode):
raise RuntimeError("Can't translate: tables %s, inbuf %s, typeform %s, mode %s"%(tableList, inbuf, typeform, mode))
if isinstance(typeform, list):
typeform[:] = list(typeformbuf)
return outbuf.raw[:outlen.value * wideCharBytes].decode(conversionEncoding, errors=ENCODING_ERROR_HANDLER)
def backTranslate(tableList, inbuf, typeform=None, cursorPos=0, mode=0):
"""Back translates a string of characters, providing position information.
@param tableList: A list of translation tables.
@type tableList: list of str
@param inbuf: Braille to back translate.
@type inbuf: str
@param typeform: List where typeform constants will be placed.
@type typeform: list
@param cursorPos: Position of cursor.
@type cursorPos: int
@param mode: Translation mode.
@type mode: int
@return: A tuple: A string of the back translation,
a list of input positions for each position in the output,
a list of the output positions for each position in the input and
the position of the cursor in the output.
@rtype: (str, list of int, list of int, int)
@raise RuntimeError: If a complete back translation could not be done.
@see: lou_backTranslate in the liblouis documentation.
"""
tablesString = _createTablesString(tableList)
inbuf = createEncodedByteString(inbuf)
inlen = c_int(len(inbuf) // wideCharBytes)
outlen = c_int(inlen.value * outlenMultiplier)
outbuf = create_string_buffer(outlen.value * wideCharBytes)
typeformbuf = None
if isinstance(typeform, list):
typeformbuf = _createTypeformbuf(outlen.value)
inPos = (c_int*outlen.value)()
outPos = (c_int*inlen.value)()
cursorPos = c_int(cursorPos)
if not liblouis.lou_backTranslate(tablesString, inbuf, byref(inlen),
outbuf, byref(outlen), typeformbuf, None,
outPos, inPos, byref(cursorPos), mode):
raise RuntimeError("Can't back translate: tables %s, inbuf %s, typeform %s, cursorPos %d, mode %d"%(tableList, inbuf, typeform, cursorPos, mode))
if isinstance(typeform, list):
typeform[:] = list(typeformbuf)
return outbuf.raw[:outlen.value * wideCharBytes].decode(conversionEncoding, errors=ENCODING_ERROR_HANDLER), inPos[:outlen.value], outPos[:inlen.value], cursorPos.value
def backTranslateString(tableList, inbuf, typeform=None, mode=0):
"""Back translate from Braille.
@param tableList: A list of translation tables.
@type tableList: list of str
@param inbuf: The Braille to back translate.
@type inbuf: str
@param typeform: List for typeform constants to be put in.
If you don't want typeform data then give None
@type typeform: list
@param mode: The translation mode
@type mode: int
@return: The back translation of inbuf.
@rtype: str
@raise RuntimeError: If a complete back translation could not be done.
@see: lou_backTranslateString in the liblouis documentation.
"""
tablesString = _createTablesString(tableList)
inbuf = createEncodedByteString(inbuf)
inlen = c_int(len(inbuf) // wideCharBytes)
outlen = c_int(inlen.value * outlenMultiplier)
outbuf = create_string_buffer(outlen.value * wideCharBytes)
typeformbuf = None
if isinstance(typeform, list):
typeformbuf = _createTypeformbuf(outlen.value)
if not liblouis.lou_backTranslateString(tablesString, inbuf, byref(inlen), outbuf,
byref(outlen), typeformbuf, None, mode):
raise RuntimeError("Can't back translate: tables %s, inbuf %s, mode %d"%(tableList, inbuf, mode))
if isinstance(typeform, list):
typeform[:] = list(typeformbuf)
return outbuf.raw[:outlen.value * wideCharBytes].decode(conversionEncoding, errors=ENCODING_ERROR_HANDLER)
def hyphenate(tableList, inbuf, mode=0):
"""Get information for hyphenation.
@param tableList: A list of translation tables and hyphenation
dictionaries.
@type tableList: list of str
@param inbuf: The text to get hyphenation information about.
This should be a single word and leading/trailing whitespace
and punctuation is ignored.
@type inbuf: str
@param mode: Lets liblouis know if inbuf is plain text or Braille.
Set to 0 for text and anyother value for Braille.
@type mode: int
@return: A string with '1' at the beginning of every syllable
and '0' elsewhere.
@rtype: str
@raise RuntimeError: If hyphenation data could not be produced.
@see: lou_hyphenate in the liblouis documentation.
"""
tablesString = _createTablesString(tableList)
inbuf = createEncodedByteString(inbuf)
inlen = c_int(len(inbuf) // wideCharBytes)
hyphen_string = create_string_buffer(inlen.value + 1)
if not liblouis.lou_hyphenate(tablesString, inbuf, inlen, hyphen_string, mode):
raise RuntimeError("Can't hyphenate: tables %s, inbuf %s, mode %d"%(tableList, inbuf, mode))
return hyphen_string.value.decode("ASCII")
def checkTable(tableList):
"""Check if the specified tables can be found and compiled.
This can be used to check if a list of tables contains errors
before sending it to other liblouis functions
that accept a list of tables.
@param tableList: A list of translation tables.
@type tableList: list of str
@raise RuntimeError: If compilation failed.
@see: lou_checkTable in the liblouis documentation
"""
tablesString = _createTablesString(tableList)
if not liblouis.lou_checkTable(tablesString):
raise RuntimeError("Can't compile: tables %s"%tableList)
def compileString(tableList, inString):
"""Compile a table entry on the fly at run-time.
@param tableList: A list of translation tables.
@type tableList: list of str
@param inString: The table entry to be added.
@type inString: str
@raise RuntimeError: If compilation of the entry failed.
@see: lou_compileString in the liblouis documentation
"""
tablesString = _createTablesString(tableList)
inBytes = inString.encode("ASCII") if isinstance(inString, str) else bytes(inString)
if not liblouis.lou_compileString(tablesString, inString):
raise RuntimeError("Can't compile entry: tables %s, inString %s"%(tableList, inString))
def getTypeformForEmphClass(tableList, emphClass):
"""Get the typeform bit for the named emphasis class.
@param tableList: A list of translation tables.
@type tableList: list of str
@param emphClass: An emphasis class name.
@type emphClass: str
@see: lou_getTypeformForEmphClass in the liblouis documentation
"""
tablesString = _createTablesString(tableList)
return liblouis.lou_getTypeformForEmphClass(tablesString, emphClass)
def dotsToChar(tableList, inbuf):
""""Convert a string of dot patterns to a string of characters according to the specifications in tableList.
@param tableList: A list of translation tables.
@type tableList: list of str
@param inbuf: a string of dot patterns, either in liblouis format or Unicode braille.
@type inbuf: str
@raise RuntimeError: If a complete conversion could not be done.
@see: lou_dotsToChar in the liblouis documentation
"""
tablesString = _createTablesString(tableList)
inbuf = createEncodedByteString(inbuf)
length = c_int(len(inbuf) // wideCharBytes)
outbuf = create_string_buffer(length.value * wideCharBytes)
if not liblouis.lou_dotsToChar(tablesString, inbuf, outbuf, length, 0):
raise RuntimeError("Can't convert dots to char: tables %s, inbuf %s"%(tableList, inbuf))
return outbuf.raw[:length.value * wideCharBytes].decode(conversionEncoding, errors=ENCODING_ERROR_HANDLER)
def charToDots(tableList, inbuf, mode=0):
""""Convert a string of characterss to a string of dot patterns according to the specifications in tableList.
@param tableList: A list of translation tables.
@type tableList: list of str
@param inbuf: a string of characters.
@type inbuf: str
@param mode: The translation mode; add multiple values for a combined mode.
@type mode: int
@raise RuntimeError: If a complete conversion could not be done.
@see: lou_charToDOts in the liblouis documentation
"""
tablesString = _createTablesString(tableList)
inbuf = createEncodedByteString(inbuf)
length = c_int(len(inbuf) // wideCharBytes)
outbuf = create_string_buffer(length.value * wideCharBytes)
if not liblouis.lou_charToDots(tablesString, inbuf, outbuf, length, mode):
raise RuntimeError("Can't convert char to dots: tables %s, inbuf %s, mode %d"%(tableList, inbuf, mode))
return outbuf.raw[:length.value * wideCharBytes].decode(conversionEncoding, errors=ENCODING_ERROR_HANDLER)
def registerLogCallback(logCallback):
"""Register logging callbacks.
Set to C{None} for default callback.
@param logCallback: The callback to use.
The callback must take two arguments:
@param level: The log level on which a message is logged.
@type level: int
@param message: The logged message.
Note that the callback should provide its own ASCII decoding routine.
@type message: bytes
Example callback:
@louis.LogCallback
def incomingLouisLog(level, message):
print("Message %s logged at level %d" % (message.decode("ASCII"), level))
@type logCallback: L{LogCallback}
"""
if logCallback is not None and not isinstance(logCallback, LogCallback):
raise TypeError("logCallback should be of type {} or NoneType".format(LogCallback.__name__))
return liblouis.lou_registerLogCallback(logCallback)
def setLogLevel(level):
"""Set the level for logging callback to be called at.
@param level: one of the C{LOG_*} constants.
@type level: int
@raise ValueError: If an invalid log level is provided.
"""
if level not in logLevels:
raise ValueError("Level %d is an invalid log level"%level)
return liblouis.lou_setLogLevel(level)
#{ Typeforms
plain_text = 0x0000
emph_1 = comp_emph_1 = italic = 0x0001
emph_2 = comp_emph_2 = underline = 0x0002
emph_3 = comp_emph_3 = bold = 0x0004
emph_4 = 0x0008
emph_5 = 0x0010
emph_6 = 0x0020
emph_7 = 0x0040
emph_8 = 0x0080
emph_9 = 0x0100
emph_10 = 0x0200
computer_braille = 0x0400
no_translate = 0x0800
no_contract = 0x1000
#}
#{ Translation modes
noContractions = 1
compbrlAtCursor = 2
dotsIO = 4
compbrlLeftCursor = 32
ucBrl = 64
noUndefined = 128
noUndefinedDots = noUndefined # alias for backward compatiblity
partialTrans = 256
#}
#{ logLevels
LOG_ALL = 0
LOG_DEBUG = 10000
LOG_INFO = 20000
LOG_WARN = 30000
LOG_ERROR = 40000
LOG_FATAL = 50000
LOG_OFF = 60000
#}
logLevels = (LOG_ALL, LOG_DEBUG, LOG_INFO, LOG_WARN, LOG_ERROR, LOG_FATAL, LOG_OFF)
if __name__ == '__main__':
# Just some common tests.
print(version())
print(translate([b'../tables/en-us-g2.ctb'], 'Hello world!', cursorPos=5))
|
PypiClean
|
/garage-2021.3.0.tar.gz/garage-2021.3.0/docs/user/environment_libraries.md
|
# Environment Libraries
Garage supports a variety of external environment libraries for different RL
training purposes. This section introduces the environment libraries
supported by garage and how to work with them.
## OpenAI `gym`
OpenAI's [gym](https://github.com/openai/gym) comes with the default garage
installation. To use a `gym` environment, you should wrap it with
`garage.envs.GymEnv`.
For example:
```
import gym
from garage.envs import GymEnv
env = GymEnv(gym.make('CarRacing-v0')
```
The wrapper `GymEnv` is required because it brings `gym` environments
to garage's `Environment` format.
Find more about the `Environment` API [here](implement_env).
Note that `GymEnv` can also take a string
argument and create the `gym` environment for you. In fact, this is the
**preferred** way to create a `gym` environment in garage.
For example:
```
from garage.envs import GymEnv
env = GymEnv('CarRacing-v0')
```
## DeepMind `dm_control`
DeepMind's [dm_control](https://github.com/deepmind/dm_control) can be
installed via `pip install 'garage[mujoco,dm_control]'`. Checkout the
[installation guide](installation) for details about setting up mujoco
dependencies for `dm_control`.
`dm_control` environments are wrapped by `garage.envs.dm_control.DMControlEnv`:
```
from garage.envs.dm_control import DMControlEnv
env = DMControlEnv.from_suite(domain_name, task_name)
```
Note that `DMControlEnv.from_suite()` is a convenient (and preferred) function
that returns a `DMControlEnv` wrapping a `dm_control` environment with the
given domain and task name.
## MetaWorld
[MetaWorld](https://github.com/rlworkgroup/metaworld) provides environments
for benchmark for meta- and multi-task reinforcement learning. Since MetaWorld
environments implement the `gym` environment interface, they can be wrapped by
`garage.envs.GymEnv` as well.
```
from metaworld.benchmarks import MT10
from garage.envs import GymEnv
task = MT10.get_train_tasks().all_task_names[0]
env = GymEnv(MT10.from_task(task)
```
## PyBullet
[PyBullet](https://github.com/bulletphysics/bullet3/tree/master/examples/pybullet)
provides environments supported by the [Bullet Physics SDK](https://github.com/bulletphysics/bullet3).
`PyBullet` dependencies can be installed via `pip install 'garage[bullet]'`.
`PyBullet` environments are wrapped by `garage.envs.bullet.BulletEnv`.
```
from garage.envs.bullet import BulletEnv
env = BulletEnv('KukaCamBulletEnv-v0')
```
Note that since `PyBullet` environments implement the `gym` environment
interface, they can be wrapped by `garage.envs.GymEnv` as well. In this case,
`GymEnv` will return a `BulletEnv` instance. For example:
```
from garage.envs import GymEnv
env = GymEnv('KukaCamBulletEnv-v0')
# type(env) == BulletEnv
```
## More Environment Libraries?
Checkout [Adding a new Environment](implement_env) to find out how to
create your own environment wrapper.
----
*This page was authored by Eric Yihan Chen
([@AiRuiChen](https://github.com/AiRuiChen)).*
|
PypiClean
|
/openimis-be-product-1.5.1.tar.gz/openimis-be-product-1.5.1/product/migrations/0001_initial.py
|
import core.fields
import datetime
from django.db import migrations, models
import django.db.models.deletion
import product.models
import uuid
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Product',
fields=[
('validity_from', core.fields.DateTimeField(db_column='ValidityFrom', default=datetime.datetime.now)),
('validity_to', core.fields.DateTimeField(blank=True, db_column='ValidityTo', null=True)),
('legacy_id', models.IntegerField(blank=True, db_column='LegacyID', null=True)),
('id', models.AutoField(db_column='ProdID', primary_key=True, serialize=False)),
('uuid', models.CharField(db_column='ProdUUID', default=uuid.uuid4, max_length=36, unique=True)),
('code', models.CharField(db_column='ProductCode', max_length=8)),
('name', models.CharField(db_column='ProductName', max_length=100)),
('date_from', models.DateTimeField(db_column='DateFrom')),
('date_to', models.DateTimeField(db_column='DateTo')),
('insurance_period', models.SmallIntegerField(db_column='InsurancePeriod')),
('administration_period', models.IntegerField(blank=True, db_column='AdministrationPeriod', null=True)),
('lump_sum', models.DecimalField(db_column='LumpSum', decimal_places=2, max_digits=18)),
('max_members', models.SmallIntegerField(db_column='MemberCount')),
('max_installments', models.IntegerField(blank=True, db_column='MaxInstallments', null=True)),
('threshold', models.IntegerField(blank=True, db_column='Threshold', null=True)),
('recurrence', models.IntegerField(blank=True, db_column='Recurrence', null=True)),
('premium_adult', models.DecimalField(blank=True, db_column='PremiumAdult', decimal_places=2, max_digits=18, null=True)),
('premium_child', models.DecimalField(blank=True, db_column='PremiumChild', decimal_places=2, max_digits=18, null=True)),
('ded_insuree', models.DecimalField(blank=True, db_column='DedInsuree', decimal_places=2, max_digits=18, null=True)),
('ded_op_insuree', models.DecimalField(blank=True, db_column='DedOPInsuree', decimal_places=2, max_digits=18, null=True)),
('ded_ip_insuree', models.DecimalField(blank=True, db_column='DedIPInsuree', decimal_places=2, max_digits=18, null=True)),
('max_insuree', models.DecimalField(blank=True, db_column='MaxInsuree', decimal_places=2, max_digits=18, null=True)),
('max_op_insuree', models.DecimalField(blank=True, db_column='MaxOPInsuree', decimal_places=2, max_digits=18, null=True)),
('max_ip_insuree', models.DecimalField(blank=True, db_column='MaxIPInsuree', decimal_places=2, max_digits=18, null=True)),
('period_rel_prices', models.CharField(blank=True, db_column='PeriodRelPrices', max_length=1, null=True)),
('period_rel_prices_op', models.CharField(blank=True, db_column='PeriodRelPricesOP', max_length=1, null=True)),
('period_rel_prices_ip', models.CharField(blank=True, db_column='PeriodRelPricesIP', max_length=1, null=True)),
('acc_code_premiums', models.CharField(blank=True, db_column='AccCodePremiums', max_length=25, null=True)),
('acc_code_remuneration', models.CharField(blank=True, db_column='AccCodeRemuneration', max_length=25, null=True)),
('ded_treatment', models.DecimalField(blank=True, db_column='DedTreatment', decimal_places=2, max_digits=18, null=True)),
('ded_op_treatment', models.DecimalField(blank=True, db_column='DedOPTreatment', decimal_places=2, max_digits=18, null=True)),
('ded_ip_treatment', models.DecimalField(blank=True, db_column='DedIPTreatment', decimal_places=2, max_digits=18, null=True)),
('max_treatment', models.DecimalField(blank=True, db_column='MaxTreatment', decimal_places=2, max_digits=18, null=True)),
('max_op_treatment', models.DecimalField(blank=True, db_column='MaxOPTreatment', decimal_places=2, max_digits=18, null=True)),
('max_ip_treatment', models.DecimalField(blank=True, db_column='MaxIPTreatment', decimal_places=2, max_digits=18, null=True)),
('ded_policy', models.DecimalField(blank=True, db_column='DedPolicy', decimal_places=2, max_digits=18, null=True)),
('ded_op_policy', models.DecimalField(blank=True, db_column='DedOPPolicy', decimal_places=2, max_digits=18, null=True)),
('ded_ip_policy', models.DecimalField(blank=True, db_column='DedIPPolicy', decimal_places=2, max_digits=18, null=True)),
('max_policy', models.DecimalField(blank=True, db_column='MaxPolicy', decimal_places=2, max_digits=18, null=True)),
('max_op_policy', models.DecimalField(blank=True, db_column='MaxOPPolicy', decimal_places=2, max_digits=18, null=True)),
('max_ip_policy', models.DecimalField(blank=True, db_column='MaxIPPolicy', decimal_places=2, max_digits=18, null=True)),
('audit_user_id', models.IntegerField(db_column='AuditUserID')),
('grace_period_enrolment', models.IntegerField(db_column='GracePeriod')),
('grace_period_payment', models.IntegerField(blank=True, db_column='WaitingPeriod', null=True)),
('grace_period_renewal', models.IntegerField(blank=True, db_column='GracePeriodRenewal', null=True)),
('registration_lump_sum', models.DecimalField(blank=True, db_column='RegistrationLumpSum', decimal_places=2, max_digits=18, null=True)),
('registration_fee', models.DecimalField(blank=True, db_column='RegistrationFee', decimal_places=2, max_digits=18, null=True)),
('general_assembly_lump_sum', models.DecimalField(blank=True, db_column='GeneralAssemblyLumpSum', decimal_places=2, max_digits=18, null=True)),
('general_assembly_fee', models.DecimalField(blank=True, db_column='GeneralAssemblyFee', decimal_places=2, max_digits=18, null=True)),
('start_cycle_1', models.CharField(blank=True, db_column='StartCycle1', max_length=5, null=True)),
('start_cycle_2', models.CharField(blank=True, db_column='StartCycle2', max_length=5, null=True)),
('start_cycle_3', models.CharField(blank=True, db_column='StartCycle3', max_length=5, null=True)),
('start_cycle_4', models.CharField(blank=True, db_column='StartCycle4', max_length=5, null=True)),
('max_no_consultation', models.IntegerField(blank=True, db_column='MaxNoConsultation', null=True)),
('max_no_surgery', models.IntegerField(blank=True, db_column='MaxNoSurgery', null=True)),
('max_no_delivery', models.IntegerField(blank=True, db_column='MaxNoDelivery', null=True)),
('max_no_hospitalization', models.IntegerField(blank=True, db_column='MaxNoHospitalizaion', null=True)),
('max_no_visits', models.IntegerField(blank=True, db_column='MaxNoVisits', null=True)),
('max_amount_consultation', models.DecimalField(blank=True, db_column='MaxAmountConsultation', decimal_places=2, max_digits=18, null=True)),
('max_amount_surgery', models.DecimalField(blank=True, db_column='MaxAmountSurgery', decimal_places=2, max_digits=18, null=True)),
('max_amount_delivery', models.DecimalField(blank=True, db_column='MaxAmountDelivery', decimal_places=2, max_digits=18, null=True)),
('max_amount_hospitalization', models.DecimalField(blank=True, db_column='MaxAmountHospitalization', decimal_places=2, max_digits=18, null=True)),
('renewal_discount_perc', models.IntegerField(blank=True, db_column='RenewalDiscountPerc', null=True)),
('renewal_discount_period', models.IntegerField(blank=True, db_column='RenewalDiscountPeriod', null=True)),
('enrolment_discount_perc', models.IntegerField(blank=True, db_column='EnrolmentDiscountPerc', null=True)),
('enrolment_discount_period', models.IntegerField(blank=True, db_column='EnrolmentDiscountPeriod', null=True)),
('share_contribution', models.DecimalField(blank=True, db_column='ShareContribution', decimal_places=2, max_digits=5, null=True)),
('max_policy_extra_member', models.DecimalField(blank=True, db_column='MaxPolicyExtraMember', decimal_places=2, max_digits=18, null=True)),
('max_policy_extra_member_ip', models.DecimalField(blank=True, db_column='MaxPolicyExtraMemberIP', decimal_places=2, max_digits=18, null=True)),
('max_policy_extra_member_op', models.DecimalField(blank=True, db_column='MaxPolicyExtraMemberOP', decimal_places=2, max_digits=18, null=True)),
('max_ceiling_policy', models.DecimalField(blank=True, db_column='MaxCeilingPolicy', decimal_places=2, max_digits=18, null=True)),
('max_ceiling_policy_ip', models.DecimalField(blank=True, db_column='MaxCeilingPolicyIP', decimal_places=2, max_digits=18, null=True)),
('max_ceiling_policy_op', models.DecimalField(blank=True, db_column='MaxCeilingPolicyOP', decimal_places=2, max_digits=18, null=True)),
('max_amount_antenatal', models.DecimalField(blank=True, db_column='MaxAmountAntenatal', decimal_places=2, max_digits=18, null=True)),
('max_no_antenatal', models.IntegerField(blank=True, db_column='MaxNoAntenatal', null=True)),
('capitation_level_1', models.CharField(blank=True, db_column='Level1', max_length=1, null=True)),
('capitation_level_2', models.CharField(blank=True, db_column='Level2', max_length=1, null=True)),
('capitation_level_3', models.CharField(blank=True, db_column='Level3', max_length=1, null=True)),
('capitation_level_4', models.CharField(blank=True, db_column='Level4', max_length=1, null=True)),
('weight_population', models.DecimalField(blank=True, db_column='WeightPopulation', decimal_places=2, max_digits=5, null=True)),
('weight_nb_families', models.DecimalField(blank=True, db_column='WeightNumberFamilies', decimal_places=2, max_digits=5, null=True)),
('weight_insured_population', models.DecimalField(blank=True, db_column='WeightInsuredPopulation', decimal_places=2, max_digits=5, null=True)),
('weight_nb_insured_families', models.DecimalField(blank=True, db_column='WeightNumberInsuredFamilies', decimal_places=2, max_digits=5, null=True)),
('weight_nb_visits', models.DecimalField(blank=True, db_column='WeightNumberVisits', decimal_places=2, max_digits=5, null=True)),
('weight_adjusted_amount', models.DecimalField(blank=True, db_column='WeightAdjustedAmount', decimal_places=2, max_digits=5, null=True)),
('capitation_sublevel_1', models.ForeignKey(blank=True, db_column='Sublevel1', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='location.healthfacilitysublevel')),
('capitation_sublevel_2', models.ForeignKey(blank=True, db_column='Sublevel2', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='location.healthfacilitysublevel')),
('capitation_sublevel_3', models.ForeignKey(blank=True, db_column='Sublevel3', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='location.healthfacilitysublevel')),
('capitation_sublevel_4', models.ForeignKey(blank=True, db_column='Sublevel4', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='location.healthfacilitysublevel')),
('conversion_product', models.ForeignKey(blank=True, db_column='ConversionProdID', null=True, on_delete=django.db.models.deletion.DO_NOTHING, to='product.product')),
('location', models.ForeignKey(blank=True, db_column='LocationId', null=True, on_delete=django.db.models.deletion.DO_NOTHING, to='location.location')),
('ceiling_interpretation', models.CharField(blank=True, choices=[('I', 'Claim Type'), ('H', 'Health Facility Type')], db_column='CeilingInterpretation', max_length=1, null=True)),
],
options={
'db_table': 'tblProduct',
'managed': False,
},
),
migrations.CreateModel(
name='ProductItem',
fields=[
('validity_from', core.fields.DateTimeField(db_column='ValidityFrom', default=datetime.datetime.now)),
('validity_to', core.fields.DateTimeField(blank=True, db_column='ValidityTo', null=True)),
('legacy_id', models.IntegerField(blank=True, db_column='LegacyID', null=True)),
('id', models.AutoField(db_column='ProdItemID', primary_key=True, serialize=False)),
('limitation_type', models.CharField(blank=True, db_column='LimitationType', max_length=1, null=True)),
('price_origin', models.CharField(blank=True, db_column='PriceOrigin', max_length=1, null=True)),
('limit_adult', models.DecimalField(blank=True, db_column='LimitAdult', decimal_places=2, max_digits=18, null=True)),
('limit_child', models.DecimalField(blank=True, db_column='LimitChild', decimal_places=2, max_digits=18, null=True)),
('waiting_period_adult', models.IntegerField(blank=True, db_column='WaitingPeriodAdult', null=True)),
('waiting_period_child', models.IntegerField(blank=True, db_column='WaitingPeriodChild', null=True)),
('limit_no_adult', models.IntegerField(blank=True, db_column='LimitNoAdult', null=True)),
('limit_no_child', models.IntegerField(blank=True, db_column='LimitNoChild', null=True)),
('limitation_type_r', models.CharField(blank=True, db_column='LimitationTypeR', max_length=1, null=True)),
('limitation_type_e', models.CharField(blank=True, db_column='LimitationTypeE', max_length=1, null=True)),
('limit_adult_r', models.DecimalField(blank=True, db_column='LimitAdultR', decimal_places=2, max_digits=18, null=True)),
('limit_adult_e', models.DecimalField(blank=True, db_column='LimitAdultE', decimal_places=2, max_digits=18, null=True)),
('limit_child_r', models.DecimalField(blank=True, db_column='LimitChildR', decimal_places=2, max_digits=18, null=True)),
('limit_child_e', models.DecimalField(blank=True, db_column='LimitChildE', decimal_places=2, max_digits=18, null=True)),
('ceiling_exclusion_adult', models.CharField(blank=True, db_column='CeilingExclusionAdult', max_length=1, null=True)),
('ceiling_exclusion_child', models.CharField(blank=True, db_column='CeilingExclusionChild', max_length=1, null=True)),
('audit_user_id', models.IntegerField(db_column='AuditUserID')),
],
options={
'db_table': 'tblProductItems',
'managed': False,
},
bases=(models.Model, product.models.ProductItemOrService),
),
migrations.CreateModel(
name='ProductService',
fields=[
('validity_from', core.fields.DateTimeField(db_column='ValidityFrom', default=datetime.datetime.now)),
('validity_to', core.fields.DateTimeField(blank=True, db_column='ValidityTo', null=True)),
('legacy_id', models.IntegerField(blank=True, db_column='LegacyID', null=True)),
('id', models.AutoField(db_column='ProdServiceID', primary_key=True, serialize=False)),
('limitation_type', models.CharField(db_column='LimitationType', max_length=1)),
('price_origin', models.CharField(db_column='PriceOrigin', max_length=1)),
('limit_adult', models.DecimalField(blank=True, db_column='LimitAdult', decimal_places=2, max_digits=18, null=True)),
('limit_child', models.DecimalField(blank=True, db_column='LimitChild', decimal_places=2, max_digits=18, null=True)),
('waiting_period_adult', models.IntegerField(blank=True, db_column='WaitingPeriodAdult', null=True)),
('waiting_period_child', models.IntegerField(blank=True, db_column='WaitingPeriodChild', null=True)),
('limit_no_adult', models.IntegerField(blank=True, db_column='LimitNoAdult', null=True)),
('limit_no_child', models.IntegerField(blank=True, db_column='LimitNoChild', null=True)),
('limitation_type_r', models.CharField(blank=True, db_column='LimitationTypeR', max_length=1, null=True)),
('limitation_type_e', models.CharField(blank=True, db_column='LimitationTypeE', max_length=1, null=True)),
('limit_adult_r', models.DecimalField(blank=True, db_column='LimitAdultR', decimal_places=2, max_digits=18, null=True)),
('limit_adult_e', models.DecimalField(blank=True, db_column='LimitAdultE', decimal_places=2, max_digits=18, null=True)),
('limit_child_r', models.DecimalField(blank=True, db_column='LimitChildR', decimal_places=2, max_digits=18, null=True)),
('limit_child_e', models.DecimalField(blank=True, db_column='LimitChildE', decimal_places=2, max_digits=18, null=True)),
('ceiling_exclusion_adult', models.CharField(blank=True, db_column='CeilingExclusionAdult', max_length=1, null=True)),
('ceiling_exclusion_child', models.CharField(blank=True, db_column='CeilingExclusionChild', max_length=1, null=True)),
('audit_user_id', models.IntegerField(db_column='AuditUserID')),
],
options={
'db_table': 'tblProductServices',
'managed': False,
},
bases=(models.Model, product.models.ProductItemOrService),
),
]
|
PypiClean
|
/mm_sdk-0.1.316-py3-none-any.whl/sdk/analytics.py
|
import json
from typing import List, Optional
from pydantic import BaseModel, HttpUrl
from sdk import Gender
from .client import Empty, SDKClient, SDKResponse
class Service(BaseModel):
name: str
price_code: str
cost: str
class Client(BaseModel):
uuid: str
fio: str
sex: str
birth: str
phone: Optional[str]
email: Optional[str]
nationality: Optional[str]
city: Optional[str]
okrug: Optional[str]
street: Optional[str]
building: Optional[str]
google_cid: Optional[str]
yandex_cid: Optional[str]
class LabOrderRequest(BaseModel):
lab_number: str
dc: str
cost: str
pre_record_number: Optional[str]
mc_id: str
client: Client
services: List[Service]
class TopFive(BaseModel):
price_code: str
name: str
class PopServiceResponse(BaseModel):
price_code: str
top_five: List[TopFive]
class PriceCodeRequest(BaseModel):
price_code: Optional[str]
class RecommendationRequest(BaseModel):
price_code: List[str]
sex: Optional[Gender]
age: Optional[int]
class Config:
use_enum_values = True
class RecommendationService(BaseModel):
price_code: str
name: str
class RecommendationServiceResponse(BaseModel):
recommendations: List[RecommendationService]
class RecommendationComplex(BaseModel):
name: str
services: List[RecommendationService]
class RecommendationComplexResponse(BaseModel):
recommendations: List[RecommendationComplex]
class AnalyticsService:
def __init__(self, client: SDKClient, url: HttpUrl, token: str):
self._client = client
self._url = url
self._token = token
def send_lab_order(self, query: LabOrderRequest, timeout=3) -> SDKResponse[Empty]:
return self._client.post(
self._url + "api/rfm/edit",
Empty,
data=json.dumps(query.dict()),
timeout=timeout,
headers={"Authorization": f"Bearer {self._token}"},
)
def top_services(
self, query: PriceCodeRequest, timeout=3
) -> SDKResponse[List[PopServiceResponse]]:
return self._client.get(
self._url + f"api/recommend/top_five",
PopServiceResponse,
params=query.dict(),
timeout=timeout,
headers={"Authorization": f"Bearer {self._token}"},
)
def recommendations(
self, query: RecommendationRequest, timeout=3
) -> SDKResponse[List[RecommendationServiceResponse]]:
return self._client.get(
self._url + f"api/recommend/recommendations",
RecommendationServiceResponse,
params=query.dict(exclude_none=True),
timeout=timeout,
headers={"Authorization": f"Bearer {self._token}"},
)
def recommendations_v2(
self, query: RecommendationRequest, timeout=3
) -> SDKResponse[List[RecommendationComplexResponse]]:
return self._client.get(
self._url + f"api/recommend/v2/recommendations",
RecommendationComplexResponse,
params=query.dict(exclude_none=True),
timeout=timeout,
headers={"Authorization": f"Bearer {self._token}"},
)
|
PypiClean
|
/qmpy-tri-2022.7.21.tar.gz/qmpy-tri-2022.7.21/qmpy/web/static/js/jsmol/j2s/J/render/FontLineShapeRenderer.js
|
Clazz.declarePackage ("J.render");
Clazz.load (["J.render.ShapeRenderer", "J.util.P3", "$.P3i", "$.V3"], "J.render.FontLineShapeRenderer", ["java.lang.Float", "J.constant.EnumAxesMode", "J.util.TextFormat"], function () {
c$ = Clazz.decorateAsClass (function () {
this.imageFontScaling = 0;
this.atomA = null;
this.atomB = null;
this.atomC = null;
this.atomD = null;
this.font3d = null;
this.pt0i = null;
this.pt1i = null;
this.pt2i = null;
this.s1 = null;
this.s2 = null;
this.pointT = null;
this.pointT2 = null;
this.pointT3 = null;
this.vectorT = null;
this.vectorT2 = null;
this.vectorT3 = null;
this.tickInfo = null;
this.draw000 = true;
this.width = 0;
this.endcap = 3;
this.colixA = 0;
this.colixB = 0;
this.dotsOrDashes = false;
this.dashDots = null;
this.asLineOnly = false;
Clazz.instantialize (this, arguments);
}, J.render, "FontLineShapeRenderer", J.render.ShapeRenderer);
Clazz.prepareFields (c$, function () {
this.pt0i = new J.util.P3i ();
this.pt1i = new J.util.P3i ();
this.pt2i = new J.util.P3i ();
this.s1 = new J.util.P3i ();
this.s2 = new J.util.P3i ();
this.pointT = new J.util.P3 ();
this.pointT2 = new J.util.P3 ();
this.pointT3 = new J.util.P3 ();
this.vectorT = new J.util.V3 ();
this.vectorT2 = new J.util.V3 ();
this.vectorT3 = new J.util.V3 ();
});
$_M(c$, "getDiameter",
function (z, madOrPixels) {
var diameter;
var isMad = (madOrPixels > 20);
switch (this.exportType) {
case 1:
diameter = (isMad ? madOrPixels : Clazz.doubleToInt (Math.floor (this.viewer.unscaleToScreen (z, madOrPixels * 2) * 1000)));
break;
default:
if (isMad) {
diameter = Clazz.floatToInt (this.viewer.scaleToScreen (z, madOrPixels));
} else {
if (this.g3d.isAntialiased ()) madOrPixels += madOrPixels;
diameter = madOrPixels;
}}
return diameter;
}, "~N,~N");
$_M(c$, "renderLine",
function (p0, p1, diameter, pt0, pt1, drawTicks) {
pt0.set (Clazz.doubleToInt (Math.floor (p0.x)), Clazz.doubleToInt (Math.floor (p0.y)), Clazz.doubleToInt (Math.floor (p0.z)));
pt1.set (Clazz.doubleToInt (Math.floor (p1.x)), Clazz.doubleToInt (Math.floor (p1.y)), Clazz.doubleToInt (Math.floor (p1.z)));
if (diameter < 0) this.g3d.drawDottedLine (pt0, pt1);
else this.g3d.fillCylinder (this.endcap, diameter, pt0, pt1);
if (!drawTicks || this.tickInfo == null) return;
this.atomA.screenX = pt0.x;
this.atomA.screenY = pt0.y;
this.atomA.screenZ = pt0.z;
this.atomB.screenX = pt1.x;
this.atomB.screenY = pt1.y;
this.atomB.screenZ = pt1.z;
this.drawTicks (this.atomA, this.atomB, diameter, true);
}, "J.util.P3,J.util.P3,~N,J.util.P3i,J.util.P3i,~B");
$_M(c$, "drawTicks",
function (pt1, pt2, diameter, withLabels) {
if (Float.isNaN (this.tickInfo.first)) this.tickInfo.first = 0;
this.drawTicks2 (pt1, pt2, this.tickInfo.ticks.x, 8, diameter, (!withLabels ? null : this.tickInfo.tickLabelFormats == null ? ["%0.2f"] : this.tickInfo.tickLabelFormats));
this.drawTicks2 (pt1, pt2, this.tickInfo.ticks.y, 4, diameter, null);
this.drawTicks2 (pt1, pt2, this.tickInfo.ticks.z, 2, diameter, null);
}, "J.util.Point3fi,J.util.Point3fi,~N,~B");
$_M(c$, "drawTicks2",
($fz = function (ptA, ptB, dx, length, diameter, formats) {
if (dx == 0) return;
if (this.g3d.isAntialiased ()) length *= 2;
this.vectorT2.set (ptB.screenX, ptB.screenY, 0);
this.vectorT.set (ptA.screenX, ptA.screenY, 0);
this.vectorT2.sub (this.vectorT);
if (this.vectorT2.length () < 50) return;
var signFactor = this.tickInfo.signFactor;
this.vectorT.setT (ptB);
this.vectorT.sub (ptA);
var d0 = this.vectorT.length ();
if (this.tickInfo.scale != null) {
if (Float.isNaN (this.tickInfo.scale.x)) {
var a = this.viewer.getUnitCellInfo (0);
if (!Float.isNaN (a)) this.vectorT.set (this.vectorT.x / a, this.vectorT.y / this.viewer.getUnitCellInfo (1), this.vectorT.z / this.viewer.getUnitCellInfo (2));
} else {
this.vectorT.set (this.vectorT.x * this.tickInfo.scale.x, this.vectorT.y * this.tickInfo.scale.y, this.vectorT.z * this.tickInfo.scale.z);
}}var d = this.vectorT.length () + 0.0001 * dx;
if (d < dx) return;
var f = dx / d * d0 / d;
this.vectorT.scale (f);
var dz = (ptB.screenZ - ptA.screenZ) / (d / dx);
d += this.tickInfo.first;
var p = (Clazz.doubleToInt (Math.floor (this.tickInfo.first / dx))) * dx - this.tickInfo.first;
this.pointT.scaleAdd2 (p / dx, this.vectorT, ptA);
p += this.tickInfo.first;
var z = ptA.screenZ;
if (diameter < 0) diameter = 1;
this.vectorT2.set (-this.vectorT2.y, this.vectorT2.x, 0);
this.vectorT2.scale (length / this.vectorT2.length ());
var ptRef = this.tickInfo.reference;
if (ptRef == null) {
this.pointT3.setT (this.viewer.getBoundBoxCenter ());
if (this.viewer.getAxesMode () === J.constant.EnumAxesMode.BOUNDBOX) {
this.pointT3.x += 1.0;
this.pointT3.y += 1.0;
this.pointT3.z += 1.0;
}} else {
this.pointT3.setT (ptRef);
}this.viewer.transformPtScr (this.pointT3, this.pt2i);
var horizontal = (Math.abs (this.vectorT2.x / this.vectorT2.y) < 0.2);
var centerX = horizontal;
var centerY = !horizontal;
var rightJustify = !centerX && (this.vectorT2.x < 0);
var drawLabel = (formats != null && formats.length > 0);
var x;
var y;
var val = new Array (1);
var i = (this.draw000 ? 0 : -1);
while (p < d) {
if (p >= this.tickInfo.first) {
this.pointT2.setT (this.pointT);
this.viewer.transformPt3f (this.pointT2, this.pointT2);
this.drawLine (Clazz.doubleToInt (Math.floor (this.pointT2.x)), Clazz.doubleToInt (Math.floor (this.pointT2.y)), Clazz.floatToInt (z), (x = Clazz.doubleToInt (Math.floor (this.pointT2.x + this.vectorT2.x))), (y = Clazz.doubleToInt (Math.floor (this.pointT2.y + this.vectorT2.y))), Clazz.floatToInt (z), diameter);
if (drawLabel && (this.draw000 || p != 0)) {
val[0] = Float.$valueOf ((p == 0 ? 0 : p * signFactor));
var s = J.util.TextFormat.sprintf (formats[i % formats.length], "f", val);
this.drawString (x, y, Clazz.floatToInt (z), 4, rightJustify, centerX, centerY, Clazz.doubleToInt (Math.floor (this.pointT2.y)), s);
}}this.pointT.add (this.vectorT);
p += dx;
z += dz;
i++;
}
}, $fz.isPrivate = true, $fz), "J.util.Point3fi,J.util.Point3fi,~N,~N,~N,~A");
$_M(c$, "drawLine",
function (x1, y1, z1, x2, y2, z2, diameter) {
return this.drawLine2 (x1, y1, z1, x2, y2, z2, diameter);
}, "~N,~N,~N,~N,~N,~N,~N");
$_M(c$, "drawLine2",
function (x1, y1, z1, x2, y2, z2, diameter) {
this.pt0i.set (x1, y1, z1);
this.pt1i.set (x2, y2, z2);
if (this.dotsOrDashes) {
if (this.dashDots != null) this.drawDashed (x1, y1, z1, x2, y2, z2, this.dashDots);
} else {
if (diameter < 0) {
this.g3d.drawDashedLine (4, 2, this.pt0i, this.pt1i);
return 1;
}this.g3d.fillCylinder (2, diameter, this.pt0i, this.pt1i);
}return Clazz.doubleToInt ((diameter + 1) / 2);
}, "~N,~N,~N,~N,~N,~N,~N");
$_M(c$, "drawString",
function (x, y, z, radius, rightJustify, centerX, centerY, yRef, sVal) {
if (sVal == null) return;
var width = this.font3d.stringWidth (sVal);
var height = this.font3d.getAscent ();
var xT = x;
if (rightJustify) xT -= Clazz.doubleToInt (radius / 2) + 2 + width;
else if (centerX) xT -= Clazz.doubleToInt (radius / 2) + 2 + Clazz.doubleToInt (width / 2);
else xT += Clazz.doubleToInt (radius / 2) + 2;
var yT = y;
if (centerY) yT += Clazz.doubleToInt (height / 2);
else if (yRef == 0 || yRef < y) yT += height;
else yT -= Clazz.doubleToInt (radius / 2);
var zT = z - radius - 2;
if (zT < 1) zT = 1;
this.g3d.drawString (sVal, this.font3d, xT, yT, zT, zT, 0);
}, "~N,~N,~N,~N,~B,~B,~B,~N,~S");
$_M(c$, "drawDashed",
function (xA, yA, zA, xB, yB, zB, array) {
if (array == null || this.width < 0) return;
var f = array[0];
var dx = xB - xA;
var dy = yB - yA;
var dz = zB - zA;
var n = 0;
var isNdots = (array === J.render.FontLineShapeRenderer.ndots);
var isDots = (isNdots || array === J.render.FontLineShapeRenderer.sixdots);
if (isDots) {
var d2 = (dx * dx + dy * dy) / (this.width * this.width);
if (isNdots) {
f = (Math.sqrt (d2) / 1.5);
n = Clazz.floatToInt (f) + 3;
} else if (d2 < 8) {
array = J.render.FontLineShapeRenderer.twodots;
} else if (d2 < 32) {
array = J.render.FontLineShapeRenderer.fourdots;
}}var ptS = array[1];
var ptE = array[2];
var colixS = this.colixA;
var colixE = (ptE == 0 ? this.colixB : this.colixA);
if (n == 0) n = array.length;
for (var i = 0, pt = 3; pt < n; pt++) {
i = (isNdots ? i + 1 : array[pt]);
var xS = Clazz.doubleToInt (Math.floor (xA + dx * i / f));
var yS = Clazz.doubleToInt (Math.floor (yA + dy * i / f));
var zS = Clazz.doubleToInt (Math.floor (zA + dz * i / f));
if (isDots) {
this.s1.set (xS, yS, zS);
if (pt == ptS) this.g3d.setColix (this.colixA);
else if (pt == ptE) this.g3d.setColix (this.colixB);
this.g3d.fillSphereI (this.width, this.s1);
continue;
}if (pt == ptS) colixS = this.colixB;
i = array[++pt];
if (pt == ptE) colixE = this.colixB;
var xE = Clazz.doubleToInt (Math.floor (xA + dx * i / f));
var yE = Clazz.doubleToInt (Math.floor (yA + dy * i / f));
var zE = Clazz.doubleToInt (Math.floor (zA + dz * i / f));
this.fillCylinder (colixS, colixE, 2, this.width, xS, yS, zS, xE, yE, zE);
}
}, "~N,~N,~N,~N,~N,~N,~A");
$_M(c$, "fillCylinder",
function (colixA, colixB, endcaps, diameter, xA, yA, zA, xB, yB, zB) {
if (this.asLineOnly) this.g3d.drawLine (colixA, colixB, xA, yA, zA, xB, yB, zB);
else this.g3d.fillCylinderXYZ (colixA, colixB, endcaps, (!this.isExport || this.mad == 1 ? diameter : this.mad), xA, yA, zA, xB, yB, zB);
}, "~N,~N,~N,~N,~N,~N,~N,~N,~N,~N");
Clazz.defineStatics (c$,
"dashes", [12, 0, 0, 2, 5, 7, 10],
"hDashes", [10, 7, 6, 1, 3, 4, 6, 7, 9],
"ndots", [0, 3, 1000],
"sixdots", [12, 3, 6, 1, 3, 5, 7, 9, 11],
"fourdots", [13, 3, 5, 2, 5, 8, 11],
"twodots", [12, 3, 4, 3, 9]);
});
|
PypiClean
|
/flask-admin-markdown-2020.3.27.1.tar.gz/flask-admin-markdown-2020.3.27.1/flask_admin_markdown/static/editormd/lib/codemirror/mode/tiddlywiki/tiddlywiki.js
|
//{{{
(function(mod) {
if (typeof exports == "object" && typeof module == "object") // CommonJS
mod(require("../../lib/codemirror"));
else if (typeof define == "function" && define.amd) // AMD
define(["../../lib/codemirror"], mod);
else // Plain browser env
mod(CodeMirror);
})(function(CodeMirror) {
"use strict";
CodeMirror.defineMode("tiddlywiki", function () {
// Tokenizer
var textwords = {};
var keywords = function () {
function kw(type) {
return { type: type, style: "macro"};
}
return {
"allTags": kw('allTags'), "closeAll": kw('closeAll'), "list": kw('list'),
"newJournal": kw('newJournal'), "newTiddler": kw('newTiddler'),
"permaview": kw('permaview'), "saveChanges": kw('saveChanges'),
"search": kw('search'), "slider": kw('slider'), "tabs": kw('tabs'),
"tag": kw('tag'), "tagging": kw('tagging'), "tags": kw('tags'),
"tiddler": kw('tiddler'), "timeline": kw('timeline'),
"today": kw('today'), "version": kw('version'), "option": kw('option'),
"with": kw('with'),
"filter": kw('filter')
};
}();
var isSpaceName = /[\w_\-]/i,
reHR = /^\-\-\-\-+$/, // <hr>
reWikiCommentStart = /^\/\*\*\*$/, // /***
reWikiCommentStop = /^\*\*\*\/$/, // ***/
reBlockQuote = /^<<<$/,
reJsCodeStart = /^\/\/\{\{\{$/, // //{{{ js block start
reJsCodeStop = /^\/\/\}\}\}$/, // //}}} js stop
reXmlCodeStart = /^<!--\{\{\{-->$/, // xml block start
reXmlCodeStop = /^<!--\}\}\}-->$/, // xml stop
reCodeBlockStart = /^\{\{\{$/, // {{{ TW text div block start
reCodeBlockStop = /^\}\}\}$/, // }}} TW text stop
reUntilCodeStop = /.*?\}\}\}/;
function chain(stream, state, f) {
state.tokenize = f;
return f(stream, state);
}
// Used as scratch variables to communicate multiple values without
// consing up tons of objects.
var type, content;
function ret(tp, style, cont) {
type = tp;
content = cont;
return style;
}
function jsTokenBase(stream, state) {
var sol = stream.sol(), ch;
state.block = false; // indicates the start of a code block.
ch = stream.peek(); // don't eat, to make matching simpler
// check start of blocks
if (sol && /[<\/\*{}\-]/.test(ch)) {
if (stream.match(reCodeBlockStart)) {
state.block = true;
return chain(stream, state, twTokenCode);
}
if (stream.match(reBlockQuote)) {
return ret('quote', 'quote');
}
if (stream.match(reWikiCommentStart) || stream.match(reWikiCommentStop)) {
return ret('code', 'comment');
}
if (stream.match(reJsCodeStart) || stream.match(reJsCodeStop) || stream.match(reXmlCodeStart) || stream.match(reXmlCodeStop)) {
return ret('code', 'comment');
}
if (stream.match(reHR)) {
return ret('hr', 'hr');
}
} // sol
ch = stream.next();
if (sol && /[\/\*!#;:>|]/.test(ch)) {
if (ch == "!") { // tw header
stream.skipToEnd();
return ret("header", "header");
}
if (ch == "*") { // tw list
stream.eatWhile('*');
return ret("list", "comment");
}
if (ch == "#") { // tw numbered list
stream.eatWhile('#');
return ret("list", "comment");
}
if (ch == ";") { // definition list, term
stream.eatWhile(';');
return ret("list", "comment");
}
if (ch == ":") { // definition list, description
stream.eatWhile(':');
return ret("list", "comment");
}
if (ch == ">") { // single line quote
stream.eatWhile(">");
return ret("quote", "quote");
}
if (ch == '|') {
return ret('table', 'header');
}
}
if (ch == '{' && stream.match(/\{\{/)) {
return chain(stream, state, twTokenCode);
}
// rudimentary html:// file:// link matching. TW knows much more ...
if (/[hf]/i.test(ch)) {
if (/[ti]/i.test(stream.peek()) && stream.match(/\b(ttps?|tp|ile):\/\/[\-A-Z0-9+&@#\/%?=~_|$!:,.;]*[A-Z0-9+&@#\/%=~_|$]/i)) {
return ret("link", "link");
}
}
// just a little string indicator, don't want to have the whole string covered
if (ch == '"') {
return ret('string', 'string');
}
if (ch == '~') { // _no_ CamelCase indicator should be bold
return ret('text', 'brace');
}
if (/[\[\]]/.test(ch)) { // check for [[..]]
if (stream.peek() == ch) {
stream.next();
return ret('brace', 'brace');
}
}
if (ch == "@") { // check for space link. TODO fix @@...@@ highlighting
stream.eatWhile(isSpaceName);
return ret("link", "link");
}
if (/\d/.test(ch)) { // numbers
stream.eatWhile(/\d/);
return ret("number", "number");
}
if (ch == "/") { // tw invisible comment
if (stream.eat("%")) {
return chain(stream, state, twTokenComment);
}
else if (stream.eat("/")) { //
return chain(stream, state, twTokenEm);
}
}
if (ch == "_") { // tw underline
if (stream.eat("_")) {
return chain(stream, state, twTokenUnderline);
}
}
// strikethrough and mdash handling
if (ch == "-") {
if (stream.eat("-")) {
// if strikethrough looks ugly, change CSS.
if (stream.peek() != ' ')
return chain(stream, state, twTokenStrike);
// mdash
if (stream.peek() == ' ')
return ret('text', 'brace');
}
}
if (ch == "'") { // tw bold
if (stream.eat("'")) {
return chain(stream, state, twTokenStrong);
}
}
if (ch == "<") { // tw macro
if (stream.eat("<")) {
return chain(stream, state, twTokenMacro);
}
}
else {
return ret(ch);
}
// core macro handling
stream.eatWhile(/[\w\$_]/);
var word = stream.current(),
known = textwords.propertyIsEnumerable(word) && textwords[word];
return known ? ret(known.type, known.style, word) : ret("text", null, word);
} // jsTokenBase()
// tw invisible comment
function twTokenComment(stream, state) {
var maybeEnd = false,
ch;
while (ch = stream.next()) {
if (ch == "/" && maybeEnd) {
state.tokenize = jsTokenBase;
break;
}
maybeEnd = (ch == "%");
}
return ret("comment", "comment");
}
// tw strong / bold
function twTokenStrong(stream, state) {
var maybeEnd = false,
ch;
while (ch = stream.next()) {
if (ch == "'" && maybeEnd) {
state.tokenize = jsTokenBase;
break;
}
maybeEnd = (ch == "'");
}
return ret("text", "strong");
}
// tw code
function twTokenCode(stream, state) {
var ch, sb = state.block;
if (sb && stream.current()) {
return ret("code", "comment");
}
if (!sb && stream.match(reUntilCodeStop)) {
state.tokenize = jsTokenBase;
return ret("code", "comment");
}
if (sb && stream.sol() && stream.match(reCodeBlockStop)) {
state.tokenize = jsTokenBase;
return ret("code", "comment");
}
ch = stream.next();
return (sb) ? ret("code", "comment") : ret("code", "comment");
}
// tw em / italic
function twTokenEm(stream, state) {
var maybeEnd = false,
ch;
while (ch = stream.next()) {
if (ch == "/" && maybeEnd) {
state.tokenize = jsTokenBase;
break;
}
maybeEnd = (ch == "/");
}
return ret("text", "em");
}
// tw underlined text
function twTokenUnderline(stream, state) {
var maybeEnd = false,
ch;
while (ch = stream.next()) {
if (ch == "_" && maybeEnd) {
state.tokenize = jsTokenBase;
break;
}
maybeEnd = (ch == "_");
}
return ret("text", "underlined");
}
// tw strike through text looks ugly
// change CSS if needed
function twTokenStrike(stream, state) {
var maybeEnd = false, ch;
while (ch = stream.next()) {
if (ch == "-" && maybeEnd) {
state.tokenize = jsTokenBase;
break;
}
maybeEnd = (ch == "-");
}
return ret("text", "strikethrough");
}
// macro
function twTokenMacro(stream, state) {
var ch, word, known;
if (stream.current() == '<<') {
return ret('brace', 'macro');
}
ch = stream.next();
if (!ch) {
state.tokenize = jsTokenBase;
return ret(ch);
}
if (ch == ">") {
if (stream.peek() == '>') {
stream.next();
state.tokenize = jsTokenBase;
return ret("brace", "macro");
}
}
stream.eatWhile(/[\w\$_]/);
word = stream.current();
known = keywords.propertyIsEnumerable(word) && keywords[word];
if (known) {
return ret(known.type, known.style, word);
}
else {
return ret("macro", null, word);
}
}
// Interface
return {
startState: function () {
return {
tokenize: jsTokenBase,
indented: 0,
level: 0
};
},
token: function (stream, state) {
if (stream.eatSpace()) return null;
var style = state.tokenize(stream, state);
return style;
},
electricChars: ""
};
});
CodeMirror.defineMIME("text/x-tiddlywiki", "tiddlywiki");
});
//}}}
|
PypiClean
|
/uvm-python-0.3.0.tar.gz/uvm-python-0.3.0/test/examples/simple/registers/models/user-defined/regmodel.py
|
from uvm.macros import uvm_object_utils
from uvm.reg import *
from uvm.base import *
#//
#// This example demonstrates how to include a user-defined register
#// in a register model.
#//
#//
#// The user_acp_reg has a user-defined behavior
#//
#// It increments by 1 after every write
#//
class user_acp_incr_on_write_cbs(UVMRegCbs):
def __init__(self, name="write_cbs"):
UVMRegCbs.__init__(self, name)
self.num_called = 0
def post_predict(self, fld, previous, value, kind, path, _map):
if (kind != UVM_PREDICT_WRITE):
return
if (path != UVM_FRONTDOOR):
return
value.push(previous + 1)
self.num_called += 1
class user_acp_reg(UVMReg):
#
# local UVMRegField value;
#
#
def __init__(self, name="user_acp_reg"):
UVMReg.__init__(self, name,16,UVM_NO_COVERAGE)
def build(self):
self.value = UVMRegField.type_id.create("value", None, self.get_full_name())
self.value.configure(self, 16, 0, "RW", 0, 0x0000, 1, 0, 0);
self.value.set_compare(UVM_NO_CHECK);
UVMResourceDb.set("REG::" + self.get_full_name(),
"NO_REG_BIT_BASH_TEST", 1)
UVMResourceDb.set("REG::" + self.get_full_name(),
"NO_REG_ACCESS_TEST", 1);
self.cb = user_acp_incr_on_write_cbs()
UVMRegFieldCb.add(self.value, self.cb)
# endfunction: build
async def pre_write(self, rw):
m_data = 0
rg = None
#assert($cast(rg,rw.element));
rg = rw.element
# Predict the value that will be in the register
m_data = rg.get() + 1
# If a backdoor write is used, replace the value written
# with the incremented value to emulate the front-door
if (rw.path == UVM_BACKDOOR):
rw.value[0] = m_data
await Timer(5, "NS")
# endtask: pre_write
#
#endclass : user_acp_reg
uvm_object_utils(user_acp_reg)
class block_B(UVMRegBlock):
# user_acp_reg user_acp;
def __init__(self, name="B"):
UVMRegBlock.__init__(self, name, UVM_NO_COVERAGE)
def build(self):
self.default_map = self.create_map("", 0, 1, UVM_BIG_ENDIAN)
self.user_acp = user_acp_reg.type_id.create("user_acp", None, self.get_full_name())
self.user_acp.configure(self, None, "acp")
self.user_acp.build()
self.default_map.add_reg(self.user_acp, 0x0000, "RW")
self.lock_model()
#endclass : block_B
uvm_object_utils(block_B)
|
PypiClean
|
/uos_statphys-0.1.12.tar.gz/uos_statphys-0.1.12/uos_statphys/ImportManager.py
|
__all__ = ['']
class Import_Manager:
'''class for import manager for classes.'''
def __init__(self, globs= None):
self._modules = {}
self._funcs = {}
self._alls = {}
def copy(self, targets, globs):
if isinstance(targets, dict):
self.requireAs(globs = globs, **targets)
else:
self.requireAs(globs = globs, **targets._modules)
def check(self, module_name):
if module_name in self._modules:
return True
else:
ch = False
try:
self.load(module_name)
ch = True
except:
ch = False
return ch
def import_all(self, module_name):
if module_name in self._alls:
return self._alls[module_name]
module_name_split = module_name.split(".")
if len(module_name_split) == 1:
if module_name in self._alls:
return self._alls[module_name]
else:
TARTGET_MOD_code = f"from {module_name} import *"
LOCAL_COPY = set(locals().keys()).copy()
exec(TARTGET_MOD_code)
LOCAL_COPY = set(locals().keys()).copy()-LOCAL_COPY
self._alls[module_name] = {}
for ALL_ATT_name in LOCAL_COPY:
self._alls[module_name][ALL_ATT_name] = locals()[ALL_ATT_name]
return self._alls[module_name]
def load(self, module_name, func_name = None):
if func_name is None:
var = self._modules
mns = module_name.split(".")
if len(mns) == 1:
if module_name in var:
return var[module_name]
else:
code = f"import {module_name}"
exec(code)
var[module_name] = locals()[module_name]
return var[module_name]
else:
module = var.get(mns[0],False)
if module:
for i in range(len(mns)-1):
module = vars(module).get(mns[i+1], lambda x:x)
if module:
return module
code = f"import {module_name}"
exec(code)
target = locals()[mns[0]]
for i in range(len(mns)-1):
target = vars(target)[mns[i+1]]
var[module_name] = target
return var[module_name]
else:
var = self._funcs
if func_name in var:
return var[func_name]
else:
code = f"from {module_name} import {func_name}"
exec(code)
var[func_name] = locals()[func_name]
return var[func_name]
def require(self, *args, globs = None, namespace = None):
'''equivalent as python `import module_name`.
Parameters
------------
globs : `dict`
imported module will be assigned into `globs`, default is None.
args
--------
module_name : `string`
The exact name of module, which is used in python code.
Return
---------
`list` of `module`
Python modules which are exactly imported by python. the order is same with `args`.
If `globs` is not `None`, return is `None`.
examples
----------
this two codes are equivalent
```
import numpy
import ctypes
import os, sys
```
and
`Import_Manager.require('numpy','ctypes','os','sys', globs= globals())`
'''
info = None
if namespace is not None:
info = vars(namespace).get('_modules', {})
targets = {}
if globs is not None:
targets = globs
if len(args) == 1:
if namespace is not None:
info[args[0]] = args[0]
targets[args[0]] = self.load(args[0])
return targets[args[0]]
for mn in args:
targets[mn] = self.load(mn)
if namespace is not None:
vars(namespace)['_modules'] = info
if (globs is None):
return [targets[mod] for mod in args]
def requireAs(self, *args, globs = None,namespace = None, **kwargs):
'''equivalent as python `import module_name as assigned_name`.
Parameters
------------
globs : `dict`
imported module will be assigned into `globs`, default is None.
args
--------
module_name : `string`
The exact name of module, which is used in python code.
kwargs
--------
assign_name = module_name : `string`
`key` is variable which you want to use, and value is the exact name of module, which is used in python code.
Return
---------
`dict` of `module`
Python modules which are exactly imported by python. the order is same with `args`.
If `globs` is not `None`, return is `None`.
examples
----------
Below two codes are equivalent
```
import numpy as np
import matplotlib.pyplot as plt
import ctypes
import os, sys
```
and
`Import_Manager.requireAs(np = 'numpy', plt = 'matplotlib.pyplot','ctypes','os','sys', globs= globals())`
'''
targets = {}
if globs is not None:
targets = globs
if args:
self.require(*args,globs = targets, namespace = namespace)
if namespace is not None:
info = vars(namespace).get('_modules', {})
for an in kwargs:
targets[an] = self.load(kwargs[an])
if namespace is not None:
info[an] = kwargs[an]
if namespace is not None:
vars(namespace)['_modules'] = info
if (globs is None):
return targets
def require_func(self, globs, **kwargs):
'''equivalent as python `from module_name import func_name`.
Parameters
------------
globs : `dict`
imported module will be assigned into `globs`, default is None.
kwargs
--------
func_name = module_name : `string`
`key` is function which you want to import, and value is the exact name of its module, which is used in python code.
Return
---------
`dict` of `function`
Python modules which are exactly imported by python. the order is same with `args`.
examples
----------
Below two codes are equivalent
``from tqdm import tqdm
from matplotlib.pyplot import plot``
and
`Import_Manager.require_func(tqdm = 'tqdm', plot = 'matplotlib.pyplot')`
'''
targets = globs
_all = False
for fn in kwargs:
if fn[:3]=='all':
_all = True
temp = self.import_all(kwargs[fn])
for key in temp:
targets[key] = temp[key]
else:
targets[fn] = self.load(kwargs[fn], fn)
if not _all and len(kwargs) == 1:
return targets[fn]
|
PypiClean
|
/gaussianfft-1.1.1.post1.tar.gz/gaussianfft-1.1.1.post1/tools/build/test/builtin_glob_archive.py
|
# Copyright 2014 Steven Watanabe
# Copyright 2015 Artur Shepilko
# Distributed under the Boost Software License, Version 1.0.
# (See accompanying file LICENSE_1_0.txt or http://www.boost.org/LICENSE_1_0.txt)
# This tests the GLOB_ARCHIVE rule.
import os
import sys
try:
from StringIO import StringIO
except:
from io import StringIO
import BoostBuild
vms = ( os.name == 'posix' and sys.platform == 'OpenVMS')
t = BoostBuild.Tester()
## Setup test archive sources and symbols they contain.
sources = {
"a.cpp" : ["a"],
"b.cpp" : ["b"],
"b_match.cpp" : ["b_match"],
"c/nopath_check.cpp" : ["nopath_check"],
"CaseCheck.cpp" : ["CaseCheck"],
"seq_check1.cpp" : ["seq_check1"],
"seq_check2.cpp" : ["seq_check2"],
"seq_check3.cpp" : ["seq_check3"],
"symbols_check.c" : ["symbol", "symbol_match"],
"members_and_symbols_check.c" : ["member_and_symbol_match"],
"symbol_case_check.c" : ["SymbolCaseCheck"],
"main_check.cpp" : ["main"]
}
def create_sources(path, sources):
for s in sources :
f = os.path.join(path, s)
t.write(f, "")
output = StringIO()
for sym in sources[s] :
output.write("int %s() { return 0; }\n" % sym)
t.write(f, output.getvalue())
def setup_archive(name, sources):
global archive
global obj_suffix
archive = t.adjust_names(name)[0]
obj_suffix = t.adjust_names(".obj")[0]
output = StringIO()
t.write("jamroot.jam","")
output.write("""\
static-lib %s :
""" % name.split(".")[0])
## sort the sources, so we can test order of the globbed members
for s in sorted(sources) :
output.write("""\
%s
""" % s)
output.write("""\
;
""")
t.write("lib/jamfile.jam", output.getvalue())
create_sources("lib", sources)
t.run_build_system(subdir="lib")
built_archive = "lib/bin/$toolset/debug*/%s" % name
t.expect_addition(built_archive)
t.copy(built_archive, name)
t.rm("lib")
def test_glob_archive(archives, glob, expected, sort_results = False):
output = StringIO()
## replace placeholders
glob = glob.replace("$archive1", archives[0]).replace("$obj", obj_suffix)
expected = [ m.replace("$archive1",
archives[0]).replace("$obj", obj_suffix) for m in expected ]
if len(archives) > 1 :
glob = glob.replace("$archive2", archives[1]).replace("$obj", obj_suffix)
expected = [ m.replace("$archive2",
archives[1]).replace("$obj", obj_suffix) for m in expected ]
## create test jamfile
if sort_results : glob = "[ SORT %s ]" % glob
output.write("""\
for local p in %s
{
ECHO $(p) ;
}
UPDATE ;
""" % glob)
t.write("file.jam", output.getvalue())
## run test jamfile and match against expected results
if sort_results : expected.sort()
t.run_build_system(["-ffile.jam"], stdout="\n".join(expected + [""]))
t.rm("file.jam")
## RUN TESTS
setup_archive("auxilliary1.lib", sources)
archive1 = archive
setup_archive("auxilliary2.lib", sources)
archive2 = archive
## all arguments empty
test_glob_archive([archive1], "[ GLOB_ARCHIVE ]", [])
## empty query
test_glob_archive([archive1], "[ GLOB_ARCHIVE $archive1 : ]", [])
## no-match
test_glob_archive([archive1], "[ GLOB_ARCHIVE $archive1 : a ]", [])
## match exact
test_glob_archive([archive1], "[ GLOB_ARCHIVE $archive1 : a$obj ]",
["$archive1(a$obj)"])
## glob wildcards:1
test_glob_archive([archive1], "[ GLOB_ARCHIVE $archive1 : b.* ]",
["$archive1(b$obj)"])
## glob wildcards:2
test_glob_archive([archive1], '[ GLOB_ARCHIVE $archive1 : "\\b?match[\.]*" ]',
["$archive1(b_match$obj)"])
## glob wildcards:3
test_glob_archive([archive1], "[ SORT [ GLOB_ARCHIVE $archive1 : b* ] ]",
["$archive1(b$obj)", "$archive1(b_match$obj)"])
## glob multiple patterns with multiple results.
test_glob_archive([archive1], "[ SORT [ GLOB_ARCHIVE $archive1 : b.* b_* ] ]",
["$archive1(b$obj)", "$archive1(b_match$obj)"])
## glob multiple archives and patterns.
test_glob_archive([archive1, archive2],
"[ SORT [ GLOB_ARCHIVE $archive1 $archive2 : b.* b_* ] ]",
["$archive1(b$obj)", "$archive1(b_match$obj)",
"$archive2(b$obj)", "$archive2(b_match$obj)"])
## glob same archive multiple times.
test_glob_archive([archive1, archive1],
"[ GLOB_ARCHIVE $archive1 $archive2 $archive1 : b.* ]",
["$archive1(b$obj)", "$archive2(b$obj)", "$archive1(b$obj)"])
## returned archive member has no path, even though its source object-file did.
## this is rather NT-specific, where members also store their object-file's path.
test_glob_archive([archive1], "[ GLOB_ARCHIVE $archive1 : nopath_check$obj ]",
["$archive1(nopath_check$obj)"])
## case insensitive matching, when archives support case sensitive member names.
## VMS implementation forces case-insensitive matching and downcased member names.
case_sensitive_members = ( not vms )
if case_sensitive_members:
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : casecheck$obj : true ]",
["$archive1(CaseCheck$obj)"])
elif vms:
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : CaseCheck$obj : false ]",
["$archive1(casecheck$obj)"])
## test the order of matched members, in general it should match the
## insertion sequence.
test_glob_archive([archive1], "[ SORT [ GLOB_ARCHIVE $archive1 : seq_check*$obj ] ]",
["$archive1(seq_check1$obj)", "$archive1(seq_check2$obj)",
"$archive1(seq_check3$obj)"])
## glob members by symbols they contain.
## Currently supported only on VMS.
symbol_glob_supported = ( vms )
if symbol_glob_supported :
## NOTE: generated symbols are compiler-dependent and may be specifically
## mangled (as in C++ case), so globbing by exact symbol is non-trivial.
## However, C-generated symbols are likely to have more portable names,
## so for the glob-by-symbol tests we glob C-generated archive members.
## glob members by exact symbol.
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : : : symbol ]",
["$archive1(symbols_check$obj)"])
## glob members by symbol wildcard.
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : : : symbol_* ]",
["$archive1(symbols_check$obj)"])
## glob members by member pattern AND symbol pattern.
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : *symbol* : : *member* ]",
["$archive1(members_and_symbols_check$obj)"])
## case insensitive symbol glob.
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : : true : symbolcasecheck ]",
["$archive1(symbol_case_check$obj)"])
## glob member that contains main symbol.
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : : : main _main ]",
["$archive1(main_check$obj)"])
else:
test_glob_archive([archive1],
"[ GLOB_ARCHIVE $archive1 : : : symbol ]",
[])
t.cleanup()
|
PypiClean
|
/django-rte-0.4.0.tar.gz/django-rte-0.4.0/rte/static/rte/tiny_mce/themes/advanced/langs/sq.js
|
tinyMCE.addI18n('sq.advanced',{"underline_desc":"I N\u00ebnvizuar (Ctrl+U)","italic_desc":"I Pjerr\u00ebt (Ctrl+I)","bold_desc":"I Trash\u00eb (Ctrl+B)",dd:"P\u00ebrshkrimi i p\u00ebrcaktimit",dt:"Terma e p\u00ebrcaktimit ",samp:"Shembull kodi",code:"Kod",blockquote:"Bllok",h6:"Kok\u00eb 6",h5:"Kok\u00eb 5",h4:"Kok\u00eb 4",h3:"Kok\u00eb 3",h2:"Kok\u00eb 2",h1:"Kok\u00eb 1",pre:"Para formatuar",address:"Adres\u00eb",div:"Div",paragraph:"Paragraf",block:"Formati",fontdefault:"Familja e tekstit","font_size":"Madh\u00ebsia e tekstit","style_select":"Stilet","more_colors":"M\u00eb shum\u00eb ngjyra","toolbar_focus":"Shko tek butonat - Alt+Q, Shko tek editori - Alt+Z, Shko tek rruga e elementit - Alt+X",newdocument:"Jeni t\u00eb sigurt q\u00eb doni t\'a fshini p\u00ebrmbajtjen?",path:"Rruga","clipboard_msg":"Kopja/Prerja/Ngjitja nuk suportohen n\u00eb Mozilla dhe Firefox.\\nD\u00ebshironi m\u00eb shum\u00eb informacione p\u00ebr k\u00ebt\u00eb \u00e7\u00ebshtje?","blockquote_desc":"Bllok","help_desc":"Ndihm\u00eb","newdocument_desc":"Dokument i Ri","image_props_desc":"Opsionet e fotos","paste_desc":"Ngjit","copy_desc":"Kopjo","cut_desc":"Prit","anchor_desc":"Fut/edito lidhje","visualaid_desc":"Shfaq/Fshih vijat ndihm\u00ebse dhe element\u00ebt e paduksh\u00ebm","charmap_desc":"Fut karakter t\u00eb personalizuar","backcolor_desc":"Zgjidh ngjyr\u00ebn e fush\u00ebs","forecolor_desc":"Zgjidh ngjyr\u00ebn e tekstit","custom1_desc":"P\u00ebshkrimi i personalizuar k\u00ebtu","removeformat_desc":"Fshi formatimin","hr_desc":"Fut linj\u00eb horizontale","sup_desc":"Mbi shkrim","sub_desc":"N\u00ebn shkrim","code_desc":"Edito kodin HTML","cleanup_desc":"Pastro kodin","image_desc":"Fut/edito foto","unlink_desc":"Hiq lidhje","link_desc":"Fut/edito lidhje","redo_desc":"Rib\u00ebj (Ctrl+Y)","undo_desc":"\u00c7b\u00ebj (Ctrl+Z)","indent_desc":"Vendos kryerradh\u00eb","outdent_desc":"Hiq kryerradh\u00eb","numlist_desc":"List\u00eb e rregullt","bullist_desc":"List\u00eb e parregullt","justifyfull_desc":"Drejtim i plot\u00eb","justifyright_desc":"Drejtimi djathtas","justifycenter_desc":"Drejtimi qend\u00ebr","justifyleft_desc":"Drejtimi majtas","striketrough_desc":"Vij\u00eb n\u00eb mes","anchor_delta_height":"","anchor_delta_width":"","charmap_delta_height":"","charmap_delta_width":"","colorpicker_delta_height":"","colorpicker_delta_width":"","link_delta_height":"","link_delta_width":"","image_delta_height":"","image_delta_width":"","help_shortcut":"Press ALT-F10 for toolbar. Press ALT-0 for help","rich_text_area":"Rich Text Area","shortcuts_desc":"Accessability Help",toolbar:"Toolbar"});
|
PypiClean
|
/ibmJupyterNotebookStyles-0.0.7.tar.gz/ibmJupyterNotebookStyles-0.0.7/README.md
|
# IbmJupyterNotebookStyles
## Install
```bash
$ pip install ibmJupyterNotebookStyles
```
## Use
```bash
$ import ibmJupyterNotebookStyles
$ ibmJupyterNotebookStyles.apply_ibm_styles()
```
## Documentation
### The package does the following:
- Changes the default font to IBM Plex and IBM Plex monospace.
- Changes the headings styles to match IBM Carbon Design guidelines.
- Adds the divergent, monochromatic_pos and monochromatic_neg matplotlib color maps that implement the IBM Carbon Design color palette.
- Replaces the default matplotlib style with a custom one that contains IBM Carbon Design color palette.
|
PypiClean
|
/zimports-0.6.0.tar.gz/zimports-0.6.0/test_files/type_checking3.no_unused_types.py
|
from types import TYPE_CHECKING
if TYPE_CHECKING:
from sqlalchemy import alias
from sqlalchemy import all_
from sqlalchemy import and_
from sqlalchemy import any_
from sqlalchemy import ARRAY
from sqlalchemy import asc
from sqlalchemy import between
from sqlalchemy import BIGINT
from sqlalchemy import BigInteger
from sqlalchemy import BINARY
from sqlalchemy import Binary
from sqlalchemy import bindparam
from sqlalchemy import BLANK_SCHEMA
from sqlalchemy import BLOB
from sqlalchemy import BOOLEAN
from sqlalchemy import Boolean
from sqlalchemy import case
from sqlalchemy import cast
from sqlalchemy import CHAR
from sqlalchemy import CheckConstraint
from sqlalchemy import CLOB
from sqlalchemy import collate
from sqlalchemy import Column
from sqlalchemy import column
from sqlalchemy import ColumnDefault
from sqlalchemy import Constraint
from sqlalchemy import create_engine
from sqlalchemy import DATE
from sqlalchemy import Date
from sqlalchemy import DATETIME
from sqlalchemy import DateTime
from sqlalchemy import DDL
from sqlalchemy import DECIMAL
from sqlalchemy import DefaultClause
from sqlalchemy import delete
from sqlalchemy import desc
from sqlalchemy import distinct
from sqlalchemy import engine_from_config
from sqlalchemy import Enum
from sqlalchemy import exc as sa_exc
from sqlalchemy import except_
from sqlalchemy import except_all
from sqlalchemy import exists
from sqlalchemy import extract
from sqlalchemy import false
from sqlalchemy import FetchedValue
from sqlalchemy import FLOAT
from sqlalchemy import Float
from sqlalchemy import ForeignKey
from sqlalchemy import ForeignKeyConstraint
from sqlalchemy import func
from sqlalchemy import funcfilter
from sqlalchemy import Index
from sqlalchemy import insert
from sqlalchemy import inspect
from sqlalchemy import INT
from sqlalchemy import INTEGER
from sqlalchemy import Integer
from sqlalchemy import intersect
from sqlalchemy import intersect_all
from sqlalchemy import Interval
from sqlalchemy import join
from sqlalchemy import JSON
from sqlalchemy import LargeBinary
from sqlalchemy import lateral
from sqlalchemy import literal
from sqlalchemy import literal_column
from sqlalchemy import MetaData
from sqlalchemy import modifier
from sqlalchemy import NCHAR
from sqlalchemy import not_
from sqlalchemy import null
from sqlalchemy import nullsfirst
from sqlalchemy import nullslast
from sqlalchemy import NUMERIC
from sqlalchemy import Numeric
from sqlalchemy import NVARCHAR
from sqlalchemy import or_
from sqlalchemy import outerjoin
from sqlalchemy import outparam
from sqlalchemy import over
from sqlalchemy import PassiveDefault
from sqlalchemy import PickleType
from sqlalchemy import PrimaryKeyConstraint
from sqlalchemy import REAL
from sqlalchemy import select
from sqlalchemy import Sequence
from sqlalchemy import SMALLINT
from sqlalchemy import SmallInteger
from sqlalchemy import String
from sqlalchemy import subquery
from sqlalchemy import Table
from sqlalchemy import table
from sqlalchemy import tablesample
from sqlalchemy import TEXT
from sqlalchemy import Text
from sqlalchemy import text
from sqlalchemy import ThreadLocalMetaData
from sqlalchemy import TIME
from sqlalchemy import Time
from sqlalchemy import TIMESTAMP
from sqlalchemy import true
from sqlalchemy import tuple_
from sqlalchemy import type_coerce
from sqlalchemy import TypeDecorator
from sqlalchemy import Unicode
from sqlalchemy import UnicodeText
from sqlalchemy import union
from sqlalchemy import union_all
from sqlalchemy import UniqueConstraint
from sqlalchemy import update
from sqlalchemy import VARBINARY
from sqlalchemy import VARCHAR
from sqlalchemy import within_group
from sqlalchemy.orm import aliased
from sqlalchemy.orm import AliasOption
from sqlalchemy.orm import AttributeExtension
from sqlalchemy.orm import backref
from sqlalchemy.orm import Bundle
from sqlalchemy.orm import class_mapper
from sqlalchemy.orm import clear_mappers
from sqlalchemy.orm import column_property
from sqlalchemy.orm import ColumnProperty
from sqlalchemy.orm import comparable_property
from sqlalchemy.orm import ComparableProperty
from sqlalchemy.orm import compile_mappers
from sqlalchemy.orm import composite
from sqlalchemy.orm import CompositeProperty
from sqlalchemy.orm import configure_mappers
from sqlalchemy.orm import contains_alias
from sqlalchemy.orm import contains_eager
from sqlalchemy.orm import create_session
from sqlalchemy.orm import defaultload
from sqlalchemy.orm import defer
from sqlalchemy.orm import deferred
from sqlalchemy.orm import dynamic_loader
from sqlalchemy.orm import eagerload
from sqlalchemy.orm import eagerload_all
from sqlalchemy.orm import EXT_CONTINUE
from sqlalchemy.orm import EXT_SKIP
from sqlalchemy.orm import EXT_STOP
from sqlalchemy.orm import foreign
from sqlalchemy.orm import immediateload
from sqlalchemy.orm import join
from sqlalchemy.orm import joinedload
from sqlalchemy.orm import joinedload_all
from sqlalchemy.orm import lazyload
from sqlalchemy.orm import lazyload_all
from sqlalchemy.orm import Load
from sqlalchemy.orm import load_only
from sqlalchemy.orm import make_transient
from sqlalchemy.orm import make_transient_to_detached
from sqlalchemy.orm import Mapper
from sqlalchemy.orm import mapper
from sqlalchemy.orm import MapperExtension
from sqlalchemy.orm import noload
from sqlalchemy.orm import object_mapper
from sqlalchemy.orm import object_session
from sqlalchemy.orm import outerjoin
from sqlalchemy.orm import polymorphic_union
from sqlalchemy.orm import PropComparator
from sqlalchemy.orm import public_factory
from sqlalchemy.orm import Query
from sqlalchemy.orm import query_expression
from sqlalchemy.orm import raiseload
from sqlalchemy.orm import reconstructor
from sqlalchemy.orm import relation
from sqlalchemy.orm import relationship
from sqlalchemy.orm import RelationshipProperty
from sqlalchemy.orm import remote
from sqlalchemy.orm import scoped_session
from sqlalchemy.orm import selectin_polymorphic
from sqlalchemy.orm import selectinload
from sqlalchemy.orm import selectinload_all
from sqlalchemy.orm import Session
from sqlalchemy.orm import SessionExtension
from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm import subqueryload
from sqlalchemy.orm import subqueryload_all
from sqlalchemy.orm import synonym
from sqlalchemy.orm import SynonymProperty
from sqlalchemy.orm import undefer
from sqlalchemy.orm import undefer_group
from sqlalchemy.orm import validates
from sqlalchemy.orm import was_deleted
from sqlalchemy.orm import with_expression
from sqlalchemy.orm import with_parent
from sqlalchemy.orm import with_polymorphic
from sqlalchemy.testing import assert_raises_message
from sqlalchemy.testing import fixtures
|
PypiClean
|
/klaviyo-api-beta-2.0.2.tar.gz/klaviyo-api-beta-2.0.2/src/openapi_client/model/campaign_partial_update_query.py
|
import re # noqa: F401
import sys # noqa: F401
from openapi_client.model_utils import ( # noqa: F401
ApiTypeError,
ModelComposed,
ModelNormal,
ModelSimple,
cached_property,
change_keys_js_to_python,
convert_js_args_to_python_args,
date,
datetime,
file_type,
none_type,
validate_get_composed_info,
OpenApiModel
)
from openapi_client.exceptions import ApiAttributeError
def lazy_import():
from openapi_client.model.campaign_partial_update_query_as_sub_resource import CampaignPartialUpdateQueryAsSubResource
globals()['CampaignPartialUpdateQueryAsSubResource'] = CampaignPartialUpdateQueryAsSubResource
class CampaignPartialUpdateQuery(ModelNormal):
"""NOTE: This class is auto generated by OpenAPI Generator.
Ref: https://openapi-generator.tech
Do not edit the class manually.
Attributes:
allowed_values (dict): The key is the tuple path to the attribute
and the for var_name this is (var_name,). The value is a dict
with a capitalized key describing the allowed value and an allowed
value. These dicts store the allowed enum values.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
discriminator_value_class_map (dict): A dict to go from the discriminator
variable value to the discriminator class name.
validations (dict): The key is the tuple path to the attribute
and the for var_name this is (var_name,). The value is a dict
that stores validations for max_length, min_length, max_items,
min_items, exclusive_maximum, inclusive_maximum, exclusive_minimum,
inclusive_minimum, and regex.
additional_properties_type (tuple): A tuple of classes accepted
as additional properties values.
"""
allowed_values = {
}
validations = {
}
@cached_property
def additional_properties_type():
"""
This must be a method because a model may have properties that are
of type self, this must run after the class is loaded
"""
lazy_import()
return (bool, date, datetime, dict, float, int, list, str, none_type,) # noqa: E501
_nullable = False
@cached_property
def openapi_types():
"""
This must be a method because a model may have properties that are
of type self, this must run after the class is loaded
Returns
openapi_types (dict): The key is attribute name
and the value is attribute type.
"""
lazy_import()
return {
'data': (CampaignPartialUpdateQueryAsSubResource,), # noqa: E501
}
@cached_property
def discriminator():
return None
attribute_map = {
'data': 'data', # noqa: E501
}
read_only_vars = {
}
_composed_schemas = {}
@classmethod
@convert_js_args_to_python_args
def _from_openapi_data(cls, data, *args, **kwargs): # noqa: E501
"""CampaignPartialUpdateQuery - a model defined in OpenAPI
Args:
data (CampaignPartialUpdateQueryAsSubResource):
Keyword Args:
_check_type (bool): if True, values for parameters in openapi_types
will be type checked and a TypeError will be
raised if the wrong type is input.
Defaults to True
_path_to_item (tuple/list): This is a list of keys or values to
drill down to the model in received_data
when deserializing a response
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_configuration (Configuration): the instance to use when
deserializing a file_type parameter.
If passed, type conversion is attempted
If omitted no type conversion is done.
_visited_composed_classes (tuple): This stores a tuple of
classes that we have traveled through so that
if we see that class again we will not use its
discriminator again.
When traveling through a discriminator, the
composed schema that is
is traveled through is added to this set.
For example if Animal has a discriminator
petType and we pass in "Dog", and the class Dog
allOf includes Animal, we move through Animal
once using the discriminator, and pick Dog.
Then in Dog, we will make an instance of the
Animal class but this time we won't travel
through its discriminator because we passed in
_visited_composed_classes = (Animal,)
"""
_check_type = kwargs.pop('_check_type', True)
_spec_property_naming = kwargs.pop('_spec_property_naming', True)
_path_to_item = kwargs.pop('_path_to_item', ())
_configuration = kwargs.pop('_configuration', None)
_visited_composed_classes = kwargs.pop('_visited_composed_classes', ())
self = super(OpenApiModel, cls).__new__(cls)
if args:
for arg in args:
if isinstance(arg, dict):
kwargs.update(arg)
else:
raise ApiTypeError(
"Invalid positional arguments=%s passed to %s. Remove those invalid positional arguments." % (
args,
self.__class__.__name__,
),
path_to_item=_path_to_item,
valid_classes=(self.__class__,),
)
self._data_store = {}
self._check_type = _check_type
self._spec_property_naming = _spec_property_naming
self._path_to_item = _path_to_item
self._configuration = _configuration
self._visited_composed_classes = _visited_composed_classes + (self.__class__,)
self.data = data
for var_name, var_value in kwargs.items():
if var_name not in self.attribute_map and \
self._configuration is not None and \
self._configuration.discard_unknown_keys and \
self.additional_properties_type is None:
# discard variable.
continue
setattr(self, var_name, var_value)
return self
required_properties = set([
'_data_store',
'_check_type',
'_spec_property_naming',
'_path_to_item',
'_configuration',
'_visited_composed_classes',
])
@convert_js_args_to_python_args
def __init__(self, data, *args, **kwargs): # noqa: E501
"""CampaignPartialUpdateQuery - a model defined in OpenAPI
Args:
data (CampaignPartialUpdateQueryAsSubResource):
Keyword Args:
_check_type (bool): if True, values for parameters in openapi_types
will be type checked and a TypeError will be
raised if the wrong type is input.
Defaults to True
_path_to_item (tuple/list): This is a list of keys or values to
drill down to the model in received_data
when deserializing a response
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_configuration (Configuration): the instance to use when
deserializing a file_type parameter.
If passed, type conversion is attempted
If omitted no type conversion is done.
_visited_composed_classes (tuple): This stores a tuple of
classes that we have traveled through so that
if we see that class again we will not use its
discriminator again.
When traveling through a discriminator, the
composed schema that is
is traveled through is added to this set.
For example if Animal has a discriminator
petType and we pass in "Dog", and the class Dog
allOf includes Animal, we move through Animal
once using the discriminator, and pick Dog.
Then in Dog, we will make an instance of the
Animal class but this time we won't travel
through its discriminator because we passed in
_visited_composed_classes = (Animal,)
"""
_check_type = kwargs.pop('_check_type', True)
_spec_property_naming = kwargs.pop('_spec_property_naming', False)
_path_to_item = kwargs.pop('_path_to_item', ())
_configuration = kwargs.pop('_configuration', None)
_visited_composed_classes = kwargs.pop('_visited_composed_classes', ())
if args:
for arg in args:
if isinstance(arg, dict):
kwargs.update(arg)
else:
raise ApiTypeError(
"Invalid positional arguments=%s passed to %s. Remove those invalid positional arguments." % (
args,
self.__class__.__name__,
),
path_to_item=_path_to_item,
valid_classes=(self.__class__,),
)
self._data_store = {}
self._check_type = _check_type
self._spec_property_naming = _spec_property_naming
self._path_to_item = _path_to_item
self._configuration = _configuration
self._visited_composed_classes = _visited_composed_classes + (self.__class__,)
self.data = data
for var_name, var_value in kwargs.items():
if var_name not in self.attribute_map and \
self._configuration is not None and \
self._configuration.discard_unknown_keys and \
self.additional_properties_type is None:
# discard variable.
continue
setattr(self, var_name, var_value)
if var_name in self.read_only_vars:
raise ApiAttributeError(f"`{var_name}` is a read-only attribute. Use `from_openapi_data` to instantiate "
f"class with read only attributes.")
|
PypiClean
|
/alipay_sdk_python-3.6.740-py3-none-any.whl/alipay/aop/api/request/AlipayOpenMiniWidgetGoodsQueryRequest.py
|
import json
from alipay.aop.api.FileItem import FileItem
from alipay.aop.api.constant.ParamConstants import *
from alipay.aop.api.domain.AlipayOpenMiniWidgetGoodsQueryModel import AlipayOpenMiniWidgetGoodsQueryModel
class AlipayOpenMiniWidgetGoodsQueryRequest(object):
def __init__(self, biz_model=None):
self._biz_model = biz_model
self._biz_content = None
self._version = "1.0"
self._terminal_type = None
self._terminal_info = None
self._prod_code = None
self._notify_url = None
self._return_url = None
self._udf_params = None
self._need_encrypt = False
@property
def biz_model(self):
return self._biz_model
@biz_model.setter
def biz_model(self, value):
self._biz_model = value
@property
def biz_content(self):
return self._biz_content
@biz_content.setter
def biz_content(self, value):
if isinstance(value, AlipayOpenMiniWidgetGoodsQueryModel):
self._biz_content = value
else:
self._biz_content = AlipayOpenMiniWidgetGoodsQueryModel.from_alipay_dict(value)
@property
def version(self):
return self._version
@version.setter
def version(self, value):
self._version = value
@property
def terminal_type(self):
return self._terminal_type
@terminal_type.setter
def terminal_type(self, value):
self._terminal_type = value
@property
def terminal_info(self):
return self._terminal_info
@terminal_info.setter
def terminal_info(self, value):
self._terminal_info = value
@property
def prod_code(self):
return self._prod_code
@prod_code.setter
def prod_code(self, value):
self._prod_code = value
@property
def notify_url(self):
return self._notify_url
@notify_url.setter
def notify_url(self, value):
self._notify_url = value
@property
def return_url(self):
return self._return_url
@return_url.setter
def return_url(self, value):
self._return_url = value
@property
def udf_params(self):
return self._udf_params
@udf_params.setter
def udf_params(self, value):
if not isinstance(value, dict):
return
self._udf_params = value
@property
def need_encrypt(self):
return self._need_encrypt
@need_encrypt.setter
def need_encrypt(self, value):
self._need_encrypt = value
def add_other_text_param(self, key, value):
if not self.udf_params:
self.udf_params = dict()
self.udf_params[key] = value
def get_params(self):
params = dict()
params[P_METHOD] = 'alipay.open.mini.widget.goods.query'
params[P_VERSION] = self.version
if self.biz_model:
params[P_BIZ_CONTENT] = json.dumps(obj=self.biz_model.to_alipay_dict(), ensure_ascii=False, sort_keys=True, separators=(',', ':'))
if self.biz_content:
if hasattr(self.biz_content, 'to_alipay_dict'):
params['biz_content'] = json.dumps(obj=self.biz_content.to_alipay_dict(), ensure_ascii=False, sort_keys=True, separators=(',', ':'))
else:
params['biz_content'] = self.biz_content
if self.terminal_type:
params['terminal_type'] = self.terminal_type
if self.terminal_info:
params['terminal_info'] = self.terminal_info
if self.prod_code:
params['prod_code'] = self.prod_code
if self.notify_url:
params['notify_url'] = self.notify_url
if self.return_url:
params['return_url'] = self.return_url
if self.udf_params:
params.update(self.udf_params)
return params
def get_multipart_params(self):
multipart_params = dict()
return multipart_params
|
PypiClean
|
/dash_daq-0.5.0.tar.gz/dash_daq-0.5.0/dash_daq/__init__.py
|
import os as _os
import dash as _dash
import sys as _sys
import json
_basepath = _os.path.dirname(__file__)
_filepath = _os.path.abspath(_os.path.join(_basepath, 'package-info.json'))
with open(_filepath) as f:
__version__ = json.loads(f.read())['version']
_current_path = _os.path.dirname(_os.path.abspath(__file__))
_components = _dash.development.component_loader.load_components(
_os.path.join(_current_path, 'metadata.json'),
'dash_daq'
)
_this_module = _sys.modules[__name__]
async_resources = [
'colorpicker',
'slider'
]
_js_dist = []
_js_dist.extend([{
'relative_package_path': 'async-{}.js'.format(async_resource),
'external_url': (
'https://unpkg.com/dash-daq@{}'
'/dash_daq/async-{}.js'
).format(__version__, async_resource),
'namespace': 'dash_daq',
'async': True
} for async_resource in async_resources])
_js_dist.extend([{
'relative_package_path': 'async-{}.js.map'.format(async_resource),
'external_url': (
'https://unpkg.com/dash-daq@{}'
'/dash_daq/async-{}.js.map'
).format(__version__, async_resource),
'namespace': 'dash_daq',
'dynamic': True
} for async_resource in async_resources])
_js_dist.extend([
{
"relative_package_path": "dash_daq.min.js",
"external_url": (
"https://unpkg.com/dash-daq@{}"
"/dash_daq/dash_daq.min.js"
).format(__version__),
"namespace": "dash_daq"
}
])
_js_dist.extend([
{
"relative_package_path": "dash_daq.min.js.map",
"external_url": (
"https://unpkg.com/dash-daq@{}"
"/dash_daq/dash_daq.min.js.map"
).format(__version__),
"namespace": "dash_daq",
'dynamic': True
}
])
_css_dist = []
for _component in _components:
setattr(_this_module, _component.__name__, _component)
setattr(_component, '_js_dist', _js_dist)
setattr(_component, '_css_dist', _css_dist)
|
PypiClean
|
/tensorflow-probability-0.2.0.tar.gz/tensorflow-probability-0.2.0/tensorflow_probability/python/math/custom_gradient.py
|
"""Functions for specifying custom gradients."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
__all__ = [
'custom_gradient',
]
def is_list_like(x):
return isinstance(x, (tuple, list))
def identity(x, dtype=None, name=None):
return tf.identity(tf.convert_to_tensor(
x, dtype=dtype, name=name), name=name)
def custom_gradient(fx, gx, x, fx_gx_manually_stopped=False, name=None):
"""Embeds a custom gradient into a `Tensor`.
This function works by clever application of `stop_gradient`. I.e., observe
that:
```none
h(x) = stop_gradient(f(x)) + stop_gradient(g(x)) * (x - stop_gradient(x))
```
is such that `h(x) == stop_gradient(f(x))` and
`grad[h(x), x] == stop_gradient(g(x)).`
In addition to scalar-domain/scalar-range functions, this function also
supports tensor-domain/scalar-range functions.
Partial Custom Gradient:
Suppose `h(x) = htilde(x, y)`. Note that `dh/dx = stop(g(x))` but `dh/dy =
None`. This is because a `Tensor` cannot have only a portion of its gradient
stopped. To circumvent this issue, one must manually `stop_gradient` the
relevant portions of `f`, `g`. For example see the unit-test,
`test_works_correctly_fx_gx_manually_stopped`.
Args:
fx: `Tensor`. Output of function evaluated at `x`.
gx: `Tensor` or list of `Tensor`s. Gradient of function at (each) `x`.
x: `Tensor` or list of `Tensor`s. Args of evaluation for `f`.
fx_gx_manually_stopped: Python `bool` indicating that `fx`, `gx` manually
have `stop_gradient` applied.
name: Python `str` name prefixed to Ops created by this function.
Returns:
fx: Floating-type `Tensor` equal to `f(x)` but which has gradient
`stop_gradient(g(x))`.
"""
def maybe_stop(x):
if fx_gx_manually_stopped:
return x
return tf.stop_gradient(x)
with tf.name_scope(name, 'custom_gradient', [fx, gx, x]):
fx = tf.convert_to_tensor(fx, name='fx')
# We don't want to bother eagerly computing `gx` since we may not even need
# it.
with tf.control_dependencies([fx]):
if is_list_like(x):
x = [identity(x_, name='x') for x_ in x]
else:
x = [identity(x, name='x')]
if is_list_like(gx):
gx = [identity(gx_, dtype=fx.dtype, name='gx')
for gx_ in gx]
else:
gx = [identity(gx, dtype=fx.dtype, name='gx')]
override_grad = []
for x_, gx_ in zip(x, gx):
# Observe: tf.gradients(f(x), x)[i].shape == x[i].shape
# thus we check that the user is supplying correct shapes.
equal_shape = tf.assert_equal(
tf.shape(x_),
tf.shape(gx_),
message='Each `x` must have the same shape as each `gx`.')
with tf.control_dependencies([equal_shape]):
# IEEE754 ensures `(x-x)==0.` and that `0.*x==0.` so we make sure to
# write the code this way, rather than, e.g.,
# `sum_x * stop(gx) + stop(fx - sum_x * gx)`.
# For more discussion regarding the relevant portions of the IEEE754
# standard, see the StackOverflow question,
# "Is there a floating point value of x, for which x-x == 0 is false?"
# http://stackoverflow.com/q/2686644
zeros_like_x_ = x_ - tf.stop_gradient(x_)
override_grad.append(tf.reduce_sum(
maybe_stop(gx_) * zeros_like_x_))
override_grad = sum(override_grad)
override_grad /= tf.cast(tf.size(fx), dtype=fx.dtype.base_dtype)
# Proof of correctness:
#
# f(x) = x * stop[gx] + stop[fx - x * gx]
# = stop[fx]
#
# g(x) = grad[fx]
# = stop[gx] + grad[stop[fx - x * gx]]
# = stop[gx] + 0
#
# Notice that when x is zero it still works:
# grad[x * stop(gx) + stop(fx - x * gx)] = 1 * stop[gx] + 0 = stop[gx]
#
# The proof is similar for the tensor-domain case, except that we
# `reduce_sum` the `stop[gx] * (x - stop[x])` then rescale by
# `tf.size(fx)` since this reduced version is broadcast to `fx`.
return maybe_stop(fx) + override_grad
|
PypiClean
|
/hybrid-vocal-classifier-0.3.1.tar.gz/hybrid-vocal-classifier-0.3.1/src/hvc/parse/predict.py
|
import os
import copy
import csv
import yaml
import joblib
from .utils import check_for_missing_keys, flatten
path = os.path.abspath(__file__)
dir_path = os.path.dirname(path)
with open(os.path.join(dir_path, "validation.yml")) as val_yaml:
validate_dict = yaml.load(val_yaml, Loader=yaml.FullLoader)
REQUIRED_TODO_LIST_KEYS = set(validate_dict["required_predict_todo_list_keys"])
REQUIRED_TODO_LIST_KEYS_FLATTENED = set(
flatten(validate_dict["required_predict_todo_list_keys"])
)
OPTIONAL_TODO_LIST_KEYS = set(validate_dict["optional_predict_todo_list_keys"])
VALID_MODELS = validate_dict["valid_models"]
VALID_CONVERT_TYPES = validate_dict["valid_convert_types"]
MUST_TRAIN_WITH_PROB_TRUE = validate_dict["must_train_with_prob_true"]
def _validate_todo_list_dict(todo_list_dict, index, config_path):
"""
validates to-do lists
Parameters
----------
todo_list_dict : dict
from "to-do" list
index : int
index of element (i.e., dictionary) in list of dictionaries
config_path : str
absolute path to YAML config file from which dict was taken.
Used to validate directory names.
Returns
-------
todo_list_dict : dict
after validation, may have new keys added if necessary
"""
# if required_todo_list_keys is not a subset of todo_list_dict,
# i.e., if not all required keys are in todo_list_dict
missing_keys = check_for_missing_keys(todo_list_dict, REQUIRED_TODO_LIST_KEYS)
if missing_keys:
raise KeyError(
"The following required keys "
"were not found in todo_list item #{}: {}".format(index, missing_keys)
)
else:
additional_keys = set(todo_list_dict.keys()) - REQUIRED_TODO_LIST_KEYS_FLATTENED
for extra_key in additional_keys:
if extra_key not in OPTIONAL_TODO_LIST_KEYS:
raise KeyError(
"key {} in todo_list item #{} is not recognized".format(
extra_key, index
)
)
validated_todo_list_dict = copy.deepcopy(todo_list_dict)
for key, val in todo_list_dict.items():
# valid todo_list_dict keys in alphabetical order
if key == "annotation_file":
with open(val, newline="") as f:
reader = csv.reader(f, delimiter=",")
first_row = next(reader)
if first_row != "filename,index,onset,offset,label".split(","):
raise ValueError("annotation_file did not have correct header")
elif key == "bird_ID":
if type(val) != str:
raise ValueError(
"Value {} for key 'bird_ID' is type {} but it"
" should be a string".format(val, type(val))
)
elif key == "convert":
if type(val) != str:
raise TypeError(
"Specifier for `convert` in to-do list should be "
"a string, but parsed as a {}".format(type(val))
)
elif val not in VALID_CONVERT_TYPES:
raise ValueError(
"{} is not a valid format that predict output "
"can be converted to".format(val)
)
elif key == "data_dirs":
if type(val) != list:
raise TypeError("data_dirs should be a list")
else:
validated_data_dirs = []
for item in val:
if not os.path.isdir(item):
# if item is not absolute path to dir
# try adding item to absolute path to config_file
# i.e. assume it is written relative to config file
item = os.path.join(
os.path.dirname(config_path), os.path.normpath(item)
)
if not os.path.isdir(item):
raise ValueError(
"directory {} in {} is not a valid directory.".format(
item, key
)
)
validated_data_dirs.append(item)
validated_todo_list_dict["data_dirs"] = validated_data_dirs
elif key == "file_format":
if type(val) != str:
raise ValueError(
"Value {} for key 'file_format' is type {} but it"
" should be a string".format(val, type(val))
)
else:
if val not in validate_dict["valid_file_formats"]:
raise ValueError("{} is not a known audio file format".format(val))
elif key == "model_meta_file":
if type(val) != str:
raise ValueError(
"Value {} for key 'feature_file' is type {} but it"
" should be a string".format(val, type(val))
)
if not os.path.isfile(os.path.normpath(val)):
# if val is not absolute path to meta_file
# try adding item to absolute path to config_file
# i.e. assume path to file is written relative to config file
val = os.path.join(os.path.dirname(config_path), os.path.normpath(val))
if not os.path.isfile(val):
raise FileNotFoundError("{} is not found as a file".format(val))
# check that model file can be opened
model_meta_file = joblib.load(val)
model_filename = model_meta_file["model_filename"]
model_name = model_meta_file["model_name"]
if model_name in VALID_MODELS["sklearn"]:
try:
joblib.load(model_filename)
except OSError:
raise OSError(
"Unable to open model file: {}".format(model_filename)
)
elif model_name in VALID_MODELS["keras"]:
try:
import keras.models
keras.models.load_model(model_filename)
except OSError:
raise OSError(
"Unable to open model file: {}".format(model_filename)
)
elif key == "output_dir":
if type(val) != str:
raise ValueError(
"output_dirs should be a string but it parsed as a {}".format(
type(val)
)
)
elif key == "predict_proba":
if type(val) != bool:
raise ValueError(
"predict_proba should be a Boolean but it parsed as {}".format(
type(val)
)
)
else: # if key is not found in list
raise KeyError(
"key {} found in todo_list_dict but not validated".format(key)
)
return validated_todo_list_dict
def validate_yaml(config_path, predict_config_yaml):
"""
validates config from YAML file
Parameters
----------
config_path : str
absolute path to YAML config file. Used to validate directory names
in YAML files, which are assumed to be written relative to the
location of the file itself.
predict_config_yaml : dict
dict should be config from YAML file as loaded with pyyaml.
Returns
-------
predict_config_dict : dict
after validation of all keys
"""
validated_predict_config = copy.deepcopy(predict_config_yaml)
for key, val in predict_config_yaml.items():
if key == "todo_list":
if type(val) != list:
raise TypeError(
"todo_list did not parse as a list, instead it parsed as {}. "
"Please check config file formatting.".format(type(val))
)
else:
for index, item in enumerate(val):
if type(item) != dict:
raise TypeError(
"item {} in todo_list did not parse as a dictionary, "
"instead it parsed as a {}. Please check config file"
" formatting".format(index, type(item))
)
else:
val[index] = _validate_todo_list_dict(item, index, config_path)
# make sure that if predict_proba is True, that the model
# was trained with predict_proba set to True.
# Need to do this *after* already validating all model_meta_file keys
for item in val: # where each item is a todo_list_dict
if "predict_proba" in item:
if item["predict_proba"]: # if it is True, then
# make sure model was trained with predict_proba set to True
model_meta_file = item["model_meta_file"]
if not os.path.isfile(os.path.normpath(model_meta_file)):
# if val is not absolute path to meta_file
# try adding item to absolute path to config_file
# i.e. assume path to file is written relative to config file
model_meta_file = os.path.join(
os.path.dirname(config_path),
os.path.normpath(model_meta_file),
)
model_meta_file = joblib.load(model_meta_file)
model_name = model_meta_file["model_name"]
if model_name in MUST_TRAIN_WITH_PROB_TRUE:
# if model not in MUST_TRAIN_WITH_PROB_TRUE
# then we get probabilities for free with the model
# as implemented, e.g. kNeighborsClassifier
# from scikit-learn, and any neural net that
# has a softmax layer as the output
model_filename = model_meta_file["model_filename"]
model = joblib.load(model_filename)
if not model.probability:
raise AttributeError(
"predict_proba in config file is set to True, "
"but model was not trained with predict_proba "
"set to True.\n"
"config file is: {}\n"
"model meta file is: {}\n"
"model file is: {}".format(
config_path,
item["model_meta_file"],
model_filename,
)
)
validated_predict_config["todo_list"] = val
else: # if key is not found in list
raise KeyError("key {} in 'predict' is an invalid key".format(key))
return validated_predict_config
|
PypiClean
|
/taskcc-alipay-sdk-python-3.3.398.tar.gz/taskcc-alipay-sdk-python-3.3.398/alipay/aop/api/domain/AlipayEcoEduKtBillingModifyModel.py
|
import json
from alipay.aop.api.constant.ParamConstants import *
class AlipayEcoEduKtBillingModifyModel(object):
def __init__(self):
self._buyer_logon_id = None
self._buyer_user_id = None
self._fund_change = None
self._gmt_refund = None
self._out_request_no = None
self._out_trade_no = None
self._refund_amount = None
self._refund_detail_item_list = None
self._refund_reason = None
self._status = None
self._trade_no = None
@property
def buyer_logon_id(self):
return self._buyer_logon_id
@buyer_logon_id.setter
def buyer_logon_id(self, value):
self._buyer_logon_id = value
@property
def buyer_user_id(self):
return self._buyer_user_id
@buyer_user_id.setter
def buyer_user_id(self, value):
self._buyer_user_id = value
@property
def fund_change(self):
return self._fund_change
@fund_change.setter
def fund_change(self, value):
self._fund_change = value
@property
def gmt_refund(self):
return self._gmt_refund
@gmt_refund.setter
def gmt_refund(self, value):
self._gmt_refund = value
@property
def out_request_no(self):
return self._out_request_no
@out_request_no.setter
def out_request_no(self, value):
self._out_request_no = value
@property
def out_trade_no(self):
return self._out_trade_no
@out_trade_no.setter
def out_trade_no(self, value):
self._out_trade_no = value
@property
def refund_amount(self):
return self._refund_amount
@refund_amount.setter
def refund_amount(self, value):
self._refund_amount = value
@property
def refund_detail_item_list(self):
return self._refund_detail_item_list
@refund_detail_item_list.setter
def refund_detail_item_list(self, value):
self._refund_detail_item_list = value
@property
def refund_reason(self):
return self._refund_reason
@refund_reason.setter
def refund_reason(self, value):
self._refund_reason = value
@property
def status(self):
return self._status
@status.setter
def status(self, value):
self._status = value
@property
def trade_no(self):
return self._trade_no
@trade_no.setter
def trade_no(self, value):
self._trade_no = value
def to_alipay_dict(self):
params = dict()
if self.buyer_logon_id:
if hasattr(self.buyer_logon_id, 'to_alipay_dict'):
params['buyer_logon_id'] = self.buyer_logon_id.to_alipay_dict()
else:
params['buyer_logon_id'] = self.buyer_logon_id
if self.buyer_user_id:
if hasattr(self.buyer_user_id, 'to_alipay_dict'):
params['buyer_user_id'] = self.buyer_user_id.to_alipay_dict()
else:
params['buyer_user_id'] = self.buyer_user_id
if self.fund_change:
if hasattr(self.fund_change, 'to_alipay_dict'):
params['fund_change'] = self.fund_change.to_alipay_dict()
else:
params['fund_change'] = self.fund_change
if self.gmt_refund:
if hasattr(self.gmt_refund, 'to_alipay_dict'):
params['gmt_refund'] = self.gmt_refund.to_alipay_dict()
else:
params['gmt_refund'] = self.gmt_refund
if self.out_request_no:
if hasattr(self.out_request_no, 'to_alipay_dict'):
params['out_request_no'] = self.out_request_no.to_alipay_dict()
else:
params['out_request_no'] = self.out_request_no
if self.out_trade_no:
if hasattr(self.out_trade_no, 'to_alipay_dict'):
params['out_trade_no'] = self.out_trade_no.to_alipay_dict()
else:
params['out_trade_no'] = self.out_trade_no
if self.refund_amount:
if hasattr(self.refund_amount, 'to_alipay_dict'):
params['refund_amount'] = self.refund_amount.to_alipay_dict()
else:
params['refund_amount'] = self.refund_amount
if self.refund_detail_item_list:
if hasattr(self.refund_detail_item_list, 'to_alipay_dict'):
params['refund_detail_item_list'] = self.refund_detail_item_list.to_alipay_dict()
else:
params['refund_detail_item_list'] = self.refund_detail_item_list
if self.refund_reason:
if hasattr(self.refund_reason, 'to_alipay_dict'):
params['refund_reason'] = self.refund_reason.to_alipay_dict()
else:
params['refund_reason'] = self.refund_reason
if self.status:
if hasattr(self.status, 'to_alipay_dict'):
params['status'] = self.status.to_alipay_dict()
else:
params['status'] = self.status
if self.trade_no:
if hasattr(self.trade_no, 'to_alipay_dict'):
params['trade_no'] = self.trade_no.to_alipay_dict()
else:
params['trade_no'] = self.trade_no
return params
@staticmethod
def from_alipay_dict(d):
if not d:
return None
o = AlipayEcoEduKtBillingModifyModel()
if 'buyer_logon_id' in d:
o.buyer_logon_id = d['buyer_logon_id']
if 'buyer_user_id' in d:
o.buyer_user_id = d['buyer_user_id']
if 'fund_change' in d:
o.fund_change = d['fund_change']
if 'gmt_refund' in d:
o.gmt_refund = d['gmt_refund']
if 'out_request_no' in d:
o.out_request_no = d['out_request_no']
if 'out_trade_no' in d:
o.out_trade_no = d['out_trade_no']
if 'refund_amount' in d:
o.refund_amount = d['refund_amount']
if 'refund_detail_item_list' in d:
o.refund_detail_item_list = d['refund_detail_item_list']
if 'refund_reason' in d:
o.refund_reason = d['refund_reason']
if 'status' in d:
o.status = d['status']
if 'trade_no' in d:
o.trade_no = d['trade_no']
return o
|
PypiClean
|
/wasi-0.4.1.tar.gz/wasi-0.4.1/README.md
|
<p align="center">
<a href="https://github.com/wasienv/wasienv" target="_blank" rel="noopener noreferrer">
<img height="180" src="https://raw.githubusercontent.com/wasienv/wasienv/master/logo.png" alt="Wasienv logo">
</a>
</p>
<p align="center">
<a href="https://github.com/wasienv/wasienv/actions?workflow=CI">
<img src="https://github.com/wasienv/wasienv/workflows/CI/badge.svg?style=flat-square" alt="Tests">
</a>
<a href="https://github.com/wasmerio/wasmer/blob/master/LICENSE">
<img src="https://img.shields.io/github/license/wasienv/wasienv.svg?style=flat-square" alt="License">
</a>
</p>
# Wasienv: WASI Development Toolchain for C/C++
Wasienv is a tool that aims to bring all projects to [WebAssembly WASI](https://github.com/WebAssembly/WASI). With `wasienv` you can compile C/C++ projects easily to WASI, so you can run them anywhere (with any Standalone WASI WebAssembly runtime, or [in the Browser](https://webassembly.sh)).
> Note: If you aim to use the WebAssembly files in the web directly (using graphics, audio or other tools that are not supported in WASI) then [Emscripten](https://emscripten.org/) is probably a much better choice.
## Install
You can install `wasienv` with:
```sh
curl https://raw.githubusercontent.com/wasienv/wasienv/master/install.sh | sh
```
> Note: we also ship `wasienv` in a Docker image. You can check [how to use the Wasienv Docker image here](https://github.com/wasienv/wasienv/blob/master/docker/).
## Using wasienv
If you want to compile a C file to a WebAssembly WASI:
```sh
# To compile to a WebAssembly WASI file
# This command will generate:
# • An executable: ./example
# • A WebAssembly file: ./example.wasm
wasicc examples/example.c -o example
# If you are using configure
wasiconfigure ./configure
# If you are using cmake (or make)
wasimake cmake .
```
If you want to compile a C file to plain WebAssembly:
```sh
# To compile to a WebAssembly file
# This command will generate:
# • An executable: ./example
# • A WebAssembly file: ./example.wasm
wasmcc examples/example.c -o example
```
## Commands
When installing `wasienv`, the following commands will be automatically available:
### `wasienv`
This is the compiler toolchain. You have two commands available:
For installing a SDK (`wasienv install-sdk`):
```sh
wasienv install-sdk 7
```
For setting a SDK as the default (`wasienv default-sdk`):
```sh
wasienv default-sdk 7
```
### `wasicc`
It's a wrapper on top of `clang`, with additions for the stubs, sysroot and target.
It also detects autoexecutables in the output and wraps to execute them with a WebAssembly WASI runtime via `wasirun`.
### `wasic++`
It's a wrapper on top of `clang++`, with additions for the stubs, sysroot and target.
It also detects autoexecutables in the output and wraps to execute them with a WebAssembly WASI runtime via `wasirun`.
### `wasmcc`
It's a wrapper on top of `clang`, with additions for preconfiguring the wasm linker, target, etc...
### `wasmc++`
It's a wrapper on top of `clang++`, with additions for preconfiguring the wasm linker, target, etc...
### `wasiconfigure`
It's a helper that adds the wasienv environment vars (`CC`, `CXX`, `RUNLIB`, ...) to the following command (`./configure`).
Example:
```sh
wasiconfigure ./configure
```
### `wasimake`
It's a helper that adds the wasienv environment vars (`CC`, `CXX`, `RUNLIB`, ...) for the make (`make` or `cmake`).
Example:
```sh
# With CMake
wasimake cmake .
# With Make
wasimake make
```
### `wasirun`
It executes a given WebAssembly file with a standalone WebAssembly runtime.
```sh
wasirun myfile.wasm
```
## Contributing
After cloning this repo, ensure dependencies are installed by running:
```sh
make install-dev
```
After that, all the commands will be available on your shell and you should be able to start seeing the changes directly without re-installing wasienv.
## Testing
After running `make install-dev` you can run directly:
```sh
make test
```
## How wasienv compares to …?
### Emscripten
[Emscripten](https://emscripten.org/) is a great toolchain that let's you compile your C/C++ projects to WebAssembly so you can use them in the web easily.
However, Emscripten has a **non-stable ABI** (because constant and fast iteration is very useful for their usecase).
This makes it a bit challening for standalone-runtimes to continually adapt.
Because of that, adopting the WASI ABI is a much easier path for standalone server-side WebAssembly runtimes.
Right now Emscripten is [moving towards WASI adoption](https://github.com/emscripten-core/emscripten/issues/9479).
However, Emscripten can only emit WASI WebAssembly files for some programs as Emscripten's filesystem layer supports POSIX features not yet present in WASI.
Emscripten has also some tools that are not needed (nor supported) in the case of server-side Standalone WebAssembly runtimes, such as [`EM_JS` and `EM_ASM`](https://emscripten.org/docs/porting/connecting_cpp_and_javascript/Interacting-with-code.html#calling-javascript-from-c-c).
Wasienv learns a lot from Emscripten, since they figured out the perfect ergonomics for having C/C++ projects to adopt WebAssembly. Alon, the creator of Emscripten, is without any doubt one of the brilliant minds behind WebAssembly and he inspired us with his work to keep improving the ergonomics of WASI.
### WASI-libc
WASI-libc is the "frontend ABI" for WASI. By itself, it only provides header files and implementations that make C compilers adopt WASI very easily via the `--sysroot` flag.
### WASI-SDK
We can see WASI-SDK as the union between `WASI-libc` and the compiler binaries `clang`, `wasm-ld`, ...
Wasienv is using WASI-SDK under the hood to compile to WebAssembly, however it differs from it in two major ways:
1. `wasienv` is designed to work with **multiple SDKs** versions
2. `wasienv` is completely focused on the **ergonomics**, exposing very simple to use CLI tools so that projects can adopt it easily.
We can think of `wasienv` as applying the ergonomic ideas from Emscripten to the WASI-SDK
|
PypiClean
|
/aspose-tasks-cloud-22.12.0.tar.gz/aspose-tasks-cloud-22.12.0/asposetaskscloud/models/task.py
|
import pprint
import re # noqa: F401
import six
class Task(object):
"""Represents project task.
"""
"""
Attributes:
swagger_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
swagger_types = {
'uid': 'int',
'id': 'int',
'name': 'str',
'duration_text': 'str',
'duration': 'str',
'start': 'datetime',
'finish': 'datetime',
'start_text': 'str',
'finish_text': 'str',
'percent_complete': 'int',
'percent_work_complete': 'int',
'is_active': 'bool',
'actual_cost': 'float',
'actual_duration': 'str',
'actual_finish': 'datetime',
'actual_overtime_cost': 'float',
'actual_overtime_work': 'str',
'actual_work_protected': 'str',
'actual_overtime_work_protected': 'str',
'actual_start': 'datetime',
'budget_work': 'str',
'budget_cost': 'float',
'constraint_date': 'datetime',
'constraint_type': 'ConstraintType',
'contact': 'str',
'cost': 'float',
'cv': 'float',
'deadline': 'datetime',
'duration_variance': 'str',
'early_finish': 'datetime',
'early_start': 'datetime',
'is_effort_driven': 'bool',
'is_external_task': 'bool',
'external_task_project': 'str',
'external_id': 'int',
'finish_slack': 'int',
'finish_variance': 'int',
'fixed_cost': 'float',
'fixed_cost_accrual': 'CostAccrualType',
'free_slack': 'int',
'guid': 'str',
'has_overallocated_resource': 'bool',
'hide_bar': 'bool',
'ignore_resource_calendar': 'bool',
'late_finish': 'datetime',
'late_start': 'datetime',
'is_level_assignments': 'bool',
'can_leveling_split': 'bool',
'leveling_delay': 'int',
'is_marked': 'bool',
'is_milestone': 'bool',
'is_critical': 'bool',
'is_subproject': 'bool',
'is_subproject_read_only': 'bool',
'subproject_name': 'str',
'is_summary': 'bool',
'subtasks_uids': 'list[int]',
'outline_level': 'int',
'is_over_allocated': 'bool',
'is_estimated': 'bool',
'overtime_cost': 'float',
'overtime_work': 'str',
'physical_percent_complete': 'int',
'pre_leveled_finish': 'datetime',
'pre_leveled_start': 'datetime',
'is_recurring': 'bool',
'regular_work': 'str',
'remaining_cost': 'float',
'remaining_duration': 'str',
'remaining_overtime_cost': 'float',
'remaining_overtime_work': 'str',
'remaining_work': 'str',
'resume': 'datetime',
'is_resume_valid': 'bool',
'stop': 'datetime',
'is_rollup': 'bool',
'start_slack': 'int',
'start_variance': 'int',
'calendar_uid': 'int',
'is_manual': 'bool',
'manual_start': 'datetime',
'manual_finish': 'datetime',
'manual_duration': 'str',
'total_slack': 'int',
'type': 'TaskType',
'wbs': 'str',
'priority': 'int',
'work': 'str',
'work_variance': 'float',
'notes_text': 'str',
'notes_rtf': 'str',
'acwp': 'float',
'bcws': 'float',
'bcwp': 'float',
'leveling_delay_format': 'TimeUnitType',
'predecessors': 'str',
'successors': 'str',
'ignore_warnings': 'bool',
'is_expanded': 'bool',
'display_on_timeline': 'bool',
'display_as_summary': 'bool',
'hyperlink': 'str',
'hyperlink_address': 'str',
'hyperlink_sub_address': 'str',
'earned_value_method': 'EarnedValueMethodType',
'is_published': 'bool',
'status_manager': 'str',
'commitment_start': 'datetime',
'commitment_finish': 'datetime',
'commitment_type': 'int',
'baselines': 'list[TaskBaseline]',
'extended_attributes': 'list[ExtendedAttribute]',
'outline_codes': 'list[OutlineCode]',
'warning': 'bool',
'activity_id': 'str'
}
attribute_map = {
'uid': 'uid',
'id': 'id',
'name': 'name',
'duration_text': 'durationText',
'duration': 'duration',
'start': 'start',
'finish': 'finish',
'start_text': 'startText',
'finish_text': 'finishText',
'percent_complete': 'percentComplete',
'percent_work_complete': 'percentWorkComplete',
'is_active': 'isActive',
'actual_cost': 'actualCost',
'actual_duration': 'actualDuration',
'actual_finish': 'actualFinish',
'actual_overtime_cost': 'actualOvertimeCost',
'actual_overtime_work': 'actualOvertimeWork',
'actual_work_protected': 'actualWorkProtected',
'actual_overtime_work_protected': 'actualOvertimeWorkProtected',
'actual_start': 'actualStart',
'budget_work': 'budgetWork',
'budget_cost': 'budgetCost',
'constraint_date': 'constraintDate',
'constraint_type': 'constraintType',
'contact': 'contact',
'cost': 'cost',
'cv': 'cv',
'deadline': 'deadline',
'duration_variance': 'durationVariance',
'early_finish': 'earlyFinish',
'early_start': 'earlyStart',
'is_effort_driven': 'isEffortDriven',
'is_external_task': 'isExternalTask',
'external_task_project': 'externalTaskProject',
'external_id': 'externalId',
'finish_slack': 'finishSlack',
'finish_variance': 'finishVariance',
'fixed_cost': 'fixedCost',
'fixed_cost_accrual': 'fixedCostAccrual',
'free_slack': 'freeSlack',
'guid': 'guid',
'has_overallocated_resource': 'hasOverallocatedResource',
'hide_bar': 'hideBar',
'ignore_resource_calendar': 'ignoreResourceCalendar',
'late_finish': 'lateFinish',
'late_start': 'lateStart',
'is_level_assignments': 'isLevelAssignments',
'can_leveling_split': 'canLevelingSplit',
'leveling_delay': 'levelingDelay',
'is_marked': 'isMarked',
'is_milestone': 'isMilestone',
'is_critical': 'isCritical',
'is_subproject': 'isSubproject',
'is_subproject_read_only': 'isSubprojectReadOnly',
'subproject_name': 'subprojectName',
'is_summary': 'isSummary',
'subtasks_uids': 'subtasksUids',
'outline_level': 'outlineLevel',
'is_over_allocated': 'isOverAllocated',
'is_estimated': 'isEstimated',
'overtime_cost': 'overtimeCost',
'overtime_work': 'overtimeWork',
'physical_percent_complete': 'physicalPercentComplete',
'pre_leveled_finish': 'preLeveledFinish',
'pre_leveled_start': 'preLeveledStart',
'is_recurring': 'isRecurring',
'regular_work': 'regularWork',
'remaining_cost': 'remainingCost',
'remaining_duration': 'remainingDuration',
'remaining_overtime_cost': 'remainingOvertimeCost',
'remaining_overtime_work': 'remainingOvertimeWork',
'remaining_work': 'remainingWork',
'resume': 'resume',
'is_resume_valid': 'isResumeValid',
'stop': 'stop',
'is_rollup': 'isRollup',
'start_slack': 'startSlack',
'start_variance': 'startVariance',
'calendar_uid': 'calendarUid',
'is_manual': 'isManual',
'manual_start': 'manualStart',
'manual_finish': 'manualFinish',
'manual_duration': 'manualDuration',
'total_slack': 'totalSlack',
'type': 'type',
'wbs': 'wbs',
'priority': 'priority',
'work': 'work',
'work_variance': 'workVariance',
'notes_text': 'notesText',
'notes_rtf': 'notesRTF',
'acwp': 'acwp',
'bcws': 'bcws',
'bcwp': 'bcwp',
'leveling_delay_format': 'levelingDelayFormat',
'predecessors': 'predecessors',
'successors': 'successors',
'ignore_warnings': 'ignoreWarnings',
'is_expanded': 'isExpanded',
'display_on_timeline': 'displayOnTimeline',
'display_as_summary': 'displayAsSummary',
'hyperlink': 'hyperlink',
'hyperlink_address': 'hyperlinkAddress',
'hyperlink_sub_address': 'hyperlinkSubAddress',
'earned_value_method': 'earnedValueMethod',
'is_published': 'isPublished',
'status_manager': 'statusManager',
'commitment_start': 'commitmentStart',
'commitment_finish': 'commitmentFinish',
'commitment_type': 'commitmentType',
'baselines': 'baselines',
'extended_attributes': 'extendedAttributes',
'outline_codes': 'outlineCodes',
'warning': 'warning',
'activity_id': 'activityId'
}
def __init__(self, uid=None, id=None, name=None, duration_text=None, duration=None, start=None, finish=None, start_text=None, finish_text=None, percent_complete=None, percent_work_complete=None, is_active=True, actual_cost=None, actual_duration=None, actual_finish=None, actual_overtime_cost=None, actual_overtime_work=None, actual_work_protected=None, actual_overtime_work_protected=None, actual_start=None, budget_work=None, budget_cost=None, constraint_date=None, constraint_type=None, contact=None, cost=None, cv=None, deadline=None, duration_variance=None, early_finish=None, early_start=None, is_effort_driven=None, is_external_task=None, external_task_project=None, external_id=None, finish_slack=None, finish_variance=None, fixed_cost=None, fixed_cost_accrual=None, free_slack=None, guid=None, has_overallocated_resource=None, hide_bar=None, ignore_resource_calendar=None, late_finish=None, late_start=None, is_level_assignments=True, can_leveling_split=True, leveling_delay=None, is_marked=None, is_milestone=None, is_critical=None, is_subproject=None, is_subproject_read_only=None, subproject_name=None, is_summary=None, subtasks_uids=None, outline_level=None, is_over_allocated=None, is_estimated=None, overtime_cost=None, overtime_work=None, physical_percent_complete=None, pre_leveled_finish=None, pre_leveled_start=None, is_recurring=None, regular_work=None, remaining_cost=None, remaining_duration=None, remaining_overtime_cost=None, remaining_overtime_work=None, remaining_work=None, resume=None, is_resume_valid=None, stop=None, is_rollup=None, start_slack=None, start_variance=None, calendar_uid=-1, is_manual=None, manual_start=None, manual_finish=None, manual_duration=None, total_slack=None, type=None, wbs=None, priority=None, work=None, work_variance=None, notes_text=None, notes_rtf=None, acwp=None, bcws=None, bcwp=None, leveling_delay_format=None, predecessors=None, successors=None, ignore_warnings=False, is_expanded=None, display_on_timeline=None, display_as_summary=None, hyperlink=None, hyperlink_address=None, hyperlink_sub_address=None, earned_value_method=None, is_published=True, status_manager=None, commitment_start=None, commitment_finish=None, commitment_type=None, baselines=None, extended_attributes=None, outline_codes=None, warning=False, activity_id=None): # noqa: E501
"""Task - a model defined in Swagger""" # noqa: E501
self._uid = None
self._id = None
self._name = None
self._duration_text = None
self._duration = None
self._start = None
self._finish = None
self._start_text = None
self._finish_text = None
self._percent_complete = None
self._percent_work_complete = None
self._is_active = None
self._actual_cost = None
self._actual_duration = None
self._actual_finish = None
self._actual_overtime_cost = None
self._actual_overtime_work = None
self._actual_work_protected = None
self._actual_overtime_work_protected = None
self._actual_start = None
self._budget_work = None
self._budget_cost = None
self._constraint_date = None
self._constraint_type = None
self._contact = None
self._cost = None
self._cv = None
self._deadline = None
self._duration_variance = None
self._early_finish = None
self._early_start = None
self._is_effort_driven = None
self._is_external_task = None
self._external_task_project = None
self._external_id = None
self._finish_slack = None
self._finish_variance = None
self._fixed_cost = None
self._fixed_cost_accrual = None
self._free_slack = None
self._guid = None
self._has_overallocated_resource = None
self._hide_bar = None
self._ignore_resource_calendar = None
self._late_finish = None
self._late_start = None
self._is_level_assignments = None
self._can_leveling_split = None
self._leveling_delay = None
self._is_marked = None
self._is_milestone = None
self._is_critical = None
self._is_subproject = None
self._is_subproject_read_only = None
self._subproject_name = None
self._is_summary = None
self._subtasks_uids = None
self._outline_level = None
self._is_over_allocated = None
self._is_estimated = None
self._overtime_cost = None
self._overtime_work = None
self._physical_percent_complete = None
self._pre_leveled_finish = None
self._pre_leveled_start = None
self._is_recurring = None
self._regular_work = None
self._remaining_cost = None
self._remaining_duration = None
self._remaining_overtime_cost = None
self._remaining_overtime_work = None
self._remaining_work = None
self._resume = None
self._is_resume_valid = None
self._stop = None
self._is_rollup = None
self._start_slack = None
self._start_variance = None
self._calendar_uid = None
self._is_manual = None
self._manual_start = None
self._manual_finish = None
self._manual_duration = None
self._total_slack = None
self._type = None
self._wbs = None
self._priority = None
self._work = None
self._work_variance = None
self._notes_text = None
self._notes_rtf = None
self._acwp = None
self._bcws = None
self._bcwp = None
self._leveling_delay_format = None
self._predecessors = None
self._successors = None
self._ignore_warnings = None
self._is_expanded = None
self._display_on_timeline = None
self._display_as_summary = None
self._hyperlink = None
self._hyperlink_address = None
self._hyperlink_sub_address = None
self._earned_value_method = None
self._is_published = None
self._status_manager = None
self._commitment_start = None
self._commitment_finish = None
self._commitment_type = None
self._baselines = None
self._extended_attributes = None
self._outline_codes = None
self._warning = None
self._activity_id = None
self.discriminator = None
if uid is not None:
self.uid = uid
if id is not None:
self.id = id
if name is not None:
self.name = name
if duration_text is not None:
self.duration_text = duration_text
if duration is not None:
self.duration = duration
if start is not None:
self.start = start
if finish is not None:
self.finish = finish
if start_text is not None:
self.start_text = start_text
if finish_text is not None:
self.finish_text = finish_text
if percent_complete is not None:
self.percent_complete = percent_complete
if percent_work_complete is not None:
self.percent_work_complete = percent_work_complete
if is_active is not None:
self.is_active = is_active
if actual_cost is not None:
self.actual_cost = actual_cost
if actual_duration is not None:
self.actual_duration = actual_duration
if actual_finish is not None:
self.actual_finish = actual_finish
if actual_overtime_cost is not None:
self.actual_overtime_cost = actual_overtime_cost
if actual_overtime_work is not None:
self.actual_overtime_work = actual_overtime_work
if actual_work_protected is not None:
self.actual_work_protected = actual_work_protected
if actual_overtime_work_protected is not None:
self.actual_overtime_work_protected = actual_overtime_work_protected
if actual_start is not None:
self.actual_start = actual_start
if budget_work is not None:
self.budget_work = budget_work
if budget_cost is not None:
self.budget_cost = budget_cost
if constraint_date is not None:
self.constraint_date = constraint_date
if constraint_type is not None:
self.constraint_type = constraint_type
if contact is not None:
self.contact = contact
if cost is not None:
self.cost = cost
if cv is not None:
self.cv = cv
if deadline is not None:
self.deadline = deadline
if duration_variance is not None:
self.duration_variance = duration_variance
if early_finish is not None:
self.early_finish = early_finish
if early_start is not None:
self.early_start = early_start
if is_effort_driven is not None:
self.is_effort_driven = is_effort_driven
if is_external_task is not None:
self.is_external_task = is_external_task
if external_task_project is not None:
self.external_task_project = external_task_project
if external_id is not None:
self.external_id = external_id
if finish_slack is not None:
self.finish_slack = finish_slack
if finish_variance is not None:
self.finish_variance = finish_variance
if fixed_cost is not None:
self.fixed_cost = fixed_cost
if fixed_cost_accrual is not None:
self.fixed_cost_accrual = fixed_cost_accrual
if free_slack is not None:
self.free_slack = free_slack
if guid is not None:
self.guid = guid
if has_overallocated_resource is not None:
self.has_overallocated_resource = has_overallocated_resource
if hide_bar is not None:
self.hide_bar = hide_bar
if ignore_resource_calendar is not None:
self.ignore_resource_calendar = ignore_resource_calendar
if late_finish is not None:
self.late_finish = late_finish
if late_start is not None:
self.late_start = late_start
if is_level_assignments is not None:
self.is_level_assignments = is_level_assignments
if can_leveling_split is not None:
self.can_leveling_split = can_leveling_split
if leveling_delay is not None:
self.leveling_delay = leveling_delay
if is_marked is not None:
self.is_marked = is_marked
if is_milestone is not None:
self.is_milestone = is_milestone
if is_critical is not None:
self.is_critical = is_critical
if is_subproject is not None:
self.is_subproject = is_subproject
if is_subproject_read_only is not None:
self.is_subproject_read_only = is_subproject_read_only
if subproject_name is not None:
self.subproject_name = subproject_name
if is_summary is not None:
self.is_summary = is_summary
if subtasks_uids is not None:
self.subtasks_uids = subtasks_uids
if outline_level is not None:
self.outline_level = outline_level
if is_over_allocated is not None:
self.is_over_allocated = is_over_allocated
if is_estimated is not None:
self.is_estimated = is_estimated
if overtime_cost is not None:
self.overtime_cost = overtime_cost
if overtime_work is not None:
self.overtime_work = overtime_work
if physical_percent_complete is not None:
self.physical_percent_complete = physical_percent_complete
if pre_leveled_finish is not None:
self.pre_leveled_finish = pre_leveled_finish
if pre_leveled_start is not None:
self.pre_leveled_start = pre_leveled_start
if is_recurring is not None:
self.is_recurring = is_recurring
if regular_work is not None:
self.regular_work = regular_work
if remaining_cost is not None:
self.remaining_cost = remaining_cost
if remaining_duration is not None:
self.remaining_duration = remaining_duration
if remaining_overtime_cost is not None:
self.remaining_overtime_cost = remaining_overtime_cost
if remaining_overtime_work is not None:
self.remaining_overtime_work = remaining_overtime_work
if remaining_work is not None:
self.remaining_work = remaining_work
if resume is not None:
self.resume = resume
if is_resume_valid is not None:
self.is_resume_valid = is_resume_valid
if stop is not None:
self.stop = stop
if is_rollup is not None:
self.is_rollup = is_rollup
if start_slack is not None:
self.start_slack = start_slack
if start_variance is not None:
self.start_variance = start_variance
if calendar_uid is not None:
self.calendar_uid = calendar_uid
if is_manual is not None:
self.is_manual = is_manual
if manual_start is not None:
self.manual_start = manual_start
if manual_finish is not None:
self.manual_finish = manual_finish
if manual_duration is not None:
self.manual_duration = manual_duration
if total_slack is not None:
self.total_slack = total_slack
if type is not None:
self.type = type
if wbs is not None:
self.wbs = wbs
if priority is not None:
self.priority = priority
if work is not None:
self.work = work
if work_variance is not None:
self.work_variance = work_variance
if notes_text is not None:
self.notes_text = notes_text
if notes_rtf is not None:
self.notes_rtf = notes_rtf
if acwp is not None:
self.acwp = acwp
if bcws is not None:
self.bcws = bcws
if bcwp is not None:
self.bcwp = bcwp
if leveling_delay_format is not None:
self.leveling_delay_format = leveling_delay_format
if predecessors is not None:
self.predecessors = predecessors
if successors is not None:
self.successors = successors
if ignore_warnings is not None:
self.ignore_warnings = ignore_warnings
if is_expanded is not None:
self.is_expanded = is_expanded
if display_on_timeline is not None:
self.display_on_timeline = display_on_timeline
if display_as_summary is not None:
self.display_as_summary = display_as_summary
if hyperlink is not None:
self.hyperlink = hyperlink
if hyperlink_address is not None:
self.hyperlink_address = hyperlink_address
if hyperlink_sub_address is not None:
self.hyperlink_sub_address = hyperlink_sub_address
if earned_value_method is not None:
self.earned_value_method = earned_value_method
if is_published is not None:
self.is_published = is_published
if status_manager is not None:
self.status_manager = status_manager
if commitment_start is not None:
self.commitment_start = commitment_start
if commitment_finish is not None:
self.commitment_finish = commitment_finish
if commitment_type is not None:
self.commitment_type = commitment_type
if baselines is not None:
self.baselines = baselines
if extended_attributes is not None:
self.extended_attributes = extended_attributes
if outline_codes is not None:
self.outline_codes = outline_codes
if warning is not None:
self.warning = warning
if activity_id is not None:
self.activity_id = activity_id
@property
def uid(self):
"""Gets the uid of this Task. # noqa: E501
The unique id of a task. # noqa: E501
:return: The uid of this Task. # noqa: E501
:rtype: int
"""
return self._uid
@uid.setter
def uid(self, uid):
"""Sets the uid of this Task.
The unique id of a task. # noqa: E501
:param uid: The uid of this Task. # noqa: E501
:type: int
"""
if uid is None:
raise ValueError("Invalid value for `uid`, must not be `None`") # noqa: E501
self._uid = uid
@property
def id(self):
"""Gets the id of this Task. # noqa: E501
The position of a task in collection. # noqa: E501
:return: The id of this Task. # noqa: E501
:rtype: int
"""
return self._id
@id.setter
def id(self, id):
"""Sets the id of this Task.
The position of a task in collection. # noqa: E501
:param id: The id of this Task. # noqa: E501
:type: int
"""
if id is None:
raise ValueError("Invalid value for `id`, must not be `None`") # noqa: E501
self._id = id
@property
def name(self):
"""Gets the name of this Task. # noqa: E501
The name of a task. # noqa: E501
:return: The name of this Task. # noqa: E501
:rtype: str
"""
return self._name
@name.setter
def name(self, name):
"""Sets the name of this Task.
The name of a task. # noqa: E501
:param name: The name of this Task. # noqa: E501
:type: str
"""
self._name = name
@property
def duration_text(self):
"""Gets the duration_text of this Task. # noqa: E501
The duration of a task entered by the user as a text. # noqa: E501
:return: The duration_text of this Task. # noqa: E501
:rtype: str
"""
return self._duration_text
@duration_text.setter
def duration_text(self, duration_text):
"""Sets the duration_text of this Task.
The duration of a task entered by the user as a text. # noqa: E501
:param duration_text: The duration_text of this Task. # noqa: E501
:type: str
"""
self._duration_text = duration_text
@property
def duration(self):
"""Gets the duration of this Task. # noqa: E501
The duration of a task. # noqa: E501
:return: The duration of this Task. # noqa: E501
:rtype: str
"""
return self._duration
@duration.setter
def duration(self, duration):
"""Sets the duration of this Task.
The duration of a task. # noqa: E501
:param duration: The duration of this Task. # noqa: E501
:type: str
"""
if duration is None:
raise ValueError("Invalid value for `duration`, must not be `None`") # noqa: E501
self._duration = duration
@property
def start(self):
"""Gets the start of this Task. # noqa: E501
The start date of a task. # noqa: E501
:return: The start of this Task. # noqa: E501
:rtype: datetime
"""
return self._start
@start.setter
def start(self, start):
"""Sets the start of this Task.
The start date of a task. # noqa: E501
:param start: The start of this Task. # noqa: E501
:type: datetime
"""
if start is None:
raise ValueError("Invalid value for `start`, must not be `None`") # noqa: E501
self._start = start
@property
def finish(self):
"""Gets the finish of this Task. # noqa: E501
The finish date of a task. # noqa: E501
:return: The finish of this Task. # noqa: E501
:rtype: datetime
"""
return self._finish
@finish.setter
def finish(self, finish):
"""Sets the finish of this Task.
The finish date of a task. # noqa: E501
:param finish: The finish of this Task. # noqa: E501
:type: datetime
"""
if finish is None:
raise ValueError("Invalid value for `finish`, must not be `None`") # noqa: E501
self._finish = finish
@property
def start_text(self):
"""Gets the start_text of this Task. # noqa: E501
Returns the task's start text. # noqa: E501
:return: The start_text of this Task. # noqa: E501
:rtype: str
"""
return self._start_text
@start_text.setter
def start_text(self, start_text):
"""Sets the start_text of this Task.
Returns the task's start text. # noqa: E501
:param start_text: The start_text of this Task. # noqa: E501
:type: str
"""
self._start_text = start_text
@property
def finish_text(self):
"""Gets the finish_text of this Task. # noqa: E501
Returns the task's finish text. # noqa: E501
:return: The finish_text of this Task. # noqa: E501
:rtype: str
"""
return self._finish_text
@finish_text.setter
def finish_text(self, finish_text):
"""Sets the finish_text of this Task.
Returns the task's finish text. # noqa: E501
:param finish_text: The finish_text of this Task. # noqa: E501
:type: str
"""
self._finish_text = finish_text
@property
def percent_complete(self):
"""Gets the percent_complete of this Task. # noqa: E501
The percent complete of a task. # noqa: E501
:return: The percent_complete of this Task. # noqa: E501
:rtype: int
"""
return self._percent_complete
@percent_complete.setter
def percent_complete(self, percent_complete):
"""Sets the percent_complete of this Task.
The percent complete of a task. # noqa: E501
:param percent_complete: The percent_complete of this Task. # noqa: E501
:type: int
"""
if percent_complete is None:
raise ValueError("Invalid value for `percent_complete`, must not be `None`") # noqa: E501
self._percent_complete = percent_complete
@property
def percent_work_complete(self):
"""Gets the percent_work_complete of this Task. # noqa: E501
The percent work complete of a task. # noqa: E501
:return: The percent_work_complete of this Task. # noqa: E501
:rtype: int
"""
return self._percent_work_complete
@percent_work_complete.setter
def percent_work_complete(self, percent_work_complete):
"""Sets the percent_work_complete of this Task.
The percent work complete of a task. # noqa: E501
:param percent_work_complete: The percent_work_complete of this Task. # noqa: E501
:type: int
"""
if percent_work_complete is None:
raise ValueError("Invalid value for `percent_work_complete`, must not be `None`") # noqa: E501
self._percent_work_complete = percent_work_complete
@property
def is_active(self):
"""Gets the is_active of this Task. # noqa: E501
Determines if a task is active. # noqa: E501
:return: The is_active of this Task. # noqa: E501
:rtype: bool
"""
return self._is_active
@is_active.setter
def is_active(self, is_active):
"""Sets the is_active of this Task.
Determines if a task is active. # noqa: E501
:param is_active: The is_active of this Task. # noqa: E501
:type: bool
"""
if is_active is None:
raise ValueError("Invalid value for `is_active`, must not be `None`") # noqa: E501
self._is_active = is_active
@property
def actual_cost(self):
"""Gets the actual_cost of this Task. # noqa: E501
The actual cost of a task. # noqa: E501
:return: The actual_cost of this Task. # noqa: E501
:rtype: float
"""
return self._actual_cost
@actual_cost.setter
def actual_cost(self, actual_cost):
"""Sets the actual_cost of this Task.
The actual cost of a task. # noqa: E501
:param actual_cost: The actual_cost of this Task. # noqa: E501
:type: float
"""
if actual_cost is None:
raise ValueError("Invalid value for `actual_cost`, must not be `None`") # noqa: E501
self._actual_cost = actual_cost
@property
def actual_duration(self):
"""Gets the actual_duration of this Task. # noqa: E501
The actual duration of a task. # noqa: E501
:return: The actual_duration of this Task. # noqa: E501
:rtype: str
"""
return self._actual_duration
@actual_duration.setter
def actual_duration(self, actual_duration):
"""Sets the actual_duration of this Task.
The actual duration of a task. # noqa: E501
:param actual_duration: The actual_duration of this Task. # noqa: E501
:type: str
"""
if actual_duration is None:
raise ValueError("Invalid value for `actual_duration`, must not be `None`") # noqa: E501
self._actual_duration = actual_duration
@property
def actual_finish(self):
"""Gets the actual_finish of this Task. # noqa: E501
The actual finish date of a task. # noqa: E501
:return: The actual_finish of this Task. # noqa: E501
:rtype: datetime
"""
return self._actual_finish
@actual_finish.setter
def actual_finish(self, actual_finish):
"""Sets the actual_finish of this Task.
The actual finish date of a task. # noqa: E501
:param actual_finish: The actual_finish of this Task. # noqa: E501
:type: datetime
"""
if actual_finish is None:
raise ValueError("Invalid value for `actual_finish`, must not be `None`") # noqa: E501
self._actual_finish = actual_finish
@property
def actual_overtime_cost(self):
"""Gets the actual_overtime_cost of this Task. # noqa: E501
The actual overtime cost of a task. # noqa: E501
:return: The actual_overtime_cost of this Task. # noqa: E501
:rtype: float
"""
return self._actual_overtime_cost
@actual_overtime_cost.setter
def actual_overtime_cost(self, actual_overtime_cost):
"""Sets the actual_overtime_cost of this Task.
The actual overtime cost of a task. # noqa: E501
:param actual_overtime_cost: The actual_overtime_cost of this Task. # noqa: E501
:type: float
"""
if actual_overtime_cost is None:
raise ValueError("Invalid value for `actual_overtime_cost`, must not be `None`") # noqa: E501
self._actual_overtime_cost = actual_overtime_cost
@property
def actual_overtime_work(self):
"""Gets the actual_overtime_work of this Task. # noqa: E501
The actual overtime work of a task. # noqa: E501
:return: The actual_overtime_work of this Task. # noqa: E501
:rtype: str
"""
return self._actual_overtime_work
@actual_overtime_work.setter
def actual_overtime_work(self, actual_overtime_work):
"""Sets the actual_overtime_work of this Task.
The actual overtime work of a task. # noqa: E501
:param actual_overtime_work: The actual_overtime_work of this Task. # noqa: E501
:type: str
"""
if actual_overtime_work is None:
raise ValueError("Invalid value for `actual_overtime_work`, must not be `None`") # noqa: E501
self._actual_overtime_work = actual_overtime_work
@property
def actual_work_protected(self):
"""Gets the actual_work_protected of this Task. # noqa: E501
The duration through which actual work is protected. Reading supported for XML format only. # noqa: E501
:return: The actual_work_protected of this Task. # noqa: E501
:rtype: str
"""
return self._actual_work_protected
@actual_work_protected.setter
def actual_work_protected(self, actual_work_protected):
"""Sets the actual_work_protected of this Task.
The duration through which actual work is protected. Reading supported for XML format only. # noqa: E501
:param actual_work_protected: The actual_work_protected of this Task. # noqa: E501
:type: str
"""
if actual_work_protected is None:
raise ValueError("Invalid value for `actual_work_protected`, must not be `None`") # noqa: E501
self._actual_work_protected = actual_work_protected
@property
def actual_overtime_work_protected(self):
"""Gets the actual_overtime_work_protected of this Task. # noqa: E501
The duration through which actual overtime work is protected. Reading supported for XML format only. # noqa: E501
:return: The actual_overtime_work_protected of this Task. # noqa: E501
:rtype: str
"""
return self._actual_overtime_work_protected
@actual_overtime_work_protected.setter
def actual_overtime_work_protected(self, actual_overtime_work_protected):
"""Sets the actual_overtime_work_protected of this Task.
The duration through which actual overtime work is protected. Reading supported for XML format only. # noqa: E501
:param actual_overtime_work_protected: The actual_overtime_work_protected of this Task. # noqa: E501
:type: str
"""
if actual_overtime_work_protected is None:
raise ValueError("Invalid value for `actual_overtime_work_protected`, must not be `None`") # noqa: E501
self._actual_overtime_work_protected = actual_overtime_work_protected
@property
def actual_start(self):
"""Gets the actual_start of this Task. # noqa: E501
The actual start date of a task. # noqa: E501
:return: The actual_start of this Task. # noqa: E501
:rtype: datetime
"""
return self._actual_start
@actual_start.setter
def actual_start(self, actual_start):
"""Sets the actual_start of this Task.
The actual start date of a task. # noqa: E501
:param actual_start: The actual_start of this Task. # noqa: E501
:type: datetime
"""
if actual_start is None:
raise ValueError("Invalid value for `actual_start`, must not be `None`") # noqa: E501
self._actual_start = actual_start
@property
def budget_work(self):
"""Gets the budget_work of this Task. # noqa: E501
The amount of budgeted work for a project root task. # noqa: E501
:return: The budget_work of this Task. # noqa: E501
:rtype: str
"""
return self._budget_work
@budget_work.setter
def budget_work(self, budget_work):
"""Sets the budget_work of this Task.
The amount of budgeted work for a project root task. # noqa: E501
:param budget_work: The budget_work of this Task. # noqa: E501
:type: str
"""
if budget_work is None:
raise ValueError("Invalid value for `budget_work`, must not be `None`") # noqa: E501
self._budget_work = budget_work
@property
def budget_cost(self):
"""Gets the budget_cost of this Task. # noqa: E501
The amount of budgeted cost for a project root task. # noqa: E501
:return: The budget_cost of this Task. # noqa: E501
:rtype: float
"""
return self._budget_cost
@budget_cost.setter
def budget_cost(self, budget_cost):
"""Sets the budget_cost of this Task.
The amount of budgeted cost for a project root task. # noqa: E501
:param budget_cost: The budget_cost of this Task. # noqa: E501
:type: float
"""
if budget_cost is None:
raise ValueError("Invalid value for `budget_cost`, must not be `None`") # noqa: E501
self._budget_cost = budget_cost
@property
def constraint_date(self):
"""Gets the constraint_date of this Task. # noqa: E501
Shows the specific date associated with certain constraint types, such as Must Start On, Must Finish On, Start No Earlier Than, Start No Later Than, Finish No Earlier Than, and Finish No Later Than. # noqa: E501
:return: The constraint_date of this Task. # noqa: E501
:rtype: datetime
"""
return self._constraint_date
@constraint_date.setter
def constraint_date(self, constraint_date):
"""Sets the constraint_date of this Task.
Shows the specific date associated with certain constraint types, such as Must Start On, Must Finish On, Start No Earlier Than, Start No Later Than, Finish No Earlier Than, and Finish No Later Than. # noqa: E501
:param constraint_date: The constraint_date of this Task. # noqa: E501
:type: datetime
"""
if constraint_date is None:
raise ValueError("Invalid value for `constraint_date`, must not be `None`") # noqa: E501
self._constraint_date = constraint_date
@property
def constraint_type(self):
"""Gets the constraint_type of this Task. # noqa: E501
Provides choices for the type of constraint that can be applied for scheduling a task. # noqa: E501
:return: The constraint_type of this Task. # noqa: E501
:rtype: ConstraintType
"""
return self._constraint_type
@constraint_type.setter
def constraint_type(self, constraint_type):
"""Sets the constraint_type of this Task.
Provides choices for the type of constraint that can be applied for scheduling a task. # noqa: E501
:param constraint_type: The constraint_type of this Task. # noqa: E501
:type: ConstraintType
"""
if constraint_type is None:
raise ValueError("Invalid value for `constraint_type`, must not be `None`") # noqa: E501
self._constraint_type = constraint_type
@property
def contact(self):
"""Gets the contact of this Task. # noqa: E501
The contact person for a task. # noqa: E501
:return: The contact of this Task. # noqa: E501
:rtype: str
"""
return self._contact
@contact.setter
def contact(self, contact):
"""Sets the contact of this Task.
The contact person for a task. # noqa: E501
:param contact: The contact of this Task. # noqa: E501
:type: str
"""
self._contact = contact
@property
def cost(self):
"""Gets the cost of this Task. # noqa: E501
The projected or scheduled cost of a task. # noqa: E501
:return: The cost of this Task. # noqa: E501
:rtype: float
"""
return self._cost
@cost.setter
def cost(self, cost):
"""Sets the cost of this Task.
The projected or scheduled cost of a task. # noqa: E501
:param cost: The cost of this Task. # noqa: E501
:type: float
"""
if cost is None:
raise ValueError("Invalid value for `cost`, must not be `None`") # noqa: E501
self._cost = cost
@property
def cv(self):
"""Gets the cv of this Task. # noqa: E501
The difference between the baseline cost and total cost for a task. # noqa: E501
:return: The cv of this Task. # noqa: E501
:rtype: float
"""
return self._cv
@cv.setter
def cv(self, cv):
"""Sets the cv of this Task.
The difference between the baseline cost and total cost for a task. # noqa: E501
:param cv: The cv of this Task. # noqa: E501
:type: float
"""
if cv is None:
raise ValueError("Invalid value for `cv`, must not be `None`") # noqa: E501
self._cv = cv
@property
def deadline(self):
"""Gets the deadline of this Task. # noqa: E501
The deadline for a task to be completed. # noqa: E501
:return: The deadline of this Task. # noqa: E501
:rtype: datetime
"""
return self._deadline
@deadline.setter
def deadline(self, deadline):
"""Sets the deadline of this Task.
The deadline for a task to be completed. # noqa: E501
:param deadline: The deadline of this Task. # noqa: E501
:type: datetime
"""
if deadline is None:
raise ValueError("Invalid value for `deadline`, must not be `None`") # noqa: E501
self._deadline = deadline
@property
def duration_variance(self):
"""Gets the duration_variance of this Task. # noqa: E501
Contains the difference between the total duration of a task and the baseline duration of a task. # noqa: E501
:return: The duration_variance of this Task. # noqa: E501
:rtype: str
"""
return self._duration_variance
@duration_variance.setter
def duration_variance(self, duration_variance):
"""Sets the duration_variance of this Task.
Contains the difference between the total duration of a task and the baseline duration of a task. # noqa: E501
:param duration_variance: The duration_variance of this Task. # noqa: E501
:type: str
"""
if duration_variance is None:
raise ValueError("Invalid value for `duration_variance`, must not be `None`") # noqa: E501
self._duration_variance = duration_variance
@property
def early_finish(self):
"""Gets the early_finish of this Task. # noqa: E501
The early finish date of a task. # noqa: E501
:return: The early_finish of this Task. # noqa: E501
:rtype: datetime
"""
return self._early_finish
@early_finish.setter
def early_finish(self, early_finish):
"""Sets the early_finish of this Task.
The early finish date of a task. # noqa: E501
:param early_finish: The early_finish of this Task. # noqa: E501
:type: datetime
"""
if early_finish is None:
raise ValueError("Invalid value for `early_finish`, must not be `None`") # noqa: E501
self._early_finish = early_finish
@property
def early_start(self):
"""Gets the early_start of this Task. # noqa: E501
The early start date of a task. # noqa: E501
:return: The early_start of this Task. # noqa: E501
:rtype: datetime
"""
return self._early_start
@early_start.setter
def early_start(self, early_start):
"""Sets the early_start of this Task.
The early start date of a task. # noqa: E501
:param early_start: The early_start of this Task. # noqa: E501
:type: datetime
"""
if early_start is None:
raise ValueError("Invalid value for `early_start`, must not be `None`") # noqa: E501
self._early_start = early_start
@property
def is_effort_driven(self):
"""Gets the is_effort_driven of this Task. # noqa: E501
Determines whether a task is effort-driven. # noqa: E501
:return: The is_effort_driven of this Task. # noqa: E501
:rtype: bool
"""
return self._is_effort_driven
@is_effort_driven.setter
def is_effort_driven(self, is_effort_driven):
"""Sets the is_effort_driven of this Task.
Determines whether a task is effort-driven. # noqa: E501
:param is_effort_driven: The is_effort_driven of this Task. # noqa: E501
:type: bool
"""
if is_effort_driven is None:
raise ValueError("Invalid value for `is_effort_driven`, must not be `None`") # noqa: E501
self._is_effort_driven = is_effort_driven
@property
def is_external_task(self):
"""Gets the is_external_task of this Task. # noqa: E501
Determines whether a task is external. # noqa: E501
:return: The is_external_task of this Task. # noqa: E501
:rtype: bool
"""
return self._is_external_task
@is_external_task.setter
def is_external_task(self, is_external_task):
"""Sets the is_external_task of this Task.
Determines whether a task is external. # noqa: E501
:param is_external_task: The is_external_task of this Task. # noqa: E501
:type: bool
"""
if is_external_task is None:
raise ValueError("Invalid value for `is_external_task`, must not be `None`") # noqa: E501
self._is_external_task = is_external_task
@property
def external_task_project(self):
"""Gets the external_task_project of this Task. # noqa: E501
The source location and task identifier of an external task. # noqa: E501
:return: The external_task_project of this Task. # noqa: E501
:rtype: str
"""
return self._external_task_project
@external_task_project.setter
def external_task_project(self, external_task_project):
"""Sets the external_task_project of this Task.
The source location and task identifier of an external task. # noqa: E501
:param external_task_project: The external_task_project of this Task. # noqa: E501
:type: str
"""
self._external_task_project = external_task_project
@property
def external_id(self):
"""Gets the external_id of this Task. # noqa: E501
If a task is an external task the property contains the task's external Id. type. # noqa: E501
:return: The external_id of this Task. # noqa: E501
:rtype: int
"""
return self._external_id
@external_id.setter
def external_id(self, external_id):
"""Sets the external_id of this Task.
If a task is an external task the property contains the task's external Id. type. # noqa: E501
:param external_id: The external_id of this Task. # noqa: E501
:type: int
"""
if external_id is None:
raise ValueError("Invalid value for `external_id`, must not be `None`") # noqa: E501
self._external_id = external_id
@property
def finish_slack(self):
"""Gets the finish_slack of this Task. # noqa: E501
Contains the duration between the Early Finish and Late Finish dates. # noqa: E501
:return: The finish_slack of this Task. # noqa: E501
:rtype: int
"""
return self._finish_slack
@finish_slack.setter
def finish_slack(self, finish_slack):
"""Sets the finish_slack of this Task.
Contains the duration between the Early Finish and Late Finish dates. # noqa: E501
:param finish_slack: The finish_slack of this Task. # noqa: E501
:type: int
"""
if finish_slack is None:
raise ValueError("Invalid value for `finish_slack`, must not be `None`") # noqa: E501
self._finish_slack = finish_slack
@property
def finish_variance(self):
"""Gets the finish_variance of this Task. # noqa: E501
The variance of the task finish date from the baseline finish date as minutes. # noqa: E501
:return: The finish_variance of this Task. # noqa: E501
:rtype: int
"""
return self._finish_variance
@finish_variance.setter
def finish_variance(self, finish_variance):
"""Sets the finish_variance of this Task.
The variance of the task finish date from the baseline finish date as minutes. # noqa: E501
:param finish_variance: The finish_variance of this Task. # noqa: E501
:type: int
"""
if finish_variance is None:
raise ValueError("Invalid value for `finish_variance`, must not be `None`") # noqa: E501
self._finish_variance = finish_variance
@property
def fixed_cost(self):
"""Gets the fixed_cost of this Task. # noqa: E501
The fixed cost of a task. # noqa: E501
:return: The fixed_cost of this Task. # noqa: E501
:rtype: float
"""
return self._fixed_cost
@fixed_cost.setter
def fixed_cost(self, fixed_cost):
"""Sets the fixed_cost of this Task.
The fixed cost of a task. # noqa: E501
:param fixed_cost: The fixed_cost of this Task. # noqa: E501
:type: float
"""
if fixed_cost is None:
raise ValueError("Invalid value for `fixed_cost`, must not be `None`") # noqa: E501
self._fixed_cost = fixed_cost
@property
def fixed_cost_accrual(self):
"""Gets the fixed_cost_accrual of this Task. # noqa: E501
Determines how the fixed cost is accrued against a task. # noqa: E501
:return: The fixed_cost_accrual of this Task. # noqa: E501
:rtype: CostAccrualType
"""
return self._fixed_cost_accrual
@fixed_cost_accrual.setter
def fixed_cost_accrual(self, fixed_cost_accrual):
"""Sets the fixed_cost_accrual of this Task.
Determines how the fixed cost is accrued against a task. # noqa: E501
:param fixed_cost_accrual: The fixed_cost_accrual of this Task. # noqa: E501
:type: CostAccrualType
"""
if fixed_cost_accrual is None:
raise ValueError("Invalid value for `fixed_cost_accrual`, must not be `None`") # noqa: E501
self._fixed_cost_accrual = fixed_cost_accrual
@property
def free_slack(self):
"""Gets the free_slack of this Task. # noqa: E501
The amount of a free slack. # noqa: E501
:return: The free_slack of this Task. # noqa: E501
:rtype: int
"""
return self._free_slack
@free_slack.setter
def free_slack(self, free_slack):
"""Sets the free_slack of this Task.
The amount of a free slack. # noqa: E501
:param free_slack: The free_slack of this Task. # noqa: E501
:type: int
"""
if free_slack is None:
raise ValueError("Invalid value for `free_slack`, must not be `None`") # noqa: E501
self._free_slack = free_slack
@property
def guid(self):
"""Gets the guid of this Task. # noqa: E501
:return: The guid of this Task. # noqa: E501
:rtype: str
"""
return self._guid
@guid.setter
def guid(self, guid):
"""Sets the guid of this Task.
:param guid: The guid of this Task. # noqa: E501
:type: str
"""
self._guid = guid
@property
def has_overallocated_resource(self):
"""Gets the has_overallocated_resource of this Task. # noqa: E501
Indicates whether the task has an resource assigned which has more work on assigned tasks than can be completed within normal working capacity. # noqa: E501
:return: The has_overallocated_resource of this Task. # noqa: E501
:rtype: bool
"""
return self._has_overallocated_resource
@has_overallocated_resource.setter
def has_overallocated_resource(self, has_overallocated_resource):
"""Sets the has_overallocated_resource of this Task.
Indicates whether the task has an resource assigned which has more work on assigned tasks than can be completed within normal working capacity. # noqa: E501
:param has_overallocated_resource: The has_overallocated_resource of this Task. # noqa: E501
:type: bool
"""
if has_overallocated_resource is None:
raise ValueError("Invalid value for `has_overallocated_resource`, must not be `None`") # noqa: E501
self._has_overallocated_resource = has_overallocated_resource
@property
def hide_bar(self):
"""Gets the hide_bar of this Task. # noqa: E501
Determines whether the GANTT bar of a task is hidden when displayed in Microsoft Project. # noqa: E501
:return: The hide_bar of this Task. # noqa: E501
:rtype: bool
"""
return self._hide_bar
@hide_bar.setter
def hide_bar(self, hide_bar):
"""Sets the hide_bar of this Task.
Determines whether the GANTT bar of a task is hidden when displayed in Microsoft Project. # noqa: E501
:param hide_bar: The hide_bar of this Task. # noqa: E501
:type: bool
"""
if hide_bar is None:
raise ValueError("Invalid value for `hide_bar`, must not be `None`") # noqa: E501
self._hide_bar = hide_bar
@property
def ignore_resource_calendar(self):
"""Gets the ignore_resource_calendar of this Task. # noqa: E501
Determines whether a task ignores the resource calendar. # noqa: E501
:return: The ignore_resource_calendar of this Task. # noqa: E501
:rtype: bool
"""
return self._ignore_resource_calendar
@ignore_resource_calendar.setter
def ignore_resource_calendar(self, ignore_resource_calendar):
"""Sets the ignore_resource_calendar of this Task.
Determines whether a task ignores the resource calendar. # noqa: E501
:param ignore_resource_calendar: The ignore_resource_calendar of this Task. # noqa: E501
:type: bool
"""
if ignore_resource_calendar is None:
raise ValueError("Invalid value for `ignore_resource_calendar`, must not be `None`") # noqa: E501
self._ignore_resource_calendar = ignore_resource_calendar
@property
def late_finish(self):
"""Gets the late_finish of this Task. # noqa: E501
The late finish date of a task. # noqa: E501
:return: The late_finish of this Task. # noqa: E501
:rtype: datetime
"""
return self._late_finish
@late_finish.setter
def late_finish(self, late_finish):
"""Sets the late_finish of this Task.
The late finish date of a task. # noqa: E501
:param late_finish: The late_finish of this Task. # noqa: E501
:type: datetime
"""
if late_finish is None:
raise ValueError("Invalid value for `late_finish`, must not be `None`") # noqa: E501
self._late_finish = late_finish
@property
def late_start(self):
"""Gets the late_start of this Task. # noqa: E501
The late start date of a task. # noqa: E501
:return: The late_start of this Task. # noqa: E501
:rtype: datetime
"""
return self._late_start
@late_start.setter
def late_start(self, late_start):
"""Sets the late_start of this Task.
The late start date of a task. # noqa: E501
:param late_start: The late_start of this Task. # noqa: E501
:type: datetime
"""
if late_start is None:
raise ValueError("Invalid value for `late_start`, must not be `None`") # noqa: E501
self._late_start = late_start
@property
def is_level_assignments(self):
"""Gets the is_level_assignments of this Task. # noqa: E501
:return: The is_level_assignments of this Task. # noqa: E501
:rtype: bool
"""
return self._is_level_assignments
@is_level_assignments.setter
def is_level_assignments(self, is_level_assignments):
"""Sets the is_level_assignments of this Task.
:param is_level_assignments: The is_level_assignments of this Task. # noqa: E501
:type: bool
"""
if is_level_assignments is None:
raise ValueError("Invalid value for `is_level_assignments`, must not be `None`") # noqa: E501
self._is_level_assignments = is_level_assignments
@property
def can_leveling_split(self):
"""Gets the can_leveling_split of this Task. # noqa: E501
:return: The can_leveling_split of this Task. # noqa: E501
:rtype: bool
"""
return self._can_leveling_split
@can_leveling_split.setter
def can_leveling_split(self, can_leveling_split):
"""Sets the can_leveling_split of this Task.
:param can_leveling_split: The can_leveling_split of this Task. # noqa: E501
:type: bool
"""
if can_leveling_split is None:
raise ValueError("Invalid value for `can_leveling_split`, must not be `None`") # noqa: E501
self._can_leveling_split = can_leveling_split
@property
def leveling_delay(self):
"""Gets the leveling_delay of this Task. # noqa: E501
The delay caused by leveling a task. # noqa: E501
:return: The leveling_delay of this Task. # noqa: E501
:rtype: int
"""
return self._leveling_delay
@leveling_delay.setter
def leveling_delay(self, leveling_delay):
"""Sets the leveling_delay of this Task.
The delay caused by leveling a task. # noqa: E501
:param leveling_delay: The leveling_delay of this Task. # noqa: E501
:type: int
"""
if leveling_delay is None:
raise ValueError("Invalid value for `leveling_delay`, must not be `None`") # noqa: E501
self._leveling_delay = leveling_delay
@property
def is_marked(self):
"""Gets the is_marked of this Task. # noqa: E501
Shows whether a task is marked for further action or identification of some kind. # noqa: E501
:return: The is_marked of this Task. # noqa: E501
:rtype: bool
"""
return self._is_marked
@is_marked.setter
def is_marked(self, is_marked):
"""Sets the is_marked of this Task.
Shows whether a task is marked for further action or identification of some kind. # noqa: E501
:param is_marked: The is_marked of this Task. # noqa: E501
:type: bool
"""
if is_marked is None:
raise ValueError("Invalid value for `is_marked`, must not be `None`") # noqa: E501
self._is_marked = is_marked
@property
def is_milestone(self):
"""Gets the is_milestone of this Task. # noqa: E501
Determines whether a task is a milestone. # noqa: E501
:return: The is_milestone of this Task. # noqa: E501
:rtype: bool
"""
return self._is_milestone
@is_milestone.setter
def is_milestone(self, is_milestone):
"""Sets the is_milestone of this Task.
Determines whether a task is a milestone. # noqa: E501
:param is_milestone: The is_milestone of this Task. # noqa: E501
:type: bool
"""
if is_milestone is None:
raise ValueError("Invalid value for `is_milestone`, must not be `None`") # noqa: E501
self._is_milestone = is_milestone
@property
def is_critical(self):
"""Gets the is_critical of this Task. # noqa: E501
Determines whether a task is in the critical chain. # noqa: E501
:return: The is_critical of this Task. # noqa: E501
:rtype: bool
"""
return self._is_critical
@is_critical.setter
def is_critical(self, is_critical):
"""Sets the is_critical of this Task.
Determines whether a task is in the critical chain. # noqa: E501
:param is_critical: The is_critical of this Task. # noqa: E501
:type: bool
"""
if is_critical is None:
raise ValueError("Invalid value for `is_critical`, must not be `None`") # noqa: E501
self._is_critical = is_critical
@property
def is_subproject(self):
"""Gets the is_subproject of this Task. # noqa: E501
Determines whether a task is an inserted project. # noqa: E501
:return: The is_subproject of this Task. # noqa: E501
:rtype: bool
"""
return self._is_subproject
@is_subproject.setter
def is_subproject(self, is_subproject):
"""Sets the is_subproject of this Task.
Determines whether a task is an inserted project. # noqa: E501
:param is_subproject: The is_subproject of this Task. # noqa: E501
:type: bool
"""
if is_subproject is None:
raise ValueError("Invalid value for `is_subproject`, must not be `None`") # noqa: E501
self._is_subproject = is_subproject
@property
def is_subproject_read_only(self):
"""Gets the is_subproject_read_only of this Task. # noqa: E501
Determines whether a subproject is read-only. # noqa: E501
:return: The is_subproject_read_only of this Task. # noqa: E501
:rtype: bool
"""
return self._is_subproject_read_only
@is_subproject_read_only.setter
def is_subproject_read_only(self, is_subproject_read_only):
"""Sets the is_subproject_read_only of this Task.
Determines whether a subproject is read-only. # noqa: E501
:param is_subproject_read_only: The is_subproject_read_only of this Task. # noqa: E501
:type: bool
"""
if is_subproject_read_only is None:
raise ValueError("Invalid value for `is_subproject_read_only`, must not be `None`") # noqa: E501
self._is_subproject_read_only = is_subproject_read_only
@property
def subproject_name(self):
"""Gets the subproject_name of this Task. # noqa: E501
The source location of a subproject. Read/write String. # noqa: E501
:return: The subproject_name of this Task. # noqa: E501
:rtype: str
"""
return self._subproject_name
@subproject_name.setter
def subproject_name(self, subproject_name):
"""Sets the subproject_name of this Task.
The source location of a subproject. Read/write String. # noqa: E501
:param subproject_name: The subproject_name of this Task. # noqa: E501
:type: str
"""
self._subproject_name = subproject_name
@property
def is_summary(self):
"""Gets the is_summary of this Task. # noqa: E501
Determines whether a task is a summary task. # noqa: E501
:return: The is_summary of this Task. # noqa: E501
:rtype: bool
"""
return self._is_summary
@is_summary.setter
def is_summary(self, is_summary):
"""Sets the is_summary of this Task.
Determines whether a task is a summary task. # noqa: E501
:param is_summary: The is_summary of this Task. # noqa: E501
:type: bool
"""
if is_summary is None:
raise ValueError("Invalid value for `is_summary`, must not be `None`") # noqa: E501
self._is_summary = is_summary
@property
def subtasks_uids(self):
"""Gets the subtasks_uids of this Task. # noqa: E501
Unique ids of all subtasks. # noqa: E501
:return: The subtasks_uids of this Task. # noqa: E501
:rtype: list[int]
"""
return self._subtasks_uids
@subtasks_uids.setter
def subtasks_uids(self, subtasks_uids):
"""Sets the subtasks_uids of this Task.
Unique ids of all subtasks. # noqa: E501
:param subtasks_uids: The subtasks_uids of this Task. # noqa: E501
:type: list[int]
"""
self._subtasks_uids = subtasks_uids
@property
def outline_level(self):
"""Gets the outline_level of this Task. # noqa: E501
The outline level of a task. # noqa: E501
:return: The outline_level of this Task. # noqa: E501
:rtype: int
"""
return self._outline_level
@outline_level.setter
def outline_level(self, outline_level):
"""Sets the outline_level of this Task.
The outline level of a task. # noqa: E501
:param outline_level: The outline_level of this Task. # noqa: E501
:type: int
"""
if outline_level is None:
raise ValueError("Invalid value for `outline_level`, must not be `None`") # noqa: E501
self._outline_level = outline_level
@property
def is_over_allocated(self):
"""Gets the is_over_allocated of this Task. # noqa: E501
:return: The is_over_allocated of this Task. # noqa: E501
:rtype: bool
"""
return self._is_over_allocated
@is_over_allocated.setter
def is_over_allocated(self, is_over_allocated):
"""Sets the is_over_allocated of this Task.
:param is_over_allocated: The is_over_allocated of this Task. # noqa: E501
:type: bool
"""
if is_over_allocated is None:
raise ValueError("Invalid value for `is_over_allocated`, must not be `None`") # noqa: E501
self._is_over_allocated = is_over_allocated
@property
def is_estimated(self):
"""Gets the is_estimated of this Task. # noqa: E501
Determines whether a task is estimated. # noqa: E501
:return: The is_estimated of this Task. # noqa: E501
:rtype: bool
"""
return self._is_estimated
@is_estimated.setter
def is_estimated(self, is_estimated):
"""Sets the is_estimated of this Task.
Determines whether a task is estimated. # noqa: E501
:param is_estimated: The is_estimated of this Task. # noqa: E501
:type: bool
"""
if is_estimated is None:
raise ValueError("Invalid value for `is_estimated`, must not be `None`") # noqa: E501
self._is_estimated = is_estimated
@property
def overtime_cost(self):
"""Gets the overtime_cost of this Task. # noqa: E501
The sum of an actual and remaining overtime cost of a task. # noqa: E501
:return: The overtime_cost of this Task. # noqa: E501
:rtype: float
"""
return self._overtime_cost
@overtime_cost.setter
def overtime_cost(self, overtime_cost):
"""Sets the overtime_cost of this Task.
The sum of an actual and remaining overtime cost of a task. # noqa: E501
:param overtime_cost: The overtime_cost of this Task. # noqa: E501
:type: float
"""
if overtime_cost is None:
raise ValueError("Invalid value for `overtime_cost`, must not be `None`") # noqa: E501
self._overtime_cost = overtime_cost
@property
def overtime_work(self):
"""Gets the overtime_work of this Task. # noqa: E501
The amount of an overtime work scheduled for a task. # noqa: E501
:return: The overtime_work of this Task. # noqa: E501
:rtype: str
"""
return self._overtime_work
@overtime_work.setter
def overtime_work(self, overtime_work):
"""Sets the overtime_work of this Task.
The amount of an overtime work scheduled for a task. # noqa: E501
:param overtime_work: The overtime_work of this Task. # noqa: E501
:type: str
"""
if overtime_work is None:
raise ValueError("Invalid value for `overtime_work`, must not be `None`") # noqa: E501
self._overtime_work = overtime_work
@property
def physical_percent_complete(self):
"""Gets the physical_percent_complete of this Task. # noqa: E501
The percentage complete value entered by the Project Manager. # noqa: E501
:return: The physical_percent_complete of this Task. # noqa: E501
:rtype: int
"""
return self._physical_percent_complete
@physical_percent_complete.setter
def physical_percent_complete(self, physical_percent_complete):
"""Sets the physical_percent_complete of this Task.
The percentage complete value entered by the Project Manager. # noqa: E501
:param physical_percent_complete: The physical_percent_complete of this Task. # noqa: E501
:type: int
"""
if physical_percent_complete is None:
raise ValueError("Invalid value for `physical_percent_complete`, must not be `None`") # noqa: E501
self._physical_percent_complete = physical_percent_complete
@property
def pre_leveled_finish(self):
"""Gets the pre_leveled_finish of this Task. # noqa: E501
:return: The pre_leveled_finish of this Task. # noqa: E501
:rtype: datetime
"""
return self._pre_leveled_finish
@pre_leveled_finish.setter
def pre_leveled_finish(self, pre_leveled_finish):
"""Sets the pre_leveled_finish of this Task.
:param pre_leveled_finish: The pre_leveled_finish of this Task. # noqa: E501
:type: datetime
"""
if pre_leveled_finish is None:
raise ValueError("Invalid value for `pre_leveled_finish`, must not be `None`") # noqa: E501
self._pre_leveled_finish = pre_leveled_finish
@property
def pre_leveled_start(self):
"""Gets the pre_leveled_start of this Task. # noqa: E501
:return: The pre_leveled_start of this Task. # noqa: E501
:rtype: datetime
"""
return self._pre_leveled_start
@pre_leveled_start.setter
def pre_leveled_start(self, pre_leveled_start):
"""Sets the pre_leveled_start of this Task.
:param pre_leveled_start: The pre_leveled_start of this Task. # noqa: E501
:type: datetime
"""
if pre_leveled_start is None:
raise ValueError("Invalid value for `pre_leveled_start`, must not be `None`") # noqa: E501
self._pre_leveled_start = pre_leveled_start
@property
def is_recurring(self):
"""Gets the is_recurring of this Task. # noqa: E501
Determines whether a task is a recurring task. # noqa: E501
:return: The is_recurring of this Task. # noqa: E501
:rtype: bool
"""
return self._is_recurring
@is_recurring.setter
def is_recurring(self, is_recurring):
"""Sets the is_recurring of this Task.
Determines whether a task is a recurring task. # noqa: E501
:param is_recurring: The is_recurring of this Task. # noqa: E501
:type: bool
"""
if is_recurring is None:
raise ValueError("Invalid value for `is_recurring`, must not be `None`") # noqa: E501
self._is_recurring = is_recurring
@property
def regular_work(self):
"""Gets the regular_work of this Task. # noqa: E501
The amount of non-overtime work scheduled for a task. # noqa: E501
:return: The regular_work of this Task. # noqa: E501
:rtype: str
"""
return self._regular_work
@regular_work.setter
def regular_work(self, regular_work):
"""Sets the regular_work of this Task.
The amount of non-overtime work scheduled for a task. # noqa: E501
:param regular_work: The regular_work of this Task. # noqa: E501
:type: str
"""
if regular_work is None:
raise ValueError("Invalid value for `regular_work`, must not be `None`") # noqa: E501
self._regular_work = regular_work
@property
def remaining_cost(self):
"""Gets the remaining_cost of this Task. # noqa: E501
The remaining projected cost of completing a task. # noqa: E501
:return: The remaining_cost of this Task. # noqa: E501
:rtype: float
"""
return self._remaining_cost
@remaining_cost.setter
def remaining_cost(self, remaining_cost):
"""Sets the remaining_cost of this Task.
The remaining projected cost of completing a task. # noqa: E501
:param remaining_cost: The remaining_cost of this Task. # noqa: E501
:type: float
"""
if remaining_cost is None:
raise ValueError("Invalid value for `remaining_cost`, must not be `None`") # noqa: E501
self._remaining_cost = remaining_cost
@property
def remaining_duration(self):
"""Gets the remaining_duration of this Task. # noqa: E501
The amount of time required to complete the unfinished portion of a task. # noqa: E501
:return: The remaining_duration of this Task. # noqa: E501
:rtype: str
"""
return self._remaining_duration
@remaining_duration.setter
def remaining_duration(self, remaining_duration):
"""Sets the remaining_duration of this Task.
The amount of time required to complete the unfinished portion of a task. # noqa: E501
:param remaining_duration: The remaining_duration of this Task. # noqa: E501
:type: str
"""
if remaining_duration is None:
raise ValueError("Invalid value for `remaining_duration`, must not be `None`") # noqa: E501
self._remaining_duration = remaining_duration
@property
def remaining_overtime_cost(self):
"""Gets the remaining_overtime_cost of this Task. # noqa: E501
The remaining overtime cost projected to finish a task. # noqa: E501
:return: The remaining_overtime_cost of this Task. # noqa: E501
:rtype: float
"""
return self._remaining_overtime_cost
@remaining_overtime_cost.setter
def remaining_overtime_cost(self, remaining_overtime_cost):
"""Sets the remaining_overtime_cost of this Task.
The remaining overtime cost projected to finish a task. # noqa: E501
:param remaining_overtime_cost: The remaining_overtime_cost of this Task. # noqa: E501
:type: float
"""
if remaining_overtime_cost is None:
raise ValueError("Invalid value for `remaining_overtime_cost`, must not be `None`") # noqa: E501
self._remaining_overtime_cost = remaining_overtime_cost
@property
def remaining_overtime_work(self):
"""Gets the remaining_overtime_work of this Task. # noqa: E501
The remaining overtime work scheduled to finish a task. # noqa: E501
:return: The remaining_overtime_work of this Task. # noqa: E501
:rtype: str
"""
return self._remaining_overtime_work
@remaining_overtime_work.setter
def remaining_overtime_work(self, remaining_overtime_work):
"""Sets the remaining_overtime_work of this Task.
The remaining overtime work scheduled to finish a task. # noqa: E501
:param remaining_overtime_work: The remaining_overtime_work of this Task. # noqa: E501
:type: str
"""
if remaining_overtime_work is None:
raise ValueError("Invalid value for `remaining_overtime_work`, must not be `None`") # noqa: E501
self._remaining_overtime_work = remaining_overtime_work
@property
def remaining_work(self):
"""Gets the remaining_work of this Task. # noqa: E501
The remaining work scheduled to complete a task. # noqa: E501
:return: The remaining_work of this Task. # noqa: E501
:rtype: str
"""
return self._remaining_work
@remaining_work.setter
def remaining_work(self, remaining_work):
"""Sets the remaining_work of this Task.
The remaining work scheduled to complete a task. # noqa: E501
:param remaining_work: The remaining_work of this Task. # noqa: E501
:type: str
"""
if remaining_work is None:
raise ValueError("Invalid value for `remaining_work`, must not be `None`") # noqa: E501
self._remaining_work = remaining_work
@property
def resume(self):
"""Gets the resume of this Task. # noqa: E501
The date when a task resumed. # noqa: E501
:return: The resume of this Task. # noqa: E501
:rtype: datetime
"""
return self._resume
@resume.setter
def resume(self, resume):
"""Sets the resume of this Task.
The date when a task resumed. # noqa: E501
:param resume: The resume of this Task. # noqa: E501
:type: datetime
"""
if resume is None:
raise ValueError("Invalid value for `resume`, must not be `None`") # noqa: E501
self._resume = resume
@property
def is_resume_valid(self):
"""Gets the is_resume_valid of this Task. # noqa: E501
Determines whether a task can be resumed. # noqa: E501
:return: The is_resume_valid of this Task. # noqa: E501
:rtype: bool
"""
return self._is_resume_valid
@is_resume_valid.setter
def is_resume_valid(self, is_resume_valid):
"""Sets the is_resume_valid of this Task.
Determines whether a task can be resumed. # noqa: E501
:param is_resume_valid: The is_resume_valid of this Task. # noqa: E501
:type: bool
"""
self._is_resume_valid = is_resume_valid
@property
def stop(self):
"""Gets the stop of this Task. # noqa: E501
The date that represents the end of the actual portion of a task. # noqa: E501
:return: The stop of this Task. # noqa: E501
:rtype: datetime
"""
return self._stop
@stop.setter
def stop(self, stop):
"""Sets the stop of this Task.
The date that represents the end of the actual portion of a task. # noqa: E501
:param stop: The stop of this Task. # noqa: E501
:type: datetime
"""
if stop is None:
raise ValueError("Invalid value for `stop`, must not be `None`") # noqa: E501
self._stop = stop
@property
def is_rollup(self):
"""Gets the is_rollup of this Task. # noqa: E501
Determines whether a task is rolled up. # noqa: E501
:return: The is_rollup of this Task. # noqa: E501
:rtype: bool
"""
return self._is_rollup
@is_rollup.setter
def is_rollup(self, is_rollup):
"""Sets the is_rollup of this Task.
Determines whether a task is rolled up. # noqa: E501
:param is_rollup: The is_rollup of this Task. # noqa: E501
:type: bool
"""
if is_rollup is None:
raise ValueError("Invalid value for `is_rollup`, must not be `None`") # noqa: E501
self._is_rollup = is_rollup
@property
def start_slack(self):
"""Gets the start_slack of this Task. # noqa: E501
Returns the task's start slack. # noqa: E501
:return: The start_slack of this Task. # noqa: E501
:rtype: int
"""
return self._start_slack
@start_slack.setter
def start_slack(self, start_slack):
"""Sets the start_slack of this Task.
Returns the task's start slack. # noqa: E501
:param start_slack: The start_slack of this Task. # noqa: E501
:type: int
"""
if start_slack is None:
raise ValueError("Invalid value for `start_slack`, must not be `None`") # noqa: E501
self._start_slack = start_slack
@property
def start_variance(self):
"""Gets the start_variance of this Task. # noqa: E501
The variance of the task start date from the baseline start date as minutes. # noqa: E501
:return: The start_variance of this Task. # noqa: E501
:rtype: int
"""
return self._start_variance
@start_variance.setter
def start_variance(self, start_variance):
"""Sets the start_variance of this Task.
The variance of the task start date from the baseline start date as minutes. # noqa: E501
:param start_variance: The start_variance of this Task. # noqa: E501
:type: int
"""
if start_variance is None:
raise ValueError("Invalid value for `start_variance`, must not be `None`") # noqa: E501
self._start_variance = start_variance
@property
def calendar_uid(self):
"""Gets the calendar_uid of this Task. # noqa: E501
The unique id of task calendar. # noqa: E501
:return: The calendar_uid of this Task. # noqa: E501
:rtype: int
"""
return self._calendar_uid
@calendar_uid.setter
def calendar_uid(self, calendar_uid):
"""Sets the calendar_uid of this Task.
The unique id of task calendar. # noqa: E501
:param calendar_uid: The calendar_uid of this Task. # noqa: E501
:type: int
"""
if calendar_uid is None:
raise ValueError("Invalid value for `calendar_uid`, must not be `None`") # noqa: E501
self._calendar_uid = calendar_uid
@property
def is_manual(self):
"""Gets the is_manual of this Task. # noqa: E501
Determines whether a task is manually scheduled. # noqa: E501
:return: The is_manual of this Task. # noqa: E501
:rtype: bool
"""
return self._is_manual
@is_manual.setter
def is_manual(self, is_manual):
"""Sets the is_manual of this Task.
Determines whether a task is manually scheduled. # noqa: E501
:param is_manual: The is_manual of this Task. # noqa: E501
:type: bool
"""
if is_manual is None:
raise ValueError("Invalid value for `is_manual`, must not be `None`") # noqa: E501
self._is_manual = is_manual
@property
def manual_start(self):
"""Gets the manual_start of this Task. # noqa: E501
Defines manually scheduled start of a task. # noqa: E501
:return: The manual_start of this Task. # noqa: E501
:rtype: datetime
"""
return self._manual_start
@manual_start.setter
def manual_start(self, manual_start):
"""Sets the manual_start of this Task.
Defines manually scheduled start of a task. # noqa: E501
:param manual_start: The manual_start of this Task. # noqa: E501
:type: datetime
"""
if manual_start is None:
raise ValueError("Invalid value for `manual_start`, must not be `None`") # noqa: E501
self._manual_start = manual_start
@property
def manual_finish(self):
"""Gets the manual_finish of this Task. # noqa: E501
Defines manually scheduled finish of a task. # noqa: E501
:return: The manual_finish of this Task. # noqa: E501
:rtype: datetime
"""
return self._manual_finish
@manual_finish.setter
def manual_finish(self, manual_finish):
"""Sets the manual_finish of this Task.
Defines manually scheduled finish of a task. # noqa: E501
:param manual_finish: The manual_finish of this Task. # noqa: E501
:type: datetime
"""
if manual_finish is None:
raise ValueError("Invalid value for `manual_finish`, must not be `None`") # noqa: E501
self._manual_finish = manual_finish
@property
def manual_duration(self):
"""Gets the manual_duration of this Task. # noqa: E501
Defines manually scheduled duration of a task. # noqa: E501
:return: The manual_duration of this Task. # noqa: E501
:rtype: str
"""
return self._manual_duration
@manual_duration.setter
def manual_duration(self, manual_duration):
"""Sets the manual_duration of this Task.
Defines manually scheduled duration of a task. # noqa: E501
:param manual_duration: The manual_duration of this Task. # noqa: E501
:type: str
"""
if manual_duration is None:
raise ValueError("Invalid value for `manual_duration`, must not be `None`") # noqa: E501
self._manual_duration = manual_duration
@property
def total_slack(self):
"""Gets the total_slack of this Task. # noqa: E501
The amount of a total slack. # noqa: E501
:return: The total_slack of this Task. # noqa: E501
:rtype: int
"""
return self._total_slack
@total_slack.setter
def total_slack(self, total_slack):
"""Sets the total_slack of this Task.
The amount of a total slack. # noqa: E501
:param total_slack: The total_slack of this Task. # noqa: E501
:type: int
"""
if total_slack is None:
raise ValueError("Invalid value for `total_slack`, must not be `None`") # noqa: E501
self._total_slack = total_slack
@property
def type(self):
"""Gets the type of this Task. # noqa: E501
The type of a task. # noqa: E501
:return: The type of this Task. # noqa: E501
:rtype: TaskType
"""
return self._type
@type.setter
def type(self, type):
"""Sets the type of this Task.
The type of a task. # noqa: E501
:param type: The type of this Task. # noqa: E501
:type: TaskType
"""
if type is None:
raise ValueError("Invalid value for `type`, must not be `None`") # noqa: E501
self._type = type
@property
def wbs(self):
"""Gets the wbs of this Task. # noqa: E501
The work breakdown structure code of a task. # noqa: E501
:return: The wbs of this Task. # noqa: E501
:rtype: str
"""
return self._wbs
@wbs.setter
def wbs(self, wbs):
"""Sets the wbs of this Task.
The work breakdown structure code of a task. # noqa: E501
:param wbs: The wbs of this Task. # noqa: E501
:type: str
"""
self._wbs = wbs
@property
def priority(self):
"""Gets the priority of this Task. # noqa: E501
The priority of a task from 0 to 1000. # noqa: E501
:return: The priority of this Task. # noqa: E501
:rtype: int
"""
return self._priority
@priority.setter
def priority(self, priority):
"""Sets the priority of this Task.
The priority of a task from 0 to 1000. # noqa: E501
:param priority: The priority of this Task. # noqa: E501
:type: int
"""
if priority is None:
raise ValueError("Invalid value for `priority`, must not be `None`") # noqa: E501
self._priority = priority
@property
def work(self):
"""Gets the work of this Task. # noqa: E501
The amount of the scheduled work for a task. # noqa: E501
:return: The work of this Task. # noqa: E501
:rtype: str
"""
return self._work
@work.setter
def work(self, work):
"""Sets the work of this Task.
The amount of the scheduled work for a task. # noqa: E501
:param work: The work of this Task. # noqa: E501
:type: str
"""
if work is None:
raise ValueError("Invalid value for `work`, must not be `None`") # noqa: E501
self._work = work
@property
def work_variance(self):
"""Gets the work_variance of this Task. # noqa: E501
The variance of the task work from the baseline task work as minutes. # noqa: E501
:return: The work_variance of this Task. # noqa: E501
:rtype: float
"""
return self._work_variance
@work_variance.setter
def work_variance(self, work_variance):
"""Sets the work_variance of this Task.
The variance of the task work from the baseline task work as minutes. # noqa: E501
:param work_variance: The work_variance of this Task. # noqa: E501
:type: float
"""
if work_variance is None:
raise ValueError("Invalid value for `work_variance`, must not be `None`") # noqa: E501
self._work_variance = work_variance
@property
def notes_text(self):
"""Gets the notes_text of this Task. # noqa: E501
Notes' plain text extracted from RTF data. # noqa: E501
:return: The notes_text of this Task. # noqa: E501
:rtype: str
"""
return self._notes_text
@notes_text.setter
def notes_text(self, notes_text):
"""Sets the notes_text of this Task.
Notes' plain text extracted from RTF data. # noqa: E501
:param notes_text: The notes_text of this Task. # noqa: E501
:type: str
"""
self._notes_text = notes_text
@property
def notes_rtf(self):
"""Gets the notes_rtf of this Task. # noqa: E501
The text notes in RTF format. # noqa: E501
:return: The notes_rtf of this Task. # noqa: E501
:rtype: str
"""
return self._notes_rtf
@notes_rtf.setter
def notes_rtf(self, notes_rtf):
"""Sets the notes_rtf of this Task.
The text notes in RTF format. # noqa: E501
:param notes_rtf: The notes_rtf of this Task. # noqa: E501
:type: str
"""
self._notes_rtf = notes_rtf
@property
def acwp(self):
"""Gets the acwp of this Task. # noqa: E501
:return: The acwp of this Task. # noqa: E501
:rtype: float
"""
return self._acwp
@acwp.setter
def acwp(self, acwp):
"""Sets the acwp of this Task.
:param acwp: The acwp of this Task. # noqa: E501
:type: float
"""
if acwp is None:
raise ValueError("Invalid value for `acwp`, must not be `None`") # noqa: E501
self._acwp = acwp
@property
def bcws(self):
"""Gets the bcws of this Task. # noqa: E501
:return: The bcws of this Task. # noqa: E501
:rtype: float
"""
return self._bcws
@bcws.setter
def bcws(self, bcws):
"""Sets the bcws of this Task.
:param bcws: The bcws of this Task. # noqa: E501
:type: float
"""
if bcws is None:
raise ValueError("Invalid value for `bcws`, must not be `None`") # noqa: E501
self._bcws = bcws
@property
def bcwp(self):
"""Gets the bcwp of this Task. # noqa: E501
:return: The bcwp of this Task. # noqa: E501
:rtype: float
"""
return self._bcwp
@bcwp.setter
def bcwp(self, bcwp):
"""Sets the bcwp of this Task.
:param bcwp: The bcwp of this Task. # noqa: E501
:type: float
"""
if bcwp is None:
raise ValueError("Invalid value for `bcwp`, must not be `None`") # noqa: E501
self._bcwp = bcwp
@property
def leveling_delay_format(self):
"""Gets the leveling_delay_format of this Task. # noqa: E501
LevelingDelayFormat # noqa: E501
:return: The leveling_delay_format of this Task. # noqa: E501
:rtype: TimeUnitType
"""
return self._leveling_delay_format
@leveling_delay_format.setter
def leveling_delay_format(self, leveling_delay_format):
"""Sets the leveling_delay_format of this Task.
LevelingDelayFormat # noqa: E501
:param leveling_delay_format: The leveling_delay_format of this Task. # noqa: E501
:type: TimeUnitType
"""
if leveling_delay_format is None:
raise ValueError("Invalid value for `leveling_delay_format`, must not be `None`") # noqa: E501
self._leveling_delay_format = leveling_delay_format
@property
def predecessors(self):
"""Gets the predecessors of this Task. # noqa: E501
The task Uid numbers for the predecessor tasks on which the task depends before it can be started or finished. # noqa: E501
:return: The predecessors of this Task. # noqa: E501
:rtype: str
"""
return self._predecessors
@predecessors.setter
def predecessors(self, predecessors):
"""Sets the predecessors of this Task.
The task Uid numbers for the predecessor tasks on which the task depends before it can be started or finished. # noqa: E501
:param predecessors: The predecessors of this Task. # noqa: E501
:type: str
"""
self._predecessors = predecessors
@property
def successors(self):
"""Gets the successors of this Task. # noqa: E501
The task Uid numbers for the successor tasks to a task. # noqa: E501
:return: The successors of this Task. # noqa: E501
:rtype: str
"""
return self._successors
@successors.setter
def successors(self, successors):
"""Sets the successors of this Task.
The task Uid numbers for the successor tasks to a task. # noqa: E501
:param successors: The successors of this Task. # noqa: E501
:type: str
"""
self._successors = successors
@property
def ignore_warnings(self):
"""Gets the ignore_warnings of this Task. # noqa: E501
Indicates whether to hide the schedule conflict warning indicator in Microsoft Project. # noqa: E501
:return: The ignore_warnings of this Task. # noqa: E501
:rtype: bool
"""
return self._ignore_warnings
@ignore_warnings.setter
def ignore_warnings(self, ignore_warnings):
"""Sets the ignore_warnings of this Task.
Indicates whether to hide the schedule conflict warning indicator in Microsoft Project. # noqa: E501
:param ignore_warnings: The ignore_warnings of this Task. # noqa: E501
:type: bool
"""
if ignore_warnings is None:
raise ValueError("Invalid value for `ignore_warnings`, must not be `None`") # noqa: E501
self._ignore_warnings = ignore_warnings
@property
def is_expanded(self):
"""Gets the is_expanded of this Task. # noqa: E501
Determines whether a summary task is expanded or not in GanttChart view. # noqa: E501
:return: The is_expanded of this Task. # noqa: E501
:rtype: bool
"""
return self._is_expanded
@is_expanded.setter
def is_expanded(self, is_expanded):
"""Sets the is_expanded of this Task.
Determines whether a summary task is expanded or not in GanttChart view. # noqa: E501
:param is_expanded: The is_expanded of this Task. # noqa: E501
:type: bool
"""
if is_expanded is None:
raise ValueError("Invalid value for `is_expanded`, must not be `None`") # noqa: E501
self._is_expanded = is_expanded
@property
def display_on_timeline(self):
"""Gets the display_on_timeline of this Task. # noqa: E501
Specifies whether a task should be displayed on a timeline view. # noqa: E501
:return: The display_on_timeline of this Task. # noqa: E501
:rtype: bool
"""
return self._display_on_timeline
@display_on_timeline.setter
def display_on_timeline(self, display_on_timeline):
"""Sets the display_on_timeline of this Task.
Specifies whether a task should be displayed on a timeline view. # noqa: E501
:param display_on_timeline: The display_on_timeline of this Task. # noqa: E501
:type: bool
"""
if display_on_timeline is None:
raise ValueError("Invalid value for `display_on_timeline`, must not be `None`") # noqa: E501
self._display_on_timeline = display_on_timeline
@property
def display_as_summary(self):
"""Gets the display_as_summary of this Task. # noqa: E501
Determines whether the task should be displayed as a summary task. Reading supported for XML format only. # noqa: E501
:return: The display_as_summary of this Task. # noqa: E501
:rtype: bool
"""
return self._display_as_summary
@display_as_summary.setter
def display_as_summary(self, display_as_summary):
"""Sets the display_as_summary of this Task.
Determines whether the task should be displayed as a summary task. Reading supported for XML format only. # noqa: E501
:param display_as_summary: The display_as_summary of this Task. # noqa: E501
:type: bool
"""
if display_as_summary is None:
raise ValueError("Invalid value for `display_as_summary`, must not be `None`") # noqa: E501
self._display_as_summary = display_as_summary
@property
def hyperlink(self):
"""Gets the hyperlink of this Task. # noqa: E501
The title or explanatory text for a hyperlink associated with a task. # noqa: E501
:return: The hyperlink of this Task. # noqa: E501
:rtype: str
"""
return self._hyperlink
@hyperlink.setter
def hyperlink(self, hyperlink):
"""Sets the hyperlink of this Task.
The title or explanatory text for a hyperlink associated with a task. # noqa: E501
:param hyperlink: The hyperlink of this Task. # noqa: E501
:type: str
"""
self._hyperlink = hyperlink
@property
def hyperlink_address(self):
"""Gets the hyperlink_address of this Task. # noqa: E501
The address for a hyperlink associated with a task. # noqa: E501
:return: The hyperlink_address of this Task. # noqa: E501
:rtype: str
"""
return self._hyperlink_address
@hyperlink_address.setter
def hyperlink_address(self, hyperlink_address):
"""Sets the hyperlink_address of this Task.
The address for a hyperlink associated with a task. # noqa: E501
:param hyperlink_address: The hyperlink_address of this Task. # noqa: E501
:type: str
"""
self._hyperlink_address = hyperlink_address
@property
def hyperlink_sub_address(self):
"""Gets the hyperlink_sub_address of this Task. # noqa: E501
The specific location in a document in a hyperlink associated with a task. type. # noqa: E501
:return: The hyperlink_sub_address of this Task. # noqa: E501
:rtype: str
"""
return self._hyperlink_sub_address
@hyperlink_sub_address.setter
def hyperlink_sub_address(self, hyperlink_sub_address):
"""Sets the hyperlink_sub_address of this Task.
The specific location in a document in a hyperlink associated with a task. type. # noqa: E501
:param hyperlink_sub_address: The hyperlink_sub_address of this Task. # noqa: E501
:type: str
"""
self._hyperlink_sub_address = hyperlink_sub_address
@property
def earned_value_method(self):
"""Gets the earned_value_method of this Task. # noqa: E501
Determines whether the % Complete or Physical % Complete field should be used to calculate budgeted cost of work performed (BCWP). # noqa: E501
:return: The earned_value_method of this Task. # noqa: E501
:rtype: EarnedValueMethodType
"""
return self._earned_value_method
@earned_value_method.setter
def earned_value_method(self, earned_value_method):
"""Sets the earned_value_method of this Task.
Determines whether the % Complete or Physical % Complete field should be used to calculate budgeted cost of work performed (BCWP). # noqa: E501
:param earned_value_method: The earned_value_method of this Task. # noqa: E501
:type: EarnedValueMethodType
"""
if earned_value_method is None:
raise ValueError("Invalid value for `earned_value_method`, must not be `None`") # noqa: E501
self._earned_value_method = earned_value_method
@property
def is_published(self):
"""Gets the is_published of this Task. # noqa: E501
Determines whether the current task should be published to Project Server with the rest of the project. # noqa: E501
:return: The is_published of this Task. # noqa: E501
:rtype: bool
"""
return self._is_published
@is_published.setter
def is_published(self, is_published):
"""Sets the is_published of this Task.
Determines whether the current task should be published to Project Server with the rest of the project. # noqa: E501
:param is_published: The is_published of this Task. # noqa: E501
:type: bool
"""
if is_published is None:
raise ValueError("Invalid value for `is_published`, must not be `None`") # noqa: E501
self._is_published = is_published
@property
def status_manager(self):
"""Gets the status_manager of this Task. # noqa: E501
The name of the enterprise resource who is to receive status updates for the current task from resources. # noqa: E501
:return: The status_manager of this Task. # noqa: E501
:rtype: str
"""
return self._status_manager
@status_manager.setter
def status_manager(self, status_manager):
"""Sets the status_manager of this Task.
The name of the enterprise resource who is to receive status updates for the current task from resources. # noqa: E501
:param status_manager: The status_manager of this Task. # noqa: E501
:type: str
"""
self._status_manager = status_manager
@property
def commitment_start(self):
"""Gets the commitment_start of this Task. # noqa: E501
The start date of a delivery. Reading supported for XML format only. # noqa: E501
:return: The commitment_start of this Task. # noqa: E501
:rtype: datetime
"""
return self._commitment_start
@commitment_start.setter
def commitment_start(self, commitment_start):
"""Sets the commitment_start of this Task.
The start date of a delivery. Reading supported for XML format only. # noqa: E501
:param commitment_start: The commitment_start of this Task. # noqa: E501
:type: datetime
"""
if commitment_start is None:
raise ValueError("Invalid value for `commitment_start`, must not be `None`") # noqa: E501
self._commitment_start = commitment_start
@property
def commitment_finish(self):
"""Gets the commitment_finish of this Task. # noqa: E501
The finish date of a delivery. Reading supported for XML format only. # noqa: E501
:return: The commitment_finish of this Task. # noqa: E501
:rtype: datetime
"""
return self._commitment_finish
@commitment_finish.setter
def commitment_finish(self, commitment_finish):
"""Sets the commitment_finish of this Task.
The finish date of a delivery. Reading supported for XML format only. # noqa: E501
:param commitment_finish: The commitment_finish of this Task. # noqa: E501
:type: datetime
"""
if commitment_finish is None:
raise ValueError("Invalid value for `commitment_finish`, must not be `None`") # noqa: E501
self._commitment_finish = commitment_finish
@property
def commitment_type(self):
"""Gets the commitment_type of this Task. # noqa: E501
Determines whether a task has an associated delivery or a dependency on an associated delivery. Reading supported for XML format only. # noqa: E501
:return: The commitment_type of this Task. # noqa: E501
:rtype: int
"""
return self._commitment_type
@commitment_type.setter
def commitment_type(self, commitment_type):
"""Sets the commitment_type of this Task.
Determines whether a task has an associated delivery or a dependency on an associated delivery. Reading supported for XML format only. # noqa: E501
:param commitment_type: The commitment_type of this Task. # noqa: E501
:type: int
"""
if commitment_type is None:
raise ValueError("Invalid value for `commitment_type`, must not be `None`") # noqa: E501
self._commitment_type = commitment_type
@property
def baselines(self):
"""Gets the baselines of this Task. # noqa: E501
Gets or sets the collection of baseline values of the task. # noqa: E501
:return: The baselines of this Task. # noqa: E501
:rtype: list[TaskBaseline]
"""
return self._baselines
@baselines.setter
def baselines(self, baselines):
"""Sets the baselines of this Task.
Gets or sets the collection of baseline values of the task. # noqa: E501
:param baselines: The baselines of this Task. # noqa: E501
:type: list[TaskBaseline]
"""
self._baselines = baselines
@property
def extended_attributes(self):
"""Gets the extended_attributes of this Task. # noqa: E501
Task extended attributes. # noqa: E501
:return: The extended_attributes of this Task. # noqa: E501
:rtype: list[ExtendedAttribute]
"""
return self._extended_attributes
@extended_attributes.setter
def extended_attributes(self, extended_attributes):
"""Sets the extended_attributes of this Task.
Task extended attributes. # noqa: E501
:param extended_attributes: The extended_attributes of this Task. # noqa: E501
:type: list[ExtendedAttribute]
"""
self._extended_attributes = extended_attributes
@property
def outline_codes(self):
"""Gets the outline_codes of this Task. # noqa: E501
Task outline codes. # noqa: E501
:return: The outline_codes of this Task. # noqa: E501
:rtype: list[OutlineCode]
"""
return self._outline_codes
@outline_codes.setter
def outline_codes(self, outline_codes):
"""Sets the outline_codes of this Task.
Task outline codes. # noqa: E501
:param outline_codes: The outline_codes of this Task. # noqa: E501
:type: list[OutlineCode]
"""
self._outline_codes = outline_codes
@property
def warning(self):
"""Gets the warning of this Task. # noqa: E501
Represents the flag which indicates that task has schedule discrepancies. # noqa: E501
:return: The warning of this Task. # noqa: E501
:rtype: bool
"""
return self._warning
@warning.setter
def warning(self, warning):
"""Sets the warning of this Task.
Represents the flag which indicates that task has schedule discrepancies. # noqa: E501
:param warning: The warning of this Task. # noqa: E501
:type: bool
"""
if warning is None:
raise ValueError("Invalid value for `warning`, must not be `None`") # noqa: E501
self._warning = warning
@property
def activity_id(self):
"""Gets the activity_id of this Task. # noqa: E501
Represents activity id field - a task's unique identifier used by Primavera (only applicable to Primavera projects). # noqa: E501
:return: The activity_id of this Task. # noqa: E501
:rtype: str
"""
return self._activity_id
@activity_id.setter
def activity_id(self, activity_id):
"""Sets the activity_id of this Task.
Represents activity id field - a task's unique identifier used by Primavera (only applicable to Primavera projects). # noqa: E501
:param activity_id: The activity_id of this Task. # noqa: E501
:type: str
"""
self._activity_id = activity_id
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, Task):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""Returns true if both objects are not equal"""
return not self == other
|
PypiClean
|
/odmantic-0.9.2.tar.gz/odmantic-0.9.2/.mongodb-cluster-action/README.md
|
# MongoDB Cluster Action
[](https://github.com/art049/mongodb-cluster-action/actions/workflows/ci.yml)
## Github Action Usage
#### Basic
```yaml
steps:
# Create the MongoDB cluster
- uses: art049/mongodb-cluster-action@v0
id: mongodb-cluster-action
# Run a CI job and pass the cluster address
# in the MONGO_URI env variable
- run: ./script/test
env:
MONGO_URI: ${{ steps.mongodb-cluster-action.outputs.connection-string }}
```
#### Specify a MongoDB server version
```yaml
steps:
...
- uses: art049/mongodb-cluster-action@v0
id: mongodb-cluster-action
with:
version: "3.6"
...
```
#### Run a replicaSet cluster
```yaml
steps:
...
- uses: art049/mongodb-cluster-action@v0
id: mongodb-cluster-action
with:
mode: replicaSet
...
```
#### Run a sharded cluster
```yaml
steps:
...
- uses: art049/mongodb-cluster-action@v0
id: mongodb-cluster-action
with:
mode: sharded
...
```
### Action details
#### Inputs
| Input | Description | Default |
| --------- | ------------------------------------------------------------------------------------------------------------------ | ------------ |
| `version` | Specifies the MongoDB version to use. Available versions can be found [here](https://hub.docker.com/_/mongo/tags). | `latest` |
| `mode` | Specifies the type of cluster to create: either `standalone`, `replicaSet` or `sharded`. | `standalone` |
#### Outputs
| Output | Description |
| ------------------- | -------------------------------------------------------------- |
| `connection-string` | The connection string to use to connect to the MongoDB cluster |
## Taskfile Usage
This action can also be used with [taskfile](https://taskfile.dev/).
Here are the available tasks:
- `standalone-docker`: Start a standalone MongoDB instance using a docker container
- `standalone-docker:down`: Stop the standalone instance
- `replica-compose`: Start a replica set MongoDB cluster using docker-compose
- `replica-compose:down`: Stop the replica set cluster
- `sharded-compose`: Start a sharded MongoDB cluster using docker-compose
- `sharded-compose:down`: Stop the sharded MongoDB cluster
### Integration in an existing taskfile
First add this repository as a submodule in your project:
```bash
git submodule add https://github.com/art049/mongodb-cluster-action.git .mongodb-cluster-action
```
Then you can include the taskfile in an existing one by adding the following lines:
```yaml
includes:
mongodb:
taskfile: ./.mongodb-cluster-action/Taskfile.yml
dir: .mongodb-cluster-action
optional: true
```
You can then use the mongodb cluster actions by adding the `mongodb` prefix. For example to start a standalone MongoDB instance:
```bash
task mongodb:standalone-docker
```
## Generated clusters details
### Standalone
Spawn a standalone MongoDB instance.
Server: `localhost:27017`
Connection string: `mongodb://localhost:27017/`
### Replica Set
Spawn a 3 member replicaset cluster (1 primary, 2 secondaries)
Servers:
- `172.16.17.11:27017`
- `172.16.17.12:27017`
- `172.16.17.13:27017`
Connection string: `mongodb://172.16.17.11:27017,172.16.17.12:27017,172.16.17.13:27017/?replicaSet=mongodb-action-replica-set`
### Sharded Cluster
Spawn the most simple sharded cluster as possible with 3 replicated shards.
[](https://mermaid-js.github.io/mermaid-live-editor/edit#pako:eNp1ksFqwzAQRH9F7MmG2HiVUDem9JL01pN99UWO5NgQW0GRAiXk36vGK2jsdg9CDDP7BqQbHLRUUMDRiHPHPst6ZH520U6PbX-sroa9NeadYc5TfEkxTxELnme4jVmSJKyMSu2s-s-Vx9O-8mGusujSCSMzUquMZEG6mK_h2WPNa8xoZsGGgs0iiH8GqQdOMQw9kGTqgYse69BjFiA-LvhrfAoQl092HtZwkonLF9zNjBsCxOUL7uaZ-_uEFQzKDKKX_rlvP1oNtlODqqHwV6la4U62hnq8e6s7S2HVh-ytNlC04nRRKxDO6uprPEBhjVPBtO-F_z0Due7fFniYnw)
Servers:
- Router: `172.16.17.11:27017`
- Configuration server: `172.16.17.11:27019`
- Shard0 servers:
- `172.16.17.20:27018`
- `172.16.17.21:27018`
- Shard1 servers:
- `172.16.17.30:27018`
- `172.16.17.31:27018`
- Shard2 servers:
- `172.16.17.40:27018`
- `172.16.17.41:27018`
Connection string: `mongodb://172.16.17.10:27017/?retryWrites=false`
[Source](https://docs.mongodb.com/manual/core/sharded-cluster-components/#development-configuration)
**Note**: Does not work with Mongo 4.4.2 ([issue](https://jira.mongodb.org/browse/SERVER-53259))
</details>
## License
The scripts and documentation in this project are released under the [MIT License](./LICENSE)
|
PypiClean
|
/nodeconductor-digitalocean-0.10.0.tar.gz/nodeconductor-digitalocean-0.10.0/src/nodeconductor_digitalocean/migrations/0001_initial.py
|
from __future__ import unicode_literals
from django.db import migrations, models
import nodeconductor.logging.loggers
import model_utils.fields
import nodeconductor.core.fields
import nodeconductor.structure.models
import nodeconductor.core.models
import django.db.models.deletion
import django.utils.timezone
import taggit.managers
import django_fsm
import nodeconductor.core.validators
class Migration(migrations.Migration):
dependencies = [
('taggit', '0002_auto_20150616_2121'),
('structure', '0037_remove_customer_billing_backend_id'),
]
operations = [
migrations.CreateModel(
name='DigitalOceanService',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=150, verbose_name='name', validators=[nodeconductor.core.validators.validate_name])),
('uuid', nodeconductor.core.fields.UUIDField()),
('available_for_all', models.BooleanField(default=False, help_text='Service will be automatically added to all customers projects if it is available for all')),
('customer', models.ForeignKey(verbose_name='organization', to='structure.Customer')),
],
options={
'abstract': False,
'verbose_name': 'DigitalOcean provider',
'verbose_name_plural': 'DigitalOcean providers',
},
bases=(nodeconductor.core.models.DescendantMixin, nodeconductor.logging.loggers.LoggableMixin, models.Model),
),
migrations.CreateModel(
name='DigitalOceanServiceProjectLink',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('project', models.ForeignKey(to='structure.Project')),
('service', models.ForeignKey(to='nodeconductor_digitalocean.DigitalOceanService')),
],
options={
'abstract': False,
'verbose_name': 'DigitalOcean provider project link',
'verbose_name_plural': 'DigitalOcean provider project links',
},
bases=(nodeconductor.core.models.DescendantMixin, nodeconductor.logging.loggers.LoggableMixin, models.Model),
),
migrations.CreateModel(
name='Droplet',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', model_utils.fields.AutoCreatedField(default=django.utils.timezone.now, verbose_name='created', editable=False)),
('modified', model_utils.fields.AutoLastModifiedField(default=django.utils.timezone.now, verbose_name='modified', editable=False)),
('description', models.CharField(max_length=500, verbose_name='description', blank=True)),
('name', models.CharField(max_length=150, verbose_name='name', validators=[nodeconductor.core.validators.validate_name])),
('uuid', nodeconductor.core.fields.UUIDField()),
('error_message', models.TextField(blank=True)),
('latitude', models.FloatField(null=True, blank=True)),
('longitude', models.FloatField(null=True, blank=True)),
('runtime_state', models.CharField(max_length=150, verbose_name='runtime state', blank=True)),
('state', django_fsm.FSMIntegerField(default=5, choices=[(5, 'Creation Scheduled'), (6, 'Creating'), (1, 'Update Scheduled'), (2, 'Updating'), (7, 'Deletion Scheduled'), (8, 'Deleting'), (3, 'OK'), (4, 'Erred')])),
('cores', models.PositiveSmallIntegerField(default=0, help_text='Number of cores in a VM')),
('ram', models.PositiveIntegerField(default=0, help_text='Memory size in MiB')),
('disk', models.PositiveIntegerField(default=0, help_text='Disk size in MiB')),
('min_ram', models.PositiveIntegerField(default=0, help_text='Minimum memory size in MiB')),
('min_disk', models.PositiveIntegerField(default=0, help_text='Minimum disk size in MiB')),
('external_ips', models.GenericIPAddressField(null=True, protocol='IPv4', blank=True)),
('internal_ips', models.GenericIPAddressField(null=True, protocol='IPv4', blank=True)),
('image_name', models.CharField(max_length=150, blank=True)),
('key_name', models.CharField(max_length=50, blank=True)),
('key_fingerprint', models.CharField(max_length=47, blank=True)),
('user_data', models.TextField(help_text='Additional data that will be added to instance on provisioning', blank=True)),
('backend_id', models.CharField(max_length=255, blank=True)),
('start_time', models.DateTimeField(null=True, blank=True)),
('transfer', models.PositiveIntegerField(default=0, help_text='Amount of transfer bandwidth in MiB')),
('service_project_link', models.ForeignKey(related_name='droplets', on_delete=django.db.models.deletion.PROTECT, to='nodeconductor_digitalocean.DigitalOceanServiceProjectLink')),
('tags', taggit.managers.TaggableManager(to='taggit.Tag', through='taggit.TaggedItem', blank=True, help_text='A comma-separated list of tags.', verbose_name='Tags')),
],
options={
'abstract': False,
},
bases=(nodeconductor.core.models.DescendantMixin, nodeconductor.logging.loggers.LoggableMixin, models.Model),
),
migrations.CreateModel(
name='Image',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=150, verbose_name='name', validators=[nodeconductor.core.validators.validate_name])),
('uuid', nodeconductor.core.fields.UUIDField()),
('backend_id', models.CharField(unique=True, max_length=255)),
('distribution', models.CharField(max_length=100)),
('type', models.CharField(max_length=100)),
('is_official', models.BooleanField(default=False, help_text='Is image provided by DigitalOcean')),
('min_disk_size', models.PositiveIntegerField(help_text='Minimum disk required for a size to use this image', null=True)),
('created_at', models.DateTimeField(null=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Region',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=150, verbose_name='name', validators=[nodeconductor.core.validators.validate_name])),
('uuid', nodeconductor.core.fields.UUIDField()),
('backend_id', models.CharField(unique=True, max_length=255)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Size',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=150, verbose_name='name', validators=[nodeconductor.core.validators.validate_name])),
('uuid', nodeconductor.core.fields.UUIDField()),
('backend_id', models.CharField(unique=True, max_length=255)),
('cores', models.PositiveSmallIntegerField(help_text='Number of cores in a VM')),
('ram', models.PositiveIntegerField(help_text='Memory size in MiB')),
('disk', models.PositiveIntegerField(help_text='Disk size in MiB')),
('transfer', models.PositiveIntegerField(help_text='Amount of transfer bandwidth in MiB')),
('price', models.DecimalField(default=0, verbose_name='Hourly price rate', max_digits=11, decimal_places=5)),
('regions', models.ManyToManyField(to='nodeconductor_digitalocean.Region')),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='image',
name='regions',
field=models.ManyToManyField(to='nodeconductor_digitalocean.Region'),
),
migrations.AddField(
model_name='digitaloceanservice',
name='projects',
field=models.ManyToManyField(related_name='digitalocean_services', through='nodeconductor_digitalocean.DigitalOceanServiceProjectLink', to='structure.Project'),
),
migrations.AddField(
model_name='digitaloceanservice',
name='settings',
field=models.ForeignKey(to='structure.ServiceSettings'),
),
migrations.AlterUniqueTogether(
name='digitaloceanserviceprojectlink',
unique_together=set([('service', 'project')]),
),
migrations.AlterUniqueTogether(
name='digitaloceanservice',
unique_together=set([('customer', 'settings')]),
),
]
|
PypiClean
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.