message
stringlengths 13
484
| diff
stringlengths 38
4.63k
|
---|---|
Change PATH for "ip addr list" command so it could work with cloud-user
It's needed for some custom images like RHEL7 where /usr/sbin/ is not
enabled by default for users under test.
Closes-bug: | @@ -239,7 +239,8 @@ class TrunkTest(base.BaseTempestTestCase):
# Configure VLAN interfaces on server
command = CONFIGURE_VLAN_INTERFACE_COMMANDS % {'tag': vlan_tag}
server['ssh_client'].exec_command(command)
- out = server['ssh_client'].exec_command('ip addr list')
+ out = server['ssh_client'].exec_command(
+ 'PATH=$PATH:/usr/sbin;ip addr list')
LOG.debug("Interfaces on server %s: %s", server, out)
# Ping from server1 to server2 via VLAN interface should fail because
|
add simplification for Ravel._add
This patch promotes ravel through add unconditionally, rather than only in
the situation that both terms have matching structure. | @@ -3146,8 +3146,7 @@ class Ravel(Array):
return Ravel(Multiply([self.func, Unravel(other, *self.func.shape[-2:])]))
def _add(self, other):
- if isinstance(other, Ravel) and equalshape(other.func.shape[-2:], self.func.shape[-2:]):
- return Ravel(Add([self.func, other.func]))
+ return Ravel(self.func + Unravel(other, *self.func.shape[-2:]))
def _sum(self, axis):
if axis == self.ndim-1:
|
fix urllib.reverse example
Responding to | @@ -76,7 +76,7 @@ in js
```
var urllib = hqImport('hqwebapp/js/urllib.js');
var widgetId = 'xxxx';
-$.get(urllib.reverse('more_widget_info'), widgetId).done(function () {...});
+$.get(urllib.reverse('more_widget_info', widgetId)).done(function () {...});
```
|
Disable native wayland for snap
Snap is using QT4 (due to lack of pyside2 in core18) which is not compatible with Wayland.
Hopefully this should prevent from occurring. | @@ -29,6 +29,8 @@ apps:
syncplay:
command: bin/desktop-launch $SNAP/usr/bin/python3 $SNAP/bin/syncplay
desktop: lib/python3.5/site-packages/syncplay/resources/syncplay.desktop
+ environment:
+ DISABLE_WAYLAND: 1
syncplay-server:
command: bin/syncplay-server
|
dcos-integration-test:test task streaming endpoint
make sure master adminrouter can proxy request to agent adminrouter to read
the dcos-log streaming endpoint. | @@ -95,6 +95,15 @@ def test_task_logs(dcos_api_session):
check_log_entry('STDOUT_LOG', url + '?filter=STREAM:STDOUT', dcos_api_session)
check_log_entry('STDERR_LOG', url + '?filter=STREAM:STDERR', dcos_api_session)
+ stream_url = get_task_url(dcos_api_session, task_id, stream=True)
+ response = dcos_api_session.get(stream_url, stream=True, headers={'Accept': 'text/event-stream'})
+ check_response_ok(response, {'Content-Type': 'text/event-stream', 'Cache-Control': 'no-cache'})
+ lines = response.iter_lines()
+ sse_id = next(lines)
+ assert sse_id, 'First line must be id. Got {}'.format(sse_id)
+ data = next(lines).decode('utf-8', 'ignore')
+ validate_sse_entry(data)
+
def test_pod_logs(dcos_api_session):
test_uuid = uuid.uuid4().hex
|
Add Archive.org API + Internet Archive category
Add the internet archive API (link to docs) and include its own category, since it covers a many areas. | @@ -34,6 +34,7 @@ Please note a passing build status indicates all listed APIs are available since
* [Geocoding](#geocoding)
* [Government](#government)
* [Health](#health)
+* [Internet Archive](#internet-archive)
* [Jobs](#jobs)
* [Machine Learning](#machine-learning)
* [Music](#music)
@@ -407,6 +408,11 @@ API | Description | Auth | HTTPS | CORS | Link |
| openFDA | Public FDA data about drugs, devices, and foods | No | Yes | Unknown | [Go!](https://open.fda.gov/api/) |
| USDA Nutrients | National Nutrient Database for Standard Reference | No | Yes | Unknown | [Go!](https://ndb.nal.usda.gov/ndb/doc/index) |
+### Internet Archive
+API | Description | Auth | HTTPS | CORS | Link |
+|---|---|---|---|---|---|
+| Archive.org | The Internet Archive | No | Yes | Unknown | [Go!](https://archive.readme.io/docs) |
+
### Jobs
API | Description | Auth | HTTPS | CORS | Link |
|---|---|---|---|---|---|
|
Add Sanic-Plugins-Framework library to Extensions doc
I made a new tool for devs to use for easily and quickly creating Sanic Plugins (extensions), and for application builders to easily use those plugins in their app. | # Extensions
A list of Sanic extensions created by the community.
-
+- [Sanic-Plugins-Framework](https://github.com/ashleysommer/sanicpluginsframework): Library for easily creating and using Sanic plugins.
- [Sessions](https://github.com/subyraman/sanic_session): Support for sessions.
Allows using redis, memcache or an in memory store.
- [CORS](https://github.com/ashleysommer/sanic-cors): A port of flask-cors.
|
Update API docs link and remove travisCI mention
Fixes | @@ -72,10 +72,7 @@ your code. You can run only the linting checks by using this command:
The project's configuration instructs tox to test against many different
versions of Python. A tox test will use as many of those as it can find on your
-local computer. Rather than installing all those versions, we recommend that
-you point the `Travis <https://travis-ci.org>`_ continuous integration tool at
-your GitHub fork. Travis will run the test against the full suite of Python
-versions every time you push new code.
+local computer.
Using tox to run tests in multiple environments can be very time
consuming. If you wish to quickly run the tests in your own environment, you
@@ -178,7 +175,7 @@ Developer Resources
-------------------
.. toctree::
- SoftLayer API Documentation <http://developer.softlayer.com/reference/softlayerapi>
+ SoftLayer API Documentation <https://sldn.softlayer.com/reference/softlayerapi/>
Source on GitHub <https://github.com/softlayer/softlayer-python>
Issues <https://github.com/softlayer/softlayer-python/issues>
Pull Requests <https://github.com/softlayer/softlayer-python/pulls>
|
Added support for PCIe TPU, as well as USB
Also added message showing which found | @@ -31,7 +31,12 @@ class ObjectDetector():
def __init__(self):
edge_tpu_delegate = None
try:
- edge_tpu_delegate = load_delegate('libedgetpu.so.1.0')
+ edge_tpu_delegate = load_delegate('libedgetpu.so.1.0', {"device": "usb"})
+ print("USB TPU found")
+ except ValueError:
+ try:
+ edge_tpu_delegate = load_delegate('libedgetpu.so.1.0', {"device": "pci:0"})
+ print("PCIe TPU found")
except ValueError:
print("No EdgeTPU detected. Falling back to CPU.")
|
utils.types.not_implemented_error: fix properties handling
TN: | @@ -182,6 +182,8 @@ def not_implemented_error(self_or_cls, method): # no-code-coverage
:rtype: NotImplementedError
"""
cls = self_or_cls if inspect.isclass(self_or_cls) else type(self_or_cls)
+ if isinstance(method, property):
+ method = method.fget
return NotImplementedError('{} must override method {}'.format(
cls.__name__, method.__name__
))
|
Fix broken unit test test_network_absent
This started failing following commit which relied on the
'Name' key being present in the return value of docker.networks -
as the mock didn't have this set the test started failing. | @@ -69,10 +69,14 @@ class DockerNetworkTestCase(TestCase, LoaderModuleMockMixin):
'''
docker_remove_network = Mock(return_value='removed')
docker_disconnect_container_from_network = Mock(return_value='disconnected')
+ docker_networks = Mock(return_value=[{
+ 'Name': 'network_foo',
+ 'Containers': {'container': {}}
+ }])
__salt__ = {
'docker.remove_network': docker_remove_network,
'docker.disconnect_container_from_network': docker_disconnect_container_from_network,
- 'docker.networks': Mock(return_value=[{'Containers': {'container': {}}}]),
+ 'docker.networks': docker_networks,
}
with patch.dict(docker_state.__dict__,
{'__salt__': __salt__}):
|
Fix PIL augs failing on ndarray as batch.images
Some augmenters in `imgaug.augmenters.pillike` failed when
`batch.images` was a single ndarray instead of a list of
arrays. This patch fixes the underlying issues. | @@ -66,6 +66,10 @@ from . import size as sizelib
from .. import parameters as iap
+# TODO some of the augmenters in this module broke on numpy arrays as
+# image inputs (as opposed to lists of arrays) without any test failing
+# add appropriate tests for that
+
_EQUALIZE_USE_PIL_BELOW = 64*64 # H*W
@@ -1380,7 +1384,7 @@ class Equalize(meta.Augmenter):
def _augment_batch_(self, batch, random_state, parents, hooks):
# pylint: disable=no-self-use
- if batch.images:
+ if batch.images is not None:
for image in batch.images:
image[...] = equalize_(image)
return batch
@@ -1483,7 +1487,6 @@ class _EnhanceBase(meta.Augmenter):
return batch
factors = self._draw_samples(len(batch.images), random_state)
- if batch.images:
for image, factor in zip(batch.images, factors):
image[...] = self.func(image, factor)
return batch
@@ -1719,10 +1722,7 @@ class _FilterBase(meta.Augmenter):
self.func = func
def _augment_batch_(self, batch, random_state, parents, hooks):
- if batch.images is None:
- return batch
-
- if batch.images:
+ if batch.images is not None:
for image in batch.images:
image[...] = self.func(image)
return batch
|
Connected components check in pageseg.segment()
Re-add old ocropy cc check to skip processing empty pages with lots of
noise. Resolves and | @@ -377,6 +377,11 @@ def segment(im, text_direction: str = 'horizontal-lr',
binary = np.array(a > 0.5*(np.amin(a) + np.amax(a)), 'i')
binary = 1 - binary
+ _, ccs = morph.label(binary)
+ if ccs > np.dot(*im.size)/(30*30):
+ logger.warning(f'To many connected components for a page image: {ccs}')
+ return {'text_direction': text_direction, 'boxes': []}
+
if not scale:
scale = estimate_scale(binary)
|
panels: Fix incorrect frb positioning.
The previous `top_offset` calculation didn't include the height
of the panels which led the calculations to be performed as if
a portion which was hidden behind the searchbox, was visible to
the user.
The new formula is corect as the frb_top calculation in
panel.resize_app also uses panels_height. | @@ -9,7 +9,7 @@ import * as timerender from "./timerender";
let is_floating_recipient_bar_showing = false;
function top_offset(elem) {
- return elem.offset().top - $("#message_view_header").safeOuterHeight();
+ return elem.offset().top - $("#message_view_header").safeOuterHeight() - $("#panels").height();
}
export function first_visible_message(bar) {
|
Return newly created SignedTransaction when `sign` is invoked.
This is useful for tearing down/rebuilding and passing transactions
around. | @@ -345,6 +345,7 @@ class TransactionBuilder(dict):
signedtx.sign(self.wifs, chain=self.blockchain.rpc.chain_params)
self["signatures"].extend(signedtx.json().get("signatures"))
+ return signedtx
def verify_authority(self):
""" Verify the authority of the signed transaction
|
Removed an except clause which was only there to support Python 2.4 on Linux.
Since we don't support versions of Python before 2.7 anymore, this was not needed. | @@ -234,9 +234,10 @@ pastebufferr = """Redirecting to or from paste buffer requires %s
to be installed on operating system.
%s"""
+# Can we access the clipboard?
+can_clip = False
if sys.platform == "win32":
# Running on Windows
- can_clip = False
try:
import win32clipboard
@@ -265,7 +266,6 @@ if sys.platform == "win32":
write_to_paste_buffer = get_paste_buffer
elif sys.platform == 'darwin':
# Running on Mac OS X
- can_clip = False
try:
# Warning: subprocess.call() and subprocess.check_call() should never be called with stdout=PIPE or stderr=PIPE
# because the child process will block if it generates enough output to a pipe to fill up the OS pipe buffer.
@@ -298,23 +298,10 @@ elif sys.platform == 'darwin':
write_to_paste_buffer = get_paste_buffer
else:
# Running on Linux
- can_clip = False
try:
with open(os.devnull, 'w') as DEVNULL:
subprocess.check_call('xclip -o -sel clip', shell=True, stdin=subprocess.PIPE, stdout=DEVNULL, stderr=DEVNULL)
can_clip = True
- except AttributeError: # check_call not defined, Python < 2.5
- try:
- teststring = 'Testing for presence of xclip.'
- xclipproc = subprocess.Popen('xclip -sel clip', shell=True, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
- xclipproc.stdin.write(teststring)
- xclipproc.stdin.close()
- xclipproc = subprocess.Popen('xclip -o -sel clip', shell=True, stdout=subprocess.PIPE,
- stdin=subprocess.PIPE)
- if xclipproc.stdout.read() == teststring:
- can_clip = True
- except Exception: # hate a bare Exception call, but exception classes vary too much b/t stdlib versions
- pass
except Exception:
pass # something went wrong with xclip and we cannot use it
if can_clip:
@@ -336,7 +323,6 @@ else:
def get_paste_buffer(*args):
raise OSError(pastebufferr % ('xclip', 'On Debian/Ubuntu, install with "sudo apt-get install xclip"'))
-
write_to_paste_buffer = get_paste_buffer
pyparsing.ParserElement.setDefaultWhitespaceChars(' \t')
|
Using windows line breaks by default.
Resolves | return ClipboardJS.isSupported();
},
textFileLink() {
- const errorBlob = new Blob([this.text], { type: 'text/plain' });
+ const windowsFormattedText = this.text.replace('\n', '\r\n');
+ const errorBlob = new Blob([windowsFormattedText], { type: 'text/plain', endings: 'native' });
if (navigator.msSaveBlob) {
return navigator.msSaveBlob(errorBlob, this.downloadFileName);
}
|
feat: pass additional params to ansible-playbook
Allow the user to pass additional parameters to the underlying command
`ansible-playbook`, except -h, -l, -p and -n (if they should exist).
Eg: bluebanquise-playbook -p computes -n c[001-002] --tags time | @@ -5,24 +5,32 @@ export ANSIBLE_CONFIG=/etc/bluebanquise
usage() {
echo "Usage:
$(basename $0) -l
- $(basename $0) -p <playbook> [-n <nodeset>]
+ $(basename $0) -p <playbook> [-n <nodeset>] [ansible-playbook parameters]
+
+Runs Ansible playbooks, executing the defined playbook on the targeted nodes.
Options:
+ -h print this help and exit
-l list available playbooks
-p <playbook> run the playbook
-n <nodeset> list of target nodes, accept nodeset(1) patterns
"
}
-while getopts 'n:p:lh' p
+PARAMS=("$@")
+while getopts ':n:p:lh-:' p "${PARAMS[@]}"
do
- case $p in
- n) nodeset=$OPTARG ;;
- p) playbook=$OPTARG ;;
+ case "$p" in
+ n) nodeset="$OPTARG" ;;
+ p) playbook="$OPTARG" ;;
l) list=1 ;;
h) usage ; exit 0 ;;
+ # Handle additional parameters to pass to ansible-playbook
+ -|*) ((OPTIND--)) ; break ;;
esac
done
+# Remaining parameters not parsed by getopts
+PARAMS=( "${PARAMS[@]:$OPTIND-1}" )
if [[ -n $list ]]; then
echo "List of playbooks available:"
@@ -56,5 +64,5 @@ if [[ -n "$nodeset" ]]; then
target="--extra-vars target=$(${nodeset_cmd} -e -S, ${nodeset})"
fi
-echo $ansbook_cmd ${ANSIBLE_CONFIG}/playbooks/${playbook}.yml ${target}
-$ansbook_cmd ${ANSIBLE_CONFIG}/playbooks/${playbook}.yml ${target}
+echo "Execute: $ansbook_cmd ${ANSIBLE_CONFIG}/playbooks/${playbook}.yml ${target} ${PARAMS[@]}"
+$ansbook_cmd ${ANSIBLE_CONFIG}/playbooks/${playbook}.yml ${target} ${PARAMS[@]}
|
use cast_withscale in function.replace_arguments
This patch generalizes function.replace_arguments to the broader class of
array-like arguments that support Numpy's dispatch protocol. | @@ -2850,7 +2850,8 @@ def replace_arguments(__array: IntoArray, __arguments: Mapping[str, IntoArray])
:class:`Array`
'''
- return _Replace(Array.cast(__array), {k: Array.cast(v) for k, v in __arguments.items()})
+ array, scale = Array.cast_withscale(__array)
+ return _Replace(array, {k: Array.cast(v) for k, v in __arguments.items()}) * scale
def broadcast_arrays(*arrays: IntoArray) -> Tuple[Array, ...]:
|
Fix extension for compiled python files for start.c (.pyo/.pyc)
Because as of Python 3.5, the .pyo filename extension is no longer used
See also: `PEP 488 -- Elimination of PYO files` (https://www.python.org/dev/peps/pep-0488/) | @@ -305,6 +305,11 @@ int main(int argc, char *argv[]) {
/* Get the entrypoint, search the .pyo then .py
*/
char *dot = strrchr(env_entrypoint, '.');
+#if PY_MAJOR_VERSION > 2
+ char *ext = ".pyc";
+#else
+ char *ext = ".pyo";
+#endif
if (dot <= 0) {
LOGP("Invalid entrypoint, abort.");
return -1;
@@ -313,14 +318,14 @@ int main(int argc, char *argv[]) {
LOGP("Entrypoint path is too long, try increasing ENTRYPOINT_MAXLEN.");
return -1;
}
- if (!strcmp(dot, ".pyo")) {
+ if (!strcmp(dot, ext)) {
if (!file_exists(env_entrypoint)) {
/* fallback on .py */
strcpy(entrypoint, env_entrypoint);
entrypoint[strlen(env_entrypoint) - 1] = '\0';
LOGP(entrypoint);
if (!file_exists(entrypoint)) {
- LOGP("Entrypoint not found (.pyo, fallback on .py), abort");
+ LOGP("Entrypoint not found (.pyc/.pyo, fallback on .py), abort");
return -1;
}
} else {
@@ -330,7 +335,11 @@ int main(int argc, char *argv[]) {
/* if .py is passed, check the pyo version first */
strcpy(entrypoint, env_entrypoint);
entrypoint[strlen(env_entrypoint) + 1] = '\0';
+#if PY_MAJOR_VERSION > 2
+ entrypoint[strlen(env_entrypoint)] = 'c';
+#else
entrypoint[strlen(env_entrypoint)] = 'o';
+#endif
if (!file_exists(entrypoint)) {
/* fallback on pure python version */
if (!file_exists(env_entrypoint)) {
@@ -340,7 +349,7 @@ int main(int argc, char *argv[]) {
strcpy(entrypoint, env_entrypoint);
}
} else {
- LOGP("Entrypoint have an invalid extension (must be .py or .pyo), abort.");
+ LOGP("Entrypoint have an invalid extension (must be .py or .pyc/.pyo), abort.");
return -1;
}
// LOGP("Entrypoint is:");
|
api/pupdevices/UltrasonicSensor: drop silent
Like its EV3 counterpart, this does not work reliably enough, so remove. | @@ -244,15 +244,10 @@ class UltrasonicSensor:
"""
pass
- def distance(self, silent=False):
+ def distance(self):
"""Measures the distance between the sensor and an object using
ultrasonic sound waves.
- Arguments:
- silent (bool): Choose ``True`` to turn the sensor off after
- measuring the distance. This reduces interference
- with other ultrasonic sensors.
-
Returns:
:ref:`distance`: Distance.
@@ -263,10 +258,6 @@ class UltrasonicSensor:
"""Checks for the presence of other ultrasonic sensors by detecting
ultrasonic sounds.
- If the other ultrasonic sensor is operating in silent mode, you can
- only detect the presence of that sensor while it is taking a
- measurement.
-
Returns:
bool: ``True`` if ultrasonic sounds are detected,
``False`` if not.
|
EmptyArray: replace use of LiteralExpr with CallExpr
TN: | @@ -1616,7 +1616,7 @@ class EmptyArray(AbstractExpression):
@staticmethod
def construct_static(array_type, abstract_expr=None):
- return LiteralExpr('Create (Items_Count => 0)', array_type,
+ return CallExpr('Create', array_type, ['Items_Count => 0'],
result_var_name='Empty_Array',
abstract_expr=abstract_expr)
|
Update README.rst
Update broken banners for Build_Status, Coverage and Version | @@ -212,13 +212,9 @@ Staff at
[email protected] and any specific CLA-related questions
to [email protected].
-.. |Build_Status| image:: https://travis-ci.org/oasis-open/cti-python-
-stix2.svg?branch=master
+.. |Build_Status| image:: https://travis-ci.org/oasis-open/cti-python-stix2.svg?branch=master
:target: https://travis-ci.org/oasis-open/cti-python-stix2
-.. |Coverage| image:: https://codecov.io/gh/oasis-open/cti-python-
-stix2/branch/master/graph/badge.svg
+.. |Coverage| image:: https://codecov.io/gh/oasis-open/cti-python-stix2/branch/master/graph/badge.svg
:target: https://codecov.io/gh/oasis-open/cti-python-stix2
-.. |Version| image:: https://img.shields.io/pypi/v/stix2.svg?maxAge=
-3600
+.. |Version| image:: https://img.shields.io/pypi/v/stix2.svg?maxAge=3600
:target: https://pypi.python.org/pypi/stix2/
-
|
changelog: Make references to "Recent topics" consistent.
Updates the current 6.0 release notes to include information about
the rename to "Recent conversations", and updates past references
to "recent topics" to be consistently formatted as "Recent topics". | @@ -64,9 +64,10 @@ log][commit-log] for an up-to-date list of raw changes.
clearer and link to the Zulip server troubleshooting guide.
- Redesigned the interface for configuring message editing and
deletion permissions to be easier to understand.
-- Improved Recent Topics. The timestamp links now go to the latest
- message in the topic, arrow key navigation was improved, and many
- other bug fixes or subtle improvements.
+- Improved "Recent topics" and renamed to "Recent conversations" with
+ the addition of including private messages in the view. The timestamp
+ links now go to the latest message in the topic, arrow key navigation
+ was improved, and many other bug fixes or subtle improvements.
- Added support for emoji added in unicode versions since 2017, which
had previously been unavailable in Zulip. Users using the deprecated
"Google blobs" emoji set are automatically migrated to the modern
@@ -423,7 +424,7 @@ log][commit-log] for an up-to-date list of raw changes.
sending email notifications after a mention or PM.
- Improved integrations: BigBlueButton, GitHub, Grafana, PagerDuty,
and many more.
-- Improved various interaction and performance details in Recent Topics.
+- Improved various interaction and performance details in "Recent topics".
- Improved styling for poll and todo list widgets.
- Zulip now supports configuring the database name and username when
using a remote Postgres server. Previously, these were hardcoded to "zulip".
@@ -768,7 +769,7 @@ log][commit-log] for an up-to-date list of raw changes.
allowing moderators and above to use the feature.
- Added a native Giphy integration for sending animated GIFs.
- Added support for muting another user.
-- Recent topics is no longer beta, no longer an overlay, supports
+- "Recent topics" is no longer beta, no longer an overlay, supports
composing messages, and is now the default view. The previous
default view, "All messages", is still available, and the default
view can now be configured via "Display settings".
@@ -1049,7 +1050,7 @@ log][commit-log] for an up-to-date list of raw changes.
- Redesigned the top navbar/search area to be much cleaner and show
useful data like subscriber counts and stream descriptions in
default views.
-- Added a new "recent topics" widget, which lets one browse recent
+- Added a new "Recent topics" widget, which lets one browse recent
and ongoing conversations at a glance. We expect this widget to
replace "All messages" as the default view in Zulip in the
next major release.
@@ -2833,7 +2834,7 @@ running a version from before 1.7 should upgrade directly to 1.7.1.
- Added easy configuration support for a remote PostgreSQL database.
- Added extensive documentation on scalability, backups, and security.
- Recent private message threads are now displayed expanded similar to
- the pre-existing recent topics feature.
+ the pre-existing "Recent topics" feature.
- Made it possible to set LDAP and EMAIL_HOST passwords in
/etc/zulip/secrets.conf.
- Improved the styling for the Administration page and added tabs.
|
typing: don't accidentally use typing.Self
We can switch to it when all type checkers have support | import collections # Needed by aliases like DefaultDict, see mypy issue 2986
import sys
-from _typeshed import Self, SupportsKeysAndGetItem
+from _typeshed import Self as TypeshedSelf, SupportsKeysAndGetItem
from abc import ABCMeta, abstractmethod
from types import BuiltinFunctionType, CodeType, FrameType, FunctionType, MethodType, ModuleType, TracebackType
from typing_extensions import Literal as _Literal, ParamSpec as _ParamSpec, final as _final
@@ -372,7 +372,7 @@ class MutableSequence(Sequence[_T], Generic[_T]):
def reverse(self) -> None: ...
def pop(self, index: int = ...) -> _T: ...
def remove(self, value: _T) -> None: ...
- def __iadd__(self: Self, x: Iterable[_T]) -> Self: ...
+ def __iadd__(self: TypeshedSelf, x: Iterable[_T]) -> TypeshedSelf: ...
class AbstractSet(Collection[_T_co], Generic[_T_co]):
@abstractmethod
@@ -398,10 +398,10 @@ class MutableSet(AbstractSet[_T], Generic[_T]):
def clear(self) -> None: ...
def pop(self) -> _T: ...
def remove(self, value: _T) -> None: ...
- def __ior__(self: Self, s: AbstractSet[_T]) -> Self: ... # type: ignore[override,misc]
- def __iand__(self: Self, s: AbstractSet[Any]) -> Self: ...
- def __ixor__(self: Self, s: AbstractSet[_T]) -> Self: ... # type: ignore[override,misc]
- def __isub__(self: Self, s: AbstractSet[Any]) -> Self: ...
+ def __ior__(self: TypeshedSelf, s: AbstractSet[_T]) -> TypeshedSelf: ... # type: ignore[override,misc]
+ def __iand__(self: TypeshedSelf, s: AbstractSet[Any]) -> TypeshedSelf: ...
+ def __ixor__(self: TypeshedSelf, s: AbstractSet[_T]) -> TypeshedSelf: ... # type: ignore[override,misc]
+ def __isub__(self: TypeshedSelf, s: AbstractSet[Any]) -> TypeshedSelf: ...
class MappingView(Sized):
def __init__(self, mapping: Mapping[Any, Any]) -> None: ... # undocumented
@@ -733,11 +733,11 @@ class NamedTuple(tuple[Any, ...]):
else:
def _asdict(self) -> collections.OrderedDict[str, Any]: ...
- def _replace(self: Self, **kwargs: Any) -> Self: ...
+ def _replace(self: TypeshedSelf, **kwargs: Any) -> TypeshedSelf: ...
# Internal mypy fallback type for all typed dicts (does not exist at runtime)
class _TypedDict(Mapping[str, object], metaclass=ABCMeta):
- def copy(self: Self) -> Self: ...
+ def copy(self: TypeshedSelf) -> TypeshedSelf: ...
# Using NoReturn so that only calls using mypy plugin hook that specialize the signature
# can go through.
def setdefault(self, k: NoReturn, default: object) -> object: ...
@@ -748,8 +748,8 @@ class _TypedDict(Mapping[str, object], metaclass=ABCMeta):
def items(self) -> ItemsView[str, object]: ...
def keys(self) -> KeysView[str]: ...
def values(self) -> ValuesView[object]: ...
- def __or__(self: Self, __value: Self) -> Self: ...
- def __ior__(self: Self, __value: Self) -> Self: ...
+ def __or__(self: TypeshedSelf, __value: TypeshedSelf) -> TypeshedSelf: ...
+ def __ior__(self: TypeshedSelf, __value: TypeshedSelf) -> TypeshedSelf: ...
# This itself is only available during type checking
def type_check_only(func_or_cls: _F) -> _F: ...
|
fix lsr bug
fix lsr bug in image classification | @@ -41,7 +41,7 @@ def _basic_model(data, model, args, is_train):
if is_train and args.use_label_smoothing:
cost = _calc_label_smoothing_loss(softmax_out, label, args.class_dim,
- args.epsilon)
+ args.label_smoothing_epsilon)
else:
cost = fluid.layers.cross_entropy(input=softmax_out, label=label)
@@ -93,9 +93,9 @@ def _mixup_model(data, model, args, is_train):
loss_b = fluid.layers.cross_entropy(input=softmax_out, label=y_b)
else:
loss_a = _calc_label_smoothing_loss(softmax_out, y_a, args.class_dim,
- args.epsilon)
+ args.label_smoothing_epsilon)
loss_b = _calc_label_smoothing_loss(softmax_out, y_b, args.class_dim,
- args.epsilon)
+ args.label_smoothing_epsilon)
loss_a_mean = fluid.layers.mean(x=loss_a)
loss_b_mean = fluid.layers.mean(x=loss_b)
|
Expose Dec_Ref and Inc_Ref for entities to $.Analysis clients
TN: | @@ -346,6 +346,9 @@ package ${ada_lib_name}.Analysis is
Empty_Env : Lexical_Env renames AST_Envs.Empty_Env;
No_Entity_Info : Entity_Info renames AST_Envs.No_Entity_Info;
+ procedure Inc_Ref (E : Entity) renames AST_Envs.Inc_Ref;
+ procedure Dec_Ref (E : in out Entity) renames AST_Envs.Dec_Ref;
+
## Declare arrays of lexical environments here because we need them for the
## Group operation below.
${array_types.public_incomplete_decl(LexicalEnvType.array_type())}
|
Mock out station_api.ApiServer in test.TestCase
Test.__init__ call station_api.start_server() that starts an ApiServer(). | @@ -127,6 +127,7 @@ from openhtf import plugs
from openhtf import util
from openhtf.core import measurements
from openhtf.core import phase_executor
+from openhtf.core import station_api
from openhtf.core import test_record
from openhtf.core import test_state
from openhtf.util import conf
@@ -322,6 +323,11 @@ class TestCase(unittest.TestCase):
raise ValueError(
"%s yields without @openhtf.util.test.yields_phases" % methodName)
+ # Mock the station api server.
+ station_api_server_patcher = mock.patch.object(station_api, 'ApiServer')
+ self.mock_api_server = station_api_server_patcher.start()
+ self.addCleanup(self.mock_api_server.stop)
+
def _AssertPhaseOrTestRecord(func): # pylint: disable=no-self-argument,invalid-name
"""Decorator for automatically invoking self.assertTestPhases when needed.
|
pkg_analysis_body_ada.mako: Rename Child_Number into Child_Index
TN: | @@ -1175,7 +1175,7 @@ package body ${ada_lib_name}.Analysis is
${array_types.body(LexicalEnvType.array)}
${array_types.body(T.root_node.entity.array)}
- function Child_Number
+ function Child_Index
(Node : access ${root_node_value_type}'Class)
return Positive
with Pre => Node.Parent /= null;
@@ -2742,11 +2742,11 @@ package body ${ada_lib_name}.Analysis is
(Node : access ${root_node_value_type}'Class) return Boolean
is (Node = null);
- ------------------
- -- Child_Number --
- ------------------
+ -----------------
+ -- Child_Index --
+ -----------------
- function Child_Number
+ function Child_Index
(Node : access ${root_node_value_type}'Class)
return Positive
is
@@ -2763,7 +2763,7 @@ package body ${ada_lib_name}.Analysis is
-- If we reach this point, then Node isn't a Child of Node.Parent. This
-- is not supposed to happen.
raise Program_Error;
- end Child_Number;
+ end Child_Index;
----------------------
-- Previous_Sibling --
@@ -2773,7 +2773,7 @@ package body ${ada_lib_name}.Analysis is
(Node : access ${root_node_value_type}'Class)
return ${root_node_type_name}
is
- N : constant Positive := Child_Number (Node);
+ N : constant Positive := Child_Index (Node);
begin
return (if N = 1
then null
@@ -2790,7 +2790,7 @@ package body ${ada_lib_name}.Analysis is
is
begin
-- If Node is the last sibling, then Child will return null
- return Node.Parent.Child (Child_Number (Node) + 1);
+ return Node.Parent.Child (Child_Index (Node) + 1);
end Next_Sibling;
## Env metadata's body
|
Register stats from request_success and request_failure
Old update was causing breaking issues. | @@ -71,12 +71,15 @@ class Runner:
self.target_user_count = None
# set up event listeners for recording requests
- def on_request(request_type, name, response_time, response_length, exception, context, **kwargs):
+ def on_request_success(request_type, name, response_time, response_length, **_kwargs):
+ self.stats.log_request(request_type, name, response_time, response_length)
+
+ def on_request_failure(request_type, name, response_time, response_length, exception, **_kwargs):
self.stats.log_request(request_type, name, response_time, response_length)
- if exception:
self.stats.log_error(request_type, name, exception)
- self.environment.events.request.add_listener(on_request)
+ self.environment.events.request_success.add_listener(on_request_success)
+ self.environment.events.request_failure.add_listener(on_request_failure)
self.connection_broken = False
# register listener that resets stats when spawning is complete
|
Adding a unit test with the empty list case in `KeyRange.to_pb()`.
Also reworked the `to_pb()` tests to just create a protobuf and
just use one assertion. | @@ -93,31 +93,58 @@ class TestKeyRange(unittest.TestCase):
self.assertEqual(krange.end_closed, None)
def test_to_pb_w_start_closed_and_end_open(self):
+ from google.protobuf.struct_pb2 import ListValue
+ from google.protobuf.struct_pb2 import Value
from google.cloud.spanner_v1.proto.keys_pb2 import KeyRange
- KEY_1 = [u'key_1']
- KEY_2 = [u'key_2']
- krange = self._make_one(start_closed=KEY_1, end_open=KEY_2)
- krange_pb = krange.to_pb()
- self.assertIsInstance(krange_pb, KeyRange)
- self.assertEqual(len(krange_pb.start_closed), 1)
- self.assertEqual(krange_pb.start_closed.values[0].string_value,
- KEY_1[0])
- self.assertEqual(len(krange_pb.end_open), 1)
- self.assertEqual(krange_pb.end_open.values[0].string_value, KEY_2[0])
+ key1 = u'key_1'
+ key2 = u'key_2'
+ key_range = self._make_one(start_closed=[key1], end_open=[key2])
+ key_range_pb = key_range.to_pb()
+ expected = KeyRange(
+ start_closed=ListValue(values=[
+ Value(string_value=key1)
+ ]),
+ end_open=ListValue(values=[
+ Value(string_value=key2)
+ ]),
+ )
+ self.assertEqual(key_range_pb, expected)
def test_to_pb_w_start_open_and_end_closed(self):
+ from google.protobuf.struct_pb2 import ListValue
+ from google.protobuf.struct_pb2 import Value
from google.cloud.spanner_v1.proto.keys_pb2 import KeyRange
- KEY_1 = [u'key_1']
- KEY_2 = [u'key_2']
- krange = self._make_one(start_open=KEY_1, end_closed=KEY_2)
- krange_pb = krange.to_pb()
- self.assertIsInstance(krange_pb, KeyRange)
- self.assertEqual(len(krange_pb.start_open), 1)
- self.assertEqual(krange_pb.start_open.values[0].string_value, KEY_1[0])
- self.assertEqual(len(krange_pb.end_closed), 1)
- self.assertEqual(krange_pb.end_closed.values[0].string_value, KEY_2[0])
+ key1 = u'key_1'
+ key2 = u'key_2'
+ key_range = self._make_one(start_open=[key1], end_closed=[key2])
+ key_range_pb = key_range.to_pb()
+ expected = KeyRange(
+ start_open=ListValue(values=[
+ Value(string_value=key1)
+ ]),
+ end_closed=ListValue(values=[
+ Value(string_value=key2)
+ ]),
+ )
+ self.assertEqual(key_range_pb, expected)
+
+ def test_to_pb_w_empty_list(self):
+ from google.protobuf.struct_pb2 import ListValue
+ from google.protobuf.struct_pb2 import Value
+ from google.cloud.spanner_v1.proto.keys_pb2 import KeyRange
+
+ key = u'key'
+ key_range = self._make_one(start_closed=[], end_closed=[key])
+ key_range_pb = key_range.to_pb()
+ expected = KeyRange(
+ start_closed=ListValue(values=[]),
+ end_closed=ListValue(values=[
+ Value(string_value=key)
+ ]),
+ )
+ self.assertEqual(key_range_pb, expected)
class TestKeySet(unittest.TestCase):
|
Fix file skipping
files_wanted and files_unwanted need to provide indices. Previously, these were providing the File objects themselves from the file_list, which transmission can't recognize. | @@ -579,7 +579,7 @@ class PluginTransmission(TransmissionBase):
if options['post'].get('main_file_only') and main_id is not None:
# Set Unwanted Files
options['change']['files_unwanted'] = [
- x for x in file_list if x not in dl_list
+ x for x in range(len(file_list)) if x not in dl_list
]
options['change']['files_wanted'] = dl_list
logger.debug(
@@ -600,7 +600,7 @@ class PluginTransmission(TransmissionBase):
else:
options['change']['files_unwanted'] = skip_list
options['change']['files_wanted'] = [
- x for x in file_list if x not in skip_list
+ x for x in range(len(file_list)) if x not in skip_list
]
logger.debug(
'Downloading {} of {} files in torrent.',
|
Avoid DB requests when making health checks
This allows us to do more custom things when the DB is unavailable, such as querying the cache.
This also reduces pressure on the DB. | @@ -91,11 +91,7 @@ def base(request):
def health(request):
- c = Channel.objects.first()
- if c:
- return HttpResponse(c.name)
- else:
- return HttpResponse("No channels created yet!")
+ return HttpResponse("Healthy!")
def stealth(request):
|
faq: update fio command
Via: | @@ -271,14 +271,14 @@ The Direct mode wraps the Write request into the I/O command and sends this comm
- Random Read test:
- ```
- ./fio -ioengine=libaio -bs=32k -direct=1 -thread -rw=randread -size=10G -filename=fio_randread_test.txt -name='PingCAP' -iodepth=4 -runtime=60
+ ```bash
+ ./fio -ioengine=psync -bs=32k -fdatasync=1 -thread -rw=randread -size=10G -filename=fio_randread_test.txt -name='fio randread test' -iodepth=4 -runtime=60 -numjobs=4 -group_reporting --output-format=json --output=fio_randread_result.json"
```
- The mix test of sequential Write and random Read:
- ```
- ./fio -ioengine=libaio -bs=32k -direct=1 -thread -rw=randrw -percentage_random=100,0 -size=10G -filename=fio_randr_write_test.txt -name='PingCAP' -iodepth=4 -runtime=60
+ ```bash
+ ./fio -ioengine=psync -bs=32k -fdatasync=1 -thread -rw=randrw -percentage_random=100,0 -size=10G -filename=fio_randread_write_test.txt -name='fio mixed randread and sequential write test' -iodepth=4 -runtime=60 -numjobs=4 -group_reporting --output-format=json --output=fio_randread_write_test.json"
```
#### Error `UNREACHABLE! "msg": "Failed to connect to the host via ssh: " ` when deploying TiDB using TiDB-Ansible
|
2.0b15-release-notes
Mostly block slugs | # Prefect Release Notes
+## 2.0b15
+
+### Uniquely refer to blocks with slugs
+Blocks are a convienient way to secure store and retreive configuration. Now, retreiving configuration stored with blocks is even easier with slugs, both human and machine readable unique identifiers. By deafult, slugs are a concatination of [block-type-name]/[block-document-name], but they are editable. Slugs and Block document names may only include alphanumeric characters and dashes.
+
+**Warning**: This breaking change makes this release incompatible with previous versions of the Orion server and Prefect Cloud 2.0
+
+### Other improvements and bug fixes
+- The new GCS FileSystem Block enables you to read and write data as a file on Google Cloud Storage
+
## 2.0b14
### Retreive the state of your tasks or flows with the `return_state` kwarg
|
Add test case for calling c10 ops from pytorch
Summary: Pull Request resolved: | @@ -9968,6 +9968,16 @@ tensor([[[1., 1., 1., ..., 1., 1., 1.],
do_test(torch.tensor([[1, 2]]).data)
do_test(torch.tensor([[1, 2]]).detach())
+ def test_c10_layer_norm(self):
+ # test that we can call c10 ops and they return a reasonable result
+ X = torch.rand(5, 5, dtype=torch.float)
+ epsilon = 1e-4
+
+ expected_norm = torch.nn.functional.layer_norm(X, X.size()[1:], eps=epsilon)
+ actual_norm, actual_mean, actual_stdev = \
+ torch.ops.caffe2.layer_norm_dont_use_this_op_yet(torch.tensor(X), 1, epsilon)
+ torch.testing.assert_allclose(expected_norm, actual_norm)
+
# Functions to test negative dimension wrapping
METHOD = 1
INPLACE_METHOD = 2
|
TST: updated param tests
Updated param tests by using new test function and improving docstrings. | @@ -12,9 +12,10 @@ import pytest
import shutil
import tempfile
-import pysat # required for reimporting pysat
-from pysat._params import Parameters # required for eval statements
+import pysat # Required for reimporting pysat
+from pysat._params import Parameters # Required for eval statements
from pysat.tests.classes.cls_ci import CICleanSetup
+from pysat.utils import testing
class TestBasics(object):
@@ -65,14 +66,20 @@ class TestBasics(object):
assert pysat.params['data_dirs'] == check
return
- @pytest.mark.parametrize("path",
- ['no_path',
- 'not_a_directory'])
+ @pytest.mark.parametrize("path", ['no_path', 'not_a_directory'])
def test_set_data_dir_bad_directory(self, path):
- """Ensure you can't set data_dirs to a bad path."""
- with pytest.raises(ValueError) as excinfo:
+ """Ensure you can't set data_dirs to a bad path.
+
+ Parameters
+ ----------
+ path : str
+ Bad path to a directory
+
+ """
+ with pytest.raises(ValueError) as verr:
pysat.params['data_dirs'] = path
- assert str(excinfo.value).find("Invalid path") >= 0
+
+ assert str(verr).find("Invalid path") >= 0
return
def test_repr(self):
@@ -175,9 +182,10 @@ class TestBasics(object):
def test_bad_path_instantiation(self):
"""Ensure you can't use bad path when loading Parameters."""
- with pytest.raises(OSError) as excinfo:
- Parameters(path='./made_up_name')
- assert str(excinfo.value).find("Supplied path does not exist") >= 0
+ testing.eval_bad_input(Parameters, OSError,
+ "Supplied path does not exist",
+ input_kwargs={"path": './made_up_name'})
+
return
@@ -204,9 +212,8 @@ class TestCIonly(CICleanSetup):
os.path.join(self.root, 'pysat_settings_moved.json'))
# Ensure we can't create a parameters file without valid .json
- with pytest.raises(OSError) as err:
- Parameters()
- assert str(err).find('pysat is unable to locate a user settings') >= 0
+ testing.eval_bad_input(Parameters, OSError,
+ 'pysat is unable to locate a user settings')
shutil.move(os.path.join(self.root, 'pysat_settings_moved.json'),
os.path.join(self.root, 'pysat_settings.json'))
|
CI: let pip handle attrs implicitly from databroker
the conda version is too old | @@ -54,7 +54,7 @@ before_install:
install:
- export GIT_FULL_HASH=`git rev-parse HEAD`
- conda create -n testenv python=$TRAVIS_PYTHON_VERSION scipy matplotlib numpy h5py -c conda-forge -c defaults --override-channels
- - conda install -n testenv nose jsonschema traitlets pytest coverage pip databroker ophyd historydict boltons doct pyepics super_state_machine xray-vision lmfit jinja2 icu pyzmq mongoquery dill attrs -c lightsource2 -c conda-forge -c soft-matter
+ - conda install -n testenv nose jsonschema traitlets pytest coverage pip databroker ophyd historydict boltons doct pyepics super_state_machine xray-vision lmfit jinja2 icu pyzmq mongoquery dill -c lightsource2 -c conda-forge -c soft-matter
- source activate testenv
- 'pip install https://github.com/NSLS-II/event-model/zipball/master#egg=event_model'
- conda remove metadatastore databroker filestore
|
Update __main__.py
Added back checking for an `--age` argument for `sos purge`. Was dropped in | @@ -1738,7 +1738,7 @@ def cmd_purge(args, workflow_args):
# from .monitor import summarizeExecution
env.verbosity = args.verbosity
try:
- if not (args.tasks or args.all or args.status or args.tags):
+ if not (args.tasks or args.all or args.status or args.tags or args.age):
raise ValueError(
"Please specify either IDs of tasks or one or more of options --all, --age, --status, or --tags."
)
|
Update README.md
Updating the CORS value for restcountries.com | @@ -669,7 +669,7 @@ API | Description | Auth | HTTPS | CORS |
| [positionstack](https://positionstack.com/) | Forward & Reverse Batch Geocoding REST API | `apiKey` | Yes | Unknown |
| [PostcodeData.nl](http://api.postcodedata.nl/v1/postcode/?postcode=1211EP&streetnumber=60&ref=domeinnaam.nl&type=json) | Provide geolocation data based on postcode for Dutch addresses | No | No | Unknown |
| [Postcodes.io](https://postcodes.io) | Postcode lookup & Geolocation for the UK | No | Yes | Yes |
-| [REST Countries](https://restcountries.com) | Get information about countries via a RESTful API | No | Yes | Unknown |
+| [REST Countries](https://restcountries.com) | Get information about countries via a RESTful API | No | Yes | Yes |
| [RoadGoat Cities](https://www.roadgoat.com/business/cities-api) | Cities content & photos API | `apiKey` | Yes | No |
| [SpotSense](https://www.spotsense.io) | Add location based interactions to your mobile app | `apiKey` | Yes | Unknown |
| [Uebermaps](https://uebermaps.com/api/v2) | Discover and share maps with friends | `apiKey` | Yes | Unknown |
|
Fix typo in linux installation docs
wihsh -> wish | @@ -40,7 +40,7 @@ Preparation
this by invoking at least one of :bash:`pdflatex --version`, :bash:`xelatex --version`, and
:bash:`lualatex --version` in a terminal.
-3. Optional: If you whish to have syntax highlighting and some other :ref:`nice features <usage-gui-config>`
+3. Optional: If you wish to have syntax highlighting and some other :ref:`nice features <usage-gui-config>`
enabled in the |TexText|-Gui install GTKSourceView:
.. code-block:: bash
|
Fixed tutorial 11 bug
Fixed the accidental deletion of "}," in the code that had broken the tutorial | "cell_type": "markdown",
"metadata": {
"collapsed": true
+ },
"source": [
"This tutorial walks through how to add traffic lights to experiments. This tutorial will use the following files:\n",
"\n",
|
Apply suggestions from code review
Thanks, David! | @@ -204,7 +204,7 @@ PageObjects Library
===================
The **PageObjects** library provides support for page objects,
-robotframework-style. Even though robot is a keyword driven framework,
+Robot Framework-style. Even though robot is a keyword-driven framework,
we've implemented a way to dynamically load in keywords that are
unique to a page or an object on the page.
@@ -213,11 +213,11 @@ objects. Each class has keywords that are unique to a page or a
component. These classes can be imported on demand only for tests
which use these pages or components.
-The pageobject Decorator
+The ``pageobject`` Decorator
------------------------
-Page objects are normal python classes which use the :code:`pageobject`
-decorator provided by cumulusci. Unlike traditional robot framework
+Page objects are normal Python classes which use the :code:`pageobject`
+decorator provided by CumulusCI. Unlike traditional Robot Framework
keyword libraries, you may define multiple sets of keywords in a
single file.
@@ -249,14 +249,14 @@ Importing the library
The **PageObjects** library is somewhat unique in that it is not only a
keyword library, but also the mechanism by which you can import files
which contain page object classes. This is done by providing the paths
-to one or more python files which implement page objects. You may also
+to one or more Python files which implement page objects. You may also
import **PageObjects** without passing any files to it in order to take
advantage of some general purpose page objects.
For example, consider the case where you've created two files that
each have one or more page object definitions. For example, lets say
-in robot/MyProject/resources you have the files PageObjects.py and
-MorePageObjects.py. You can import these page objects into a test
+in ``robot/MyProject/resources`` you have the files ``PageObjects.py`` and
+``MorePageObjects.py``. You can import these page objects into a test
suite like so:
.. code-block:: robotframework
@@ -336,7 +336,7 @@ Example: :code:`Load page object Listing Contact`
This will load the page object for the given **page_type** and
**object_name_**. It is useful when you want to use the keywords from a
-page object without first navigating to that page (ie: when you are
+page object without first navigating to that page (i.e. when you are
already on the page and don't want to navigate away).
@@ -350,9 +350,9 @@ keywords for a page that does not have its own page object, the
For example, if you use :code:`Current page should be Home Event` and
there is no page object by that name, a generic :code:`Home` page object
-will be loaded, and it's object name will be set to :code:`Event`.
+will be loaded, and its object name will be set to :code:`Event`.
-For example, lets say your project has created a custom object named
+For example, let's say your project has created a custom object named
**Island**. You don't have a home page, but the object does have a
standard listing page. Without creating any page objects, this test
should work by using generic implementations of the Home and Listing
|
CTypes: Make all operations target C type aware.
* The real gains of course will be in the source type awareness, this is
only for using it in conditions. | @@ -192,9 +192,14 @@ def getOperationCode(to_name, operator, arg_names, in_place, emit, context):
context.addCleanupTempName(to_name)
else:
+ if to_name.c_type != "PyObject *":
+ value_name = context.allocateTempName("op_%s_res" % operator.lower())
+ else:
+ value_name = to_name
+
emit(
"%s = %s( %s );" % (
- to_name,
+ value_name,
helper,
", ".join(
str(arg_name)
@@ -205,11 +210,19 @@ def getOperationCode(to_name, operator, arg_names, in_place, emit, context):
)
getErrorExitCode(
- check_name = to_name,
+ check_name = value_name,
release_names = arg_names,
emit = emit,
context = context
)
+ if value_name is not to_name:
+ to_name.getCType().emitAssignConversionCode(
+ to_name = to_name,
+ value_name = value_name,
+ emit = emit,
+ context = context
+ )
+ else:
if ref_count:
context.addCleanupTempName(to_name)
|
chore: remove numpy related deprecations
numpy recently released a minor v1.24.0 which broke our tests primarily
due to the usage of `np.bool` which has been deprecated in favour of python `bool`. | @@ -192,7 +192,7 @@ def structurewise_uncertainty(fname_lst, fname_hard, fname_unc_vox, fname_out):
if i_mc_label > 0:
data_tmp[mc_dict["mc_labeled"][i_mc][i_class] == i_mc_label] = 1.
- data_class_obj_mc.append(data_tmp.astype(np.bool))
+ data_class_obj_mc.append(data_tmp.astype(bool))
# COMPUTE IoU
# Init intersection and union
|
Pendulum trust region parameter tuning
reduced network size
quiet=True for trpo | @@ -51,7 +51,7 @@ def experiment(alg, env_id, horizon, gamma, n_epochs, n_steps, n_steps_per_fit,
'params': {'lr': 3e-4}},
loss=F.mse_loss,
n_features=64,
- batch_size=64,
+ batch_size=32,
input_shape=mdp.info.observation_space.shape,
output_shape=(1,))
@@ -118,7 +118,7 @@ if __name__ == '__main__':
n_epochs_cg=100,
cg_damping=1e-2,
cg_residual_tol=1e-10,
- quiet=False)
+ quiet=True)
algs_params = [
(TRPO, 'trpo', trpo_params),
|
I'm a dumbass.
This typo breaks the installer | @@ -186,7 +186,7 @@ function setup_config_files() {
cp --no-clobber "/opt/arm/setup/.abcde.conf" "/etc/.abcde.conf"
chown arm:arm "/etc/.abcde.conf"
# link to the new install location so runui.py doesn't break
- sudo -u arm ln -sf /etc/.abdce.conf /etc/arm/config/abcde.conf
+ sudo -u arm ln -sf /etc/.abcde.conf /etc/arm/config/abcde.conf
if [[ $port_flag ]]; then
echo -e "${RED}Non-default port specified, updating arm config...${NC}"
|
Added feature occurence and avg score vs feature occurence
Removed print statement | @@ -762,7 +762,6 @@ class BaseSplitter(ms.BaseCrossValidator):
# if i.__class__.__name__ == 'EnsembleModelFeatureSelector':
# plot_ensemble_feature_graphs = True
dirs = [d for d in os.listdir(savepath) if 'EnsembleModelFeatureSelector' in d]
- print(dirs)
# Plot feature_occurence curve and average score against occurence if selector is EnsembleModelFeatureSelector
# if plot_ensemble_feature_graphs and dirs != None:
if dirs != None:
|
adapt to spyder-3.1.3+
spyder.exe is now spyder3.exe | @@ -939,18 +939,30 @@ if exist "%WINPYDIR%\scripts\idlex.pyw" (
self.create_batch_script('spyder.bat',r"""@echo off
call "%~dp0env_for_icons.bat"
cd/D "%WINPYWORKDIR%"
+if exist "%WINPYDIR%\scripts\spyder3.exe" (
+ "%WINPYDIR%\scripts\spyder3.exe" %*
+) else (
"%WINPYDIR%\scripts\spyder.exe" %*
+)
""")
self.create_batch_script('winspyder.bat',r"""@echo off
call "%~dp0env_for_icons.bat"
cd/D "%WINPYWORKDIR%"
+if exist "%WINPYDIR%\scripts\spyder3.exe" (
+ "%WINPYDIR%\scripts\spyder3.exe" %*
+) else (
"%WINPYDIR%\scripts\spyder.exe" %*
+)
""")
self.create_batch_script('spyder_reset.bat',r"""@echo off
call "%~dp0env_for_icons.bat"
cd/D "%WINPYWORKDIR%"
+if exist "%WINPYDIR%\scripts\spyder3.exe" (
+ "%WINPYDIR%\scripts\spyder3.exe" --reset %*
+) else (
"%WINPYDIR%\scripts\spyder.exe" --reset %*
+)
""")
self.create_batch_script('ipython_notebook.bat',r"""@echo off
|
[Fix] Fix bug in the installation of `mmsegmentation` in Dockerfile
* pip install mmsegmentation
Change from mmseg to mmsegmentation.
* Update Dockerfile | @@ -15,7 +15,7 @@ RUN apt-get update && apt-get install -y ffmpeg libsm6 libxext6 git ninja-build
# Install MMCV, MMDetection and MMSegmentation
RUN pip install mmcv-full==latest+torch1.6.0+cu101 -f https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html
RUN pip install mmdet==2.11.0
-RUN pip install mmseg
+RUN pip install mmsegmentation==0.13.0
# Install MMDetection3D
RUN conda clean --all
|
Fix incorrectnesses in the mocking order of certain test functions in test_platformops.py
SIM:
CR: | @@ -280,7 +280,12 @@ class TestPlatformOperations(unittest.TestCase):
@mock.patch('ebcli.operations.platformops.io')
@mock.patch('ebcli.operations.platformops.elasticbeanstalk')
@mock.patch('ebcli.operations.platformops.commonops')
- def test_delete_no_environments(self, mock_io, mock_elasticbeanstalk, mock_commonops):
+ def test_delete_no_environments(
+ self,
+ mock_commonops,
+ mock_elasticbeanstalk,
+ mock_io
+ ):
platformops._version_to_arn = Mock(return_value=self.platform_arn)
mock_elasticbeanstalk.get_environments.return_value = []
mock_elasticbeanstalk.delete_platform.return_value = { 'ResponseMetadata': { 'RequestId': 'request-id' } }
@@ -293,7 +298,12 @@ class TestPlatformOperations(unittest.TestCase):
@mock.patch('ebcli.operations.platformops.io')
@mock.patch('ebcli.operations.platformops.elasticbeanstalk')
@mock.patch('ebcli.operations.platformops.commonops')
- def test_delete_with_environments(self, mock_io, mock_elasticbeanstalk, mock_commonops):
+ def test_delete_with_environments(
+ self,
+ mock_commonops,
+ mock_elasticbeanstalk,
+ mock_io
+ ):
platformops._version_to_arn = Mock(return_value=self.platform_arn)
environments = [
Environment(name='env1', platform=PlatformVersion(self.platform_arn)),
@@ -308,8 +318,6 @@ class TestPlatformOperations(unittest.TestCase):
mock_elasticbeanstalk.get_environments.assert_called_with()
-
-
@mock.patch('ebcli.lib.elasticbeanstalk.list_platform_versions')
def test_list_custom_platform_versions__filtered_by_owner_name(self, mock_list_platform_versions):
mock_list_platform_versions.return_value = self.custom_platforms_list
@@ -352,7 +360,6 @@ class TestPlatformOperations(unittest.TestCase):
custom_platforms
)
-
@mock.patch('ebcli.lib.elasticbeanstalk.list_platform_versions')
def test_list_eb_managed_platform_versions(self, mock_list_platform_versions):
mock_list_platform_versions.return_value = self.eb_managed_platforms_list
|
[basePen] Add addVarComposite() to DecomposingPen
Also change AbstractPen's. | @@ -143,7 +143,8 @@ class AbstractPen:
and the 'location' argument must be a dictionary mapping axis tags
to their locations.
"""
- raise NotImplementedError
+ # GlyphSet decomposes for us
+ raise AttributeError
class NullPen(AbstractPen):
@@ -222,6 +223,10 @@ class DecomposingPen(LoggingPen):
tPen = TransformPen(self, transformation)
glyph.draw(tPen)
+ def addVarComponent(self, glyphName, transformation, location):
+ # GlyphSet decomposes for us
+ raise AttributeError
+
class BasePen(DecomposingPen):
|
Add SymbolTableFactory to symtable stub
This is not documented API, but AFAIR the typeshed policy is now that stubs should preferably reflect reality rather than solely documented API.
See | @@ -47,3 +47,8 @@ class Symbol(object):
def is_namespace(self) -> bool: ...
def get_namespaces(self) -> Sequence[SymbolTable]: ...
def get_namespace(self) -> SymbolTable: ...
+
+class SymbolTableFactory(object):
+ def __init__(self) -> None: ...
+ def new(self, table: Any, filename: str) -> SymbolTable: ...
+ def __call__(self, table: Any, filename: str) -> SymbolTable: ...
|
Update README.md
Fix typo issue.
I'm not sure about this one, but I don't understand the code if it's not like this. Please check before merging. | @@ -233,7 +233,7 @@ An enhanced example of the previous bot just puts two of the three things into a
for entrez_id, ensembl in raw_data.items():
# data type object
entrez_gene_id = wdi_core.WDString(value=entrez_id, prop_nr='P351')
- ensembl_transcript_id = wdi_core.WDString(value='entrez_id_string', prop_nr='P704')
+ ensembl_transcript_id = wdi_core.WDString(value=ensembl, prop_nr='P704')
# data goes into a list, because many data objects can be provided to
data = [entrez_gene_id, ensembl_transcript_id]
@@ -279,7 +279,7 @@ The full example:
for entrez_id, ensembl in raw_data.items():
# data type object
entrez_gene_id = wdi_core.WDString(value=entrez_id, prop_nr='P351')
- ensembl_transcript_id = wdi_core.WDString(value='entrez_id_string', prop_nr='P704')
+ ensembl_transcript_id = wdi_core.WDString(value=ensembl, prop_nr='P704')
# data goes into a list, because many data objects can be provided to
data = [entrez_gene_id, ensembl_transcript_id]
|
fix: fix test_fl_sms on test_notebooks.py
Currently it could fail with os.chdir since changing directory doesn't alter import paths, it changes the directory for opening files.
See original PR: | @@ -124,9 +124,10 @@ def test_fl_with_trainconfig(isolated_filesystem, start_remote_server_worker_onl
@pytest.mark.skip
def test_fl_sms(isolated_filesystem): # pragma: no cover
sys.path.append("advanced/Federated SMS Spam prediction/")
- os.chdir("advanced/Federated SMS Spam prediction/")
import preprocess
+ os.chdir("advanced/Federated SMS Spam prediction/")
+
notebook = "Federated SMS Spam prediction.ipynb"
p_name = Path("examples/tutorials/advanced/Federated SMS Spam prediction/")
not_excluded_notebooks.remove(p_name / notebook)
|
Fix PGPKey.decrypt misusing message.issuers instead of encrypters.
- We only want to see if our key/subkey fingerprint is in some of
the PKESKs, we don't care if it's in the signatures. | @@ -2219,9 +2219,9 @@ class PGPKey(Armorable, ParentRef, PGPObject):
warnings.warn("This message is not encrypted", stacklevel=2)
return message
- if self.fingerprint.keyid not in message.issuers:
+ if self.fingerprint.keyid not in message.encrypters:
sks = set(self.subkeys)
- mis = set(message.issuers)
+ mis = set(message.encrypters)
if sks & mis:
skid = list(sks & mis)[0]
warnings.warn("Message was encrypted with this key's subkey: {:s}. "
|
Plugins: Fix, the webengine process wasn't found on Windows
* It seems that the code was never ported to PySide and it's
unclear which PyQt ever worked with it. | @@ -177,7 +177,6 @@ import %(binding_name)s.QtCore
),
),
(
- # TODO: Expose this as an option to add it.
"translations_path",
applyBindingName(
"""\
@@ -200,6 +199,10 @@ import %(binding_name)s.QtCore
"""Does it include the Nuitka patch, i.e. is a self-built one with it applied."""
return self._getQtInformation().nuitka_patch_level
+ def _getTranslationsPath(self):
+ """Get the path to the Qt translations."""
+ return self._getQtInformation().translations_path
+
def getQtPluginDirs(self):
if self.qt_plugins_dirs is not None:
return self.qt_plugins_dirs
@@ -676,7 +679,7 @@ if not path.startswith(__nuitka_binary_dir):
plugin_parent = os.path.dirname(self.getQtPluginDirs()[0])
if isWin32Windows():
- bin_dir = os.path.join(plugin_parent, "bin")
+ bin_dir = plugin_parent
else: # TODO verify this for non-Windows!
bin_dir = os.path.join(plugin_parent, "libexec")
target_bin_dir = os.path.join(dist_dir)
@@ -689,16 +692,16 @@ if not path.startswith(__nuitka_binary_dir):
for f in os.listdir(resources_dir):
shutil.copy(os.path.join(resources_dir, f), target_resources_dir)
- translations_dir = os.path.join(plugin_parent, "translations")
- pos = len(translations_dir) + 1
- target_translations_dir = os.path.join(
+ translations_path = self._getTranslationsPath()
+ pos = len(translations_path) + 1
+ translations_path = os.path.join(
dist_dir,
full_name.getTopLevelPackageName().asPath(),
"Qt",
"translations",
)
- for f in getFileList(translations_dir):
- tar_f = os.path.join(target_translations_dir, f[pos:])
+ for f in getFileList(translations_path):
+ tar_f = os.path.join(translations_path, f[pos:])
makePath(os.path.dirname(tar_f))
shutil.copyfile(f, tar_f)
|
vray device aspect ratio fix
Vray has different attribute name for device aspect ratio from other renderers.
(.aspectRatio instead of .deviceAspectRatio)
Testing
open/create a maya scene with vray renderer set
run OpenPype -> Set Resolution
resolution should be set without an error message about nonexistent vray attribute | @@ -2141,7 +2141,7 @@ def set_scene_resolution(width, height, pixelAspect):
cmds.setAttr("%s.height" % control_node, height)
deviceAspectRatio = ((float(width) / float(height)) * float(pixelAspect))
- cmds.setAttr("%s.deviceAspectRatio" % control_node, deviceAspectRatio)
+ cmds.setAttr("%s.aspectRatio" % control_node, deviceAspectRatio)
cmds.setAttr("%s.pixelAspect" % control_node, pixelAspect)
|
Make it harder to do potentially foolish things during release
Summary: Switch from y/N, Y/n, to N/! -- make em hit shift.
Test Plan: N/A
Reviewers: nate, alangenfeld | @@ -378,11 +378,11 @@ def check_new_version(new_version):
should_continue = input(
'You appear to be releasing a new version, {new_version}, without having '
'previously run a prerelease.\n(Last version found was {previous_version})\n'
- 'Are you sure you know what you\'re doing? (Y/n)'.format(
+ 'Are you sure you know what you\'re doing? (N/!)'.format(
new_version=new_version, previous_version=last_version['__version__']
)
)
- if not should_continue == 'Y':
+ if not should_continue == '!':
raise Exception('Bailing! Run a pre-release before continuing.')
return True
@@ -424,11 +424,11 @@ def check_for_cruft(autoclean):
'Found potentially crufty directories:\n'
' {found_cruft}\n'
'***We strongly recommend releasing from a fresh git clone!***\n'
- 'Automatically remove these directories and continue? (y/N)'.format(
+ 'Automatically remove these directories and continue? (N/!)'.format(
found_cruft='\n '.join(found_cruft)
)
)
- if wipeout == 'y' or wipeout == 'Y':
+ if wipeout == '!':
for cruft_dir in found_cruft:
subprocess.check_output(['rm', '-rfv', cruft_dir])
else:
@@ -451,11 +451,11 @@ def check_for_cruft(autoclean):
wipeout = input(
'Found {n_files} .pyc files.\n'
'We strongly recommend releasing from a fresh git clone!\n'
- 'Automatically remove these files and continue? (y/N)'.format(
+ 'Automatically remove these files and continue? (N/!)'.format(
n_files=len(found_pyc_files)
)
)
- if wipeout == 'y' or wipeout == 'Y':
+ if wipeout == '!':
for file_ in found_pyc_files:
os.unlink(file_)
else:
|
emoji_picker: Move click handler out from global scope.
In this commit we are moving the .emoji-popover-emoji.reaction
click handler to register_click_handlers() so as to have parity
with rest of the code design. | @@ -301,26 +301,6 @@ function maybe_select_emoji(e) {
}
}
-$(document).on('click', '.emoji-popover-emoji.reaction', function () {
- // When an emoji is clicked in the popover,
- // if the user has reacted to this message with this emoji
- // the reaction is removed
- // otherwise, the reaction is added
- var emoji_name = this.title;
- var message_id = $(this).parent().parent().attr('data-message-id');
-
- var message = message_store.get(message_id);
- if (!message) {
- blueslip.error('reactions: Bad message id: ' + message_id);
- return;
- }
-
- if (reactions.current_user_has_reacted_to_emoji(message, emoji_name)) {
- $(this).removeClass('reacted');
- }
- reactions.toggle_emoji_reaction(message_id, emoji_name);
-});
-
exports.toggle_selected_emoji = function () {
// Toggle the currently selected emoji.
var message_id = current_msg_list.selected_id();
@@ -501,6 +481,26 @@ exports.emoji_select_tab = function (elt) {
exports.register_click_handlers = function () {
+ $(document).on('click', '.emoji-popover-emoji.reaction', function () {
+ // When an emoji is clicked in the popover,
+ // if the user has reacted to this message with this emoji
+ // the reaction is removed
+ // otherwise, the reaction is added
+ var emoji_name = this.title;
+ var message_id = $(this).parent().parent().attr('data-message-id');
+
+ var message = message_store.get(message_id);
+ if (!message) {
+ blueslip.error('reactions: Bad message id: ' + message_id);
+ return;
+ }
+
+ if (reactions.current_user_has_reacted_to_emoji(message, emoji_name)) {
+ $(this).removeClass('reacted');
+ }
+ reactions.toggle_emoji_reaction(message_id, emoji_name);
+ });
+
$(document).on('click', '.emoji-popover-emoji.composition', function (e) {
var emoji_text = ':' + this.title + ':';
var textarea = $("#new_message_content");
|
validation: remove extraneous check for ClassDefaults.variable
ClassDefaults.variable is always set by default | @@ -891,18 +891,6 @@ class Component(object):
# Used by run to store return value of execute
self.results = []
- # ENFORCE REQUIRED CLASS DEFAULTS
-
- # All subclasses must implement self.ClassDefaults.variable
- # Do this here, as _validate_variable might be overridden by subclass
- try:
- if self.ClassDefaults.variable is NotImplemented:
- raise ComponentError("self.ClassDefaults.variable for {} must be assigned a value or \'None\'".
- format(self.componentName))
- except AttributeError:
- raise ComponentError("self.ClassDefaults.variable must be defined for {} or its base class".
- format(self.componentName))
-
# CHECK FOR REQUIRED PARAMS
# All subclasses must implement, in their paramClassDefaults, params of types specified in
|
Update README for pgAdmin
Add instructions for developers to use pgAdmin | @@ -143,6 +143,16 @@ To lint the code base ::
tox -e lint
+pgAdmin
+-------------------
+
+If you want to interact with the Postgres database from a GUI:
+
+ 1. Copy the `pgadmin_servers.json.example` into a `pgadmin_servers.json` file.
+ 2. `docker-compose up` causes pgAdmin to run on http://localhost:8432
+
+Side note: The `pgadmin_servers.json` file uses [pgadmin servers.json syntax](https://www.pgadmin.org/docs/pgadmin4/development/import_export_servers.html#json-format)
+
Contributing
=============
|
Problem: older backends are no longer supported
Solution: when running the command `bigchaindb configure`, configure for
`localmongodb` only. | @@ -271,7 +271,10 @@ def create_parser():
help='Prepare the config file '
'and create the node keypair')
config_parser.add_argument('backend',
- choices=['rethinkdb', 'mongodb', 'localmongodb'],
+ choices=['localmongodb'],
+ default='localmongodb',
+ const='localmongodb',
+ nargs='?',
help='The backend to use. It can be either '
'rethinkdb or mongodb.')
|
Added `feedback_required`
Added `feedback_required` in defs `comment` and `reply_to_comment` | @@ -17,7 +17,11 @@ def comment(self, media_id, comment_text):
return True
if not self.reached_limit('comments'):
self.delay('comment')
- if self.api.comment(media_id, comment_text):
+ _r = self.api.comment(media_id, comment_text)
+ if _r == 'feedback_required':
+ self.logger.error("`Comment` action has been BLOCKED...!!!")
+ return False
+ if _r:
self.total['comments'] += 1
return True
else:
@@ -37,7 +41,11 @@ def reply_to_comment(self, media_id, comment_text, parent_comment_id):
if comment_text.split(' ')[0][1:] == self.get_username_from_user_id(self.user_id):
self.logger.error("You can't reply to yourself")
return False
- if self.api.reply_to_comment(media_id, comment_text, parent_comment_id):
+ _r = self.api.reply_to_comment(media_id, comment_text, parent_comment_id)
+ if _r == 'feedback_required':
+ self.logger.error("`Comment` action has been BLOCKED...!!!")
+ return False
+ if _r:
self.total['comments'] += 1
return True
else:
|
Adjust botorch css to match Ax css for Sphinx docs
Summary: Pull Request resolved: | @@ -220,6 +220,46 @@ div.body {
max-width: 900px;
}
+table {
+ overflow: hidden;
+}
+
+dl {
+ margin-bottom: 15px;
+ }
+
+dl.class > dt {
+ background-color: #f8f8f8;
+ border-left: 3px solid #F15A24;
+ padding: 2px 0px 2px 5px;
+}
+
+dl.class > dt > code {
+ background: none;
+}
+
+dl.class > dt > em.property {
+ color: #F15A24;
+ font-style: normal;
+ font-variant: small-caps;
+}
+
+dl.class em.property {
+ color: #F15A24;
+ font-style: normal;
+ font-variant: small-caps;
+}
+
+dl.function > dt {
+ background-color: #f8f8f8;
+ border-left: 3px solid #F15A24;
+ padding: 2px 0px 2px 5px;
+}
+
+dl.function > dt > code {
+ background: none;
+}
+
dd {
padding-top: 10px;
padding-bottom: 5px;
|
Allow the wrapper and environment to be optional.
This gives the option of using the environment provided at the worker. | @@ -255,17 +255,20 @@ def wqex_create_task(itemid, item, wrapper, env_file, command_path, infile_funct
infile_item = os.path.join(tmpdir, 'item_{}.p'.format(itemid))
outfile = os.path.join(tmpdir, 'output_{}.p'.format(itemid))
- coffea_command = 'python {} {} {} {}'.format(basename(command_path), basename(infile_function), basename(infile_item), basename(outfile))
- wrapped_command = './{}'.format(basename(wrapper))
- wrapped_command += ' --environment {}'.format(basename(env_file))
- wrapped_command += ' --unpack-to "$WORK_QUEUE_SANDBOX"/{}-env {}'.format(basename(env_file), coffea_command)
+ # Base command just invokes python on the function and data.
+ command = 'python {} {} {} {}'.format(basename(command_path), basename(infile_function), basename(infile_item), basename(outfile))
- task = wq.Task(wrapped_command)
- task.specify_category('default')
+ # If wrapper and env provided, add that.
+ if wrapper and env_file:
+ command = './{} --environment {} --unpack-to "$WORK_QUEUE_SANDBOX"/{}-env {}'.format(basename(wrapper),basename(env_file),basename(env_file),command)
+ task = wq.Task(command)
+ task.specify_category('default')
task.specify_input_file(command_path, cache=True)
task.specify_input_file(infile_function, cache=False)
task.specify_input_file(infile_item, cache=False)
+
+ if wrapper and env_file:
task.specify_input_file(env_file, cache=True)
task.specify_input_file(wrapper, cache=True)
@@ -426,12 +429,7 @@ def work_queue_executor(items, function, accumulator, **kwargs):
if _wq_queue is None or queue_mode == 'one-per-stage':
_wq_queue = wq.WorkQueue(port, name=master_name, debug_log=debug_log, stats_log=stats_log, transactions_log=trans_log)
- if not env_file:
- raise TypeError("environment-file argument missing. It should name a conda environment as a tar file.")
- elif not os.path.exists(env_file):
- raise ValueError("environment-file does not name an existing conda environment as a tar file.")
-
- if not wrapper:
+ if env_file and not wrapper:
raise ValueError("Location of python_package_run could not be determined automatically.\nUse 'wrapper' argument to the work_queue_executor.")
# If explicit resources are given, collect them into default_resources
|
Update README.txt
fix link to tensorboard tutorial | @@ -3,7 +3,7 @@ Intermediate tutorials
1. tensorboard_tutorial.py
Classifying Names with a Character-Level RNN
- https://pytorch.org/tutorials/beginner/tensorboard_tutorial.html
+ https://pytorch.org/tutorials/intermediate/tensorboard_tutorial.html
2. char_rnn_classification_tutorial.py
Classifying Names with a Character-Level RNN
|
fix(automl): fix typo in code example for AutoML Tables
Typo in AutoML Tables code example at | @@ -56,7 +56,7 @@ class TablesClient(object):
>>> from google.oauth2 import service_account
>>>
>>> client = automl_v1beta1.TablesClient(
- ... credentials=service_account.Credentials.from_service_account_file('~/.gcp/account.json')
+ ... credentials=service_account.Credentials.from_service_account_file('~/.gcp/account.json'),
... project='my-project', region='us-central1')
...
|
Charinfo: up char limit and reduce line limit
Pagination means more characters can be supported without cluttering
anything. It also means infinite lines, so there's no longer a need to
squeeze out the most from a single page. Reducing the line limit leads
to a smaller, tidier presentation. | @@ -119,7 +119,7 @@ class Utils(Cog):
@command()
@in_whitelist(channels=(Channels.bot_commands,), roles=STAFF_ROLES)
async def charinfo(self, ctx: Context, *, characters: str) -> None:
- """Shows you information on up to 25 unicode characters."""
+ """Shows you information on up to 50 unicode characters."""
match = re.match(r"<(a?):(\w+):(\d+)>", characters)
if match:
return await messages.send_denial(
@@ -129,7 +129,7 @@ class Utils(Cog):
"was found. Please remove it and try again."
)
- if len(characters) > 25:
+ if len(characters) > 50:
return await messages.send_denial(ctx, f"Too many characters ({len(characters)}/25)")
def get_info(char: str) -> Tuple[str, str]:
@@ -147,10 +147,10 @@ class Utils(Cog):
embed = Embed().set_author(name="Character Info")
if len(characters) > 1:
- # Maximum length possible is 252 so no need to truncate.
+ # Maximum length possible is 502 out of 1024, so there's no need to truncate.
embed.add_field(name='Full Raw Text', value=f"`{''.join(raw_list)}`", inline=False)
- await LinePaginator.paginate(char_list, ctx, embed, max_size=2000, empty=False)
+ await LinePaginator.paginate(char_list, ctx, embed, max_lines=10, max_size=2000, empty=False)
@command()
async def zen(self, ctx: Context, *, search_value: Union[int, str, None] = None) -> None:
|
fix xfails involving literals
Summary:
I missed these in
cc apaszke jamesr66a zdevito
Pull Request resolved: | @@ -2289,21 +2289,6 @@ a")
y2 = torch.sum(x, dim=0)
self.assertEqual(y, y2)
- # TODO: renable when we support passing literals to script fns
- @unittest.expectedFailure
- def test_literal_xfail(self):
- def func4(a, b):
- c = 0, (0, 0)
- x = True
- while x:
- x = False
- c = a, (a, b)
- d, e = c
- f, g = e
- return d + f + g
-
- self.checkScript(func4, (a, b), optimize=True)
-
def test_literal(self):
def func1(a, b):
c = a, b
@@ -2316,10 +2301,22 @@ a")
f, g = e
return d + f + g
+ def func3(a, b):
+ # type: (float, float) -> float
+ c = 0., (0., 0.)
+ x = True
+ while x:
+ x = False
+ c = a, (a, b)
+ d, e = c
+ f, g = e
+ return d + f + g
+
a = torch.rand(1, requires_grad=True)
b = torch.rand(1, requires_grad=True)
self.checkScript(func1, (a, b), optimize=True)
self.checkScript(func2, (a, b), optimize=True)
+ self.checkScript(func3, (a.item(), b.item()), optimize=True)
def test_expand(self):
@torch.jit.script
@@ -4246,8 +4243,6 @@ a")
with self.assertRaisesRegex(RuntimeError, 'called recursively involving'):
M()
- # TODO: Use this when we support passing literals to script fns
- @unittest.expectedFailure
def test_script_kwargs_fn_call(self):
class M(torch.jit.ScriptModule):
def __init__(self):
@@ -4259,6 +4254,7 @@ a")
@torch.jit.script_method
def foo(self, bar, input):
+ # type: (int, Tensor) -> Tensor
return input + bar
m = M()
self.assertEqual(2, m.call_foo(torch.ones((), dtype=torch.int64)))
|
Fix compiling of MultipleSubstFormat1 with zero out glyphs
str.split('') returns [''], whereas we expect [].
Fix that. | @@ -356,7 +356,7 @@ class MultipleSubst(FormatSwitchingBaseTable):
return
# TTX v3.1 and later.
- outGlyphs = attrs["out"].split(",")
+ outGlyphs = attrs["out"].split(",") if attrs["out"] else []
mapping[attrs["in"]] = [g.strip() for g in outGlyphs]
@staticmethod
|
Update Minnesota.md
Closes
Closes | @@ -848,3 +848,34 @@ geolocation: 44.9342249, -93.2624022
* https://twitter.com/929_julian/status/1337531637026971649
+
+### Police shove woman carrying pizza | 2021-06-04
+
+In a protest in response to the killing of Winston Smith, police violently shove a woman carrying pizza. One of the people who recorded the incident said "they [police] cracked her head open" and others at the scene allege that the womans head was bleeding.
+
+tags: push, shove, protester
+
+id: mn-minneapolis-40
+
+geolocation: 44.948353, -93.298301
+
+**Links**
+
+* https://twitter.com/IguanaEatMePls/status/1400985943369191425
+* https://twitter.com/Isa_teric/status/1400964064961040387
+* https://twitter.com/coloring_book/status/1401001240130207744
+
+
+### Police tackle woman carrying water | 2021-06-15
+
+Following a protest near the site of Deona Marie's death, a protester is seen grabbing a plastic water vessel and walking away, when a police officer charges her and tackles her.
+
+tags: push, shove, tackle, protester
+
+id: mn-minneapolis-41
+
+geolocation: 44.9480255,-93.2957499
+
+**Links**
+
+* https://www.reddit.com/r/2020PoliceBrutality/comments/o0xkdq/minneapolis_cop_pushes_woman_to_the_ground/
|
Implementation of allclose
allclose tunneled from torch.allclose
implementations in operations and tensor.py
implemented unit-tests | @@ -143,6 +143,18 @@ class TestOperations(unittest.TestCase):
with self.assertRaises(TypeError):
ht.ones(array_len).all(axis='bad_axis_type')
+ def test_allclose(self):
+ a = ht.float32([[2, 2], [2, 2]])
+ b = ht.float32([[2.00005, 2.00005], [2.00005, 2.00005]])
+
+ self.assertFalse(ht.allclose(a, b))
+ self.assertTrue(ht.allclose(a, b, atol = 1e-04))
+ self.assertTrue(ht.allclose(a,b, rtol = 1e-04))
+
+ with self.assertRaises(TypeError):
+ ht.allclose(a, (2,2,2,2))
+
+
def test_argmin(self):
data = ht.float32([
[1, 2, 3],
|
Update doc about output_differentiability keyword in derivatives.yaml
Summary: Pull Request resolved:
Test Plan: Imported from OSS | # same length as the number of outputs from the forward function. The list
# should contain only booleans, specifying whether each of the output Tensor
# is differentiable.
+# If it is not specified for a function that returns multiple elements but
+# uses `grad` instead of `grads[idx]`, then all but the first output will
+# be marked as non-differentiable.
# If None of the output is differentiable, you can also add the function
# name to `gen_variable_type.py`'s `DONT_REQUIRE_DERIVATIVE` list.
#
|
improve prng compile times with loop rolling
cf. | @@ -123,34 +123,29 @@ def threefry_2x32(keypair, count):
else:
x = list(np.split(count.ravel(), 2))
- rotations = onp.uint32([13, 15, 26, 6, 17, 29, 16, 24])
+ rotations = asarray([13, 15, 26, 6, 17, 29, 16, 24], dtype="uint32")
ks = [key1, key2, key1 ^ key2 ^ onp.uint32(0x1BD11BDA)]
x[0] = x[0] + ks[0]
x[1] = x[1] + ks[1]
- for r in rotations[:4]:
- x = apply_round(x, r)
+ x = lax.fori_loop(0, 4, lambda i, x: apply_round(x, rotations[i]), x)
x[0] = x[0] + ks[1]
x[1] = x[1] + ks[2] + onp.uint32(1)
- for r in rotations[4:]:
- x = apply_round(x, r)
+ x = lax.fori_loop(4, 8, lambda i, x: apply_round(x, rotations[i]), x)
x[0] = x[0] + ks[2]
x[1] = x[1] + ks[0] + onp.uint32(2)
- for r in rotations[:4]:
- x = apply_round(x, r)
+ x = lax.fori_loop(0, 4, lambda i, x: apply_round(x, rotations[i]), x)
x[0] = x[0] + ks[0]
x[1] = x[1] + ks[1] + onp.uint32(3)
- for r in rotations[4:]:
- x = apply_round(x, r)
+ x = lax.fori_loop(4, 8, lambda i, x: apply_round(x, rotations[i]), x)
x[0] = x[0] + ks[1]
x[1] = x[1] + ks[2] + onp.uint32(4)
- for r in rotations[:4]:
- x = apply_round(x, r)
+ x = lax.fori_loop(0, 4, lambda i, x: apply_round(x, rotations[i]), x)
x[0] = x[0] + ks[2]
x[1] = x[1] + ks[0] + onp.uint32(5)
|
Clarify part of the docs
The comment on Content-Length is unclear and better removed (it meant
the content-length would be calculated by Quart and emitted). | @@ -19,8 +19,7 @@ str
return render_template("index.html")
A solitary string return indicates that you intend to return a string
-mimetype ``text/html`` and a specified Content-Length header. The
-string will be encoded using the default
+mimetype ``text/html``. The string will be encoded using the default
:attr:`~quart.wrappers._BaseRequestResponse.charset`.
dict
|
Bump HDF5 used for MPI CI
It appears 1.10.4 and lower use deprecated MPI APIs, which have now been
removed in the latest releases. | @@ -117,7 +117,7 @@ jobs:
py37-deps-hdf51103-mpi:
python.version: '3.7'
TOXENV: py37-test-mindeps-mpi4py
- HDF5_VERSION: 1.10.3
+ HDF5_VERSION: 1.10.5
HDF5_DIR: $(HDF5_CACHE_DIR)/$(HDF5_VERSION)
HDF5_MPI: ON
CC: mpicc
|
purge-iscsi-gateways: don't run all ceph-facts
We only need to have the container_binary fact. Because we're not
gathering the facts from all nodes then the purge fails trying to get
one of the grafana fact.
Closes: | block:
- import_role:
name: ceph-facts
+ tasks_from: container_binary
+
+ - name: set_fact container_exec_cmd
+ set_fact:
+ container_exec_cmd: "{{ container_binary }} exec ceph-mon-{{ ansible_hostname }}"
+ when: containerized_deployment | bool
- name: get iscsi gateway list
command: "{{ container_exec_cmd | default('') }} ceph --cluster {{ cluster }} dashboard iscsi-gateway-list -f json"
|
Fix empty strings crashing Namespace for float options
This feels like a Discord bug to me but it's causing issues | @@ -142,7 +142,8 @@ class Namespace:
self.__dict__[name] = value
elif opt_type == 10: # number
value = option['value'] # type: ignore # Key is there
- if value is None:
+ # This condition is written this way because 0 can be a valid float
+ if value is None or value == '':
self.__dict__[name] = float('nan')
else:
self.__dict__[name] = float(value)
|
jenkins.bash: Don't delete test results on the second loop run.
Also, spec repo and docker pull also needs to be run once. | @@ -20,12 +20,13 @@ venv() {
source "$1"/bin/activate
}
-# Test for Python 2.7 and Python 3
-for PYTHON_VERSION in 2 3
-do
git clean --force -d -x || /bin/true
cloneorpull common-workflow-language https://github.com/common-workflow-language/common-workflow-language.git
docker pull node:slim
+
+# Test for Python 2.7 and Python 3
+for PYTHON_VERSION in 2 3
+do
venv cwltool-venv
export PIP_DOWNLOAD_CACHE=/var/lib/jenkins/pypi-cache/
pip install -U setuptools wheel pip
|
MAINT: Remove useless custom tp_alloc and tp_free on ndarray
array_alloc is equivalent to the default object allocator
PyType_GenericAlloc. array_free was added "just in case" in but
doesn't seem to serve any actual purpose. | @@ -1705,22 +1705,6 @@ array_iter(PyArrayObject *arr)
return PySeqIter_New((PyObject *)arr);
}
-static PyObject *
-array_alloc(PyTypeObject *type, Py_ssize_t NPY_UNUSED(nitems))
-{
- /* nitems will always be 0 */
- PyObject *obj = PyObject_Malloc(type->tp_basicsize);
- PyObject_Init(obj, type);
- return obj;
-}
-
-static void
-array_free(PyObject * v)
-{
- /* avoid same deallocator as PyBaseObject, see gentype_free */
- PyObject_Free(v);
-}
-
NPY_NO_EXPORT PyTypeObject PyArray_Type = {
PyVarObject_HEAD_INIT(NULL, 0)
@@ -1741,7 +1725,5 @@ NPY_NO_EXPORT PyTypeObject PyArray_Type = {
.tp_iter = (getiterfunc)array_iter,
.tp_methods = array_methods,
.tp_getset = array_getsetlist,
- .tp_alloc = (allocfunc)array_alloc,
.tp_new = (newfunc)array_new,
- .tp_free = (freefunc)array_free,
};
|
Correct Shutdown stop.
Should properly finalize in a way that will kill the thread. | @@ -1282,7 +1282,8 @@ class LhystudioController(Module):
context.register("control/Resume", resume_k40)
def finalize(self, *args, **kwargs):
- pass
+ if self._thread is not None:
+ self.write(b'\x18\n')
def __repr__(self):
return "LhystudioController()"
@@ -1467,7 +1468,6 @@ class LhystudioController(Module):
self.context._buffer_size = len(self._realtime_buffer) + len(self._buffer) + len(self._queue)
self.context.signal('pipe;buffer', self.context._buffer_size)
-
def update_packet(self, packet):
if self.context is not None:
self.context.signal('pipe;packet', convert_to_list_bytes(packet))
@@ -1482,6 +1482,7 @@ class LhystudioController(Module):
self._main_lock.acquire(True)
self.count = 0
self.pre_ok = False
+ self.is_shutdown = False
while self.state != STATE_END and self.state != STATE_TERMINATE:
if self.state == STATE_INITIALIZE:
# If we are initialized. Change that to active since we're running.
@@ -1531,11 +1532,11 @@ class LhystudioController(Module):
time.sleep(0.02 * self.count)
# will tick up to 1 second waits if there's never a queue.
self.count += 1
- self._main_lock.release()
self._thread = None
- self.update_state(STATE_END)
self.is_shutdown = False
+ self.update_state(STATE_END)
self.pre_ok = False
+ self._main_lock.release()
def process_queue(self):
"""
|
Simplify .travis.yml conda install
Use conda-forge channel. | @@ -30,16 +30,13 @@ before_install:
fi
- bash miniconda.sh -b -p $HOME/miniconda
- export PATH="$HOME/miniconda/bin:$PATH"
- - hash -r
- - conda config --set always_yes yes --set changeps1 no
- - conda update -q conda
- # Useful for debugging any issues with conda
- - conda info -a
+ - conda update --yes conda
+ - conda config --add channels conda-forge
# Install packages
install:
# Use openblas to fix a theano error (hacky); see http://stackoverflow.com/questions/11987325/theano-fails-due-to-numpy-fortran-mixup-under-ubuntu
- - conda create --name conda-env-python$TRAVIS_PYTHON_VERSION --yes python=$TRAVIS_PYTHON_VERSION openblas numpy scipy nose libgfortran six wheel
+ - conda create --name conda-env-python$TRAVIS_PYTHON_VERSION --yes python=$TRAVIS_PYTHON_VERSION numpy scipy nose libgfortran six wheel
- source activate conda-env-python$TRAVIS_PYTHON_VERSION
- pip install pep8
- pip install coveralls
|
Take use_inherit_rotation and inherit_scale into account.
This implementation should also fix an issue where static poses don't use Bone Constraints. | @@ -43,7 +43,14 @@ def gather_joint(blender_object, blender_bone, export_settings):
else:
correction_matrix_local = gltf2_blender_math.multiply(
blender_bone.parent.bone.matrix_local.inverted(), blender_bone.bone.matrix_local)
- matrix_basis = blender_bone.matrix_basis
+
+ if (blender_bone.bone.use_inherit_rotation == False or blender_bone.bone.inherit_scale != "FULL") and blender_bone.parent != None:
+ rest_mat = (blender_bone.parent.bone.matrix_local.inverted_safe() @ blender_bone.bone.matrix_local)
+ matrix_basis = (rest_mat.inverted_safe() @ blender_bone.parent.matrix.inverted_safe() @ blender_bone.matrix)
+ else:
+ matrix_basis = blender_bone.matrix
+ matrix_basis = blender_object.convert_space(pose_bone=blender_bone, matrix=matrix_basis, from_space='POSE', to_space='LOCAL')
+
trans, rot, sca = gltf2_blender_extract.decompose_transition(
gltf2_blender_math.multiply(correction_matrix_local, matrix_basis), export_settings)
translation, rotation, scale = (None, None, None)
|
Update `MAINTAINERS.md` for new maintainers.
While we have this file, we should keep it updated. If we were to remove it though, we'd want to link directly to
[ci skip-rust]
[ci skip-build-wheels] | Active Maintainers
==================
+* Alexey Tereshenkov
* Andreas Stenius
* Benjy Weinberger
+* Carina C. Zona
+* Christopher Neugebauer
* Daniel McClanahan
* Daniel Wagner-Hall
* Eric Arellano
@@ -11,6 +14,7 @@ Active Maintainers
* Ity Kaul
* Patrick Lawson
* John Sirois
+* Joshua Cannon
* Kris Wilson
* Nora Howard
* Stu Hood
|
Retry getting credentials from Boto3
It may make network requests that can fail.
Also, we want a good error message if it fails, and not a complaint about operating on None.
Fixes hopefully | @@ -538,9 +538,19 @@ def _monkey_patch_boto():
# We get a Credentials object
# <https://github.com/boto/botocore/blob/8d3ea0e61473fba43774eb3c74e1b22995ee7370/botocore/credentials.py#L227>
- # or a RefreshableCredentials
+ # or a RefreshableCredentials, or None on failure.
+ creds = None
+ for attempt in retry(timeout=10, predicate=true):
+ with attempt:
creds = self._boto3_resolver.load_credentials()
+ if creds is None:
+ try:
+ resolvers = str(self._boto3_resolver.providers)
+ except:
+ resolvers = "(Resolvers unavailable)"
+ raise RuntimeError("Could not obtain AWS credentials from Boto3. Resolvers tried: " + resolvers)
+
# Make sure the credentials actually has some credentials if it is lazy
creds.get_frozen_credentials()
|
NodeSetEditor : Add `floating` argument to `acquire()` method.
This provides control over whether or not the acquired editor will be embedded in the main layout or in a floating window. | @@ -103,15 +103,14 @@ class NodeSetEditor( GafferUI.EditorWidget ) :
return result
## Ensures that the specified node has a visible editor of this class type editing
- # it, creating one if necessary.
- ## \todo User preferences for whether these are made floating, embedded, whether
- # they are reused etc. This class should provide the behaviour, but the code for
- # linking it to preferences should be in a startup file.
+ # it, creating one if necessary. The `floating` argument may be passed a value of
+ # `True` or `False`, to force the acquisition of a panel that is either floating or
+ # docked respectively.
## \todo Consider how this relates to draggable editor tabs and editor floating
# once we implement that in CompoundEditor - perhaps acquire will become a method
# on CompoundEditor instead at this point.
@classmethod
- def acquire( cls, node ) :
+ def acquire( cls, node, floating = None ) :
if isinstance( node, Gaffer.ScriptNode ) :
script = node
@@ -120,11 +119,13 @@ class NodeSetEditor( GafferUI.EditorWidget ) :
scriptWindow = GafferUI.ScriptWindow.acquire( script )
+ if floating in ( None, False ) :
for editor in scriptWindow.getLayout().editors( type = cls ) :
if node.isSame( editor._lastAddedNode() ) :
editor.reveal()
return editor
+ if floating in ( None, True ) :
childWindows = scriptWindow.childWindows()
for window in childWindows :
if isinstance( window, _EditorWindow ) :
@@ -135,8 +136,16 @@ class NodeSetEditor( GafferUI.EditorWidget ) :
editor = cls( script )
editor.setNodeSet( Gaffer.StandardSet( [ node ] ) )
+ if floating is False :
+ scriptWindow.getLayout().addEditor( editor )
+ else :
window = _EditorWindow( scriptWindow, editor )
window.setVisible( True )
+ ## \todo Can we do better using `window.resizeToFitChild()`?
+ # Our problem is that some NodeEditors (for GafferImage.Text for instance)
+ # are very large, whereas some (GafferScene.Shader) don't have
+ # a valid size until the UI has been built lazily.
+ window._qtWidget().resize( 400, 400 )
return editor
|
Install package for fast string matching
But really, mostly to supress a warning! | @@ -15,3 +15,4 @@ flake8==3.3.0 # PEP checking
coverage>=4.5.3 # Unit test coverage
python-coveralls==2.9.1 # Coveralls linking (for Travis)
fuzzywuzzy>=0.17.0 # Fuzzy string matching
+python-Levenshtein>=0.12.0 # Required for fuzzywuzzy
\ No newline at end of file
|
SystemMetricsTest: fix type conversion issue
Summary:
The tests were using `uint64_t` rather than the `size_t` that `getRssMemBytes`
return. This is fine on 64-bit architectures but fails on 32-bit ones. | @@ -16,7 +16,7 @@ namespace fbzmq {
TEST(SystemMetricsTest, MemoryStats) {
SystemMetrics systemMetrics_{};
- folly::Optional<uint64_t> rssMem1 = systemMetrics_.getRSSMemBytes();
+ folly::Optional<size_t> rssMem1 = systemMetrics_.getRSSMemBytes();
EXPECT_TRUE(rssMem1.hasValue());
// check sanity of return value, check for > 1MB and < 100MB
@@ -27,7 +27,7 @@ TEST(SystemMetricsTest, MemoryStats) {
std::vector<int64_t> v(13 * 0x100000);
fill(v.begin(), v.end(), 1);
- folly::Optional<uint64_t> rssMem2 = systemMetrics_.getRSSMemBytes();
+ folly::Optional<size_t> rssMem2 = systemMetrics_.getRSSMemBytes();
EXPECT_TRUE(rssMem2.hasValue());
EXPECT_GT(rssMem2.value(), rssMem1.value() + 100);
}
|
Ignore mypy errors
Sadly the latest version doesn't pick up that Quart matches the
ASGI3Framework type in Hypercorn. | @@ -1388,9 +1388,9 @@ class Quart(PackageStatic):
if loop is not None:
loop.set_debug(debug or False)
- loop.run_until_complete(serve(self, config))
+ loop.run_until_complete(serve(self, config)) # type: ignore
else:
- asyncio.run(serve(self, config), debug=config.debug)
+ asyncio.run(serve(self, config), debug=config.debug) # type: ignore
def test_client(self) -> QuartClient:
"""Creates and returns a test client."""
|
[CI] fixing build script
if package is installed, this will wait for y/n user input | #!/bin/bash
-python3 -m pip uninstall scipy
+python3 -m pip uninstall -y scipy
python3 -m pip install git+https://github.com/zhanghang1989/d2l-book
python3 -m pip install --force-reinstall ipython==7.16
|
[query] Increase test parallelism
I changed this from 2=>1 in April of last year unintentionally while debugging
(it's easy to get interleaved prints/logs with 2 concurrent worker threads). | @@ -17,7 +17,7 @@ def startTestHailContext():
if not _initialized:
backend_name = os.environ.get('HAIL_QUERY_BACKEND', 'spark')
if backend_name == 'spark':
- hl.init(master='local[1]', min_block_size=0, quiet=True)
+ hl.init(master='local[2]', min_block_size=0, quiet=True)
else:
Env.hc() # force initialization
_initialized = True
|
[IMPR] simplify code in treat_disamb_only
move code out of try statement in treat_disamb_only
remove include = False statement; include is False by default | @@ -747,13 +747,6 @@ class DisambiguationRobot(SingleSiteBot):
new_targets = []
try:
text = ref_page.get()
- ignore_reason = self.checkContents(text)
- if ignore_reason:
- pywikibot.output(
- '\n\nSkipping {0} because it contains {1}.\n\n'
- .format(ref_page.title(), ignore_reason))
- else:
- include = True
except pywikibot.IsRedirectPage:
pywikibot.output('{0} is a redirect to {1}'
.format(ref_page.title(), disamb_page.title()))
@@ -792,8 +785,16 @@ class DisambiguationRobot(SingleSiteBot):
pywikibot.output(
'Page [[{0}]] does not seem to exist?! Skipping.'
.format(ref_page.title()))
- include = False
- if include in (True, 'redirect'):
+ else:
+ ignore_reason = self.checkContents(text)
+ if ignore_reason:
+ pywikibot.output(
+ '\n\nSkipping {0} because it contains {1}.\n\n'
+ .format(ref_page.title(), ignore_reason))
+ else:
+ include = True
+
+ if include:
# save the original text so we can show the changes later
original_text = text
n = 0
@@ -817,9 +818,8 @@ class DisambiguationRobot(SingleSiteBot):
foundlink = pywikibot.Link(m.group('title'),
disamb_page.site)
foundlink.parse()
- except pywikibot.Error:
- continue
- except ValueError: # T111513
+ except (pywikibot.Error,
+ ValueError): # T111513
continue
# ignore interwiki links
|
Some inconsistencies fixed
"this repository repository" fixed and some punctuation. | @@ -10,7 +10,7 @@ Jarvis is a simple personal assistant for Linux, MacOS and Windows which works o
## Getting Started
-In order to start Jarvis just clone [this repository](https://github.com/sukeesh/Jarvis.git) repository and run `python installer`.
+In order to start Jarvis just clone [this repository](https://github.com/sukeesh/Jarvis.git) and run `python installer`.
Run **Jarvis** from anywhere by command `jarvis`
@@ -29,8 +29,8 @@ You can start by typing `help` within the Jarvis command line to check what Jarv
- PRs are accepted!!
- We follow [PEP 8](https://www.python.org/dev/peps/pep-0008/) guidelines. Before making a PR, make sure that your code is according to PEP 8 standards.
-- If you have some ideas for new features and you don't have time to implement them please open an issue with the tag new_feature
-- Please don't forget to comment (document) your code
+- If you have some ideas for new features and you don't have time to implement them please open an issue with the tag new_feature.
+- Please don't forget to comment (document) your code!
@@ -100,4 +100,4 @@ See also the list of [contributors](https://github.com/sukeesh/Jarvis/graphs/con
## License
-This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details
+This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
|
Fix minor documentation error
This method does not throw any error nor takes any input. | @@ -847,7 +847,6 @@ class Customer(StripeModel):
Checks to see if this customer has an active subscription to any plan.
:returns: True if there exists an active subscription, False otherwise.
- :throws: TypeError if ``plan`` is None and more than one active subscription exists for this customer.
"""
return len(self._get_valid_subscriptions()) != 0
|
Add collectionMode data and modeUpdate
Added collectionMode to collection data.
modUpdate acts like webUI, `collection.modeUpdate(mode="default")` | @@ -994,6 +994,7 @@ class Collections(PlexObject):
self.childCount = utils.cast(int, data.attrib.get('childCount'))
self.minYear = utils.cast(int, data.attrib.get('minYear'))
self.maxYear = utils.cast(int, data.attrib.get('maxYear'))
+ self.collectionMode = data.attrib.get('collectionMode')
@property
def children(self):
@@ -1006,5 +1007,13 @@ class Collections(PlexObject):
part = '/library/metadata/%s' % self.ratingKey
return self._server.query(part, method=self._server._session.delete)
+ def modeUpdate(self, mode=['default', 'hide', 'hideItems', 'showItems']):
+ mode_dict = {'default': '-2',
+ 'hide': '0',
+ 'hideItems': '1',
+ 'showItems': '2'}
+ part = '/library/metadata/%s/prefs?collectionMode=%s' % (self.ratingKey, mode_dict[mode])
+ return self._server.query(part, method=self._server._session.put)
+
# def edit(self, **kwargs):
# TODO
|
If field cannot be found (usu where read/write operations have different serializers), assume error was in attributes by default.
Used when building custom error message. | @@ -7,8 +7,12 @@ from rest_framework.exceptions import APIException, AuthenticationFailed
def get_resource_object_member(error_key, context):
from api.base.serializers import RelationshipField
- field = context['view'].serializer_class._declared_fields[error_key]
+ field = context['view'].serializer_class._declared_fields.get(error_key, None)
+ if field:
return 'relationships' if isinstance(field, RelationshipField) else 'attributes'
+ # If field cannot be found (where read/write operations have different serializers,
+ # assume error was in 'attributes' by default
+ return 'attributes'
def dict_error_formatting(errors, context, index=None):
"""
|
Bump timeout
The original timeout seemed OK for local tests but would result in
spurious failures under Browserstack. | @@ -136,7 +136,7 @@ class SeleniumTestCase(StaticLiveServerTestCase):
super(SeleniumTestCase, cls).tearDownClass()
def _find_and_wait(self, locator_type, locator, waiter):
- wait = 5
+ wait = 15
try:
element = WebDriverWait(self.browser, wait).until(
waiter((locator_type, locator))
|
Fix css bug in custom.css.
White color was applied to all elements with class .nav-link instead of only the ones inside a .navbar-nav element.
This caused the links in the right column to turn white as well, making them invisible on the white background. | background-color: #2F4858 !important;
}
-/* This is kept for reference, in case the logo needs to be adjusted in the css. */
-/*.navbar-brand>.logo {*/
-/* filter: drop-shadow(1px 1px 0px #ffffff88);*/
-/*}*/
-
-.nav-link {
- color: #ffffffff!important;
+.navbar-nav > .nav-item > .nav-link {
+ color: #ffffffff;
}
.navbar-nav > .active > .nav-link {
|
[CONTRIBUTING.md] Added more instruction in Incident Report Format
Added some verbose instruction in **Incident Report Format** for the people who want to contribute. | @@ -205,7 +205,7 @@ Use the following format for all incident reports.
```
// State.md
-## City
+## City Name (Note: Exclude this, if city section is already available)
### Brief description of a thing that happened | Date
@@ -219,7 +219,7 @@ created a github repository on June 1st 2020 to compile evidence of police bruta
and concerned citizens. The repository was initialized with a README document explaining how other people
could get involved in the project, providing a meta example of what a good incident report looks like.
-tags: vehicle, celebrity, death (ex: check below)
+tags: vehicle, celebrity, death (ex: check the link to tags below)
id: state_abbreviation-city-number (ex: ca-losangeles-1)
@@ -227,6 +227,7 @@ id: state_abbreviation-city-number (ex: ca-losangeles-1)
* https://github.com/2020PB/police-brutality
* https://www.reddit.com/r/2020PoliceBrutality
+* [Link description can also be provided inside square brackets. ex: 'Video of the incident taken from building'](https://link-to-video-on-web/)
```
Check the current list of [tags](docs/possible_tags.md)
|
Subsets and Splits