problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_2249
|
rasdani/github-patches
|
git_diff
|
rasterio__rasterio-598
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
rio warp null transformer error with bad proj4
Currently, if you pass a bad projection, you get the following behavior:
```
$ rio warp --dst-crs "+proj=foobar" tests/data/warp_test.tif /tmp/foo.tif
ERROR:GDAL:CPLE_NotSupported in Failed to initialize PROJ.4 with `+proj=foobar +wktext'.
Traceback (most recent call last):
...
File "/Users/mperry/work/rasterio/rasterio/rio/warp.py", line 198, in warp
resolution=res)
File "/Users/mperry/work/rasterio/rasterio/warp.py", line 296, in calculate_default_transform
left, bottom, right, top)
File "rasterio/_warp.pyx", line 535, in rasterio._warp._calculate_default_transform (rasterio/_warp.cpp:9551)
with InMemoryRaster(
File "rasterio/_warp.pyx", line 542, in rasterio._warp._calculate_default_transform (rasterio/_warp.cpp:9261)
raise ValueError("NULL transformer")
ValueError: NULL transformer
```
The transformer fails to initialize, which is reasonable considering the invalid proj string. Is there any way to catch that error and report back something more meaningful than "NULL transformer"?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `rasterio/errors.py`
Content:
```
1 """A module of errors."""
2
3 from click import FileError
4
5
6 class RasterioIOError(IOError):
7 """A failure to open a dataset using the presently registered drivers."""
8
9
10 class RasterioDriverRegistrationError(ValueError):
11 """To be raised when, eg, _gdal.GDALGetDriverByName("MEM") returns NULL."""
12
13
14 class FileOverwriteError(FileError):
15 """Rasterio's CLI refuses to implicitly clobber output files."""
16
17 def __init__(self, message):
18 super(FileOverwriteError, self).__init__('', hint=message)
19
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/rasterio/errors.py b/rasterio/errors.py
--- a/rasterio/errors.py
+++ b/rasterio/errors.py
@@ -7,7 +7,7 @@
"""A failure to open a dataset using the presently registered drivers."""
-class RasterioDriverRegistrationError(ValueError):
+class DriverRegistrationError(ValueError):
"""To be raised when, eg, _gdal.GDALGetDriverByName("MEM") returns NULL."""
|
{"golden_diff": "diff --git a/rasterio/errors.py b/rasterio/errors.py\n--- a/rasterio/errors.py\n+++ b/rasterio/errors.py\n@@ -7,7 +7,7 @@\n \"\"\"A failure to open a dataset using the presently registered drivers.\"\"\"\n \n \n-class RasterioDriverRegistrationError(ValueError):\n+class DriverRegistrationError(ValueError):\n \"\"\"To be raised when, eg, _gdal.GDALGetDriverByName(\"MEM\") returns NULL.\"\"\"\n", "issue": "rio warp null transformer error with bad proj4\nCurrently, if you pass a bad projection, you get the following behavior:\n\n```\n$ rio warp --dst-crs \"+proj=foobar\" tests/data/warp_test.tif /tmp/foo.tif\nERROR:GDAL:CPLE_NotSupported in Failed to initialize PROJ.4 with `+proj=foobar +wktext'.\nTraceback (most recent call last):\n...\n File \"/Users/mperry/work/rasterio/rasterio/rio/warp.py\", line 198, in warp\n resolution=res)\n File \"/Users/mperry/work/rasterio/rasterio/warp.py\", line 296, in calculate_default_transform\n left, bottom, right, top)\n File \"rasterio/_warp.pyx\", line 535, in rasterio._warp._calculate_default_transform (rasterio/_warp.cpp:9551)\n with InMemoryRaster(\n File \"rasterio/_warp.pyx\", line 542, in rasterio._warp._calculate_default_transform (rasterio/_warp.cpp:9261)\n raise ValueError(\"NULL transformer\")\nValueError: NULL transformer\n```\n\nThe transformer fails to initialize, which is reasonable considering the invalid proj string. Is there any way to catch that error and report back something more meaningful than \"NULL transformer\"?\n\n", "before_files": [{"content": "\"\"\"A module of errors.\"\"\"\n\nfrom click import FileError\n\n\nclass RasterioIOError(IOError):\n \"\"\"A failure to open a dataset using the presently registered drivers.\"\"\"\n\n\nclass RasterioDriverRegistrationError(ValueError):\n \"\"\"To be raised when, eg, _gdal.GDALGetDriverByName(\"MEM\") returns NULL.\"\"\"\n\n\nclass FileOverwriteError(FileError):\n \"\"\"Rasterio's CLI refuses to implicitly clobber output files.\"\"\"\n\n def __init__(self, message):\n super(FileOverwriteError, self).__init__('', hint=message)\n", "path": "rasterio/errors.py"}], "after_files": [{"content": "\"\"\"A module of errors.\"\"\"\n\nfrom click import FileError\n\n\nclass RasterioIOError(IOError):\n \"\"\"A failure to open a dataset using the presently registered drivers.\"\"\"\n\n\nclass DriverRegistrationError(ValueError):\n \"\"\"To be raised when, eg, _gdal.GDALGetDriverByName(\"MEM\") returns NULL.\"\"\"\n\n\nclass FileOverwriteError(FileError):\n \"\"\"Rasterio's CLI refuses to implicitly clobber output files.\"\"\"\n\n def __init__(self, message):\n super(FileOverwriteError, self).__init__('', hint=message)\n", "path": "rasterio/errors.py"}]}
| 704 | 96 |
gh_patches_debug_8090
|
rasdani/github-patches
|
git_diff
|
RedHatInsights__insights-core-2486
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
InsightsEvaluator ignores its stream argument during construction
The [InsightsEvaluator](https://github.com/RedHatInsights/insights-core/blob/master/insights/core/evaluators.py#L121) ignores its stream argument and passes `sys.stdout` to its superclass constructor.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `insights/core/evaluators.py`
Content:
```
1 import logging
2 import six
3 import sys
4
5 from collections import defaultdict
6
7 from ..formats import Formatter
8 from ..specs import Specs
9 from ..combiners.hostname import hostname as combiner_hostname
10 from ..parsers.branch_info import BranchInfo
11 from . import dr, plugins
12
13 log = logging.getLogger(__name__)
14
15
16 def get_simple_module_name(obj):
17 return dr.BASE_MODULE_NAMES.get(obj, None)
18
19
20 class Evaluator(Formatter):
21 def __init__(self, broker=None, stream=sys.stdout, incremental=False):
22 super(Evaluator, self).__init__(broker or dr.Broker(), stream)
23 self.results = defaultdict(list)
24 self.rule_skips = []
25 self.hostname = None
26 self.metadata = {}
27 self.metadata_keys = {}
28 self.incremental = incremental
29
30 def observer(self, comp, broker):
31 if comp is combiner_hostname and comp in broker:
32 self.hostname = broker[comp].fqdn
33
34 if plugins.is_rule(comp) and comp in broker:
35 self.handle_result(comp, broker[comp])
36
37 def preprocess(self):
38 self.broker.add_observer(self.observer)
39
40 def run_serial(self, graph=None):
41 dr.run(graph or dr.COMPONENTS[dr.GROUPS.single], broker=self.broker)
42
43 def run_incremental(self, graph=None):
44 for _ in dr.run_incremental(graph or dr.COMPONENTS[dr.GROUPS.single], broker=self.broker):
45 pass
46
47 def format_response(self, response):
48 """
49 To be overridden by subclasses to format the response sent back to the
50 client.
51 """
52 return response
53
54 def format_result(self, result):
55 """
56 To be overridden by subclasses to format individual rule results.
57 """
58 return result
59
60 def process(self, graph=None):
61 with self:
62 if self.incremental:
63 self.run_incremental(graph)
64 else:
65 self.run_serial(graph)
66 return self.get_response()
67
68
69 class SingleEvaluator(Evaluator):
70 def append_metadata(self, r):
71 for k, v in r.items():
72 if k != "type":
73 self.metadata[k] = v
74
75 def format_response(self, response):
76 return response
77
78 def get_response(self):
79 r = dict(self.metadata_keys)
80 r.update({
81 "system": {
82 "metadata": self.metadata,
83 "hostname": self.hostname
84 },
85 "reports": self.results["rule"],
86 "fingerprints": self.results["fingerprint"],
87 "skips": self.rule_skips,
88 })
89
90 for k, v in six.iteritems(self.results):
91 if k not in ("rule", "fingerprint"):
92 r[k] = v
93
94 return self.format_response(r)
95
96 def handle_result(self, plugin, r):
97 type_ = r["type"]
98
99 if type_ == "skip":
100 self.rule_skips.append(r)
101 elif type_ == "metadata":
102 self.append_metadata(r)
103 elif type_ == "metadata_key":
104 self.metadata_keys[r.get_key()] = r["value"]
105 else:
106 response_id = "%s_id" % r.response_type
107 key = r.get_key()
108 self.results[type_].append(self.format_result({
109 response_id: "{0}|{1}".format(get_simple_module_name(plugin), key),
110 "component": dr.get_name(plugin),
111 "type": type_,
112 "key": key,
113 "details": r,
114 "tags": list(dr.get_tags(plugin)),
115 "links": dr.get_delegate(plugin).links or {}
116 }))
117
118
119 class InsightsEvaluator(SingleEvaluator):
120 def __init__(self, broker=None, system_id=None, stream=sys.stdout, incremental=False):
121 super(InsightsEvaluator, self).__init__(broker, stream=sys.stdout, incremental=incremental)
122 self.system_id = system_id
123 self.branch_info = {}
124 self.product = "rhel"
125 self.type = "host"
126 self.release = None
127
128 def observer(self, comp, broker):
129 super(InsightsEvaluator, self).observer(comp, broker)
130 if comp is Specs.machine_id and comp in broker:
131 self.system_id = broker[Specs.machine_id].content[0].strip()
132
133 if comp is Specs.redhat_release and comp in broker:
134 self.release = broker[comp].content[0].strip()
135
136 if comp is BranchInfo and BranchInfo in broker:
137 self.branch_info = broker[comp].data
138
139 if comp is Specs.metadata_json and comp in broker:
140 md = broker[comp]
141 self.product = md.get("product_code")
142 self.type = md.get("role")
143
144 def format_result(self, result):
145 result["system_id"] = self.system_id
146 return result
147
148 def format_response(self, response):
149 system = response["system"]
150 system["remote_branch"] = self.branch_info.get("remote_branch")
151 system["remote_leaf"] = self.branch_info.get("remote_leaf")
152 system["system_id"] = self.system_id
153 system["product"] = self.product
154 system["type"] = self.type
155 if self.release:
156 system["metadata"]["release"] = self.release
157
158 return response
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/insights/core/evaluators.py b/insights/core/evaluators.py
--- a/insights/core/evaluators.py
+++ b/insights/core/evaluators.py
@@ -118,7 +118,7 @@
class InsightsEvaluator(SingleEvaluator):
def __init__(self, broker=None, system_id=None, stream=sys.stdout, incremental=False):
- super(InsightsEvaluator, self).__init__(broker, stream=sys.stdout, incremental=incremental)
+ super(InsightsEvaluator, self).__init__(broker, stream=stream, incremental=incremental)
self.system_id = system_id
self.branch_info = {}
self.product = "rhel"
|
{"golden_diff": "diff --git a/insights/core/evaluators.py b/insights/core/evaluators.py\n--- a/insights/core/evaluators.py\n+++ b/insights/core/evaluators.py\n@@ -118,7 +118,7 @@\n \n class InsightsEvaluator(SingleEvaluator):\n def __init__(self, broker=None, system_id=None, stream=sys.stdout, incremental=False):\n- super(InsightsEvaluator, self).__init__(broker, stream=sys.stdout, incremental=incremental)\n+ super(InsightsEvaluator, self).__init__(broker, stream=stream, incremental=incremental)\n self.system_id = system_id\n self.branch_info = {}\n self.product = \"rhel\"\n", "issue": "InsightsEvaluator ignores its stream argument during construction\nThe [InsightsEvaluator](https://github.com/RedHatInsights/insights-core/blob/master/insights/core/evaluators.py#L121) ignores its stream argument and passes `sys.stdout` to its superclass constructor.\n", "before_files": [{"content": "import logging\nimport six\nimport sys\n\nfrom collections import defaultdict\n\nfrom ..formats import Formatter\nfrom ..specs import Specs\nfrom ..combiners.hostname import hostname as combiner_hostname\nfrom ..parsers.branch_info import BranchInfo\nfrom . import dr, plugins\n\nlog = logging.getLogger(__name__)\n\n\ndef get_simple_module_name(obj):\n return dr.BASE_MODULE_NAMES.get(obj, None)\n\n\nclass Evaluator(Formatter):\n def __init__(self, broker=None, stream=sys.stdout, incremental=False):\n super(Evaluator, self).__init__(broker or dr.Broker(), stream)\n self.results = defaultdict(list)\n self.rule_skips = []\n self.hostname = None\n self.metadata = {}\n self.metadata_keys = {}\n self.incremental = incremental\n\n def observer(self, comp, broker):\n if comp is combiner_hostname and comp in broker:\n self.hostname = broker[comp].fqdn\n\n if plugins.is_rule(comp) and comp in broker:\n self.handle_result(comp, broker[comp])\n\n def preprocess(self):\n self.broker.add_observer(self.observer)\n\n def run_serial(self, graph=None):\n dr.run(graph or dr.COMPONENTS[dr.GROUPS.single], broker=self.broker)\n\n def run_incremental(self, graph=None):\n for _ in dr.run_incremental(graph or dr.COMPONENTS[dr.GROUPS.single], broker=self.broker):\n pass\n\n def format_response(self, response):\n \"\"\"\n To be overridden by subclasses to format the response sent back to the\n client.\n \"\"\"\n return response\n\n def format_result(self, result):\n \"\"\"\n To be overridden by subclasses to format individual rule results.\n \"\"\"\n return result\n\n def process(self, graph=None):\n with self:\n if self.incremental:\n self.run_incremental(graph)\n else:\n self.run_serial(graph)\n return self.get_response()\n\n\nclass SingleEvaluator(Evaluator):\n def append_metadata(self, r):\n for k, v in r.items():\n if k != \"type\":\n self.metadata[k] = v\n\n def format_response(self, response):\n return response\n\n def get_response(self):\n r = dict(self.metadata_keys)\n r.update({\n \"system\": {\n \"metadata\": self.metadata,\n \"hostname\": self.hostname\n },\n \"reports\": self.results[\"rule\"],\n \"fingerprints\": self.results[\"fingerprint\"],\n \"skips\": self.rule_skips,\n })\n\n for k, v in six.iteritems(self.results):\n if k not in (\"rule\", \"fingerprint\"):\n r[k] = v\n\n return self.format_response(r)\n\n def handle_result(self, plugin, r):\n type_ = r[\"type\"]\n\n if type_ == \"skip\":\n self.rule_skips.append(r)\n elif type_ == \"metadata\":\n self.append_metadata(r)\n elif type_ == \"metadata_key\":\n self.metadata_keys[r.get_key()] = r[\"value\"]\n else:\n response_id = \"%s_id\" % r.response_type\n key = r.get_key()\n self.results[type_].append(self.format_result({\n response_id: \"{0}|{1}\".format(get_simple_module_name(plugin), key),\n \"component\": dr.get_name(plugin),\n \"type\": type_,\n \"key\": key,\n \"details\": r,\n \"tags\": list(dr.get_tags(plugin)),\n \"links\": dr.get_delegate(plugin).links or {}\n }))\n\n\nclass InsightsEvaluator(SingleEvaluator):\n def __init__(self, broker=None, system_id=None, stream=sys.stdout, incremental=False):\n super(InsightsEvaluator, self).__init__(broker, stream=sys.stdout, incremental=incremental)\n self.system_id = system_id\n self.branch_info = {}\n self.product = \"rhel\"\n self.type = \"host\"\n self.release = None\n\n def observer(self, comp, broker):\n super(InsightsEvaluator, self).observer(comp, broker)\n if comp is Specs.machine_id and comp in broker:\n self.system_id = broker[Specs.machine_id].content[0].strip()\n\n if comp is Specs.redhat_release and comp in broker:\n self.release = broker[comp].content[0].strip()\n\n if comp is BranchInfo and BranchInfo in broker:\n self.branch_info = broker[comp].data\n\n if comp is Specs.metadata_json and comp in broker:\n md = broker[comp]\n self.product = md.get(\"product_code\")\n self.type = md.get(\"role\")\n\n def format_result(self, result):\n result[\"system_id\"] = self.system_id\n return result\n\n def format_response(self, response):\n system = response[\"system\"]\n system[\"remote_branch\"] = self.branch_info.get(\"remote_branch\")\n system[\"remote_leaf\"] = self.branch_info.get(\"remote_leaf\")\n system[\"system_id\"] = self.system_id\n system[\"product\"] = self.product\n system[\"type\"] = self.type\n if self.release:\n system[\"metadata\"][\"release\"] = self.release\n\n return response\n", "path": "insights/core/evaluators.py"}], "after_files": [{"content": "import logging\nimport six\nimport sys\n\nfrom collections import defaultdict\n\nfrom ..formats import Formatter\nfrom ..specs import Specs\nfrom ..combiners.hostname import hostname as combiner_hostname\nfrom ..parsers.branch_info import BranchInfo\nfrom . import dr, plugins\n\nlog = logging.getLogger(__name__)\n\n\ndef get_simple_module_name(obj):\n return dr.BASE_MODULE_NAMES.get(obj, None)\n\n\nclass Evaluator(Formatter):\n def __init__(self, broker=None, stream=sys.stdout, incremental=False):\n super(Evaluator, self).__init__(broker or dr.Broker(), stream)\n self.results = defaultdict(list)\n self.rule_skips = []\n self.hostname = None\n self.metadata = {}\n self.metadata_keys = {}\n self.incremental = incremental\n\n def observer(self, comp, broker):\n if comp is combiner_hostname and comp in broker:\n self.hostname = broker[comp].fqdn\n\n if plugins.is_rule(comp) and comp in broker:\n self.handle_result(comp, broker[comp])\n\n def preprocess(self):\n self.broker.add_observer(self.observer)\n\n def run_serial(self, graph=None):\n dr.run(graph or dr.COMPONENTS[dr.GROUPS.single], broker=self.broker)\n\n def run_incremental(self, graph=None):\n for _ in dr.run_incremental(graph or dr.COMPONENTS[dr.GROUPS.single], broker=self.broker):\n pass\n\n def format_response(self, response):\n \"\"\"\n To be overridden by subclasses to format the response sent back to the\n client.\n \"\"\"\n return response\n\n def format_result(self, result):\n \"\"\"\n To be overridden by subclasses to format individual rule results.\n \"\"\"\n return result\n\n def process(self, graph=None):\n with self:\n if self.incremental:\n self.run_incremental(graph)\n else:\n self.run_serial(graph)\n return self.get_response()\n\n\nclass SingleEvaluator(Evaluator):\n def append_metadata(self, r):\n for k, v in r.items():\n if k != \"type\":\n self.metadata[k] = v\n\n def format_response(self, response):\n return response\n\n def get_response(self):\n r = dict(self.metadata_keys)\n r.update({\n \"system\": {\n \"metadata\": self.metadata,\n \"hostname\": self.hostname\n },\n \"reports\": self.results[\"rule\"],\n \"fingerprints\": self.results[\"fingerprint\"],\n \"skips\": self.rule_skips,\n })\n\n for k, v in six.iteritems(self.results):\n if k not in (\"rule\", \"fingerprint\"):\n r[k] = v\n\n return self.format_response(r)\n\n def handle_result(self, plugin, r):\n type_ = r[\"type\"]\n\n if type_ == \"skip\":\n self.rule_skips.append(r)\n elif type_ == \"metadata\":\n self.append_metadata(r)\n elif type_ == \"metadata_key\":\n self.metadata_keys[r.get_key()] = r[\"value\"]\n else:\n response_id = \"%s_id\" % r.response_type\n key = r.get_key()\n self.results[type_].append(self.format_result({\n response_id: \"{0}|{1}\".format(get_simple_module_name(plugin), key),\n \"component\": dr.get_name(plugin),\n \"type\": type_,\n \"key\": key,\n \"details\": r,\n \"tags\": list(dr.get_tags(plugin)),\n \"links\": dr.get_delegate(plugin).links or {}\n }))\n\n\nclass InsightsEvaluator(SingleEvaluator):\n def __init__(self, broker=None, system_id=None, stream=sys.stdout, incremental=False):\n super(InsightsEvaluator, self).__init__(broker, stream=stream, incremental=incremental)\n self.system_id = system_id\n self.branch_info = {}\n self.product = \"rhel\"\n self.type = \"host\"\n self.release = None\n\n def observer(self, comp, broker):\n super(InsightsEvaluator, self).observer(comp, broker)\n if comp is Specs.machine_id and comp in broker:\n self.system_id = broker[Specs.machine_id].content[0].strip()\n\n if comp is Specs.redhat_release and comp in broker:\n self.release = broker[comp].content[0].strip()\n\n if comp is BranchInfo and BranchInfo in broker:\n self.branch_info = broker[comp].data\n\n if comp is Specs.metadata_json and comp in broker:\n md = broker[comp]\n self.product = md.get(\"product_code\")\n self.type = md.get(\"role\")\n\n def format_result(self, result):\n result[\"system_id\"] = self.system_id\n return result\n\n def format_response(self, response):\n system = response[\"system\"]\n system[\"remote_branch\"] = self.branch_info.get(\"remote_branch\")\n system[\"remote_leaf\"] = self.branch_info.get(\"remote_leaf\")\n system[\"system_id\"] = self.system_id\n system[\"product\"] = self.product\n system[\"type\"] = self.type\n if self.release:\n system[\"metadata\"][\"release\"] = self.release\n\n return response\n", "path": "insights/core/evaluators.py"}]}
| 1,797 | 159 |
gh_patches_debug_15334
|
rasdani/github-patches
|
git_diff
|
aws__aws-cli-2972
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AWS CLI configuration does not respect profile variable
I tried to use the command aws ec2 describe-spot-price-history in the CLI and received the error 'You must specify a region. You can also configure your region by running "aws configure".' When I first installed the CLI, I configured my credentials and region.
Running 'aws configure', I try to re-enter my details (they are listed as blank), but they do not save. I assume this is because I am using a profile. I have the environment variable AWS_DEFAULT_PROFILE set to 'internal', and have attached my configuration files (with .txt extension added and sensitive data redacted).
[config.txt](https://github.com/aws/aws-cli/files/1486174/config.txt)
[credentials.txt](https://github.com/aws/aws-cli/files/1486175/credentials.txt)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `awscli/customizations/configure/configure.py`
Content:
```
1 # Copyright 2016 Amazon.com, Inc. or its affiliates. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License"). You
4 # may not use this file except in compliance with the License. A copy of
5 # the License is located at
6 #
7 # http://aws.amazon.com/apache2.0/
8 #
9 # or in the "license" file accompanying this file. This file is
10 # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
11 # ANY KIND, either express or implied. See the License for the specific
12 # language governing permissions and limitations under the License.
13 import os
14 import logging
15
16 from botocore.exceptions import ProfileNotFound
17
18 from awscli.compat import compat_input
19 from awscli.customizations.commands import BasicCommand
20 from awscli.customizations.configure.addmodel import AddModelCommand
21 from awscli.customizations.configure.set import ConfigureSetCommand
22 from awscli.customizations.configure.get import ConfigureGetCommand
23 from awscli.customizations.configure.list import ConfigureListCommand
24 from awscli.customizations.configure.writer import ConfigFileWriter
25
26 from . import mask_value, profile_to_section
27
28
29 logger = logging.getLogger(__name__)
30
31
32 def register_configure_cmd(cli):
33 cli.register('building-command-table.main',
34 ConfigureCommand.add_command)
35
36
37 class InteractivePrompter(object):
38
39 def get_value(self, current_value, config_name, prompt_text=''):
40 if config_name in ('aws_access_key_id', 'aws_secret_access_key'):
41 current_value = mask_value(current_value)
42 response = compat_input("%s [%s]: " % (prompt_text, current_value))
43 if not response:
44 # If the user hits enter, we return a value of None
45 # instead of an empty string. That way we can determine
46 # whether or not a value has changed.
47 response = None
48 return response
49
50
51 class ConfigureCommand(BasicCommand):
52 NAME = 'configure'
53 DESCRIPTION = BasicCommand.FROM_FILE()
54 SYNOPSIS = ('aws configure [--profile profile-name]')
55 EXAMPLES = (
56 'To create a new configuration::\n'
57 '\n'
58 ' $ aws configure\n'
59 ' AWS Access Key ID [None]: accesskey\n'
60 ' AWS Secret Access Key [None]: secretkey\n'
61 ' Default region name [None]: us-west-2\n'
62 ' Default output format [None]:\n'
63 '\n'
64 'To update just the region name::\n'
65 '\n'
66 ' $ aws configure\n'
67 ' AWS Access Key ID [****]:\n'
68 ' AWS Secret Access Key [****]:\n'
69 ' Default region name [us-west-1]: us-west-2\n'
70 ' Default output format [None]:\n'
71 )
72 SUBCOMMANDS = [
73 {'name': 'list', 'command_class': ConfigureListCommand},
74 {'name': 'get', 'command_class': ConfigureGetCommand},
75 {'name': 'set', 'command_class': ConfigureSetCommand},
76 {'name': 'add-model', 'command_class': AddModelCommand}
77 ]
78
79 # If you want to add new values to prompt, update this list here.
80 VALUES_TO_PROMPT = [
81 # (logical_name, config_name, prompt_text)
82 ('aws_access_key_id', "AWS Access Key ID"),
83 ('aws_secret_access_key', "AWS Secret Access Key"),
84 ('region', "Default region name"),
85 ('output', "Default output format"),
86 ]
87
88 def __init__(self, session, prompter=None, config_writer=None):
89 super(ConfigureCommand, self).__init__(session)
90 if prompter is None:
91 prompter = InteractivePrompter()
92 self._prompter = prompter
93 if config_writer is None:
94 config_writer = ConfigFileWriter()
95 self._config_writer = config_writer
96
97 def _run_main(self, parsed_args, parsed_globals):
98 # Called when invoked with no args "aws configure"
99 new_values = {}
100 # This is the config from the config file scoped to a specific
101 # profile.
102 try:
103 config = self._session.get_scoped_config()
104 except ProfileNotFound:
105 config = {}
106 for config_name, prompt_text in self.VALUES_TO_PROMPT:
107 current_value = config.get(config_name)
108 new_value = self._prompter.get_value(current_value, config_name,
109 prompt_text)
110 if new_value is not None and new_value != current_value:
111 new_values[config_name] = new_value
112 config_filename = os.path.expanduser(
113 self._session.get_config_variable('config_file'))
114 if new_values:
115 self._write_out_creds_file_values(new_values,
116 parsed_globals.profile)
117 if parsed_globals.profile is not None:
118 section = profile_to_section(parsed_globals.profile)
119 new_values['__section__'] = section
120 self._config_writer.update_config(new_values, config_filename)
121
122 def _write_out_creds_file_values(self, new_values, profile_name):
123 # The access_key/secret_key are now *always* written to the shared
124 # credentials file (~/.aws/credentials), see aws/aws-cli#847.
125 # post-conditions: ~/.aws/credentials will have the updated credential
126 # file values and new_values will have the cred vars removed.
127 credential_file_values = {}
128 if 'aws_access_key_id' in new_values:
129 credential_file_values['aws_access_key_id'] = new_values.pop(
130 'aws_access_key_id')
131 if 'aws_secret_access_key' in new_values:
132 credential_file_values['aws_secret_access_key'] = new_values.pop(
133 'aws_secret_access_key')
134 if credential_file_values:
135 if profile_name is not None:
136 credential_file_values['__section__'] = profile_name
137 shared_credentials_filename = os.path.expanduser(
138 self._session.get_config_variable('credentials_file'))
139 self._config_writer.update_config(
140 credential_file_values,
141 shared_credentials_filename)
142
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/awscli/customizations/configure/configure.py b/awscli/customizations/configure/configure.py
--- a/awscli/customizations/configure/configure.py
+++ b/awscli/customizations/configure/configure.py
@@ -112,10 +112,10 @@
config_filename = os.path.expanduser(
self._session.get_config_variable('config_file'))
if new_values:
- self._write_out_creds_file_values(new_values,
- parsed_globals.profile)
- if parsed_globals.profile is not None:
- section = profile_to_section(parsed_globals.profile)
+ profile = self._session.profile
+ self._write_out_creds_file_values(new_values, profile)
+ if profile is not None:
+ section = profile_to_section(profile)
new_values['__section__'] = section
self._config_writer.update_config(new_values, config_filename)
|
{"golden_diff": "diff --git a/awscli/customizations/configure/configure.py b/awscli/customizations/configure/configure.py\n--- a/awscli/customizations/configure/configure.py\n+++ b/awscli/customizations/configure/configure.py\n@@ -112,10 +112,10 @@\n config_filename = os.path.expanduser(\n self._session.get_config_variable('config_file'))\n if new_values:\n- self._write_out_creds_file_values(new_values,\n- parsed_globals.profile)\n- if parsed_globals.profile is not None:\n- section = profile_to_section(parsed_globals.profile)\n+ profile = self._session.profile\n+ self._write_out_creds_file_values(new_values, profile)\n+ if profile is not None:\n+ section = profile_to_section(profile)\n new_values['__section__'] = section\n self._config_writer.update_config(new_values, config_filename)\n", "issue": "AWS CLI configuration does not respect profile variable\nI tried to use the command aws ec2 describe-spot-price-history in the CLI and received the error 'You must specify a region. You can also configure your region by running \"aws configure\".' When I first installed the CLI, I configured my credentials and region.\r\n\r\nRunning 'aws configure', I try to re-enter my details (they are listed as blank), but they do not save. I assume this is because I am using a profile. I have the environment variable AWS_DEFAULT_PROFILE set to 'internal', and have attached my configuration files (with .txt extension added and sensitive data redacted).\r\n\r\n[config.txt](https://github.com/aws/aws-cli/files/1486174/config.txt)\r\n[credentials.txt](https://github.com/aws/aws-cli/files/1486175/credentials.txt)\n", "before_files": [{"content": "# Copyright 2016 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"). You\n# may not use this file except in compliance with the License. A copy of\n# the License is located at\n#\n# http://aws.amazon.com/apache2.0/\n#\n# or in the \"license\" file accompanying this file. This file is\n# distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific\n# language governing permissions and limitations under the License.\nimport os\nimport logging\n\nfrom botocore.exceptions import ProfileNotFound\n\nfrom awscli.compat import compat_input\nfrom awscli.customizations.commands import BasicCommand\nfrom awscli.customizations.configure.addmodel import AddModelCommand\nfrom awscli.customizations.configure.set import ConfigureSetCommand\nfrom awscli.customizations.configure.get import ConfigureGetCommand\nfrom awscli.customizations.configure.list import ConfigureListCommand\nfrom awscli.customizations.configure.writer import ConfigFileWriter\n\nfrom . import mask_value, profile_to_section\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef register_configure_cmd(cli):\n cli.register('building-command-table.main',\n ConfigureCommand.add_command)\n\n\nclass InteractivePrompter(object):\n\n def get_value(self, current_value, config_name, prompt_text=''):\n if config_name in ('aws_access_key_id', 'aws_secret_access_key'):\n current_value = mask_value(current_value)\n response = compat_input(\"%s [%s]: \" % (prompt_text, current_value))\n if not response:\n # If the user hits enter, we return a value of None\n # instead of an empty string. That way we can determine\n # whether or not a value has changed.\n response = None\n return response\n\n\nclass ConfigureCommand(BasicCommand):\n NAME = 'configure'\n DESCRIPTION = BasicCommand.FROM_FILE()\n SYNOPSIS = ('aws configure [--profile profile-name]')\n EXAMPLES = (\n 'To create a new configuration::\\n'\n '\\n'\n ' $ aws configure\\n'\n ' AWS Access Key ID [None]: accesskey\\n'\n ' AWS Secret Access Key [None]: secretkey\\n'\n ' Default region name [None]: us-west-2\\n'\n ' Default output format [None]:\\n'\n '\\n'\n 'To update just the region name::\\n'\n '\\n'\n ' $ aws configure\\n'\n ' AWS Access Key ID [****]:\\n'\n ' AWS Secret Access Key [****]:\\n'\n ' Default region name [us-west-1]: us-west-2\\n'\n ' Default output format [None]:\\n'\n )\n SUBCOMMANDS = [\n {'name': 'list', 'command_class': ConfigureListCommand},\n {'name': 'get', 'command_class': ConfigureGetCommand},\n {'name': 'set', 'command_class': ConfigureSetCommand},\n {'name': 'add-model', 'command_class': AddModelCommand}\n ]\n\n # If you want to add new values to prompt, update this list here.\n VALUES_TO_PROMPT = [\n # (logical_name, config_name, prompt_text)\n ('aws_access_key_id', \"AWS Access Key ID\"),\n ('aws_secret_access_key', \"AWS Secret Access Key\"),\n ('region', \"Default region name\"),\n ('output', \"Default output format\"),\n ]\n\n def __init__(self, session, prompter=None, config_writer=None):\n super(ConfigureCommand, self).__init__(session)\n if prompter is None:\n prompter = InteractivePrompter()\n self._prompter = prompter\n if config_writer is None:\n config_writer = ConfigFileWriter()\n self._config_writer = config_writer\n\n def _run_main(self, parsed_args, parsed_globals):\n # Called when invoked with no args \"aws configure\"\n new_values = {}\n # This is the config from the config file scoped to a specific\n # profile.\n try:\n config = self._session.get_scoped_config()\n except ProfileNotFound:\n config = {}\n for config_name, prompt_text in self.VALUES_TO_PROMPT:\n current_value = config.get(config_name)\n new_value = self._prompter.get_value(current_value, config_name,\n prompt_text)\n if new_value is not None and new_value != current_value:\n new_values[config_name] = new_value\n config_filename = os.path.expanduser(\n self._session.get_config_variable('config_file'))\n if new_values:\n self._write_out_creds_file_values(new_values,\n parsed_globals.profile)\n if parsed_globals.profile is not None:\n section = profile_to_section(parsed_globals.profile)\n new_values['__section__'] = section\n self._config_writer.update_config(new_values, config_filename)\n\n def _write_out_creds_file_values(self, new_values, profile_name):\n # The access_key/secret_key are now *always* written to the shared\n # credentials file (~/.aws/credentials), see aws/aws-cli#847.\n # post-conditions: ~/.aws/credentials will have the updated credential\n # file values and new_values will have the cred vars removed.\n credential_file_values = {}\n if 'aws_access_key_id' in new_values:\n credential_file_values['aws_access_key_id'] = new_values.pop(\n 'aws_access_key_id')\n if 'aws_secret_access_key' in new_values:\n credential_file_values['aws_secret_access_key'] = new_values.pop(\n 'aws_secret_access_key')\n if credential_file_values:\n if profile_name is not None:\n credential_file_values['__section__'] = profile_name\n shared_credentials_filename = os.path.expanduser(\n self._session.get_config_variable('credentials_file'))\n self._config_writer.update_config(\n credential_file_values,\n shared_credentials_filename)\n", "path": "awscli/customizations/configure/configure.py"}], "after_files": [{"content": "# Copyright 2016 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"). You\n# may not use this file except in compliance with the License. A copy of\n# the License is located at\n#\n# http://aws.amazon.com/apache2.0/\n#\n# or in the \"license\" file accompanying this file. This file is\n# distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF\n# ANY KIND, either express or implied. See the License for the specific\n# language governing permissions and limitations under the License.\nimport os\nimport logging\n\nfrom botocore.exceptions import ProfileNotFound\n\nfrom awscli.compat import compat_input\nfrom awscli.customizations.commands import BasicCommand\nfrom awscli.customizations.configure.addmodel import AddModelCommand\nfrom awscli.customizations.configure.set import ConfigureSetCommand\nfrom awscli.customizations.configure.get import ConfigureGetCommand\nfrom awscli.customizations.configure.list import ConfigureListCommand\nfrom awscli.customizations.configure.writer import ConfigFileWriter\n\nfrom . import mask_value, profile_to_section\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef register_configure_cmd(cli):\n cli.register('building-command-table.main',\n ConfigureCommand.add_command)\n\n\nclass InteractivePrompter(object):\n\n def get_value(self, current_value, config_name, prompt_text=''):\n if config_name in ('aws_access_key_id', 'aws_secret_access_key'):\n current_value = mask_value(current_value)\n response = compat_input(\"%s [%s]: \" % (prompt_text, current_value))\n if not response:\n # If the user hits enter, we return a value of None\n # instead of an empty string. That way we can determine\n # whether or not a value has changed.\n response = None\n return response\n\n\nclass ConfigureCommand(BasicCommand):\n NAME = 'configure'\n DESCRIPTION = BasicCommand.FROM_FILE()\n SYNOPSIS = ('aws configure [--profile profile-name]')\n EXAMPLES = (\n 'To create a new configuration::\\n'\n '\\n'\n ' $ aws configure\\n'\n ' AWS Access Key ID [None]: accesskey\\n'\n ' AWS Secret Access Key [None]: secretkey\\n'\n ' Default region name [None]: us-west-2\\n'\n ' Default output format [None]:\\n'\n '\\n'\n 'To update just the region name::\\n'\n '\\n'\n ' $ aws configure\\n'\n ' AWS Access Key ID [****]:\\n'\n ' AWS Secret Access Key [****]:\\n'\n ' Default region name [us-west-1]: us-west-2\\n'\n ' Default output format [None]:\\n'\n )\n SUBCOMMANDS = [\n {'name': 'list', 'command_class': ConfigureListCommand},\n {'name': 'get', 'command_class': ConfigureGetCommand},\n {'name': 'set', 'command_class': ConfigureSetCommand},\n {'name': 'add-model', 'command_class': AddModelCommand}\n ]\n\n # If you want to add new values to prompt, update this list here.\n VALUES_TO_PROMPT = [\n # (logical_name, config_name, prompt_text)\n ('aws_access_key_id', \"AWS Access Key ID\"),\n ('aws_secret_access_key', \"AWS Secret Access Key\"),\n ('region', \"Default region name\"),\n ('output', \"Default output format\"),\n ]\n\n def __init__(self, session, prompter=None, config_writer=None):\n super(ConfigureCommand, self).__init__(session)\n if prompter is None:\n prompter = InteractivePrompter()\n self._prompter = prompter\n if config_writer is None:\n config_writer = ConfigFileWriter()\n self._config_writer = config_writer\n\n def _run_main(self, parsed_args, parsed_globals):\n # Called when invoked with no args \"aws configure\"\n new_values = {}\n # This is the config from the config file scoped to a specific\n # profile.\n try:\n config = self._session.get_scoped_config()\n except ProfileNotFound:\n config = {}\n for config_name, prompt_text in self.VALUES_TO_PROMPT:\n current_value = config.get(config_name)\n new_value = self._prompter.get_value(current_value, config_name,\n prompt_text)\n if new_value is not None and new_value != current_value:\n new_values[config_name] = new_value\n config_filename = os.path.expanduser(\n self._session.get_config_variable('config_file'))\n if new_values:\n profile = self._session.profile\n self._write_out_creds_file_values(new_values, profile)\n if profile is not None:\n section = profile_to_section(profile)\n new_values['__section__'] = section\n self._config_writer.update_config(new_values, config_filename)\n\n def _write_out_creds_file_values(self, new_values, profile_name):\n # The access_key/secret_key are now *always* written to the shared\n # credentials file (~/.aws/credentials), see aws/aws-cli#847.\n # post-conditions: ~/.aws/credentials will have the updated credential\n # file values and new_values will have the cred vars removed.\n credential_file_values = {}\n if 'aws_access_key_id' in new_values:\n credential_file_values['aws_access_key_id'] = new_values.pop(\n 'aws_access_key_id')\n if 'aws_secret_access_key' in new_values:\n credential_file_values['aws_secret_access_key'] = new_values.pop(\n 'aws_secret_access_key')\n if credential_file_values:\n if profile_name is not None:\n credential_file_values['__section__'] = profile_name\n shared_credentials_filename = os.path.expanduser(\n self._session.get_config_variable('credentials_file'))\n self._config_writer.update_config(\n credential_file_values,\n shared_credentials_filename)\n", "path": "awscli/customizations/configure/configure.py"}]}
| 2,042 | 193 |
gh_patches_debug_1376
|
rasdani/github-patches
|
git_diff
|
flairNLP__flair-419
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Logging overwrite less sweeping
To be removed, once it is done: Please add the appropriate label to this ticket, e.g. feature or enhancement.
**Is your feature/enhancement request related to a problem? Please describe.**
When using flair in other applications, the fact that it disables existing logs in `__init__.py` can be detrimental. For instance when wrapping it up as a component in a tool like rasa_nlu, importing flair overrides all logging except its own, breaking functionality in rasa_nlu.
This is the [line that does so ](https://github.com/zalandoresearch/flair/blob/c2bb0d8776f25493a5b994dcd89a96f71ac175b8/flair/__init__.py#L13) and it was done on purpose to disable BERT logging in #282 .
**Describe the solution you'd like**
Ideally, the problem of disabling logging from certain known dependencies should be much more limited in scope. Importing flair as a package shouldn't disable all the other loggers. At a minimum, perhaps the logging could only *optionally* disable all existing logs
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flair/__init__.py`
Content:
```
1 import torch
2
3 from . import data
4 from . import models
5 from . import visual
6 from . import trainers
7
8 import logging.config
9
10
11 logging.config.dictConfig({
12 'version': 1,
13 'disable_existing_loggers': True,
14 'formatters': {
15 'standard': {
16 'format': '%(asctime)-15s %(message)s'
17 },
18 },
19 'handlers': {
20 'console': {
21 'level': 'INFO',
22 'class': 'logging.StreamHandler',
23 'formatter': 'standard',
24 'stream': 'ext://sys.stdout'
25 },
26 },
27 'loggers': {
28 'flair': {
29 'handlers': ['console'],
30 'level': 'INFO',
31 'propagate': False
32 }
33 },
34 'root': {
35 'handlers': ['console'],
36 'level': 'WARNING'
37 }
38 })
39
40 logger = logging.getLogger('flair')
41
42
43 device = None
44 if torch.cuda.is_available():
45 device = torch.device('cuda:0')
46 else:
47 device = torch.device('cpu')
48
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/flair/__init__.py b/flair/__init__.py
--- a/flair/__init__.py
+++ b/flair/__init__.py
@@ -10,7 +10,7 @@
logging.config.dictConfig({
'version': 1,
- 'disable_existing_loggers': True,
+ 'disable_existing_loggers': False,
'formatters': {
'standard': {
'format': '%(asctime)-15s %(message)s'
|
{"golden_diff": "diff --git a/flair/__init__.py b/flair/__init__.py\n--- a/flair/__init__.py\n+++ b/flair/__init__.py\n@@ -10,7 +10,7 @@\n \n logging.config.dictConfig({\n 'version': 1,\n- 'disable_existing_loggers': True,\n+ 'disable_existing_loggers': False,\n 'formatters': {\n 'standard': {\n 'format': '%(asctime)-15s %(message)s'\n", "issue": "Logging overwrite less sweeping\nTo be removed, once it is done: Please add the appropriate label to this ticket, e.g. feature or enhancement.\r\n\r\n**Is your feature/enhancement request related to a problem? Please describe.**\r\nWhen using flair in other applications, the fact that it disables existing logs in `__init__.py` can be detrimental. For instance when wrapping it up as a component in a tool like rasa_nlu, importing flair overrides all logging except its own, breaking functionality in rasa_nlu.\r\n\r\nThis is the [line that does so ](https://github.com/zalandoresearch/flair/blob/c2bb0d8776f25493a5b994dcd89a96f71ac175b8/flair/__init__.py#L13) and it was done on purpose to disable BERT logging in #282 .\r\n\r\n**Describe the solution you'd like**\r\nIdeally, the problem of disabling logging from certain known dependencies should be much more limited in scope. Importing flair as a package shouldn't disable all the other loggers. At a minimum, perhaps the logging could only *optionally* disable all existing logs\r\n\n", "before_files": [{"content": "import torch\n\nfrom . import data\nfrom . import models\nfrom . import visual\nfrom . import trainers\n\nimport logging.config\n\n\nlogging.config.dictConfig({\n 'version': 1,\n 'disable_existing_loggers': True,\n 'formatters': {\n 'standard': {\n 'format': '%(asctime)-15s %(message)s'\n },\n },\n 'handlers': {\n 'console': {\n 'level': 'INFO',\n 'class': 'logging.StreamHandler',\n 'formatter': 'standard',\n 'stream': 'ext://sys.stdout'\n },\n },\n 'loggers': {\n 'flair': {\n 'handlers': ['console'],\n 'level': 'INFO',\n 'propagate': False\n }\n },\n 'root': {\n 'handlers': ['console'],\n 'level': 'WARNING'\n }\n})\n\nlogger = logging.getLogger('flair')\n\n\ndevice = None\nif torch.cuda.is_available():\n device = torch.device('cuda:0')\nelse:\n device = torch.device('cpu')\n", "path": "flair/__init__.py"}], "after_files": [{"content": "import torch\n\nfrom . import data\nfrom . import models\nfrom . import visual\nfrom . import trainers\n\nimport logging.config\n\n\nlogging.config.dictConfig({\n 'version': 1,\n 'disable_existing_loggers': False,\n 'formatters': {\n 'standard': {\n 'format': '%(asctime)-15s %(message)s'\n },\n },\n 'handlers': {\n 'console': {\n 'level': 'INFO',\n 'class': 'logging.StreamHandler',\n 'formatter': 'standard',\n 'stream': 'ext://sys.stdout'\n },\n },\n 'loggers': {\n 'flair': {\n 'handlers': ['console'],\n 'level': 'INFO',\n 'propagate': False\n }\n },\n 'root': {\n 'handlers': ['console'],\n 'level': 'WARNING'\n }\n})\n\nlogger = logging.getLogger('flair')\n\n\ndevice = None\nif torch.cuda.is_available():\n device = torch.device('cuda:0')\nelse:\n device = torch.device('cpu')\n", "path": "flair/__init__.py"}]}
| 832 | 107 |
gh_patches_debug_32968
|
rasdani/github-patches
|
git_diff
|
ipython__ipython-11979
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Incompatibility with prompt_toolkit v3.0.0
I noticed the following error today:
```
def pt_init(self):
def get_prompt_tokens():
return [(Token.Prompt, self.prompt)]
if self._ptcomp is None:
compl = IPCompleter(shell=self.shell,
namespace={},
global_namespace={},
parent=self.shell,
)
self._ptcomp = IPythonPTCompleter(compl)
kb = KeyBindings()
supports_suspend = Condition(lambda: hasattr(signal, 'SIGTSTP'))
kb.add('c-z', filter=supports_suspend)(suspend_to_bg)
if self.shell.display_completions == 'readlinelike':
kb.add('tab', filter=(has_focus(DEFAULT_BUFFER)
& ~has_selection
& vi_insert_mode | emacs_insert_mode
& ~cursor_in_leading_ws
))(display_completions_like_readline)
self.pt_app = PromptSession(
message=(lambda: PygmentsTokens(get_prompt_tokens())),
editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),
key_bindings=kb,
history=self.shell.debugger_history,
completer=self._ptcomp,
enable_history_search=True,
mouse_support=self.shell.mouse_support,
complete_style=self.shell.pt_complete_style,
style=self.shell.style,
inputhook=self.shell.inputhook,
> color_depth=self.shell.color_depth,
)
E TypeError: __init__() got an unexpected keyword argument 'inputhook'
```
`PromptSession` in `prompt_toolkit` v3 does not have an `inputhook` parameter - https://github.com/prompt-toolkit/python-prompt-toolkit/blob/4cbbf8b9db1cb11caa1d72f4200c5cbc48bfd384/prompt_toolkit/shortcuts/prompt.py#L340-L383
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/terminal/debugger.py`
Content:
```
1 import signal
2 import sys
3
4 from IPython.core.debugger import Pdb
5
6 from IPython.core.completer import IPCompleter
7 from .ptutils import IPythonPTCompleter
8 from .shortcuts import suspend_to_bg, cursor_in_leading_ws
9
10 from prompt_toolkit.enums import DEFAULT_BUFFER
11 from prompt_toolkit.filters import (Condition, has_focus, has_selection,
12 vi_insert_mode, emacs_insert_mode)
13 from prompt_toolkit.key_binding import KeyBindings
14 from prompt_toolkit.key_binding.bindings.completion import display_completions_like_readline
15 from pygments.token import Token
16 from prompt_toolkit.shortcuts.prompt import PromptSession
17 from prompt_toolkit.enums import EditingMode
18 from prompt_toolkit.formatted_text import PygmentsTokens
19
20
21 class TerminalPdb(Pdb):
22 """Standalone IPython debugger."""
23
24 def __init__(self, *args, **kwargs):
25 Pdb.__init__(self, *args, **kwargs)
26 self._ptcomp = None
27 self.pt_init()
28
29 def pt_init(self):
30 def get_prompt_tokens():
31 return [(Token.Prompt, self.prompt)]
32
33 if self._ptcomp is None:
34 compl = IPCompleter(shell=self.shell,
35 namespace={},
36 global_namespace={},
37 parent=self.shell,
38 )
39 self._ptcomp = IPythonPTCompleter(compl)
40
41 kb = KeyBindings()
42 supports_suspend = Condition(lambda: hasattr(signal, 'SIGTSTP'))
43 kb.add('c-z', filter=supports_suspend)(suspend_to_bg)
44
45 if self.shell.display_completions == 'readlinelike':
46 kb.add('tab', filter=(has_focus(DEFAULT_BUFFER)
47 & ~has_selection
48 & vi_insert_mode | emacs_insert_mode
49 & ~cursor_in_leading_ws
50 ))(display_completions_like_readline)
51
52 self.pt_app = PromptSession(
53 message=(lambda: PygmentsTokens(get_prompt_tokens())),
54 editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),
55 key_bindings=kb,
56 history=self.shell.debugger_history,
57 completer=self._ptcomp,
58 enable_history_search=True,
59 mouse_support=self.shell.mouse_support,
60 complete_style=self.shell.pt_complete_style,
61 style=self.shell.style,
62 inputhook=self.shell.inputhook,
63 color_depth=self.shell.color_depth,
64 )
65
66 def cmdloop(self, intro=None):
67 """Repeatedly issue a prompt, accept input, parse an initial prefix
68 off the received input, and dispatch to action methods, passing them
69 the remainder of the line as argument.
70
71 override the same methods from cmd.Cmd to provide prompt toolkit replacement.
72 """
73 if not self.use_rawinput:
74 raise ValueError('Sorry ipdb does not support use_rawinput=False')
75
76 self.preloop()
77
78 try:
79 if intro is not None:
80 self.intro = intro
81 if self.intro:
82 self.stdout.write(str(self.intro)+"\n")
83 stop = None
84 while not stop:
85 if self.cmdqueue:
86 line = self.cmdqueue.pop(0)
87 else:
88 self._ptcomp.ipy_completer.namespace = self.curframe_locals
89 self._ptcomp.ipy_completer.global_namespace = self.curframe.f_globals
90 try:
91 line = self.pt_app.prompt() # reset_current_buffer=True)
92 except EOFError:
93 line = 'EOF'
94 line = self.precmd(line)
95 stop = self.onecmd(line)
96 stop = self.postcmd(stop, line)
97 self.postloop()
98 except Exception:
99 raise
100
101
102 def set_trace(frame=None):
103 """
104 Start debugging from `frame`.
105
106 If frame is not specified, debugging starts from caller's frame.
107 """
108 TerminalPdb().set_trace(frame or sys._getframe().f_back)
109
110
111 if __name__ == '__main__':
112 import pdb
113 # IPython.core.debugger.Pdb.trace_dispatch shall not catch
114 # bdb.BdbQuit. When started through __main__ and an exception
115 # happened after hitting "c", this is needed in order to
116 # be able to quit the debugging session (see #9950).
117 old_trace_dispatch = pdb.Pdb.trace_dispatch
118 pdb.Pdb = TerminalPdb
119 pdb.Pdb.trace_dispatch = old_trace_dispatch
120 pdb.main()
121
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/IPython/terminal/debugger.py b/IPython/terminal/debugger.py
--- a/IPython/terminal/debugger.py
+++ b/IPython/terminal/debugger.py
@@ -17,6 +17,9 @@
from prompt_toolkit.enums import EditingMode
from prompt_toolkit.formatted_text import PygmentsTokens
+from prompt_toolkit import __version__ as ptk_version
+PTK3 = ptk_version.startswith('3.')
+
class TerminalPdb(Pdb):
"""Standalone IPython debugger."""
@@ -49,20 +52,23 @@
& ~cursor_in_leading_ws
))(display_completions_like_readline)
- self.pt_app = PromptSession(
- message=(lambda: PygmentsTokens(get_prompt_tokens())),
- editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),
- key_bindings=kb,
- history=self.shell.debugger_history,
- completer=self._ptcomp,
- enable_history_search=True,
- mouse_support=self.shell.mouse_support,
- complete_style=self.shell.pt_complete_style,
- style=self.shell.style,
- inputhook=self.shell.inputhook,
- color_depth=self.shell.color_depth,
+ options = dict(
+ message=(lambda: PygmentsTokens(get_prompt_tokens())),
+ editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),
+ key_bindings=kb,
+ history=self.shell.debugger_history,
+ completer=self._ptcomp,
+ enable_history_search=True,
+ mouse_support=self.shell.mouse_support,
+ complete_style=self.shell.pt_complete_style,
+ style=self.shell.style,
+ color_depth=self.shell.color_depth,
)
+ if not PTK3:
+ options['inputhook'] = self.inputhook
+ self.pt_app = PromptSession(**options)
+
def cmdloop(self, intro=None):
"""Repeatedly issue a prompt, accept input, parse an initial prefix
off the received input, and dispatch to action methods, passing them
|
{"golden_diff": "diff --git a/IPython/terminal/debugger.py b/IPython/terminal/debugger.py\n--- a/IPython/terminal/debugger.py\n+++ b/IPython/terminal/debugger.py\n@@ -17,6 +17,9 @@\n from prompt_toolkit.enums import EditingMode\n from prompt_toolkit.formatted_text import PygmentsTokens\n \n+from prompt_toolkit import __version__ as ptk_version\n+PTK3 = ptk_version.startswith('3.')\n+\n \n class TerminalPdb(Pdb):\n \"\"\"Standalone IPython debugger.\"\"\"\n@@ -49,20 +52,23 @@\n & ~cursor_in_leading_ws\n ))(display_completions_like_readline)\n \n- self.pt_app = PromptSession(\n- message=(lambda: PygmentsTokens(get_prompt_tokens())),\n- editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),\n- key_bindings=kb,\n- history=self.shell.debugger_history,\n- completer=self._ptcomp,\n- enable_history_search=True,\n- mouse_support=self.shell.mouse_support,\n- complete_style=self.shell.pt_complete_style,\n- style=self.shell.style,\n- inputhook=self.shell.inputhook,\n- color_depth=self.shell.color_depth,\n+ options = dict(\n+ message=(lambda: PygmentsTokens(get_prompt_tokens())),\n+ editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),\n+ key_bindings=kb,\n+ history=self.shell.debugger_history,\n+ completer=self._ptcomp,\n+ enable_history_search=True,\n+ mouse_support=self.shell.mouse_support,\n+ complete_style=self.shell.pt_complete_style,\n+ style=self.shell.style,\n+ color_depth=self.shell.color_depth,\n )\n \n+ if not PTK3:\n+ options['inputhook'] = self.inputhook\n+ self.pt_app = PromptSession(**options)\n+\n def cmdloop(self, intro=None):\n \"\"\"Repeatedly issue a prompt, accept input, parse an initial prefix\n off the received input, and dispatch to action methods, passing them\n", "issue": "Incompatibility with prompt_toolkit v3.0.0\nI noticed the following error today:\r\n\r\n```\r\n def pt_init(self):\r\n def get_prompt_tokens():\r\n return [(Token.Prompt, self.prompt)]\r\n\r\n if self._ptcomp is None:\r\n compl = IPCompleter(shell=self.shell,\r\n namespace={},\r\n global_namespace={},\r\n parent=self.shell,\r\n )\r\n self._ptcomp = IPythonPTCompleter(compl)\r\n\r\n kb = KeyBindings()\r\n supports_suspend = Condition(lambda: hasattr(signal, 'SIGTSTP'))\r\n kb.add('c-z', filter=supports_suspend)(suspend_to_bg)\r\n\r\n if self.shell.display_completions == 'readlinelike':\r\n kb.add('tab', filter=(has_focus(DEFAULT_BUFFER)\r\n & ~has_selection\r\n & vi_insert_mode | emacs_insert_mode\r\n & ~cursor_in_leading_ws\r\n ))(display_completions_like_readline)\r\n\r\n self.pt_app = PromptSession(\r\n message=(lambda: PygmentsTokens(get_prompt_tokens())),\r\n editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),\r\n key_bindings=kb,\r\n history=self.shell.debugger_history,\r\n completer=self._ptcomp,\r\n enable_history_search=True,\r\n mouse_support=self.shell.mouse_support,\r\n complete_style=self.shell.pt_complete_style,\r\n style=self.shell.style,\r\n inputhook=self.shell.inputhook,\r\n> color_depth=self.shell.color_depth,\r\n )\r\nE TypeError: __init__() got an unexpected keyword argument 'inputhook'\r\n```\r\n`PromptSession` in `prompt_toolkit` v3 does not have an `inputhook` parameter - https://github.com/prompt-toolkit/python-prompt-toolkit/blob/4cbbf8b9db1cb11caa1d72f4200c5cbc48bfd384/prompt_toolkit/shortcuts/prompt.py#L340-L383\n", "before_files": [{"content": "import signal\nimport sys\n\nfrom IPython.core.debugger import Pdb\n\nfrom IPython.core.completer import IPCompleter\nfrom .ptutils import IPythonPTCompleter\nfrom .shortcuts import suspend_to_bg, cursor_in_leading_ws\n\nfrom prompt_toolkit.enums import DEFAULT_BUFFER\nfrom prompt_toolkit.filters import (Condition, has_focus, has_selection,\n vi_insert_mode, emacs_insert_mode)\nfrom prompt_toolkit.key_binding import KeyBindings\nfrom prompt_toolkit.key_binding.bindings.completion import display_completions_like_readline\nfrom pygments.token import Token\nfrom prompt_toolkit.shortcuts.prompt import PromptSession\nfrom prompt_toolkit.enums import EditingMode\nfrom prompt_toolkit.formatted_text import PygmentsTokens\n\n\nclass TerminalPdb(Pdb):\n \"\"\"Standalone IPython debugger.\"\"\"\n\n def __init__(self, *args, **kwargs):\n Pdb.__init__(self, *args, **kwargs)\n self._ptcomp = None\n self.pt_init()\n\n def pt_init(self):\n def get_prompt_tokens():\n return [(Token.Prompt, self.prompt)]\n\n if self._ptcomp is None:\n compl = IPCompleter(shell=self.shell,\n namespace={},\n global_namespace={},\n parent=self.shell,\n )\n self._ptcomp = IPythonPTCompleter(compl)\n\n kb = KeyBindings()\n supports_suspend = Condition(lambda: hasattr(signal, 'SIGTSTP'))\n kb.add('c-z', filter=supports_suspend)(suspend_to_bg)\n\n if self.shell.display_completions == 'readlinelike':\n kb.add('tab', filter=(has_focus(DEFAULT_BUFFER)\n & ~has_selection\n & vi_insert_mode | emacs_insert_mode\n & ~cursor_in_leading_ws\n ))(display_completions_like_readline)\n\n self.pt_app = PromptSession(\n message=(lambda: PygmentsTokens(get_prompt_tokens())),\n editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),\n key_bindings=kb,\n history=self.shell.debugger_history,\n completer=self._ptcomp,\n enable_history_search=True,\n mouse_support=self.shell.mouse_support,\n complete_style=self.shell.pt_complete_style,\n style=self.shell.style,\n inputhook=self.shell.inputhook,\n color_depth=self.shell.color_depth,\n )\n\n def cmdloop(self, intro=None):\n \"\"\"Repeatedly issue a prompt, accept input, parse an initial prefix\n off the received input, and dispatch to action methods, passing them\n the remainder of the line as argument.\n\n override the same methods from cmd.Cmd to provide prompt toolkit replacement.\n \"\"\"\n if not self.use_rawinput:\n raise ValueError('Sorry ipdb does not support use_rawinput=False')\n\n self.preloop()\n\n try:\n if intro is not None:\n self.intro = intro\n if self.intro:\n self.stdout.write(str(self.intro)+\"\\n\")\n stop = None\n while not stop:\n if self.cmdqueue:\n line = self.cmdqueue.pop(0)\n else:\n self._ptcomp.ipy_completer.namespace = self.curframe_locals\n self._ptcomp.ipy_completer.global_namespace = self.curframe.f_globals\n try:\n line = self.pt_app.prompt() # reset_current_buffer=True)\n except EOFError:\n line = 'EOF'\n line = self.precmd(line)\n stop = self.onecmd(line)\n stop = self.postcmd(stop, line)\n self.postloop()\n except Exception:\n raise\n\n\ndef set_trace(frame=None):\n \"\"\"\n Start debugging from `frame`.\n\n If frame is not specified, debugging starts from caller's frame.\n \"\"\"\n TerminalPdb().set_trace(frame or sys._getframe().f_back)\n\n\nif __name__ == '__main__':\n import pdb\n # IPython.core.debugger.Pdb.trace_dispatch shall not catch\n # bdb.BdbQuit. When started through __main__ and an exception\n # happened after hitting \"c\", this is needed in order to\n # be able to quit the debugging session (see #9950).\n old_trace_dispatch = pdb.Pdb.trace_dispatch\n pdb.Pdb = TerminalPdb\n pdb.Pdb.trace_dispatch = old_trace_dispatch\n pdb.main()\n", "path": "IPython/terminal/debugger.py"}], "after_files": [{"content": "import signal\nimport sys\n\nfrom IPython.core.debugger import Pdb\n\nfrom IPython.core.completer import IPCompleter\nfrom .ptutils import IPythonPTCompleter\nfrom .shortcuts import suspend_to_bg, cursor_in_leading_ws\n\nfrom prompt_toolkit.enums import DEFAULT_BUFFER\nfrom prompt_toolkit.filters import (Condition, has_focus, has_selection,\n vi_insert_mode, emacs_insert_mode)\nfrom prompt_toolkit.key_binding import KeyBindings\nfrom prompt_toolkit.key_binding.bindings.completion import display_completions_like_readline\nfrom pygments.token import Token\nfrom prompt_toolkit.shortcuts.prompt import PromptSession\nfrom prompt_toolkit.enums import EditingMode\nfrom prompt_toolkit.formatted_text import PygmentsTokens\n\nfrom prompt_toolkit import __version__ as ptk_version\nPTK3 = ptk_version.startswith('3.')\n\n\nclass TerminalPdb(Pdb):\n \"\"\"Standalone IPython debugger.\"\"\"\n\n def __init__(self, *args, **kwargs):\n Pdb.__init__(self, *args, **kwargs)\n self._ptcomp = None\n self.pt_init()\n\n def pt_init(self):\n def get_prompt_tokens():\n return [(Token.Prompt, self.prompt)]\n\n if self._ptcomp is None:\n compl = IPCompleter(shell=self.shell,\n namespace={},\n global_namespace={},\n parent=self.shell,\n )\n self._ptcomp = IPythonPTCompleter(compl)\n\n kb = KeyBindings()\n supports_suspend = Condition(lambda: hasattr(signal, 'SIGTSTP'))\n kb.add('c-z', filter=supports_suspend)(suspend_to_bg)\n\n if self.shell.display_completions == 'readlinelike':\n kb.add('tab', filter=(has_focus(DEFAULT_BUFFER)\n & ~has_selection\n & vi_insert_mode | emacs_insert_mode\n & ~cursor_in_leading_ws\n ))(display_completions_like_readline)\n\n options = dict(\n message=(lambda: PygmentsTokens(get_prompt_tokens())),\n editing_mode=getattr(EditingMode, self.shell.editing_mode.upper()),\n key_bindings=kb,\n history=self.shell.debugger_history,\n completer=self._ptcomp,\n enable_history_search=True,\n mouse_support=self.shell.mouse_support,\n complete_style=self.shell.pt_complete_style,\n style=self.shell.style,\n color_depth=self.shell.color_depth,\n )\n\n if not PTK3:\n options['inputhook'] = self.inputhook\n self.pt_app = PromptSession(**options)\n\n def cmdloop(self, intro=None):\n \"\"\"Repeatedly issue a prompt, accept input, parse an initial prefix\n off the received input, and dispatch to action methods, passing them\n the remainder of the line as argument.\n\n override the same methods from cmd.Cmd to provide prompt toolkit replacement.\n \"\"\"\n if not self.use_rawinput:\n raise ValueError('Sorry ipdb does not support use_rawinput=False')\n\n self.preloop()\n\n try:\n if intro is not None:\n self.intro = intro\n if self.intro:\n self.stdout.write(str(self.intro)+\"\\n\")\n stop = None\n while not stop:\n if self.cmdqueue:\n line = self.cmdqueue.pop(0)\n else:\n self._ptcomp.ipy_completer.namespace = self.curframe_locals\n self._ptcomp.ipy_completer.global_namespace = self.curframe.f_globals\n try:\n line = self.pt_app.prompt() # reset_current_buffer=True)\n except EOFError:\n line = 'EOF'\n line = self.precmd(line)\n stop = self.onecmd(line)\n stop = self.postcmd(stop, line)\n self.postloop()\n except Exception:\n raise\n\n\ndef set_trace(frame=None):\n \"\"\"\n Start debugging from `frame`.\n\n If frame is not specified, debugging starts from caller's frame.\n \"\"\"\n TerminalPdb().set_trace(frame or sys._getframe().f_back)\n\n\nif __name__ == '__main__':\n import pdb\n # IPython.core.debugger.Pdb.trace_dispatch shall not catch\n # bdb.BdbQuit. When started through __main__ and an exception\n # happened after hitting \"c\", this is needed in order to\n # be able to quit the debugging session (see #9950).\n old_trace_dispatch = pdb.Pdb.trace_dispatch\n pdb.Pdb = TerminalPdb\n pdb.Pdb.trace_dispatch = old_trace_dispatch\n pdb.main()\n", "path": "IPython/terminal/debugger.py"}]}
| 1,853 | 444 |
gh_patches_debug_29729
|
rasdani/github-patches
|
git_diff
|
open-mmlab__mmdetection-10568
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
TypeError: list indices must be integers or slices, not str
When I run the demo code **video_gpuaccel_demo.py**, it has the following error. How to solve it, thanks.
Traceback (most recent call last):
File "demo/video_gpuaccel_demo.py", line 147, in <module>
main()
File "demo/video_gpuaccel_demo.py", line 102, in main
batch_input_shape = prefetch_batch_input_shape(
File "demo/video_gpuaccel_demo.py", line 60, in prefetch_batch_input_shape
_, data_sample = model.data_preprocessor([data], False)
File "C:\Anaconda\Anaconda\envs\mmdetection\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
return forward_call(*input, **kwargs)
File "c:\research\programmar\deeplearning\vehicle_classification\mmdet\models\data_preprocessors\data_preprocessor.py", line 121, in forward
batch_pad_shape = self._get_pad_shape(data)
File "c:\research\programmar\deeplearning\vehicle_classification\mmdet\models\data_preprocessors\data_preprocessor.py", line 154, in _get_pad_shape
_batch_inputs = data['inputs']
TypeError: list indices must be integers or slices, not str
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `demo/video_gpuaccel_demo.py`
Content:
```
1 # Copyright (c) OpenMMLab. All rights reserved.
2 import argparse
3 from typing import Tuple
4
5 import cv2
6 import mmcv
7 import numpy as np
8 import torch
9 import torch.nn as nn
10 from mmcv.transforms import Compose
11 from mmengine.utils import track_iter_progress
12
13 from mmdet.apis import init_detector
14 from mmdet.registry import VISUALIZERS
15 from mmdet.structures import DetDataSample
16
17 try:
18 import ffmpegcv
19 except ImportError:
20 raise ImportError(
21 'Please install ffmpegcv with:\n\n pip install ffmpegcv')
22
23
24 def parse_args():
25 parser = argparse.ArgumentParser(
26 description='MMDetection video demo with GPU acceleration')
27 parser.add_argument('video', help='Video file')
28 parser.add_argument('config', help='Config file')
29 parser.add_argument('checkpoint', help='Checkpoint file')
30 parser.add_argument(
31 '--device', default='cuda:0', help='Device used for inference')
32 parser.add_argument(
33 '--score-thr', type=float, default=0.3, help='Bbox score threshold')
34 parser.add_argument('--out', type=str, help='Output video file')
35 parser.add_argument('--show', action='store_true', help='Show video')
36 parser.add_argument(
37 '--nvdecode', action='store_true', help='Use NVIDIA decoder')
38 parser.add_argument(
39 '--wait-time',
40 type=float,
41 default=1,
42 help='The interval of show (s), 0 is block')
43 args = parser.parse_args()
44 return args
45
46
47 def prefetch_batch_input_shape(model: nn.Module, ori_wh: Tuple[int,
48 int]) -> dict:
49 cfg = model.cfg
50 w, h = ori_wh
51 cfg.test_dataloader.dataset.pipeline[0].type = 'LoadImageFromNDArray'
52 test_pipeline = Compose(cfg.test_dataloader.dataset.pipeline)
53 data = {'img': np.zeros((h, w, 3), dtype=np.uint8), 'img_id': 0}
54 data = test_pipeline(data)
55 _, data_sample = model.data_preprocessor([data], False)
56 batch_input_shape = data_sample[0].batch_input_shape
57 return batch_input_shape
58
59
60 def pack_data(frame_resize: np.ndarray, batch_input_shape: Tuple[int, int],
61 ori_shape: Tuple[int, int]) -> dict:
62 assert frame_resize.shape[:2] == batch_input_shape
63 data_sample = DetDataSample()
64 data_sample.set_metainfo({
65 'img_shape':
66 batch_input_shape,
67 'ori_shape':
68 ori_shape,
69 'scale_factor': (batch_input_shape[0] / ori_shape[0],
70 batch_input_shape[1] / ori_shape[1])
71 })
72 frame_resize = torch.from_numpy(frame_resize).permute((2, 0, 1))
73 data = {'inputs': frame_resize, 'data_sample': data_sample}
74 return data
75
76
77 def main():
78 args = parse_args()
79 assert args.out or args.show, \
80 ('Please specify at least one operation (save/show the '
81 'video) with the argument "--out" or "--show"')
82
83 model = init_detector(args.config, args.checkpoint, device=args.device)
84
85 # init visualizer
86 visualizer = VISUALIZERS.build(model.cfg.visualizer)
87 # the dataset_meta is loaded from the checkpoint and
88 # then pass to the model in init_detector
89 visualizer.dataset_meta = model.dataset_meta
90
91 if args.nvdecode:
92 VideoCapture = ffmpegcv.VideoCaptureNV
93 else:
94 VideoCapture = ffmpegcv.VideoCapture
95 video_origin = VideoCapture(args.video)
96
97 batch_input_shape = prefetch_batch_input_shape(
98 model, (video_origin.width, video_origin.height))
99 ori_shape = (video_origin.height, video_origin.width)
100 resize_wh = batch_input_shape[::-1]
101 video_resize = VideoCapture(
102 args.video,
103 resize=resize_wh,
104 resize_keepratio=True,
105 resize_keepratioalign='topleft')
106
107 video_writer = None
108 if args.out:
109 video_writer = ffmpegcv.VideoWriter(args.out, fps=video_origin.fps)
110
111 with torch.no_grad():
112 for i, (frame_resize, frame_origin) in enumerate(
113 zip(track_iter_progress(video_resize), video_origin)):
114 data = pack_data(frame_resize, batch_input_shape, ori_shape)
115 result = model.test_step([data])[0]
116
117 visualizer.add_datasample(
118 name='video',
119 image=frame_origin,
120 data_sample=result,
121 draw_gt=False,
122 show=False,
123 pred_score_thr=args.score_thr)
124
125 frame_mask = visualizer.get_image()
126
127 if args.show:
128 cv2.namedWindow('video', 0)
129 mmcv.imshow(frame_mask, 'video', args.wait_time)
130 if args.out:
131 video_writer.write(frame_mask)
132
133 if video_writer:
134 video_writer.release()
135 video_origin.release()
136 video_resize.release()
137
138 cv2.destroyAllWindows()
139
140
141 if __name__ == '__main__':
142 main()
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/demo/video_gpuaccel_demo.py b/demo/video_gpuaccel_demo.py
--- a/demo/video_gpuaccel_demo.py
+++ b/demo/video_gpuaccel_demo.py
@@ -52,7 +52,9 @@
test_pipeline = Compose(cfg.test_dataloader.dataset.pipeline)
data = {'img': np.zeros((h, w, 3), dtype=np.uint8), 'img_id': 0}
data = test_pipeline(data)
- _, data_sample = model.data_preprocessor([data], False)
+ data['inputs'] = [data['inputs']]
+ data['data_samples'] = [data['data_samples']]
+ data_sample = model.data_preprocessor(data, False)['data_samples']
batch_input_shape = data_sample[0].batch_input_shape
return batch_input_shape
@@ -69,8 +71,8 @@
'scale_factor': (batch_input_shape[0] / ori_shape[0],
batch_input_shape[1] / ori_shape[1])
})
- frame_resize = torch.from_numpy(frame_resize).permute((2, 0, 1))
- data = {'inputs': frame_resize, 'data_sample': data_sample}
+ frame_resize = torch.from_numpy(frame_resize).permute((2, 0, 1)).cuda()
+ data = {'inputs': [frame_resize], 'data_samples': [data_sample]}
return data
@@ -112,7 +114,7 @@
for i, (frame_resize, frame_origin) in enumerate(
zip(track_iter_progress(video_resize), video_origin)):
data = pack_data(frame_resize, batch_input_shape, ori_shape)
- result = model.test_step([data])[0]
+ result = model.test_step(data)[0]
visualizer.add_datasample(
name='video',
|
{"golden_diff": "diff --git a/demo/video_gpuaccel_demo.py b/demo/video_gpuaccel_demo.py\n--- a/demo/video_gpuaccel_demo.py\n+++ b/demo/video_gpuaccel_demo.py\n@@ -52,7 +52,9 @@\n test_pipeline = Compose(cfg.test_dataloader.dataset.pipeline)\n data = {'img': np.zeros((h, w, 3), dtype=np.uint8), 'img_id': 0}\n data = test_pipeline(data)\n- _, data_sample = model.data_preprocessor([data], False)\n+ data['inputs'] = [data['inputs']]\n+ data['data_samples'] = [data['data_samples']]\n+ data_sample = model.data_preprocessor(data, False)['data_samples']\n batch_input_shape = data_sample[0].batch_input_shape\n return batch_input_shape\n \n@@ -69,8 +71,8 @@\n 'scale_factor': (batch_input_shape[0] / ori_shape[0],\n batch_input_shape[1] / ori_shape[1])\n })\n- frame_resize = torch.from_numpy(frame_resize).permute((2, 0, 1))\n- data = {'inputs': frame_resize, 'data_sample': data_sample}\n+ frame_resize = torch.from_numpy(frame_resize).permute((2, 0, 1)).cuda()\n+ data = {'inputs': [frame_resize], 'data_samples': [data_sample]}\n return data\n \n \n@@ -112,7 +114,7 @@\n for i, (frame_resize, frame_origin) in enumerate(\n zip(track_iter_progress(video_resize), video_origin)):\n data = pack_data(frame_resize, batch_input_shape, ori_shape)\n- result = model.test_step([data])[0]\n+ result = model.test_step(data)[0]\n \n visualizer.add_datasample(\n name='video',\n", "issue": "TypeError: list indices must be integers or slices, not str\nWhen I run the demo code **video_gpuaccel_demo.py**, it has the following error. How to solve it, thanks.\r\n\r\nTraceback (most recent call last):\r\n File \"demo/video_gpuaccel_demo.py\", line 147, in <module>\r\n main()\r\n File \"demo/video_gpuaccel_demo.py\", line 102, in main\r\n batch_input_shape = prefetch_batch_input_shape(\r\n File \"demo/video_gpuaccel_demo.py\", line 60, in prefetch_batch_input_shape\r\n _, data_sample = model.data_preprocessor([data], False)\r\n File \"C:\\Anaconda\\Anaconda\\envs\\mmdetection\\lib\\site-packages\\torch\\nn\\modules\\module.py\", line 1130, in _call_impl\r\n return forward_call(*input, **kwargs)\r\n File \"c:\\research\\programmar\\deeplearning\\vehicle_classification\\mmdet\\models\\data_preprocessors\\data_preprocessor.py\", line 121, in forward\r\n batch_pad_shape = self._get_pad_shape(data)\r\n File \"c:\\research\\programmar\\deeplearning\\vehicle_classification\\mmdet\\models\\data_preprocessors\\data_preprocessor.py\", line 154, in _get_pad_shape\r\n _batch_inputs = data['inputs']\r\nTypeError: list indices must be integers or slices, not str\r\n\n", "before_files": [{"content": "# Copyright (c) OpenMMLab. All rights reserved.\nimport argparse\nfrom typing import Tuple\n\nimport cv2\nimport mmcv\nimport numpy as np\nimport torch\nimport torch.nn as nn\nfrom mmcv.transforms import Compose\nfrom mmengine.utils import track_iter_progress\n\nfrom mmdet.apis import init_detector\nfrom mmdet.registry import VISUALIZERS\nfrom mmdet.structures import DetDataSample\n\ntry:\n import ffmpegcv\nexcept ImportError:\n raise ImportError(\n 'Please install ffmpegcv with:\\n\\n pip install ffmpegcv')\n\n\ndef parse_args():\n parser = argparse.ArgumentParser(\n description='MMDetection video demo with GPU acceleration')\n parser.add_argument('video', help='Video file')\n parser.add_argument('config', help='Config file')\n parser.add_argument('checkpoint', help='Checkpoint file')\n parser.add_argument(\n '--device', default='cuda:0', help='Device used for inference')\n parser.add_argument(\n '--score-thr', type=float, default=0.3, help='Bbox score threshold')\n parser.add_argument('--out', type=str, help='Output video file')\n parser.add_argument('--show', action='store_true', help='Show video')\n parser.add_argument(\n '--nvdecode', action='store_true', help='Use NVIDIA decoder')\n parser.add_argument(\n '--wait-time',\n type=float,\n default=1,\n help='The interval of show (s), 0 is block')\n args = parser.parse_args()\n return args\n\n\ndef prefetch_batch_input_shape(model: nn.Module, ori_wh: Tuple[int,\n int]) -> dict:\n cfg = model.cfg\n w, h = ori_wh\n cfg.test_dataloader.dataset.pipeline[0].type = 'LoadImageFromNDArray'\n test_pipeline = Compose(cfg.test_dataloader.dataset.pipeline)\n data = {'img': np.zeros((h, w, 3), dtype=np.uint8), 'img_id': 0}\n data = test_pipeline(data)\n _, data_sample = model.data_preprocessor([data], False)\n batch_input_shape = data_sample[0].batch_input_shape\n return batch_input_shape\n\n\ndef pack_data(frame_resize: np.ndarray, batch_input_shape: Tuple[int, int],\n ori_shape: Tuple[int, int]) -> dict:\n assert frame_resize.shape[:2] == batch_input_shape\n data_sample = DetDataSample()\n data_sample.set_metainfo({\n 'img_shape':\n batch_input_shape,\n 'ori_shape':\n ori_shape,\n 'scale_factor': (batch_input_shape[0] / ori_shape[0],\n batch_input_shape[1] / ori_shape[1])\n })\n frame_resize = torch.from_numpy(frame_resize).permute((2, 0, 1))\n data = {'inputs': frame_resize, 'data_sample': data_sample}\n return data\n\n\ndef main():\n args = parse_args()\n assert args.out or args.show, \\\n ('Please specify at least one operation (save/show the '\n 'video) with the argument \"--out\" or \"--show\"')\n\n model = init_detector(args.config, args.checkpoint, device=args.device)\n\n # init visualizer\n visualizer = VISUALIZERS.build(model.cfg.visualizer)\n # the dataset_meta is loaded from the checkpoint and\n # then pass to the model in init_detector\n visualizer.dataset_meta = model.dataset_meta\n\n if args.nvdecode:\n VideoCapture = ffmpegcv.VideoCaptureNV\n else:\n VideoCapture = ffmpegcv.VideoCapture\n video_origin = VideoCapture(args.video)\n\n batch_input_shape = prefetch_batch_input_shape(\n model, (video_origin.width, video_origin.height))\n ori_shape = (video_origin.height, video_origin.width)\n resize_wh = batch_input_shape[::-1]\n video_resize = VideoCapture(\n args.video,\n resize=resize_wh,\n resize_keepratio=True,\n resize_keepratioalign='topleft')\n\n video_writer = None\n if args.out:\n video_writer = ffmpegcv.VideoWriter(args.out, fps=video_origin.fps)\n\n with torch.no_grad():\n for i, (frame_resize, frame_origin) in enumerate(\n zip(track_iter_progress(video_resize), video_origin)):\n data = pack_data(frame_resize, batch_input_shape, ori_shape)\n result = model.test_step([data])[0]\n\n visualizer.add_datasample(\n name='video',\n image=frame_origin,\n data_sample=result,\n draw_gt=False,\n show=False,\n pred_score_thr=args.score_thr)\n\n frame_mask = visualizer.get_image()\n\n if args.show:\n cv2.namedWindow('video', 0)\n mmcv.imshow(frame_mask, 'video', args.wait_time)\n if args.out:\n video_writer.write(frame_mask)\n\n if video_writer:\n video_writer.release()\n video_origin.release()\n video_resize.release()\n\n cv2.destroyAllWindows()\n\n\nif __name__ == '__main__':\n main()\n", "path": "demo/video_gpuaccel_demo.py"}], "after_files": [{"content": "# Copyright (c) OpenMMLab. All rights reserved.\nimport argparse\nfrom typing import Tuple\n\nimport cv2\nimport mmcv\nimport numpy as np\nimport torch\nimport torch.nn as nn\nfrom mmcv.transforms import Compose\nfrom mmengine.utils import track_iter_progress\n\nfrom mmdet.apis import init_detector\nfrom mmdet.registry import VISUALIZERS\nfrom mmdet.structures import DetDataSample\n\ntry:\n import ffmpegcv\nexcept ImportError:\n raise ImportError(\n 'Please install ffmpegcv with:\\n\\n pip install ffmpegcv')\n\n\ndef parse_args():\n parser = argparse.ArgumentParser(\n description='MMDetection video demo with GPU acceleration')\n parser.add_argument('video', help='Video file')\n parser.add_argument('config', help='Config file')\n parser.add_argument('checkpoint', help='Checkpoint file')\n parser.add_argument(\n '--device', default='cuda:0', help='Device used for inference')\n parser.add_argument(\n '--score-thr', type=float, default=0.3, help='Bbox score threshold')\n parser.add_argument('--out', type=str, help='Output video file')\n parser.add_argument('--show', action='store_true', help='Show video')\n parser.add_argument(\n '--nvdecode', action='store_true', help='Use NVIDIA decoder')\n parser.add_argument(\n '--wait-time',\n type=float,\n default=1,\n help='The interval of show (s), 0 is block')\n args = parser.parse_args()\n return args\n\n\ndef prefetch_batch_input_shape(model: nn.Module, ori_wh: Tuple[int,\n int]) -> dict:\n cfg = model.cfg\n w, h = ori_wh\n cfg.test_dataloader.dataset.pipeline[0].type = 'LoadImageFromNDArray'\n test_pipeline = Compose(cfg.test_dataloader.dataset.pipeline)\n data = {'img': np.zeros((h, w, 3), dtype=np.uint8), 'img_id': 0}\n data = test_pipeline(data)\n data['inputs'] = [data['inputs']]\n data['data_samples'] = [data['data_samples']]\n data_sample = model.data_preprocessor(data, False)['data_samples']\n batch_input_shape = data_sample[0].batch_input_shape\n return batch_input_shape\n\n\ndef pack_data(frame_resize: np.ndarray, batch_input_shape: Tuple[int, int],\n ori_shape: Tuple[int, int]) -> dict:\n assert frame_resize.shape[:2] == batch_input_shape\n data_sample = DetDataSample()\n data_sample.set_metainfo({\n 'img_shape':\n batch_input_shape,\n 'ori_shape':\n ori_shape,\n 'scale_factor': (batch_input_shape[0] / ori_shape[0],\n batch_input_shape[1] / ori_shape[1])\n })\n frame_resize = torch.from_numpy(frame_resize).permute((2, 0, 1)).cuda()\n data = {'inputs': [frame_resize], 'data_samples': [data_sample]}\n return data\n\n\ndef main():\n args = parse_args()\n assert args.out or args.show, \\\n ('Please specify at least one operation (save/show the '\n 'video) with the argument \"--out\" or \"--show\"')\n\n model = init_detector(args.config, args.checkpoint, device=args.device)\n\n # init visualizer\n visualizer = VISUALIZERS.build(model.cfg.visualizer)\n # the dataset_meta is loaded from the checkpoint and\n # then pass to the model in init_detector\n visualizer.dataset_meta = model.dataset_meta\n\n if args.nvdecode:\n VideoCapture = ffmpegcv.VideoCaptureNV\n else:\n VideoCapture = ffmpegcv.VideoCapture\n video_origin = VideoCapture(args.video)\n\n batch_input_shape = prefetch_batch_input_shape(\n model, (video_origin.width, video_origin.height))\n ori_shape = (video_origin.height, video_origin.width)\n resize_wh = batch_input_shape[::-1]\n video_resize = VideoCapture(\n args.video,\n resize=resize_wh,\n resize_keepratio=True,\n resize_keepratioalign='topleft')\n\n video_writer = None\n if args.out:\n video_writer = ffmpegcv.VideoWriter(args.out, fps=video_origin.fps)\n\n with torch.no_grad():\n for i, (frame_resize, frame_origin) in enumerate(\n zip(track_iter_progress(video_resize), video_origin)):\n data = pack_data(frame_resize, batch_input_shape, ori_shape)\n result = model.test_step(data)[0]\n\n visualizer.add_datasample(\n name='video',\n image=frame_origin,\n data_sample=result,\n draw_gt=False,\n show=False,\n pred_score_thr=args.score_thr)\n\n frame_mask = visualizer.get_image()\n\n if args.show:\n cv2.namedWindow('video', 0)\n mmcv.imshow(frame_mask, 'video', args.wait_time)\n if args.out:\n video_writer.write(frame_mask)\n\n if video_writer:\n video_writer.release()\n video_origin.release()\n video_resize.release()\n\n cv2.destroyAllWindows()\n\n\nif __name__ == '__main__':\n main()\n", "path": "demo/video_gpuaccel_demo.py"}]}
| 1,969 | 395 |
gh_patches_debug_21996
|
rasdani/github-patches
|
git_diff
|
qtile__qtile-2490
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
widget.CheckUpdates on Fedora: show the correct number of updates
widget.CheckUpdates on Fedora shows -2 updates when no updates found.
Excerpt from my config.py:
```
widget.CheckUpdates(
distro='Fedora',
display_format=' {updates} updates',
colour_have_updates=colors[3],
no_update_string=' no update',
update_interval=1800,
colour_no_updates=colors[5],
background=colors[8],
),
```
# Qtile version
0.17.1dev
# distro
Fedora 34
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libqtile/widget/check_updates.py`
Content:
```
1 # Copyright (c) 2015 Ali Mousavi
2 #
3 # Permission is hereby granted, free of charge, to any person obtaining a copy
4 # of this software and associated documentation files (the "Software"), to deal
5 # in the Software without restriction, including without limitation the rights
6 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
7 # copies of the Software, and to permit persons to whom the Software is
8 # furnished to do so, subject to the following conditions:
9 #
10 # The above copyright notice and this permission notice shall be included in
11 # all copies or substantial portions of the Software.
12 #
13 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
16 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
18 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
19 # SOFTWARE.
20
21 import os
22 from subprocess import CalledProcessError, Popen
23
24 from libqtile.log_utils import logger
25 from libqtile.widget import base
26
27
28 class CheckUpdates(base.ThreadPoolText):
29 """Shows number of pending updates in different unix systems"""
30 orientations = base.ORIENTATION_HORIZONTAL
31 defaults = [
32 ("distro", "Arch", "Name of your distribution"),
33 ("custom_command", None, "Custom shell command for checking updates (counts the lines of the output)"),
34 ("custom_command_modify", (lambda x: x), "Lambda function to modify line count from custom_command"),
35 ("update_interval", 60, "Update interval in seconds."),
36 ('execute', None, 'Command to execute on click'),
37 ("display_format", "Updates: {updates}", "Display format if updates available"),
38 ("colour_no_updates", "ffffff", "Colour when there's no updates."),
39 ("colour_have_updates", "ffffff", "Colour when there are updates."),
40 ("restart_indicator", "", "Indicator to represent reboot is required. (Ubuntu only)"),
41 ("no_update_string", "", "String to display if no updates available")
42 ]
43
44 def __init__(self, **config):
45 base.ThreadPoolText.__init__(self, "", **config)
46 self.add_defaults(CheckUpdates.defaults)
47
48 # Helpful to have this as a variable as we can shorten it for testing
49 self.execute_polling_interval = 1
50
51 # format: "Distro": ("cmd", "number of lines to subtract from output")
52 self.cmd_dict = {"Arch": ("pacman -Qu", 0),
53 "Arch_checkupdates": ("checkupdates", 0),
54 "Arch_Sup": ("pacman -Sup", 1),
55 "Arch_yay": ("yay -Qu", 0),
56 "Debian": ("apt-show-versions -u -b", 0),
57 "Ubuntu": ("aptitude search ~U", 0),
58 "Fedora": ("dnf list updates", 3),
59 "FreeBSD": ("pkg_version -I -l '<'", 0),
60 "Mandriva": ("urpmq --auto-select", 0)
61 }
62
63 if self.custom_command:
64 # Use custom_command
65 self.cmd = self.custom_command
66
67 else:
68 # Check if distro name is valid.
69 try:
70 self.cmd = self.cmd_dict[self.distro][0]
71 self.custom_command_modify = (lambda x: x - self.cmd_dict[self.distro][1])
72 except KeyError:
73 distros = sorted(self.cmd_dict.keys())
74 logger.error(self.distro + ' is not a valid distro name. ' +
75 'Use one of the list: ' + str(distros) + '.')
76 self.cmd = None
77
78 if self.execute:
79 self.add_callbacks({'Button1': self.do_execute})
80
81 def _check_updates(self):
82 # type: () -> str
83 try:
84 updates = self.call_process(self.cmd, shell=True)
85 except CalledProcessError:
86 updates = ""
87 num_updates = self.custom_command_modify(len(updates.splitlines()))
88
89 if num_updates == 0:
90 self.layout.colour = self.colour_no_updates
91 return self.no_update_string
92 num_updates = str(num_updates)
93
94 if self.restart_indicator and os.path.exists('/var/run/reboot-required'):
95 num_updates += self.restart_indicator
96
97 self.layout.colour = self.colour_have_updates
98 return self.display_format.format(**{"updates": num_updates})
99
100 def poll(self):
101 # type: () -> str
102 if not self.cmd:
103 return "N/A"
104 return self._check_updates()
105
106 def do_execute(self):
107 self._process = Popen(self.execute, shell=True)
108 self.timeout_add(self.execute_polling_interval, self._refresh_count)
109
110 def _refresh_count(self):
111 if self._process.poll() is None:
112 self.timeout_add(self.execute_polling_interval, self._refresh_count)
113
114 else:
115 self.timer_setup()
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/libqtile/widget/check_updates.py b/libqtile/widget/check_updates.py
--- a/libqtile/widget/check_updates.py
+++ b/libqtile/widget/check_updates.py
@@ -55,7 +55,7 @@
"Arch_yay": ("yay -Qu", 0),
"Debian": ("apt-show-versions -u -b", 0),
"Ubuntu": ("aptitude search ~U", 0),
- "Fedora": ("dnf list updates", 3),
+ "Fedora": ("dnf list updates -q", 1),
"FreeBSD": ("pkg_version -I -l '<'", 0),
"Mandriva": ("urpmq --auto-select", 0)
}
@@ -86,6 +86,8 @@
updates = ""
num_updates = self.custom_command_modify(len(updates.splitlines()))
+ if num_updates < 0:
+ num_updates = 0
if num_updates == 0:
self.layout.colour = self.colour_no_updates
return self.no_update_string
|
{"golden_diff": "diff --git a/libqtile/widget/check_updates.py b/libqtile/widget/check_updates.py\n--- a/libqtile/widget/check_updates.py\n+++ b/libqtile/widget/check_updates.py\n@@ -55,7 +55,7 @@\n \"Arch_yay\": (\"yay -Qu\", 0),\n \"Debian\": (\"apt-show-versions -u -b\", 0),\n \"Ubuntu\": (\"aptitude search ~U\", 0),\n- \"Fedora\": (\"dnf list updates\", 3),\n+ \"Fedora\": (\"dnf list updates -q\", 1),\n \"FreeBSD\": (\"pkg_version -I -l '<'\", 0),\n \"Mandriva\": (\"urpmq --auto-select\", 0)\n }\n@@ -86,6 +86,8 @@\n updates = \"\"\n num_updates = self.custom_command_modify(len(updates.splitlines()))\n \n+ if num_updates < 0:\n+ num_updates = 0\n if num_updates == 0:\n self.layout.colour = self.colour_no_updates\n return self.no_update_string\n", "issue": "widget.CheckUpdates on Fedora: show the correct number of updates\nwidget.CheckUpdates on Fedora shows -2 updates when no updates found.\r\n\r\nExcerpt from my config.py:\r\n```\r\nwidget.CheckUpdates(\r\n distro='Fedora',\r\n display_format=' {updates} updates',\r\n colour_have_updates=colors[3],\r\n no_update_string=' no update',\r\n update_interval=1800,\r\n colour_no_updates=colors[5],\r\n background=colors[8],\r\n ),\r\n```\r\n# Qtile version\r\n0.17.1dev\r\n# distro\r\nFedora 34\n", "before_files": [{"content": "# Copyright (c) 2015 Ali Mousavi\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\nimport os\nfrom subprocess import CalledProcessError, Popen\n\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\n\nclass CheckUpdates(base.ThreadPoolText):\n \"\"\"Shows number of pending updates in different unix systems\"\"\"\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\"distro\", \"Arch\", \"Name of your distribution\"),\n (\"custom_command\", None, \"Custom shell command for checking updates (counts the lines of the output)\"),\n (\"custom_command_modify\", (lambda x: x), \"Lambda function to modify line count from custom_command\"),\n (\"update_interval\", 60, \"Update interval in seconds.\"),\n ('execute', None, 'Command to execute on click'),\n (\"display_format\", \"Updates: {updates}\", \"Display format if updates available\"),\n (\"colour_no_updates\", \"ffffff\", \"Colour when there's no updates.\"),\n (\"colour_have_updates\", \"ffffff\", \"Colour when there are updates.\"),\n (\"restart_indicator\", \"\", \"Indicator to represent reboot is required. (Ubuntu only)\"),\n (\"no_update_string\", \"\", \"String to display if no updates available\")\n ]\n\n def __init__(self, **config):\n base.ThreadPoolText.__init__(self, \"\", **config)\n self.add_defaults(CheckUpdates.defaults)\n\n # Helpful to have this as a variable as we can shorten it for testing\n self.execute_polling_interval = 1\n\n # format: \"Distro\": (\"cmd\", \"number of lines to subtract from output\")\n self.cmd_dict = {\"Arch\": (\"pacman -Qu\", 0),\n \"Arch_checkupdates\": (\"checkupdates\", 0),\n \"Arch_Sup\": (\"pacman -Sup\", 1),\n \"Arch_yay\": (\"yay -Qu\", 0),\n \"Debian\": (\"apt-show-versions -u -b\", 0),\n \"Ubuntu\": (\"aptitude search ~U\", 0),\n \"Fedora\": (\"dnf list updates\", 3),\n \"FreeBSD\": (\"pkg_version -I -l '<'\", 0),\n \"Mandriva\": (\"urpmq --auto-select\", 0)\n }\n\n if self.custom_command:\n # Use custom_command\n self.cmd = self.custom_command\n\n else:\n # Check if distro name is valid.\n try:\n self.cmd = self.cmd_dict[self.distro][0]\n self.custom_command_modify = (lambda x: x - self.cmd_dict[self.distro][1])\n except KeyError:\n distros = sorted(self.cmd_dict.keys())\n logger.error(self.distro + ' is not a valid distro name. ' +\n 'Use one of the list: ' + str(distros) + '.')\n self.cmd = None\n\n if self.execute:\n self.add_callbacks({'Button1': self.do_execute})\n\n def _check_updates(self):\n # type: () -> str\n try:\n updates = self.call_process(self.cmd, shell=True)\n except CalledProcessError:\n updates = \"\"\n num_updates = self.custom_command_modify(len(updates.splitlines()))\n\n if num_updates == 0:\n self.layout.colour = self.colour_no_updates\n return self.no_update_string\n num_updates = str(num_updates)\n\n if self.restart_indicator and os.path.exists('/var/run/reboot-required'):\n num_updates += self.restart_indicator\n\n self.layout.colour = self.colour_have_updates\n return self.display_format.format(**{\"updates\": num_updates})\n\n def poll(self):\n # type: () -> str\n if not self.cmd:\n return \"N/A\"\n return self._check_updates()\n\n def do_execute(self):\n self._process = Popen(self.execute, shell=True)\n self.timeout_add(self.execute_polling_interval, self._refresh_count)\n\n def _refresh_count(self):\n if self._process.poll() is None:\n self.timeout_add(self.execute_polling_interval, self._refresh_count)\n\n else:\n self.timer_setup()\n", "path": "libqtile/widget/check_updates.py"}], "after_files": [{"content": "# Copyright (c) 2015 Ali Mousavi\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\nimport os\nfrom subprocess import CalledProcessError, Popen\n\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\n\nclass CheckUpdates(base.ThreadPoolText):\n \"\"\"Shows number of pending updates in different unix systems\"\"\"\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\"distro\", \"Arch\", \"Name of your distribution\"),\n (\"custom_command\", None, \"Custom shell command for checking updates (counts the lines of the output)\"),\n (\"custom_command_modify\", (lambda x: x), \"Lambda function to modify line count from custom_command\"),\n (\"update_interval\", 60, \"Update interval in seconds.\"),\n ('execute', None, 'Command to execute on click'),\n (\"display_format\", \"Updates: {updates}\", \"Display format if updates available\"),\n (\"colour_no_updates\", \"ffffff\", \"Colour when there's no updates.\"),\n (\"colour_have_updates\", \"ffffff\", \"Colour when there are updates.\"),\n (\"restart_indicator\", \"\", \"Indicator to represent reboot is required. (Ubuntu only)\"),\n (\"no_update_string\", \"\", \"String to display if no updates available\")\n ]\n\n def __init__(self, **config):\n base.ThreadPoolText.__init__(self, \"\", **config)\n self.add_defaults(CheckUpdates.defaults)\n\n # Helpful to have this as a variable as we can shorten it for testing\n self.execute_polling_interval = 1\n\n # format: \"Distro\": (\"cmd\", \"number of lines to subtract from output\")\n self.cmd_dict = {\"Arch\": (\"pacman -Qu\", 0),\n \"Arch_checkupdates\": (\"checkupdates\", 0),\n \"Arch_Sup\": (\"pacman -Sup\", 1),\n \"Arch_yay\": (\"yay -Qu\", 0),\n \"Debian\": (\"apt-show-versions -u -b\", 0),\n \"Ubuntu\": (\"aptitude search ~U\", 0),\n \"Fedora\": (\"dnf list updates -q\", 1),\n \"FreeBSD\": (\"pkg_version -I -l '<'\", 0),\n \"Mandriva\": (\"urpmq --auto-select\", 0)\n }\n\n if self.custom_command:\n # Use custom_command\n self.cmd = self.custom_command\n\n else:\n # Check if distro name is valid.\n try:\n self.cmd = self.cmd_dict[self.distro][0]\n self.custom_command_modify = (lambda x: x - self.cmd_dict[self.distro][1])\n except KeyError:\n distros = sorted(self.cmd_dict.keys())\n logger.error(self.distro + ' is not a valid distro name. ' +\n 'Use one of the list: ' + str(distros) + '.')\n self.cmd = None\n\n if self.execute:\n self.add_callbacks({'Button1': self.do_execute})\n\n def _check_updates(self):\n # type: () -> str\n try:\n updates = self.call_process(self.cmd, shell=True)\n except CalledProcessError:\n updates = \"\"\n num_updates = self.custom_command_modify(len(updates.splitlines()))\n\n if num_updates < 0:\n num_updates = 0\n if num_updates == 0:\n self.layout.colour = self.colour_no_updates\n return self.no_update_string\n num_updates = str(num_updates)\n\n if self.restart_indicator and os.path.exists('/var/run/reboot-required'):\n num_updates += self.restart_indicator\n\n self.layout.colour = self.colour_have_updates\n return self.display_format.format(**{\"updates\": num_updates})\n\n def poll(self):\n # type: () -> str\n if not self.cmd:\n return \"N/A\"\n return self._check_updates()\n\n def do_execute(self):\n self._process = Popen(self.execute, shell=True)\n self.timeout_add(self.execute_polling_interval, self._refresh_count)\n\n def _refresh_count(self):\n if self._process.poll() is None:\n self.timeout_add(self.execute_polling_interval, self._refresh_count)\n\n else:\n self.timer_setup()\n", "path": "libqtile/widget/check_updates.py"}]}
| 1,727 | 242 |
gh_patches_debug_12045
|
rasdani/github-patches
|
git_diff
|
carpentries__amy-2639
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Require email reschedule date/time to be in the future
Currently past date/time values are allowed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `amy/emails/forms.py`
Content:
```
1 from django import forms
2 from markdownx.fields import MarkdownxFormField
3
4 from emails.models import EmailTemplate, ScheduledEmail
5 from emails.signals import SignalNameEnum
6 from workshops.forms import BootstrapHelper
7
8
9 class EmailTemplateCreateForm(forms.ModelForm):
10 body = MarkdownxFormField(
11 label=EmailTemplate._meta.get_field("body").verbose_name,
12 help_text=EmailTemplate._meta.get_field("body").help_text,
13 widget=forms.Textarea,
14 )
15 signal = forms.CharField(
16 help_text=EmailTemplate._meta.get_field("signal").help_text,
17 widget=forms.Select(choices=SignalNameEnum.choices()),
18 )
19
20 class Meta:
21 model = EmailTemplate
22 fields = [
23 "name",
24 "active",
25 "signal",
26 "from_header",
27 "reply_to_header",
28 "cc_header",
29 "bcc_header",
30 "subject",
31 "body",
32 ]
33
34 def __init__(self, *args, **kwargs):
35 super().__init__(*args, **kwargs)
36
37 array_email_field_help_text = "Separate email addresses with a comma"
38 self.fields["cc_header"].help_text = array_email_field_help_text
39 self.fields["bcc_header"].help_text = array_email_field_help_text
40
41
42 class EmailTemplateUpdateForm(EmailTemplateCreateForm):
43 signal = forms.CharField(
44 required=False,
45 disabled=True,
46 help_text=EmailTemplate._meta.get_field("signal").help_text,
47 widget=forms.Select(choices=SignalNameEnum.choices()),
48 )
49
50 class Meta(EmailTemplateCreateForm.Meta):
51 pass
52
53
54 class ScheduledEmailUpdateForm(forms.ModelForm):
55 body = MarkdownxFormField(
56 label=ScheduledEmail._meta.get_field("body").verbose_name,
57 help_text=ScheduledEmail._meta.get_field("body").help_text,
58 widget=forms.Textarea,
59 )
60
61 class Meta:
62 model = ScheduledEmail
63 fields = [
64 "to_header",
65 "from_header",
66 "reply_to_header",
67 "cc_header",
68 "bcc_header",
69 "subject",
70 "body",
71 ]
72
73 def __init__(self, *args, **kwargs):
74 super().__init__(*args, **kwargs)
75
76 array_email_field_help_text = "Separate email addresses with a comma"
77 self.fields["to_header"].help_text = array_email_field_help_text
78 self.fields["cc_header"].help_text = array_email_field_help_text
79 self.fields["bcc_header"].help_text = array_email_field_help_text
80
81
82 class ScheduledEmailRescheduleForm(forms.Form):
83 scheduled_at = forms.SplitDateTimeField(
84 label=ScheduledEmail._meta.get_field("scheduled_at").verbose_name,
85 help_text="Time in UTC",
86 )
87
88 helper = BootstrapHelper(submit_label="Update")
89
90
91 class ScheduledEmailCancelForm(forms.Form):
92 confirm = forms.CharField(required=False)
93 decline = forms.CharField(required=False)
94
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/amy/emails/forms.py b/amy/emails/forms.py
--- a/amy/emails/forms.py
+++ b/amy/emails/forms.py
@@ -1,3 +1,5 @@
+from datetime import UTC, datetime
+
from django import forms
from markdownx.fields import MarkdownxFormField
@@ -87,6 +89,14 @@
helper = BootstrapHelper(submit_label="Update")
+ def clean_scheduled_at(self):
+ scheduled_at = self.cleaned_data["scheduled_at"]
+
+ if scheduled_at < datetime.now(tz=UTC):
+ raise forms.ValidationError("Scheduled time cannot be in the past.")
+
+ return scheduled_at
+
class ScheduledEmailCancelForm(forms.Form):
confirm = forms.CharField(required=False)
|
{"golden_diff": "diff --git a/amy/emails/forms.py b/amy/emails/forms.py\n--- a/amy/emails/forms.py\n+++ b/amy/emails/forms.py\n@@ -1,3 +1,5 @@\n+from datetime import UTC, datetime\n+\n from django import forms\n from markdownx.fields import MarkdownxFormField\n \n@@ -87,6 +89,14 @@\n \n helper = BootstrapHelper(submit_label=\"Update\")\n \n+ def clean_scheduled_at(self):\n+ scheduled_at = self.cleaned_data[\"scheduled_at\"]\n+\n+ if scheduled_at < datetime.now(tz=UTC):\n+ raise forms.ValidationError(\"Scheduled time cannot be in the past.\")\n+\n+ return scheduled_at\n+\n \n class ScheduledEmailCancelForm(forms.Form):\n confirm = forms.CharField(required=False)\n", "issue": "Require email reschedule date/time to be in the future\nCurrently past date/time values are allowed. \n", "before_files": [{"content": "from django import forms\nfrom markdownx.fields import MarkdownxFormField\n\nfrom emails.models import EmailTemplate, ScheduledEmail\nfrom emails.signals import SignalNameEnum\nfrom workshops.forms import BootstrapHelper\n\n\nclass EmailTemplateCreateForm(forms.ModelForm):\n body = MarkdownxFormField(\n label=EmailTemplate._meta.get_field(\"body\").verbose_name,\n help_text=EmailTemplate._meta.get_field(\"body\").help_text,\n widget=forms.Textarea,\n )\n signal = forms.CharField(\n help_text=EmailTemplate._meta.get_field(\"signal\").help_text,\n widget=forms.Select(choices=SignalNameEnum.choices()),\n )\n\n class Meta:\n model = EmailTemplate\n fields = [\n \"name\",\n \"active\",\n \"signal\",\n \"from_header\",\n \"reply_to_header\",\n \"cc_header\",\n \"bcc_header\",\n \"subject\",\n \"body\",\n ]\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n array_email_field_help_text = \"Separate email addresses with a comma\"\n self.fields[\"cc_header\"].help_text = array_email_field_help_text\n self.fields[\"bcc_header\"].help_text = array_email_field_help_text\n\n\nclass EmailTemplateUpdateForm(EmailTemplateCreateForm):\n signal = forms.CharField(\n required=False,\n disabled=True,\n help_text=EmailTemplate._meta.get_field(\"signal\").help_text,\n widget=forms.Select(choices=SignalNameEnum.choices()),\n )\n\n class Meta(EmailTemplateCreateForm.Meta):\n pass\n\n\nclass ScheduledEmailUpdateForm(forms.ModelForm):\n body = MarkdownxFormField(\n label=ScheduledEmail._meta.get_field(\"body\").verbose_name,\n help_text=ScheduledEmail._meta.get_field(\"body\").help_text,\n widget=forms.Textarea,\n )\n\n class Meta:\n model = ScheduledEmail\n fields = [\n \"to_header\",\n \"from_header\",\n \"reply_to_header\",\n \"cc_header\",\n \"bcc_header\",\n \"subject\",\n \"body\",\n ]\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n array_email_field_help_text = \"Separate email addresses with a comma\"\n self.fields[\"to_header\"].help_text = array_email_field_help_text\n self.fields[\"cc_header\"].help_text = array_email_field_help_text\n self.fields[\"bcc_header\"].help_text = array_email_field_help_text\n\n\nclass ScheduledEmailRescheduleForm(forms.Form):\n scheduled_at = forms.SplitDateTimeField(\n label=ScheduledEmail._meta.get_field(\"scheduled_at\").verbose_name,\n help_text=\"Time in UTC\",\n )\n\n helper = BootstrapHelper(submit_label=\"Update\")\n\n\nclass ScheduledEmailCancelForm(forms.Form):\n confirm = forms.CharField(required=False)\n decline = forms.CharField(required=False)\n", "path": "amy/emails/forms.py"}], "after_files": [{"content": "from datetime import UTC, datetime\n\nfrom django import forms\nfrom markdownx.fields import MarkdownxFormField\n\nfrom emails.models import EmailTemplate, ScheduledEmail\nfrom emails.signals import SignalNameEnum\nfrom workshops.forms import BootstrapHelper\n\n\nclass EmailTemplateCreateForm(forms.ModelForm):\n body = MarkdownxFormField(\n label=EmailTemplate._meta.get_field(\"body\").verbose_name,\n help_text=EmailTemplate._meta.get_field(\"body\").help_text,\n widget=forms.Textarea,\n )\n signal = forms.CharField(\n help_text=EmailTemplate._meta.get_field(\"signal\").help_text,\n widget=forms.Select(choices=SignalNameEnum.choices()),\n )\n\n class Meta:\n model = EmailTemplate\n fields = [\n \"name\",\n \"active\",\n \"signal\",\n \"from_header\",\n \"reply_to_header\",\n \"cc_header\",\n \"bcc_header\",\n \"subject\",\n \"body\",\n ]\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n array_email_field_help_text = \"Separate email addresses with a comma\"\n self.fields[\"cc_header\"].help_text = array_email_field_help_text\n self.fields[\"bcc_header\"].help_text = array_email_field_help_text\n\n\nclass EmailTemplateUpdateForm(EmailTemplateCreateForm):\n signal = forms.CharField(\n required=False,\n disabled=True,\n help_text=EmailTemplate._meta.get_field(\"signal\").help_text,\n widget=forms.Select(choices=SignalNameEnum.choices()),\n )\n\n class Meta(EmailTemplateCreateForm.Meta):\n pass\n\n\nclass ScheduledEmailUpdateForm(forms.ModelForm):\n body = MarkdownxFormField(\n label=ScheduledEmail._meta.get_field(\"body\").verbose_name,\n help_text=ScheduledEmail._meta.get_field(\"body\").help_text,\n widget=forms.Textarea,\n )\n\n class Meta:\n model = ScheduledEmail\n fields = [\n \"to_header\",\n \"from_header\",\n \"reply_to_header\",\n \"cc_header\",\n \"bcc_header\",\n \"subject\",\n \"body\",\n ]\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n array_email_field_help_text = \"Separate email addresses with a comma\"\n self.fields[\"to_header\"].help_text = array_email_field_help_text\n self.fields[\"cc_header\"].help_text = array_email_field_help_text\n self.fields[\"bcc_header\"].help_text = array_email_field_help_text\n\n\nclass ScheduledEmailRescheduleForm(forms.Form):\n scheduled_at = forms.SplitDateTimeField(\n label=ScheduledEmail._meta.get_field(\"scheduled_at\").verbose_name,\n help_text=\"Time in UTC\",\n )\n\n helper = BootstrapHelper(submit_label=\"Update\")\n\n def clean_scheduled_at(self):\n scheduled_at = self.cleaned_data[\"scheduled_at\"]\n\n if scheduled_at < datetime.now(tz=UTC):\n raise forms.ValidationError(\"Scheduled time cannot be in the past.\")\n\n return scheduled_at\n\n\nclass ScheduledEmailCancelForm(forms.Form):\n confirm = forms.CharField(required=False)\n decline = forms.CharField(required=False)\n", "path": "amy/emails/forms.py"}]}
| 1,071 | 169 |
gh_patches_debug_15604
|
rasdani/github-patches
|
git_diff
|
microsoft__botbuilder-python-1487
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
OpenIdMetadata signing keys should refresh every 24 hours, and once per hour if a key is missing
## Describe the bug
Every 24 hours, and when a cert is cycled (a key is not found in the cache), the OpenIdMetadata cache within the sdk should gracefully refresh.
javascript implementation reference:
https://github.com/microsoft/botbuilder-js/pull/2466
python code to change:
https://github.com/microsoft/botbuilder-python/blob/master/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py#L129
[bug]
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 import json
5 from datetime import datetime, timedelta
6 from typing import List
7 import requests
8 from jwt.algorithms import RSAAlgorithm
9 import jwt
10 from .claims_identity import ClaimsIdentity
11 from .verify_options import VerifyOptions
12 from .endorsements_validator import EndorsementsValidator
13
14
15 class JwtTokenExtractor:
16 metadataCache = {}
17
18 def __init__(
19 self,
20 validation_params: VerifyOptions,
21 metadata_url: str,
22 allowed_algorithms: list,
23 ):
24 self.validation_parameters = validation_params
25 self.validation_parameters.algorithms = allowed_algorithms
26 self.open_id_metadata = JwtTokenExtractor.get_open_id_metadata(metadata_url)
27
28 @staticmethod
29 def get_open_id_metadata(metadata_url: str):
30 metadata = JwtTokenExtractor.metadataCache.get(metadata_url, None)
31 if metadata is None:
32 metadata = _OpenIdMetadata(metadata_url)
33 JwtTokenExtractor.metadataCache.setdefault(metadata_url, metadata)
34 return metadata
35
36 async def get_identity_from_auth_header(
37 self, auth_header: str, channel_id: str, required_endorsements: List[str] = None
38 ) -> ClaimsIdentity:
39 if not auth_header:
40 return None
41 parts = auth_header.split(" ")
42 if len(parts) == 2:
43 return await self.get_identity(
44 parts[0], parts[1], channel_id, required_endorsements
45 )
46 return None
47
48 async def get_identity(
49 self,
50 schema: str,
51 parameter: str,
52 channel_id: str,
53 required_endorsements: List[str] = None,
54 ) -> ClaimsIdentity:
55 # No header in correct scheme or no token
56 if schema != "Bearer" or not parameter:
57 return None
58
59 # Issuer isn't allowed? No need to check signature
60 if not self._has_allowed_issuer(parameter):
61 return None
62
63 try:
64 return await self._validate_token(
65 parameter, channel_id, required_endorsements
66 )
67 except Exception as error:
68 raise error
69
70 def _has_allowed_issuer(self, jwt_token: str) -> bool:
71 decoded = jwt.decode(jwt_token, verify=False)
72 issuer = decoded.get("iss", None)
73 if issuer in self.validation_parameters.issuer:
74 return True
75
76 return issuer == self.validation_parameters.issuer
77
78 async def _validate_token(
79 self, jwt_token: str, channel_id: str, required_endorsements: List[str] = None
80 ) -> ClaimsIdentity:
81 required_endorsements = required_endorsements or []
82 headers = jwt.get_unverified_header(jwt_token)
83
84 # Update the signing tokens from the last refresh
85 key_id = headers.get("kid", None)
86 metadata = await self.open_id_metadata.get(key_id)
87
88 if key_id and metadata.endorsements:
89 # Verify that channelId is included in endorsements
90 if not EndorsementsValidator.validate(channel_id, metadata.endorsements):
91 raise Exception("Could not validate endorsement key")
92
93 # Verify that additional endorsements are satisfied.
94 # If no additional endorsements are expected, the requirement is satisfied as well
95 for endorsement in required_endorsements:
96 if not EndorsementsValidator.validate(
97 endorsement, metadata.endorsements
98 ):
99 raise Exception("Could not validate endorsement key")
100
101 if headers.get("alg", None) not in self.validation_parameters.algorithms:
102 raise Exception("Token signing algorithm not in allowed list")
103
104 options = {
105 "verify_aud": False,
106 "verify_exp": not self.validation_parameters.ignore_expiration,
107 }
108
109 decoded_payload = jwt.decode(
110 jwt_token,
111 metadata.public_key,
112 leeway=self.validation_parameters.clock_tolerance,
113 options=options,
114 )
115
116 claims = ClaimsIdentity(decoded_payload, True)
117
118 return claims
119
120
121 class _OpenIdMetadata:
122 def __init__(self, url):
123 self.url = url
124 self.keys = []
125 self.last_updated = datetime.min
126
127 async def get(self, key_id: str):
128 # If keys are more than 5 days old, refresh them
129 if self.last_updated < (datetime.now() - timedelta(days=5)):
130 await self._refresh()
131 return self._find(key_id)
132
133 async def _refresh(self):
134 response = requests.get(self.url)
135 response.raise_for_status()
136 keys_url = response.json()["jwks_uri"]
137 response_keys = requests.get(keys_url)
138 response_keys.raise_for_status()
139 self.last_updated = datetime.now()
140 self.keys = response_keys.json()["keys"]
141
142 def _find(self, key_id: str):
143 if not self.keys:
144 return None
145 key = [x for x in self.keys if x["kid"] == key_id][0]
146 public_key = RSAAlgorithm.from_jwk(json.dumps(key))
147 endorsements = key.get("endorsements", [])
148 return _OpenIdConfig(public_key, endorsements)
149
150
151 class _OpenIdConfig:
152 def __init__(self, public_key, endorsements):
153 self.public_key = public_key
154 self.endorsements = endorsements
155
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py b/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py
--- a/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py
+++ b/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py
@@ -125,10 +125,16 @@
self.last_updated = datetime.min
async def get(self, key_id: str):
- # If keys are more than 5 days old, refresh them
- if self.last_updated < (datetime.now() - timedelta(days=5)):
+ # If keys are more than 1 day old, refresh them
+ if self.last_updated < (datetime.now() - timedelta(days=1)):
await self._refresh()
- return self._find(key_id)
+
+ key = self._find(key_id)
+ if not key and self.last_updated < (datetime.now() - timedelta(hours=1)):
+ # Refresh the cache if a key is not found (max once per hour)
+ await self._refresh()
+ key = self._find(key_id)
+ return key
async def _refresh(self):
response = requests.get(self.url)
|
{"golden_diff": "diff --git a/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py b/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py\n--- a/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py\n+++ b/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py\n@@ -125,10 +125,16 @@\n self.last_updated = datetime.min\n \n async def get(self, key_id: str):\n- # If keys are more than 5 days old, refresh them\n- if self.last_updated < (datetime.now() - timedelta(days=5)):\n+ # If keys are more than 1 day old, refresh them\n+ if self.last_updated < (datetime.now() - timedelta(days=1)):\n await self._refresh()\n- return self._find(key_id)\n+\n+ key = self._find(key_id)\n+ if not key and self.last_updated < (datetime.now() - timedelta(hours=1)):\n+ # Refresh the cache if a key is not found (max once per hour)\n+ await self._refresh()\n+ key = self._find(key_id)\n+ return key\n \n async def _refresh(self):\n response = requests.get(self.url)\n", "issue": "OpenIdMetadata signing keys should refresh every 24 hours, and once per hour if a key is missing\n## Describe the bug\r\nEvery 24 hours, and when a cert is cycled (a key is not found in the cache), the OpenIdMetadata cache within the sdk should gracefully refresh. \r\n\r\njavascript implementation reference:\r\nhttps://github.com/microsoft/botbuilder-js/pull/2466\r\n\r\npython code to change:\r\nhttps://github.com/microsoft/botbuilder-python/blob/master/libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py#L129\r\n\r\n[bug]\r\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport json\nfrom datetime import datetime, timedelta\nfrom typing import List\nimport requests\nfrom jwt.algorithms import RSAAlgorithm\nimport jwt\nfrom .claims_identity import ClaimsIdentity\nfrom .verify_options import VerifyOptions\nfrom .endorsements_validator import EndorsementsValidator\n\n\nclass JwtTokenExtractor:\n metadataCache = {}\n\n def __init__(\n self,\n validation_params: VerifyOptions,\n metadata_url: str,\n allowed_algorithms: list,\n ):\n self.validation_parameters = validation_params\n self.validation_parameters.algorithms = allowed_algorithms\n self.open_id_metadata = JwtTokenExtractor.get_open_id_metadata(metadata_url)\n\n @staticmethod\n def get_open_id_metadata(metadata_url: str):\n metadata = JwtTokenExtractor.metadataCache.get(metadata_url, None)\n if metadata is None:\n metadata = _OpenIdMetadata(metadata_url)\n JwtTokenExtractor.metadataCache.setdefault(metadata_url, metadata)\n return metadata\n\n async def get_identity_from_auth_header(\n self, auth_header: str, channel_id: str, required_endorsements: List[str] = None\n ) -> ClaimsIdentity:\n if not auth_header:\n return None\n parts = auth_header.split(\" \")\n if len(parts) == 2:\n return await self.get_identity(\n parts[0], parts[1], channel_id, required_endorsements\n )\n return None\n\n async def get_identity(\n self,\n schema: str,\n parameter: str,\n channel_id: str,\n required_endorsements: List[str] = None,\n ) -> ClaimsIdentity:\n # No header in correct scheme or no token\n if schema != \"Bearer\" or not parameter:\n return None\n\n # Issuer isn't allowed? No need to check signature\n if not self._has_allowed_issuer(parameter):\n return None\n\n try:\n return await self._validate_token(\n parameter, channel_id, required_endorsements\n )\n except Exception as error:\n raise error\n\n def _has_allowed_issuer(self, jwt_token: str) -> bool:\n decoded = jwt.decode(jwt_token, verify=False)\n issuer = decoded.get(\"iss\", None)\n if issuer in self.validation_parameters.issuer:\n return True\n\n return issuer == self.validation_parameters.issuer\n\n async def _validate_token(\n self, jwt_token: str, channel_id: str, required_endorsements: List[str] = None\n ) -> ClaimsIdentity:\n required_endorsements = required_endorsements or []\n headers = jwt.get_unverified_header(jwt_token)\n\n # Update the signing tokens from the last refresh\n key_id = headers.get(\"kid\", None)\n metadata = await self.open_id_metadata.get(key_id)\n\n if key_id and metadata.endorsements:\n # Verify that channelId is included in endorsements\n if not EndorsementsValidator.validate(channel_id, metadata.endorsements):\n raise Exception(\"Could not validate endorsement key\")\n\n # Verify that additional endorsements are satisfied.\n # If no additional endorsements are expected, the requirement is satisfied as well\n for endorsement in required_endorsements:\n if not EndorsementsValidator.validate(\n endorsement, metadata.endorsements\n ):\n raise Exception(\"Could not validate endorsement key\")\n\n if headers.get(\"alg\", None) not in self.validation_parameters.algorithms:\n raise Exception(\"Token signing algorithm not in allowed list\")\n\n options = {\n \"verify_aud\": False,\n \"verify_exp\": not self.validation_parameters.ignore_expiration,\n }\n\n decoded_payload = jwt.decode(\n jwt_token,\n metadata.public_key,\n leeway=self.validation_parameters.clock_tolerance,\n options=options,\n )\n\n claims = ClaimsIdentity(decoded_payload, True)\n\n return claims\n\n\nclass _OpenIdMetadata:\n def __init__(self, url):\n self.url = url\n self.keys = []\n self.last_updated = datetime.min\n\n async def get(self, key_id: str):\n # If keys are more than 5 days old, refresh them\n if self.last_updated < (datetime.now() - timedelta(days=5)):\n await self._refresh()\n return self._find(key_id)\n\n async def _refresh(self):\n response = requests.get(self.url)\n response.raise_for_status()\n keys_url = response.json()[\"jwks_uri\"]\n response_keys = requests.get(keys_url)\n response_keys.raise_for_status()\n self.last_updated = datetime.now()\n self.keys = response_keys.json()[\"keys\"]\n\n def _find(self, key_id: str):\n if not self.keys:\n return None\n key = [x for x in self.keys if x[\"kid\"] == key_id][0]\n public_key = RSAAlgorithm.from_jwk(json.dumps(key))\n endorsements = key.get(\"endorsements\", [])\n return _OpenIdConfig(public_key, endorsements)\n\n\nclass _OpenIdConfig:\n def __init__(self, public_key, endorsements):\n self.public_key = public_key\n self.endorsements = endorsements\n", "path": "libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport json\nfrom datetime import datetime, timedelta\nfrom typing import List\nimport requests\nfrom jwt.algorithms import RSAAlgorithm\nimport jwt\nfrom .claims_identity import ClaimsIdentity\nfrom .verify_options import VerifyOptions\nfrom .endorsements_validator import EndorsementsValidator\n\n\nclass JwtTokenExtractor:\n metadataCache = {}\n\n def __init__(\n self,\n validation_params: VerifyOptions,\n metadata_url: str,\n allowed_algorithms: list,\n ):\n self.validation_parameters = validation_params\n self.validation_parameters.algorithms = allowed_algorithms\n self.open_id_metadata = JwtTokenExtractor.get_open_id_metadata(metadata_url)\n\n @staticmethod\n def get_open_id_metadata(metadata_url: str):\n metadata = JwtTokenExtractor.metadataCache.get(metadata_url, None)\n if metadata is None:\n metadata = _OpenIdMetadata(metadata_url)\n JwtTokenExtractor.metadataCache.setdefault(metadata_url, metadata)\n return metadata\n\n async def get_identity_from_auth_header(\n self, auth_header: str, channel_id: str, required_endorsements: List[str] = None\n ) -> ClaimsIdentity:\n if not auth_header:\n return None\n parts = auth_header.split(\" \")\n if len(parts) == 2:\n return await self.get_identity(\n parts[0], parts[1], channel_id, required_endorsements\n )\n return None\n\n async def get_identity(\n self,\n schema: str,\n parameter: str,\n channel_id: str,\n required_endorsements: List[str] = None,\n ) -> ClaimsIdentity:\n # No header in correct scheme or no token\n if schema != \"Bearer\" or not parameter:\n return None\n\n # Issuer isn't allowed? No need to check signature\n if not self._has_allowed_issuer(parameter):\n return None\n\n try:\n return await self._validate_token(\n parameter, channel_id, required_endorsements\n )\n except Exception as error:\n raise error\n\n def _has_allowed_issuer(self, jwt_token: str) -> bool:\n decoded = jwt.decode(jwt_token, verify=False)\n issuer = decoded.get(\"iss\", None)\n if issuer in self.validation_parameters.issuer:\n return True\n\n return issuer == self.validation_parameters.issuer\n\n async def _validate_token(\n self, jwt_token: str, channel_id: str, required_endorsements: List[str] = None\n ) -> ClaimsIdentity:\n required_endorsements = required_endorsements or []\n headers = jwt.get_unverified_header(jwt_token)\n\n # Update the signing tokens from the last refresh\n key_id = headers.get(\"kid\", None)\n metadata = await self.open_id_metadata.get(key_id)\n\n if key_id and metadata.endorsements:\n # Verify that channelId is included in endorsements\n if not EndorsementsValidator.validate(channel_id, metadata.endorsements):\n raise Exception(\"Could not validate endorsement key\")\n\n # Verify that additional endorsements are satisfied.\n # If no additional endorsements are expected, the requirement is satisfied as well\n for endorsement in required_endorsements:\n if not EndorsementsValidator.validate(\n endorsement, metadata.endorsements\n ):\n raise Exception(\"Could not validate endorsement key\")\n\n if headers.get(\"alg\", None) not in self.validation_parameters.algorithms:\n raise Exception(\"Token signing algorithm not in allowed list\")\n\n options = {\n \"verify_aud\": False,\n \"verify_exp\": not self.validation_parameters.ignore_expiration,\n }\n\n decoded_payload = jwt.decode(\n jwt_token,\n metadata.public_key,\n leeway=self.validation_parameters.clock_tolerance,\n options=options,\n )\n\n claims = ClaimsIdentity(decoded_payload, True)\n\n return claims\n\n\nclass _OpenIdMetadata:\n def __init__(self, url):\n self.url = url\n self.keys = []\n self.last_updated = datetime.min\n\n async def get(self, key_id: str):\n # If keys are more than 1 day old, refresh them\n if self.last_updated < (datetime.now() - timedelta(days=1)):\n await self._refresh()\n\n key = self._find(key_id)\n if not key and self.last_updated < (datetime.now() - timedelta(hours=1)):\n # Refresh the cache if a key is not found (max once per hour)\n await self._refresh()\n key = self._find(key_id)\n return key\n\n async def _refresh(self):\n response = requests.get(self.url)\n response.raise_for_status()\n keys_url = response.json()[\"jwks_uri\"]\n response_keys = requests.get(keys_url)\n response_keys.raise_for_status()\n self.last_updated = datetime.now()\n self.keys = response_keys.json()[\"keys\"]\n\n def _find(self, key_id: str):\n if not self.keys:\n return None\n key = [x for x in self.keys if x[\"kid\"] == key_id][0]\n public_key = RSAAlgorithm.from_jwk(json.dumps(key))\n endorsements = key.get(\"endorsements\", [])\n return _OpenIdConfig(public_key, endorsements)\n\n\nclass _OpenIdConfig:\n def __init__(self, public_key, endorsements):\n self.public_key = public_key\n self.endorsements = endorsements\n", "path": "libraries/botframework-connector/botframework/connector/auth/jwt_token_extractor.py"}]}
| 1,863 | 286 |
gh_patches_debug_28753
|
rasdani/github-patches
|
git_diff
|
optuna__optuna-979
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Axis labels of parallel coordinate plots overlap if the parameter names are long.
As you can see in the following image, axis labels overlap when parameter names are too long. I think we can solve this issue if we limit the maximum length of the labels and/or tilt the labels.

**Conditions**
- Optuna version: 0.17.1
- Python version: 3.7.2
- OS: macOS 10.13
**Code to reproduce**
```python
import optuna
def objective(trial):
x = trial.suggest_uniform('x' * 100, -10, 10)
y = trial.suggest_uniform('y' * 100, -10, 10)
return x - y
study = optuna.create_study()
study.optimize(objective, n_trials=20)
optuna.visualization.plot_parallel_coordinate(study)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `optuna/visualization/parallel_coordinate.py`
Content:
```
1 from collections import defaultdict
2
3 from optuna.logging import get_logger
4 from optuna.structs import StudyDirection
5 from optuna.structs import TrialState
6 from optuna import type_checking
7 from optuna.visualization.utils import _check_plotly_availability
8 from optuna.visualization.utils import is_available
9
10 if type_checking.TYPE_CHECKING:
11 from typing import Any # NOQA
12 from typing import DefaultDict # NOQA
13 from typing import Dict # NOQA
14 from typing import List # NOQA
15 from typing import Optional # NOQA
16
17 from optuna.study import Study # NOQA
18
19 if is_available():
20 from optuna.visualization.plotly_imports import go
21
22 logger = get_logger(__name__)
23
24
25 def plot_parallel_coordinate(study, params=None):
26 # type: (Study, Optional[List[str]]) -> go.Figure
27 """Plot the high-dimentional parameter relationships in a study.
28
29 Note that, If a parameter contains missing values, a trial with missing values is not plotted.
30
31 Example:
32
33 The following code snippet shows how to plot the high-dimentional parameter relationships.
34
35 .. testcode::
36
37 import optuna
38
39 def objective(trial):
40 x = trial.suggest_uniform('x', -100, 100)
41 y = trial.suggest_categorical('y', [-1, 0, 1])
42 return x ** 2 + y
43
44 study = optuna.create_study()
45 study.optimize(objective, n_trials=10)
46
47 optuna.visualization.plot_parallel_coordinate(study, params=['x', 'y'])
48
49 .. raw:: html
50
51 <iframe src="../_static/plot_parallel_coordinate.html"
52 width="100%" height="500px" frameborder="0">
53 </iframe>
54
55 Args:
56 study:
57 A :class:`~optuna.study.Study` object whose trials are plotted for their objective
58 values.
59 params:
60 Parameter list to visualize. The default is all parameters.
61
62 Returns:
63 A :class:`plotly.graph_objs.Figure` object.
64 """
65
66 _check_plotly_availability()
67 return _get_parallel_coordinate_plot(study, params)
68
69
70 def _get_parallel_coordinate_plot(study, params=None):
71 # type: (Study, Optional[List[str]]) -> go.Figure
72
73 layout = go.Layout(title="Parallel Coordinate Plot",)
74
75 trials = [trial for trial in study.trials if trial.state == TrialState.COMPLETE]
76
77 if len(trials) == 0:
78 logger.warning("Your study does not have any completed trials.")
79 return go.Figure(data=[], layout=layout)
80
81 all_params = {p_name for t in trials for p_name in t.params.keys()}
82 if params is not None:
83 for input_p_name in params:
84 if input_p_name not in all_params:
85 ValueError("Parameter {} does not exist in your study.".format(input_p_name))
86 all_params = set(params)
87 sorted_params = sorted(list(all_params))
88
89 dims = [
90 {
91 "label": "Objective Value",
92 "values": tuple([t.value for t in trials]),
93 "range": (min([t.value for t in trials]), max([t.value for t in trials])),
94 }
95 ] # type: List[Dict[str, Any]]
96 for p_name in sorted_params:
97 values = []
98 for t in trials:
99 if p_name in t.params:
100 values.append(t.params[p_name])
101 is_categorical = False
102 try:
103 tuple(map(float, values))
104 except (TypeError, ValueError):
105 vocab = defaultdict(lambda: len(vocab)) # type: DefaultDict[str, int]
106 values = [vocab[v] for v in values]
107 is_categorical = True
108 dim = {"label": p_name, "values": tuple(values), "range": (min(values), max(values))}
109 if is_categorical:
110 dim["tickvals"] = list(range(len(vocab)))
111 dim["ticktext"] = list(sorted(vocab.items(), key=lambda x: x[1]))
112 dims.append(dim)
113
114 traces = [
115 go.Parcoords(
116 dimensions=dims,
117 line={
118 "color": dims[0]["values"],
119 "colorscale": "blues",
120 "colorbar": {"title": "Objective Value"},
121 "showscale": True,
122 "reversescale": study.direction == StudyDirection.MINIMIZE,
123 },
124 )
125 ]
126
127 figure = go.Figure(data=traces, layout=layout)
128
129 return figure
130
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/optuna/visualization/parallel_coordinate.py b/optuna/visualization/parallel_coordinate.py
--- a/optuna/visualization/parallel_coordinate.py
+++ b/optuna/visualization/parallel_coordinate.py
@@ -93,6 +93,7 @@
"range": (min([t.value for t in trials]), max([t.value for t in trials])),
}
] # type: List[Dict[str, Any]]
+
for p_name in sorted_params:
values = []
for t in trials:
@@ -105,7 +106,11 @@
vocab = defaultdict(lambda: len(vocab)) # type: DefaultDict[str, int]
values = [vocab[v] for v in values]
is_categorical = True
- dim = {"label": p_name, "values": tuple(values), "range": (min(values), max(values))}
+ dim = {
+ "label": p_name if len(p_name) < 20 else "{}...".format(p_name[:17]),
+ "values": tuple(values),
+ "range": (min(values), max(values)),
+ } # type: Dict[str, object]
if is_categorical:
dim["tickvals"] = list(range(len(vocab)))
dim["ticktext"] = list(sorted(vocab.items(), key=lambda x: x[1]))
@@ -114,6 +119,8 @@
traces = [
go.Parcoords(
dimensions=dims,
+ labelangle=30,
+ labelside="bottom",
line={
"color": dims[0]["values"],
"colorscale": "blues",
|
{"golden_diff": "diff --git a/optuna/visualization/parallel_coordinate.py b/optuna/visualization/parallel_coordinate.py\n--- a/optuna/visualization/parallel_coordinate.py\n+++ b/optuna/visualization/parallel_coordinate.py\n@@ -93,6 +93,7 @@\n \"range\": (min([t.value for t in trials]), max([t.value for t in trials])),\n }\n ] # type: List[Dict[str, Any]]\n+\n for p_name in sorted_params:\n values = []\n for t in trials:\n@@ -105,7 +106,11 @@\n vocab = defaultdict(lambda: len(vocab)) # type: DefaultDict[str, int]\n values = [vocab[v] for v in values]\n is_categorical = True\n- dim = {\"label\": p_name, \"values\": tuple(values), \"range\": (min(values), max(values))}\n+ dim = {\n+ \"label\": p_name if len(p_name) < 20 else \"{}...\".format(p_name[:17]),\n+ \"values\": tuple(values),\n+ \"range\": (min(values), max(values)),\n+ } # type: Dict[str, object]\n if is_categorical:\n dim[\"tickvals\"] = list(range(len(vocab)))\n dim[\"ticktext\"] = list(sorted(vocab.items(), key=lambda x: x[1]))\n@@ -114,6 +119,8 @@\n traces = [\n go.Parcoords(\n dimensions=dims,\n+ labelangle=30,\n+ labelside=\"bottom\",\n line={\n \"color\": dims[0][\"values\"],\n \"colorscale\": \"blues\",\n", "issue": "Axis labels of parallel coordinate plots overlap if the parameter names are long.\nAs you can see in the following image, axis labels overlap when parameter names are too long. I think we can solve this issue if we limit the maximum length of the labels and/or tilt the labels.\r\n\r\n\r\n\r\n**Conditions**\r\n- Optuna version: 0.17.1\r\n- Python version: 3.7.2\r\n- OS: macOS 10.13\r\n\r\n**Code to reproduce**\r\n\r\n```python\r\nimport optuna\r\n\r\ndef objective(trial):\r\n x = trial.suggest_uniform('x' * 100, -10, 10)\r\n y = trial.suggest_uniform('y' * 100, -10, 10)\r\n return x - y\r\n\r\nstudy = optuna.create_study()\r\nstudy.optimize(objective, n_trials=20)\r\n\r\noptuna.visualization.plot_parallel_coordinate(study)\r\n```\n", "before_files": [{"content": "from collections import defaultdict\n\nfrom optuna.logging import get_logger\nfrom optuna.structs import StudyDirection\nfrom optuna.structs import TrialState\nfrom optuna import type_checking\nfrom optuna.visualization.utils import _check_plotly_availability\nfrom optuna.visualization.utils import is_available\n\nif type_checking.TYPE_CHECKING:\n from typing import Any # NOQA\n from typing import DefaultDict # NOQA\n from typing import Dict # NOQA\n from typing import List # NOQA\n from typing import Optional # NOQA\n\n from optuna.study import Study # NOQA\n\nif is_available():\n from optuna.visualization.plotly_imports import go\n\nlogger = get_logger(__name__)\n\n\ndef plot_parallel_coordinate(study, params=None):\n # type: (Study, Optional[List[str]]) -> go.Figure\n \"\"\"Plot the high-dimentional parameter relationships in a study.\n\n Note that, If a parameter contains missing values, a trial with missing values is not plotted.\n\n Example:\n\n The following code snippet shows how to plot the high-dimentional parameter relationships.\n\n .. testcode::\n\n import optuna\n\n def objective(trial):\n x = trial.suggest_uniform('x', -100, 100)\n y = trial.suggest_categorical('y', [-1, 0, 1])\n return x ** 2 + y\n\n study = optuna.create_study()\n study.optimize(objective, n_trials=10)\n\n optuna.visualization.plot_parallel_coordinate(study, params=['x', 'y'])\n\n .. raw:: html\n\n <iframe src=\"../_static/plot_parallel_coordinate.html\"\n width=\"100%\" height=\"500px\" frameborder=\"0\">\n </iframe>\n\n Args:\n study:\n A :class:`~optuna.study.Study` object whose trials are plotted for their objective\n values.\n params:\n Parameter list to visualize. The default is all parameters.\n\n Returns:\n A :class:`plotly.graph_objs.Figure` object.\n \"\"\"\n\n _check_plotly_availability()\n return _get_parallel_coordinate_plot(study, params)\n\n\ndef _get_parallel_coordinate_plot(study, params=None):\n # type: (Study, Optional[List[str]]) -> go.Figure\n\n layout = go.Layout(title=\"Parallel Coordinate Plot\",)\n\n trials = [trial for trial in study.trials if trial.state == TrialState.COMPLETE]\n\n if len(trials) == 0:\n logger.warning(\"Your study does not have any completed trials.\")\n return go.Figure(data=[], layout=layout)\n\n all_params = {p_name for t in trials for p_name in t.params.keys()}\n if params is not None:\n for input_p_name in params:\n if input_p_name not in all_params:\n ValueError(\"Parameter {} does not exist in your study.\".format(input_p_name))\n all_params = set(params)\n sorted_params = sorted(list(all_params))\n\n dims = [\n {\n \"label\": \"Objective Value\",\n \"values\": tuple([t.value for t in trials]),\n \"range\": (min([t.value for t in trials]), max([t.value for t in trials])),\n }\n ] # type: List[Dict[str, Any]]\n for p_name in sorted_params:\n values = []\n for t in trials:\n if p_name in t.params:\n values.append(t.params[p_name])\n is_categorical = False\n try:\n tuple(map(float, values))\n except (TypeError, ValueError):\n vocab = defaultdict(lambda: len(vocab)) # type: DefaultDict[str, int]\n values = [vocab[v] for v in values]\n is_categorical = True\n dim = {\"label\": p_name, \"values\": tuple(values), \"range\": (min(values), max(values))}\n if is_categorical:\n dim[\"tickvals\"] = list(range(len(vocab)))\n dim[\"ticktext\"] = list(sorted(vocab.items(), key=lambda x: x[1]))\n dims.append(dim)\n\n traces = [\n go.Parcoords(\n dimensions=dims,\n line={\n \"color\": dims[0][\"values\"],\n \"colorscale\": \"blues\",\n \"colorbar\": {\"title\": \"Objective Value\"},\n \"showscale\": True,\n \"reversescale\": study.direction == StudyDirection.MINIMIZE,\n },\n )\n ]\n\n figure = go.Figure(data=traces, layout=layout)\n\n return figure\n", "path": "optuna/visualization/parallel_coordinate.py"}], "after_files": [{"content": "from collections import defaultdict\n\nfrom optuna.logging import get_logger\nfrom optuna.structs import StudyDirection\nfrom optuna.structs import TrialState\nfrom optuna import type_checking\nfrom optuna.visualization.utils import _check_plotly_availability\nfrom optuna.visualization.utils import is_available\n\nif type_checking.TYPE_CHECKING:\n from typing import Any # NOQA\n from typing import DefaultDict # NOQA\n from typing import Dict # NOQA\n from typing import List # NOQA\n from typing import Optional # NOQA\n\n from optuna.study import Study # NOQA\n\nif is_available():\n from optuna.visualization.plotly_imports import go\n\nlogger = get_logger(__name__)\n\n\ndef plot_parallel_coordinate(study, params=None):\n # type: (Study, Optional[List[str]]) -> go.Figure\n \"\"\"Plot the high-dimentional parameter relationships in a study.\n\n Note that, If a parameter contains missing values, a trial with missing values is not plotted.\n\n Example:\n\n The following code snippet shows how to plot the high-dimentional parameter relationships.\n\n .. testcode::\n\n import optuna\n\n def objective(trial):\n x = trial.suggest_uniform('x', -100, 100)\n y = trial.suggest_categorical('y', [-1, 0, 1])\n return x ** 2 + y\n\n study = optuna.create_study()\n study.optimize(objective, n_trials=10)\n\n optuna.visualization.plot_parallel_coordinate(study, params=['x', 'y'])\n\n .. raw:: html\n\n <iframe src=\"../_static/plot_parallel_coordinate.html\"\n width=\"100%\" height=\"500px\" frameborder=\"0\">\n </iframe>\n\n Args:\n study:\n A :class:`~optuna.study.Study` object whose trials are plotted for their objective\n values.\n params:\n Parameter list to visualize. The default is all parameters.\n\n Returns:\n A :class:`plotly.graph_objs.Figure` object.\n \"\"\"\n\n _check_plotly_availability()\n return _get_parallel_coordinate_plot(study, params)\n\n\ndef _get_parallel_coordinate_plot(study, params=None):\n # type: (Study, Optional[List[str]]) -> go.Figure\n\n layout = go.Layout(title=\"Parallel Coordinate Plot\",)\n\n trials = [trial for trial in study.trials if trial.state == TrialState.COMPLETE]\n\n if len(trials) == 0:\n logger.warning(\"Your study does not have any completed trials.\")\n return go.Figure(data=[], layout=layout)\n\n all_params = {p_name for t in trials for p_name in t.params.keys()}\n if params is not None:\n for input_p_name in params:\n if input_p_name not in all_params:\n ValueError(\"Parameter {} does not exist in your study.\".format(input_p_name))\n all_params = set(params)\n sorted_params = sorted(list(all_params))\n\n dims = [\n {\n \"label\": \"Objective Value\",\n \"values\": tuple([t.value for t in trials]),\n \"range\": (min([t.value for t in trials]), max([t.value for t in trials])),\n }\n ] # type: List[Dict[str, Any]]\n\n for p_name in sorted_params:\n values = []\n for t in trials:\n if p_name in t.params:\n values.append(t.params[p_name])\n is_categorical = False\n try:\n tuple(map(float, values))\n except (TypeError, ValueError):\n vocab = defaultdict(lambda: len(vocab)) # type: DefaultDict[str, int]\n values = [vocab[v] for v in values]\n is_categorical = True\n dim = {\n \"label\": p_name if len(p_name) < 20 else \"{}...\".format(p_name[:17]),\n \"values\": tuple(values),\n \"range\": (min(values), max(values)),\n } # type: Dict[str, object]\n if is_categorical:\n dim[\"tickvals\"] = list(range(len(vocab)))\n dim[\"ticktext\"] = list(sorted(vocab.items(), key=lambda x: x[1]))\n dims.append(dim)\n\n traces = [\n go.Parcoords(\n dimensions=dims,\n labelangle=30,\n labelside=\"bottom\",\n line={\n \"color\": dims[0][\"values\"],\n \"colorscale\": \"blues\",\n \"colorbar\": {\"title\": \"Objective Value\"},\n \"showscale\": True,\n \"reversescale\": study.direction == StudyDirection.MINIMIZE,\n },\n )\n ]\n\n figure = go.Figure(data=traces, layout=layout)\n\n return figure\n", "path": "optuna/visualization/parallel_coordinate.py"}]}
| 1,789 | 360 |
gh_patches_debug_60690
|
rasdani/github-patches
|
git_diff
|
biolab__orange3-text-356
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bag of Words: crashes if < 11 tokens on the input
<!--
This is an issue template. Please fill in the relevant details in the
sections below.
-->
##### Text version
<!-- From menu _Options→Add-ons→Orange3-Text_ or code `orangecontrib.text.version.full_version` -->
0.3.0
##### Orange version
<!-- From menu _Help→About→Version_ or code `Orange.version.full_version` -->
3.14.dev
##### Expected behavior
Bag of Words doesn't crash on few tokens
##### Actual behavior
BoW crashes if less then 11 tokens on the input and Binary option selected.
##### Steps to reproduce the behavior
Corpus - Preprocess Text (have it output less than 11 types) - Bag of Words (Binary)
##### Additional info (worksheets, data, screenshots, ...)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `orangecontrib/text/vectorization/bagofwords.py`
Content:
```
1 """ This module constructs a new corpus with tokens as features.
2
3 First create a corpus::
4
5 >>> from orangecontrib.text import Corpus
6 >>> corpus = Corpus.from_file('deerwester')
7 >>> corpus.domain
8 [ | Category] {Text}
9
10 Then create :class:`BowVectorizer` object and call transform:
11
12 >>> from orangecontrib.text.vectorization.bagofwords import BowVectorizer
13 >>> bow = BowVectorizer()
14 >>> new_corpus = bow.transform(corpus)
15 >>> new_corpus.domain
16 [a, abc, and, applications, binary, computer, engineering, eps, error, for,
17 generation, graph, human, in, interface, intersection, iv, lab, machine,
18 management, measurement, minors, of, opinion, ordering, paths, perceived,
19 quasi, random, relation, response, survey, system, testing, the, time, to,
20 trees, unordered, user, well, widths | Category] {Text}
21
22 """
23
24 from collections import OrderedDict
25 from functools import partial
26
27 import numpy as np
28 from gensim import corpora, models, matutils
29 from sklearn.preprocessing import normalize
30
31 from orangecontrib.text.vectorization.base import BaseVectorizer,\
32 SharedTransform, VectorizationComputeValue
33
34
35 class BowVectorizer(BaseVectorizer):
36 name = 'BoW Vectorizer'
37
38 COUNT = 'Count'
39 BINARY = 'Binary'
40 SUBLINEAR = 'Sublinear'
41 NONE = '(None)'
42 IDF = 'IDF'
43 SMOOTH = 'Smooth IDF'
44 L1 = 'L1 (Sum of elements)'
45 L2 = 'L2 (Euclidean)'
46
47 wlocals = OrderedDict((
48 (COUNT, lambda tf: tf),
49 (BINARY, lambda tf: np.greater(tf, 0, dtype=np.int)),
50 (SUBLINEAR, lambda tf: 1 + np.log(tf)),
51 ))
52
53 wglobals = OrderedDict((
54 (NONE, lambda df, N: 1),
55 (IDF, lambda df, N: np.log(N/df)),
56 (SMOOTH, lambda df, N: np.log(1 + N/df)),
57 ))
58
59 norms = OrderedDict((
60 (NONE, None),
61 (L1, partial(normalize, norm='l1')),
62 (L2, partial(normalize, norm='l2')),
63 ))
64
65 def __init__(self, norm=NONE, wlocal=COUNT, wglobal=NONE):
66 self.norm = norm
67 self.wlocal = wlocal
68 self.wglobal = wglobal
69
70 def _transform(self, corpus, source_dict=None):
71 temp_corpus = list(corpus.ngrams_iterator(' ', include_postags=True))
72 dic = corpora.Dictionary(temp_corpus, prune_at=None) if not source_dict else source_dict
73 temp_corpus = [dic.doc2bow(doc) for doc in temp_corpus]
74 model = models.TfidfModel(temp_corpus, normalize=False,
75 wlocal=self.wlocals[self.wlocal],
76 wglobal=self.wglobals[self.wglobal])
77
78 X = matutils.corpus2csc(model[temp_corpus], dtype=np.float, num_terms=len(dic)).T
79 norm = self.norms[self.norm]
80 if norm:
81 X = norm(X)
82
83 # set compute values
84 shared_cv = SharedTransform(self, corpus.used_preprocessor,
85 source_dict=dic)
86 cv = [VectorizationComputeValue(shared_cv, dic[i])
87 for i in range(len(dic))]
88
89 self.add_features(corpus, X, dic, cv, var_attrs={'bow-feature': True})
90 return corpus
91
92 def report(self):
93 return (('Term Frequency', self.wlocal),
94 ('Document Frequency', self.wglobal),
95 ('Regularization', self.norm),)
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/orangecontrib/text/vectorization/bagofwords.py b/orangecontrib/text/vectorization/bagofwords.py
--- a/orangecontrib/text/vectorization/bagofwords.py
+++ b/orangecontrib/text/vectorization/bagofwords.py
@@ -46,7 +46,8 @@
wlocals = OrderedDict((
(COUNT, lambda tf: tf),
- (BINARY, lambda tf: np.greater(tf, 0, dtype=np.int)),
+ (BINARY, lambda tf: np.greater(tf, 0, dtype=np.int) if tf.size
+ else np.array([], dtype=np.int)),
(SUBLINEAR, lambda tf: 1 + np.log(tf)),
))
|
{"golden_diff": "diff --git a/orangecontrib/text/vectorization/bagofwords.py b/orangecontrib/text/vectorization/bagofwords.py\n--- a/orangecontrib/text/vectorization/bagofwords.py\n+++ b/orangecontrib/text/vectorization/bagofwords.py\n@@ -46,7 +46,8 @@\n \n wlocals = OrderedDict((\n (COUNT, lambda tf: tf),\n- (BINARY, lambda tf: np.greater(tf, 0, dtype=np.int)),\n+ (BINARY, lambda tf: np.greater(tf, 0, dtype=np.int) if tf.size\n+ else np.array([], dtype=np.int)),\n (SUBLINEAR, lambda tf: 1 + np.log(tf)),\n ))\n", "issue": "Bag of Words: crashes if < 11 tokens on the input\n<!--\r\nThis is an issue template. Please fill in the relevant details in the\r\nsections below.\r\n-->\r\n\r\n##### Text version\r\n<!-- From menu _Options\u2192Add-ons\u2192Orange3-Text_ or code `orangecontrib.text.version.full_version` -->\r\n0.3.0\r\n\r\n##### Orange version\r\n<!-- From menu _Help\u2192About\u2192Version_ or code `Orange.version.full_version` -->\r\n3.14.dev\r\n\r\n##### Expected behavior\r\nBag of Words doesn't crash on few tokens\r\n\r\n\r\n##### Actual behavior\r\nBoW crashes if less then 11 tokens on the input and Binary option selected.\r\n\r\n\r\n##### Steps to reproduce the behavior\r\nCorpus - Preprocess Text (have it output less than 11 types) - Bag of Words (Binary)\r\n\r\n\r\n##### Additional info (worksheets, data, screenshots, ...)\r\n\r\n\r\n\n", "before_files": [{"content": "\"\"\" This module constructs a new corpus with tokens as features.\n\nFirst create a corpus::\n\n >>> from orangecontrib.text import Corpus\n >>> corpus = Corpus.from_file('deerwester')\n >>> corpus.domain\n [ | Category] {Text}\n\nThen create :class:`BowVectorizer` object and call transform:\n\n >>> from orangecontrib.text.vectorization.bagofwords import BowVectorizer\n >>> bow = BowVectorizer()\n >>> new_corpus = bow.transform(corpus)\n >>> new_corpus.domain\n [a, abc, and, applications, binary, computer, engineering, eps, error, for,\n generation, graph, human, in, interface, intersection, iv, lab, machine,\n management, measurement, minors, of, opinion, ordering, paths, perceived,\n quasi, random, relation, response, survey, system, testing, the, time, to,\n trees, unordered, user, well, widths | Category] {Text}\n\n\"\"\"\n\nfrom collections import OrderedDict\nfrom functools import partial\n\nimport numpy as np\nfrom gensim import corpora, models, matutils\nfrom sklearn.preprocessing import normalize\n\nfrom orangecontrib.text.vectorization.base import BaseVectorizer,\\\n SharedTransform, VectorizationComputeValue\n\n\nclass BowVectorizer(BaseVectorizer):\n name = 'BoW Vectorizer'\n\n COUNT = 'Count'\n BINARY = 'Binary'\n SUBLINEAR = 'Sublinear'\n NONE = '(None)'\n IDF = 'IDF'\n SMOOTH = 'Smooth IDF'\n L1 = 'L1 (Sum of elements)'\n L2 = 'L2 (Euclidean)'\n\n wlocals = OrderedDict((\n (COUNT, lambda tf: tf),\n (BINARY, lambda tf: np.greater(tf, 0, dtype=np.int)),\n (SUBLINEAR, lambda tf: 1 + np.log(tf)),\n ))\n\n wglobals = OrderedDict((\n (NONE, lambda df, N: 1),\n (IDF, lambda df, N: np.log(N/df)),\n (SMOOTH, lambda df, N: np.log(1 + N/df)),\n ))\n\n norms = OrderedDict((\n (NONE, None),\n (L1, partial(normalize, norm='l1')),\n (L2, partial(normalize, norm='l2')),\n ))\n\n def __init__(self, norm=NONE, wlocal=COUNT, wglobal=NONE):\n self.norm = norm\n self.wlocal = wlocal\n self.wglobal = wglobal\n\n def _transform(self, corpus, source_dict=None):\n temp_corpus = list(corpus.ngrams_iterator(' ', include_postags=True))\n dic = corpora.Dictionary(temp_corpus, prune_at=None) if not source_dict else source_dict\n temp_corpus = [dic.doc2bow(doc) for doc in temp_corpus]\n model = models.TfidfModel(temp_corpus, normalize=False,\n wlocal=self.wlocals[self.wlocal],\n wglobal=self.wglobals[self.wglobal])\n\n X = matutils.corpus2csc(model[temp_corpus], dtype=np.float, num_terms=len(dic)).T\n norm = self.norms[self.norm]\n if norm:\n X = norm(X)\n\n # set compute values\n shared_cv = SharedTransform(self, corpus.used_preprocessor,\n source_dict=dic)\n cv = [VectorizationComputeValue(shared_cv, dic[i])\n for i in range(len(dic))]\n\n self.add_features(corpus, X, dic, cv, var_attrs={'bow-feature': True})\n return corpus\n\n def report(self):\n return (('Term Frequency', self.wlocal),\n ('Document Frequency', self.wglobal),\n ('Regularization', self.norm),)\n", "path": "orangecontrib/text/vectorization/bagofwords.py"}], "after_files": [{"content": "\"\"\" This module constructs a new corpus with tokens as features.\n\nFirst create a corpus::\n\n >>> from orangecontrib.text import Corpus\n >>> corpus = Corpus.from_file('deerwester')\n >>> corpus.domain\n [ | Category] {Text}\n\nThen create :class:`BowVectorizer` object and call transform:\n\n >>> from orangecontrib.text.vectorization.bagofwords import BowVectorizer\n >>> bow = BowVectorizer()\n >>> new_corpus = bow.transform(corpus)\n >>> new_corpus.domain\n [a, abc, and, applications, binary, computer, engineering, eps, error, for,\n generation, graph, human, in, interface, intersection, iv, lab, machine,\n management, measurement, minors, of, opinion, ordering, paths, perceived,\n quasi, random, relation, response, survey, system, testing, the, time, to,\n trees, unordered, user, well, widths | Category] {Text}\n\n\"\"\"\n\nfrom collections import OrderedDict\nfrom functools import partial\n\nimport numpy as np\nfrom gensim import corpora, models, matutils\nfrom sklearn.preprocessing import normalize\n\nfrom orangecontrib.text.vectorization.base import BaseVectorizer,\\\n SharedTransform, VectorizationComputeValue\n\n\nclass BowVectorizer(BaseVectorizer):\n name = 'BoW Vectorizer'\n\n COUNT = 'Count'\n BINARY = 'Binary'\n SUBLINEAR = 'Sublinear'\n NONE = '(None)'\n IDF = 'IDF'\n SMOOTH = 'Smooth IDF'\n L1 = 'L1 (Sum of elements)'\n L2 = 'L2 (Euclidean)'\n\n wlocals = OrderedDict((\n (COUNT, lambda tf: tf),\n (BINARY, lambda tf: np.greater(tf, 0, dtype=np.int) if tf.size\n else np.array([], dtype=np.int)),\n (SUBLINEAR, lambda tf: 1 + np.log(tf)),\n ))\n\n wglobals = OrderedDict((\n (NONE, lambda df, N: 1),\n (IDF, lambda df, N: np.log(N/df)),\n (SMOOTH, lambda df, N: np.log(1 + N/df)),\n ))\n\n norms = OrderedDict((\n (NONE, None),\n (L1, partial(normalize, norm='l1')),\n (L2, partial(normalize, norm='l2')),\n ))\n\n def __init__(self, norm=NONE, wlocal=COUNT, wglobal=NONE):\n self.norm = norm\n self.wlocal = wlocal\n self.wglobal = wglobal\n\n def _transform(self, corpus, source_dict=None):\n temp_corpus = list(corpus.ngrams_iterator(' ', include_postags=True))\n dic = corpora.Dictionary(temp_corpus, prune_at=None) if not source_dict else source_dict\n temp_corpus = [dic.doc2bow(doc) for doc in temp_corpus]\n model = models.TfidfModel(temp_corpus, normalize=False,\n wlocal=self.wlocals[self.wlocal],\n wglobal=self.wglobals[self.wglobal])\n\n X = matutils.corpus2csc(model[temp_corpus], dtype=np.float, num_terms=len(dic)).T\n norm = self.norms[self.norm]\n if norm:\n X = norm(X)\n\n # set compute values\n shared_cv = SharedTransform(self, corpus.used_preprocessor,\n source_dict=dic)\n cv = [VectorizationComputeValue(shared_cv, dic[i])\n for i in range(len(dic))]\n\n self.add_features(corpus, X, dic, cv, var_attrs={'bow-feature': True})\n return corpus\n\n def report(self):\n return (('Term Frequency', self.wlocal),\n ('Document Frequency', self.wglobal),\n ('Regularization', self.norm),)\n", "path": "orangecontrib/text/vectorization/bagofwords.py"}]}
| 1,427 | 158 |
gh_patches_debug_30772
|
rasdani/github-patches
|
git_diff
|
3cn-ecn__nantralPlatform-446
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Pages lentes
Certaines pages sont un peu lentes à charger:
- liste des clubs
C'est peut-être lié au grand nombre d'images, il faudrait étudier la possibilité de cacher ces images.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `server/apps/club/views.py`
Content:
```
1 from django.views.generic import ListView, TemplateView
2 from django.contrib.auth.mixins import LoginRequiredMixin
3 from django.urls import resolve
4
5 from apps.club.models import Club, BDX
6 from apps.group.models import Group
7 from apps.group.views import BaseDetailGroupView
8
9 from apps.utils.slug import *
10
11 class ListClubView(TemplateView):
12 template_name = 'club/list.html'
13
14 def get_context_data(self, **kwargs):
15 context = {'club_list': [] }
16 try:
17 context['club_list'].append({
18 'grouper': "Mes Clubs et Assos",
19 'list': Club.objects.filter(members__user=self.request.user).only('name', 'slug', 'logo', 'bdx_type'),
20 })
21 except Exception:
22 pass
23 club_list = Club.objects.all().select_related('bdx_type').only('name', 'slug', 'logo', 'bdx_type')
24 context['club_list'].append({
25 'grouper': "Associations",
26 'list': club_list.filter(bdx_type__isnull=True)
27 })
28 for bdx in BDX.objects.all():
29 context['club_list'].append({
30 'grouper': f'Clubs {bdx.name}',
31 'list': club_list.filter(bdx_type=bdx),
32 })
33 return context
34
35
36
37 class DetailClubView(BaseDetailGroupView):
38 '''Vue de détails d'un club.'''
39
40 template_name='club/detail.html'
41
42
43 class DetailGroupMembersView(LoginRequiredMixin, ListView):
44 template_name = 'club/members.html'
45
46 def get_object(self, **kwargs):
47 app = resolve(self.request.path).app_name
48 slug = self.kwargs.get("slug")
49 return get_object_from_slug(app, slug)
50
51 def get_queryset(self, **kwargs):
52 object = self.get_object()
53 members = object.members.through.objects.filter(group=object)
54 return members.order_by('year', 'order')
55
56 def get_context_data(self, **kwargs):
57 context = super().get_context_data(**kwargs)
58 context['object'] = self.get_object()
59 return context
60
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/server/apps/club/views.py b/server/apps/club/views.py
--- a/server/apps/club/views.py
+++ b/server/apps/club/views.py
@@ -2,8 +2,7 @@
from django.contrib.auth.mixins import LoginRequiredMixin
from django.urls import resolve
-from apps.club.models import Club, BDX
-from apps.group.models import Group
+from apps.club.models import Club
from apps.group.views import BaseDetailGroupView
from apps.utils.slug import *
@@ -12,24 +11,18 @@
template_name = 'club/list.html'
def get_context_data(self, **kwargs):
- context = {'club_list': [] }
- try:
- context['club_list'].append({
- 'grouper': "Mes Clubs et Assos",
- 'list': Club.objects.filter(members__user=self.request.user).only('name', 'slug', 'logo', 'bdx_type'),
- })
- except Exception:
- pass
- club_list = Club.objects.all().select_related('bdx_type').only('name', 'slug', 'logo', 'bdx_type')
- context['club_list'].append({
- 'grouper': "Associations",
- 'list': club_list.filter(bdx_type__isnull=True)
- })
- for bdx in BDX.objects.all():
- context['club_list'].append({
- 'grouper': f'Clubs {bdx.name}',
- 'list': club_list.filter(bdx_type=bdx),
- })
+ context = {'club_list': {} }
+ clubList = {}
+ allMembersClub = Club.objects.filter(members__user=self.request.user).only('name', 'slug', 'logo', 'bdx_type')
+ for club in allMembersClub:
+ clubList.setdefault("Mes Clubs et Assos", []).append(club)
+ allClubs = Club.objects.all().select_related("bdx_type").only('name', 'slug', 'logo', 'bdx_type')
+ for club in allClubs:
+ if(club.bdx_type is None):
+ clubList.setdefault("Associations", []).append(club)
+ else:
+ clubList.setdefault(f'Clubs {club.bdx_type.name}', []).append(club)
+ context['club_list']=clubList
return context
|
{"golden_diff": "diff --git a/server/apps/club/views.py b/server/apps/club/views.py\n--- a/server/apps/club/views.py\n+++ b/server/apps/club/views.py\n@@ -2,8 +2,7 @@\n from django.contrib.auth.mixins import LoginRequiredMixin\n from django.urls import resolve\n \n-from apps.club.models import Club, BDX\n-from apps.group.models import Group\n+from apps.club.models import Club\n from apps.group.views import BaseDetailGroupView\n \n from apps.utils.slug import *\n@@ -12,24 +11,18 @@\n template_name = 'club/list.html'\n \n def get_context_data(self, **kwargs):\n- context = {'club_list': [] }\n- try:\n- context['club_list'].append({\n- 'grouper': \"Mes Clubs et Assos\",\n- 'list': Club.objects.filter(members__user=self.request.user).only('name', 'slug', 'logo', 'bdx_type'),\n- })\n- except Exception:\n- pass\n- club_list = Club.objects.all().select_related('bdx_type').only('name', 'slug', 'logo', 'bdx_type')\n- context['club_list'].append({\n- 'grouper': \"Associations\",\n- 'list': club_list.filter(bdx_type__isnull=True)\n- })\n- for bdx in BDX.objects.all():\n- context['club_list'].append({\n- 'grouper': f'Clubs {bdx.name}',\n- 'list': club_list.filter(bdx_type=bdx),\n- })\n+ context = {'club_list': {} }\n+ clubList = {}\n+ allMembersClub = Club.objects.filter(members__user=self.request.user).only('name', 'slug', 'logo', 'bdx_type')\n+ for club in allMembersClub:\n+ clubList.setdefault(\"Mes Clubs et Assos\", []).append(club)\n+ allClubs = Club.objects.all().select_related(\"bdx_type\").only('name', 'slug', 'logo', 'bdx_type')\n+ for club in allClubs:\n+ if(club.bdx_type is None):\n+ clubList.setdefault(\"Associations\", []).append(club)\n+ else:\n+ clubList.setdefault(f'Clubs {club.bdx_type.name}', []).append(club)\n+ context['club_list']=clubList\n return context\n", "issue": "Pages lentes\nCertaines pages sont un peu lentes \u00e0 charger:\r\n- liste des clubs\r\n\r\nC'est peut-\u00eatre li\u00e9 au grand nombre d'images, il faudrait \u00e9tudier la possibilit\u00e9 de cacher ces images.\n", "before_files": [{"content": "from django.views.generic import ListView, TemplateView\nfrom django.contrib.auth.mixins import LoginRequiredMixin\nfrom django.urls import resolve\n\nfrom apps.club.models import Club, BDX\nfrom apps.group.models import Group\nfrom apps.group.views import BaseDetailGroupView\n\nfrom apps.utils.slug import *\n\nclass ListClubView(TemplateView):\n template_name = 'club/list.html'\n\n def get_context_data(self, **kwargs):\n context = {'club_list': [] }\n try:\n context['club_list'].append({\n 'grouper': \"Mes Clubs et Assos\",\n 'list': Club.objects.filter(members__user=self.request.user).only('name', 'slug', 'logo', 'bdx_type'),\n })\n except Exception:\n pass\n club_list = Club.objects.all().select_related('bdx_type').only('name', 'slug', 'logo', 'bdx_type')\n context['club_list'].append({\n 'grouper': \"Associations\",\n 'list': club_list.filter(bdx_type__isnull=True)\n })\n for bdx in BDX.objects.all():\n context['club_list'].append({\n 'grouper': f'Clubs {bdx.name}',\n 'list': club_list.filter(bdx_type=bdx),\n })\n return context\n\n\n\nclass DetailClubView(BaseDetailGroupView):\n '''Vue de d\u00e9tails d'un club.'''\n \n template_name='club/detail.html'\n\n\nclass DetailGroupMembersView(LoginRequiredMixin, ListView):\n template_name = 'club/members.html'\n \n def get_object(self, **kwargs):\n app = resolve(self.request.path).app_name\n slug = self.kwargs.get(\"slug\")\n return get_object_from_slug(app, slug)\n \n def get_queryset(self, **kwargs):\n object = self.get_object()\n members = object.members.through.objects.filter(group=object)\n return members.order_by('year', 'order')\n \n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['object'] = self.get_object()\n return context\n", "path": "server/apps/club/views.py"}], "after_files": [{"content": "from django.views.generic import ListView, TemplateView\nfrom django.contrib.auth.mixins import LoginRequiredMixin\nfrom django.urls import resolve\n\nfrom apps.club.models import Club\nfrom apps.group.views import BaseDetailGroupView\n\nfrom apps.utils.slug import *\n\nclass ListClubView(TemplateView):\n template_name = 'club/list.html'\n\n def get_context_data(self, **kwargs):\n context = {'club_list': {} }\n clubList = {}\n allMembersClub = Club.objects.filter(members__user=self.request.user).only('name', 'slug', 'logo', 'bdx_type')\n for club in allMembersClub:\n clubList.setdefault(\"Mes Clubs et Assos\", []).append(club)\n allClubs = Club.objects.all().select_related(\"bdx_type\").only('name', 'slug', 'logo', 'bdx_type')\n for club in allClubs:\n if(club.bdx_type is None):\n clubList.setdefault(\"Associations\", []).append(club)\n else:\n clubList.setdefault(f'Clubs {club.bdx_type.name}', []).append(club)\n context['club_list']=clubList\n return context\n\n\n\nclass DetailClubView(BaseDetailGroupView):\n '''Vue de d\u00e9tails d'un club.'''\n \n template_name='club/detail.html'\n\n\nclass DetailGroupMembersView(LoginRequiredMixin, ListView):\n template_name = 'club/members.html'\n \n def get_object(self, **kwargs):\n app = resolve(self.request.path).app_name\n slug = self.kwargs.get(\"slug\")\n return get_object_from_slug(app, slug)\n \n def get_queryset(self, **kwargs):\n object = self.get_object()\n members = object.members.through.objects.filter(group=object)\n return members.order_by('year', 'order')\n \n def get_context_data(self, **kwargs):\n context = super().get_context_data(**kwargs)\n context['object'] = self.get_object()\n return context\n", "path": "server/apps/club/views.py"}]}
| 868 | 520 |
gh_patches_debug_23889
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-center-index-2500
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
GLOG 0.4.0 : unresolved external symbol google::InitGoogleLogging
Package Name/Version: **glog/0.4.0**
Operating System+version: **Win 10**
Compiler+version: **MSVS 2015 & MSVS 2019**
Conan version: **1.28.0**
cmake version:**3.18.0**
Ninja version:**1.10.0**
This may be similar to: https://github.com/conan-io/conan-center-index/issues/1691
Using conan-cmake:
``` Bash
conan_cmake_run(
REQUIRES
glog/0.4.0
IMPORTS
${CONANIMPORTS}
BASIC_SETUP
CMAKE_TARGETS
)
add_executable( ${PROJECT_NAME} ${SOURCES} )
target_link_libraries(${PROJECT_NAME} CONAN_PKG::glog)
```
main.cpp is simple enough:
```Bash
#include <glog/logging.h>
int main(int argc, char* argv[]) {
// Initialize Google's logging library.
google::InitGoogleLogging(argv[0]);
LOG(INFO) << "This is an info message";
LOG(WARNING) << "This is a warning message";
LOG(ERROR) << "This is an error message";
LOG(FATAL) << "This is a fatal message";
return 0;
}
```
Log attached below.
[Conan_GLOG_Fail.log](https://github.com/conan-io/conan-center-index/files/5062714/Conan_GLOG_Fail.log)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/glog/all/conanfile.py`
Content:
```
1 from conans import ConanFile, CMake, tools
2 import os
3
4
5 class GlogConan(ConanFile):
6 name = "glog"
7 url = "https://github.com/conan-io/conan-center-index"
8 homepage = "https://github.com/google/glog/"
9 description = "Google logging library"
10 topics = ("conan", "glog", "logging")
11 license = "BSD 3-Clause"
12 exports_sources = ["CMakeLists.txt", "patches/**"]
13 generators = "cmake", "cmake_find_package"
14 settings = "os", "arch", "compiler", "build_type"
15 options = {"shared": [True, False], "fPIC": [True, False], "with_gflags": [True, False], "with_threads": [True, False]}
16 default_options = {"shared": False, "fPIC": True, "with_gflags": True, "with_threads": True}
17
18 _cmake = None
19
20 @property
21 def _source_subfolder(self):
22 return "source_subfolder"
23
24 def config_options(self):
25 if self.settings.os == "Windows":
26 del self.options.fPIC
27
28 def configure(self):
29 if self.options.shared:
30 del self.options.fPIC
31 if self.options.with_gflags:
32 self.options["gflags"].shared = self.options.shared
33
34 def requirements(self):
35 if self.options.with_gflags:
36 self.requires("gflags/2.2.2")
37
38 def source(self):
39 tools.get(**self.conan_data["sources"][self.version])
40 extracted_dir = self.name + "-" + self.version
41 os.rename(extracted_dir, self._source_subfolder)
42
43 def _configure_cmake(self):
44 if self._cmake:
45 return self._cmake
46 self._cmake = CMake(self)
47 self._cmake.definitions["WITH_GFLAGS"] = self.options.with_gflags
48 self._cmake.definitions["WITH_THREADS"] = self.options.with_threads
49 self._cmake.definitions["BUILD_TESTING"] = False
50 self._cmake.configure()
51 return self._cmake
52
53 def build(self):
54 for patch in self.conan_data.get("patches", {}).get(self.version, []):
55 tools.patch(**patch)
56 cmake = self._configure_cmake()
57 cmake.build()
58
59 def package(self):
60 self.copy("COPYING", dst="licenses", src=self._source_subfolder)
61 cmake = self._configure_cmake()
62 cmake.install()
63 tools.rmdir(os.path.join(self.package_folder, "lib", "cmake"))
64
65 def package_info(self):
66 self.cpp_info.libs = tools.collect_libs(self)
67 if self.settings.os == "Linux":
68 self.cpp_info.system_libs.append("pthread")
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/recipes/glog/all/conanfile.py b/recipes/glog/all/conanfile.py
--- a/recipes/glog/all/conanfile.py
+++ b/recipes/glog/all/conanfile.py
@@ -8,7 +8,7 @@
homepage = "https://github.com/google/glog/"
description = "Google logging library"
topics = ("conan", "glog", "logging")
- license = "BSD 3-Clause"
+ license = "BSD-3-Clause"
exports_sources = ["CMakeLists.txt", "patches/**"]
generators = "cmake", "cmake_find_package"
settings = "os", "arch", "compiler", "build_type"
@@ -64,5 +64,12 @@
def package_info(self):
self.cpp_info.libs = tools.collect_libs(self)
+ self.cpp_info.names["pkgconfig"] = ["libglog"]
if self.settings.os == "Linux":
- self.cpp_info.system_libs.append("pthread")
+ self.cpp_info.system_libs = ["pthread"]
+ elif self.settings.os == "Windows":
+ self.cpp_info.defines = ["GLOG_NO_ABBREVIATED_SEVERITIES"]
+ decl = "__declspec(dllimport)" if self.options.shared else ""
+ self.cpp_info.defines.append("GOOGLE_GLOG_DLL_DECL={}".format(decl))
+ if self.options.with_gflags and not self.options.shared:
+ self.cpp_info.defines.extend(["GFLAGS_DLL_DECLARE_FLAG=", "GFLAGS_DLL_DEFINE_FLAG="])
|
{"golden_diff": "diff --git a/recipes/glog/all/conanfile.py b/recipes/glog/all/conanfile.py\n--- a/recipes/glog/all/conanfile.py\n+++ b/recipes/glog/all/conanfile.py\n@@ -8,7 +8,7 @@\n homepage = \"https://github.com/google/glog/\"\n description = \"Google logging library\"\n topics = (\"conan\", \"glog\", \"logging\")\n- license = \"BSD 3-Clause\"\n+ license = \"BSD-3-Clause\"\n exports_sources = [\"CMakeLists.txt\", \"patches/**\"]\n generators = \"cmake\", \"cmake_find_package\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n@@ -64,5 +64,12 @@\n \n def package_info(self):\n self.cpp_info.libs = tools.collect_libs(self)\n+ self.cpp_info.names[\"pkgconfig\"] = [\"libglog\"]\n if self.settings.os == \"Linux\":\n- self.cpp_info.system_libs.append(\"pthread\")\n+ self.cpp_info.system_libs = [\"pthread\"]\n+ elif self.settings.os == \"Windows\":\n+ self.cpp_info.defines = [\"GLOG_NO_ABBREVIATED_SEVERITIES\"]\n+ decl = \"__declspec(dllimport)\" if self.options.shared else \"\"\n+ self.cpp_info.defines.append(\"GOOGLE_GLOG_DLL_DECL={}\".format(decl))\n+ if self.options.with_gflags and not self.options.shared:\n+ self.cpp_info.defines.extend([\"GFLAGS_DLL_DECLARE_FLAG=\", \"GFLAGS_DLL_DEFINE_FLAG=\"])\n", "issue": "GLOG 0.4.0 : unresolved external symbol google::InitGoogleLogging\n\r\nPackage Name/Version: **glog/0.4.0**\r\nOperating System+version: **Win 10**\r\nCompiler+version: **MSVS 2015 & MSVS 2019**\r\nConan version: **1.28.0**\r\ncmake version:**3.18.0**\r\nNinja version:**1.10.0**\r\n\r\nThis may be similar to: https://github.com/conan-io/conan-center-index/issues/1691\r\n\r\nUsing conan-cmake:\r\n``` Bash\r\nconan_cmake_run(\r\n REQUIRES\r\n glog/0.4.0\r\n IMPORTS\r\n ${CONANIMPORTS}\r\n BASIC_SETUP\r\n CMAKE_TARGETS\r\n )\r\n\r\nadd_executable( ${PROJECT_NAME} ${SOURCES} )\r\ntarget_link_libraries(${PROJECT_NAME} CONAN_PKG::glog)\r\n\r\n```\r\n\r\nmain.cpp is simple enough:\r\n\r\n```Bash\r\n\r\n#include <glog/logging.h>\r\n\r\nint main(int argc, char* argv[]) {\r\n\r\n\r\n // Initialize Google's logging library.\r\n google::InitGoogleLogging(argv[0]);\r\n\r\n LOG(INFO) << \"This is an info message\";\r\n LOG(WARNING) << \"This is a warning message\";\r\n LOG(ERROR) << \"This is an error message\";\r\n LOG(FATAL) << \"This is a fatal message\";\r\n\r\n return 0;\r\n}\r\n```\r\n\r\nLog attached below.\r\n[Conan_GLOG_Fail.log](https://github.com/conan-io/conan-center-index/files/5062714/Conan_GLOG_Fail.log)\n", "before_files": [{"content": "from conans import ConanFile, CMake, tools\nimport os\n\n\nclass GlogConan(ConanFile):\n name = \"glog\"\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/google/glog/\"\n description = \"Google logging library\"\n topics = (\"conan\", \"glog\", \"logging\")\n license = \"BSD 3-Clause\"\n exports_sources = [\"CMakeLists.txt\", \"patches/**\"]\n generators = \"cmake\", \"cmake_find_package\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\"shared\": [True, False], \"fPIC\": [True, False], \"with_gflags\": [True, False], \"with_threads\": [True, False]}\n default_options = {\"shared\": False, \"fPIC\": True, \"with_gflags\": True, \"with_threads\": True}\n\n _cmake = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.options.shared:\n del self.options.fPIC\n if self.options.with_gflags:\n self.options[\"gflags\"].shared = self.options.shared\n\n def requirements(self):\n if self.options.with_gflags:\n self.requires(\"gflags/2.2.2\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = self.name + \"-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def _configure_cmake(self):\n if self._cmake:\n return self._cmake\n self._cmake = CMake(self)\n self._cmake.definitions[\"WITH_GFLAGS\"] = self.options.with_gflags\n self._cmake.definitions[\"WITH_THREADS\"] = self.options.with_threads\n self._cmake.definitions[\"BUILD_TESTING\"] = False\n self._cmake.configure()\n return self._cmake\n\n def build(self):\n for patch in self.conan_data.get(\"patches\", {}).get(self.version, []):\n tools.patch(**patch)\n cmake = self._configure_cmake()\n cmake.build()\n\n def package(self):\n self.copy(\"COPYING\", dst=\"licenses\", src=self._source_subfolder)\n cmake = self._configure_cmake()\n cmake.install()\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"cmake\"))\n\n def package_info(self):\n self.cpp_info.libs = tools.collect_libs(self)\n if self.settings.os == \"Linux\":\n self.cpp_info.system_libs.append(\"pthread\")\n", "path": "recipes/glog/all/conanfile.py"}], "after_files": [{"content": "from conans import ConanFile, CMake, tools\nimport os\n\n\nclass GlogConan(ConanFile):\n name = \"glog\"\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/google/glog/\"\n description = \"Google logging library\"\n topics = (\"conan\", \"glog\", \"logging\")\n license = \"BSD-3-Clause\"\n exports_sources = [\"CMakeLists.txt\", \"patches/**\"]\n generators = \"cmake\", \"cmake_find_package\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\"shared\": [True, False], \"fPIC\": [True, False], \"with_gflags\": [True, False], \"with_threads\": [True, False]}\n default_options = {\"shared\": False, \"fPIC\": True, \"with_gflags\": True, \"with_threads\": True}\n\n _cmake = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.options.shared:\n del self.options.fPIC\n if self.options.with_gflags:\n self.options[\"gflags\"].shared = self.options.shared\n\n def requirements(self):\n if self.options.with_gflags:\n self.requires(\"gflags/2.2.2\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = self.name + \"-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def _configure_cmake(self):\n if self._cmake:\n return self._cmake\n self._cmake = CMake(self)\n self._cmake.definitions[\"WITH_GFLAGS\"] = self.options.with_gflags\n self._cmake.definitions[\"WITH_THREADS\"] = self.options.with_threads\n self._cmake.definitions[\"BUILD_TESTING\"] = False\n self._cmake.configure()\n return self._cmake\n\n def build(self):\n for patch in self.conan_data.get(\"patches\", {}).get(self.version, []):\n tools.patch(**patch)\n cmake = self._configure_cmake()\n cmake.build()\n\n def package(self):\n self.copy(\"COPYING\", dst=\"licenses\", src=self._source_subfolder)\n cmake = self._configure_cmake()\n cmake.install()\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"cmake\"))\n\n def package_info(self):\n self.cpp_info.libs = tools.collect_libs(self)\n self.cpp_info.names[\"pkgconfig\"] = [\"libglog\"]\n if self.settings.os == \"Linux\":\n self.cpp_info.system_libs = [\"pthread\"]\n elif self.settings.os == \"Windows\":\n self.cpp_info.defines = [\"GLOG_NO_ABBREVIATED_SEVERITIES\"]\n decl = \"__declspec(dllimport)\" if self.options.shared else \"\"\n self.cpp_info.defines.append(\"GOOGLE_GLOG_DLL_DECL={}\".format(decl))\n if self.options.with_gflags and not self.options.shared:\n self.cpp_info.defines.extend([\"GFLAGS_DLL_DECLARE_FLAG=\", \"GFLAGS_DLL_DEFINE_FLAG=\"])\n", "path": "recipes/glog/all/conanfile.py"}]}
| 1,347 | 341 |
gh_patches_debug_10921
|
rasdani/github-patches
|
git_diff
|
PokemonGoF__PokemonGo-Bot-4230
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Request: Please add "log_stats" to colored_logging handler
### Short Description
Please add "log_stats" to colored_logging handler
### Possible solution
The following line can be added to colored_logging_handler.py
### _'log_stats': 'magenta',_
### How it would help others
It will be easy to track the progress with so many lines printed on the console between 2 status logs.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pokemongo_bot/event_handlers/colored_logging_handler.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from __future__ import unicode_literals
3
4 import logging
5
6 from pokemongo_bot.event_manager import EventHandler
7
8
9 class ColoredLoggingHandler(EventHandler):
10 EVENT_COLOR_MAP = {
11 'api_error': 'red',
12 'bot_exit': 'red',
13 'bot_start': 'green',
14 'catch_limit': 'red',
15 'config_error': 'red',
16 'egg_already_incubating': 'yellow',
17 'egg_hatched': 'green',
18 'future_pokemon_release': 'yellow',
19 'incubate': 'green',
20 'incubator_already_used': 'yellow',
21 'inventory_full': 'yellow',
22 'item_discard_fail': 'red',
23 'item_discarded': 'green',
24 'keep_best_release': 'green',
25 'level_up': 'green',
26 'level_up_reward': 'green',
27 'location_cache_error': 'yellow',
28 'location_cache_ignored': 'yellow',
29 'login_failed': 'red',
30 'login_successful': 'green',
31 'lucky_egg_error': 'red',
32 'move_to_map_pokemon_encounter': 'green',
33 'move_to_map_pokemon_fail': 'red',
34 'next_egg_incubates': 'yellow',
35 'next_sleep': 'green',
36 'next_random_pause': 'green',
37 'no_pokeballs': 'red',
38 'pokemon_appeared': 'yellow',
39 'pokemon_capture_failed': 'red',
40 'pokemon_caught': 'blue',
41 'pokemon_evolved': 'green',
42 'pokemon_fled': 'red',
43 'pokemon_inventory_full': 'red',
44 'pokemon_nickname_invalid': 'red',
45 'pokemon_not_in_range': 'yellow',
46 'pokemon_release': 'green',
47 'pokemon_vanished': 'red',
48 'pokestop_empty': 'yellow',
49 'pokestop_searching_too_often': 'yellow',
50 'rename_pokemon': 'green',
51 'skip_evolve': 'yellow',
52 'softban': 'red',
53 'spun_pokestop': 'cyan',
54 'threw_berry_failed': 'red',
55 'unknown_spin_result': 'red',
56 'unset_pokemon_nickname': 'red',
57 'vip_pokemon': 'red',
58
59 # event names for 'white' still here to remember that these events are already determined its color.
60 'arrived_at_cluster': 'white',
61 'arrived_at_fort': 'white',
62 'bot_sleep': 'white',
63 'bot_random_pause': 'white',
64 'catchable_pokemon': 'white',
65 'found_cluster': 'white',
66 'incubate_try': 'white',
67 'load_cached_location': 'white',
68 'location_found': 'white',
69 'login_started': 'white',
70 'lured_pokemon_found': 'white',
71 'move_to_map_pokemon_move_towards': 'white',
72 'move_to_map_pokemon_teleport_back': 'white',
73 'move_to_map_pokemon_updated_map': 'white',
74 'moving_to_fort': 'white',
75 'moving_to_lured_fort': 'white',
76 'pokemon_catch_rate': 'white',
77 'pokemon_evolve_fail': 'white',
78 'pokestop_on_cooldown': 'white',
79 'pokestop_out_of_range': 'white',
80 'polyline_request': 'white',
81 'position_update': 'white',
82 'set_start_location': 'white',
83 'softban_fix': 'white',
84 'softban_fix_done': 'white',
85 'spun_fort': 'white',
86 'threw_berry': 'white',
87 'threw_pokeball': 'white',
88 'used_lucky_egg': 'white'
89 }
90 CONTINUOUS_EVENT_NAMES = [
91 'catchable_pokemon',
92 'moving_to_lured_fort',
93 'spun_fort'
94 ]
95 COLOR_CODE = {
96 'gray': '\033[90m',
97 'red': '\033[91m',
98 'green': '\033[92m',
99 'yellow': '\033[93m',
100 'blue': '\033[94m',
101 'magenta': '\033[95m',
102 'cyan': '\033[96m',
103 'white': '\033[97m',
104 'reset': '\033[0m'
105 }
106
107 def handle_event(self, event, sender, level, formatted_msg, data):
108 logger = logging.getLogger(type(sender).__name__)
109
110 color = self.COLOR_CODE['white']
111 if event in self.EVENT_COLOR_MAP:
112 color = self.COLOR_CODE[self.EVENT_COLOR_MAP[event]]
113 if event == 'egg_hatched' and data.get('pokemon', 'error') == 'error':
114 color = self.COLOR_CODE['red']
115 formatted_msg = '{}{}{}'.format(color, formatted_msg, self.COLOR_CODE['reset'])
116
117 if formatted_msg:
118 message = "[{}] {}".format(event, formatted_msg)
119 else:
120 message = '{}: {}'.format(event, str(data))
121 getattr(logger, level)(message)
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pokemongo_bot/event_handlers/colored_logging_handler.py b/pokemongo_bot/event_handlers/colored_logging_handler.py
--- a/pokemongo_bot/event_handlers/colored_logging_handler.py
+++ b/pokemongo_bot/event_handlers/colored_logging_handler.py
@@ -55,6 +55,8 @@
'unknown_spin_result': 'red',
'unset_pokemon_nickname': 'red',
'vip_pokemon': 'red',
+ 'log_stats': 'magenta',
+ 'show_inventory': 'magenta',
# event names for 'white' still here to remember that these events are already determined its color.
'arrived_at_cluster': 'white',
|
{"golden_diff": "diff --git a/pokemongo_bot/event_handlers/colored_logging_handler.py b/pokemongo_bot/event_handlers/colored_logging_handler.py\n--- a/pokemongo_bot/event_handlers/colored_logging_handler.py\n+++ b/pokemongo_bot/event_handlers/colored_logging_handler.py\n@@ -55,6 +55,8 @@\n 'unknown_spin_result': 'red',\n 'unset_pokemon_nickname': 'red',\n 'vip_pokemon': 'red',\n+ 'log_stats': 'magenta',\n+ 'show_inventory': 'magenta',\n \n # event names for 'white' still here to remember that these events are already determined its color.\n 'arrived_at_cluster': 'white',\n", "issue": "Request: Please add \"log_stats\" to colored_logging handler\n### Short Description\n\nPlease add \"log_stats\" to colored_logging handler\n### Possible solution\n\nThe following line can be added to colored_logging_handler.py\n### _'log_stats': 'magenta',_\n### How it would help others\n\nIt will be easy to track the progress with so many lines printed on the console between 2 status logs.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import unicode_literals\n\nimport logging\n\nfrom pokemongo_bot.event_manager import EventHandler\n\n\nclass ColoredLoggingHandler(EventHandler):\n EVENT_COLOR_MAP = {\n 'api_error': 'red',\n 'bot_exit': 'red',\n 'bot_start': 'green',\n 'catch_limit': 'red',\n 'config_error': 'red',\n 'egg_already_incubating': 'yellow',\n 'egg_hatched': 'green',\n 'future_pokemon_release': 'yellow',\n 'incubate': 'green',\n 'incubator_already_used': 'yellow',\n 'inventory_full': 'yellow',\n 'item_discard_fail': 'red',\n 'item_discarded': 'green',\n 'keep_best_release': 'green',\n 'level_up': 'green',\n 'level_up_reward': 'green',\n 'location_cache_error': 'yellow',\n 'location_cache_ignored': 'yellow',\n 'login_failed': 'red',\n 'login_successful': 'green',\n 'lucky_egg_error': 'red',\n 'move_to_map_pokemon_encounter': 'green',\n 'move_to_map_pokemon_fail': 'red',\n 'next_egg_incubates': 'yellow',\n 'next_sleep': 'green',\n 'next_random_pause': 'green',\n 'no_pokeballs': 'red',\n 'pokemon_appeared': 'yellow',\n 'pokemon_capture_failed': 'red',\n 'pokemon_caught': 'blue',\n 'pokemon_evolved': 'green',\n 'pokemon_fled': 'red',\n 'pokemon_inventory_full': 'red',\n 'pokemon_nickname_invalid': 'red',\n 'pokemon_not_in_range': 'yellow',\n 'pokemon_release': 'green',\n 'pokemon_vanished': 'red',\n 'pokestop_empty': 'yellow',\n 'pokestop_searching_too_often': 'yellow',\n 'rename_pokemon': 'green',\n 'skip_evolve': 'yellow',\n 'softban': 'red',\n 'spun_pokestop': 'cyan',\n 'threw_berry_failed': 'red',\n 'unknown_spin_result': 'red',\n 'unset_pokemon_nickname': 'red',\n 'vip_pokemon': 'red',\n\n # event names for 'white' still here to remember that these events are already determined its color.\n 'arrived_at_cluster': 'white',\n 'arrived_at_fort': 'white',\n 'bot_sleep': 'white',\n 'bot_random_pause': 'white',\n 'catchable_pokemon': 'white',\n 'found_cluster': 'white',\n 'incubate_try': 'white',\n 'load_cached_location': 'white',\n 'location_found': 'white',\n 'login_started': 'white',\n 'lured_pokemon_found': 'white',\n 'move_to_map_pokemon_move_towards': 'white',\n 'move_to_map_pokemon_teleport_back': 'white',\n 'move_to_map_pokemon_updated_map': 'white',\n 'moving_to_fort': 'white',\n 'moving_to_lured_fort': 'white',\n 'pokemon_catch_rate': 'white',\n 'pokemon_evolve_fail': 'white',\n 'pokestop_on_cooldown': 'white',\n 'pokestop_out_of_range': 'white',\n 'polyline_request': 'white',\n 'position_update': 'white',\n 'set_start_location': 'white',\n 'softban_fix': 'white',\n 'softban_fix_done': 'white',\n 'spun_fort': 'white',\n 'threw_berry': 'white',\n 'threw_pokeball': 'white',\n 'used_lucky_egg': 'white'\n }\n CONTINUOUS_EVENT_NAMES = [\n 'catchable_pokemon',\n 'moving_to_lured_fort',\n 'spun_fort'\n ]\n COLOR_CODE = {\n 'gray': '\\033[90m',\n 'red': '\\033[91m',\n 'green': '\\033[92m',\n 'yellow': '\\033[93m',\n 'blue': '\\033[94m',\n 'magenta': '\\033[95m',\n 'cyan': '\\033[96m',\n 'white': '\\033[97m',\n 'reset': '\\033[0m'\n }\n\n def handle_event(self, event, sender, level, formatted_msg, data):\n logger = logging.getLogger(type(sender).__name__)\n\n color = self.COLOR_CODE['white']\n if event in self.EVENT_COLOR_MAP:\n color = self.COLOR_CODE[self.EVENT_COLOR_MAP[event]]\n if event == 'egg_hatched' and data.get('pokemon', 'error') == 'error':\n color = self.COLOR_CODE['red']\n formatted_msg = '{}{}{}'.format(color, formatted_msg, self.COLOR_CODE['reset'])\n\n if formatted_msg:\n message = \"[{}] {}\".format(event, formatted_msg)\n else:\n message = '{}: {}'.format(event, str(data))\n getattr(logger, level)(message)\n", "path": "pokemongo_bot/event_handlers/colored_logging_handler.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import unicode_literals\n\nimport logging\n\nfrom pokemongo_bot.event_manager import EventHandler\n\n\nclass ColoredLoggingHandler(EventHandler):\n EVENT_COLOR_MAP = {\n 'api_error': 'red',\n 'bot_exit': 'red',\n 'bot_start': 'green',\n 'catch_limit': 'red',\n 'config_error': 'red',\n 'egg_already_incubating': 'yellow',\n 'egg_hatched': 'green',\n 'future_pokemon_release': 'yellow',\n 'incubate': 'green',\n 'incubator_already_used': 'yellow',\n 'inventory_full': 'yellow',\n 'item_discard_fail': 'red',\n 'item_discarded': 'green',\n 'keep_best_release': 'green',\n 'level_up': 'green',\n 'level_up_reward': 'green',\n 'location_cache_error': 'yellow',\n 'location_cache_ignored': 'yellow',\n 'login_failed': 'red',\n 'login_successful': 'green',\n 'lucky_egg_error': 'red',\n 'move_to_map_pokemon_encounter': 'green',\n 'move_to_map_pokemon_fail': 'red',\n 'next_egg_incubates': 'yellow',\n 'next_sleep': 'green',\n 'next_random_pause': 'green',\n 'no_pokeballs': 'red',\n 'pokemon_appeared': 'yellow',\n 'pokemon_capture_failed': 'red',\n 'pokemon_caught': 'blue',\n 'pokemon_evolved': 'green',\n 'pokemon_fled': 'red',\n 'pokemon_inventory_full': 'red',\n 'pokemon_nickname_invalid': 'red',\n 'pokemon_not_in_range': 'yellow',\n 'pokemon_release': 'green',\n 'pokemon_vanished': 'red',\n 'pokestop_empty': 'yellow',\n 'pokestop_searching_too_often': 'yellow',\n 'rename_pokemon': 'green',\n 'skip_evolve': 'yellow',\n 'softban': 'red',\n 'spun_pokestop': 'cyan',\n 'threw_berry_failed': 'red',\n 'unknown_spin_result': 'red',\n 'unset_pokemon_nickname': 'red',\n 'vip_pokemon': 'red',\n 'log_stats': 'magenta',\n 'show_inventory': 'magenta',\n\n # event names for 'white' still here to remember that these events are already determined its color.\n 'arrived_at_cluster': 'white',\n 'arrived_at_fort': 'white',\n 'bot_sleep': 'white',\n 'bot_random_pause': 'white',\n 'catchable_pokemon': 'white',\n 'found_cluster': 'white',\n 'incubate_try': 'white',\n 'load_cached_location': 'white',\n 'location_found': 'white',\n 'login_started': 'white',\n 'lured_pokemon_found': 'white',\n 'move_to_map_pokemon_move_towards': 'white',\n 'move_to_map_pokemon_teleport_back': 'white',\n 'move_to_map_pokemon_updated_map': 'white',\n 'moving_to_fort': 'white',\n 'moving_to_lured_fort': 'white',\n 'pokemon_catch_rate': 'white',\n 'pokemon_evolve_fail': 'white',\n 'pokestop_on_cooldown': 'white',\n 'pokestop_out_of_range': 'white',\n 'polyline_request': 'white',\n 'position_update': 'white',\n 'set_start_location': 'white',\n 'softban_fix': 'white',\n 'softban_fix_done': 'white',\n 'spun_fort': 'white',\n 'threw_berry': 'white',\n 'threw_pokeball': 'white',\n 'used_lucky_egg': 'white'\n }\n CONTINUOUS_EVENT_NAMES = [\n 'catchable_pokemon',\n 'moving_to_lured_fort',\n 'spun_fort'\n ]\n COLOR_CODE = {\n 'gray': '\\033[90m',\n 'red': '\\033[91m',\n 'green': '\\033[92m',\n 'yellow': '\\033[93m',\n 'blue': '\\033[94m',\n 'magenta': '\\033[95m',\n 'cyan': '\\033[96m',\n 'white': '\\033[97m',\n 'reset': '\\033[0m'\n }\n\n def handle_event(self, event, sender, level, formatted_msg, data):\n logger = logging.getLogger(type(sender).__name__)\n\n color = self.COLOR_CODE['white']\n if event in self.EVENT_COLOR_MAP:\n color = self.COLOR_CODE[self.EVENT_COLOR_MAP[event]]\n if event == 'egg_hatched' and data.get('pokemon', 'error') == 'error':\n color = self.COLOR_CODE['red']\n formatted_msg = '{}{}{}'.format(color, formatted_msg, self.COLOR_CODE['reset'])\n\n if formatted_msg:\n message = \"[{}] {}\".format(event, formatted_msg)\n else:\n message = '{}: {}'.format(event, str(data))\n getattr(logger, level)(message)\n", "path": "pokemongo_bot/event_handlers/colored_logging_handler.py"}]}
| 1,810 | 163 |
gh_patches_debug_9801
|
rasdani/github-patches
|
git_diff
|
joke2k__faker-836
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add method to generate a cell phone number to pt-BR
Faker doesn't have a function to generate a cellphone to Brazilian.
Steps to reproduce
Create fake instance using localization "pt_BR"
Call fake.msisdn() or fake.phone_number()
Expected behavior
It should generate a cell phone number.
Actual behavior
Sometimes these methods return a "residential" numbers.
Reference difference between cell phones and residential numbers:
http://www.teleco.com.br/num.asp
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/providers/phone_number/pt_BR/__init__.py`
Content:
```
1 from __future__ import unicode_literals
2 from .. import Provider as PhoneNumberProvider
3
4
5 class Provider(PhoneNumberProvider):
6 formats = (
7 '+55 (011) #### ####',
8 '+55 (021) #### ####',
9 '+55 (031) #### ####',
10 '+55 (041) #### ####',
11 '+55 (051) #### ####',
12 '+55 (061) #### ####',
13 '+55 (071) #### ####',
14 '+55 (081) #### ####',
15 '+55 11 #### ####',
16 '+55 21 #### ####',
17 '+55 31 #### ####',
18 '+55 41 #### ####',
19 '+55 51 ### ####',
20 '+55 61 #### ####',
21 '+55 71 #### ####',
22 '+55 81 #### ####',
23 '+55 (011) ####-####',
24 '+55 (021) ####-####',
25 '+55 (031) ####-####',
26 '+55 (041) ####-####',
27 '+55 (051) ####-####',
28 '+55 (061) ####-####',
29 '+55 (071) ####-####',
30 '+55 (081) ####-####',
31 '+55 11 ####-####',
32 '+55 21 ####-####',
33 '+55 31 ####-####',
34 '+55 41 ####-####',
35 '+55 51 ### ####',
36 '+55 61 ####-####',
37 '+55 71 ####-####',
38 '+55 81 ####-####',
39 '(011) #### ####',
40 '(021) #### ####',
41 '(031) #### ####',
42 '(041) #### ####',
43 '(051) #### ####',
44 '(061) #### ####',
45 '(071) #### ####',
46 '(081) #### ####',
47 '11 #### ####',
48 '21 #### ####',
49 '31 #### ####',
50 '41 #### ####',
51 '51 ### ####',
52 '61 #### ####',
53 '71 #### ####',
54 '81 #### ####',
55 '(011) ####-####',
56 '(021) ####-####',
57 '(031) ####-####',
58 '(041) ####-####',
59 '(051) ####-####',
60 '(061) ####-####',
61 '(071) ####-####',
62 '(081) ####-####',
63 '11 ####-####',
64 '21 ####-####',
65 '31 ####-####',
66 '41 ####-####',
67 '51 ### ####',
68 '61 ####-####',
69 '71 ####-####',
70 '81 ####-####',
71 '#### ####',
72 '####-####',
73 )
74 msisdn_formats = (
75 '5511#########',
76 '5521#########',
77 '5531#########',
78 '5541#########',
79 '5551#########',
80 '5561#########',
81 '5571#########',
82 '5581#########',
83 )
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/faker/providers/phone_number/pt_BR/__init__.py b/faker/providers/phone_number/pt_BR/__init__.py
--- a/faker/providers/phone_number/pt_BR/__init__.py
+++ b/faker/providers/phone_number/pt_BR/__init__.py
@@ -71,6 +71,7 @@
'#### ####',
'####-####',
)
+
msisdn_formats = (
'5511#########',
'5521#########',
@@ -81,3 +82,11 @@
'5571#########',
'5581#########',
)
+
+ cellphone_formats = (
+ '+55 9#### ####',
+ )
+
+ def cellphone_number(self):
+ pattern = self.random_element(self.cellphone_formats)
+ return self.numerify(self.generator.parse(pattern))
|
{"golden_diff": "diff --git a/faker/providers/phone_number/pt_BR/__init__.py b/faker/providers/phone_number/pt_BR/__init__.py\n--- a/faker/providers/phone_number/pt_BR/__init__.py\n+++ b/faker/providers/phone_number/pt_BR/__init__.py\n@@ -71,6 +71,7 @@\n '#### ####',\n '####-####',\n )\n+\n msisdn_formats = (\n '5511#########',\n '5521#########',\n@@ -81,3 +82,11 @@\n '5571#########',\n '5581#########',\n )\n+\n+ cellphone_formats = (\n+ '+55 9#### ####',\n+ )\n+\n+ def cellphone_number(self):\n+ pattern = self.random_element(self.cellphone_formats)\n+ return self.numerify(self.generator.parse(pattern))\n", "issue": "Add method to generate a cell phone number to pt-BR\nFaker doesn't have a function to generate a cellphone to Brazilian.\r\n\r\nSteps to reproduce\r\nCreate fake instance using localization \"pt_BR\"\r\nCall fake.msisdn() or fake.phone_number()\r\nExpected behavior\r\nIt should generate a cell phone number.\r\n\r\nActual behavior\r\nSometimes these methods return a \"residential\" numbers.\r\n\r\nReference difference between cell phones and residential numbers:\r\n\r\nhttp://www.teleco.com.br/num.asp\n", "before_files": [{"content": "from __future__ import unicode_literals\nfrom .. import Provider as PhoneNumberProvider\n\n\nclass Provider(PhoneNumberProvider):\n formats = (\n '+55 (011) #### ####',\n '+55 (021) #### ####',\n '+55 (031) #### ####',\n '+55 (041) #### ####',\n '+55 (051) #### ####',\n '+55 (061) #### ####',\n '+55 (071) #### ####',\n '+55 (081) #### ####',\n '+55 11 #### ####',\n '+55 21 #### ####',\n '+55 31 #### ####',\n '+55 41 #### ####',\n '+55 51 ### ####',\n '+55 61 #### ####',\n '+55 71 #### ####',\n '+55 81 #### ####',\n '+55 (011) ####-####',\n '+55 (021) ####-####',\n '+55 (031) ####-####',\n '+55 (041) ####-####',\n '+55 (051) ####-####',\n '+55 (061) ####-####',\n '+55 (071) ####-####',\n '+55 (081) ####-####',\n '+55 11 ####-####',\n '+55 21 ####-####',\n '+55 31 ####-####',\n '+55 41 ####-####',\n '+55 51 ### ####',\n '+55 61 ####-####',\n '+55 71 ####-####',\n '+55 81 ####-####',\n '(011) #### ####',\n '(021) #### ####',\n '(031) #### ####',\n '(041) #### ####',\n '(051) #### ####',\n '(061) #### ####',\n '(071) #### ####',\n '(081) #### ####',\n '11 #### ####',\n '21 #### ####',\n '31 #### ####',\n '41 #### ####',\n '51 ### ####',\n '61 #### ####',\n '71 #### ####',\n '81 #### ####',\n '(011) ####-####',\n '(021) ####-####',\n '(031) ####-####',\n '(041) ####-####',\n '(051) ####-####',\n '(061) ####-####',\n '(071) ####-####',\n '(081) ####-####',\n '11 ####-####',\n '21 ####-####',\n '31 ####-####',\n '41 ####-####',\n '51 ### ####',\n '61 ####-####',\n '71 ####-####',\n '81 ####-####',\n '#### ####',\n '####-####',\n )\n msisdn_formats = (\n '5511#########',\n '5521#########',\n '5531#########',\n '5541#########',\n '5551#########',\n '5561#########',\n '5571#########',\n '5581#########',\n )\n", "path": "faker/providers/phone_number/pt_BR/__init__.py"}], "after_files": [{"content": "from __future__ import unicode_literals\nfrom .. import Provider as PhoneNumberProvider\n\n\nclass Provider(PhoneNumberProvider):\n formats = (\n '+55 (011) #### ####',\n '+55 (021) #### ####',\n '+55 (031) #### ####',\n '+55 (041) #### ####',\n '+55 (051) #### ####',\n '+55 (061) #### ####',\n '+55 (071) #### ####',\n '+55 (081) #### ####',\n '+55 11 #### ####',\n '+55 21 #### ####',\n '+55 31 #### ####',\n '+55 41 #### ####',\n '+55 51 ### ####',\n '+55 61 #### ####',\n '+55 71 #### ####',\n '+55 81 #### ####',\n '+55 (011) ####-####',\n '+55 (021) ####-####',\n '+55 (031) ####-####',\n '+55 (041) ####-####',\n '+55 (051) ####-####',\n '+55 (061) ####-####',\n '+55 (071) ####-####',\n '+55 (081) ####-####',\n '+55 11 ####-####',\n '+55 21 ####-####',\n '+55 31 ####-####',\n '+55 41 ####-####',\n '+55 51 ### ####',\n '+55 61 ####-####',\n '+55 71 ####-####',\n '+55 81 ####-####',\n '(011) #### ####',\n '(021) #### ####',\n '(031) #### ####',\n '(041) #### ####',\n '(051) #### ####',\n '(061) #### ####',\n '(071) #### ####',\n '(081) #### ####',\n '11 #### ####',\n '21 #### ####',\n '31 #### ####',\n '41 #### ####',\n '51 ### ####',\n '61 #### ####',\n '71 #### ####',\n '81 #### ####',\n '(011) ####-####',\n '(021) ####-####',\n '(031) ####-####',\n '(041) ####-####',\n '(051) ####-####',\n '(061) ####-####',\n '(071) ####-####',\n '(081) ####-####',\n '11 ####-####',\n '21 ####-####',\n '31 ####-####',\n '41 ####-####',\n '51 ### ####',\n '61 ####-####',\n '71 ####-####',\n '81 ####-####',\n '#### ####',\n '####-####',\n )\n\n msisdn_formats = (\n '5511#########',\n '5521#########',\n '5531#########',\n '5541#########',\n '5551#########',\n '5561#########',\n '5571#########',\n '5581#########',\n )\n\n cellphone_formats = (\n '+55 9#### ####',\n )\n\n def cellphone_number(self):\n pattern = self.random_element(self.cellphone_formats)\n return self.numerify(self.generator.parse(pattern))\n", "path": "faker/providers/phone_number/pt_BR/__init__.py"}]}
| 1,281 | 200 |
gh_patches_debug_11240
|
rasdani/github-patches
|
git_diff
|
acl-org__acl-anthology-1025
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Name parser
At ingestion time, we are often given data that is not split into BibTeX's "surname, given name" format. We therefore split it ourselves, heuristically, which often fails. Python has a [name parser](https://pypi.org/project/nameparser/) module, but it doesn't work on all Anthology names, either, e.g.,:
- José Alejandro Lopez Gonzalez
- Philippe Boula de Mareüil
It would be cool to implement our own name parser and train it on the Anthology data. (I imagine that applying the trained model would turn up some mistakes in our data).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bin/likely_name_split.py`
Content:
```
1 #!/usr/bin/env python3
2 # Daniel Gildea, 2020
3
4 """Usage: likely_name_split.py [--importdir=DIR]
5
6 Counts first and last names in anthology.
7 Predicts best split into first and last.
8 Checks whether current names match our predictions.
9
10 Options:
11 --importdir=DIR Directory to import XML files from. [default: {scriptdir}/../data/]
12 -h, --help Display this helpful text.
13 """
14
15 from collections import defaultdict
16 from docopt import docopt
17 import re
18 import os
19 from math import *
20
21 from anthology import Anthology
22 from anthology.people import PersonName
23
24
25 class NameSplitter:
26 def __init__(self, anthology):
27 # counts of how often each name appears
28 self.first_count = defaultdict(lambda: 0) # "Maria" "Victoria"
29 self.first_full_count = defaultdict(lambda: 0) # "Maria Victoria"
30 self.last_count = defaultdict(lambda: 0) # "van" "den" "Bosch"
31 self.last_full_count = defaultdict(lambda: 0) # "van den Bosch"
32 self.first_total = 0
33 self.last_total = 0
34
35 self.count_names(anthology)
36
37 # counts names in anthology database into global vars
38 # first_count last_count (dicts)
39 # first_full_count last_full_count (dicts)
40 # first_total last_total (floats)
41 def count_names(self, anthology):
42 for person in anthology.people.personids():
43 name = anthology.people.get_canonical_name(person)
44 num_papers = len(anthology.people.get_papers(person)) + 0.0
45 # print(name.last, ", ", name.first, num_papers)
46 for w in name.first.split(" "):
47 self.first_count[w] += num_papers
48 self.first_full_count[name.first] += num_papers
49 self.first_total += num_papers
50
51 for w in name.last.split(" "):
52 self.last_count[w] += num_papers
53 self.last_full_count[name.last] += num_papers
54 self.last_total += num_papers
55
56 # takes "Maria Victoria Lopez Gonzalez"
57 # returns ("Lopez Gonzalez", "Maria Victoria")
58 # uses counts of words in first and last names in current database
59 def best_split(self, name):
60 if "," in name:
61 # Short-circuit names that are already split
62 surname, given_names = name.split(",")
63 return (surname.strip(), given_names.strip())
64
65 words = name.split(" ")
66 best_score = -inf
67 best = ("", "")
68 # loop over possible split points between first/last
69 for i in range(1, len(words)): # at least one word in each part
70 first = " ".join(words[0:i])
71 last = " ".join(words[i:])
72 # max of log prob of "Maria Victoria" and
73 # log prob of "Maria" + log prob of "Victoria"
74 first_probs = [
75 log((self.first_count[x] + 0.01) / self.first_total) for x in words[0:i]
76 ]
77 first_score = max(
78 log((self.first_full_count[first] + 0.000001) / self.first_total),
79 sum(first_probs),
80 )
81 last_probs = [
82 log((self.last_count[x] + 0.01) / self.last_total) for x in words[i:]
83 ]
84 last_score = max(
85 log((self.last_full_count[last] + 0.000001) / self.last_total),
86 sum(last_probs),
87 )
88
89 if first_score + last_score > best_score:
90 best_score = first_score + last_score
91 best = (last, first)
92 # end of loop over split points
93 return best
94
95
96 if __name__ == "__main__":
97 args = docopt(__doc__)
98 scriptdir = os.path.dirname(os.path.abspath(__file__))
99 if "{scriptdir}" in args["--importdir"]:
100 args["--importdir"] = os.path.abspath(
101 args["--importdir"].format(scriptdir=scriptdir)
102 )
103
104 anthology = Anthology(importdir=args["--importdir"])
105 splitter = NameSplitter(anthology)
106
107 # for all names currently in anthology,
108 # see if they match what we predict
109 for person in anthology.people.personids():
110 name = anthology.people.get_canonical_name(person)
111
112 # find our prediction of split
113 best = splitter.best_split(name.first + " " + name.last)
114
115 # if current split does not match our prediction
116 if not (best[0] == name.last and best[1] == name.first):
117 # print suggested replacement
118 print(name.last, ",", name.first, " ==> ", best[0], ",", best[1])
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/bin/likely_name_split.py b/bin/likely_name_split.py
--- a/bin/likely_name_split.py
+++ b/bin/likely_name_split.py
@@ -57,8 +57,9 @@
# returns ("Lopez Gonzalez", "Maria Victoria")
# uses counts of words in first and last names in current database
def best_split(self, name):
- if "," in name:
+ if "," in name and not "Jr." in name:
# Short-circuit names that are already split
+ # comma in "William Baumgartner, Jr." does not count as a split
surname, given_names = name.split(",")
return (surname.strip(), given_names.strip())
|
{"golden_diff": "diff --git a/bin/likely_name_split.py b/bin/likely_name_split.py\n--- a/bin/likely_name_split.py\n+++ b/bin/likely_name_split.py\n@@ -57,8 +57,9 @@\n # returns (\"Lopez Gonzalez\", \"Maria Victoria\")\n # uses counts of words in first and last names in current database\n def best_split(self, name):\n- if \",\" in name:\n+ if \",\" in name and not \"Jr.\" in name:\n # Short-circuit names that are already split\n+ # comma in \"William Baumgartner, Jr.\" does not count as a split\n surname, given_names = name.split(\",\")\n return (surname.strip(), given_names.strip())\n", "issue": "Name parser\nAt ingestion time, we are often given data that is not split into BibTeX's \"surname, given name\" format. We therefore split it ourselves, heuristically, which often fails. Python has a [name parser](https://pypi.org/project/nameparser/) module, but it doesn't work on all Anthology names, either, e.g.,:\r\n\r\n- Jos\u00e9 Alejandro Lopez Gonzalez\r\n- Philippe Boula de Mare\u00fcil\r\n\r\nIt would be cool to implement our own name parser and train it on the Anthology data. (I imagine that applying the trained model would turn up some mistakes in our data).\n", "before_files": [{"content": "#!/usr/bin/env python3\n# Daniel Gildea, 2020\n\n\"\"\"Usage: likely_name_split.py [--importdir=DIR]\n\nCounts first and last names in anthology.\nPredicts best split into first and last.\nChecks whether current names match our predictions.\n\nOptions:\n --importdir=DIR Directory to import XML files from. [default: {scriptdir}/../data/]\n -h, --help Display this helpful text.\n\"\"\"\n\nfrom collections import defaultdict\nfrom docopt import docopt\nimport re\nimport os\nfrom math import *\n\nfrom anthology import Anthology\nfrom anthology.people import PersonName\n\n\nclass NameSplitter:\n def __init__(self, anthology):\n # counts of how often each name appears\n self.first_count = defaultdict(lambda: 0) # \"Maria\" \"Victoria\"\n self.first_full_count = defaultdict(lambda: 0) # \"Maria Victoria\"\n self.last_count = defaultdict(lambda: 0) # \"van\" \"den\" \"Bosch\"\n self.last_full_count = defaultdict(lambda: 0) # \"van den Bosch\"\n self.first_total = 0\n self.last_total = 0\n\n self.count_names(anthology)\n\n # counts names in anthology database into global vars\n # first_count last_count (dicts)\n # first_full_count last_full_count (dicts)\n # first_total last_total (floats)\n def count_names(self, anthology):\n for person in anthology.people.personids():\n name = anthology.people.get_canonical_name(person)\n num_papers = len(anthology.people.get_papers(person)) + 0.0\n # print(name.last, \", \", name.first, num_papers)\n for w in name.first.split(\" \"):\n self.first_count[w] += num_papers\n self.first_full_count[name.first] += num_papers\n self.first_total += num_papers\n\n for w in name.last.split(\" \"):\n self.last_count[w] += num_papers\n self.last_full_count[name.last] += num_papers\n self.last_total += num_papers\n\n # takes \"Maria Victoria Lopez Gonzalez\"\n # returns (\"Lopez Gonzalez\", \"Maria Victoria\")\n # uses counts of words in first and last names in current database\n def best_split(self, name):\n if \",\" in name:\n # Short-circuit names that are already split\n surname, given_names = name.split(\",\")\n return (surname.strip(), given_names.strip())\n\n words = name.split(\" \")\n best_score = -inf\n best = (\"\", \"\")\n # loop over possible split points between first/last\n for i in range(1, len(words)): # at least one word in each part\n first = \" \".join(words[0:i])\n last = \" \".join(words[i:])\n # max of log prob of \"Maria Victoria\" and\n # log prob of \"Maria\" + log prob of \"Victoria\"\n first_probs = [\n log((self.first_count[x] + 0.01) / self.first_total) for x in words[0:i]\n ]\n first_score = max(\n log((self.first_full_count[first] + 0.000001) / self.first_total),\n sum(first_probs),\n )\n last_probs = [\n log((self.last_count[x] + 0.01) / self.last_total) for x in words[i:]\n ]\n last_score = max(\n log((self.last_full_count[last] + 0.000001) / self.last_total),\n sum(last_probs),\n )\n\n if first_score + last_score > best_score:\n best_score = first_score + last_score\n best = (last, first)\n # end of loop over split points\n return best\n\n\nif __name__ == \"__main__\":\n args = docopt(__doc__)\n scriptdir = os.path.dirname(os.path.abspath(__file__))\n if \"{scriptdir}\" in args[\"--importdir\"]:\n args[\"--importdir\"] = os.path.abspath(\n args[\"--importdir\"].format(scriptdir=scriptdir)\n )\n\n anthology = Anthology(importdir=args[\"--importdir\"])\n splitter = NameSplitter(anthology)\n\n # for all names currently in anthology,\n # see if they match what we predict\n for person in anthology.people.personids():\n name = anthology.people.get_canonical_name(person)\n\n # find our prediction of split\n best = splitter.best_split(name.first + \" \" + name.last)\n\n # if current split does not match our prediction\n if not (best[0] == name.last and best[1] == name.first):\n # print suggested replacement\n print(name.last, \",\", name.first, \" ==> \", best[0], \",\", best[1])\n", "path": "bin/likely_name_split.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n# Daniel Gildea, 2020\n\n\"\"\"Usage: likely_name_split.py [--importdir=DIR]\n\nCounts first and last names in anthology.\nPredicts best split into first and last.\nChecks whether current names match our predictions.\n\nOptions:\n --importdir=DIR Directory to import XML files from. [default: {scriptdir}/../data/]\n -h, --help Display this helpful text.\n\"\"\"\n\nfrom collections import defaultdict\nfrom docopt import docopt\nimport re\nimport os\nfrom math import *\n\nfrom anthology import Anthology\nfrom anthology.people import PersonName\n\n\nclass NameSplitter:\n def __init__(self, anthology):\n # counts of how often each name appears\n self.first_count = defaultdict(lambda: 0) # \"Maria\" \"Victoria\"\n self.first_full_count = defaultdict(lambda: 0) # \"Maria Victoria\"\n self.last_count = defaultdict(lambda: 0) # \"van\" \"den\" \"Bosch\"\n self.last_full_count = defaultdict(lambda: 0) # \"van den Bosch\"\n self.first_total = 0\n self.last_total = 0\n\n self.count_names(anthology)\n\n # counts names in anthology database into global vars\n # first_count last_count (dicts)\n # first_full_count last_full_count (dicts)\n # first_total last_total (floats)\n def count_names(self, anthology):\n for person in anthology.people.personids():\n name = anthology.people.get_canonical_name(person)\n num_papers = len(anthology.people.get_papers(person)) + 0.0\n # print(name.last, \", \", name.first, num_papers)\n for w in name.first.split(\" \"):\n self.first_count[w] += num_papers\n self.first_full_count[name.first] += num_papers\n self.first_total += num_papers\n\n for w in name.last.split(\" \"):\n self.last_count[w] += num_papers\n self.last_full_count[name.last] += num_papers\n self.last_total += num_papers\n\n # takes \"Maria Victoria Lopez Gonzalez\"\n # returns (\"Lopez Gonzalez\", \"Maria Victoria\")\n # uses counts of words in first and last names in current database\n def best_split(self, name):\n if \",\" in name and not \"Jr.\" in name:\n # Short-circuit names that are already split\n # comma in \"William Baumgartner, Jr.\" does not count as a split\n surname, given_names = name.split(\",\")\n return (surname.strip(), given_names.strip())\n\n words = name.split(\" \")\n best_score = -inf\n best = (\"\", \"\")\n # loop over possible split points between first/last\n for i in range(1, len(words)): # at least one word in each part\n first = \" \".join(words[0:i])\n last = \" \".join(words[i:])\n # max of log prob of \"Maria Victoria\" and\n # log prob of \"Maria\" + log prob of \"Victoria\"\n first_probs = [\n log((self.first_count[x] + 0.01) / self.first_total) for x in words[0:i]\n ]\n first_score = max(\n log((self.first_full_count[first] + 0.000001) / self.first_total),\n sum(first_probs),\n )\n last_probs = [\n log((self.last_count[x] + 0.01) / self.last_total) for x in words[i:]\n ]\n last_score = max(\n log((self.last_full_count[last] + 0.000001) / self.last_total),\n sum(last_probs),\n )\n\n if first_score + last_score > best_score:\n best_score = first_score + last_score\n best = (last, first)\n # end of loop over split points\n return best\n\n\nif __name__ == \"__main__\":\n args = docopt(__doc__)\n scriptdir = os.path.dirname(os.path.abspath(__file__))\n if \"{scriptdir}\" in args[\"--importdir\"]:\n args[\"--importdir\"] = os.path.abspath(\n args[\"--importdir\"].format(scriptdir=scriptdir)\n )\n\n anthology = Anthology(importdir=args[\"--importdir\"])\n splitter = NameSplitter(anthology)\n\n # for all names currently in anthology,\n # see if they match what we predict\n for person in anthology.people.personids():\n name = anthology.people.get_canonical_name(person)\n\n # find our prediction of split\n best = splitter.best_split(name.first + \" \" + name.last)\n\n # if current split does not match our prediction\n if not (best[0] == name.last and best[1] == name.first):\n # print suggested replacement\n print(name.last, \",\", name.first, \" ==> \", best[0], \",\", best[1])\n", "path": "bin/likely_name_split.py"}]}
| 1,691 | 155 |
gh_patches_debug_150
|
rasdani/github-patches
|
git_diff
|
ManimCommunity__manim-70
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
A small Bug in setup.py
In `install_requires` of `setup.py` the library `colour` is mentioned twice. This needed to be changed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_namespace_packages
2 setup(
3 name="manimlib",
4 version="0.2.0",
5 description="Animation engine for explanatory math videos",
6 license="MIT",
7 packages=find_namespace_packages(),
8 package_data={ "manim": ["*.tex"] },
9 entry_points={
10 "console_scripts": [
11 "manim=manim:main",
12 "manimcm=manim:main",
13 ]
14 },
15 install_requires=[
16 "colour",
17 "argparse",
18 "colour",
19 "numpy",
20 "Pillow",
21 "progressbar",
22 "scipy",
23 "tqdm",
24 "opencv-python",
25 "pycairo",
26 "pydub",
27 "pygments",
28 "pyreadline; sys_platform == 'win32'",
29 "rich"
30 ],
31 )
32
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -13,7 +13,6 @@
]
},
install_requires=[
- "colour",
"argparse",
"colour",
"numpy",
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -13,7 +13,6 @@\n ]\n },\n install_requires=[\n- \"colour\",\n \"argparse\",\n \"colour\",\n \"numpy\",\n", "issue": "A small Bug in setup.py\nIn `install_requires` of `setup.py` the library `colour` is mentioned twice. This needed to be changed.\n", "before_files": [{"content": "from setuptools import setup, find_namespace_packages\nsetup(\n name=\"manimlib\",\n version=\"0.2.0\",\n description=\"Animation engine for explanatory math videos\",\n license=\"MIT\",\n packages=find_namespace_packages(),\n package_data={ \"manim\": [\"*.tex\"] },\n entry_points={\n \"console_scripts\": [\n \"manim=manim:main\",\n \"manimcm=manim:main\",\n ]\n },\n install_requires=[\n \"colour\",\n \"argparse\",\n \"colour\",\n \"numpy\",\n \"Pillow\",\n \"progressbar\",\n \"scipy\",\n \"tqdm\",\n \"opencv-python\",\n \"pycairo\",\n \"pydub\",\n \"pygments\",\n \"pyreadline; sys_platform == 'win32'\",\n \"rich\"\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_namespace_packages\nsetup(\n name=\"manimlib\",\n version=\"0.2.0\",\n description=\"Animation engine for explanatory math videos\",\n license=\"MIT\",\n packages=find_namespace_packages(),\n package_data={ \"manim\": [\"*.tex\"] },\n entry_points={\n \"console_scripts\": [\n \"manim=manim:main\",\n \"manimcm=manim:main\",\n ]\n },\n install_requires=[\n \"argparse\",\n \"colour\",\n \"numpy\",\n \"Pillow\",\n \"progressbar\",\n \"scipy\",\n \"tqdm\",\n \"opencv-python\",\n \"pycairo\",\n \"pydub\",\n \"pygments\",\n \"pyreadline; sys_platform == 'win32'\",\n \"rich\"\n ],\n)\n", "path": "setup.py"}]}
| 522 | 58 |
gh_patches_debug_15754
|
rasdani/github-patches
|
git_diff
|
secdev__scapy-2317
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
KRACK module requires python-cryptography
When `python-cryptography` or `python3-cryptography` packages are not installed, the KRACK module cannot be loaded.
```
>>> load_module("krack")
ERROR: Loading module scapy.modules.krack
Traceback (most recent call last):
File "/home/ria/scapy/scapy/main.py", line 150, in _load
mod = importlib.import_module(module)
File "/usr/lib/python3.7/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
File "<frozen importlib._bootstrap>", line 983, in _find_and_load
File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/ria/scapy/scapy/modules/krack/__init__.py", line 28, in <module>
from scapy.modules.krack.automaton import KrackAP # noqa: F401
File "/home/ria/scapy/scapy/modules/krack/automaton.py", line 7, in <module>
from cryptography.hazmat.primitives import hashes
ModuleNotFoundError: No module named 'cryptography'
```
Calling @commial to assess whether the module could still offer functionalities with a `crypto_valid` around some blocks. But at first glance I believe a missing `cryptography` should completely prevent importing the module.
https://github.com/secdev/scapy/blob/a58e1b90a704c394216a0b5a864a50931754bdf7/scapy/modules/krack/automaton.py#L6-L10
https://github.com/secdev/scapy/blob/a58e1b90a704c394216a0b5a864a50931754bdf7/scapy/modules/krack/crypto.py#L6-L9
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scapy/modules/krack/__init__.py`
Content:
```
1 """Module implementing Krack Attack on client, as a custom WPA Access Point
2
3 More details on the attack can be found on https://www.krackattacks.com/
4
5 Example of use (from the scapy shell):
6 >>> load_module("krack")
7 >>> KrackAP(
8 iface="mon0", # A monitor interface
9 ap_mac='11:22:33:44:55:66', # MAC (BSSID) to use
10 ssid="TEST_KRACK", # SSID
11 passphrase="testtest", # Associated passphrase
12 ).run()
13
14 Then, on the target device, connect to "TEST_KRACK" using "testtest" as the
15 passphrase.
16 The output logs will indicate if one of the vulnerability have been triggered.
17
18 Outputs for vulnerable devices:
19 - IV re-use!! Client seems to be vulnerable to handshake 3/4 replay
20 (CVE-2017-13077)
21 - Broadcast packet accepted twice!! (CVE-2017-13080)
22 - Client has installed an all zero encryption key (TK)!!
23
24 For patched devices:
25 - Client is likely not vulnerable to CVE-2017-13080
26 """
27
28 from scapy.modules.krack.automaton import KrackAP # noqa: F401
29
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/scapy/modules/krack/__init__.py b/scapy/modules/krack/__init__.py
--- a/scapy/modules/krack/__init__.py
+++ b/scapy/modules/krack/__init__.py
@@ -1,5 +1,7 @@
"""Module implementing Krack Attack on client, as a custom WPA Access Point
+Requires the python cryptography package v1.7+. See https://cryptography.io/
+
More details on the attack can be found on https://www.krackattacks.com/
Example of use (from the scapy shell):
@@ -25,4 +27,10 @@
- Client is likely not vulnerable to CVE-2017-13080
"""
-from scapy.modules.krack.automaton import KrackAP # noqa: F401
+from scapy.config import conf
+
+if conf.crypto_valid:
+ from scapy.modules.krack.automaton import KrackAP # noqa: F401
+else:
+ raise ImportError("Cannot import Krack module due to missing dependency. "
+ "Please install python{3}-cryptography v1.7+.")
|
{"golden_diff": "diff --git a/scapy/modules/krack/__init__.py b/scapy/modules/krack/__init__.py\n--- a/scapy/modules/krack/__init__.py\n+++ b/scapy/modules/krack/__init__.py\n@@ -1,5 +1,7 @@\n \"\"\"Module implementing Krack Attack on client, as a custom WPA Access Point\n \n+Requires the python cryptography package v1.7+. See https://cryptography.io/\n+\n More details on the attack can be found on https://www.krackattacks.com/\n \n Example of use (from the scapy shell):\n@@ -25,4 +27,10 @@\n - Client is likely not vulnerable to CVE-2017-13080\n \"\"\"\n \n-from scapy.modules.krack.automaton import KrackAP # noqa: F401\n+from scapy.config import conf\n+\n+if conf.crypto_valid:\n+ from scapy.modules.krack.automaton import KrackAP # noqa: F401\n+else:\n+ raise ImportError(\"Cannot import Krack module due to missing dependency. \"\n+ \"Please install python{3}-cryptography v1.7+.\")\n", "issue": "KRACK module requires python-cryptography\nWhen `python-cryptography` or `python3-cryptography` packages are not installed, the KRACK module cannot be loaded.\r\n\r\n```\r\n>>> load_module(\"krack\")\r\nERROR: Loading module scapy.modules.krack\r\nTraceback (most recent call last):\r\n File \"/home/ria/scapy/scapy/main.py\", line 150, in _load\r\n mod = importlib.import_module(module)\r\n File \"/usr/lib/python3.7/importlib/__init__.py\", line 127, in import_module\r\n return _bootstrap._gcd_import(name[level:], package, level)\r\n File \"<frozen importlib._bootstrap>\", line 1006, in _gcd_import\r\n File \"<frozen importlib._bootstrap>\", line 983, in _find_and_load\r\n File \"<frozen importlib._bootstrap>\", line 967, in _find_and_load_unlocked\r\n File \"<frozen importlib._bootstrap>\", line 677, in _load_unlocked\r\n File \"<frozen importlib._bootstrap_external>\", line 728, in exec_module\r\n File \"<frozen importlib._bootstrap>\", line 219, in _call_with_frames_removed\r\n File \"/home/ria/scapy/scapy/modules/krack/__init__.py\", line 28, in <module>\r\n from scapy.modules.krack.automaton import KrackAP # noqa: F401\r\n File \"/home/ria/scapy/scapy/modules/krack/automaton.py\", line 7, in <module>\r\n from cryptography.hazmat.primitives import hashes\r\nModuleNotFoundError: No module named 'cryptography'\r\n```\r\n\r\nCalling @commial to assess whether the module could still offer functionalities with a `crypto_valid` around some blocks. But at first glance I believe a missing `cryptography` should completely prevent importing the module.\r\n\r\nhttps://github.com/secdev/scapy/blob/a58e1b90a704c394216a0b5a864a50931754bdf7/scapy/modules/krack/automaton.py#L6-L10\r\n\r\nhttps://github.com/secdev/scapy/blob/a58e1b90a704c394216a0b5a864a50931754bdf7/scapy/modules/krack/crypto.py#L6-L9\n", "before_files": [{"content": "\"\"\"Module implementing Krack Attack on client, as a custom WPA Access Point\n\nMore details on the attack can be found on https://www.krackattacks.com/\n\nExample of use (from the scapy shell):\n>>> load_module(\"krack\")\n>>> KrackAP(\n iface=\"mon0\", # A monitor interface\n ap_mac='11:22:33:44:55:66', # MAC (BSSID) to use\n ssid=\"TEST_KRACK\", # SSID\n passphrase=\"testtest\", # Associated passphrase\n).run()\n\nThen, on the target device, connect to \"TEST_KRACK\" using \"testtest\" as the\npassphrase.\nThe output logs will indicate if one of the vulnerability have been triggered.\n\nOutputs for vulnerable devices:\n- IV re-use!! Client seems to be vulnerable to handshake 3/4 replay\n (CVE-2017-13077)\n- Broadcast packet accepted twice!! (CVE-2017-13080)\n- Client has installed an all zero encryption key (TK)!!\n\nFor patched devices:\n- Client is likely not vulnerable to CVE-2017-13080\n\"\"\"\n\nfrom scapy.modules.krack.automaton import KrackAP # noqa: F401\n", "path": "scapy/modules/krack/__init__.py"}], "after_files": [{"content": "\"\"\"Module implementing Krack Attack on client, as a custom WPA Access Point\n\nRequires the python cryptography package v1.7+. See https://cryptography.io/\n\nMore details on the attack can be found on https://www.krackattacks.com/\n\nExample of use (from the scapy shell):\n>>> load_module(\"krack\")\n>>> KrackAP(\n iface=\"mon0\", # A monitor interface\n ap_mac='11:22:33:44:55:66', # MAC (BSSID) to use\n ssid=\"TEST_KRACK\", # SSID\n passphrase=\"testtest\", # Associated passphrase\n).run()\n\nThen, on the target device, connect to \"TEST_KRACK\" using \"testtest\" as the\npassphrase.\nThe output logs will indicate if one of the vulnerability have been triggered.\n\nOutputs for vulnerable devices:\n- IV re-use!! Client seems to be vulnerable to handshake 3/4 replay\n (CVE-2017-13077)\n- Broadcast packet accepted twice!! (CVE-2017-13080)\n- Client has installed an all zero encryption key (TK)!!\n\nFor patched devices:\n- Client is likely not vulnerable to CVE-2017-13080\n\"\"\"\n\nfrom scapy.config import conf\n\nif conf.crypto_valid:\n from scapy.modules.krack.automaton import KrackAP # noqa: F401\nelse:\n raise ImportError(\"Cannot import Krack module due to missing dependency. \"\n \"Please install python{3}-cryptography v1.7+.\")\n", "path": "scapy/modules/krack/__init__.py"}]}
| 1,139 | 258 |
gh_patches_debug_5192
|
rasdani/github-patches
|
git_diff
|
cowrie__cowrie-1421
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
$UID not spitting out UID
**Describe the bug**
A hacker found a way to get inside the cowrie box and executed "echo $UID". The reply that came back was empty, and the hacker disconnected. My normal box returned the UID.
**To Reproduce**
Steps to reproduce the behavior:
1. Connect to linux machine
2. Type in "Echo $UID" (response should be something like 1001 or 0)
3. Connect to cowrie
4. Type in "echo $UID"
5. See nothing replied on screen.
**Expected behavior**
When someone uses the variable $UID, get the variable from the honeyfs/etc/passwd file or return 0.
**Server (please complete the following information):**
- OS: Linux nanopineo2 5.8.6-sunxi64 #20.08.2 SMP Fri Sep 4 08:52:31 CEST 2020 aarch64 GNU/Linux
- Python: Python 3.7.3
**Additional context**
My hackers seem to get smarter each day...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cowrie/shell/session.py`
Content:
```
1 # Copyright (c) 2009-2014 Upi Tamminen <[email protected]>
2 # See the COPYRIGHT file for more information
3
4 from __future__ import absolute_import, division
5
6 from twisted.conch.interfaces import ISession
7 from twisted.conch.ssh import session
8 from twisted.python import log
9
10 from zope.interface import implementer
11
12 from cowrie.insults import insults
13 from cowrie.shell import protocol
14
15
16 @implementer(ISession)
17 class SSHSessionForCowrieUser(object):
18
19 def __init__(self, avatar, reactor=None):
20 """
21 Construct an C{SSHSessionForCowrieUser}.
22
23 @param avatar: The L{CowrieUser} for whom this is an SSH session.
24 @param reactor: An L{IReactorProcess} used to handle shell and exec
25 requests. Uses the default reactor if None.
26 """
27 self.protocol = None
28 self.avatar = avatar
29 self.server = avatar.server
30 self.uid = avatar.uid
31 self.gid = avatar.gid
32 self.username = avatar.username
33 self.environ = {
34 'LOGNAME': self.username,
35 'SHELL': '/bin/bash',
36 'USER': self.username,
37 'HOME': self.avatar.home,
38 'TMOUT': '1800',
39 'UID': self.uid}
40 if self.uid == 0:
41 self.environ['PATH'] = '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
42 else:
43 self.environ['PATH'] = '/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games'
44
45 self.server.initFileSystem()
46
47 if self.avatar.temporary:
48 self.server.fs.mkdir(self.avatar.home, self.uid, self.gid, 4096, 755)
49
50 def openShell(self, processprotocol):
51 self.protocol = insults.LoggingServerProtocol(
52 protocol.HoneyPotInteractiveProtocol, self)
53 self.protocol.makeConnection(processprotocol)
54 processprotocol.makeConnection(session.wrapProtocol(self.protocol))
55
56 def getPty(self, terminal, windowSize, attrs):
57 self.environ['TERM'] = terminal.decode("utf-8")
58 log.msg(
59 eventid='cowrie.client.size',
60 width=windowSize[1],
61 height=windowSize[0],
62 format='Terminal Size: %(width)s %(height)s'
63 )
64 self.windowSize = windowSize
65 return None
66
67 def execCommand(self, processprotocol, cmd):
68 self.protocol = insults.LoggingServerProtocol(
69 protocol.HoneyPotExecProtocol, self, cmd)
70 self.protocol.makeConnection(processprotocol)
71 processprotocol.makeConnection(session.wrapProtocol(self.protocol))
72
73 def closed(self):
74 """
75 this is reliably called on both logout and disconnect
76 we notify the protocol here we lost the connection
77 """
78 if self.protocol:
79 self.protocol.connectionLost("disconnected")
80 self.protocol = None
81
82 def eofReceived(self):
83 if self.protocol:
84 self.protocol.eofReceived()
85
86 def windowChanged(self, windowSize):
87 self.windowSize = windowSize
88
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/cowrie/shell/session.py b/src/cowrie/shell/session.py
--- a/src/cowrie/shell/session.py
+++ b/src/cowrie/shell/session.py
@@ -36,7 +36,7 @@
'USER': self.username,
'HOME': self.avatar.home,
'TMOUT': '1800',
- 'UID': self.uid}
+ 'UID': str(self.uid)}
if self.uid == 0:
self.environ['PATH'] = '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
else:
|
{"golden_diff": "diff --git a/src/cowrie/shell/session.py b/src/cowrie/shell/session.py\n--- a/src/cowrie/shell/session.py\n+++ b/src/cowrie/shell/session.py\n@@ -36,7 +36,7 @@\n 'USER': self.username,\n 'HOME': self.avatar.home,\n 'TMOUT': '1800',\n- 'UID': self.uid}\n+ 'UID': str(self.uid)}\n if self.uid == 0:\n self.environ['PATH'] = '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'\n else:\n", "issue": "$UID not spitting out UID\n**Describe the bug**\r\nA hacker found a way to get inside the cowrie box and executed \"echo $UID\". The reply that came back was empty, and the hacker disconnected. My normal box returned the UID.\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Connect to linux machine\r\n2. Type in \"Echo $UID\" (response should be something like 1001 or 0)\r\n3. Connect to cowrie\r\n4. Type in \"echo $UID\"\r\n5. See nothing replied on screen.\r\n\r\n**Expected behavior**\r\nWhen someone uses the variable $UID, get the variable from the honeyfs/etc/passwd file or return 0.\r\n\r\n**Server (please complete the following information):**\r\n - OS: Linux nanopineo2 5.8.6-sunxi64 #20.08.2 SMP Fri Sep 4 08:52:31 CEST 2020 aarch64 GNU/Linux\r\n - Python: Python 3.7.3\r\n\r\n**Additional context**\r\nMy hackers seem to get smarter each day...\r\n\n", "before_files": [{"content": "# Copyright (c) 2009-2014 Upi Tamminen <[email protected]>\n# See the COPYRIGHT file for more information\n\nfrom __future__ import absolute_import, division\n\nfrom twisted.conch.interfaces import ISession\nfrom twisted.conch.ssh import session\nfrom twisted.python import log\n\nfrom zope.interface import implementer\n\nfrom cowrie.insults import insults\nfrom cowrie.shell import protocol\n\n\n@implementer(ISession)\nclass SSHSessionForCowrieUser(object):\n\n def __init__(self, avatar, reactor=None):\n \"\"\"\n Construct an C{SSHSessionForCowrieUser}.\n\n @param avatar: The L{CowrieUser} for whom this is an SSH session.\n @param reactor: An L{IReactorProcess} used to handle shell and exec\n requests. Uses the default reactor if None.\n \"\"\"\n self.protocol = None\n self.avatar = avatar\n self.server = avatar.server\n self.uid = avatar.uid\n self.gid = avatar.gid\n self.username = avatar.username\n self.environ = {\n 'LOGNAME': self.username,\n 'SHELL': '/bin/bash',\n 'USER': self.username,\n 'HOME': self.avatar.home,\n 'TMOUT': '1800',\n 'UID': self.uid}\n if self.uid == 0:\n self.environ['PATH'] = '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'\n else:\n self.environ['PATH'] = '/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games'\n\n self.server.initFileSystem()\n\n if self.avatar.temporary:\n self.server.fs.mkdir(self.avatar.home, self.uid, self.gid, 4096, 755)\n\n def openShell(self, processprotocol):\n self.protocol = insults.LoggingServerProtocol(\n protocol.HoneyPotInteractiveProtocol, self)\n self.protocol.makeConnection(processprotocol)\n processprotocol.makeConnection(session.wrapProtocol(self.protocol))\n\n def getPty(self, terminal, windowSize, attrs):\n self.environ['TERM'] = terminal.decode(\"utf-8\")\n log.msg(\n eventid='cowrie.client.size',\n width=windowSize[1],\n height=windowSize[0],\n format='Terminal Size: %(width)s %(height)s'\n )\n self.windowSize = windowSize\n return None\n\n def execCommand(self, processprotocol, cmd):\n self.protocol = insults.LoggingServerProtocol(\n protocol.HoneyPotExecProtocol, self, cmd)\n self.protocol.makeConnection(processprotocol)\n processprotocol.makeConnection(session.wrapProtocol(self.protocol))\n\n def closed(self):\n \"\"\"\n this is reliably called on both logout and disconnect\n we notify the protocol here we lost the connection\n \"\"\"\n if self.protocol:\n self.protocol.connectionLost(\"disconnected\")\n self.protocol = None\n\n def eofReceived(self):\n if self.protocol:\n self.protocol.eofReceived()\n\n def windowChanged(self, windowSize):\n self.windowSize = windowSize\n", "path": "src/cowrie/shell/session.py"}], "after_files": [{"content": "# Copyright (c) 2009-2014 Upi Tamminen <[email protected]>\n# See the COPYRIGHT file for more information\n\nfrom __future__ import absolute_import, division\n\nfrom twisted.conch.interfaces import ISession\nfrom twisted.conch.ssh import session\nfrom twisted.python import log\n\nfrom zope.interface import implementer\n\nfrom cowrie.insults import insults\nfrom cowrie.shell import protocol\n\n\n@implementer(ISession)\nclass SSHSessionForCowrieUser(object):\n\n def __init__(self, avatar, reactor=None):\n \"\"\"\n Construct an C{SSHSessionForCowrieUser}.\n\n @param avatar: The L{CowrieUser} for whom this is an SSH session.\n @param reactor: An L{IReactorProcess} used to handle shell and exec\n requests. Uses the default reactor if None.\n \"\"\"\n self.protocol = None\n self.avatar = avatar\n self.server = avatar.server\n self.uid = avatar.uid\n self.gid = avatar.gid\n self.username = avatar.username\n self.environ = {\n 'LOGNAME': self.username,\n 'SHELL': '/bin/bash',\n 'USER': self.username,\n 'HOME': self.avatar.home,\n 'TMOUT': '1800',\n 'UID': str(self.uid)}\n if self.uid == 0:\n self.environ['PATH'] = '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'\n else:\n self.environ['PATH'] = '/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games'\n\n self.server.initFileSystem()\n\n if self.avatar.temporary:\n self.server.fs.mkdir(self.avatar.home, self.uid, self.gid, 4096, 755)\n\n def openShell(self, processprotocol):\n self.protocol = insults.LoggingServerProtocol(\n protocol.HoneyPotInteractiveProtocol, self)\n self.protocol.makeConnection(processprotocol)\n processprotocol.makeConnection(session.wrapProtocol(self.protocol))\n\n def getPty(self, terminal, windowSize, attrs):\n self.environ['TERM'] = terminal.decode(\"utf-8\")\n log.msg(\n eventid='cowrie.client.size',\n width=windowSize[1],\n height=windowSize[0],\n format='Terminal Size: %(width)s %(height)s'\n )\n self.windowSize = windowSize\n return None\n\n def execCommand(self, processprotocol, cmd):\n self.protocol = insults.LoggingServerProtocol(\n protocol.HoneyPotExecProtocol, self, cmd)\n self.protocol.makeConnection(processprotocol)\n processprotocol.makeConnection(session.wrapProtocol(self.protocol))\n\n def closed(self):\n \"\"\"\n this is reliably called on both logout and disconnect\n we notify the protocol here we lost the connection\n \"\"\"\n if self.protocol:\n self.protocol.connectionLost(\"disconnected\")\n self.protocol = None\n\n def eofReceived(self):\n if self.protocol:\n self.protocol.eofReceived()\n\n def windowChanged(self, windowSize):\n self.windowSize = windowSize\n", "path": "src/cowrie/shell/session.py"}]}
| 1,332 | 136 |
gh_patches_debug_21925
|
rasdani/github-patches
|
git_diff
|
Parsl__parsl-2753
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Link to or provide instructions for example Perlmutter config
Since Cori is now retired, it can be removed from the [Configuration](https://parsl.readthedocs.io/en/stable/userguide/configuring.html?highlight=nersc#cori-nersc) section of the docs. In its place, it would be worthwhile to add (or link to) an example config for Perlmutter at NERSC, the details of which can be found [here](https://docs.nersc.gov/jobs/workflow/parsl/).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `parsl/configs/cori.py`
Content:
```
1 from parsl.config import Config
2 from parsl.providers import SlurmProvider
3 from parsl.launchers import SrunLauncher
4 from parsl.executors import HighThroughputExecutor
5 from parsl.addresses import address_by_interface
6
7
8 config = Config(
9 executors=[
10 HighThroughputExecutor(
11 label='Cori_HTEX_multinode',
12 # This is the network interface on the login node to
13 # which compute nodes can communicate
14 address=address_by_interface('bond0.144'),
15 cores_per_worker=2,
16 provider=SlurmProvider(
17 'regular', # Partition / QOS
18 nodes_per_block=2,
19 init_blocks=1,
20 # string to prepend to #SBATCH blocks in the submit
21 # script to the scheduler eg: '#SBATCH --constraint=knl,quad,cache'
22 scheduler_options='',
23 # Command to be run before starting a worker, such as:
24 # 'module load Anaconda; source activate parsl_env'.
25 worker_init='',
26 # We request all hyperthreads on a node.
27 launcher=SrunLauncher(overrides='-c 272'),
28 walltime='00:10:00',
29 # Slurm scheduler on Cori can be slow at times,
30 # increase the command timeouts
31 cmd_timeout=120,
32 ),
33 )
34 ]
35 )
36
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/parsl/configs/cori.py b/parsl/configs/cori.py
deleted file mode 100644
--- a/parsl/configs/cori.py
+++ /dev/null
@@ -1,35 +0,0 @@
-from parsl.config import Config
-from parsl.providers import SlurmProvider
-from parsl.launchers import SrunLauncher
-from parsl.executors import HighThroughputExecutor
-from parsl.addresses import address_by_interface
-
-
-config = Config(
- executors=[
- HighThroughputExecutor(
- label='Cori_HTEX_multinode',
- # This is the network interface on the login node to
- # which compute nodes can communicate
- address=address_by_interface('bond0.144'),
- cores_per_worker=2,
- provider=SlurmProvider(
- 'regular', # Partition / QOS
- nodes_per_block=2,
- init_blocks=1,
- # string to prepend to #SBATCH blocks in the submit
- # script to the scheduler eg: '#SBATCH --constraint=knl,quad,cache'
- scheduler_options='',
- # Command to be run before starting a worker, such as:
- # 'module load Anaconda; source activate parsl_env'.
- worker_init='',
- # We request all hyperthreads on a node.
- launcher=SrunLauncher(overrides='-c 272'),
- walltime='00:10:00',
- # Slurm scheduler on Cori can be slow at times,
- # increase the command timeouts
- cmd_timeout=120,
- ),
- )
- ]
-)
|
{"golden_diff": "diff --git a/parsl/configs/cori.py b/parsl/configs/cori.py\ndeleted file mode 100644\n--- a/parsl/configs/cori.py\n+++ /dev/null\n@@ -1,35 +0,0 @@\n-from parsl.config import Config\n-from parsl.providers import SlurmProvider\n-from parsl.launchers import SrunLauncher\n-from parsl.executors import HighThroughputExecutor\n-from parsl.addresses import address_by_interface\n-\n-\n-config = Config(\n- executors=[\n- HighThroughputExecutor(\n- label='Cori_HTEX_multinode',\n- # This is the network interface on the login node to\n- # which compute nodes can communicate\n- address=address_by_interface('bond0.144'),\n- cores_per_worker=2,\n- provider=SlurmProvider(\n- 'regular', # Partition / QOS\n- nodes_per_block=2,\n- init_blocks=1,\n- # string to prepend to #SBATCH blocks in the submit\n- # script to the scheduler eg: '#SBATCH --constraint=knl,quad,cache'\n- scheduler_options='',\n- # Command to be run before starting a worker, such as:\n- # 'module load Anaconda; source activate parsl_env'.\n- worker_init='',\n- # We request all hyperthreads on a node.\n- launcher=SrunLauncher(overrides='-c 272'),\n- walltime='00:10:00',\n- # Slurm scheduler on Cori can be slow at times,\n- # increase the command timeouts\n- cmd_timeout=120,\n- ),\n- )\n- ]\n-)\n", "issue": "Link to or provide instructions for example Perlmutter config\nSince Cori is now retired, it can be removed from the [Configuration](https://parsl.readthedocs.io/en/stable/userguide/configuring.html?highlight=nersc#cori-nersc) section of the docs. In its place, it would be worthwhile to add (or link to) an example config for Perlmutter at NERSC, the details of which can be found [here](https://docs.nersc.gov/jobs/workflow/parsl/).\n", "before_files": [{"content": "from parsl.config import Config\nfrom parsl.providers import SlurmProvider\nfrom parsl.launchers import SrunLauncher\nfrom parsl.executors import HighThroughputExecutor\nfrom parsl.addresses import address_by_interface\n\n\nconfig = Config(\n executors=[\n HighThroughputExecutor(\n label='Cori_HTEX_multinode',\n # This is the network interface on the login node to\n # which compute nodes can communicate\n address=address_by_interface('bond0.144'),\n cores_per_worker=2,\n provider=SlurmProvider(\n 'regular', # Partition / QOS\n nodes_per_block=2,\n init_blocks=1,\n # string to prepend to #SBATCH blocks in the submit\n # script to the scheduler eg: '#SBATCH --constraint=knl,quad,cache'\n scheduler_options='',\n # Command to be run before starting a worker, such as:\n # 'module load Anaconda; source activate parsl_env'.\n worker_init='',\n # We request all hyperthreads on a node.\n launcher=SrunLauncher(overrides='-c 272'),\n walltime='00:10:00',\n # Slurm scheduler on Cori can be slow at times,\n # increase the command timeouts\n cmd_timeout=120,\n ),\n )\n ]\n)\n", "path": "parsl/configs/cori.py"}], "after_files": [{"content": null, "path": "parsl/configs/cori.py"}]}
| 726 | 375 |
gh_patches_debug_127
|
rasdani/github-patches
|
git_diff
|
googleapis__google-cloud-python-6232
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Re-generate library using tasks/synth.py
This PR was created by autosynth.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tasks/synth.py`
Content:
```
1 # Copyright 2018 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """This script is used to synthesize generated parts of this library."""
16
17 import synthtool as s
18 import synthtool.gcp as gcp
19 import logging
20
21 logging.basicConfig(level=logging.DEBUG)
22
23 gapic = gcp.GAPICGenerator()
24 common = gcp.CommonTemplates()
25 excludes = [
26 'README.rst',
27 'setup.py',
28 'docs/conf.py',
29 'docs/index.rst',
30 ]
31
32 for version in ['v2beta2', 'v2beta3']:
33 library = gapic.py_library(
34 'tasks', version,
35 config_path=f'artman_cloudtasks_{version}.yaml')
36
37 s.copy(library, excludes=excludes)
38
39 # Fix unindentation of bullet list second line
40 s.replace(
41 f'google/cloud/tasks_{version}/gapic/cloud_tasks_client.py',
42 '( \* .*\n )([^\s*])',
43 '\g<1> \g<2>')
44
45 s.replace(
46 f'google/cloud/tasks_{version}/gapic/cloud_tasks_client.py',
47 '(Google IAM .*?_) ',
48 '\g<1>_ ')
49
50 # Issues with Anonymous ('__') links. Change to named.
51 s.replace(
52 f"google/cloud/tasks_{version}/proto/*.py",
53 ">`__",
54 ">`_")
55
56 # Issue in v2beta2
57 s.replace(
58 f'google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py',
59 r'(Sample filter \\"app_engine_http_target: )\*\\".',
60 '\g<1>\\*\\".')
61
62 # Wrapped link fails due to space in link (v2beta2)
63 s.replace(
64 f"google/cloud/tasks_v2beta2/proto/queue_pb2.py",
65 '(uests in queue.yaml/xml) <\n\s+',
66 '\g<1>\n <')
67
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/tasks/synth.py b/tasks/synth.py
--- a/tasks/synth.py
+++ b/tasks/synth.py
@@ -25,6 +25,7 @@
excludes = [
'README.rst',
'setup.py',
+ 'nox*.py',
'docs/conf.py',
'docs/index.rst',
]
|
{"golden_diff": "diff --git a/tasks/synth.py b/tasks/synth.py\n--- a/tasks/synth.py\n+++ b/tasks/synth.py\n@@ -25,6 +25,7 @@\n excludes = [\n 'README.rst',\n 'setup.py',\n+ 'nox*.py',\n 'docs/conf.py',\n 'docs/index.rst',\n ]\n", "issue": "Re-generate library using tasks/synth.py\nThis PR was created by autosynth.\n", "before_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This script is used to synthesize generated parts of this library.\"\"\"\n\nimport synthtool as s\nimport synthtool.gcp as gcp\nimport logging\n\nlogging.basicConfig(level=logging.DEBUG)\n\ngapic = gcp.GAPICGenerator()\ncommon = gcp.CommonTemplates()\nexcludes = [\n 'README.rst',\n 'setup.py',\n 'docs/conf.py',\n 'docs/index.rst',\n]\n\nfor version in ['v2beta2', 'v2beta3']:\n library = gapic.py_library(\n 'tasks', version,\n config_path=f'artman_cloudtasks_{version}.yaml')\n\n s.copy(library, excludes=excludes)\n\n # Fix unindentation of bullet list second line\n s.replace(\n f'google/cloud/tasks_{version}/gapic/cloud_tasks_client.py',\n '( \\* .*\\n )([^\\s*])',\n '\\g<1> \\g<2>')\n\n s.replace(\n f'google/cloud/tasks_{version}/gapic/cloud_tasks_client.py',\n '(Google IAM .*?_) ',\n '\\g<1>_ ')\n\n # Issues with Anonymous ('__') links. Change to named.\n s.replace(\n f\"google/cloud/tasks_{version}/proto/*.py\",\n \">`__\",\n \">`_\")\n\n# Issue in v2beta2\ns.replace(\n f'google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py',\n r'(Sample filter \\\\\"app_engine_http_target: )\\*\\\\\".',\n '\\g<1>\\\\*\\\\\".')\n\n# Wrapped link fails due to space in link (v2beta2)\ns.replace(\n f\"google/cloud/tasks_v2beta2/proto/queue_pb2.py\",\n '(uests in queue.yaml/xml) <\\n\\s+',\n '\\g<1>\\n <')\n", "path": "tasks/synth.py"}], "after_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This script is used to synthesize generated parts of this library.\"\"\"\n\nimport synthtool as s\nimport synthtool.gcp as gcp\nimport logging\n\nlogging.basicConfig(level=logging.DEBUG)\n\ngapic = gcp.GAPICGenerator()\ncommon = gcp.CommonTemplates()\nexcludes = [\n 'README.rst',\n 'setup.py',\n 'nox*.py',\n 'docs/conf.py',\n 'docs/index.rst',\n]\n\nfor version in ['v2beta2', 'v2beta3']:\n library = gapic.py_library(\n 'tasks', version,\n config_path=f'artman_cloudtasks_{version}.yaml')\n\n s.copy(library, excludes=excludes)\n\n # Fix unindentation of bullet list second line\n s.replace(\n f'google/cloud/tasks_{version}/gapic/cloud_tasks_client.py',\n '( \\* .*\\n )([^\\s*])',\n '\\g<1> \\g<2>')\n\n s.replace(\n f'google/cloud/tasks_{version}/gapic/cloud_tasks_client.py',\n '(Google IAM .*?_) ',\n '\\g<1>_ ')\n\n # Issues with Anonymous ('__') links. Change to named.\n s.replace(\n f\"google/cloud/tasks_{version}/proto/*.py\",\n \">`__\",\n \">`_\")\n\n# Issue in v2beta2\ns.replace(\n f'google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py',\n r'(Sample filter \\\\\"app_engine_http_target: )\\*\\\\\".',\n '\\g<1>\\\\*\\\\\".')\n\n# Wrapped link fails due to space in link (v2beta2)\ns.replace(\n f\"google/cloud/tasks_v2beta2/proto/queue_pb2.py\",\n '(uests in queue.yaml/xml) <\\n\\s+',\n '\\g<1>\\n <')\n", "path": "tasks/synth.py"}]}
| 930 | 75 |
gh_patches_debug_21647
|
rasdani/github-patches
|
git_diff
|
microsoft__ptvsd-1148
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Provide a public API to register threads not created by Python for debugging
This can be done by simply wrapping `pydevd.settrace(suspend=False)`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/ptvsd/attach_server.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License. See LICENSE in the project root
3 # for license information.
4
5 from ptvsd._remote import (
6 attach as ptvsd_attach,
7 enable_attach as ptvsd_enable_attach,
8 _pydevd_settrace,
9 )
10 from ptvsd.wrapper import debugger_attached
11 import sys
12 from _pydevd_bundle.pydevd_constants import get_global_debugger
13 from pydevd_file_utils import get_abs_path_real_path_and_base_from_frame
14
15 WAIT_TIMEOUT = 1.0
16
17 DEFAULT_HOST = '0.0.0.0'
18 DEFAULT_PORT = 5678
19
20 _pending_threads = set()
21
22
23 def wait_for_attach(timeout=None):
24 """If a remote debugger is attached, returns immediately. Otherwise,
25 blocks until a remote debugger attaches to this process, or until the
26 optional timeout occurs.
27
28 Parameters
29 ----------
30 timeout : float, optional
31 The timeout for the operation in seconds (or fractions thereof).
32 """
33 debugger_attached.wait(timeout)
34
35
36 def enable_attach(address=(DEFAULT_HOST, DEFAULT_PORT), redirect_output=True):
37 """Enables a client to attach to this process remotely to debug Python code.
38
39 Parameters
40 ----------
41 address : (str, int), optional
42 Specifies the interface and port on which the debugging server should
43 listen for TCP connections. It is in the same format as used for
44 regular sockets of the `socket.AF_INET` family, i.e. a tuple of
45 ``(hostname, port)``. On client side, the server is identified by the
46 Qualifier string in the usual ``'hostname:port'`` format, e.g.:
47 ``'myhost.cloudapp.net:5678'``. Default is ``('0.0.0.0', 5678)``.
48 redirect_output : bool, optional
49 Specifies whether any output (on both `stdout` and `stderr`) produced
50 by this program should be sent to the debugger. Default is ``True``.
51
52 Notes
53 -----
54 This function returns immediately after setting up the debugging server,
55 and does not block program execution. If you need to block until debugger
56 is attached, call `ptvsd.wait_for_attach`. The debugger can be detached
57 and re-attached multiple times after `enable_attach` is called.
58
59 Only the thread on which this function is called, and any threads that are
60 created after it returns, will be visible in the debugger once it is
61 attached. Any threads that are already running before this function is
62 called will not be visible.
63 """
64 if is_attached():
65 return
66 debugger_attached.clear()
67
68 # Ensure port is int
69 port = address[1]
70 address = (address[0], port if type(port) is int else int(port))
71
72 ptvsd_enable_attach(
73 address,
74 redirect_output=redirect_output,
75 )
76
77
78 def attach(address, redirect_output=True):
79 """Attaches this process to the debugger listening on a given address.
80
81 Parameters
82 ----------
83 address : (str, int), optional
84 Specifies the interface and port on which the debugger is listening
85 for TCP connections. It is in the same format as used for
86 regular sockets of the `socket.AF_INET` family, i.e. a tuple of
87 ``(hostname, port)``.
88 redirect_output : bool, optional
89 Specifies whether any output (on both `stdout` and `stderr`) produced
90 by this program should be sent to the debugger. Default is ``True``.
91 """
92 if is_attached():
93 return
94 debugger_attached.clear()
95
96 # Ensure port is int
97 port = address[1]
98 address = (address[0], port if type(port) is int else int(port))
99
100 ptvsd_attach(address, redirect_output=redirect_output)
101
102 # TODO: Add disable_attach()?
103
104
105 def is_attached():
106 """Returns ``True`` if debugger is attached, ``False`` otherwise."""
107 return debugger_attached.isSet()
108
109
110 def break_into_debugger():
111 """If a remote debugger is attached, pauses execution of all threads,
112 and breaks into the debugger with current thread as active.
113 """
114 if not is_attached():
115 return
116
117 # Get the first frame in the stack that's not an internal frame.
118 global_debugger = get_global_debugger()
119 stop_at_frame = sys._getframe().f_back
120 while stop_at_frame is not None and global_debugger.get_file_type(
121 get_abs_path_real_path_and_base_from_frame(stop_at_frame)) == global_debugger.PYDEV_FILE:
122 stop_at_frame = stop_at_frame.f_back
123
124 _pydevd_settrace(
125 suspend=True,
126 trace_only_current_thread=True,
127 patch_multiprocessing=False,
128 stop_at_frame=stop_at_frame,
129 )
130 stop_at_frame = None
131
```
Path: `src/ptvsd/__init__.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License. See LICENSE in the project root
3 # for license information.
4
5 __all__ = [
6 '__version__', '__author__',
7 'enable_attach', 'wait_for_attach', 'break_into_debugger', 'is_attached',
8 ]
9
10
11 # "force_pydevd" must be imported first to ensure (via side effects)
12 # that the ptvsd-vendored copy of pydevd gets used.
13 from ._vendored import force_pydevd
14 from ptvsd.version import __version__, __author__
15 from ptvsd.attach_server import (
16 enable_attach, wait_for_attach, break_into_debugger, is_attached,
17 )
18 del force_pydevd
19
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/ptvsd/__init__.py b/src/ptvsd/__init__.py
--- a/src/ptvsd/__init__.py
+++ b/src/ptvsd/__init__.py
@@ -12,7 +12,12 @@
# that the ptvsd-vendored copy of pydevd gets used.
from ._vendored import force_pydevd
from ptvsd.version import __version__, __author__
-from ptvsd.attach_server import (
- enable_attach, wait_for_attach, break_into_debugger, is_attached,
+from ptvsd.attach_server import ( # noqa
+ attach,
+ break_into_debugger,
+ debug_this_thread,
+ enable_attach,
+ is_attached,
+ wait_for_attach,
)
del force_pydevd
diff --git a/src/ptvsd/attach_server.py b/src/ptvsd/attach_server.py
--- a/src/ptvsd/attach_server.py
+++ b/src/ptvsd/attach_server.py
@@ -9,6 +9,8 @@
)
from ptvsd.wrapper import debugger_attached
import sys
+
+import pydevd
from _pydevd_bundle.pydevd_constants import get_global_debugger
from pydevd_file_utils import get_abs_path_real_path_and_base_from_frame
@@ -128,3 +130,7 @@
stop_at_frame=stop_at_frame,
)
stop_at_frame = None
+
+
+def debug_this_thread():
+ pydevd.settrace(suspend=False)
|
{"golden_diff": "diff --git a/src/ptvsd/__init__.py b/src/ptvsd/__init__.py\n--- a/src/ptvsd/__init__.py\n+++ b/src/ptvsd/__init__.py\n@@ -12,7 +12,12 @@\n # that the ptvsd-vendored copy of pydevd gets used.\n from ._vendored import force_pydevd\n from ptvsd.version import __version__, __author__\n-from ptvsd.attach_server import (\n- enable_attach, wait_for_attach, break_into_debugger, is_attached,\n+from ptvsd.attach_server import ( # noqa\n+ attach,\n+ break_into_debugger,\n+ debug_this_thread,\n+ enable_attach,\n+ is_attached,\n+ wait_for_attach,\n )\n del force_pydevd\ndiff --git a/src/ptvsd/attach_server.py b/src/ptvsd/attach_server.py\n--- a/src/ptvsd/attach_server.py\n+++ b/src/ptvsd/attach_server.py\n@@ -9,6 +9,8 @@\n )\n from ptvsd.wrapper import debugger_attached\n import sys\n+\n+import pydevd\n from _pydevd_bundle.pydevd_constants import get_global_debugger\n from pydevd_file_utils import get_abs_path_real_path_and_base_from_frame\n \n@@ -128,3 +130,7 @@\n stop_at_frame=stop_at_frame,\n )\n stop_at_frame = None\n+\n+\n+def debug_this_thread():\n+ pydevd.settrace(suspend=False)\n", "issue": "Provide a public API to register threads not created by Python for debugging\nThis can be done by simply wrapping `pydevd.settrace(suspend=False)`\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\nfrom ptvsd._remote import (\n attach as ptvsd_attach,\n enable_attach as ptvsd_enable_attach,\n _pydevd_settrace,\n)\nfrom ptvsd.wrapper import debugger_attached\nimport sys\nfrom _pydevd_bundle.pydevd_constants import get_global_debugger\nfrom pydevd_file_utils import get_abs_path_real_path_and_base_from_frame\n\nWAIT_TIMEOUT = 1.0\n\nDEFAULT_HOST = '0.0.0.0'\nDEFAULT_PORT = 5678\n\n_pending_threads = set()\n\n\ndef wait_for_attach(timeout=None):\n \"\"\"If a remote debugger is attached, returns immediately. Otherwise,\n blocks until a remote debugger attaches to this process, or until the\n optional timeout occurs.\n\n Parameters\n ----------\n timeout : float, optional\n The timeout for the operation in seconds (or fractions thereof).\n \"\"\"\n debugger_attached.wait(timeout)\n\n\ndef enable_attach(address=(DEFAULT_HOST, DEFAULT_PORT), redirect_output=True):\n \"\"\"Enables a client to attach to this process remotely to debug Python code.\n\n Parameters\n ----------\n address : (str, int), optional\n Specifies the interface and port on which the debugging server should\n listen for TCP connections. It is in the same format as used for\n regular sockets of the `socket.AF_INET` family, i.e. a tuple of\n ``(hostname, port)``. On client side, the server is identified by the\n Qualifier string in the usual ``'hostname:port'`` format, e.g.:\n ``'myhost.cloudapp.net:5678'``. Default is ``('0.0.0.0', 5678)``.\n redirect_output : bool, optional\n Specifies whether any output (on both `stdout` and `stderr`) produced\n by this program should be sent to the debugger. Default is ``True``.\n\n Notes\n -----\n This function returns immediately after setting up the debugging server,\n and does not block program execution. If you need to block until debugger\n is attached, call `ptvsd.wait_for_attach`. The debugger can be detached\n and re-attached multiple times after `enable_attach` is called.\n\n Only the thread on which this function is called, and any threads that are\n created after it returns, will be visible in the debugger once it is\n attached. Any threads that are already running before this function is\n called will not be visible.\n \"\"\"\n if is_attached():\n return\n debugger_attached.clear()\n\n # Ensure port is int\n port = address[1]\n address = (address[0], port if type(port) is int else int(port))\n\n ptvsd_enable_attach(\n address,\n redirect_output=redirect_output,\n )\n\n\ndef attach(address, redirect_output=True):\n \"\"\"Attaches this process to the debugger listening on a given address.\n\n Parameters\n ----------\n address : (str, int), optional\n Specifies the interface and port on which the debugger is listening\n for TCP connections. It is in the same format as used for\n regular sockets of the `socket.AF_INET` family, i.e. a tuple of\n ``(hostname, port)``.\n redirect_output : bool, optional\n Specifies whether any output (on both `stdout` and `stderr`) produced\n by this program should be sent to the debugger. Default is ``True``.\n \"\"\"\n if is_attached():\n return\n debugger_attached.clear()\n\n # Ensure port is int\n port = address[1]\n address = (address[0], port if type(port) is int else int(port))\n\n ptvsd_attach(address, redirect_output=redirect_output)\n\n# TODO: Add disable_attach()?\n\n\ndef is_attached():\n \"\"\"Returns ``True`` if debugger is attached, ``False`` otherwise.\"\"\"\n return debugger_attached.isSet()\n\n\ndef break_into_debugger():\n \"\"\"If a remote debugger is attached, pauses execution of all threads,\n and breaks into the debugger with current thread as active.\n \"\"\"\n if not is_attached():\n return\n\n # Get the first frame in the stack that's not an internal frame.\n global_debugger = get_global_debugger()\n stop_at_frame = sys._getframe().f_back\n while stop_at_frame is not None and global_debugger.get_file_type(\n get_abs_path_real_path_and_base_from_frame(stop_at_frame)) == global_debugger.PYDEV_FILE:\n stop_at_frame = stop_at_frame.f_back\n\n _pydevd_settrace(\n suspend=True,\n trace_only_current_thread=True,\n patch_multiprocessing=False,\n stop_at_frame=stop_at_frame,\n )\n stop_at_frame = None\n", "path": "src/ptvsd/attach_server.py"}, {"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\n__all__ = [\n '__version__', '__author__',\n 'enable_attach', 'wait_for_attach', 'break_into_debugger', 'is_attached',\n]\n\n\n# \"force_pydevd\" must be imported first to ensure (via side effects)\n# that the ptvsd-vendored copy of pydevd gets used.\nfrom ._vendored import force_pydevd\nfrom ptvsd.version import __version__, __author__\nfrom ptvsd.attach_server import (\n enable_attach, wait_for_attach, break_into_debugger, is_attached,\n)\ndel force_pydevd\n", "path": "src/ptvsd/__init__.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\nfrom ptvsd._remote import (\n attach as ptvsd_attach,\n enable_attach as ptvsd_enable_attach,\n _pydevd_settrace,\n)\nfrom ptvsd.wrapper import debugger_attached\nimport sys\n\nimport pydevd\nfrom _pydevd_bundle.pydevd_constants import get_global_debugger\nfrom pydevd_file_utils import get_abs_path_real_path_and_base_from_frame\n\nWAIT_TIMEOUT = 1.0\n\nDEFAULT_HOST = '0.0.0.0'\nDEFAULT_PORT = 5678\n\n_pending_threads = set()\n\n\ndef wait_for_attach(timeout=None):\n \"\"\"If a remote debugger is attached, returns immediately. Otherwise,\n blocks until a remote debugger attaches to this process, or until the\n optional timeout occurs.\n\n Parameters\n ----------\n timeout : float, optional\n The timeout for the operation in seconds (or fractions thereof).\n \"\"\"\n debugger_attached.wait(timeout)\n\n\ndef enable_attach(address=(DEFAULT_HOST, DEFAULT_PORT), redirect_output=True):\n \"\"\"Enables a client to attach to this process remotely to debug Python code.\n\n Parameters\n ----------\n address : (str, int), optional\n Specifies the interface and port on which the debugging server should\n listen for TCP connections. It is in the same format as used for\n regular sockets of the `socket.AF_INET` family, i.e. a tuple of\n ``(hostname, port)``. On client side, the server is identified by the\n Qualifier string in the usual ``'hostname:port'`` format, e.g.:\n ``'myhost.cloudapp.net:5678'``. Default is ``('0.0.0.0', 5678)``.\n redirect_output : bool, optional\n Specifies whether any output (on both `stdout` and `stderr`) produced\n by this program should be sent to the debugger. Default is ``True``.\n\n Notes\n -----\n This function returns immediately after setting up the debugging server,\n and does not block program execution. If you need to block until debugger\n is attached, call `ptvsd.wait_for_attach`. The debugger can be detached\n and re-attached multiple times after `enable_attach` is called.\n\n Only the thread on which this function is called, and any threads that are\n created after it returns, will be visible in the debugger once it is\n attached. Any threads that are already running before this function is\n called will not be visible.\n \"\"\"\n if is_attached():\n return\n debugger_attached.clear()\n\n # Ensure port is int\n port = address[1]\n address = (address[0], port if type(port) is int else int(port))\n\n ptvsd_enable_attach(\n address,\n redirect_output=redirect_output,\n )\n\n\ndef attach(address, redirect_output=True):\n \"\"\"Attaches this process to the debugger listening on a given address.\n\n Parameters\n ----------\n address : (str, int), optional\n Specifies the interface and port on which the debugger is listening\n for TCP connections. It is in the same format as used for\n regular sockets of the `socket.AF_INET` family, i.e. a tuple of\n ``(hostname, port)``.\n redirect_output : bool, optional\n Specifies whether any output (on both `stdout` and `stderr`) produced\n by this program should be sent to the debugger. Default is ``True``.\n \"\"\"\n if is_attached():\n return\n debugger_attached.clear()\n\n # Ensure port is int\n port = address[1]\n address = (address[0], port if type(port) is int else int(port))\n\n ptvsd_attach(address, redirect_output=redirect_output)\n\n# TODO: Add disable_attach()?\n\n\ndef is_attached():\n \"\"\"Returns ``True`` if debugger is attached, ``False`` otherwise.\"\"\"\n return debugger_attached.isSet()\n\n\ndef break_into_debugger():\n \"\"\"If a remote debugger is attached, pauses execution of all threads,\n and breaks into the debugger with current thread as active.\n \"\"\"\n if not is_attached():\n return\n\n # Get the first frame in the stack that's not an internal frame.\n global_debugger = get_global_debugger()\n stop_at_frame = sys._getframe().f_back\n while stop_at_frame is not None and global_debugger.get_file_type(\n get_abs_path_real_path_and_base_from_frame(stop_at_frame)) == global_debugger.PYDEV_FILE:\n stop_at_frame = stop_at_frame.f_back\n\n _pydevd_settrace(\n suspend=True,\n trace_only_current_thread=True,\n patch_multiprocessing=False,\n stop_at_frame=stop_at_frame,\n )\n stop_at_frame = None\n\n\ndef debug_this_thread():\n pydevd.settrace(suspend=False)\n", "path": "src/ptvsd/attach_server.py"}, {"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\n__all__ = [\n '__version__', '__author__',\n 'enable_attach', 'wait_for_attach', 'break_into_debugger', 'is_attached',\n]\n\n\n# \"force_pydevd\" must be imported first to ensure (via side effects)\n# that the ptvsd-vendored copy of pydevd gets used.\nfrom ._vendored import force_pydevd\nfrom ptvsd.version import __version__, __author__\nfrom ptvsd.attach_server import ( # noqa\n attach,\n break_into_debugger,\n debug_this_thread,\n enable_attach,\n is_attached,\n wait_for_attach,\n)\ndel force_pydevd\n", "path": "src/ptvsd/__init__.py"}]}
| 1,839 | 341 |
gh_patches_debug_35367
|
rasdani/github-patches
|
git_diff
|
mlflow__mlflow-9378
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Enable `PT011`
### Summary
Enable https://beta.ruff.rs/docs/rules/pytest-raises-too-broad.
```diff
diff --git a/pyproject.toml b/pyproject.toml
index c373b48ca..8b7810c04 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -25,6 +25,7 @@ select = [
"PT009",
"PT010",
"PT012",
+ "PT011",
"PT013",
"PT018",
"PT022",
@@ -72,6 +73,7 @@ extend-exclude = [
[tool.ruff.flake8-pytest-style]
mark-parentheses = false
fixture-parentheses = false
+raises-require-match-for = ["*"]
[tool.ruff.flake8-tidy-imports]
ban-relative-imports = "all"
```
- `raises-require-match-for = ["*"]` means all errors require `match`.
### Notes
- Make sure to open a PR from a **non-master** branch.
- Sign off the commit using the `-s` flag when making a commit:
```sh
git commit -s -m "..."
# ^^ make sure to use this
```
- Include `#{issue_number}` (e.g. `#123`) in the PR description when opening a PR.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pylint_plugins/__init__.py`
Content:
```
1 from pylint_plugins.pytest_raises_checker import PytestRaisesChecker
2 from pylint_plugins.unittest_assert_raises import UnittestAssertRaises
3 from pylint_plugins.import_checker import ImportChecker
4 from pylint_plugins.assign_checker import AssignChecker
5
6
7 def register(linter):
8 linter.register_checker(PytestRaisesChecker(linter))
9 linter.register_checker(UnittestAssertRaises(linter))
10 linter.register_checker(ImportChecker(linter))
11 linter.register_checker(AssignChecker(linter))
12
```
Path: `pylint_plugins/errors.py`
Content:
```
1 from typing import NamedTuple, Dict, Tuple
2 from functools import reduce
3
4
5 class Message(NamedTuple):
6 id: str
7 name: str
8 message: str
9 reason: str
10
11 def to_dict(self) -> Dict[str, Tuple[str, str, str]]:
12 return {self.id: (self.message, self.name, self.reason)}
13
14
15 def to_msgs(*messages: Message) -> Dict[str, Tuple[str, str, str]]:
16 return reduce(lambda x, y: {**x, **y.to_dict()}, messages, {})
17
18
19 PYTEST_RAISES_WITHOUT_MATCH = Message(
20 id="W0001",
21 name="pytest-raises-without-match",
22 message="`pytest.raises` must be called with `match` argument`.",
23 reason="`pytest.raises` without `match` argument can lead to false positives.",
24 )
25
26
27 UNITTEST_PYTEST_RAISES = Message(
28 id="W0003",
29 name="unittest-assert-raises",
30 message="Use `pytest.raises` instead of `unittest.TestCase.assertRaises`.",
31 reason="To enforce 'pytest-raises-multiple-statements' Message.",
32 )
33
34
35 LAZY_BUILTIN_IMPORT = Message(
36 id="W0007",
37 name="lazy-builtin-import",
38 message="Import built-in module(s) (%s) at the top of the file.",
39 reason="There is no reason they should be imported inside a function.",
40 )
41
42 USELESS_ASSIGNMENT = Message(
43 id="W0008",
44 name="useless-assignment",
45 message="Useless assignment. Use immediate return instead.",
46 reason="For simplicity and readability",
47 )
48
```
Path: `pylint_plugins/pytest_raises_checker/__init__.py`
Content:
```
1 import astroid
2 from pylint.interfaces import IAstroidChecker
3 from pylint.checkers import BaseChecker
4
5 from pylint_plugins.errors import PYTEST_RAISES_WITHOUT_MATCH, to_msgs
6
7
8 def _is_pytest_raises_call(node: astroid.NodeNG):
9 if not isinstance(node, astroid.Call):
10 return False
11 if not isinstance(node.func, astroid.Attribute) or not isinstance(node.func.expr, astroid.Name):
12 return False
13 return node.func.expr.name == "pytest" and node.func.attrname == "raises"
14
15
16 def _called_with_match(node: astroid.Call):
17 # Note `match` is a keyword-only argument:
18 # https://docs.pytest.org/en/latest/reference/reference.html#pytest.raises
19 return any(k.arg == "match" for k in node.keywords)
20
21
22 def _contains_multiple_statements(raises_with: astroid.With):
23 return len(raises_with.body) > 1
24
25
26 class PytestRaisesChecker(BaseChecker):
27 __implements__ = IAstroidChecker
28
29 name = "pytest-raises-checker"
30 msgs = to_msgs(PYTEST_RAISES_WITHOUT_MATCH)
31 priority = -1
32
33 def visit_call(self, node: astroid.Call):
34 if not _is_pytest_raises_call(node):
35 return
36
37 if not _called_with_match(node):
38 self.add_message(PYTEST_RAISES_WITHOUT_MATCH.name, node=node)
39
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pylint_plugins/__init__.py b/pylint_plugins/__init__.py
--- a/pylint_plugins/__init__.py
+++ b/pylint_plugins/__init__.py
@@ -1,11 +1,9 @@
-from pylint_plugins.pytest_raises_checker import PytestRaisesChecker
from pylint_plugins.unittest_assert_raises import UnittestAssertRaises
from pylint_plugins.import_checker import ImportChecker
from pylint_plugins.assign_checker import AssignChecker
def register(linter):
- linter.register_checker(PytestRaisesChecker(linter))
linter.register_checker(UnittestAssertRaises(linter))
linter.register_checker(ImportChecker(linter))
linter.register_checker(AssignChecker(linter))
diff --git a/pylint_plugins/errors.py b/pylint_plugins/errors.py
--- a/pylint_plugins/errors.py
+++ b/pylint_plugins/errors.py
@@ -16,14 +16,6 @@
return reduce(lambda x, y: {**x, **y.to_dict()}, messages, {})
-PYTEST_RAISES_WITHOUT_MATCH = Message(
- id="W0001",
- name="pytest-raises-without-match",
- message="`pytest.raises` must be called with `match` argument`.",
- reason="`pytest.raises` without `match` argument can lead to false positives.",
-)
-
-
UNITTEST_PYTEST_RAISES = Message(
id="W0003",
name="unittest-assert-raises",
diff --git a/pylint_plugins/pytest_raises_checker/__init__.py b/pylint_plugins/pytest_raises_checker/__init__.py
deleted file mode 100644
--- a/pylint_plugins/pytest_raises_checker/__init__.py
+++ /dev/null
@@ -1,38 +0,0 @@
-import astroid
-from pylint.interfaces import IAstroidChecker
-from pylint.checkers import BaseChecker
-
-from pylint_plugins.errors import PYTEST_RAISES_WITHOUT_MATCH, to_msgs
-
-
-def _is_pytest_raises_call(node: astroid.NodeNG):
- if not isinstance(node, astroid.Call):
- return False
- if not isinstance(node.func, astroid.Attribute) or not isinstance(node.func.expr, astroid.Name):
- return False
- return node.func.expr.name == "pytest" and node.func.attrname == "raises"
-
-
-def _called_with_match(node: astroid.Call):
- # Note `match` is a keyword-only argument:
- # https://docs.pytest.org/en/latest/reference/reference.html#pytest.raises
- return any(k.arg == "match" for k in node.keywords)
-
-
-def _contains_multiple_statements(raises_with: astroid.With):
- return len(raises_with.body) > 1
-
-
-class PytestRaisesChecker(BaseChecker):
- __implements__ = IAstroidChecker
-
- name = "pytest-raises-checker"
- msgs = to_msgs(PYTEST_RAISES_WITHOUT_MATCH)
- priority = -1
-
- def visit_call(self, node: astroid.Call):
- if not _is_pytest_raises_call(node):
- return
-
- if not _called_with_match(node):
- self.add_message(PYTEST_RAISES_WITHOUT_MATCH.name, node=node)
|
{"golden_diff": "diff --git a/pylint_plugins/__init__.py b/pylint_plugins/__init__.py\n--- a/pylint_plugins/__init__.py\n+++ b/pylint_plugins/__init__.py\n@@ -1,11 +1,9 @@\n-from pylint_plugins.pytest_raises_checker import PytestRaisesChecker\n from pylint_plugins.unittest_assert_raises import UnittestAssertRaises\n from pylint_plugins.import_checker import ImportChecker\n from pylint_plugins.assign_checker import AssignChecker\n \n \n def register(linter):\n- linter.register_checker(PytestRaisesChecker(linter))\n linter.register_checker(UnittestAssertRaises(linter))\n linter.register_checker(ImportChecker(linter))\n linter.register_checker(AssignChecker(linter))\ndiff --git a/pylint_plugins/errors.py b/pylint_plugins/errors.py\n--- a/pylint_plugins/errors.py\n+++ b/pylint_plugins/errors.py\n@@ -16,14 +16,6 @@\n return reduce(lambda x, y: {**x, **y.to_dict()}, messages, {})\n \n \n-PYTEST_RAISES_WITHOUT_MATCH = Message(\n- id=\"W0001\",\n- name=\"pytest-raises-without-match\",\n- message=\"`pytest.raises` must be called with `match` argument`.\",\n- reason=\"`pytest.raises` without `match` argument can lead to false positives.\",\n-)\n-\n-\n UNITTEST_PYTEST_RAISES = Message(\n id=\"W0003\",\n name=\"unittest-assert-raises\",\ndiff --git a/pylint_plugins/pytest_raises_checker/__init__.py b/pylint_plugins/pytest_raises_checker/__init__.py\ndeleted file mode 100644\n--- a/pylint_plugins/pytest_raises_checker/__init__.py\n+++ /dev/null\n@@ -1,38 +0,0 @@\n-import astroid\n-from pylint.interfaces import IAstroidChecker\n-from pylint.checkers import BaseChecker\n-\n-from pylint_plugins.errors import PYTEST_RAISES_WITHOUT_MATCH, to_msgs\n-\n-\n-def _is_pytest_raises_call(node: astroid.NodeNG):\n- if not isinstance(node, astroid.Call):\n- return False\n- if not isinstance(node.func, astroid.Attribute) or not isinstance(node.func.expr, astroid.Name):\n- return False\n- return node.func.expr.name == \"pytest\" and node.func.attrname == \"raises\"\n-\n-\n-def _called_with_match(node: astroid.Call):\n- # Note `match` is a keyword-only argument:\n- # https://docs.pytest.org/en/latest/reference/reference.html#pytest.raises\n- return any(k.arg == \"match\" for k in node.keywords)\n-\n-\n-def _contains_multiple_statements(raises_with: astroid.With):\n- return len(raises_with.body) > 1\n-\n-\n-class PytestRaisesChecker(BaseChecker):\n- __implements__ = IAstroidChecker\n-\n- name = \"pytest-raises-checker\"\n- msgs = to_msgs(PYTEST_RAISES_WITHOUT_MATCH)\n- priority = -1\n-\n- def visit_call(self, node: astroid.Call):\n- if not _is_pytest_raises_call(node):\n- return\n-\n- if not _called_with_match(node):\n- self.add_message(PYTEST_RAISES_WITHOUT_MATCH.name, node=node)\n", "issue": "Enable `PT011`\n### Summary\r\n\r\nEnable https://beta.ruff.rs/docs/rules/pytest-raises-too-broad.\r\n\r\n```diff\r\ndiff --git a/pyproject.toml b/pyproject.toml\r\nindex c373b48ca..8b7810c04 100644\r\n--- a/pyproject.toml\r\n+++ b/pyproject.toml\r\n@@ -25,6 +25,7 @@ select = [\r\n \"PT009\",\r\n \"PT010\",\r\n \"PT012\",\r\n+ \"PT011\",\r\n \"PT013\",\r\n \"PT018\",\r\n \"PT022\",\r\n@@ -72,6 +73,7 @@ extend-exclude = [\r\n [tool.ruff.flake8-pytest-style]\r\n mark-parentheses = false\r\n fixture-parentheses = false\r\n+raises-require-match-for = [\"*\"]\r\n \r\n [tool.ruff.flake8-tidy-imports]\r\n ban-relative-imports = \"all\"\r\n```\r\n\r\n- `raises-require-match-for = [\"*\"]` means all errors require `match`.\r\n\r\n### Notes\r\n\r\n- Make sure to open a PR from a **non-master** branch.\r\n- Sign off the commit using the `-s` flag when making a commit:\r\n\r\n ```sh\r\n git commit -s -m \"...\"\r\n # ^^ make sure to use this\r\n ```\r\n\r\n- Include `#{issue_number}` (e.g. `#123`) in the PR description when opening a PR.\r\n\n", "before_files": [{"content": "from pylint_plugins.pytest_raises_checker import PytestRaisesChecker\nfrom pylint_plugins.unittest_assert_raises import UnittestAssertRaises\nfrom pylint_plugins.import_checker import ImportChecker\nfrom pylint_plugins.assign_checker import AssignChecker\n\n\ndef register(linter):\n linter.register_checker(PytestRaisesChecker(linter))\n linter.register_checker(UnittestAssertRaises(linter))\n linter.register_checker(ImportChecker(linter))\n linter.register_checker(AssignChecker(linter))\n", "path": "pylint_plugins/__init__.py"}, {"content": "from typing import NamedTuple, Dict, Tuple\nfrom functools import reduce\n\n\nclass Message(NamedTuple):\n id: str\n name: str\n message: str\n reason: str\n\n def to_dict(self) -> Dict[str, Tuple[str, str, str]]:\n return {self.id: (self.message, self.name, self.reason)}\n\n\ndef to_msgs(*messages: Message) -> Dict[str, Tuple[str, str, str]]:\n return reduce(lambda x, y: {**x, **y.to_dict()}, messages, {})\n\n\nPYTEST_RAISES_WITHOUT_MATCH = Message(\n id=\"W0001\",\n name=\"pytest-raises-without-match\",\n message=\"`pytest.raises` must be called with `match` argument`.\",\n reason=\"`pytest.raises` without `match` argument can lead to false positives.\",\n)\n\n\nUNITTEST_PYTEST_RAISES = Message(\n id=\"W0003\",\n name=\"unittest-assert-raises\",\n message=\"Use `pytest.raises` instead of `unittest.TestCase.assertRaises`.\",\n reason=\"To enforce 'pytest-raises-multiple-statements' Message.\",\n)\n\n\nLAZY_BUILTIN_IMPORT = Message(\n id=\"W0007\",\n name=\"lazy-builtin-import\",\n message=\"Import built-in module(s) (%s) at the top of the file.\",\n reason=\"There is no reason they should be imported inside a function.\",\n)\n\nUSELESS_ASSIGNMENT = Message(\n id=\"W0008\",\n name=\"useless-assignment\",\n message=\"Useless assignment. Use immediate return instead.\",\n reason=\"For simplicity and readability\",\n)\n", "path": "pylint_plugins/errors.py"}, {"content": "import astroid\nfrom pylint.interfaces import IAstroidChecker\nfrom pylint.checkers import BaseChecker\n\nfrom pylint_plugins.errors import PYTEST_RAISES_WITHOUT_MATCH, to_msgs\n\n\ndef _is_pytest_raises_call(node: astroid.NodeNG):\n if not isinstance(node, astroid.Call):\n return False\n if not isinstance(node.func, astroid.Attribute) or not isinstance(node.func.expr, astroid.Name):\n return False\n return node.func.expr.name == \"pytest\" and node.func.attrname == \"raises\"\n\n\ndef _called_with_match(node: astroid.Call):\n # Note `match` is a keyword-only argument:\n # https://docs.pytest.org/en/latest/reference/reference.html#pytest.raises\n return any(k.arg == \"match\" for k in node.keywords)\n\n\ndef _contains_multiple_statements(raises_with: astroid.With):\n return len(raises_with.body) > 1\n\n\nclass PytestRaisesChecker(BaseChecker):\n __implements__ = IAstroidChecker\n\n name = \"pytest-raises-checker\"\n msgs = to_msgs(PYTEST_RAISES_WITHOUT_MATCH)\n priority = -1\n\n def visit_call(self, node: astroid.Call):\n if not _is_pytest_raises_call(node):\n return\n\n if not _called_with_match(node):\n self.add_message(PYTEST_RAISES_WITHOUT_MATCH.name, node=node)\n", "path": "pylint_plugins/pytest_raises_checker/__init__.py"}], "after_files": [{"content": "from pylint_plugins.unittest_assert_raises import UnittestAssertRaises\nfrom pylint_plugins.import_checker import ImportChecker\nfrom pylint_plugins.assign_checker import AssignChecker\n\n\ndef register(linter):\n linter.register_checker(UnittestAssertRaises(linter))\n linter.register_checker(ImportChecker(linter))\n linter.register_checker(AssignChecker(linter))\n", "path": "pylint_plugins/__init__.py"}, {"content": "from typing import NamedTuple, Dict, Tuple\nfrom functools import reduce\n\n\nclass Message(NamedTuple):\n id: str\n name: str\n message: str\n reason: str\n\n def to_dict(self) -> Dict[str, Tuple[str, str, str]]:\n return {self.id: (self.message, self.name, self.reason)}\n\n\ndef to_msgs(*messages: Message) -> Dict[str, Tuple[str, str, str]]:\n return reduce(lambda x, y: {**x, **y.to_dict()}, messages, {})\n\n\nUNITTEST_PYTEST_RAISES = Message(\n id=\"W0003\",\n name=\"unittest-assert-raises\",\n message=\"Use `pytest.raises` instead of `unittest.TestCase.assertRaises`.\",\n reason=\"To enforce 'pytest-raises-multiple-statements' Message.\",\n)\n\n\nLAZY_BUILTIN_IMPORT = Message(\n id=\"W0007\",\n name=\"lazy-builtin-import\",\n message=\"Import built-in module(s) (%s) at the top of the file.\",\n reason=\"There is no reason they should be imported inside a function.\",\n)\n\nUSELESS_ASSIGNMENT = Message(\n id=\"W0008\",\n name=\"useless-assignment\",\n message=\"Useless assignment. Use immediate return instead.\",\n reason=\"For simplicity and readability\",\n)\n", "path": "pylint_plugins/errors.py"}, {"content": null, "path": "pylint_plugins/pytest_raises_checker/__init__.py"}]}
| 1,564 | 710 |
gh_patches_debug_12537
|
rasdani/github-patches
|
git_diff
|
Parsl__parsl-3238
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
parsl.providers.cluster_provider _write_submit_script should return nothing, rather than constant True
**Describe the bug**
_write_submit_script in parsl.providers.cluster_provider indicates failure by raising an exception, and so should not be returning a True (or False) value. Instead it should return None by either `return` on its own, or falling off the end of the method without a return statement.
To tidy this up, change that return handling. Edit the docstring to match.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `parsl/providers/cluster_provider.py`
Content:
```
1 import logging
2 from abc import abstractmethod
3 from string import Template
4
5 from parsl.providers.errors import SchedulerMissingArgs, ScriptPathError
6 from parsl.launchers.base import Launcher
7 from parsl.launchers.errors import BadLauncher
8 from parsl.providers.base import ExecutionProvider
9
10 logger = logging.getLogger(__name__)
11
12
13 class ClusterProvider(ExecutionProvider):
14 """ This class defines behavior common to all cluster/supercompute-style scheduler systems.
15
16 Parameters
17 ----------
18 label : str
19 Label for this provider.
20 channel : Channel
21 Channel for accessing this provider. Possible channels include
22 :class:`~parsl.channels.LocalChannel` (the default),
23 :class:`~parsl.channels.SSHChannel`, or
24 :class:`~parsl.channels.SSHInteractiveLoginChannel`.
25 walltime : str
26 Walltime requested per block in HH:MM:SS.
27 launcher : Launcher
28 Launcher for this provider.
29 cmd_timeout : int
30 Timeout for commands made to the scheduler in seconds
31
32 .. code:: python
33
34 +------------------
35 |
36 script_string ------->| submit
37 id <--------|---+
38 |
39 [ ids ] ------->| status
40 [statuses] <--------|----+
41 |
42 [ ids ] ------->| cancel
43 [cancel] <--------|----+
44 |
45 +-------------------
46 """
47
48 def __init__(self,
49 label,
50 channel,
51 nodes_per_block,
52 init_blocks,
53 min_blocks,
54 max_blocks,
55 parallelism,
56 walltime,
57 launcher,
58 cmd_timeout=10):
59
60 self._label = label
61 self.channel = channel
62 self.nodes_per_block = nodes_per_block
63 self.init_blocks = init_blocks
64 self.min_blocks = min_blocks
65 self.max_blocks = max_blocks
66 self.parallelism = parallelism
67 self.launcher = launcher
68 self.walltime = walltime
69 self.cmd_timeout = cmd_timeout
70 if not isinstance(self.launcher, Launcher):
71 raise BadLauncher(self.launcher)
72
73 self.script_dir = None
74
75 # Dictionary that keeps track of jobs, keyed on job_id
76 self.resources = {}
77
78 def execute_wait(self, cmd, timeout=None):
79 t = self.cmd_timeout
80 if timeout is not None:
81 t = timeout
82 return self.channel.execute_wait(cmd, t)
83
84 def _write_submit_script(self, template, script_filename, job_name, configs):
85 """Generate submit script and write it to a file.
86
87 Args:
88 - template (string) : The template string to be used for the writing submit script
89 - script_filename (string) : Name of the submit script
90 - job_name (string) : job name
91 - configs (dict) : configs that get pushed into the template
92
93 Returns:
94 - True: on success
95
96 Raises:
97 SchedulerMissingArgs : If template is missing args
98 ScriptPathError : Unable to write submit script out
99 """
100
101 try:
102 submit_script = Template(template).substitute(jobname=job_name, **configs)
103 with open(script_filename, 'w') as f:
104 f.write(submit_script)
105
106 except KeyError as e:
107 logger.error("Missing keys for submit script : %s", e)
108 raise SchedulerMissingArgs(e.args, self.label)
109
110 except IOError as e:
111 logger.error("Failed writing to submit script: %s", script_filename)
112 raise ScriptPathError(script_filename, e)
113 except Exception as e:
114 print("Template : ", template)
115 print("Args : ", job_name)
116 print("Kwargs : ", configs)
117 logger.error("Uncategorized error: %s", e)
118 raise e
119
120 return True
121
122 @abstractmethod
123 def _status(self):
124 pass
125
126 def status(self, job_ids):
127 """ Get the status of a list of jobs identified by the job identifiers
128 returned from the submit request.
129
130 Args:
131 - job_ids (list) : A list of job identifiers
132
133 Returns:
134 - A list of JobStatus objects corresponding to each job_id in the job_ids list.
135
136 Raises:
137 - ExecutionProviderException or its subclasses
138
139 """
140 if job_ids:
141 self._status()
142 return [self.resources[jid]['status'] for jid in job_ids]
143
144 @property
145 def label(self):
146 return self._label
147
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/parsl/providers/cluster_provider.py b/parsl/providers/cluster_provider.py
--- a/parsl/providers/cluster_provider.py
+++ b/parsl/providers/cluster_provider.py
@@ -91,7 +91,7 @@
- configs (dict) : configs that get pushed into the template
Returns:
- - True: on success
+ - None
Raises:
SchedulerMissingArgs : If template is missing args
@@ -117,8 +117,6 @@
logger.error("Uncategorized error: %s", e)
raise e
- return True
-
@abstractmethod
def _status(self):
pass
|
{"golden_diff": "diff --git a/parsl/providers/cluster_provider.py b/parsl/providers/cluster_provider.py\n--- a/parsl/providers/cluster_provider.py\n+++ b/parsl/providers/cluster_provider.py\n@@ -91,7 +91,7 @@\n - configs (dict) : configs that get pushed into the template\n \n Returns:\n- - True: on success\n+ - None\n \n Raises:\n SchedulerMissingArgs : If template is missing args\n@@ -117,8 +117,6 @@\n logger.error(\"Uncategorized error: %s\", e)\n raise e\n \n- return True\n-\n @abstractmethod\n def _status(self):\n pass\n", "issue": "parsl.providers.cluster_provider _write_submit_script should return nothing, rather than constant True\n**Describe the bug**\r\n\r\n_write_submit_script in parsl.providers.cluster_provider indicates failure by raising an exception, and so should not be returning a True (or False) value. Instead it should return None by either `return` on its own, or falling off the end of the method without a return statement.\r\n\r\nTo tidy this up, change that return handling. Edit the docstring to match.\r\n\r\n\n", "before_files": [{"content": "import logging\nfrom abc import abstractmethod\nfrom string import Template\n\nfrom parsl.providers.errors import SchedulerMissingArgs, ScriptPathError\nfrom parsl.launchers.base import Launcher\nfrom parsl.launchers.errors import BadLauncher\nfrom parsl.providers.base import ExecutionProvider\n\nlogger = logging.getLogger(__name__)\n\n\nclass ClusterProvider(ExecutionProvider):\n \"\"\" This class defines behavior common to all cluster/supercompute-style scheduler systems.\n\n Parameters\n ----------\n label : str\n Label for this provider.\n channel : Channel\n Channel for accessing this provider. Possible channels include\n :class:`~parsl.channels.LocalChannel` (the default),\n :class:`~parsl.channels.SSHChannel`, or\n :class:`~parsl.channels.SSHInteractiveLoginChannel`.\n walltime : str\n Walltime requested per block in HH:MM:SS.\n launcher : Launcher\n Launcher for this provider.\n cmd_timeout : int\n Timeout for commands made to the scheduler in seconds\n\n .. code:: python\n\n +------------------\n |\n script_string ------->| submit\n id <--------|---+\n |\n [ ids ] ------->| status\n [statuses] <--------|----+\n |\n [ ids ] ------->| cancel\n [cancel] <--------|----+\n |\n +-------------------\n \"\"\"\n\n def __init__(self,\n label,\n channel,\n nodes_per_block,\n init_blocks,\n min_blocks,\n max_blocks,\n parallelism,\n walltime,\n launcher,\n cmd_timeout=10):\n\n self._label = label\n self.channel = channel\n self.nodes_per_block = nodes_per_block\n self.init_blocks = init_blocks\n self.min_blocks = min_blocks\n self.max_blocks = max_blocks\n self.parallelism = parallelism\n self.launcher = launcher\n self.walltime = walltime\n self.cmd_timeout = cmd_timeout\n if not isinstance(self.launcher, Launcher):\n raise BadLauncher(self.launcher)\n\n self.script_dir = None\n\n # Dictionary that keeps track of jobs, keyed on job_id\n self.resources = {}\n\n def execute_wait(self, cmd, timeout=None):\n t = self.cmd_timeout\n if timeout is not None:\n t = timeout\n return self.channel.execute_wait(cmd, t)\n\n def _write_submit_script(self, template, script_filename, job_name, configs):\n \"\"\"Generate submit script and write it to a file.\n\n Args:\n - template (string) : The template string to be used for the writing submit script\n - script_filename (string) : Name of the submit script\n - job_name (string) : job name\n - configs (dict) : configs that get pushed into the template\n\n Returns:\n - True: on success\n\n Raises:\n SchedulerMissingArgs : If template is missing args\n ScriptPathError : Unable to write submit script out\n \"\"\"\n\n try:\n submit_script = Template(template).substitute(jobname=job_name, **configs)\n with open(script_filename, 'w') as f:\n f.write(submit_script)\n\n except KeyError as e:\n logger.error(\"Missing keys for submit script : %s\", e)\n raise SchedulerMissingArgs(e.args, self.label)\n\n except IOError as e:\n logger.error(\"Failed writing to submit script: %s\", script_filename)\n raise ScriptPathError(script_filename, e)\n except Exception as e:\n print(\"Template : \", template)\n print(\"Args : \", job_name)\n print(\"Kwargs : \", configs)\n logger.error(\"Uncategorized error: %s\", e)\n raise e\n\n return True\n\n @abstractmethod\n def _status(self):\n pass\n\n def status(self, job_ids):\n \"\"\" Get the status of a list of jobs identified by the job identifiers\n returned from the submit request.\n\n Args:\n - job_ids (list) : A list of job identifiers\n\n Returns:\n - A list of JobStatus objects corresponding to each job_id in the job_ids list.\n\n Raises:\n - ExecutionProviderException or its subclasses\n\n \"\"\"\n if job_ids:\n self._status()\n return [self.resources[jid]['status'] for jid in job_ids]\n\n @property\n def label(self):\n return self._label\n", "path": "parsl/providers/cluster_provider.py"}], "after_files": [{"content": "import logging\nfrom abc import abstractmethod\nfrom string import Template\n\nfrom parsl.providers.errors import SchedulerMissingArgs, ScriptPathError\nfrom parsl.launchers.base import Launcher\nfrom parsl.launchers.errors import BadLauncher\nfrom parsl.providers.base import ExecutionProvider\n\nlogger = logging.getLogger(__name__)\n\n\nclass ClusterProvider(ExecutionProvider):\n \"\"\" This class defines behavior common to all cluster/supercompute-style scheduler systems.\n\n Parameters\n ----------\n label : str\n Label for this provider.\n channel : Channel\n Channel for accessing this provider. Possible channels include\n :class:`~parsl.channels.LocalChannel` (the default),\n :class:`~parsl.channels.SSHChannel`, or\n :class:`~parsl.channels.SSHInteractiveLoginChannel`.\n walltime : str\n Walltime requested per block in HH:MM:SS.\n launcher : Launcher\n Launcher for this provider.\n cmd_timeout : int\n Timeout for commands made to the scheduler in seconds\n\n .. code:: python\n\n +------------------\n |\n script_string ------->| submit\n id <--------|---+\n |\n [ ids ] ------->| status\n [statuses] <--------|----+\n |\n [ ids ] ------->| cancel\n [cancel] <--------|----+\n |\n +-------------------\n \"\"\"\n\n def __init__(self,\n label,\n channel,\n nodes_per_block,\n init_blocks,\n min_blocks,\n max_blocks,\n parallelism,\n walltime,\n launcher,\n cmd_timeout=10):\n\n self._label = label\n self.channel = channel\n self.nodes_per_block = nodes_per_block\n self.init_blocks = init_blocks\n self.min_blocks = min_blocks\n self.max_blocks = max_blocks\n self.parallelism = parallelism\n self.launcher = launcher\n self.walltime = walltime\n self.cmd_timeout = cmd_timeout\n if not isinstance(self.launcher, Launcher):\n raise BadLauncher(self.launcher)\n\n self.script_dir = None\n\n # Dictionary that keeps track of jobs, keyed on job_id\n self.resources = {}\n\n def execute_wait(self, cmd, timeout=None):\n t = self.cmd_timeout\n if timeout is not None:\n t = timeout\n return self.channel.execute_wait(cmd, t)\n\n def _write_submit_script(self, template, script_filename, job_name, configs):\n \"\"\"Generate submit script and write it to a file.\n\n Args:\n - template (string) : The template string to be used for the writing submit script\n - script_filename (string) : Name of the submit script\n - job_name (string) : job name\n - configs (dict) : configs that get pushed into the template\n\n Returns:\n - None\n\n Raises:\n SchedulerMissingArgs : If template is missing args\n ScriptPathError : Unable to write submit script out\n \"\"\"\n\n try:\n submit_script = Template(template).substitute(jobname=job_name, **configs)\n with open(script_filename, 'w') as f:\n f.write(submit_script)\n\n except KeyError as e:\n logger.error(\"Missing keys for submit script : %s\", e)\n raise SchedulerMissingArgs(e.args, self.label)\n\n except IOError as e:\n logger.error(\"Failed writing to submit script: %s\", script_filename)\n raise ScriptPathError(script_filename, e)\n except Exception as e:\n print(\"Template : \", template)\n print(\"Args : \", job_name)\n print(\"Kwargs : \", configs)\n logger.error(\"Uncategorized error: %s\", e)\n raise e\n\n @abstractmethod\n def _status(self):\n pass\n\n def status(self, job_ids):\n \"\"\" Get the status of a list of jobs identified by the job identifiers\n returned from the submit request.\n\n Args:\n - job_ids (list) : A list of job identifiers\n\n Returns:\n - A list of JobStatus objects corresponding to each job_id in the job_ids list.\n\n Raises:\n - ExecutionProviderException or its subclasses\n\n \"\"\"\n if job_ids:\n self._status()\n return [self.resources[jid]['status'] for jid in job_ids]\n\n @property\n def label(self):\n return self._label\n", "path": "parsl/providers/cluster_provider.py"}]}
| 1,646 | 151 |
gh_patches_debug_28089
|
rasdani/github-patches
|
git_diff
|
open-mmlab__mmdetection-1603
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
NMS fails on any non-default GPU.
I'm creating an issue that corresponds to the problem that came up in #1603. I'm reposting the body of that issue here:
I get an error when I try to run NMS code on any GPU except 0.
The issue is that I get RuntimeError: cuda runtime error (700) : an illegal memory access was encountered at mmdet/ops/nms/src/nms_kernel.cu:103 when I try to run NMS with a Tensor on any device except CPU or 0. The error happens on this line:
THCudaCheck(cudaMemcpy(&mask_host[0],
mask_dev,
sizeof(unsigned long long) * boxes_num * col_blocks,
cudaMemcpyDeviceToHost));
But I believe the issue is actually here:
THCState *state = at::globalContext().lazyInitCUDA(); // TODO replace with getTHCState
unsigned long long* mask_dev = NULL;
//THCudaCheck(THCudaMalloc(state, (void**) &mask_dev,
// boxes_num * col_blocks * sizeof(unsigned long long)));
mask_dev = (unsigned long long*) THCudaMalloc(state, boxes_num * col_blocks * sizeof(unsigned long long));
My guess is that THCudaMalloc is creating the mask_dev array on device 0 and not the device corresponding to the input at::Tensor boxes. It looks like state might encode which device a new cuda array is allocated on, so my intuition would be to try and grab the state from boxes. However, I'm not a CUDA expert, so I'm probably totally off base for how to use THCState objects. I was attempting to look through the pytorch docs / source to see if I could figure something out, but I'm not having any luck.
Any pointers on how this issue might be handled would be appreciated. Note that if you have two GPUs you can reproduce the error by checking out this PR and running: xdoctest -m tests/test_nms.py test_nms_device_and_dtypes_gpu
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mmdet/ops/nms/nms_wrapper.py`
Content:
```
1 import numpy as np
2 import torch
3
4 from . import nms_cpu, nms_cuda
5 from .soft_nms_cpu import soft_nms_cpu
6
7
8 def nms(dets, iou_thr, device_id=None):
9 """Dispatch to either CPU or GPU NMS implementations.
10
11 The input can be either a torch tensor or numpy array. GPU NMS will be used
12 if the input is a gpu tensor or device_id is specified, otherwise CPU NMS
13 will be used. The returned type will always be the same as inputs.
14
15 Arguments:
16 dets (torch.Tensor or np.ndarray): bboxes with scores.
17 iou_thr (float): IoU threshold for NMS.
18 device_id (int, optional): when `dets` is a numpy array, if `device_id`
19 is None, then cpu nms is used, otherwise gpu_nms will be used.
20
21 Returns:
22 tuple: kept bboxes and indice, which is always the same data type as
23 the input.
24 """
25 # convert dets (tensor or numpy array) to tensor
26 if isinstance(dets, torch.Tensor):
27 is_numpy = False
28 dets_th = dets
29 elif isinstance(dets, np.ndarray):
30 is_numpy = True
31 device = 'cpu' if device_id is None else 'cuda:{}'.format(device_id)
32 dets_th = torch.from_numpy(dets).to(device)
33 else:
34 raise TypeError(
35 'dets must be either a Tensor or numpy array, but got {}'.format(
36 type(dets)))
37
38 # execute cpu or cuda nms
39 if dets_th.shape[0] == 0:
40 inds = dets_th.new_zeros(0, dtype=torch.long)
41 else:
42 if dets_th.is_cuda:
43 inds = nms_cuda.nms(dets_th, iou_thr)
44 else:
45 inds = nms_cpu.nms(dets_th, iou_thr)
46
47 if is_numpy:
48 inds = inds.cpu().numpy()
49 return dets[inds, :], inds
50
51
52 def soft_nms(dets, iou_thr, method='linear', sigma=0.5, min_score=1e-3):
53 if isinstance(dets, torch.Tensor):
54 is_tensor = True
55 dets_np = dets.detach().cpu().numpy()
56 elif isinstance(dets, np.ndarray):
57 is_tensor = False
58 dets_np = dets
59 else:
60 raise TypeError(
61 'dets must be either a Tensor or numpy array, but got {}'.format(
62 type(dets)))
63
64 method_codes = {'linear': 1, 'gaussian': 2}
65 if method not in method_codes:
66 raise ValueError('Invalid method for SoftNMS: {}'.format(method))
67 new_dets, inds = soft_nms_cpu(
68 dets_np,
69 iou_thr,
70 method=method_codes[method],
71 sigma=sigma,
72 min_score=min_score)
73
74 if is_tensor:
75 return dets.new_tensor(new_dets), dets.new_tensor(
76 inds, dtype=torch.long)
77 else:
78 return new_dets.astype(np.float32), inds.astype(np.int64)
79
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mmdet/ops/nms/nms_wrapper.py b/mmdet/ops/nms/nms_wrapper.py
--- a/mmdet/ops/nms/nms_wrapper.py
+++ b/mmdet/ops/nms/nms_wrapper.py
@@ -21,6 +21,18 @@
Returns:
tuple: kept bboxes and indice, which is always the same data type as
the input.
+
+ Example:
+ >>> dets = np.array([[49.1, 32.4, 51.0, 35.9, 0.9],
+ >>> [49.3, 32.9, 51.0, 35.3, 0.9],
+ >>> [49.2, 31.8, 51.0, 35.4, 0.5],
+ >>> [35.1, 11.5, 39.1, 15.7, 0.5],
+ >>> [35.6, 11.8, 39.3, 14.2, 0.5],
+ >>> [35.3, 11.5, 39.9, 14.5, 0.4],
+ >>> [35.2, 11.7, 39.7, 15.7, 0.3]], dtype=np.float32)
+ >>> iou_thr = 0.7
+ >>> supressed, inds = nms(dets, iou_thr)
+ >>> assert len(inds) == len(supressed) == 3
"""
# convert dets (tensor or numpy array) to tensor
if isinstance(dets, torch.Tensor):
@@ -50,6 +62,18 @@
def soft_nms(dets, iou_thr, method='linear', sigma=0.5, min_score=1e-3):
+ """
+ Example:
+ >>> dets = np.array([[4., 3., 5., 3., 0.9],
+ >>> [4., 3., 5., 4., 0.9],
+ >>> [3., 1., 3., 1., 0.5],
+ >>> [3., 1., 3., 1., 0.5],
+ >>> [3., 1., 3., 1., 0.4],
+ >>> [3., 1., 3., 1., 0.0]], dtype=np.float32)
+ >>> iou_thr = 0.7
+ >>> supressed, inds = soft_nms(dets, iou_thr, sigma=0.5)
+ >>> assert len(inds) == len(supressed) == 3
+ """
if isinstance(dets, torch.Tensor):
is_tensor = True
dets_np = dets.detach().cpu().numpy()
|
{"golden_diff": "diff --git a/mmdet/ops/nms/nms_wrapper.py b/mmdet/ops/nms/nms_wrapper.py\n--- a/mmdet/ops/nms/nms_wrapper.py\n+++ b/mmdet/ops/nms/nms_wrapper.py\n@@ -21,6 +21,18 @@\n Returns:\n tuple: kept bboxes and indice, which is always the same data type as\n the input.\n+\n+ Example:\n+ >>> dets = np.array([[49.1, 32.4, 51.0, 35.9, 0.9],\n+ >>> [49.3, 32.9, 51.0, 35.3, 0.9],\n+ >>> [49.2, 31.8, 51.0, 35.4, 0.5],\n+ >>> [35.1, 11.5, 39.1, 15.7, 0.5],\n+ >>> [35.6, 11.8, 39.3, 14.2, 0.5],\n+ >>> [35.3, 11.5, 39.9, 14.5, 0.4],\n+ >>> [35.2, 11.7, 39.7, 15.7, 0.3]], dtype=np.float32)\n+ >>> iou_thr = 0.7\n+ >>> supressed, inds = nms(dets, iou_thr)\n+ >>> assert len(inds) == len(supressed) == 3\n \"\"\"\n # convert dets (tensor or numpy array) to tensor\n if isinstance(dets, torch.Tensor):\n@@ -50,6 +62,18 @@\n \n \n def soft_nms(dets, iou_thr, method='linear', sigma=0.5, min_score=1e-3):\n+ \"\"\"\n+ Example:\n+ >>> dets = np.array([[4., 3., 5., 3., 0.9],\n+ >>> [4., 3., 5., 4., 0.9],\n+ >>> [3., 1., 3., 1., 0.5],\n+ >>> [3., 1., 3., 1., 0.5],\n+ >>> [3., 1., 3., 1., 0.4],\n+ >>> [3., 1., 3., 1., 0.0]], dtype=np.float32)\n+ >>> iou_thr = 0.7\n+ >>> supressed, inds = soft_nms(dets, iou_thr, sigma=0.5)\n+ >>> assert len(inds) == len(supressed) == 3\n+ \"\"\"\n if isinstance(dets, torch.Tensor):\n is_tensor = True\n dets_np = dets.detach().cpu().numpy()\n", "issue": "NMS fails on any non-default GPU. \nI'm creating an issue that corresponds to the problem that came up in #1603. I'm reposting the body of that issue here:\r\n\r\nI get an error when I try to run NMS code on any GPU except 0. \r\n\r\nThe issue is that I get RuntimeError: cuda runtime error (700) : an illegal memory access was encountered at mmdet/ops/nms/src/nms_kernel.cu:103 when I try to run NMS with a Tensor on any device except CPU or 0. The error happens on this line:\r\n\r\n THCudaCheck(cudaMemcpy(&mask_host[0],\r\n mask_dev,\r\n sizeof(unsigned long long) * boxes_num * col_blocks,\r\n cudaMemcpyDeviceToHost));\r\nBut I believe the issue is actually here:\r\n\r\n THCState *state = at::globalContext().lazyInitCUDA(); // TODO replace with getTHCState\r\n\r\n unsigned long long* mask_dev = NULL;\r\n //THCudaCheck(THCudaMalloc(state, (void**) &mask_dev,\r\n // boxes_num * col_blocks * sizeof(unsigned long long)));\r\n\r\n mask_dev = (unsigned long long*) THCudaMalloc(state, boxes_num * col_blocks * sizeof(unsigned long long));\r\nMy guess is that THCudaMalloc is creating the mask_dev array on device 0 and not the device corresponding to the input at::Tensor boxes. It looks like state might encode which device a new cuda array is allocated on, so my intuition would be to try and grab the state from boxes. However, I'm not a CUDA expert, so I'm probably totally off base for how to use THCState objects. I was attempting to look through the pytorch docs / source to see if I could figure something out, but I'm not having any luck.\r\n\r\nAny pointers on how this issue might be handled would be appreciated. Note that if you have two GPUs you can reproduce the error by checking out this PR and running: xdoctest -m tests/test_nms.py test_nms_device_and_dtypes_gpu\r\n\n", "before_files": [{"content": "import numpy as np\nimport torch\n\nfrom . import nms_cpu, nms_cuda\nfrom .soft_nms_cpu import soft_nms_cpu\n\n\ndef nms(dets, iou_thr, device_id=None):\n \"\"\"Dispatch to either CPU or GPU NMS implementations.\n\n The input can be either a torch tensor or numpy array. GPU NMS will be used\n if the input is a gpu tensor or device_id is specified, otherwise CPU NMS\n will be used. The returned type will always be the same as inputs.\n\n Arguments:\n dets (torch.Tensor or np.ndarray): bboxes with scores.\n iou_thr (float): IoU threshold for NMS.\n device_id (int, optional): when `dets` is a numpy array, if `device_id`\n is None, then cpu nms is used, otherwise gpu_nms will be used.\n\n Returns:\n tuple: kept bboxes and indice, which is always the same data type as\n the input.\n \"\"\"\n # convert dets (tensor or numpy array) to tensor\n if isinstance(dets, torch.Tensor):\n is_numpy = False\n dets_th = dets\n elif isinstance(dets, np.ndarray):\n is_numpy = True\n device = 'cpu' if device_id is None else 'cuda:{}'.format(device_id)\n dets_th = torch.from_numpy(dets).to(device)\n else:\n raise TypeError(\n 'dets must be either a Tensor or numpy array, but got {}'.format(\n type(dets)))\n\n # execute cpu or cuda nms\n if dets_th.shape[0] == 0:\n inds = dets_th.new_zeros(0, dtype=torch.long)\n else:\n if dets_th.is_cuda:\n inds = nms_cuda.nms(dets_th, iou_thr)\n else:\n inds = nms_cpu.nms(dets_th, iou_thr)\n\n if is_numpy:\n inds = inds.cpu().numpy()\n return dets[inds, :], inds\n\n\ndef soft_nms(dets, iou_thr, method='linear', sigma=0.5, min_score=1e-3):\n if isinstance(dets, torch.Tensor):\n is_tensor = True\n dets_np = dets.detach().cpu().numpy()\n elif isinstance(dets, np.ndarray):\n is_tensor = False\n dets_np = dets\n else:\n raise TypeError(\n 'dets must be either a Tensor or numpy array, but got {}'.format(\n type(dets)))\n\n method_codes = {'linear': 1, 'gaussian': 2}\n if method not in method_codes:\n raise ValueError('Invalid method for SoftNMS: {}'.format(method))\n new_dets, inds = soft_nms_cpu(\n dets_np,\n iou_thr,\n method=method_codes[method],\n sigma=sigma,\n min_score=min_score)\n\n if is_tensor:\n return dets.new_tensor(new_dets), dets.new_tensor(\n inds, dtype=torch.long)\n else:\n return new_dets.astype(np.float32), inds.astype(np.int64)\n", "path": "mmdet/ops/nms/nms_wrapper.py"}], "after_files": [{"content": "import numpy as np\nimport torch\n\nfrom . import nms_cpu, nms_cuda\nfrom .soft_nms_cpu import soft_nms_cpu\n\n\ndef nms(dets, iou_thr, device_id=None):\n \"\"\"Dispatch to either CPU or GPU NMS implementations.\n\n The input can be either a torch tensor or numpy array. GPU NMS will be used\n if the input is a gpu tensor or device_id is specified, otherwise CPU NMS\n will be used. The returned type will always be the same as inputs.\n\n Arguments:\n dets (torch.Tensor or np.ndarray): bboxes with scores.\n iou_thr (float): IoU threshold for NMS.\n device_id (int, optional): when `dets` is a numpy array, if `device_id`\n is None, then cpu nms is used, otherwise gpu_nms will be used.\n\n Returns:\n tuple: kept bboxes and indice, which is always the same data type as\n the input.\n\n Example:\n >>> dets = np.array([[49.1, 32.4, 51.0, 35.9, 0.9],\n >>> [49.3, 32.9, 51.0, 35.3, 0.9],\n >>> [49.2, 31.8, 51.0, 35.4, 0.5],\n >>> [35.1, 11.5, 39.1, 15.7, 0.5],\n >>> [35.6, 11.8, 39.3, 14.2, 0.5],\n >>> [35.3, 11.5, 39.9, 14.5, 0.4],\n >>> [35.2, 11.7, 39.7, 15.7, 0.3]], dtype=np.float32)\n >>> iou_thr = 0.7\n >>> supressed, inds = nms(dets, iou_thr)\n >>> assert len(inds) == len(supressed) == 3\n \"\"\"\n # convert dets (tensor or numpy array) to tensor\n if isinstance(dets, torch.Tensor):\n is_numpy = False\n dets_th = dets\n elif isinstance(dets, np.ndarray):\n is_numpy = True\n device = 'cpu' if device_id is None else 'cuda:{}'.format(device_id)\n dets_th = torch.from_numpy(dets).to(device)\n else:\n raise TypeError(\n 'dets must be either a Tensor or numpy array, but got {}'.format(\n type(dets)))\n\n # execute cpu or cuda nms\n if dets_th.shape[0] == 0:\n inds = dets_th.new_zeros(0, dtype=torch.long)\n else:\n if dets_th.is_cuda:\n inds = nms_cuda.nms(dets_th, iou_thr)\n else:\n inds = nms_cpu.nms(dets_th, iou_thr)\n\n if is_numpy:\n inds = inds.cpu().numpy()\n return dets[inds, :], inds\n\n\ndef soft_nms(dets, iou_thr, method='linear', sigma=0.5, min_score=1e-3):\n \"\"\"\n Example:\n >>> dets = np.array([[4., 3., 5., 3., 0.9],\n >>> [4., 3., 5., 4., 0.9],\n >>> [3., 1., 3., 1., 0.5],\n >>> [3., 1., 3., 1., 0.5],\n >>> [3., 1., 3., 1., 0.4],\n >>> [3., 1., 3., 1., 0.0]], dtype=np.float32)\n >>> iou_thr = 0.7\n >>> supressed, inds = soft_nms(dets, iou_thr, sigma=0.5)\n >>> assert len(inds) == len(supressed) == 3\n \"\"\"\n if isinstance(dets, torch.Tensor):\n is_tensor = True\n dets_np = dets.detach().cpu().numpy()\n elif isinstance(dets, np.ndarray):\n is_tensor = False\n dets_np = dets\n else:\n raise TypeError(\n 'dets must be either a Tensor or numpy array, but got {}'.format(\n type(dets)))\n\n method_codes = {'linear': 1, 'gaussian': 2}\n if method not in method_codes:\n raise ValueError('Invalid method for SoftNMS: {}'.format(method))\n new_dets, inds = soft_nms_cpu(\n dets_np,\n iou_thr,\n method=method_codes[method],\n sigma=sigma,\n min_score=min_score)\n\n if is_tensor:\n return dets.new_tensor(new_dets), dets.new_tensor(\n inds, dtype=torch.long)\n else:\n return new_dets.astype(np.float32), inds.astype(np.int64)\n", "path": "mmdet/ops/nms/nms_wrapper.py"}]}
| 1,531 | 683 |
gh_patches_debug_28277
|
rasdani/github-patches
|
git_diff
|
pyqtgraph__pyqtgraph-458
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
export image bug related to type casting
Hi there,
I've got a Qt5-only Python distro installed (WinPython 3.5 Qt5) which includes pyqtgraph 0.10.0. Exporting images from PlotWidgets and ImageViews doesn't work anymore and gives this exception:
`Traceback (most recent call last):
File "C:\WinPython35_Qt5\python-3.5.3.amd64\lib\site-packages\pyqtgraph\exporters\Exporter.py", line 77, in fileSaveFinished
self.export(fileName=fileName, **self.fileDialog.opts)
File "C:\WinPython35_Qt5\python-3.5.3.amd64\lib\site-packages\pyqtgraph\exporters\ImageExporter.py", line 70, in export
bg = np.empty((self.params['width'], self.params['height'], 4), dtype=np.ubyte)
TypeError: 'float' object cannot be interpreted as an integer
QWaitCondition: Destroyed while threads are still waiting`
Didn't happen with WinPython 3.5 Qt4 (pyqtgraph 0.9.10 I think). Am I the only one experiencing this?
Update: simple fix: in ImageExporter.py, line 70:
`bg = np.empty((int(self.params['width']), int(self.params['height']), 4), dtype=np.ubyte)`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyqtgraph/exporters/ImageExporter.py`
Content:
```
1 from .Exporter import Exporter
2 from ..parametertree import Parameter
3 from ..Qt import QtGui, QtCore, QtSvg, USE_PYSIDE
4 from .. import functions as fn
5 import numpy as np
6
7 __all__ = ['ImageExporter']
8
9 class ImageExporter(Exporter):
10 Name = "Image File (PNG, TIF, JPG, ...)"
11 allowCopy = True
12
13 def __init__(self, item):
14 Exporter.__init__(self, item)
15 tr = self.getTargetRect()
16 if isinstance(item, QtGui.QGraphicsItem):
17 scene = item.scene()
18 else:
19 scene = item
20 bgbrush = scene.views()[0].backgroundBrush()
21 bg = bgbrush.color()
22 if bgbrush.style() == QtCore.Qt.NoBrush:
23 bg.setAlpha(0)
24
25 self.params = Parameter(name='params', type='group', children=[
26 {'name': 'width', 'type': 'int', 'value': tr.width(), 'limits': (0, None)},
27 {'name': 'height', 'type': 'int', 'value': tr.height(), 'limits': (0, None)},
28 {'name': 'antialias', 'type': 'bool', 'value': True},
29 {'name': 'background', 'type': 'color', 'value': bg},
30 ])
31 self.params.param('width').sigValueChanged.connect(self.widthChanged)
32 self.params.param('height').sigValueChanged.connect(self.heightChanged)
33
34 def widthChanged(self):
35 sr = self.getSourceRect()
36 ar = float(sr.height()) / sr.width()
37 self.params.param('height').setValue(self.params['width'] * ar, blockSignal=self.heightChanged)
38
39 def heightChanged(self):
40 sr = self.getSourceRect()
41 ar = float(sr.width()) / sr.height()
42 self.params.param('width').setValue(self.params['height'] * ar, blockSignal=self.widthChanged)
43
44 def parameters(self):
45 return self.params
46
47 def export(self, fileName=None, toBytes=False, copy=False):
48 if fileName is None and not toBytes and not copy:
49 if USE_PYSIDE:
50 filter = ["*."+str(f) for f in QtGui.QImageWriter.supportedImageFormats()]
51 else:
52 filter = ["*."+bytes(f).decode('utf-8') for f in QtGui.QImageWriter.supportedImageFormats()]
53 preferred = ['*.png', '*.tif', '*.jpg']
54 for p in preferred[::-1]:
55 if p in filter:
56 filter.remove(p)
57 filter.insert(0, p)
58 self.fileSaveDialog(filter=filter)
59 return
60
61 targetRect = QtCore.QRect(0, 0, self.params['width'], self.params['height'])
62 sourceRect = self.getSourceRect()
63
64
65 #self.png = QtGui.QImage(targetRect.size(), QtGui.QImage.Format_ARGB32)
66 #self.png.fill(pyqtgraph.mkColor(self.params['background']))
67 w, h = self.params['width'], self.params['height']
68 if w == 0 or h == 0:
69 raise Exception("Cannot export image with size=0 (requested export size is %dx%d)" % (w,h))
70 bg = np.empty((self.params['width'], self.params['height'], 4), dtype=np.ubyte)
71 color = self.params['background']
72 bg[:,:,0] = color.blue()
73 bg[:,:,1] = color.green()
74 bg[:,:,2] = color.red()
75 bg[:,:,3] = color.alpha()
76 self.png = fn.makeQImage(bg, alpha=True)
77
78 ## set resolution of image:
79 origTargetRect = self.getTargetRect()
80 resolutionScale = targetRect.width() / origTargetRect.width()
81 #self.png.setDotsPerMeterX(self.png.dotsPerMeterX() * resolutionScale)
82 #self.png.setDotsPerMeterY(self.png.dotsPerMeterY() * resolutionScale)
83
84 painter = QtGui.QPainter(self.png)
85 #dtr = painter.deviceTransform()
86 try:
87 self.setExportMode(True, {'antialias': self.params['antialias'], 'background': self.params['background'], 'painter': painter, 'resolutionScale': resolutionScale})
88 painter.setRenderHint(QtGui.QPainter.Antialiasing, self.params['antialias'])
89 self.getScene().render(painter, QtCore.QRectF(targetRect), QtCore.QRectF(sourceRect))
90 finally:
91 self.setExportMode(False)
92 painter.end()
93
94 if copy:
95 QtGui.QApplication.clipboard().setImage(self.png)
96 elif toBytes:
97 return self.png
98 else:
99 self.png.save(fileName)
100
101 ImageExporter.register()
102
103
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pyqtgraph/exporters/ImageExporter.py b/pyqtgraph/exporters/ImageExporter.py
--- a/pyqtgraph/exporters/ImageExporter.py
+++ b/pyqtgraph/exporters/ImageExporter.py
@@ -23,8 +23,8 @@
bg.setAlpha(0)
self.params = Parameter(name='params', type='group', children=[
- {'name': 'width', 'type': 'int', 'value': tr.width(), 'limits': (0, None)},
- {'name': 'height', 'type': 'int', 'value': tr.height(), 'limits': (0, None)},
+ {'name': 'width', 'type': 'int', 'value': int(tr.width()), 'limits': (0, None)},
+ {'name': 'height', 'type': 'int', 'value': int(tr.height()), 'limits': (0, None)},
{'name': 'antialias', 'type': 'bool', 'value': True},
{'name': 'background', 'type': 'color', 'value': bg},
])
@@ -34,12 +34,12 @@
def widthChanged(self):
sr = self.getSourceRect()
ar = float(sr.height()) / sr.width()
- self.params.param('height').setValue(self.params['width'] * ar, blockSignal=self.heightChanged)
+ self.params.param('height').setValue(int(self.params['width'] * ar), blockSignal=self.heightChanged)
def heightChanged(self):
sr = self.getSourceRect()
ar = float(sr.width()) / sr.height()
- self.params.param('width').setValue(self.params['height'] * ar, blockSignal=self.widthChanged)
+ self.params.param('width').setValue(int(self.params['height'] * ar), blockSignal=self.widthChanged)
def parameters(self):
return self.params
|
{"golden_diff": "diff --git a/pyqtgraph/exporters/ImageExporter.py b/pyqtgraph/exporters/ImageExporter.py\n--- a/pyqtgraph/exporters/ImageExporter.py\n+++ b/pyqtgraph/exporters/ImageExporter.py\n@@ -23,8 +23,8 @@\n bg.setAlpha(0)\n \n self.params = Parameter(name='params', type='group', children=[\n- {'name': 'width', 'type': 'int', 'value': tr.width(), 'limits': (0, None)},\n- {'name': 'height', 'type': 'int', 'value': tr.height(), 'limits': (0, None)},\n+ {'name': 'width', 'type': 'int', 'value': int(tr.width()), 'limits': (0, None)},\n+ {'name': 'height', 'type': 'int', 'value': int(tr.height()), 'limits': (0, None)},\n {'name': 'antialias', 'type': 'bool', 'value': True},\n {'name': 'background', 'type': 'color', 'value': bg},\n ])\n@@ -34,12 +34,12 @@\n def widthChanged(self):\n sr = self.getSourceRect()\n ar = float(sr.height()) / sr.width()\n- self.params.param('height').setValue(self.params['width'] * ar, blockSignal=self.heightChanged)\n+ self.params.param('height').setValue(int(self.params['width'] * ar), blockSignal=self.heightChanged)\n \n def heightChanged(self):\n sr = self.getSourceRect()\n ar = float(sr.width()) / sr.height()\n- self.params.param('width').setValue(self.params['height'] * ar, blockSignal=self.widthChanged)\n+ self.params.param('width').setValue(int(self.params['height'] * ar), blockSignal=self.widthChanged)\n \n def parameters(self):\n return self.params\n", "issue": "export image bug related to type casting\nHi there,\r\n\r\nI've got a Qt5-only Python distro installed (WinPython 3.5 Qt5) which includes pyqtgraph 0.10.0. Exporting images from PlotWidgets and ImageViews doesn't work anymore and gives this exception:\r\n\r\n`Traceback (most recent call last):\r\n File \"C:\\WinPython35_Qt5\\python-3.5.3.amd64\\lib\\site-packages\\pyqtgraph\\exporters\\Exporter.py\", line 77, in fileSaveFinished\r\n self.export(fileName=fileName, **self.fileDialog.opts)\r\n File \"C:\\WinPython35_Qt5\\python-3.5.3.amd64\\lib\\site-packages\\pyqtgraph\\exporters\\ImageExporter.py\", line 70, in export\r\n bg = np.empty((self.params['width'], self.params['height'], 4), dtype=np.ubyte)\r\nTypeError: 'float' object cannot be interpreted as an integer\r\nQWaitCondition: Destroyed while threads are still waiting`\r\n\r\nDidn't happen with WinPython 3.5 Qt4 (pyqtgraph 0.9.10 I think). Am I the only one experiencing this?\r\n\r\nUpdate: simple fix: in ImageExporter.py, line 70:\r\n`bg = np.empty((int(self.params['width']), int(self.params['height']), 4), dtype=np.ubyte)`\n", "before_files": [{"content": "from .Exporter import Exporter\nfrom ..parametertree import Parameter\nfrom ..Qt import QtGui, QtCore, QtSvg, USE_PYSIDE\nfrom .. import functions as fn\nimport numpy as np\n\n__all__ = ['ImageExporter']\n\nclass ImageExporter(Exporter):\n Name = \"Image File (PNG, TIF, JPG, ...)\"\n allowCopy = True\n \n def __init__(self, item):\n Exporter.__init__(self, item)\n tr = self.getTargetRect()\n if isinstance(item, QtGui.QGraphicsItem):\n scene = item.scene()\n else:\n scene = item\n bgbrush = scene.views()[0].backgroundBrush()\n bg = bgbrush.color()\n if bgbrush.style() == QtCore.Qt.NoBrush:\n bg.setAlpha(0)\n \n self.params = Parameter(name='params', type='group', children=[\n {'name': 'width', 'type': 'int', 'value': tr.width(), 'limits': (0, None)},\n {'name': 'height', 'type': 'int', 'value': tr.height(), 'limits': (0, None)},\n {'name': 'antialias', 'type': 'bool', 'value': True},\n {'name': 'background', 'type': 'color', 'value': bg},\n ])\n self.params.param('width').sigValueChanged.connect(self.widthChanged)\n self.params.param('height').sigValueChanged.connect(self.heightChanged)\n \n def widthChanged(self):\n sr = self.getSourceRect()\n ar = float(sr.height()) / sr.width()\n self.params.param('height').setValue(self.params['width'] * ar, blockSignal=self.heightChanged)\n \n def heightChanged(self):\n sr = self.getSourceRect()\n ar = float(sr.width()) / sr.height()\n self.params.param('width').setValue(self.params['height'] * ar, blockSignal=self.widthChanged)\n \n def parameters(self):\n return self.params\n \n def export(self, fileName=None, toBytes=False, copy=False):\n if fileName is None and not toBytes and not copy:\n if USE_PYSIDE:\n filter = [\"*.\"+str(f) for f in QtGui.QImageWriter.supportedImageFormats()]\n else:\n filter = [\"*.\"+bytes(f).decode('utf-8') for f in QtGui.QImageWriter.supportedImageFormats()]\n preferred = ['*.png', '*.tif', '*.jpg']\n for p in preferred[::-1]:\n if p in filter:\n filter.remove(p)\n filter.insert(0, p)\n self.fileSaveDialog(filter=filter)\n return\n \n targetRect = QtCore.QRect(0, 0, self.params['width'], self.params['height'])\n sourceRect = self.getSourceRect()\n \n \n #self.png = QtGui.QImage(targetRect.size(), QtGui.QImage.Format_ARGB32)\n #self.png.fill(pyqtgraph.mkColor(self.params['background']))\n w, h = self.params['width'], self.params['height']\n if w == 0 or h == 0:\n raise Exception(\"Cannot export image with size=0 (requested export size is %dx%d)\" % (w,h))\n bg = np.empty((self.params['width'], self.params['height'], 4), dtype=np.ubyte)\n color = self.params['background']\n bg[:,:,0] = color.blue()\n bg[:,:,1] = color.green()\n bg[:,:,2] = color.red()\n bg[:,:,3] = color.alpha()\n self.png = fn.makeQImage(bg, alpha=True)\n \n ## set resolution of image:\n origTargetRect = self.getTargetRect()\n resolutionScale = targetRect.width() / origTargetRect.width()\n #self.png.setDotsPerMeterX(self.png.dotsPerMeterX() * resolutionScale)\n #self.png.setDotsPerMeterY(self.png.dotsPerMeterY() * resolutionScale)\n \n painter = QtGui.QPainter(self.png)\n #dtr = painter.deviceTransform()\n try:\n self.setExportMode(True, {'antialias': self.params['antialias'], 'background': self.params['background'], 'painter': painter, 'resolutionScale': resolutionScale})\n painter.setRenderHint(QtGui.QPainter.Antialiasing, self.params['antialias'])\n self.getScene().render(painter, QtCore.QRectF(targetRect), QtCore.QRectF(sourceRect))\n finally:\n self.setExportMode(False)\n painter.end()\n \n if copy:\n QtGui.QApplication.clipboard().setImage(self.png)\n elif toBytes:\n return self.png\n else:\n self.png.save(fileName)\n \nImageExporter.register() \n \n", "path": "pyqtgraph/exporters/ImageExporter.py"}], "after_files": [{"content": "from .Exporter import Exporter\nfrom ..parametertree import Parameter\nfrom ..Qt import QtGui, QtCore, QtSvg, USE_PYSIDE\nfrom .. import functions as fn\nimport numpy as np\n\n__all__ = ['ImageExporter']\n\nclass ImageExporter(Exporter):\n Name = \"Image File (PNG, TIF, JPG, ...)\"\n allowCopy = True\n \n def __init__(self, item):\n Exporter.__init__(self, item)\n tr = self.getTargetRect()\n if isinstance(item, QtGui.QGraphicsItem):\n scene = item.scene()\n else:\n scene = item\n bgbrush = scene.views()[0].backgroundBrush()\n bg = bgbrush.color()\n if bgbrush.style() == QtCore.Qt.NoBrush:\n bg.setAlpha(0)\n \n self.params = Parameter(name='params', type='group', children=[\n {'name': 'width', 'type': 'int', 'value': int(tr.width()), 'limits': (0, None)},\n {'name': 'height', 'type': 'int', 'value': int(tr.height()), 'limits': (0, None)},\n {'name': 'antialias', 'type': 'bool', 'value': True},\n {'name': 'background', 'type': 'color', 'value': bg},\n ])\n self.params.param('width').sigValueChanged.connect(self.widthChanged)\n self.params.param('height').sigValueChanged.connect(self.heightChanged)\n \n def widthChanged(self):\n sr = self.getSourceRect()\n ar = float(sr.height()) / sr.width()\n self.params.param('height').setValue(int(self.params['width'] * ar), blockSignal=self.heightChanged)\n \n def heightChanged(self):\n sr = self.getSourceRect()\n ar = float(sr.width()) / sr.height()\n self.params.param('width').setValue(int(self.params['height'] * ar), blockSignal=self.widthChanged)\n \n def parameters(self):\n return self.params\n \n def export(self, fileName=None, toBytes=False, copy=False):\n if fileName is None and not toBytes and not copy:\n if USE_PYSIDE:\n filter = [\"*.\"+str(f) for f in QtGui.QImageWriter.supportedImageFormats()]\n else:\n filter = [\"*.\"+bytes(f).decode('utf-8') for f in QtGui.QImageWriter.supportedImageFormats()]\n preferred = ['*.png', '*.tif', '*.jpg']\n for p in preferred[::-1]:\n if p in filter:\n filter.remove(p)\n filter.insert(0, p)\n self.fileSaveDialog(filter=filter)\n return\n \n targetRect = QtCore.QRect(0, 0, self.params['width'], self.params['height'])\n sourceRect = self.getSourceRect()\n \n \n #self.png = QtGui.QImage(targetRect.size(), QtGui.QImage.Format_ARGB32)\n #self.png.fill(pyqtgraph.mkColor(self.params['background']))\n w, h = self.params['width'], self.params['height']\n if w == 0 or h == 0:\n raise Exception(\"Cannot export image with size=0 (requested export size is %dx%d)\" % (w,h))\n bg = np.empty((self.params['width'], self.params['height'], 4), dtype=np.ubyte)\n color = self.params['background']\n bg[:,:,0] = color.blue()\n bg[:,:,1] = color.green()\n bg[:,:,2] = color.red()\n bg[:,:,3] = color.alpha()\n self.png = fn.makeQImage(bg, alpha=True)\n \n ## set resolution of image:\n origTargetRect = self.getTargetRect()\n resolutionScale = targetRect.width() / origTargetRect.width()\n #self.png.setDotsPerMeterX(self.png.dotsPerMeterX() * resolutionScale)\n #self.png.setDotsPerMeterY(self.png.dotsPerMeterY() * resolutionScale)\n \n painter = QtGui.QPainter(self.png)\n #dtr = painter.deviceTransform()\n try:\n self.setExportMode(True, {'antialias': self.params['antialias'], 'background': self.params['background'], 'painter': painter, 'resolutionScale': resolutionScale})\n painter.setRenderHint(QtGui.QPainter.Antialiasing, self.params['antialias'])\n self.getScene().render(painter, QtCore.QRectF(targetRect), QtCore.QRectF(sourceRect))\n finally:\n self.setExportMode(False)\n painter.end()\n \n if copy:\n QtGui.QApplication.clipboard().setImage(self.png)\n elif toBytes:\n return self.png\n else:\n self.png.save(fileName)\n \nImageExporter.register() \n \n", "path": "pyqtgraph/exporters/ImageExporter.py"}]}
| 1,756 | 400 |
gh_patches_debug_4464
|
rasdani/github-patches
|
git_diff
|
cal-itp__benefits-1229
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Investigate missing event properties
In #1057 we made some changes to the base `Event` type, to include an `eligibility_types` property on all events we send into Amplitude.
## OAuth events
Looking at the event reports over the last 14 days of production data, the [`started sign in` event](https://github.com/cal-itp/benefits/blob/dev/benefits/oauth/analytics.py#L16) is missing the `eligibility_types` property in 100% of recorded events. [See report here](https://data.amplitude.com/compiler/Benefits/events/main/latest/started%20sign%20in).
## Eligibility events
We're also seeing unexpected behavior with the `eligibility` events for `courtesy_card`: the `eligiblity_types` property is coming through with the `event_properties`, but not with `user_properties`, leading to blanks/`(none)` when plotting the events on user-based charts.
For example, this [user checked `courtesy_card` eligibility](https://analytics.amplitude.com/compiler/project/304110/search/amplitude_id%3D555656738955?collapsed=true&sessionHandle=YXoogID_YXoogIG_IFfuPiL_BKPu_2b6a1cea-210a-8693-6f25-38c5de0f39c3&eventId=b505e207-9ca1-11ed-967a-9bb610a8bd25) and we can see the value in the raw `event_properties` output, but not `user_properties`:
```json
{
"amplitude_id": 555656738955,
"app": 304110,
"client_event_time": "2023-01-25 11:16:23.396000",
"client_upload_time": "2023-01-25 11:16:23.506000",
"data_type": "event",
"display_name": "started eligibility",
"event_id": 664,
"event_properties": {
"eligibility_types": [
"courtesy_card"
],
"eligibility_verifier": "MST Courtesy Card Eligibility Server Verifier",
"path": "/eligibility/confirm",
"transit_agency": "Monterey-Salinas Transit"
},
"event_time": "2023-01-25 11:16:23.396000",
"event_type": "started eligibility",
"language": "English",
"library": "http/2.0",
"server_received_time": "2023-01-25 11:16:23.506000",
"server_upload_time": "2023-01-25 11:16:23.508000",
"session_id": 1674645209603,
"source_id": null,
"start_version": "2023.01.1",
"timeline_hidden": false,
"user_creation_time": "2023-01-25 11:13:29.606000",
"user_id": "35aabc7e-d98c-451d-95ec-e4033fcec987",
"user_properties": {
"eligibility_verifier": "MST Courtesy Card Eligibility Server Verifier",
"referrer": "https://benefits.calitp.org/eligibility/confirm",
"referring_domain": "benefits.calitp.org",
"transit_agency": "Monterey-Salinas Transit",
"user_agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36"
},
"uuid": "b505e207-9ca1-11ed-967a-9bb610a8bd25",
"version_name": "2023.01.1"
}
```
And the resulting user chart shows categorizes this event as `(none)`:
<img width="680" alt="image" src="https://user-images.githubusercontent.com/1783439/215842237-10a52ea7-5bad-4fa2-ad08-004e01d1f174.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `benefits/eligibility/analytics.py`
Content:
```
1 """
2 The eligibility application: analytics implementation.
3 """
4 from benefits.core import analytics as core
5
6
7 class EligibilityEvent(core.Event):
8 """Base analytics event for eligibility verification."""
9
10 def __init__(self, request, event_type, eligibility_types):
11 super().__init__(request, event_type)
12 # overwrite core.Event eligibility_types
13 self.update_event_properties(eligibility_types=eligibility_types)
14
15
16 class SelectedVerifierEvent(EligibilityEvent):
17 """Analytics event representing the user selecting an eligibility verifier."""
18
19 def __init__(self, request, eligibility_types):
20 super().__init__(request, "selected eligibility verifier", eligibility_types)
21
22
23 class StartedEligibilityEvent(EligibilityEvent):
24 """Analytics event representing the beginning of an eligibility verification check."""
25
26 def __init__(self, request, eligibility_types):
27 super().__init__(request, "started eligibility", eligibility_types)
28
29
30 class ReturnedEligibilityEvent(EligibilityEvent):
31 """Analytics event representing the end of an eligibility verification check."""
32
33 def __init__(self, request, eligibility_types, status, error=None):
34 super().__init__(request, "returned eligibility", eligibility_types)
35 status = str(status).lower()
36 if status in ("error", "fail", "success"):
37 self.update_event_properties(status=status, error=error)
38 if status == "success":
39 self.update_user_properties(eligibility_types=eligibility_types)
40
41
42 def selected_verifier(request, eligibility_types):
43 """Send the "selected eligibility verifier" analytics event."""
44 core.send_event(SelectedVerifierEvent(request, eligibility_types))
45
46
47 def started_eligibility(request, eligibility_types):
48 """Send the "started eligibility" analytics event."""
49 core.send_event(StartedEligibilityEvent(request, eligibility_types))
50
51
52 def returned_error(request, eligibility_types, error):
53 """Send the "returned eligibility" analytics event with an error status."""
54 core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status="error", error=error))
55
56
57 def returned_fail(request, eligibility_types):
58 """Send the "returned eligibility" analytics event with a fail status."""
59 core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status="fail"))
60
61
62 def returned_success(request, eligibility_types):
63 """Send the "returned eligibility" analytics event with a success status."""
64 core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status="success"))
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/benefits/eligibility/analytics.py b/benefits/eligibility/analytics.py
--- a/benefits/eligibility/analytics.py
+++ b/benefits/eligibility/analytics.py
@@ -11,6 +11,7 @@
super().__init__(request, event_type)
# overwrite core.Event eligibility_types
self.update_event_properties(eligibility_types=eligibility_types)
+ self.update_user_properties(eligibility_types=eligibility_types)
class SelectedVerifierEvent(EligibilityEvent):
|
{"golden_diff": "diff --git a/benefits/eligibility/analytics.py b/benefits/eligibility/analytics.py\n--- a/benefits/eligibility/analytics.py\n+++ b/benefits/eligibility/analytics.py\n@@ -11,6 +11,7 @@\n super().__init__(request, event_type)\n # overwrite core.Event eligibility_types\n self.update_event_properties(eligibility_types=eligibility_types)\n+ self.update_user_properties(eligibility_types=eligibility_types)\n \n \n class SelectedVerifierEvent(EligibilityEvent):\n", "issue": "Investigate missing event properties\nIn #1057 we made some changes to the base `Event` type, to include an `eligibility_types` property on all events we send into Amplitude.\r\n\r\n## OAuth events\r\n\r\nLooking at the event reports over the last 14 days of production data, the [`started sign in` event](https://github.com/cal-itp/benefits/blob/dev/benefits/oauth/analytics.py#L16) is missing the `eligibility_types` property in 100% of recorded events. [See report here](https://data.amplitude.com/compiler/Benefits/events/main/latest/started%20sign%20in).\r\n\r\n## Eligibility events\r\n\r\nWe're also seeing unexpected behavior with the `eligibility` events for `courtesy_card`: the `eligiblity_types` property is coming through with the `event_properties`, but not with `user_properties`, leading to blanks/`(none)` when plotting the events on user-based charts. \r\n\r\nFor example, this [user checked `courtesy_card` eligibility](https://analytics.amplitude.com/compiler/project/304110/search/amplitude_id%3D555656738955?collapsed=true&sessionHandle=YXoogID_YXoogIG_IFfuPiL_BKPu_2b6a1cea-210a-8693-6f25-38c5de0f39c3&eventId=b505e207-9ca1-11ed-967a-9bb610a8bd25) and we can see the value in the raw `event_properties` output, but not `user_properties`:\r\n\r\n```json\r\n{\r\n \"amplitude_id\": 555656738955,\r\n \"app\": 304110,\r\n \"client_event_time\": \"2023-01-25 11:16:23.396000\",\r\n \"client_upload_time\": \"2023-01-25 11:16:23.506000\",\r\n \"data_type\": \"event\",\r\n \"display_name\": \"started eligibility\",\r\n \"event_id\": 664,\r\n \"event_properties\": {\r\n \"eligibility_types\": [\r\n \"courtesy_card\"\r\n ],\r\n \"eligibility_verifier\": \"MST Courtesy Card Eligibility Server Verifier\",\r\n \"path\": \"/eligibility/confirm\",\r\n \"transit_agency\": \"Monterey-Salinas Transit\"\r\n },\r\n \"event_time\": \"2023-01-25 11:16:23.396000\",\r\n \"event_type\": \"started eligibility\",\r\n \"language\": \"English\",\r\n \"library\": \"http/2.0\",\r\n \"server_received_time\": \"2023-01-25 11:16:23.506000\",\r\n \"server_upload_time\": \"2023-01-25 11:16:23.508000\",\r\n \"session_id\": 1674645209603,\r\n \"source_id\": null,\r\n \"start_version\": \"2023.01.1\",\r\n \"timeline_hidden\": false,\r\n \"user_creation_time\": \"2023-01-25 11:13:29.606000\",\r\n \"user_id\": \"35aabc7e-d98c-451d-95ec-e4033fcec987\",\r\n \"user_properties\": {\r\n \"eligibility_verifier\": \"MST Courtesy Card Eligibility Server Verifier\",\r\n \"referrer\": \"https://benefits.calitp.org/eligibility/confirm\",\r\n \"referring_domain\": \"benefits.calitp.org\",\r\n \"transit_agency\": \"Monterey-Salinas Transit\",\r\n \"user_agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36\"\r\n },\r\n \"uuid\": \"b505e207-9ca1-11ed-967a-9bb610a8bd25\",\r\n \"version_name\": \"2023.01.1\"\r\n}\r\n```\r\n\r\nAnd the resulting user chart shows categorizes this event as `(none)`:\r\n\r\n<img width=\"680\" alt=\"image\" src=\"https://user-images.githubusercontent.com/1783439/215842237-10a52ea7-5bad-4fa2-ad08-004e01d1f174.png\">\r\n\n", "before_files": [{"content": "\"\"\"\nThe eligibility application: analytics implementation.\n\"\"\"\nfrom benefits.core import analytics as core\n\n\nclass EligibilityEvent(core.Event):\n \"\"\"Base analytics event for eligibility verification.\"\"\"\n\n def __init__(self, request, event_type, eligibility_types):\n super().__init__(request, event_type)\n # overwrite core.Event eligibility_types\n self.update_event_properties(eligibility_types=eligibility_types)\n\n\nclass SelectedVerifierEvent(EligibilityEvent):\n \"\"\"Analytics event representing the user selecting an eligibility verifier.\"\"\"\n\n def __init__(self, request, eligibility_types):\n super().__init__(request, \"selected eligibility verifier\", eligibility_types)\n\n\nclass StartedEligibilityEvent(EligibilityEvent):\n \"\"\"Analytics event representing the beginning of an eligibility verification check.\"\"\"\n\n def __init__(self, request, eligibility_types):\n super().__init__(request, \"started eligibility\", eligibility_types)\n\n\nclass ReturnedEligibilityEvent(EligibilityEvent):\n \"\"\"Analytics event representing the end of an eligibility verification check.\"\"\"\n\n def __init__(self, request, eligibility_types, status, error=None):\n super().__init__(request, \"returned eligibility\", eligibility_types)\n status = str(status).lower()\n if status in (\"error\", \"fail\", \"success\"):\n self.update_event_properties(status=status, error=error)\n if status == \"success\":\n self.update_user_properties(eligibility_types=eligibility_types)\n\n\ndef selected_verifier(request, eligibility_types):\n \"\"\"Send the \"selected eligibility verifier\" analytics event.\"\"\"\n core.send_event(SelectedVerifierEvent(request, eligibility_types))\n\n\ndef started_eligibility(request, eligibility_types):\n \"\"\"Send the \"started eligibility\" analytics event.\"\"\"\n core.send_event(StartedEligibilityEvent(request, eligibility_types))\n\n\ndef returned_error(request, eligibility_types, error):\n \"\"\"Send the \"returned eligibility\" analytics event with an error status.\"\"\"\n core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status=\"error\", error=error))\n\n\ndef returned_fail(request, eligibility_types):\n \"\"\"Send the \"returned eligibility\" analytics event with a fail status.\"\"\"\n core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status=\"fail\"))\n\n\ndef returned_success(request, eligibility_types):\n \"\"\"Send the \"returned eligibility\" analytics event with a success status.\"\"\"\n core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status=\"success\"))\n", "path": "benefits/eligibility/analytics.py"}], "after_files": [{"content": "\"\"\"\nThe eligibility application: analytics implementation.\n\"\"\"\nfrom benefits.core import analytics as core\n\n\nclass EligibilityEvent(core.Event):\n \"\"\"Base analytics event for eligibility verification.\"\"\"\n\n def __init__(self, request, event_type, eligibility_types):\n super().__init__(request, event_type)\n # overwrite core.Event eligibility_types\n self.update_event_properties(eligibility_types=eligibility_types)\n self.update_user_properties(eligibility_types=eligibility_types)\n\n\nclass SelectedVerifierEvent(EligibilityEvent):\n \"\"\"Analytics event representing the user selecting an eligibility verifier.\"\"\"\n\n def __init__(self, request, eligibility_types):\n super().__init__(request, \"selected eligibility verifier\", eligibility_types)\n\n\nclass StartedEligibilityEvent(EligibilityEvent):\n \"\"\"Analytics event representing the beginning of an eligibility verification check.\"\"\"\n\n def __init__(self, request, eligibility_types):\n super().__init__(request, \"started eligibility\", eligibility_types)\n\n\nclass ReturnedEligibilityEvent(EligibilityEvent):\n \"\"\"Analytics event representing the end of an eligibility verification check.\"\"\"\n\n def __init__(self, request, eligibility_types, status, error=None):\n super().__init__(request, \"returned eligibility\", eligibility_types)\n status = str(status).lower()\n if status in (\"error\", \"fail\", \"success\"):\n self.update_event_properties(status=status, error=error)\n if status == \"success\":\n self.update_user_properties(eligibility_types=eligibility_types)\n\n\ndef selected_verifier(request, eligibility_types):\n \"\"\"Send the \"selected eligibility verifier\" analytics event.\"\"\"\n core.send_event(SelectedVerifierEvent(request, eligibility_types))\n\n\ndef started_eligibility(request, eligibility_types):\n \"\"\"Send the \"started eligibility\" analytics event.\"\"\"\n core.send_event(StartedEligibilityEvent(request, eligibility_types))\n\n\ndef returned_error(request, eligibility_types, error):\n \"\"\"Send the \"returned eligibility\" analytics event with an error status.\"\"\"\n core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status=\"error\", error=error))\n\n\ndef returned_fail(request, eligibility_types):\n \"\"\"Send the \"returned eligibility\" analytics event with a fail status.\"\"\"\n core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status=\"fail\"))\n\n\ndef returned_success(request, eligibility_types):\n \"\"\"Send the \"returned eligibility\" analytics event with a success status.\"\"\"\n core.send_event(ReturnedEligibilityEvent(request, eligibility_types, status=\"success\"))\n", "path": "benefits/eligibility/analytics.py"}]}
| 1,995 | 118 |
gh_patches_debug_17323
|
rasdani/github-patches
|
git_diff
|
angr__angr-3374
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
MIPS32BE: stack collided with heap?
Couldn't get to the bottom of this one...
```
#!/home/<user>/angr_pypy/bin/python
import angr
import claripy
import monkeyhex
import logging
import pickle
import time
import sys
import os
import socket
import copy
import argparse
import ipdb
from IPython import embed
proj = angr.Project("httpd", auto_load_libs=False, except_missing_libs=False)
cfg = proj.analyses.CFGFast(normalize=True,
fail_fast=True,
force_complete_scan=False,
data_references=False,
cross_references=False,
show_progressbar=True)
# some functions we're interested in
funcs = proj.kb.functions
parse_http_req = funcs[0x408f90]
s = proj.factory.blank_state(addr=parse_http_req.addr)
# running it in a simulation manager will allow us to examine the state after it errors out
sm = proj.factory.simulation_manager(s)
sm.run()
embed()
```
[httpd.zip](https://github.com/angr/angr/files/7671480/httpd.zip)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py`
Content:
```
1 import logging
2
3 from .paged_memory_mixin import PagedMemoryMixin
4 from ....errors import SimSegfaultException, SimMemoryError
5
6 l = logging.getLogger(__name__)
7
8 class StackAllocationMixin(PagedMemoryMixin):
9 """
10 This mixin adds automatic allocation for a stack region based on the stack_end and stack_size parameters.
11 """
12 # TODO: multiple stacks. this scheme should scale p well
13 # TODO tbh this should be handled by an actual fault handler in simos or something
14 def __init__(self, stack_end=None, stack_size=None, stack_perms=None, **kwargs):
15 super().__init__(**kwargs)
16 self._red_pageno = (stack_end - 1) // self.page_size if stack_end is not None else None
17 self._remaining_stack = stack_size
18 self._stack_perms = stack_perms
19
20 def copy(self, memo):
21 o = super().copy(memo)
22 o._red_pageno = self._red_pageno
23 o._remaining_stack = self._remaining_stack
24 o._stack_perms = self._stack_perms
25 return o
26
27 def allocate_stack_pages(self, addr: int, size: int, **kwargs):
28 """
29 Pre-allocates pages for the stack without triggering any logic related to reading from them.
30
31 :param addr: The highest address that should be mapped
32 :param size: The number of bytes to be allocated. byte 1 is the one at addr, byte 2 is the one before that, and so on.
33 :return: A list of the new page objects
34 """
35 # weird off-by-ones here. we want to calculate the last byte requested, find its pageno, and then use that to determine what the last page allocated will be and then how many pages are touched
36 pageno = addr // self.page_size
37 if pageno != self._red_pageno:
38 raise SimMemoryError("Trying to allocate stack space in a place that isn't the top of the stack")
39 num = pageno - ((addr - size + 1) // self.page_size) + 1
40
41 result = []
42 for _ in range(num):
43 new_red_pageno = (self._red_pageno - 1) % ((1 << self.state.arch.bits) // self.page_size)
44 if new_red_pageno in self._pages:
45 raise SimSegfaultException(self._red_pageno * self.page_size, "stack collided with heap")
46
47 if self._remaining_stack is not None and self._remaining_stack < self.page_size:
48 raise SimSegfaultException(self._red_pageno * self.page_size, "exhausted stack quota")
49
50 l.debug("Allocating new stack page at %#x", self._red_pageno * self.page_size)
51 result.append(PagedMemoryMixin._initialize_default_page(self, self._red_pageno, permissions=self._stack_perms, **kwargs))
52 self._pages[self._red_pageno] = result[-1]
53
54 self._red_pageno = new_red_pageno
55 if self._remaining_stack is not None:
56 self._remaining_stack -= self.page_size
57
58 return result
59
60 def _initialize_page(self, pageno: int, **kwargs):
61 if pageno != self._red_pageno:
62 return super()._initialize_page(pageno, **kwargs)
63
64 return self.allocate_stack_pages((pageno + 1) * self.page_size - 1, self.page_size)[0]
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py b/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py
--- a/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py
+++ b/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py
@@ -2,6 +2,7 @@
from .paged_memory_mixin import PagedMemoryMixin
from ....errors import SimSegfaultException, SimMemoryError
+from ....sim_options import STRICT_PAGE_ACCESS
l = logging.getLogger(__name__)
@@ -58,7 +59,7 @@
return result
def _initialize_page(self, pageno: int, **kwargs):
- if pageno != self._red_pageno:
+ if pageno != self._red_pageno or STRICT_PAGE_ACCESS not in self.state.options:
return super()._initialize_page(pageno, **kwargs)
return self.allocate_stack_pages((pageno + 1) * self.page_size - 1, self.page_size)[0]
|
{"golden_diff": "diff --git a/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py b/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py\n--- a/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py\n+++ b/angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py\n@@ -2,6 +2,7 @@\n \n from .paged_memory_mixin import PagedMemoryMixin\n from ....errors import SimSegfaultException, SimMemoryError\n+from ....sim_options import STRICT_PAGE_ACCESS\n \n l = logging.getLogger(__name__)\n \n@@ -58,7 +59,7 @@\n return result\n \n def _initialize_page(self, pageno: int, **kwargs):\n- if pageno != self._red_pageno:\n+ if pageno != self._red_pageno or STRICT_PAGE_ACCESS not in self.state.options:\n return super()._initialize_page(pageno, **kwargs)\n \n return self.allocate_stack_pages((pageno + 1) * self.page_size - 1, self.page_size)[0]\n", "issue": "MIPS32BE: stack collided with heap?\nCouldn't get to the bottom of this one...\r\n\r\n```\r\n#!/home/<user>/angr_pypy/bin/python\r\n\r\nimport angr\r\nimport claripy\r\nimport monkeyhex\r\nimport logging\r\nimport pickle\r\nimport time\r\nimport sys\r\nimport os\r\nimport socket\r\nimport copy\r\nimport argparse\r\nimport ipdb\r\n\r\nfrom IPython import embed\r\n\r\nproj = angr.Project(\"httpd\", auto_load_libs=False, except_missing_libs=False)\r\n\r\ncfg = proj.analyses.CFGFast(normalize=True,\r\n fail_fast=True,\r\n force_complete_scan=False,\r\n data_references=False,\r\n cross_references=False,\r\n show_progressbar=True)\r\n\r\n# some functions we're interested in\r\nfuncs = proj.kb.functions\r\nparse_http_req = funcs[0x408f90]\r\n\r\ns = proj.factory.blank_state(addr=parse_http_req.addr)\r\n\r\n# running it in a simulation manager will allow us to examine the state after it errors out\r\nsm = proj.factory.simulation_manager(s)\r\nsm.run()\r\n\r\nembed()\r\n```\r\n[httpd.zip](https://github.com/angr/angr/files/7671480/httpd.zip)\r\n\r\n\n", "before_files": [{"content": "import logging\n\nfrom .paged_memory_mixin import PagedMemoryMixin\nfrom ....errors import SimSegfaultException, SimMemoryError\n\nl = logging.getLogger(__name__)\n\nclass StackAllocationMixin(PagedMemoryMixin):\n \"\"\"\n This mixin adds automatic allocation for a stack region based on the stack_end and stack_size parameters.\n \"\"\"\n # TODO: multiple stacks. this scheme should scale p well\n # TODO tbh this should be handled by an actual fault handler in simos or something\n def __init__(self, stack_end=None, stack_size=None, stack_perms=None, **kwargs):\n super().__init__(**kwargs)\n self._red_pageno = (stack_end - 1) // self.page_size if stack_end is not None else None\n self._remaining_stack = stack_size\n self._stack_perms = stack_perms\n\n def copy(self, memo):\n o = super().copy(memo)\n o._red_pageno = self._red_pageno\n o._remaining_stack = self._remaining_stack\n o._stack_perms = self._stack_perms\n return o\n\n def allocate_stack_pages(self, addr: int, size: int, **kwargs):\n \"\"\"\n Pre-allocates pages for the stack without triggering any logic related to reading from them.\n\n :param addr: The highest address that should be mapped\n :param size: The number of bytes to be allocated. byte 1 is the one at addr, byte 2 is the one before that, and so on.\n :return: A list of the new page objects\n \"\"\"\n # weird off-by-ones here. we want to calculate the last byte requested, find its pageno, and then use that to determine what the last page allocated will be and then how many pages are touched\n pageno = addr // self.page_size\n if pageno != self._red_pageno:\n raise SimMemoryError(\"Trying to allocate stack space in a place that isn't the top of the stack\")\n num = pageno - ((addr - size + 1) // self.page_size) + 1\n\n result = []\n for _ in range(num):\n new_red_pageno = (self._red_pageno - 1) % ((1 << self.state.arch.bits) // self.page_size)\n if new_red_pageno in self._pages:\n raise SimSegfaultException(self._red_pageno * self.page_size, \"stack collided with heap\")\n\n if self._remaining_stack is not None and self._remaining_stack < self.page_size:\n raise SimSegfaultException(self._red_pageno * self.page_size, \"exhausted stack quota\")\n\n l.debug(\"Allocating new stack page at %#x\", self._red_pageno * self.page_size)\n result.append(PagedMemoryMixin._initialize_default_page(self, self._red_pageno, permissions=self._stack_perms, **kwargs))\n self._pages[self._red_pageno] = result[-1]\n\n self._red_pageno = new_red_pageno\n if self._remaining_stack is not None:\n self._remaining_stack -= self.page_size\n\n return result\n\n def _initialize_page(self, pageno: int, **kwargs):\n if pageno != self._red_pageno:\n return super()._initialize_page(pageno, **kwargs)\n\n return self.allocate_stack_pages((pageno + 1) * self.page_size - 1, self.page_size)[0]\n", "path": "angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py"}], "after_files": [{"content": "import logging\n\nfrom .paged_memory_mixin import PagedMemoryMixin\nfrom ....errors import SimSegfaultException, SimMemoryError\nfrom ....sim_options import STRICT_PAGE_ACCESS\n\nl = logging.getLogger(__name__)\n\nclass StackAllocationMixin(PagedMemoryMixin):\n \"\"\"\n This mixin adds automatic allocation for a stack region based on the stack_end and stack_size parameters.\n \"\"\"\n # TODO: multiple stacks. this scheme should scale p well\n # TODO tbh this should be handled by an actual fault handler in simos or something\n def __init__(self, stack_end=None, stack_size=None, stack_perms=None, **kwargs):\n super().__init__(**kwargs)\n self._red_pageno = (stack_end - 1) // self.page_size if stack_end is not None else None\n self._remaining_stack = stack_size\n self._stack_perms = stack_perms\n\n def copy(self, memo):\n o = super().copy(memo)\n o._red_pageno = self._red_pageno\n o._remaining_stack = self._remaining_stack\n o._stack_perms = self._stack_perms\n return o\n\n def allocate_stack_pages(self, addr: int, size: int, **kwargs):\n \"\"\"\n Pre-allocates pages for the stack without triggering any logic related to reading from them.\n\n :param addr: The highest address that should be mapped\n :param size: The number of bytes to be allocated. byte 1 is the one at addr, byte 2 is the one before that, and so on.\n :return: A list of the new page objects\n \"\"\"\n # weird off-by-ones here. we want to calculate the last byte requested, find its pageno, and then use that to determine what the last page allocated will be and then how many pages are touched\n pageno = addr // self.page_size\n if pageno != self._red_pageno:\n raise SimMemoryError(\"Trying to allocate stack space in a place that isn't the top of the stack\")\n num = pageno - ((addr - size + 1) // self.page_size) + 1\n\n result = []\n for _ in range(num):\n new_red_pageno = (self._red_pageno - 1) % ((1 << self.state.arch.bits) // self.page_size)\n if new_red_pageno in self._pages:\n raise SimSegfaultException(self._red_pageno * self.page_size, \"stack collided with heap\")\n\n if self._remaining_stack is not None and self._remaining_stack < self.page_size:\n raise SimSegfaultException(self._red_pageno * self.page_size, \"exhausted stack quota\")\n\n l.debug(\"Allocating new stack page at %#x\", self._red_pageno * self.page_size)\n result.append(PagedMemoryMixin._initialize_default_page(self, self._red_pageno, permissions=self._stack_perms, **kwargs))\n self._pages[self._red_pageno] = result[-1]\n\n self._red_pageno = new_red_pageno\n if self._remaining_stack is not None:\n self._remaining_stack -= self.page_size\n\n return result\n\n def _initialize_page(self, pageno: int, **kwargs):\n if pageno != self._red_pageno or STRICT_PAGE_ACCESS not in self.state.options:\n return super()._initialize_page(pageno, **kwargs)\n\n return self.allocate_stack_pages((pageno + 1) * self.page_size - 1, self.page_size)[0]\n", "path": "angr/storage/memory_mixins/paged_memory/stack_allocation_mixin.py"}]}
| 1,405 | 240 |
gh_patches_debug_47849
|
rasdani/github-patches
|
git_diff
|
searx__searx-1304
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Engines cannot retrieve results: piratebay (request exception): PirateBay changed URL
When some text is entered, and I click on General and Files several times, it shows this error:
```
Error! Engines cannot retrieve results.
piratebay (request exception)
Please, try again later or find another searx instance.
```
Version 0.14.0 on FreeBSD.
Default config.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `searx/engines/piratebay.py`
Content:
```
1 # Piratebay (Videos, Music, Files)
2 #
3 # @website https://thepiratebay.se
4 # @provide-api no (nothing found)
5 #
6 # @using-api no
7 # @results HTML (using search portal)
8 # @stable yes (HTML can change)
9 # @parse url, title, content, seed, leech, magnetlink
10
11 from lxml import html
12 from operator import itemgetter
13 from searx.engines.xpath import extract_text
14 from searx.url_utils import quote, urljoin
15
16 # engine dependent config
17 categories = ['videos', 'music', 'files']
18 paging = True
19
20 # search-url
21 url = 'https://thepiratebay.se/'
22 search_url = url + 'search/{search_term}/{pageno}/99/{search_type}'
23
24 # piratebay specific type-definitions
25 search_types = {'files': '0',
26 'music': '100',
27 'videos': '200'}
28
29 # specific xpath variables
30 magnet_xpath = './/a[@title="Download this torrent using magnet"]'
31 torrent_xpath = './/a[@title="Download this torrent"]'
32 content_xpath = './/font[@class="detDesc"]'
33
34
35 # do search-request
36 def request(query, params):
37 search_type = search_types.get(params['category'], '0')
38
39 params['url'] = search_url.format(search_term=quote(query),
40 search_type=search_type,
41 pageno=params['pageno'] - 1)
42
43 return params
44
45
46 # get response from search-request
47 def response(resp):
48 results = []
49
50 dom = html.fromstring(resp.text)
51
52 search_res = dom.xpath('//table[@id="searchResult"]//tr')
53
54 # return empty array if nothing is found
55 if not search_res:
56 return []
57
58 # parse results
59 for result in search_res[1:]:
60 link = result.xpath('.//div[@class="detName"]//a')[0]
61 href = urljoin(url, link.attrib.get('href'))
62 title = extract_text(link)
63 content = extract_text(result.xpath(content_xpath))
64 seed, leech = result.xpath('.//td[@align="right"]/text()')[:2]
65
66 # convert seed to int if possible
67 if seed.isdigit():
68 seed = int(seed)
69 else:
70 seed = 0
71
72 # convert leech to int if possible
73 if leech.isdigit():
74 leech = int(leech)
75 else:
76 leech = 0
77
78 magnetlink = result.xpath(magnet_xpath)[0]
79 torrentfile_links = result.xpath(torrent_xpath)
80 if torrentfile_links:
81 torrentfile_link = torrentfile_links[0].attrib.get('href')
82 else:
83 torrentfile_link = None
84
85 # append result
86 results.append({'url': href,
87 'title': title,
88 'content': content,
89 'seed': seed,
90 'leech': leech,
91 'magnetlink': magnetlink.attrib.get('href'),
92 'torrentfile': torrentfile_link,
93 'template': 'torrent.html'})
94
95 # return results sorted by seeder
96 return sorted(results, key=itemgetter('seed'), reverse=True)
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/searx/engines/piratebay.py b/searx/engines/piratebay.py
--- a/searx/engines/piratebay.py
+++ b/searx/engines/piratebay.py
@@ -18,7 +18,7 @@
paging = True
# search-url
-url = 'https://thepiratebay.se/'
+url = 'https://thepiratebay.org/'
search_url = url + 'search/{search_term}/{pageno}/99/{search_type}'
# piratebay specific type-definitions
|
{"golden_diff": "diff --git a/searx/engines/piratebay.py b/searx/engines/piratebay.py\n--- a/searx/engines/piratebay.py\n+++ b/searx/engines/piratebay.py\n@@ -18,7 +18,7 @@\n paging = True\n \n # search-url\n-url = 'https://thepiratebay.se/'\n+url = 'https://thepiratebay.org/'\n search_url = url + 'search/{search_term}/{pageno}/99/{search_type}'\n \n # piratebay specific type-definitions\n", "issue": "Engines cannot retrieve results: piratebay (request exception): PirateBay changed URL\nWhen some text is entered, and I click on General and Files several times, it shows this error:\r\n```\r\nError! Engines cannot retrieve results.\r\npiratebay (request exception)\r\nPlease, try again later or find another searx instance.\r\n```\r\n\r\nVersion 0.14.0 on FreeBSD.\r\nDefault config.\n", "before_files": [{"content": "# Piratebay (Videos, Music, Files)\n#\n# @website https://thepiratebay.se\n# @provide-api no (nothing found)\n#\n# @using-api no\n# @results HTML (using search portal)\n# @stable yes (HTML can change)\n# @parse url, title, content, seed, leech, magnetlink\n\nfrom lxml import html\nfrom operator import itemgetter\nfrom searx.engines.xpath import extract_text\nfrom searx.url_utils import quote, urljoin\n\n# engine dependent config\ncategories = ['videos', 'music', 'files']\npaging = True\n\n# search-url\nurl = 'https://thepiratebay.se/'\nsearch_url = url + 'search/{search_term}/{pageno}/99/{search_type}'\n\n# piratebay specific type-definitions\nsearch_types = {'files': '0',\n 'music': '100',\n 'videos': '200'}\n\n# specific xpath variables\nmagnet_xpath = './/a[@title=\"Download this torrent using magnet\"]'\ntorrent_xpath = './/a[@title=\"Download this torrent\"]'\ncontent_xpath = './/font[@class=\"detDesc\"]'\n\n\n# do search-request\ndef request(query, params):\n search_type = search_types.get(params['category'], '0')\n\n params['url'] = search_url.format(search_term=quote(query),\n search_type=search_type,\n pageno=params['pageno'] - 1)\n\n return params\n\n\n# get response from search-request\ndef response(resp):\n results = []\n\n dom = html.fromstring(resp.text)\n\n search_res = dom.xpath('//table[@id=\"searchResult\"]//tr')\n\n # return empty array if nothing is found\n if not search_res:\n return []\n\n # parse results\n for result in search_res[1:]:\n link = result.xpath('.//div[@class=\"detName\"]//a')[0]\n href = urljoin(url, link.attrib.get('href'))\n title = extract_text(link)\n content = extract_text(result.xpath(content_xpath))\n seed, leech = result.xpath('.//td[@align=\"right\"]/text()')[:2]\n\n # convert seed to int if possible\n if seed.isdigit():\n seed = int(seed)\n else:\n seed = 0\n\n # convert leech to int if possible\n if leech.isdigit():\n leech = int(leech)\n else:\n leech = 0\n\n magnetlink = result.xpath(magnet_xpath)[0]\n torrentfile_links = result.xpath(torrent_xpath)\n if torrentfile_links:\n torrentfile_link = torrentfile_links[0].attrib.get('href')\n else:\n torrentfile_link = None\n\n # append result\n results.append({'url': href,\n 'title': title,\n 'content': content,\n 'seed': seed,\n 'leech': leech,\n 'magnetlink': magnetlink.attrib.get('href'),\n 'torrentfile': torrentfile_link,\n 'template': 'torrent.html'})\n\n # return results sorted by seeder\n return sorted(results, key=itemgetter('seed'), reverse=True)\n", "path": "searx/engines/piratebay.py"}], "after_files": [{"content": "# Piratebay (Videos, Music, Files)\n#\n# @website https://thepiratebay.se\n# @provide-api no (nothing found)\n#\n# @using-api no\n# @results HTML (using search portal)\n# @stable yes (HTML can change)\n# @parse url, title, content, seed, leech, magnetlink\n\nfrom lxml import html\nfrom operator import itemgetter\nfrom searx.engines.xpath import extract_text\nfrom searx.url_utils import quote, urljoin\n\n# engine dependent config\ncategories = ['videos', 'music', 'files']\npaging = True\n\n# search-url\nurl = 'https://thepiratebay.org/'\nsearch_url = url + 'search/{search_term}/{pageno}/99/{search_type}'\n\n# piratebay specific type-definitions\nsearch_types = {'files': '0',\n 'music': '100',\n 'videos': '200'}\n\n# specific xpath variables\nmagnet_xpath = './/a[@title=\"Download this torrent using magnet\"]'\ntorrent_xpath = './/a[@title=\"Download this torrent\"]'\ncontent_xpath = './/font[@class=\"detDesc\"]'\n\n\n# do search-request\ndef request(query, params):\n search_type = search_types.get(params['category'], '0')\n\n params['url'] = search_url.format(search_term=quote(query),\n search_type=search_type,\n pageno=params['pageno'] - 1)\n\n return params\n\n\n# get response from search-request\ndef response(resp):\n results = []\n\n dom = html.fromstring(resp.text)\n\n search_res = dom.xpath('//table[@id=\"searchResult\"]//tr')\n\n # return empty array if nothing is found\n if not search_res:\n return []\n\n # parse results\n for result in search_res[1:]:\n link = result.xpath('.//div[@class=\"detName\"]//a')[0]\n href = urljoin(url, link.attrib.get('href'))\n title = extract_text(link)\n content = extract_text(result.xpath(content_xpath))\n seed, leech = result.xpath('.//td[@align=\"right\"]/text()')[:2]\n\n # convert seed to int if possible\n if seed.isdigit():\n seed = int(seed)\n else:\n seed = 0\n\n # convert leech to int if possible\n if leech.isdigit():\n leech = int(leech)\n else:\n leech = 0\n\n magnetlink = result.xpath(magnet_xpath)[0]\n torrentfile_links = result.xpath(torrent_xpath)\n if torrentfile_links:\n torrentfile_link = torrentfile_links[0].attrib.get('href')\n else:\n torrentfile_link = None\n\n # append result\n results.append({'url': href,\n 'title': title,\n 'content': content,\n 'seed': seed,\n 'leech': leech,\n 'magnetlink': magnetlink.attrib.get('href'),\n 'torrentfile': torrentfile_link,\n 'template': 'torrent.html'})\n\n # return results sorted by seeder\n return sorted(results, key=itemgetter('seed'), reverse=True)\n", "path": "searx/engines/piratebay.py"}]}
| 1,221 | 124 |
gh_patches_debug_250
|
rasdani/github-patches
|
git_diff
|
hpcaitech__ColossalAI-3944
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
[elixir] make README consistent in style
The README for the `Elixir` module is rather a draft, we should polish it to make it consistent with the README files found in other modules.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `colossalai/elixir/__init__.py`
Content:
```
1 from .wrapper import ElixirModule, ElixirOptimizer
2
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/colossalai/elixir/__init__.py b/colossalai/elixir/__init__.py
--- a/colossalai/elixir/__init__.py
+++ b/colossalai/elixir/__init__.py
@@ -1 +1,4 @@
+from .search import minimum_waste_search, optimal_search
from .wrapper import ElixirModule, ElixirOptimizer
+
+__all__ = ['ElixirModule', 'ElixirOptimizer', 'minimum_waste_search', 'optimal_search']
|
{"golden_diff": "diff --git a/colossalai/elixir/__init__.py b/colossalai/elixir/__init__.py\n--- a/colossalai/elixir/__init__.py\n+++ b/colossalai/elixir/__init__.py\n@@ -1 +1,4 @@\n+from .search import minimum_waste_search, optimal_search\n from .wrapper import ElixirModule, ElixirOptimizer\n+\n+__all__ = ['ElixirModule', 'ElixirOptimizer', 'minimum_waste_search', 'optimal_search']\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[elixir] make README consistent in style\nThe README for the `Elixir` module is rather a draft, we should polish it to make it consistent with the README files found in other modules.\n", "before_files": [{"content": "from .wrapper import ElixirModule, ElixirOptimizer\n", "path": "colossalai/elixir/__init__.py"}], "after_files": [{"content": "from .search import minimum_waste_search, optimal_search\nfrom .wrapper import ElixirModule, ElixirOptimizer\n\n__all__ = ['ElixirModule', 'ElixirOptimizer', 'minimum_waste_search', 'optimal_search']\n", "path": "colossalai/elixir/__init__.py"}]}
| 343 | 116 |
gh_patches_debug_17690
|
rasdani/github-patches
|
git_diff
|
wemake-services__wemake-python-styleguide-1688
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Method `given_function_called` should only take the function name into account.
# Bug report
## What's wrong
The method `given_function_called(node: ast.Call, to_check: Container[str]) -> str` in `logic.tree.functions` is described as a method that returns the name of the function being called in `node`, in case it is included in `to_check`. For example:
```python
# Let's imagine we are visiting the Call node in `print(123, 456)` stored in `node`
called_function = given_function_called(node, ['print'])
print(called_function)
# Prints `print`
# But, if we are visiting `datetime.timedelta(days=1)`
called_function = given_function_called(node, ['timedelta'])
print(called_function)
# Prints an empty string, as if `timedelta` had not been called. The way for it to be shown is:
called_function = given_function_called(node, ['datetime.timedelta'])
print(called_function)
# Prints `datetime.timedelta`
```
This is related to https://github.com/wemake-services/wemake-python-styleguide/pull/1676#discussion_r508471791
## How is that should be
```python
# If we are visiting `datetime.timedelta(days=1)`
called_function = given_function_called(node, ['timedelta'])
print(called_function)
# Prints `timedelta`
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wemake_python_styleguide/logic/tree/functions.py`
Content:
```
1 from ast import Call, Yield, YieldFrom, arg
2 from typing import Container, List, Optional
3
4 from wemake_python_styleguide.compat.functions import get_posonlyargs
5 from wemake_python_styleguide.logic import source
6 from wemake_python_styleguide.logic.walk import is_contained
7 from wemake_python_styleguide.types import (
8 AnyFunctionDef,
9 AnyFunctionDefAndLambda,
10 )
11
12
13 def given_function_called(node: Call, to_check: Container[str]) -> str:
14 """
15 Returns function name if it is called and contained in the container.
16
17 >>> import ast
18 >>> module = ast.parse('print(123, 456)')
19 >>> given_function_called(module.body[0].value, ['print'])
20 'print'
21
22 >>> given_function_called(module.body[0].value, ['adjust'])
23 ''
24
25 """
26 function_name = source.node_to_string(node.func)
27 if function_name in to_check:
28 return function_name
29 return ''
30
31
32 def is_method(function_type: Optional[str]) -> bool:
33 """
34 Returns whether a given function type belongs to a class.
35
36 >>> is_method('function')
37 False
38
39 >>> is_method(None)
40 False
41
42 >>> is_method('method')
43 True
44
45 >>> is_method('classmethod')
46 True
47
48 >>> is_method('staticmethod')
49 True
50
51 >>> is_method('')
52 False
53
54 """
55 return function_type in {'method', 'classmethod', 'staticmethod'}
56
57
58 def get_all_arguments(node: AnyFunctionDefAndLambda) -> List[arg]:
59 """
60 Returns list of all arguments that exist in a function.
61
62 Respects the correct parameters order.
63 Positional only args, regular argument,
64 ``*args``, keyword-only, ``**kwargs``.
65
66 Positional only args are only added for ``python3.8+``
67 other versions are ignoring this type of arguments.
68 """
69 names = [
70 *get_posonlyargs(node),
71 *node.args.args,
72 ]
73
74 if node.args.vararg:
75 names.append(node.args.vararg)
76
77 names.extend(node.args.kwonlyargs)
78
79 if node.args.kwarg:
80 names.append(node.args.kwarg)
81
82 return names
83
84
85 def is_first_argument(node: AnyFunctionDefAndLambda, name: str) -> bool:
86 """Tells whether an argument name is the logically first in function."""
87 positional_args = [
88 *get_posonlyargs(node),
89 *node.args.args,
90 ]
91
92 if not positional_args:
93 return False
94
95 return name == positional_args[0].arg
96
97
98 def is_generator(node: AnyFunctionDef) -> bool:
99 """Tells whether a given function is a generator."""
100 for body_item in node.body:
101 if is_contained(node=body_item, to_check=(Yield, YieldFrom)):
102 return True
103 return False
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/wemake_python_styleguide/logic/tree/functions.py b/wemake_python_styleguide/logic/tree/functions.py
--- a/wemake_python_styleguide/logic/tree/functions.py
+++ b/wemake_python_styleguide/logic/tree/functions.py
@@ -10,20 +10,21 @@
)
-def given_function_called(node: Call, to_check: Container[str]) -> str:
+def given_function_called(
+ node: Call,
+ to_check: Container[str],
+ *,
+ split_modules: bool = False,
+) -> str:
"""
Returns function name if it is called and contained in the container.
- >>> import ast
- >>> module = ast.parse('print(123, 456)')
- >>> given_function_called(module.body[0].value, ['print'])
- 'print'
-
- >>> given_function_called(module.body[0].value, ['adjust'])
- ''
-
+ If `split_modules`, takes the modules or objects into account. Otherwise,
+ it only cares about the function's name.
"""
function_name = source.node_to_string(node.func)
+ if split_modules:
+ function_name = function_name.split('.')[-1]
if function_name in to_check:
return function_name
return ''
|
{"golden_diff": "diff --git a/wemake_python_styleguide/logic/tree/functions.py b/wemake_python_styleguide/logic/tree/functions.py\n--- a/wemake_python_styleguide/logic/tree/functions.py\n+++ b/wemake_python_styleguide/logic/tree/functions.py\n@@ -10,20 +10,21 @@\n )\n \n \n-def given_function_called(node: Call, to_check: Container[str]) -> str:\n+def given_function_called(\n+ node: Call,\n+ to_check: Container[str],\n+ *,\n+ split_modules: bool = False,\n+) -> str:\n \"\"\"\n Returns function name if it is called and contained in the container.\n \n- >>> import ast\n- >>> module = ast.parse('print(123, 456)')\n- >>> given_function_called(module.body[0].value, ['print'])\n- 'print'\n-\n- >>> given_function_called(module.body[0].value, ['adjust'])\n- ''\n-\n+ If `split_modules`, takes the modules or objects into account. Otherwise,\n+ it only cares about the function's name.\n \"\"\"\n function_name = source.node_to_string(node.func)\n+ if split_modules:\n+ function_name = function_name.split('.')[-1]\n if function_name in to_check:\n return function_name\n return ''\n", "issue": "Method `given_function_called` should only take the function name into account.\n# Bug report\r\n\r\n## What's wrong\r\n\r\nThe method `given_function_called(node: ast.Call, to_check: Container[str]) -> str` in `logic.tree.functions` is described as a method that returns the name of the function being called in `node`, in case it is included in `to_check`. For example:\r\n ```python\r\n# Let's imagine we are visiting the Call node in `print(123, 456)` stored in `node`\r\ncalled_function = given_function_called(node, ['print'])\r\nprint(called_function)\r\n# Prints `print`\r\n# But, if we are visiting `datetime.timedelta(days=1)`\r\ncalled_function = given_function_called(node, ['timedelta'])\r\nprint(called_function)\r\n# Prints an empty string, as if `timedelta` had not been called. The way for it to be shown is:\r\ncalled_function = given_function_called(node, ['datetime.timedelta'])\r\nprint(called_function)\r\n# Prints `datetime.timedelta`\r\n```\r\n\r\nThis is related to https://github.com/wemake-services/wemake-python-styleguide/pull/1676#discussion_r508471791\r\n\r\n## How is that should be\r\n\r\n```python\r\n# If we are visiting `datetime.timedelta(days=1)`\r\ncalled_function = given_function_called(node, ['timedelta'])\r\nprint(called_function)\r\n# Prints `timedelta`\r\n```\r\n\n", "before_files": [{"content": "from ast import Call, Yield, YieldFrom, arg\nfrom typing import Container, List, Optional\n\nfrom wemake_python_styleguide.compat.functions import get_posonlyargs\nfrom wemake_python_styleguide.logic import source\nfrom wemake_python_styleguide.logic.walk import is_contained\nfrom wemake_python_styleguide.types import (\n AnyFunctionDef,\n AnyFunctionDefAndLambda,\n)\n\n\ndef given_function_called(node: Call, to_check: Container[str]) -> str:\n \"\"\"\n Returns function name if it is called and contained in the container.\n\n >>> import ast\n >>> module = ast.parse('print(123, 456)')\n >>> given_function_called(module.body[0].value, ['print'])\n 'print'\n\n >>> given_function_called(module.body[0].value, ['adjust'])\n ''\n\n \"\"\"\n function_name = source.node_to_string(node.func)\n if function_name in to_check:\n return function_name\n return ''\n\n\ndef is_method(function_type: Optional[str]) -> bool:\n \"\"\"\n Returns whether a given function type belongs to a class.\n\n >>> is_method('function')\n False\n\n >>> is_method(None)\n False\n\n >>> is_method('method')\n True\n\n >>> is_method('classmethod')\n True\n\n >>> is_method('staticmethod')\n True\n\n >>> is_method('')\n False\n\n \"\"\"\n return function_type in {'method', 'classmethod', 'staticmethod'}\n\n\ndef get_all_arguments(node: AnyFunctionDefAndLambda) -> List[arg]:\n \"\"\"\n Returns list of all arguments that exist in a function.\n\n Respects the correct parameters order.\n Positional only args, regular argument,\n ``*args``, keyword-only, ``**kwargs``.\n\n Positional only args are only added for ``python3.8+``\n other versions are ignoring this type of arguments.\n \"\"\"\n names = [\n *get_posonlyargs(node),\n *node.args.args,\n ]\n\n if node.args.vararg:\n names.append(node.args.vararg)\n\n names.extend(node.args.kwonlyargs)\n\n if node.args.kwarg:\n names.append(node.args.kwarg)\n\n return names\n\n\ndef is_first_argument(node: AnyFunctionDefAndLambda, name: str) -> bool:\n \"\"\"Tells whether an argument name is the logically first in function.\"\"\"\n positional_args = [\n *get_posonlyargs(node),\n *node.args.args,\n ]\n\n if not positional_args:\n return False\n\n return name == positional_args[0].arg\n\n\ndef is_generator(node: AnyFunctionDef) -> bool:\n \"\"\"Tells whether a given function is a generator.\"\"\"\n for body_item in node.body:\n if is_contained(node=body_item, to_check=(Yield, YieldFrom)):\n return True\n return False\n", "path": "wemake_python_styleguide/logic/tree/functions.py"}], "after_files": [{"content": "from ast import Call, Yield, YieldFrom, arg\nfrom typing import Container, List, Optional\n\nfrom wemake_python_styleguide.compat.functions import get_posonlyargs\nfrom wemake_python_styleguide.logic import source\nfrom wemake_python_styleguide.logic.walk import is_contained\nfrom wemake_python_styleguide.types import (\n AnyFunctionDef,\n AnyFunctionDefAndLambda,\n)\n\n\ndef given_function_called(\n node: Call,\n to_check: Container[str],\n *,\n split_modules: bool = False,\n) -> str:\n \"\"\"\n Returns function name if it is called and contained in the container.\n\n If `split_modules`, takes the modules or objects into account. Otherwise,\n it only cares about the function's name.\n \"\"\"\n function_name = source.node_to_string(node.func)\n if split_modules:\n function_name = function_name.split('.')[-1]\n if function_name in to_check:\n return function_name\n return ''\n\n\ndef is_method(function_type: Optional[str]) -> bool:\n \"\"\"\n Returns whether a given function type belongs to a class.\n\n >>> is_method('function')\n False\n\n >>> is_method(None)\n False\n\n >>> is_method('method')\n True\n\n >>> is_method('classmethod')\n True\n\n >>> is_method('staticmethod')\n True\n\n >>> is_method('')\n False\n\n \"\"\"\n return function_type in {'method', 'classmethod', 'staticmethod'}\n\n\ndef get_all_arguments(node: AnyFunctionDefAndLambda) -> List[arg]:\n \"\"\"\n Returns list of all arguments that exist in a function.\n\n Respects the correct parameters order.\n Positional only args, regular argument,\n ``*args``, keyword-only, ``**kwargs``.\n\n Positional only args are only added for ``python3.8+``\n other versions are ignoring this type of arguments.\n \"\"\"\n names = [\n *get_posonlyargs(node),\n *node.args.args,\n ]\n\n if node.args.vararg:\n names.append(node.args.vararg)\n\n names.extend(node.args.kwonlyargs)\n\n if node.args.kwarg:\n names.append(node.args.kwarg)\n\n return names\n\n\ndef is_first_argument(node: AnyFunctionDefAndLambda, name: str) -> bool:\n \"\"\"Tells whether an argument name is the logically first in function.\"\"\"\n positional_args = [\n *get_posonlyargs(node),\n *node.args.args,\n ]\n\n if not positional_args:\n return False\n\n return name == positional_args[0].arg\n\n\ndef is_generator(node: AnyFunctionDef) -> bool:\n \"\"\"Tells whether a given function is a generator.\"\"\"\n for body_item in node.body:\n if is_contained(node=body_item, to_check=(Yield, YieldFrom)):\n return True\n return False\n", "path": "wemake_python_styleguide/logic/tree/functions.py"}]}
| 1,394 | 284 |
gh_patches_debug_57811
|
rasdani/github-patches
|
git_diff
|
mozilla__pontoon-3117
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Hitting a server error when accessing a Tag page of a Tag without any resoures associated to it
This is a regression from https://github.com/mozilla/pontoon/commit/1dcd7382221f7b943b9b743ee32322f7233f6a86.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pontoon/tags/utils.py`
Content:
```
1 from django.db.models import Q, Max, Sum
2
3 from pontoon.base.models import TranslatedResource, Translation
4 from pontoon.tags.models import Tag
5
6
7 class Tags:
8 """This provides an API for retrieving related ``Tags`` for given filters,
9 providing statistical information and latest activity data.
10 """
11
12 def __init__(self, **kwargs):
13 self.project = kwargs.get("project")
14 self.locale = kwargs.get("locale")
15 self.slug = kwargs.get("slug")
16 self.tag = Tag.objects.filter(project=self.project, slug=self.slug).first()
17
18 def get(self):
19 tags = (
20 Tag.objects.filter(project=self.project, resources__isnull=False)
21 .distinct()
22 .order_by("-priority", "name")
23 )
24
25 chart = self.chart(Q(), "resource__tag")
26 latest_activity = self.latest_activity(Q(), "resource__tag")
27 for tag in tags:
28 tag.chart = chart.get(tag.pk)
29 tag.latest_activity = latest_activity.get(tag.pk)
30
31 return tags
32
33 def get_tag_locales(self):
34 tag = self.tag
35
36 if tag is None:
37 return None
38
39 chart = self.chart(Q(resource__tag=self.tag), "resource__tag")
40 tag.chart = chart.get(tag.pk)
41 tag.locales = self.project.locales.all()
42
43 locale_chart = self.chart(Q(resource__tag=self.tag), "locale")
44 locale_latest_activity = self.latest_activity(
45 Q(resource__tag=self.tag), "locale"
46 )
47 for locale in tag.locales:
48 locale.chart = locale_chart.get(locale.pk)
49 locale.latest_activity = locale_latest_activity.get(locale.pk)
50
51 return tag
52
53 def chart(self, query, group_by):
54 trs = (
55 self.translated_resources.filter(query)
56 .values(group_by)
57 .annotate(
58 total_strings=Sum("resource__total_strings"),
59 approved_strings=Sum("approved_strings"),
60 pretranslated_strings=Sum("pretranslated_strings"),
61 strings_with_errors=Sum("strings_with_errors"),
62 strings_with_warnings=Sum("strings_with_warnings"),
63 unreviewed_strings=Sum("unreviewed_strings"),
64 )
65 )
66
67 return {
68 tr[group_by]: TranslatedResource.get_chart_dict(
69 TranslatedResource(**{key: tr[key] for key in list(tr.keys())[1:]})
70 )
71 for tr in trs
72 }
73
74 def latest_activity(self, query, group_by):
75 latest_activity = {}
76 dates = {}
77 translations = Translation.objects.none()
78
79 trs = (
80 self.translated_resources.exclude(latest_translation__isnull=True)
81 .filter(query)
82 .values(group_by)
83 .annotate(
84 date=Max("latest_translation__date"),
85 approved_date=Max("latest_translation__approved_date"),
86 )
87 )
88
89 for tr in trs:
90 date = max(tr["date"], tr["approved_date"] or tr["date"])
91 dates[date] = tr[group_by]
92 prefix = "entity__" if group_by == "resource__tag" else ""
93
94 # Find translations with matching date and tag/locale
95 translations |= Translation.objects.filter(
96 Q(**{"date": date, f"{prefix}{group_by}": tr[group_by]})
97 ).prefetch_related("user", "approved_user")
98
99 for t in translations:
100 key = dates[t.latest_activity["date"]]
101 latest_activity[key] = t.latest_activity
102
103 return latest_activity
104
105 @property
106 def translated_resources(self):
107 trs = TranslatedResource.objects
108
109 if self.project is not None:
110 trs = trs.filter(resource__project=self.project)
111
112 if self.locale is not None:
113 trs = trs.filter(locale=self.locale)
114
115 return trs
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pontoon/tags/utils.py b/pontoon/tags/utils.py
--- a/pontoon/tags/utils.py
+++ b/pontoon/tags/utils.py
@@ -13,7 +13,9 @@
self.project = kwargs.get("project")
self.locale = kwargs.get("locale")
self.slug = kwargs.get("slug")
- self.tag = Tag.objects.filter(project=self.project, slug=self.slug).first()
+ self.tag = Tag.objects.filter(
+ project=self.project, slug=self.slug, resources__isnull=False
+ ).first()
def get(self):
tags = (
|
{"golden_diff": "diff --git a/pontoon/tags/utils.py b/pontoon/tags/utils.py\n--- a/pontoon/tags/utils.py\n+++ b/pontoon/tags/utils.py\n@@ -13,7 +13,9 @@\n self.project = kwargs.get(\"project\")\n self.locale = kwargs.get(\"locale\")\n self.slug = kwargs.get(\"slug\")\n- self.tag = Tag.objects.filter(project=self.project, slug=self.slug).first()\n+ self.tag = Tag.objects.filter(\n+ project=self.project, slug=self.slug, resources__isnull=False\n+ ).first()\n \n def get(self):\n tags = (\n", "issue": "Hitting a server error when accessing a Tag page of a Tag without any resoures associated to it\nThis is a regression from https://github.com/mozilla/pontoon/commit/1dcd7382221f7b943b9b743ee32322f7233f6a86.\n", "before_files": [{"content": "from django.db.models import Q, Max, Sum\n\nfrom pontoon.base.models import TranslatedResource, Translation\nfrom pontoon.tags.models import Tag\n\n\nclass Tags:\n \"\"\"This provides an API for retrieving related ``Tags`` for given filters,\n providing statistical information and latest activity data.\n \"\"\"\n\n def __init__(self, **kwargs):\n self.project = kwargs.get(\"project\")\n self.locale = kwargs.get(\"locale\")\n self.slug = kwargs.get(\"slug\")\n self.tag = Tag.objects.filter(project=self.project, slug=self.slug).first()\n\n def get(self):\n tags = (\n Tag.objects.filter(project=self.project, resources__isnull=False)\n .distinct()\n .order_by(\"-priority\", \"name\")\n )\n\n chart = self.chart(Q(), \"resource__tag\")\n latest_activity = self.latest_activity(Q(), \"resource__tag\")\n for tag in tags:\n tag.chart = chart.get(tag.pk)\n tag.latest_activity = latest_activity.get(tag.pk)\n\n return tags\n\n def get_tag_locales(self):\n tag = self.tag\n\n if tag is None:\n return None\n\n chart = self.chart(Q(resource__tag=self.tag), \"resource__tag\")\n tag.chart = chart.get(tag.pk)\n tag.locales = self.project.locales.all()\n\n locale_chart = self.chart(Q(resource__tag=self.tag), \"locale\")\n locale_latest_activity = self.latest_activity(\n Q(resource__tag=self.tag), \"locale\"\n )\n for locale in tag.locales:\n locale.chart = locale_chart.get(locale.pk)\n locale.latest_activity = locale_latest_activity.get(locale.pk)\n\n return tag\n\n def chart(self, query, group_by):\n trs = (\n self.translated_resources.filter(query)\n .values(group_by)\n .annotate(\n total_strings=Sum(\"resource__total_strings\"),\n approved_strings=Sum(\"approved_strings\"),\n pretranslated_strings=Sum(\"pretranslated_strings\"),\n strings_with_errors=Sum(\"strings_with_errors\"),\n strings_with_warnings=Sum(\"strings_with_warnings\"),\n unreviewed_strings=Sum(\"unreviewed_strings\"),\n )\n )\n\n return {\n tr[group_by]: TranslatedResource.get_chart_dict(\n TranslatedResource(**{key: tr[key] for key in list(tr.keys())[1:]})\n )\n for tr in trs\n }\n\n def latest_activity(self, query, group_by):\n latest_activity = {}\n dates = {}\n translations = Translation.objects.none()\n\n trs = (\n self.translated_resources.exclude(latest_translation__isnull=True)\n .filter(query)\n .values(group_by)\n .annotate(\n date=Max(\"latest_translation__date\"),\n approved_date=Max(\"latest_translation__approved_date\"),\n )\n )\n\n for tr in trs:\n date = max(tr[\"date\"], tr[\"approved_date\"] or tr[\"date\"])\n dates[date] = tr[group_by]\n prefix = \"entity__\" if group_by == \"resource__tag\" else \"\"\n\n # Find translations with matching date and tag/locale\n translations |= Translation.objects.filter(\n Q(**{\"date\": date, f\"{prefix}{group_by}\": tr[group_by]})\n ).prefetch_related(\"user\", \"approved_user\")\n\n for t in translations:\n key = dates[t.latest_activity[\"date\"]]\n latest_activity[key] = t.latest_activity\n\n return latest_activity\n\n @property\n def translated_resources(self):\n trs = TranslatedResource.objects\n\n if self.project is not None:\n trs = trs.filter(resource__project=self.project)\n\n if self.locale is not None:\n trs = trs.filter(locale=self.locale)\n\n return trs\n", "path": "pontoon/tags/utils.py"}], "after_files": [{"content": "from django.db.models import Q, Max, Sum\n\nfrom pontoon.base.models import TranslatedResource, Translation\nfrom pontoon.tags.models import Tag\n\n\nclass Tags:\n \"\"\"This provides an API for retrieving related ``Tags`` for given filters,\n providing statistical information and latest activity data.\n \"\"\"\n\n def __init__(self, **kwargs):\n self.project = kwargs.get(\"project\")\n self.locale = kwargs.get(\"locale\")\n self.slug = kwargs.get(\"slug\")\n self.tag = Tag.objects.filter(\n project=self.project, slug=self.slug, resources__isnull=False\n ).first()\n\n def get(self):\n tags = (\n Tag.objects.filter(project=self.project, resources__isnull=False)\n .distinct()\n .order_by(\"-priority\", \"name\")\n )\n\n chart = self.chart(Q(), \"resource__tag\")\n latest_activity = self.latest_activity(Q(), \"resource__tag\")\n for tag in tags:\n tag.chart = chart.get(tag.pk)\n tag.latest_activity = latest_activity.get(tag.pk)\n\n return tags\n\n def get_tag_locales(self):\n tag = self.tag\n\n if tag is None:\n return None\n\n chart = self.chart(Q(resource__tag=self.tag), \"resource__tag\")\n tag.chart = chart.get(tag.pk)\n tag.locales = self.project.locales.all()\n\n locale_chart = self.chart(Q(resource__tag=self.tag), \"locale\")\n locale_latest_activity = self.latest_activity(\n Q(resource__tag=self.tag), \"locale\"\n )\n for locale in tag.locales:\n locale.chart = locale_chart.get(locale.pk)\n locale.latest_activity = locale_latest_activity.get(locale.pk)\n\n return tag\n\n def chart(self, query, group_by):\n trs = (\n self.translated_resources.filter(query)\n .values(group_by)\n .annotate(\n total_strings=Sum(\"resource__total_strings\"),\n approved_strings=Sum(\"approved_strings\"),\n pretranslated_strings=Sum(\"pretranslated_strings\"),\n strings_with_errors=Sum(\"strings_with_errors\"),\n strings_with_warnings=Sum(\"strings_with_warnings\"),\n unreviewed_strings=Sum(\"unreviewed_strings\"),\n )\n )\n\n return {\n tr[group_by]: TranslatedResource.get_chart_dict(\n TranslatedResource(**{key: tr[key] for key in list(tr.keys())[1:]})\n )\n for tr in trs\n }\n\n def latest_activity(self, query, group_by):\n latest_activity = {}\n dates = {}\n translations = Translation.objects.none()\n\n trs = (\n self.translated_resources.exclude(latest_translation__isnull=True)\n .filter(query)\n .values(group_by)\n .annotate(\n date=Max(\"latest_translation__date\"),\n approved_date=Max(\"latest_translation__approved_date\"),\n )\n )\n\n for tr in trs:\n date = max(tr[\"date\"], tr[\"approved_date\"] or tr[\"date\"])\n dates[date] = tr[group_by]\n prefix = \"entity__\" if group_by == \"resource__tag\" else \"\"\n\n # Find translations with matching date and tag/locale\n translations |= Translation.objects.filter(\n Q(**{\"date\": date, f\"{prefix}{group_by}\": tr[group_by]})\n ).prefetch_related(\"user\", \"approved_user\")\n\n for t in translations:\n key = dates[t.latest_activity[\"date\"]]\n latest_activity[key] = t.latest_activity\n\n return latest_activity\n\n @property\n def translated_resources(self):\n trs = TranslatedResource.objects\n\n if self.project is not None:\n trs = trs.filter(resource__project=self.project)\n\n if self.locale is not None:\n trs = trs.filter(locale=self.locale)\n\n return trs\n", "path": "pontoon/tags/utils.py"}]}
| 1,379 | 133 |
gh_patches_debug_12816
|
rasdani/github-patches
|
git_diff
|
ansible__ansible-3017
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support scripts dir in roles
Could we extend roles to support scripts?
So
```
- script: myscript.sh
```
Gets automagically sucked up from, roles/myrole/scripts/myscript.sh
?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/ansible/runner/action_plugins/script.py`
Content:
```
1 # (c) 2012, Michael DeHaan <[email protected]>
2 #
3 # This file is part of Ansible
4 #
5 # Ansible is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU General Public License as published by
7 # the Free Software Foundation, either version 3 of the License, or
8 # (at your option) any later version.
9 #
10 # Ansible is distributed in the hope that it will be useful,
11 # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 # GNU General Public License for more details.
14 #
15 # You should have received a copy of the GNU General Public License
16 # along with Ansible. If not, see <http://www.gnu.org/licenses/>.
17
18 import os
19 import shlex
20
21 import ansible.constants as C
22 from ansible.utils import template
23 from ansible import utils
24 from ansible import errors
25 from ansible.runner.return_data import ReturnData
26
27 class ActionModule(object):
28
29 def __init__(self, runner):
30 self.runner = runner
31
32 def run(self, conn, tmp, module_name, module_args, inject, complex_args=None, **kwargs):
33 ''' handler for file transfer operations '''
34
35 if self.runner.check:
36 # in check mode, always skip this module
37 return ReturnData(conn=conn, comm_ok=True, result=dict(skipped=True, msg='check mode not supported for this module'))
38
39 tokens = shlex.split(module_args)
40 source = tokens[0]
41 # FIXME: error handling
42 args = " ".join(tokens[1:])
43 source = template.template(self.runner.basedir, source, inject)
44 source = utils.path_dwim(self.runner.basedir, source)
45
46 # transfer the file to a remote tmp location
47 source = source.replace('\x00','') # why does this happen here?
48 args = args.replace('\x00','') # why does this happen here?
49 tmp_src = os.path.join(tmp, os.path.basename(source))
50 tmp_src = tmp_src.replace('\x00', '')
51
52 conn.put_file(source, tmp_src)
53
54 # fix file permissions when the copy is done as a different user
55 if self.runner.sudo and self.runner.sudo_user != 'root':
56 prepcmd = 'chmod a+rx %s' % tmp_src
57 else:
58 prepcmd = 'chmod +x %s' % tmp_src
59
60 # add preparation steps to one ssh roundtrip executing the script
61 module_args = prepcmd + '; ' + tmp_src + ' ' + args
62
63 handler = utils.plugins.action_loader.get('raw', self.runner)
64 result = handler.run(conn, tmp, 'raw', module_args, inject)
65
66 # clean up after
67 if tmp.find("tmp") != -1 and C.DEFAULT_KEEP_REMOTE_FILES != '1':
68 self.runner._low_level_exec_command(conn, 'rm -rf %s >/dev/null 2>&1' % tmp, tmp)
69
70 return result
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lib/ansible/runner/action_plugins/script.py b/lib/ansible/runner/action_plugins/script.py
--- a/lib/ansible/runner/action_plugins/script.py
+++ b/lib/ansible/runner/action_plugins/script.py
@@ -41,7 +41,10 @@
# FIXME: error handling
args = " ".join(tokens[1:])
source = template.template(self.runner.basedir, source, inject)
- source = utils.path_dwim(self.runner.basedir, source)
+ if '_original_file' in inject:
+ source = utils.path_dwim_relative(inject['_original_file'], 'files', source, self.runner.basedir)
+ else:
+ source = utils.path_dwim(self.runner.basedir, source)
# transfer the file to a remote tmp location
source = source.replace('\x00','') # why does this happen here?
|
{"golden_diff": "diff --git a/lib/ansible/runner/action_plugins/script.py b/lib/ansible/runner/action_plugins/script.py\n--- a/lib/ansible/runner/action_plugins/script.py\n+++ b/lib/ansible/runner/action_plugins/script.py\n@@ -41,7 +41,10 @@\n # FIXME: error handling\n args = \" \".join(tokens[1:])\n source = template.template(self.runner.basedir, source, inject)\n- source = utils.path_dwim(self.runner.basedir, source)\n+ if '_original_file' in inject:\n+ source = utils.path_dwim_relative(inject['_original_file'], 'files', source, self.runner.basedir)\n+ else:\n+ source = utils.path_dwim(self.runner.basedir, source)\n \n # transfer the file to a remote tmp location\n source = source.replace('\\x00','') # why does this happen here?\n", "issue": "Support scripts dir in roles\nCould we extend roles to support scripts?\n\nSo \n\n```\n- script: myscript.sh\n```\n\nGets automagically sucked up from, roles/myrole/scripts/myscript.sh\n\n?\n\n", "before_files": [{"content": "# (c) 2012, Michael DeHaan <[email protected]>\n#\n# This file is part of Ansible\n#\n# Ansible is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Ansible is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Ansible. If not, see <http://www.gnu.org/licenses/>.\n\nimport os\nimport shlex\n\nimport ansible.constants as C\nfrom ansible.utils import template\nfrom ansible import utils\nfrom ansible import errors\nfrom ansible.runner.return_data import ReturnData\n\nclass ActionModule(object):\n\n def __init__(self, runner):\n self.runner = runner\n\n def run(self, conn, tmp, module_name, module_args, inject, complex_args=None, **kwargs):\n ''' handler for file transfer operations '''\n\n if self.runner.check:\n # in check mode, always skip this module\n return ReturnData(conn=conn, comm_ok=True, result=dict(skipped=True, msg='check mode not supported for this module'))\n\n tokens = shlex.split(module_args)\n source = tokens[0]\n # FIXME: error handling\n args = \" \".join(tokens[1:])\n source = template.template(self.runner.basedir, source, inject)\n source = utils.path_dwim(self.runner.basedir, source)\n\n # transfer the file to a remote tmp location\n source = source.replace('\\x00','') # why does this happen here?\n args = args.replace('\\x00','') # why does this happen here?\n tmp_src = os.path.join(tmp, os.path.basename(source))\n tmp_src = tmp_src.replace('\\x00', '') \n\n conn.put_file(source, tmp_src)\n\n # fix file permissions when the copy is done as a different user\n if self.runner.sudo and self.runner.sudo_user != 'root':\n prepcmd = 'chmod a+rx %s' % tmp_src\n else:\n prepcmd = 'chmod +x %s' % tmp_src\n\n # add preparation steps to one ssh roundtrip executing the script\n module_args = prepcmd + '; ' + tmp_src + ' ' + args\n\n handler = utils.plugins.action_loader.get('raw', self.runner)\n result = handler.run(conn, tmp, 'raw', module_args, inject)\n\n # clean up after\n if tmp.find(\"tmp\") != -1 and C.DEFAULT_KEEP_REMOTE_FILES != '1':\n self.runner._low_level_exec_command(conn, 'rm -rf %s >/dev/null 2>&1' % tmp, tmp)\n\n return result\n", "path": "lib/ansible/runner/action_plugins/script.py"}], "after_files": [{"content": "# (c) 2012, Michael DeHaan <[email protected]>\n#\n# This file is part of Ansible\n#\n# Ansible is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Ansible is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Ansible. If not, see <http://www.gnu.org/licenses/>.\n\nimport os\nimport shlex\n\nimport ansible.constants as C\nfrom ansible.utils import template\nfrom ansible import utils\nfrom ansible import errors\nfrom ansible.runner.return_data import ReturnData\n\nclass ActionModule(object):\n\n def __init__(self, runner):\n self.runner = runner\n\n def run(self, conn, tmp, module_name, module_args, inject, complex_args=None, **kwargs):\n ''' handler for file transfer operations '''\n\n if self.runner.check:\n # in check mode, always skip this module\n return ReturnData(conn=conn, comm_ok=True, result=dict(skipped=True, msg='check mode not supported for this module'))\n\n tokens = shlex.split(module_args)\n source = tokens[0]\n # FIXME: error handling\n args = \" \".join(tokens[1:])\n source = template.template(self.runner.basedir, source, inject)\n if '_original_file' in inject:\n source = utils.path_dwim_relative(inject['_original_file'], 'files', source, self.runner.basedir)\n else:\n source = utils.path_dwim(self.runner.basedir, source)\n\n # transfer the file to a remote tmp location\n source = source.replace('\\x00','') # why does this happen here?\n args = args.replace('\\x00','') # why does this happen here?\n tmp_src = os.path.join(tmp, os.path.basename(source))\n tmp_src = tmp_src.replace('\\x00', '') \n\n conn.put_file(source, tmp_src)\n\n # fix file permissions when the copy is done as a different user\n if self.runner.sudo and self.runner.sudo_user != 'root':\n prepcmd = 'chmod a+rx %s' % tmp_src\n else:\n prepcmd = 'chmod +x %s' % tmp_src\n\n # add preparation steps to one ssh roundtrip executing the script\n module_args = prepcmd + '; ' + tmp_src + ' ' + args\n\n handler = utils.plugins.action_loader.get('raw', self.runner)\n result = handler.run(conn, tmp, 'raw', module_args, inject)\n\n # clean up after\n if tmp.find(\"tmp\") != -1 and C.DEFAULT_KEEP_REMOTE_FILES != '1':\n self.runner._low_level_exec_command(conn, 'rm -rf %s >/dev/null 2>&1' % tmp, tmp)\n\n return result\n", "path": "lib/ansible/runner/action_plugins/script.py"}]}
| 1,108 | 199 |
gh_patches_debug_21836
|
rasdani/github-patches
|
git_diff
|
uccser__cs-unplugged-82
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Always ask for page size and copies for resource generation
Add these questions to `base-resource.html` template
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `csunplugged/resources/views/sorting_network.py`
Content:
```
1 from PIL import Image, ImageDraw, ImageFont
2 from random import sample
3
4
5 def resource_image(get_request, resource):
6 """Creates a image for Sorting Network resource.
7
8 Returns:
9 A Pillow image object.
10 """
11 image_path = 'static/img/resource-sorting-network-colour.png'
12 image = Image.open(image_path)
13 draw = ImageDraw.Draw(image)
14
15 (range_min, range_max, font_size) = number_range(get_request)
16
17 font_path = 'static/fonts/PatrickHand-Regular.ttf'
18
19 # Add numbers to text if needed
20 if get_request['prefilled_values'] != 'blank':
21 font = ImageFont.truetype(font_path, font_size)
22 numbers = sample(range(range_min, range_max), 6)
23 base_coord_x = 70
24 base_coord_y = 2560
25 coord_x_increment = 204
26 for number in numbers:
27 text = str(number)
28 text_width, text_height = draw.textsize(text, font=font)
29 coord_x = base_coord_x - (text_width / 2)
30 coord_y = base_coord_y - (text_height / 2)
31 draw.text(
32 (coord_x, coord_y),
33 text,
34 font=font,
35 fill='#000'
36 )
37 base_coord_x += coord_x_increment
38
39 return image
40
41
42 def subtitle(get_request, resource):
43 """Returns the subtitle string of the resource.
44
45 Used after the resource name in the filename, and
46 also on the resource image.
47 """
48 TEMPLATE = '{} to {}'
49 range_min, range_max, font_size = number_range(get_request)
50 text = TEMPLATE.format(range_min, range_max - 1)
51 return text
52
53
54 def number_range(get_request):
55 """Returns a tuple of (range_min, range_max, font_size)
56 for the requested resource.
57 """
58 prefilled_values = get_request['prefilled_values']
59 if prefilled_values == 'easy':
60 range_min = 1
61 range_max = 10
62 font_size = 150
63 elif prefilled_values == 'medium':
64 range_min = 10
65 range_max = 100
66 font_size = 120
67 elif prefilled_values == 'hard':
68 range_min = 100
69 range_max = 1000
70 font_size = 90
71 return (range_min, range_max, font_size)
72
```
Path: `csunplugged/resources/views/generate_resource_pdf.py`
Content:
```
1 from django.http import HttpResponse
2 from django.template.loader import render_to_string
3 from django.contrib.staticfiles import finders
4 from multiprocessing import Pool
5 from functools import partial
6 from weasyprint import HTML, CSS
7 from PIL import Image
8 from io import BytesIO
9 import importlib
10 import base64
11
12 RESPONSE_CONTENT_DISPOSITION = 'attachment; filename="{filename}.pdf"'
13 MM_TO_PIXEL_RATIO = 3.78
14
15
16 def generate_resource_pdf(request, resource, module_path):
17 """Returns a response containing a randomly generated PDF resource.
18
19 Returns:
20 HTTP Response containing generated resource PDF
21 """
22 context = dict()
23 get_request = request.GET
24 context['paper_size'] = get_request['paper_size']
25 context['resource'] = resource
26
27 resource_image_generator = importlib.import_module(module_path)
28 filename = '{} ({})'.format(resource.name, resource_image_generator.subtitle(get_request, resource))
29 context['filename'] = filename
30
31 num_copies = range(0, int(get_request['copies']))
32 image_generator = partial(
33 generate_resource_image,
34 get_request,
35 resource,
36 module_path
37 )
38 with Pool() as pool:
39 context['resource_images'] = pool.map(image_generator, num_copies)
40 pool.close()
41
42 pdf_html = render_to_string('resources/base-resource-pdf.html', context)
43 html = HTML(string=pdf_html, base_url=request.build_absolute_uri())
44 css_file = finders.find('css/print-resource-pdf.css')
45 css_string = open(css_file, encoding='UTF-8').read()
46 base_css = CSS(string=css_string)
47 pdf_file = html.write_pdf(stylesheets=[base_css])
48
49 response = HttpResponse(pdf_file, content_type='application/pdf')
50 response['Content-Disposition'] = RESPONSE_CONTENT_DISPOSITION.format(filename=filename)
51 return response
52
53
54 def generate_resource_image(get_request, resource, module_path, copy_num):
55 """Calls the resource's image generator and returns the generated
56 image. This function also resizes the generated image for the paper
57 size requested.
58
59 Returns:
60 Base 64 string of a generated resource image.
61 """
62 # Get image from resource image creator
63 resource_image_generator = importlib.import_module(module_path)
64 image = resource_image_generator.resource_image(get_request, resource)
65
66 # Resize image to reduce file size
67 if get_request['paper_size'] == "a4":
68 max_pixel_height = 267 * MM_TO_PIXEL_RATIO
69 elif get_request['paper_size'] == "letter":
70 max_pixel_height = 249 * MM_TO_PIXEL_RATIO
71 (width, height) = image.size
72 if height > max_pixel_height:
73 ratio = max_pixel_height / height
74 width *= ratio
75 height *= ratio
76 image = image.resize((int(width), int(height)), Image.ANTIALIAS)
77
78 # Save image to buffer
79 image_buffer = BytesIO()
80 image.save(image_buffer, format='PNG')
81
82 # Return base64 of image
83 return base64.b64encode(image_buffer.getvalue())
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/csunplugged/resources/views/generate_resource_pdf.py b/csunplugged/resources/views/generate_resource_pdf.py
--- a/csunplugged/resources/views/generate_resource_pdf.py
+++ b/csunplugged/resources/views/generate_resource_pdf.py
@@ -23,6 +23,7 @@
get_request = request.GET
context['paper_size'] = get_request['paper_size']
context['resource'] = resource
+ context['header_text'] = get_request['header_text']
resource_image_generator = importlib.import_module(module_path)
filename = '{} ({})'.format(resource.name, resource_image_generator.subtitle(get_request, resource))
diff --git a/csunplugged/resources/views/sorting_network.py b/csunplugged/resources/views/sorting_network.py
--- a/csunplugged/resources/views/sorting_network.py
+++ b/csunplugged/resources/views/sorting_network.py
@@ -56,10 +56,12 @@
for the requested resource.
"""
prefilled_values = get_request['prefilled_values']
+ range_min = 0
+ range_max = 0
+ font_size = 150
if prefilled_values == 'easy':
range_min = 1
range_max = 10
- font_size = 150
elif prefilled_values == 'medium':
range_min = 10
range_max = 100
|
{"golden_diff": "diff --git a/csunplugged/resources/views/generate_resource_pdf.py b/csunplugged/resources/views/generate_resource_pdf.py\n--- a/csunplugged/resources/views/generate_resource_pdf.py\n+++ b/csunplugged/resources/views/generate_resource_pdf.py\n@@ -23,6 +23,7 @@\n get_request = request.GET\n context['paper_size'] = get_request['paper_size']\n context['resource'] = resource\n+ context['header_text'] = get_request['header_text']\n \n resource_image_generator = importlib.import_module(module_path)\n filename = '{} ({})'.format(resource.name, resource_image_generator.subtitle(get_request, resource))\ndiff --git a/csunplugged/resources/views/sorting_network.py b/csunplugged/resources/views/sorting_network.py\n--- a/csunplugged/resources/views/sorting_network.py\n+++ b/csunplugged/resources/views/sorting_network.py\n@@ -56,10 +56,12 @@\n for the requested resource.\n \"\"\"\n prefilled_values = get_request['prefilled_values']\n+ range_min = 0\n+ range_max = 0\n+ font_size = 150\n if prefilled_values == 'easy':\n range_min = 1\n range_max = 10\n- font_size = 150\n elif prefilled_values == 'medium':\n range_min = 10\n range_max = 100\n", "issue": "Always ask for page size and copies for resource generation\nAdd these questions to `base-resource.html` template\n", "before_files": [{"content": "from PIL import Image, ImageDraw, ImageFont\nfrom random import sample\n\n\ndef resource_image(get_request, resource):\n \"\"\"Creates a image for Sorting Network resource.\n\n Returns:\n A Pillow image object.\n \"\"\"\n image_path = 'static/img/resource-sorting-network-colour.png'\n image = Image.open(image_path)\n draw = ImageDraw.Draw(image)\n\n (range_min, range_max, font_size) = number_range(get_request)\n\n font_path = 'static/fonts/PatrickHand-Regular.ttf'\n\n # Add numbers to text if needed\n if get_request['prefilled_values'] != 'blank':\n font = ImageFont.truetype(font_path, font_size)\n numbers = sample(range(range_min, range_max), 6)\n base_coord_x = 70\n base_coord_y = 2560\n coord_x_increment = 204\n for number in numbers:\n text = str(number)\n text_width, text_height = draw.textsize(text, font=font)\n coord_x = base_coord_x - (text_width / 2)\n coord_y = base_coord_y - (text_height / 2)\n draw.text(\n (coord_x, coord_y),\n text,\n font=font,\n fill='#000'\n )\n base_coord_x += coord_x_increment\n\n return image\n\n\ndef subtitle(get_request, resource):\n \"\"\"Returns the subtitle string of the resource.\n\n Used after the resource name in the filename, and\n also on the resource image.\n \"\"\"\n TEMPLATE = '{} to {}'\n range_min, range_max, font_size = number_range(get_request)\n text = TEMPLATE.format(range_min, range_max - 1)\n return text\n\n\ndef number_range(get_request):\n \"\"\"Returns a tuple of (range_min, range_max, font_size)\n for the requested resource.\n \"\"\"\n prefilled_values = get_request['prefilled_values']\n if prefilled_values == 'easy':\n range_min = 1\n range_max = 10\n font_size = 150\n elif prefilled_values == 'medium':\n range_min = 10\n range_max = 100\n font_size = 120\n elif prefilled_values == 'hard':\n range_min = 100\n range_max = 1000\n font_size = 90\n return (range_min, range_max, font_size)\n", "path": "csunplugged/resources/views/sorting_network.py"}, {"content": "from django.http import HttpResponse\nfrom django.template.loader import render_to_string\nfrom django.contrib.staticfiles import finders\nfrom multiprocessing import Pool\nfrom functools import partial\nfrom weasyprint import HTML, CSS\nfrom PIL import Image\nfrom io import BytesIO\nimport importlib\nimport base64\n\nRESPONSE_CONTENT_DISPOSITION = 'attachment; filename=\"{filename}.pdf\"'\nMM_TO_PIXEL_RATIO = 3.78\n\n\ndef generate_resource_pdf(request, resource, module_path):\n \"\"\"Returns a response containing a randomly generated PDF resource.\n\n Returns:\n HTTP Response containing generated resource PDF\n \"\"\"\n context = dict()\n get_request = request.GET\n context['paper_size'] = get_request['paper_size']\n context['resource'] = resource\n\n resource_image_generator = importlib.import_module(module_path)\n filename = '{} ({})'.format(resource.name, resource_image_generator.subtitle(get_request, resource))\n context['filename'] = filename\n\n num_copies = range(0, int(get_request['copies']))\n image_generator = partial(\n generate_resource_image,\n get_request,\n resource,\n module_path\n )\n with Pool() as pool:\n context['resource_images'] = pool.map(image_generator, num_copies)\n pool.close()\n\n pdf_html = render_to_string('resources/base-resource-pdf.html', context)\n html = HTML(string=pdf_html, base_url=request.build_absolute_uri())\n css_file = finders.find('css/print-resource-pdf.css')\n css_string = open(css_file, encoding='UTF-8').read()\n base_css = CSS(string=css_string)\n pdf_file = html.write_pdf(stylesheets=[base_css])\n\n response = HttpResponse(pdf_file, content_type='application/pdf')\n response['Content-Disposition'] = RESPONSE_CONTENT_DISPOSITION.format(filename=filename)\n return response\n\n\ndef generate_resource_image(get_request, resource, module_path, copy_num):\n \"\"\"Calls the resource's image generator and returns the generated\n image. This function also resizes the generated image for the paper\n size requested.\n\n Returns:\n Base 64 string of a generated resource image.\n \"\"\"\n # Get image from resource image creator\n resource_image_generator = importlib.import_module(module_path)\n image = resource_image_generator.resource_image(get_request, resource)\n\n # Resize image to reduce file size\n if get_request['paper_size'] == \"a4\":\n max_pixel_height = 267 * MM_TO_PIXEL_RATIO\n elif get_request['paper_size'] == \"letter\":\n max_pixel_height = 249 * MM_TO_PIXEL_RATIO\n (width, height) = image.size\n if height > max_pixel_height:\n ratio = max_pixel_height / height\n width *= ratio\n height *= ratio\n image = image.resize((int(width), int(height)), Image.ANTIALIAS)\n\n # Save image to buffer\n image_buffer = BytesIO()\n image.save(image_buffer, format='PNG')\n\n # Return base64 of image\n return base64.b64encode(image_buffer.getvalue())\n", "path": "csunplugged/resources/views/generate_resource_pdf.py"}], "after_files": [{"content": "from PIL import Image, ImageDraw, ImageFont\nfrom random import sample\n\n\ndef resource_image(get_request, resource):\n \"\"\"Creates a image for Sorting Network resource.\n\n Returns:\n A Pillow image object.\n \"\"\"\n image_path = 'static/img/resource-sorting-network-colour.png'\n image = Image.open(image_path)\n draw = ImageDraw.Draw(image)\n\n (range_min, range_max, font_size) = number_range(get_request)\n\n font_path = 'static/fonts/PatrickHand-Regular.ttf'\n\n # Add numbers to text if needed\n if get_request['prefilled_values'] != 'blank':\n font = ImageFont.truetype(font_path, font_size)\n numbers = sample(range(range_min, range_max), 6)\n base_coord_x = 70\n base_coord_y = 2560\n coord_x_increment = 204\n for number in numbers:\n text = str(number)\n text_width, text_height = draw.textsize(text, font=font)\n coord_x = base_coord_x - (text_width / 2)\n coord_y = base_coord_y - (text_height / 2)\n draw.text(\n (coord_x, coord_y),\n text,\n font=font,\n fill='#000'\n )\n base_coord_x += coord_x_increment\n\n return image\n\n\ndef subtitle(get_request, resource):\n \"\"\"Returns the subtitle string of the resource.\n\n Used after the resource name in the filename, and\n also on the resource image.\n \"\"\"\n TEMPLATE = '{} to {}'\n range_min, range_max, font_size = number_range(get_request)\n text = TEMPLATE.format(range_min, range_max - 1)\n return text\n\n\ndef number_range(get_request):\n \"\"\"Returns a tuple of (range_min, range_max, font_size)\n for the requested resource.\n \"\"\"\n prefilled_values = get_request['prefilled_values']\n range_min = 0\n range_max = 0\n font_size = 150\n if prefilled_values == 'easy':\n range_min = 1\n range_max = 10\n elif prefilled_values == 'medium':\n range_min = 10\n range_max = 100\n font_size = 120\n elif prefilled_values == 'hard':\n range_min = 100\n range_max = 1000\n font_size = 90\n return (range_min, range_max, font_size)\n", "path": "csunplugged/resources/views/sorting_network.py"}, {"content": "from django.http import HttpResponse\nfrom django.template.loader import render_to_string\nfrom django.contrib.staticfiles import finders\nfrom multiprocessing import Pool\nfrom functools import partial\nfrom weasyprint import HTML, CSS\nfrom PIL import Image\nfrom io import BytesIO\nimport importlib\nimport base64\n\nRESPONSE_CONTENT_DISPOSITION = 'attachment; filename=\"{filename}.pdf\"'\nMM_TO_PIXEL_RATIO = 3.78\n\n\ndef generate_resource_pdf(request, resource, module_path):\n \"\"\"Returns a response containing a randomly generated PDF resource.\n\n Returns:\n HTTP Response containing generated resource PDF\n \"\"\"\n context = dict()\n get_request = request.GET\n context['paper_size'] = get_request['paper_size']\n context['resource'] = resource\n context['header_text'] = get_request['header_text']\n\n resource_image_generator = importlib.import_module(module_path)\n filename = '{} ({})'.format(resource.name, resource_image_generator.subtitle(get_request, resource))\n context['filename'] = filename\n\n num_copies = range(0, int(get_request['copies']))\n image_generator = partial(\n generate_resource_image,\n get_request,\n resource,\n module_path\n )\n with Pool() as pool:\n context['resource_images'] = pool.map(image_generator, num_copies)\n pool.close()\n\n pdf_html = render_to_string('resources/base-resource-pdf.html', context)\n html = HTML(string=pdf_html, base_url=request.build_absolute_uri())\n css_file = finders.find('css/print-resource-pdf.css')\n css_string = open(css_file, encoding='UTF-8').read()\n base_css = CSS(string=css_string)\n pdf_file = html.write_pdf(stylesheets=[base_css])\n\n response = HttpResponse(pdf_file, content_type='application/pdf')\n response['Content-Disposition'] = RESPONSE_CONTENT_DISPOSITION.format(filename=filename)\n return response\n\n\ndef generate_resource_image(get_request, resource, module_path, copy_num):\n \"\"\"Calls the resource's image generator and returns the generated\n image. This function also resizes the generated image for the paper\n size requested.\n\n Returns:\n Base 64 string of a generated resource image.\n \"\"\"\n # Get image from resource image creator\n resource_image_generator = importlib.import_module(module_path)\n image = resource_image_generator.resource_image(get_request, resource)\n\n # Resize image to reduce file size\n if get_request['paper_size'] == \"a4\":\n max_pixel_height = 267 * MM_TO_PIXEL_RATIO\n elif get_request['paper_size'] == \"letter\":\n max_pixel_height = 249 * MM_TO_PIXEL_RATIO\n (width, height) = image.size\n if height > max_pixel_height:\n ratio = max_pixel_height / height\n width *= ratio\n height *= ratio\n image = image.resize((int(width), int(height)), Image.ANTIALIAS)\n\n # Save image to buffer\n image_buffer = BytesIO()\n image.save(image_buffer, format='PNG')\n\n # Return base64 of image\n return base64.b64encode(image_buffer.getvalue())\n", "path": "csunplugged/resources/views/generate_resource_pdf.py"}]}
| 1,794 | 315 |
gh_patches_debug_2045
|
rasdani/github-patches
|
git_diff
|
dotkom__onlineweb4-1220
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Tags with a '.' will crash
Ref. http://moonshine.online.ntnu.no/article/10/online-far-ny-nettside
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/article/admin.py`
Content:
```
1 from django.contrib import admin
2 from apps.article.models import Article, Tag, ArticleTag
3 from django.conf import settings
4 from filebrowser.settings import VERSIONS, ADMIN_THUMBNAIL
5
6
7 class ArticleTagAdmin(admin.ModelAdmin):
8 model = ArticleTag
9
10
11 class ArticleTagInline(admin.TabularInline):
12 model = ArticleTag
13 max_num = 99
14 extra = 0
15
16
17 class TagAdmin(admin.ModelAdmin):
18 def save_model(self, request, obj, form, change):
19 obj.changed_by = request.user
20 if not change:
21 obj.created_by = request.user
22 obj.save()
23
24
25 class ArticleAdmin(admin.ModelAdmin):
26 inlines = (ArticleTagInline,)
27 list_display = ("heading", "created_by", "changed_by")
28
29 # set the created and changed by fields
30 def save_model(self, request, obj, form, change):
31 if (obj.image):
32 obj.image.version_generate(ADMIN_THUMBNAIL).url
33
34 # Itterate the different versions (by key)
35 for ver in VERSIONS.keys():
36 # Check if the key start with article_ (if it does, we want to crop to that size)
37 if ver.startswith('article_'):
38 obj.image.version_generate(ver).url
39
40 obj.changed_by = request.user
41
42 if not change:
43 obj.created_by = request.user
44 obj.save()
45
46 def save_formset(self, request, form, formset, change):
47 instances = formset.save(commit=False)
48 for instances in instances:
49 instances.save()
50
51 admin.site.register(Article, ArticleAdmin)
52 admin.site.register(Tag, TagAdmin)
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/apps/article/admin.py b/apps/article/admin.py
--- a/apps/article/admin.py
+++ b/apps/article/admin.py
@@ -18,6 +18,7 @@
def save_model(self, request, obj, form, change):
obj.changed_by = request.user
if not change:
+ obj.name = obj.name.replace('.', '')
obj.created_by = request.user
obj.save()
|
{"golden_diff": "diff --git a/apps/article/admin.py b/apps/article/admin.py\n--- a/apps/article/admin.py\n+++ b/apps/article/admin.py\n@@ -18,6 +18,7 @@\n def save_model(self, request, obj, form, change):\n obj.changed_by = request.user\n if not change:\n+ obj.name = obj.name.replace('.', '')\n obj.created_by = request.user\n obj.save()\n", "issue": "Tags with a '.' will crash\nRef. http://moonshine.online.ntnu.no/article/10/online-far-ny-nettside\n\n", "before_files": [{"content": "from django.contrib import admin\nfrom apps.article.models import Article, Tag, ArticleTag\nfrom django.conf import settings\nfrom filebrowser.settings import VERSIONS, ADMIN_THUMBNAIL\n\n\nclass ArticleTagAdmin(admin.ModelAdmin):\n model = ArticleTag\n\n\nclass ArticleTagInline(admin.TabularInline):\n model = ArticleTag\n max_num = 99\n extra = 0\n\n\nclass TagAdmin(admin.ModelAdmin):\n def save_model(self, request, obj, form, change):\n obj.changed_by = request.user\n if not change:\n obj.created_by = request.user\n obj.save()\n\n\nclass ArticleAdmin(admin.ModelAdmin):\n inlines = (ArticleTagInline,)\n list_display = (\"heading\", \"created_by\", \"changed_by\")\n\n # set the created and changed by fields\n def save_model(self, request, obj, form, change):\n if (obj.image):\n obj.image.version_generate(ADMIN_THUMBNAIL).url\n\n # Itterate the different versions (by key)\n for ver in VERSIONS.keys():\n # Check if the key start with article_ (if it does, we want to crop to that size)\n if ver.startswith('article_'):\n obj.image.version_generate(ver).url\n\n obj.changed_by = request.user\n\n if not change:\n obj.created_by = request.user\n obj.save()\n\n def save_formset(self, request, form, formset, change):\n instances = formset.save(commit=False)\n for instances in instances:\n instances.save()\n\nadmin.site.register(Article, ArticleAdmin)\nadmin.site.register(Tag, TagAdmin)\n", "path": "apps/article/admin.py"}], "after_files": [{"content": "from django.contrib import admin\nfrom apps.article.models import Article, Tag, ArticleTag\nfrom django.conf import settings\nfrom filebrowser.settings import VERSIONS, ADMIN_THUMBNAIL\n\n\nclass ArticleTagAdmin(admin.ModelAdmin):\n model = ArticleTag\n\n\nclass ArticleTagInline(admin.TabularInline):\n model = ArticleTag\n max_num = 99\n extra = 0\n\n\nclass TagAdmin(admin.ModelAdmin):\n def save_model(self, request, obj, form, change):\n obj.changed_by = request.user\n if not change:\n obj.name = obj.name.replace('.', '')\n obj.created_by = request.user\n obj.save()\n\n\nclass ArticleAdmin(admin.ModelAdmin):\n inlines = (ArticleTagInline,)\n list_display = (\"heading\", \"created_by\", \"changed_by\")\n\n # set the created and changed by fields\n def save_model(self, request, obj, form, change):\n if (obj.image):\n obj.image.version_generate(ADMIN_THUMBNAIL).url\n\n # Itterate the different versions (by key)\n for ver in VERSIONS.keys():\n # Check if the key start with article_ (if it does, we want to crop to that size)\n if ver.startswith('article_'):\n obj.image.version_generate(ver).url\n\n obj.changed_by = request.user\n\n if not change:\n obj.created_by = request.user\n obj.save()\n\n def save_formset(self, request, form, formset, change):\n instances = formset.save(commit=False)\n for instances in instances:\n instances.save()\n\nadmin.site.register(Article, ArticleAdmin)\nadmin.site.register(Tag, TagAdmin)\n", "path": "apps/article/admin.py"}]}
| 736 | 89 |
gh_patches_debug_14853
|
rasdani/github-patches
|
git_diff
|
Cloud-CV__EvalAI-1276
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Disable the Challenge Phase buttons which are inactive while making submissions
## Observed Behaviour
Currently, if a challenge phase is inactive then also a participant can select the phase to make submissions.
See screenshot for example:
<img width="1440" alt="screen shot 2017-08-08 at 7 21 16 pm" src="https://user-images.githubusercontent.com/2945708/29098709-ca4c67a8-7c6e-11e7-8729-73122eb9982e.png">
## Expected Behaviour
The challenge phases that are not active now should be ideally disabled and the user shouldn't be allowed to select those challenge phases.
Disable the Challenge Phase buttons which are inactive while making submissions
## Observed Behaviour
Currently, if a challenge phase is inactive then also a participant can select the phase to make submissions.
See screenshot for example:
<img width="1440" alt="screen shot 2017-08-08 at 7 21 16 pm" src="https://user-images.githubusercontent.com/2945708/29098709-ca4c67a8-7c6e-11e7-8729-73122eb9982e.png">
## Expected Behaviour
The challenge phases that are not active now should be ideally disabled and the user shouldn't be allowed to select those challenge phases.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/participants/serializers.py`
Content:
```
1 from django.contrib.auth.models import User
2
3 from rest_framework import serializers
4
5 from challenges.serializers import ChallengeSerializer
6 from .models import (Participant, ParticipantTeam)
7
8
9 class ParticipantTeamSerializer(serializers.ModelSerializer):
10 """Serializer class to map Participants to Teams."""
11 created_by = serializers.SlugRelatedField(slug_field='username', queryset=User.objects.all())
12
13 def __init__(self, *args, **kwargs):
14 super(ParticipantTeamSerializer, self).__init__(*args, **kwargs)
15 context = kwargs.get('context')
16 if context:
17 request = context.get('request')
18 kwargs['data']['created_by'] = request.user.username
19
20 class Meta:
21 model = ParticipantTeam
22 fields = ('id', 'team_name', 'created_by')
23
24
25 class InviteParticipantToTeamSerializer(serializers.Serializer):
26 """Serializer class for inviting Participant to Team."""
27 email = serializers.EmailField()
28
29 def __init__(self, *args, **kwargs):
30 super(InviteParticipantToTeamSerializer, self).__init__(*args, **kwargs)
31 context = kwargs.get('context')
32 if context:
33 self.participant_team = context.get('participant_team')
34 self.user = context.get('request').user
35
36 def validate_email(self, value):
37 if value == self.user.email:
38 raise serializers.ValidationError('A participant cannot invite himself')
39 try:
40 User.objects.get(email=value)
41 except User.DoesNotExist:
42 raise serializers.ValidationError('User does not exist')
43 return value
44
45 def save(self):
46 email = self.validated_data.get('email')
47 return Participant.objects.get_or_create(user=User.objects.get(email=email),
48 status=Participant.ACCEPTED,
49 team=self.participant_team)
50
51
52 class ParticipantSerializer(serializers.ModelSerializer):
53 """Serializer class for Participants."""
54 member_name = serializers.SerializerMethodField()
55 member_id = serializers.SerializerMethodField()
56
57 class Meta:
58 model = Participant
59 fields = ('member_name', 'status', 'member_id')
60
61 def get_member_name(self, obj):
62 return obj.user.username
63
64 def get_member_id(self, obj):
65 return obj.user.id
66
67
68 class ParticipantTeamDetailSerializer(serializers.ModelSerializer):
69 """Serializer for Participant Teams and Participant Combined."""
70 members = serializers.SerializerMethodField()
71 created_by = serializers.SlugRelatedField(slug_field='username', queryset=User.objects.all())
72
73 class Meta:
74 model = ParticipantTeam
75 fields = ('id', 'team_name', 'created_by', 'members')
76
77 def get_members(self, obj):
78 participants = Participant.objects.filter(team__pk=obj.id)
79 serializer = ParticipantSerializer(participants, many=True)
80 return serializer.data
81
82
83 class ChallengeParticipantTeam(object):
84 """Serializer to map Challenge and Participant Teams."""
85 def __init__(self, challenge, participant_team):
86 self.challenge = challenge
87 self.participant_team = participant_team
88
89
90 class ChallengeParticipantTeamSerializer(serializers.Serializer):
91 """Serializer to initialize Challenge and Participant's Team"""
92 challenge = ChallengeSerializer()
93 participant_team = ParticipantTeamSerializer()
94
95
96 class ChallengeParticipantTeamList(object):
97 """Class to create a list of Challenge and Participant Teams."""
98 def __init__(self, challenge_participant_team_list):
99 self.challenge_participant_team_list = challenge_participant_team_list
100
101
102 class ChallengeParticipantTeamListSerializer(serializers.Serializer):
103 """Serializer to map a challenge's participant team lists."""
104 challenge_participant_team_list = ChallengeParticipantTeamSerializer(many=True)
105
106
107 class ParticipantTeamCount(object):
108 def __init__(self, participant_team_count):
109 self.participant_team_count = participant_team_count
110
111
112 class ParticipantTeamCountSerializer(serializers.Serializer):
113 participant_team_count = serializers.IntegerField()
114
115
116 class ParticipantCount(object):
117 def __init__(self, participant_count):
118 self.participant_count = participant_count
119
120
121 class ParticipantCountSerializer(serializers.Serializer):
122 participant_count = serializers.IntegerField()
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/apps/participants/serializers.py b/apps/participants/serializers.py
--- a/apps/participants/serializers.py
+++ b/apps/participants/serializers.py
@@ -1,4 +1,5 @@
from django.contrib.auth.models import User
+from django.utils import timezone
from rest_framework import serializers
@@ -102,6 +103,10 @@
class ChallengeParticipantTeamListSerializer(serializers.Serializer):
"""Serializer to map a challenge's participant team lists."""
challenge_participant_team_list = ChallengeParticipantTeamSerializer(many=True)
+ datetime_now = serializers.SerializerMethodField()
+
+ def get_datetime_now(self, obj):
+ return timezone.now()
class ParticipantTeamCount(object):
|
{"golden_diff": "diff --git a/apps/participants/serializers.py b/apps/participants/serializers.py\n--- a/apps/participants/serializers.py\n+++ b/apps/participants/serializers.py\n@@ -1,4 +1,5 @@\n from django.contrib.auth.models import User\n+from django.utils import timezone\n \n from rest_framework import serializers\n \n@@ -102,6 +103,10 @@\n class ChallengeParticipantTeamListSerializer(serializers.Serializer):\n \"\"\"Serializer to map a challenge's participant team lists.\"\"\"\n challenge_participant_team_list = ChallengeParticipantTeamSerializer(many=True)\n+ datetime_now = serializers.SerializerMethodField()\n+\n+ def get_datetime_now(self, obj):\n+ return timezone.now()\n \n \n class ParticipantTeamCount(object):\n", "issue": "Disable the Challenge Phase buttons which are inactive while making submissions\n## Observed Behaviour\r\nCurrently, if a challenge phase is inactive then also a participant can select the phase to make submissions. \r\n\r\nSee screenshot for example:\r\n<img width=\"1440\" alt=\"screen shot 2017-08-08 at 7 21 16 pm\" src=\"https://user-images.githubusercontent.com/2945708/29098709-ca4c67a8-7c6e-11e7-8729-73122eb9982e.png\">\r\n\r\n## Expected Behaviour\r\n\r\nThe challenge phases that are not active now should be ideally disabled and the user shouldn't be allowed to select those challenge phases. \nDisable the Challenge Phase buttons which are inactive while making submissions\n## Observed Behaviour\r\nCurrently, if a challenge phase is inactive then also a participant can select the phase to make submissions. \r\n\r\nSee screenshot for example:\r\n<img width=\"1440\" alt=\"screen shot 2017-08-08 at 7 21 16 pm\" src=\"https://user-images.githubusercontent.com/2945708/29098709-ca4c67a8-7c6e-11e7-8729-73122eb9982e.png\">\r\n\r\n## Expected Behaviour\r\n\r\nThe challenge phases that are not active now should be ideally disabled and the user shouldn't be allowed to select those challenge phases. \n", "before_files": [{"content": "from django.contrib.auth.models import User\n\nfrom rest_framework import serializers\n\nfrom challenges.serializers import ChallengeSerializer\nfrom .models import (Participant, ParticipantTeam)\n\n\nclass ParticipantTeamSerializer(serializers.ModelSerializer):\n \"\"\"Serializer class to map Participants to Teams.\"\"\"\n created_by = serializers.SlugRelatedField(slug_field='username', queryset=User.objects.all())\n\n def __init__(self, *args, **kwargs):\n super(ParticipantTeamSerializer, self).__init__(*args, **kwargs)\n context = kwargs.get('context')\n if context:\n request = context.get('request')\n kwargs['data']['created_by'] = request.user.username\n\n class Meta:\n model = ParticipantTeam\n fields = ('id', 'team_name', 'created_by')\n\n\nclass InviteParticipantToTeamSerializer(serializers.Serializer):\n \"\"\"Serializer class for inviting Participant to Team.\"\"\"\n email = serializers.EmailField()\n\n def __init__(self, *args, **kwargs):\n super(InviteParticipantToTeamSerializer, self).__init__(*args, **kwargs)\n context = kwargs.get('context')\n if context:\n self.participant_team = context.get('participant_team')\n self.user = context.get('request').user\n\n def validate_email(self, value):\n if value == self.user.email:\n raise serializers.ValidationError('A participant cannot invite himself')\n try:\n User.objects.get(email=value)\n except User.DoesNotExist:\n raise serializers.ValidationError('User does not exist')\n return value\n\n def save(self):\n email = self.validated_data.get('email')\n return Participant.objects.get_or_create(user=User.objects.get(email=email),\n status=Participant.ACCEPTED,\n team=self.participant_team)\n\n\nclass ParticipantSerializer(serializers.ModelSerializer):\n \"\"\"Serializer class for Participants.\"\"\"\n member_name = serializers.SerializerMethodField()\n member_id = serializers.SerializerMethodField()\n\n class Meta:\n model = Participant\n fields = ('member_name', 'status', 'member_id')\n\n def get_member_name(self, obj):\n return obj.user.username\n\n def get_member_id(self, obj):\n return obj.user.id\n\n\nclass ParticipantTeamDetailSerializer(serializers.ModelSerializer):\n \"\"\"Serializer for Participant Teams and Participant Combined.\"\"\"\n members = serializers.SerializerMethodField()\n created_by = serializers.SlugRelatedField(slug_field='username', queryset=User.objects.all())\n\n class Meta:\n model = ParticipantTeam\n fields = ('id', 'team_name', 'created_by', 'members')\n\n def get_members(self, obj):\n participants = Participant.objects.filter(team__pk=obj.id)\n serializer = ParticipantSerializer(participants, many=True)\n return serializer.data\n\n\nclass ChallengeParticipantTeam(object):\n \"\"\"Serializer to map Challenge and Participant Teams.\"\"\"\n def __init__(self, challenge, participant_team):\n self.challenge = challenge\n self.participant_team = participant_team\n\n\nclass ChallengeParticipantTeamSerializer(serializers.Serializer):\n \"\"\"Serializer to initialize Challenge and Participant's Team\"\"\"\n challenge = ChallengeSerializer()\n participant_team = ParticipantTeamSerializer()\n\n\nclass ChallengeParticipantTeamList(object):\n \"\"\"Class to create a list of Challenge and Participant Teams.\"\"\"\n def __init__(self, challenge_participant_team_list):\n self.challenge_participant_team_list = challenge_participant_team_list\n\n\nclass ChallengeParticipantTeamListSerializer(serializers.Serializer):\n \"\"\"Serializer to map a challenge's participant team lists.\"\"\"\n challenge_participant_team_list = ChallengeParticipantTeamSerializer(many=True)\n\n\nclass ParticipantTeamCount(object):\n def __init__(self, participant_team_count):\n self.participant_team_count = participant_team_count\n\n\nclass ParticipantTeamCountSerializer(serializers.Serializer):\n participant_team_count = serializers.IntegerField()\n\n\nclass ParticipantCount(object):\n def __init__(self, participant_count):\n self.participant_count = participant_count\n\n\nclass ParticipantCountSerializer(serializers.Serializer):\n participant_count = serializers.IntegerField()\n", "path": "apps/participants/serializers.py"}], "after_files": [{"content": "from django.contrib.auth.models import User\nfrom django.utils import timezone\n\nfrom rest_framework import serializers\n\nfrom challenges.serializers import ChallengeSerializer\nfrom .models import (Participant, ParticipantTeam)\n\n\nclass ParticipantTeamSerializer(serializers.ModelSerializer):\n \"\"\"Serializer class to map Participants to Teams.\"\"\"\n created_by = serializers.SlugRelatedField(slug_field='username', queryset=User.objects.all())\n\n def __init__(self, *args, **kwargs):\n super(ParticipantTeamSerializer, self).__init__(*args, **kwargs)\n context = kwargs.get('context')\n if context:\n request = context.get('request')\n kwargs['data']['created_by'] = request.user.username\n\n class Meta:\n model = ParticipantTeam\n fields = ('id', 'team_name', 'created_by')\n\n\nclass InviteParticipantToTeamSerializer(serializers.Serializer):\n \"\"\"Serializer class for inviting Participant to Team.\"\"\"\n email = serializers.EmailField()\n\n def __init__(self, *args, **kwargs):\n super(InviteParticipantToTeamSerializer, self).__init__(*args, **kwargs)\n context = kwargs.get('context')\n if context:\n self.participant_team = context.get('participant_team')\n self.user = context.get('request').user\n\n def validate_email(self, value):\n if value == self.user.email:\n raise serializers.ValidationError('A participant cannot invite himself')\n try:\n User.objects.get(email=value)\n except User.DoesNotExist:\n raise serializers.ValidationError('User does not exist')\n return value\n\n def save(self):\n email = self.validated_data.get('email')\n return Participant.objects.get_or_create(user=User.objects.get(email=email),\n status=Participant.ACCEPTED,\n team=self.participant_team)\n\n\nclass ParticipantSerializer(serializers.ModelSerializer):\n \"\"\"Serializer class for Participants.\"\"\"\n member_name = serializers.SerializerMethodField()\n member_id = serializers.SerializerMethodField()\n\n class Meta:\n model = Participant\n fields = ('member_name', 'status', 'member_id')\n\n def get_member_name(self, obj):\n return obj.user.username\n\n def get_member_id(self, obj):\n return obj.user.id\n\n\nclass ParticipantTeamDetailSerializer(serializers.ModelSerializer):\n \"\"\"Serializer for Participant Teams and Participant Combined.\"\"\"\n members = serializers.SerializerMethodField()\n created_by = serializers.SlugRelatedField(slug_field='username', queryset=User.objects.all())\n\n class Meta:\n model = ParticipantTeam\n fields = ('id', 'team_name', 'created_by', 'members')\n\n def get_members(self, obj):\n participants = Participant.objects.filter(team__pk=obj.id)\n serializer = ParticipantSerializer(participants, many=True)\n return serializer.data\n\n\nclass ChallengeParticipantTeam(object):\n \"\"\"Serializer to map Challenge and Participant Teams.\"\"\"\n def __init__(self, challenge, participant_team):\n self.challenge = challenge\n self.participant_team = participant_team\n\n\nclass ChallengeParticipantTeamSerializer(serializers.Serializer):\n \"\"\"Serializer to initialize Challenge and Participant's Team\"\"\"\n challenge = ChallengeSerializer()\n participant_team = ParticipantTeamSerializer()\n\n\nclass ChallengeParticipantTeamList(object):\n \"\"\"Class to create a list of Challenge and Participant Teams.\"\"\"\n def __init__(self, challenge_participant_team_list):\n self.challenge_participant_team_list = challenge_participant_team_list\n\n\nclass ChallengeParticipantTeamListSerializer(serializers.Serializer):\n \"\"\"Serializer to map a challenge's participant team lists.\"\"\"\n challenge_participant_team_list = ChallengeParticipantTeamSerializer(many=True)\n datetime_now = serializers.SerializerMethodField()\n\n def get_datetime_now(self, obj):\n return timezone.now()\n\n\nclass ParticipantTeamCount(object):\n def __init__(self, participant_team_count):\n self.participant_team_count = participant_team_count\n\n\nclass ParticipantTeamCountSerializer(serializers.Serializer):\n participant_team_count = serializers.IntegerField()\n\n\nclass ParticipantCount(object):\n def __init__(self, participant_count):\n self.participant_count = participant_count\n\n\nclass ParticipantCountSerializer(serializers.Serializer):\n participant_count = serializers.IntegerField()\n", "path": "apps/participants/serializers.py"}]}
| 1,677 | 156 |
gh_patches_debug_1558
|
rasdani/github-patches
|
git_diff
|
streamlink__streamlink-928
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Socket is not connected error when closing currently open stream with VLC
### Checklist
- [x] This is a bug report.
- [ ] This is a feature request.
- [ ] This is a plugin (improvement) request.
- [ ] I have read the contribution guidelines.
### Description
Every time I close a stream that was playing in VLC, I get the following error:
```
[cli][info] Closing currently open stream...
Traceback (most recent call last):
File "/usr/local/bin/streamlink", line 11, in <module>
load_entry_point('streamlink==0.6.0', 'console_scripts', 'streamlink')()
File "/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py", line 1027, in main
handle_url()
File "/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py", line 502, in handle_url
handle_stream(plugin, streams, stream_name)
File "/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py", line 380, in handle_stream
return output_stream_http(plugin, streams)
File "/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py", line 192, in output_stream_http
server.close()
File "/usr/local/lib/python2.7/site-packages/streamlink_cli/utils/http_server.py", line 116, in close
self.socket.shutdown(2)
File "/usr/local/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
socket.error: [Errno 57] Socket is not connected
```
This has been happening to me since 0.4.0, but I haven't had a chance to report it earlier. I've only been watching streams on Twtich so I have no idea if other services are affected by this too.
Issue #604 might be something similar, but the error is quite different although some parts of the backtrace are similar.
### Expected / Actual behavior
Expected: No error when closing the stream.
Actual: The above error happens.
### Reproduction steps / Explicit stream URLs to test
1. Load a Twitch stream with VLC as the player.
2. Close VLC.
This happens regardless of if the stream was still running when VLC is closed or if the stream already ended and VLC is not playing anything.
### Environment details
Operating system and version: FreeBSD 11.0-RELEASE-p8
Streamlink and Python version: Streamlink 0.6.0, Python 2.7.13
VLC version: 2.2.5.1
My .streamlinkrc file contains the following (excluding my Twitch OAuth token):
```
player-continuous-http
default-stream=best
hls-segment-threads=10
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink_cli/utils/http_server.py`
Content:
```
1 import socket
2
3 from io import BytesIO
4
5 try:
6 from BaseHTTPServer import BaseHTTPRequestHandler
7 except ImportError:
8 from http.server import BaseHTTPRequestHandler
9
10
11 class HTTPRequest(BaseHTTPRequestHandler):
12 def __init__(self, request_text):
13 self.rfile = BytesIO(request_text)
14 self.raw_requestline = self.rfile.readline()
15 self.error_code = self.error_message = None
16 self.parse_request()
17
18 def send_error(self, code, message):
19 self.error_code = code
20 self.error_message = message
21
22
23 class HTTPServer(object):
24 def __init__(self):
25 self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
26 self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
27 self.conn = self.host = self.port = None
28 self.bound = False
29
30 @property
31 def addresses(self):
32 if self.host:
33 return [self.host]
34
35 addrs = set()
36 try:
37 for info in socket.getaddrinfo(socket.gethostname(), self.port,
38 socket.AF_INET):
39 addrs.add(info[4][0])
40 except socket.gaierror:
41 pass
42
43 addrs.add("127.0.0.1")
44 return sorted(addrs)
45
46 @property
47 def urls(self):
48 for addr in self.addresses:
49 yield "http://{0}:{1}/".format(addr, self.port)
50
51 @property
52 def url(self):
53 return next(self.urls, None)
54
55 def bind(self, host="127.0.0.1", port=0):
56 try:
57 self.socket.bind((host or "", port))
58 except socket.error as err:
59 raise OSError(err)
60
61 self.socket.listen(1)
62 self.bound = True
63 self.host, self.port = self.socket.getsockname()
64 if self.host == "0.0.0.0":
65 self.host = None
66
67 def open(self, timeout=30):
68 self.socket.settimeout(timeout)
69
70 try:
71 conn, addr = self.socket.accept()
72 conn.settimeout(None)
73 except socket.timeout:
74 raise OSError("Socket accept timed out")
75
76 try:
77 req_data = conn.recv(1024)
78 except socket.error:
79 raise OSError("Failed to read data from socket")
80
81 req = HTTPRequest(req_data)
82 if req.command not in ("GET", "HEAD"):
83 conn.send(b"HTTP/1.1 501 Not Implemented\r\n")
84 conn.close()
85 raise OSError("Invalid request method: {0}".format(req.command))
86
87 try:
88 conn.send(b"HTTP/1.1 200 OK\r\n")
89 conn.send(b"Server: Streamlink\r\n")
90 conn.send(b"Content-Type: video/unknown\r\n")
91 conn.send(b"\r\n")
92 except socket.error:
93 raise OSError("Failed to write data to socket")
94
95 # We don't want to send any data on HEAD requests.
96 if req.command == "HEAD":
97 conn.close()
98 raise OSError
99
100 self.conn = conn
101
102 return req
103
104 def write(self, data):
105 if not self.conn:
106 raise IOError("No connection")
107
108 self.conn.sendall(data)
109
110 def close(self, client_only=False):
111 if self.conn:
112 self.conn.close()
113
114 if not client_only:
115 try:
116 self.socket.shutdown(2)
117 except OSError:
118 pass
119 self.socket.close()
120
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/streamlink_cli/utils/http_server.py b/src/streamlink_cli/utils/http_server.py
--- a/src/streamlink_cli/utils/http_server.py
+++ b/src/streamlink_cli/utils/http_server.py
@@ -114,6 +114,6 @@
if not client_only:
try:
self.socket.shutdown(2)
- except OSError:
+ except (OSError, socket.error):
pass
self.socket.close()
|
{"golden_diff": "diff --git a/src/streamlink_cli/utils/http_server.py b/src/streamlink_cli/utils/http_server.py\n--- a/src/streamlink_cli/utils/http_server.py\n+++ b/src/streamlink_cli/utils/http_server.py\n@@ -114,6 +114,6 @@\n if not client_only:\n try:\n self.socket.shutdown(2)\n- except OSError:\n+ except (OSError, socket.error):\n pass\n self.socket.close()\n", "issue": "Socket is not connected error when closing currently open stream with VLC\n### Checklist\r\n\r\n- [x] This is a bug report.\r\n- [ ] This is a feature request.\r\n- [ ] This is a plugin (improvement) request.\r\n- [ ] I have read the contribution guidelines.\r\n\r\n### Description\r\n\r\nEvery time I close a stream that was playing in VLC, I get the following error:\r\n\r\n```\r\n[cli][info] Closing currently open stream...\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/streamlink\", line 11, in <module>\r\n load_entry_point('streamlink==0.6.0', 'console_scripts', 'streamlink')()\r\n File \"/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py\", line 1027, in main\r\n handle_url()\r\n File \"/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py\", line 502, in handle_url\r\n handle_stream(plugin, streams, stream_name)\r\n File \"/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py\", line 380, in handle_stream\r\n return output_stream_http(plugin, streams)\r\n File \"/usr/local/lib/python2.7/site-packages/streamlink_cli/main.py\", line 192, in output_stream_http\r\n server.close()\r\n File \"/usr/local/lib/python2.7/site-packages/streamlink_cli/utils/http_server.py\", line 116, in close\r\n self.socket.shutdown(2)\r\n File \"/usr/local/lib/python2.7/socket.py\", line 228, in meth\r\n return getattr(self._sock,name)(*args)\r\nsocket.error: [Errno 57] Socket is not connected\r\n```\r\n\r\nThis has been happening to me since 0.4.0, but I haven't had a chance to report it earlier. I've only been watching streams on Twtich so I have no idea if other services are affected by this too.\r\n\r\nIssue #604 might be something similar, but the error is quite different although some parts of the backtrace are similar.\r\n\r\n### Expected / Actual behavior\r\n\r\nExpected: No error when closing the stream.\r\nActual: The above error happens.\r\n\r\n### Reproduction steps / Explicit stream URLs to test\r\n\r\n1. Load a Twitch stream with VLC as the player.\r\n2. Close VLC.\r\n\r\nThis happens regardless of if the stream was still running when VLC is closed or if the stream already ended and VLC is not playing anything.\r\n\r\n### Environment details\r\n\r\nOperating system and version: FreeBSD 11.0-RELEASE-p8\r\nStreamlink and Python version: Streamlink 0.6.0, Python 2.7.13\r\nVLC version: 2.2.5.1\r\n\r\nMy .streamlinkrc file contains the following (excluding my Twitch OAuth token):\r\n\r\n```\r\nplayer-continuous-http\r\ndefault-stream=best\r\nhls-segment-threads=10\r\n```\n", "before_files": [{"content": "import socket\n\nfrom io import BytesIO\n\ntry:\n from BaseHTTPServer import BaseHTTPRequestHandler\nexcept ImportError:\n from http.server import BaseHTTPRequestHandler\n\n\nclass HTTPRequest(BaseHTTPRequestHandler):\n def __init__(self, request_text):\n self.rfile = BytesIO(request_text)\n self.raw_requestline = self.rfile.readline()\n self.error_code = self.error_message = None\n self.parse_request()\n\n def send_error(self, code, message):\n self.error_code = code\n self.error_message = message\n\n\nclass HTTPServer(object):\n def __init__(self):\n self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)\n self.conn = self.host = self.port = None\n self.bound = False\n\n @property\n def addresses(self):\n if self.host:\n return [self.host]\n\n addrs = set()\n try:\n for info in socket.getaddrinfo(socket.gethostname(), self.port,\n socket.AF_INET):\n addrs.add(info[4][0])\n except socket.gaierror:\n pass\n\n addrs.add(\"127.0.0.1\")\n return sorted(addrs)\n\n @property\n def urls(self):\n for addr in self.addresses:\n yield \"http://{0}:{1}/\".format(addr, self.port)\n\n @property\n def url(self):\n return next(self.urls, None)\n\n def bind(self, host=\"127.0.0.1\", port=0):\n try:\n self.socket.bind((host or \"\", port))\n except socket.error as err:\n raise OSError(err)\n\n self.socket.listen(1)\n self.bound = True\n self.host, self.port = self.socket.getsockname()\n if self.host == \"0.0.0.0\":\n self.host = None\n\n def open(self, timeout=30):\n self.socket.settimeout(timeout)\n\n try:\n conn, addr = self.socket.accept()\n conn.settimeout(None)\n except socket.timeout:\n raise OSError(\"Socket accept timed out\")\n\n try:\n req_data = conn.recv(1024)\n except socket.error:\n raise OSError(\"Failed to read data from socket\")\n\n req = HTTPRequest(req_data)\n if req.command not in (\"GET\", \"HEAD\"):\n conn.send(b\"HTTP/1.1 501 Not Implemented\\r\\n\")\n conn.close()\n raise OSError(\"Invalid request method: {0}\".format(req.command))\n\n try:\n conn.send(b\"HTTP/1.1 200 OK\\r\\n\")\n conn.send(b\"Server: Streamlink\\r\\n\")\n conn.send(b\"Content-Type: video/unknown\\r\\n\")\n conn.send(b\"\\r\\n\")\n except socket.error:\n raise OSError(\"Failed to write data to socket\")\n\n # We don't want to send any data on HEAD requests.\n if req.command == \"HEAD\":\n conn.close()\n raise OSError\n\n self.conn = conn\n\n return req\n\n def write(self, data):\n if not self.conn:\n raise IOError(\"No connection\")\n\n self.conn.sendall(data)\n\n def close(self, client_only=False):\n if self.conn:\n self.conn.close()\n\n if not client_only:\n try:\n self.socket.shutdown(2)\n except OSError:\n pass\n self.socket.close()\n", "path": "src/streamlink_cli/utils/http_server.py"}], "after_files": [{"content": "import socket\n\nfrom io import BytesIO\n\ntry:\n from BaseHTTPServer import BaseHTTPRequestHandler\nexcept ImportError:\n from http.server import BaseHTTPRequestHandler\n\n\nclass HTTPRequest(BaseHTTPRequestHandler):\n def __init__(self, request_text):\n self.rfile = BytesIO(request_text)\n self.raw_requestline = self.rfile.readline()\n self.error_code = self.error_message = None\n self.parse_request()\n\n def send_error(self, code, message):\n self.error_code = code\n self.error_message = message\n\n\nclass HTTPServer(object):\n def __init__(self):\n self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)\n self.conn = self.host = self.port = None\n self.bound = False\n\n @property\n def addresses(self):\n if self.host:\n return [self.host]\n\n addrs = set()\n try:\n for info in socket.getaddrinfo(socket.gethostname(), self.port,\n socket.AF_INET):\n addrs.add(info[4][0])\n except socket.gaierror:\n pass\n\n addrs.add(\"127.0.0.1\")\n return sorted(addrs)\n\n @property\n def urls(self):\n for addr in self.addresses:\n yield \"http://{0}:{1}/\".format(addr, self.port)\n\n @property\n def url(self):\n return next(self.urls, None)\n\n def bind(self, host=\"127.0.0.1\", port=0):\n try:\n self.socket.bind((host or \"\", port))\n except socket.error as err:\n raise OSError(err)\n\n self.socket.listen(1)\n self.bound = True\n self.host, self.port = self.socket.getsockname()\n if self.host == \"0.0.0.0\":\n self.host = None\n\n def open(self, timeout=30):\n self.socket.settimeout(timeout)\n\n try:\n conn, addr = self.socket.accept()\n conn.settimeout(None)\n except socket.timeout:\n raise OSError(\"Socket accept timed out\")\n\n try:\n req_data = conn.recv(1024)\n except socket.error:\n raise OSError(\"Failed to read data from socket\")\n\n req = HTTPRequest(req_data)\n if req.command not in (\"GET\", \"HEAD\"):\n conn.send(b\"HTTP/1.1 501 Not Implemented\\r\\n\")\n conn.close()\n raise OSError(\"Invalid request method: {0}\".format(req.command))\n\n try:\n conn.send(b\"HTTP/1.1 200 OK\\r\\n\")\n conn.send(b\"Server: Streamlink\\r\\n\")\n conn.send(b\"Content-Type: video/unknown\\r\\n\")\n conn.send(b\"\\r\\n\")\n except socket.error:\n raise OSError(\"Failed to write data to socket\")\n\n # We don't want to send any data on HEAD requests.\n if req.command == \"HEAD\":\n conn.close()\n raise OSError\n\n self.conn = conn\n\n return req\n\n def write(self, data):\n if not self.conn:\n raise IOError(\"No connection\")\n\n self.conn.sendall(data)\n\n def close(self, client_only=False):\n if self.conn:\n self.conn.close()\n\n if not client_only:\n try:\n self.socket.shutdown(2)\n except (OSError, socket.error):\n pass\n self.socket.close()\n", "path": "src/streamlink_cli/utils/http_server.py"}]}
| 1,886 | 97 |
gh_patches_debug_28152
|
rasdani/github-patches
|
git_diff
|
rasterio__rasterio-662
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
empty window intersection
I've been playing around with the cool windowing functions @brendan-ward put in a while back. I'm wondering if this is the desired behavior when two windows intersect at a point or along a row/column. Because of the ways windows work in rasterio, the upper row/column bound is open, so `windows_intersect` is returning `True` but returning an empty array, e.g.
```
>>> with rasterio.open('tests/data/RGB.byte.tif') as src:
... w1 = ((0, 2), (0, 2))
... w2 = ((2, 4), (2, 4))
... if windows_intersect((w1, w2)): #returns True
... print(src.read(1, window=window_intersection((w1, w2))))
[]
```
True, the windows do intersect at a point, but I assume the way people will use `window_intersection` is to expect a window that they can use in further processing.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `rasterio/windows.py`
Content:
```
1 """Windows and related functions."""
2
3
4 def get_data_window(arr, nodata=None):
5 """Return a window for the non-nodata pixels within the input array.
6
7 Parameters
8 ----------
9 arr: numpy ndarray, <= 3 dimensions
10 nodata: number
11 If None, will either return a full window if arr is not a masked
12 array, or will use the mask to determine non-nodata pixels.
13 If provided, it must be a number within the valid range of the dtype
14 of the input array.
15
16 Returns
17 -------
18 ((row_start, row_stop), (col_start, col_stop))
19
20 """
21 from rasterio._io import get_data_window
22 return get_data_window(arr, nodata)
23
24
25 def union(windows):
26 """Union windows and return the outermost extent they cover.
27
28 Parameters
29 ----------
30 windows: list-like of window objects
31 ((row_start, row_stop), (col_start, col_stop))
32
33 Returns
34 -------
35 ((row_start, row_stop), (col_start, col_stop))
36 """
37 from rasterio._io import window_union
38 return window_union(windows)
39
40
41 def intersection(windows):
42 """Intersect windows and return the innermost extent they cover.
43
44 Will raise ValueError if windows do not intersect.
45
46 Parameters
47 ----------
48 windows: list-like of window objects
49 ((row_start, row_stop), (col_start, col_stop))
50
51 Returns
52 -------
53 ((row_start, row_stop), (col_start, col_stop))
54 """
55 from rasterio._io import window_intersection
56 return window_intersection(windows)
57
58
59 def intersect(windows):
60 """Test if windows intersect.
61
62 Parameters
63 ----------
64 windows: list-like of window objects
65 ((row_start, row_stop), (col_start, col_stop))
66
67 Returns
68 -------
69 boolean:
70 True if all windows intersect.
71 """
72 from rasterio._io import windows_intersect
73 return windows_intersect(windows)
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/rasterio/windows.py b/rasterio/windows.py
--- a/rasterio/windows.py
+++ b/rasterio/windows.py
@@ -1,5 +1,21 @@
"""Windows and related functions."""
+import functools
+import collections
+
+
+def iter_args(function):
+ """Decorator to allow function to take either *args or
+ a single iterable which gets expanded to *args.
+ """
+ @functools.wraps(function)
+ def wrapper(*args, **kwargs):
+ if len(args) == 1 and isinstance(args[0], collections.Iterable):
+ return function(*args[0])
+ else:
+ return function(*args)
+ return wrapper
+
def get_data_window(arr, nodata=None):
"""Return a window for the non-nodata pixels within the input array.
@@ -22,7 +38,8 @@
return get_data_window(arr, nodata)
-def union(windows):
+@iter_args
+def union(*windows):
"""Union windows and return the outermost extent they cover.
Parameters
@@ -38,7 +55,8 @@
return window_union(windows)
-def intersection(windows):
+@iter_args
+def intersection(*windows):
"""Intersect windows and return the innermost extent they cover.
Will raise ValueError if windows do not intersect.
@@ -56,7 +74,8 @@
return window_intersection(windows)
-def intersect(windows):
+@iter_args
+def intersect(*windows):
"""Test if windows intersect.
Parameters
|
{"golden_diff": "diff --git a/rasterio/windows.py b/rasterio/windows.py\n--- a/rasterio/windows.py\n+++ b/rasterio/windows.py\n@@ -1,5 +1,21 @@\n \"\"\"Windows and related functions.\"\"\"\n \n+import functools\n+import collections\n+\n+\n+def iter_args(function):\n+ \"\"\"Decorator to allow function to take either *args or\n+ a single iterable which gets expanded to *args.\n+ \"\"\"\n+ @functools.wraps(function)\n+ def wrapper(*args, **kwargs):\n+ if len(args) == 1 and isinstance(args[0], collections.Iterable):\n+ return function(*args[0])\n+ else:\n+ return function(*args)\n+ return wrapper\n+\n \n def get_data_window(arr, nodata=None):\n \"\"\"Return a window for the non-nodata pixels within the input array.\n@@ -22,7 +38,8 @@\n return get_data_window(arr, nodata)\n \n \n-def union(windows):\n+@iter_args\n+def union(*windows):\n \"\"\"Union windows and return the outermost extent they cover.\n \n Parameters\n@@ -38,7 +55,8 @@\n return window_union(windows)\n \n \n-def intersection(windows):\n+@iter_args\n+def intersection(*windows):\n \"\"\"Intersect windows and return the innermost extent they cover.\n \n Will raise ValueError if windows do not intersect.\n@@ -56,7 +74,8 @@\n return window_intersection(windows)\n \n \n-def intersect(windows):\n+@iter_args\n+def intersect(*windows):\n \"\"\"Test if windows intersect.\n \n Parameters\n", "issue": "empty window intersection\nI've been playing around with the cool windowing functions @brendan-ward put in a while back. I'm wondering if this is the desired behavior when two windows intersect at a point or along a row/column. Because of the ways windows work in rasterio, the upper row/column bound is open, so `windows_intersect` is returning `True` but returning an empty array, e.g.\n\n```\n>>> with rasterio.open('tests/data/RGB.byte.tif') as src:\n... w1 = ((0, 2), (0, 2))\n... w2 = ((2, 4), (2, 4))\n... if windows_intersect((w1, w2)): #returns True\n... print(src.read(1, window=window_intersection((w1, w2))))\n[]\n```\n\nTrue, the windows do intersect at a point, but I assume the way people will use `window_intersection` is to expect a window that they can use in further processing. \n\n", "before_files": [{"content": "\"\"\"Windows and related functions.\"\"\"\n\n\ndef get_data_window(arr, nodata=None):\n \"\"\"Return a window for the non-nodata pixels within the input array.\n\n Parameters\n ----------\n arr: numpy ndarray, <= 3 dimensions\n nodata: number\n If None, will either return a full window if arr is not a masked\n array, or will use the mask to determine non-nodata pixels.\n If provided, it must be a number within the valid range of the dtype\n of the input array.\n\n Returns\n -------\n ((row_start, row_stop), (col_start, col_stop))\n\n \"\"\"\n from rasterio._io import get_data_window\n return get_data_window(arr, nodata)\n\n\ndef union(windows):\n \"\"\"Union windows and return the outermost extent they cover.\n\n Parameters\n ----------\n windows: list-like of window objects\n ((row_start, row_stop), (col_start, col_stop))\n\n Returns\n -------\n ((row_start, row_stop), (col_start, col_stop))\n \"\"\"\n from rasterio._io import window_union\n return window_union(windows)\n\n\ndef intersection(windows):\n \"\"\"Intersect windows and return the innermost extent they cover.\n\n Will raise ValueError if windows do not intersect.\n\n Parameters\n ----------\n windows: list-like of window objects\n ((row_start, row_stop), (col_start, col_stop))\n\n Returns\n -------\n ((row_start, row_stop), (col_start, col_stop))\n \"\"\"\n from rasterio._io import window_intersection\n return window_intersection(windows)\n\n\ndef intersect(windows):\n \"\"\"Test if windows intersect.\n\n Parameters\n ----------\n windows: list-like of window objects\n ((row_start, row_stop), (col_start, col_stop))\n\n Returns\n -------\n boolean:\n True if all windows intersect.\n \"\"\"\n from rasterio._io import windows_intersect\n return windows_intersect(windows)\n", "path": "rasterio/windows.py"}], "after_files": [{"content": "\"\"\"Windows and related functions.\"\"\"\n\nimport functools\nimport collections\n\n\ndef iter_args(function):\n \"\"\"Decorator to allow function to take either *args or\n a single iterable which gets expanded to *args.\n \"\"\"\n @functools.wraps(function)\n def wrapper(*args, **kwargs):\n if len(args) == 1 and isinstance(args[0], collections.Iterable):\n return function(*args[0])\n else:\n return function(*args)\n return wrapper\n\n\ndef get_data_window(arr, nodata=None):\n \"\"\"Return a window for the non-nodata pixels within the input array.\n\n Parameters\n ----------\n arr: numpy ndarray, <= 3 dimensions\n nodata: number\n If None, will either return a full window if arr is not a masked\n array, or will use the mask to determine non-nodata pixels.\n If provided, it must be a number within the valid range of the dtype\n of the input array.\n\n Returns\n -------\n ((row_start, row_stop), (col_start, col_stop))\n\n \"\"\"\n from rasterio._io import get_data_window\n return get_data_window(arr, nodata)\n\n\n@iter_args\ndef union(*windows):\n \"\"\"Union windows and return the outermost extent they cover.\n\n Parameters\n ----------\n windows: list-like of window objects\n ((row_start, row_stop), (col_start, col_stop))\n\n Returns\n -------\n ((row_start, row_stop), (col_start, col_stop))\n \"\"\"\n from rasterio._io import window_union\n return window_union(windows)\n\n\n@iter_args\ndef intersection(*windows):\n \"\"\"Intersect windows and return the innermost extent they cover.\n\n Will raise ValueError if windows do not intersect.\n\n Parameters\n ----------\n windows: list-like of window objects\n ((row_start, row_stop), (col_start, col_stop))\n\n Returns\n -------\n ((row_start, row_stop), (col_start, col_stop))\n \"\"\"\n from rasterio._io import window_intersection\n return window_intersection(windows)\n\n\n@iter_args\ndef intersect(*windows):\n \"\"\"Test if windows intersect.\n\n Parameters\n ----------\n windows: list-like of window objects\n ((row_start, row_stop), (col_start, col_stop))\n\n Returns\n -------\n boolean:\n True if all windows intersect.\n \"\"\"\n from rasterio._io import windows_intersect\n return windows_intersect(windows)\n", "path": "rasterio/windows.py"}]}
| 1,040 | 351 |
gh_patches_debug_26285
|
rasdani/github-patches
|
git_diff
|
pypi__warehouse-568
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use the latest version of pip-tools
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Licensed under the Apache License, Version 2.0 (the "License");
2 # you may not use this file except in compliance with the License.
3 # You may obtain a copy of the License at
4 #
5 # http://www.apache.org/licenses/LICENSE-2.0
6 #
7 # Unless required by applicable law or agreed to in writing, software
8 # distributed under the License is distributed on an "AS IS" BASIS,
9 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10 # See the License for the specific language governing permissions and
11 # limitations under the License.
12
13 import os
14
15 import setuptools
16
17
18 base_dir = os.path.dirname(__file__)
19
20 about = {}
21 with open(os.path.join(base_dir, "warehouse", "__about__.py")) as f:
22 exec(f.read(), about)
23
24 with open(os.path.join(base_dir, "README.rst")) as f:
25 long_description = f.read()
26
27
28 setuptools.setup(
29 name=about["__title__"],
30 version=about["__version__"],
31
32 description=about["__summary__"],
33 long_description=long_description,
34 license=about["__license__"],
35 url=about["__uri__"],
36
37 author=about["__author__"],
38 author_email=about["__email__"],
39
40 classifiers=[
41 "Intended Audience :: Developers",
42
43 "License :: OSI Approved :: Apache Software License",
44
45 "Programming Language :: Python",
46 "Programming Language :: Python :: 3",
47 "Programming Language :: Python :: 3.4",
48 ],
49
50 packages=[
51 "warehouse",
52 "warehouse.accounts",
53 "warehouse.cache",
54 "warehouse.cache.origin",
55 "warehouse.classifiers",
56 "warehouse.cli",
57 "warehouse.cli.db",
58 "warehouse.i18n",
59 "warehouse.legacy",
60 "warehouse.legacy.api",
61 "warehouse.migrations",
62 "warehouse.packaging",
63 "warehouse.utils",
64 ],
65
66 include_package_data=True,
67
68 install_requires=[
69 "alembic>=0.7.0",
70 "Babel",
71 "bcrypt",
72 "boto3",
73 "click",
74 "fs",
75 "gunicorn",
76 "hiredis",
77 "html5lib",
78 "itsdangerous",
79 "msgpack-python",
80 "passlib>=1.6.4",
81 "psycopg2",
82 "pyramid>=1.6a1",
83 "pyramid_jinja2>=2.5",
84 "pyramid_multiauth",
85 "pyramid_services",
86 "pyramid_tm>=0.12",
87 "readme>=0.5.1",
88 "redis",
89 "setproctitle",
90 "sqlalchemy>=0.9",
91 "sqlalchemy-citext",
92 "structlog",
93 "transaction",
94 "WTForms>=2.0.0",
95 "zope.sqlalchemy",
96 ],
97
98 entry_points={
99 "console_scripts": [
100 "warehouse = warehouse.cli:warehouse",
101 ],
102 },
103 )
104
```
Path: `tasks/pip.py`
Content:
```
1 # Licensed under the Apache License, Version 2.0 (the "License");
2 # you may not use this file except in compliance with the License.
3 # You may obtain a copy of the License at
4 #
5 # http://www.apache.org/licenses/LICENSE-2.0
6 #
7 # Unless required by applicable law or agreed to in writing, software
8 # distributed under the License is distributed on an "AS IS" BASIS,
9 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10 # See the License for the specific language governing permissions and
11 # limitations under the License.
12
13 import os
14 import re
15
16 import invoke
17
18
19 REQUIREMENTS_HEADER = """
20 #
21 # This file is autogenerated by pip-compile
22 # Make changes in setup.py, then run this to update:
23 #
24 # $ invoke pip.compile
25 #
26
27 -f https://github.com/Pylons/webob/archive/master.zip#egg=webob-1.5.dev0
28
29 """.lstrip()
30
31
32 @invoke.task
33 def compile():
34 with open("requirements.in", "w") as fp:
35 fp.write("-e .")
36
37 try:
38 invoke.run("pip-compile --no-header requirements.in", hide="out")
39 finally:
40 os.remove("requirements.in")
41
42 lines = [REQUIREMENTS_HEADER]
43 with open("requirements.txt", "r") as fp:
44 for line in fp:
45 line = re.sub(r"^webob==(\S+)(.*)$", r"webob==1.5.dev0\2", line)
46
47 # The boto3 wheel includes a futures==2.2.0 even though that is a
48 # Python 2 only dependency. This dependency comes by default on
49 # Python 3, so the backport is never needed. See boto/boto3#163.
50 if re.search(r"^futures==2\.2\.0", line.strip()) is not None:
51 continue
52
53 if re.search(r"^-e file:///.+/warehouse$", line.strip()) is None:
54 lines.append(line)
55
56 with open("requirements.txt", "w") as fp:
57 for line in lines:
58 fp.write(line)
59
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -76,6 +76,7 @@
"hiredis",
"html5lib",
"itsdangerous",
+ "Jinja2>=2.8",
"msgpack-python",
"passlib>=1.6.4",
"psycopg2",
@@ -91,6 +92,7 @@
"sqlalchemy-citext",
"structlog",
"transaction",
+ "WebOb>=1.5.0a0",
"WTForms>=2.0.0",
"zope.sqlalchemy",
],
diff --git a/tasks/pip.py b/tasks/pip.py
--- a/tasks/pip.py
+++ b/tasks/pip.py
@@ -24,8 +24,6 @@
# $ invoke pip.compile
#
--f https://github.com/Pylons/webob/archive/master.zip#egg=webob-1.5.dev0
-
""".lstrip()
@@ -42,8 +40,6 @@
lines = [REQUIREMENTS_HEADER]
with open("requirements.txt", "r") as fp:
for line in fp:
- line = re.sub(r"^webob==(\S+)(.*)$", r"webob==1.5.dev0\2", line)
-
# The boto3 wheel includes a futures==2.2.0 even though that is a
# Python 2 only dependency. This dependency comes by default on
# Python 3, so the backport is never needed. See boto/boto3#163.
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -76,6 +76,7 @@\n \"hiredis\",\n \"html5lib\",\n \"itsdangerous\",\n+ \"Jinja2>=2.8\",\n \"msgpack-python\",\n \"passlib>=1.6.4\",\n \"psycopg2\",\n@@ -91,6 +92,7 @@\n \"sqlalchemy-citext\",\n \"structlog\",\n \"transaction\",\n+ \"WebOb>=1.5.0a0\",\n \"WTForms>=2.0.0\",\n \"zope.sqlalchemy\",\n ],\ndiff --git a/tasks/pip.py b/tasks/pip.py\n--- a/tasks/pip.py\n+++ b/tasks/pip.py\n@@ -24,8 +24,6 @@\n # $ invoke pip.compile\n #\n \n--f https://github.com/Pylons/webob/archive/master.zip#egg=webob-1.5.dev0\n-\n \"\"\".lstrip()\n \n \n@@ -42,8 +40,6 @@\n lines = [REQUIREMENTS_HEADER]\n with open(\"requirements.txt\", \"r\") as fp:\n for line in fp:\n- line = re.sub(r\"^webob==(\\S+)(.*)$\", r\"webob==1.5.dev0\\2\", line)\n-\n # The boto3 wheel includes a futures==2.2.0 even though that is a\n # Python 2 only dependency. This dependency comes by default on\n # Python 3, so the backport is never needed. See boto/boto3#163.\n", "issue": "Use the latest version of pip-tools\n\n", "before_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport setuptools\n\n\nbase_dir = os.path.dirname(__file__)\n\nabout = {}\nwith open(os.path.join(base_dir, \"warehouse\", \"__about__.py\")) as f:\n exec(f.read(), about)\n\nwith open(os.path.join(base_dir, \"README.rst\")) as f:\n long_description = f.read()\n\n\nsetuptools.setup(\n name=about[\"__title__\"],\n version=about[\"__version__\"],\n\n description=about[\"__summary__\"],\n long_description=long_description,\n license=about[\"__license__\"],\n url=about[\"__uri__\"],\n\n author=about[\"__author__\"],\n author_email=about[\"__email__\"],\n\n classifiers=[\n \"Intended Audience :: Developers\",\n\n \"License :: OSI Approved :: Apache Software License\",\n\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.4\",\n ],\n\n packages=[\n \"warehouse\",\n \"warehouse.accounts\",\n \"warehouse.cache\",\n \"warehouse.cache.origin\",\n \"warehouse.classifiers\",\n \"warehouse.cli\",\n \"warehouse.cli.db\",\n \"warehouse.i18n\",\n \"warehouse.legacy\",\n \"warehouse.legacy.api\",\n \"warehouse.migrations\",\n \"warehouse.packaging\",\n \"warehouse.utils\",\n ],\n\n include_package_data=True,\n\n install_requires=[\n \"alembic>=0.7.0\",\n \"Babel\",\n \"bcrypt\",\n \"boto3\",\n \"click\",\n \"fs\",\n \"gunicorn\",\n \"hiredis\",\n \"html5lib\",\n \"itsdangerous\",\n \"msgpack-python\",\n \"passlib>=1.6.4\",\n \"psycopg2\",\n \"pyramid>=1.6a1\",\n \"pyramid_jinja2>=2.5\",\n \"pyramid_multiauth\",\n \"pyramid_services\",\n \"pyramid_tm>=0.12\",\n \"readme>=0.5.1\",\n \"redis\",\n \"setproctitle\",\n \"sqlalchemy>=0.9\",\n \"sqlalchemy-citext\",\n \"structlog\",\n \"transaction\",\n \"WTForms>=2.0.0\",\n \"zope.sqlalchemy\",\n ],\n\n entry_points={\n \"console_scripts\": [\n \"warehouse = warehouse.cli:warehouse\",\n ],\n },\n)\n", "path": "setup.py"}, {"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport re\n\nimport invoke\n\n\nREQUIREMENTS_HEADER = \"\"\"\n#\n# This file is autogenerated by pip-compile\n# Make changes in setup.py, then run this to update:\n#\n# $ invoke pip.compile\n#\n\n-f https://github.com/Pylons/webob/archive/master.zip#egg=webob-1.5.dev0\n\n\"\"\".lstrip()\n\n\[email protected]\ndef compile():\n with open(\"requirements.in\", \"w\") as fp:\n fp.write(\"-e .\")\n\n try:\n invoke.run(\"pip-compile --no-header requirements.in\", hide=\"out\")\n finally:\n os.remove(\"requirements.in\")\n\n lines = [REQUIREMENTS_HEADER]\n with open(\"requirements.txt\", \"r\") as fp:\n for line in fp:\n line = re.sub(r\"^webob==(\\S+)(.*)$\", r\"webob==1.5.dev0\\2\", line)\n\n # The boto3 wheel includes a futures==2.2.0 even though that is a\n # Python 2 only dependency. This dependency comes by default on\n # Python 3, so the backport is never needed. See boto/boto3#163.\n if re.search(r\"^futures==2\\.2\\.0\", line.strip()) is not None:\n continue\n\n if re.search(r\"^-e file:///.+/warehouse$\", line.strip()) is None:\n lines.append(line)\n\n with open(\"requirements.txt\", \"w\") as fp:\n for line in lines:\n fp.write(line)\n", "path": "tasks/pip.py"}], "after_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport setuptools\n\n\nbase_dir = os.path.dirname(__file__)\n\nabout = {}\nwith open(os.path.join(base_dir, \"warehouse\", \"__about__.py\")) as f:\n exec(f.read(), about)\n\nwith open(os.path.join(base_dir, \"README.rst\")) as f:\n long_description = f.read()\n\n\nsetuptools.setup(\n name=about[\"__title__\"],\n version=about[\"__version__\"],\n\n description=about[\"__summary__\"],\n long_description=long_description,\n license=about[\"__license__\"],\n url=about[\"__uri__\"],\n\n author=about[\"__author__\"],\n author_email=about[\"__email__\"],\n\n classifiers=[\n \"Intended Audience :: Developers\",\n\n \"License :: OSI Approved :: Apache Software License\",\n\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.4\",\n ],\n\n packages=[\n \"warehouse\",\n \"warehouse.accounts\",\n \"warehouse.cache\",\n \"warehouse.cache.origin\",\n \"warehouse.classifiers\",\n \"warehouse.cli\",\n \"warehouse.cli.db\",\n \"warehouse.i18n\",\n \"warehouse.legacy\",\n \"warehouse.legacy.api\",\n \"warehouse.migrations\",\n \"warehouse.packaging\",\n \"warehouse.utils\",\n ],\n\n include_package_data=True,\n\n install_requires=[\n \"alembic>=0.7.0\",\n \"Babel\",\n \"bcrypt\",\n \"boto3\",\n \"click\",\n \"fs\",\n \"gunicorn\",\n \"hiredis\",\n \"html5lib\",\n \"itsdangerous\",\n \"Jinja2>=2.8\",\n \"msgpack-python\",\n \"passlib>=1.6.4\",\n \"psycopg2\",\n \"pyramid>=1.6a1\",\n \"pyramid_jinja2>=2.5\",\n \"pyramid_multiauth\",\n \"pyramid_services\",\n \"pyramid_tm>=0.12\",\n \"readme>=0.5.1\",\n \"redis\",\n \"setproctitle\",\n \"sqlalchemy>=0.9\",\n \"sqlalchemy-citext\",\n \"structlog\",\n \"transaction\",\n \"WebOb>=1.5.0a0\",\n \"WTForms>=2.0.0\",\n \"zope.sqlalchemy\",\n ],\n\n entry_points={\n \"console_scripts\": [\n \"warehouse = warehouse.cli:warehouse\",\n ],\n },\n)\n", "path": "setup.py"}, {"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport re\n\nimport invoke\n\n\nREQUIREMENTS_HEADER = \"\"\"\n#\n# This file is autogenerated by pip-compile\n# Make changes in setup.py, then run this to update:\n#\n# $ invoke pip.compile\n#\n\n\"\"\".lstrip()\n\n\[email protected]\ndef compile():\n with open(\"requirements.in\", \"w\") as fp:\n fp.write(\"-e .\")\n\n try:\n invoke.run(\"pip-compile --no-header requirements.in\", hide=\"out\")\n finally:\n os.remove(\"requirements.in\")\n\n lines = [REQUIREMENTS_HEADER]\n with open(\"requirements.txt\", \"r\") as fp:\n for line in fp:\n # The boto3 wheel includes a futures==2.2.0 even though that is a\n # Python 2 only dependency. This dependency comes by default on\n # Python 3, so the backport is never needed. See boto/boto3#163.\n if re.search(r\"^futures==2\\.2\\.0\", line.strip()) is not None:\n continue\n\n if re.search(r\"^-e file:///.+/warehouse$\", line.strip()) is None:\n lines.append(line)\n\n with open(\"requirements.txt\", \"w\") as fp:\n for line in lines:\n fp.write(line)\n", "path": "tasks/pip.py"}]}
| 1,690 | 361 |
gh_patches_debug_37068
|
rasdani/github-patches
|
git_diff
|
microsoft__AzureTRE-1757
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
make all/tre-deploy fails due to db-migrate
**Describe the bug**
Running make `all` (or `tre-deploy` for that matter) fails because it runs `db-migrate` that needs direct access to cosmos.
**Steps to reproduce**
1. Make sure you don't have `ENABLE_LOCAL_DEBUGGING=true` in your .env files
2. Deploy a new environment by running `make all` or `make tre-deploy`
3. See that the activity fail due to lack of access to cosmos
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scripts/db_migrations.py`
Content:
```
1 #!/usr/local/bin/python3
2
3 from datetime import datetime
4 import os
5 from azure.cosmos.cosmos_client import CosmosClient
6 from azure.mgmt.cosmosdb import CosmosDBManagementClient
7 from azure.cosmos import PartitionKey
8 from azure.identity import DefaultAzureCredential
9 import json
10 import semantic_version
11 import uuid
12
13 STATE_STORE_DATABASE = "AzureTRE"
14
15
16 class TRECosmosDBMigrations:
17
18 def __init__(self):
19
20 url = os.environ['STATE_STORE_ENDPOINT']
21 key = self.get_store_key()
22 self.client = CosmosClient(url=url, credential=key)
23 self.database = self.client.get_database_client(STATE_STORE_DATABASE)
24
25 def get_store_key(self) -> str:
26 if 'STATE_STORE_KEY' in os.environ:
27 primary_master_key = os.getenv('STATE_STORE_KEY')
28 else:
29 credential = DefaultAzureCredential()
30 cosmosdb_client = CosmosDBManagementClient(credential, subscription_id=os.environ['SUBSCRIPTION_ID'])
31 database_keys = cosmosdb_client.database_accounts.list_keys(resource_group_name=os.environ['RESOURCE_GROUP_NAME'], account_name=os.environ['COSMOSDB_ACCOUNT_NAME'])
32 primary_master_key = database_keys.primary_master_key
33
34 return primary_master_key
35
36 def renameCosmosDBFields(self, container_name, old_field_name, new_field_name):
37
38 container = self.database.get_container_client(container_name)
39
40 for item in container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):
41 print(json.dumps(item, indent=True))
42 if old_field_name in item:
43 item[new_field_name] = item[old_field_name]
44 del item[old_field_name]
45 container.upsert_item(item)
46
47 def moveDeploymentsToOperations(self, resources_container_name, operations_container_name):
48 resources_container = self.database.get_container_client(resources_container_name)
49
50 # create operations container if needed
51 self.database.create_container_if_not_exists(id=operations_container_name, partition_key=PartitionKey(path="/id"))
52 operations_container = self.database.get_container_client(operations_container_name)
53
54 for item in resources_container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):
55 isActive = True
56 if ("deployment" in item):
57 newOperation = {
58 "id": str(uuid.uuid4()),
59 "resourceId": item["id"],
60 "status": item["deployment"]["status"],
61 "message": item["deployment"]["message"],
62 "resourceVersion": 0,
63 "createdWhen": datetime.utcnow().timestamp(),
64 "updatedWhen": datetime.utcnow().timestamp()
65 }
66 operations_container.create_item(newOperation)
67
68 if item["deployment"]["status"] == "deleted":
69 isActive = False
70
71 del item["deployment"]
72 item["isActive"] = isActive
73 resources_container.upsert_item(item)
74 print(f'Moved deployment from resource id {item["id"]} to operations')
75
76 def deleteDuplicatedSharedServices(self, resource_container_name):
77 resources_container = self.database.get_container_client(resource_container_name)
78
79 template_names = ['tre-shared-service-firewall', 'tre-shared-service-nexus', 'tre-shared-service-gitea']
80
81 for template_name in template_names:
82 for item in resources_container.query_items(query=f'SELECT * FROM c WHERE c.resourceType = "shared-service" AND c.templateName = "{template_name}" \
83 ORDER BY c.updatedWhen ASC OFFSET 1 LIMIT 10000', enable_cross_partition_query=True):
84 print(f"Deleting element {item}")
85 resources_container.delete_item(item, partition_key=item["id"])
86
87 def moveAuthInformationToProperties(self, resources_container_name):
88 resources_container = self.database.get_container_client(resources_container_name)
89
90 for item in resources_container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):
91 template_version = semantic_version.Version(item["templateVersion"])
92 if (template_version > semantic_version.Version('0.3.0') and "authInformation" in item):
93 print(f'Found workspace {item["id"]} that needs migrating')
94
95 # Rename app_id to be client_id
96 item["properties"]["client_id"] = item["properties"]["app_id"]
97 del item["properties"]["app_id"]
98 del item["authInformation"]["app_id"]
99
100 # merge authInformation into properties
101 item["properties"] = {**item["authInformation"], **item["properties"]}
102 del item["authInformation"]
103 resources_container.upsert_item(item)
104 print(f'Upgraded authentication info for workspace id {item["id"]}')
105
106
107 def main():
108 migrations = TRECosmosDBMigrations()
109 # PR 1030
110 migrations.renameCosmosDBFields("Resources", 'resourceTemplateName', 'templateName')
111 migrations.renameCosmosDBFields("Resources", 'resourceTemplateVersion', 'templateVersion')
112 migrations.renameCosmosDBFields("Resources", 'resourceTemplateParameters', 'properties')
113
114 # PR 1031
115 migrations.renameCosmosDBFields("Resources", 'workspaceType', 'templateName')
116 migrations.renameCosmosDBFields("Resources", 'workspaceServiceType', 'templateName')
117 migrations.renameCosmosDBFields("Resources", 'userResourceType', 'templateName')
118
119 # Operations History
120 migrations.moveDeploymentsToOperations("Resources", "Operations")
121
122 # Shared services (PR #1717)
123 migrations.deleteDuplicatedSharedServices("Resources")
124
125 # Authentication needs to be in properties so we can update them. (PR #1726)
126 migrations.moveAuthInformationToProperties("Resources")
127
128
129 if __name__ == "__main__":
130 main()
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/scripts/db_migrations.py b/scripts/db_migrations.py
--- a/scripts/db_migrations.py
+++ b/scripts/db_migrations.py
@@ -16,11 +16,14 @@
class TRECosmosDBMigrations:
def __init__(self):
+ if (self.can_connect_to_cosmos()):
+ url = os.environ['STATE_STORE_ENDPOINT']
+ key = self.get_store_key()
+ self.client = CosmosClient(url=url, credential=key)
+ self.database = self.client.get_database_client(STATE_STORE_DATABASE)
- url = os.environ['STATE_STORE_ENDPOINT']
- key = self.get_store_key()
- self.client = CosmosClient(url=url, credential=key)
- self.database = self.client.get_database_client(STATE_STORE_DATABASE)
+ def can_connect_to_cosmos(self) -> bool:
+ return os.getenv('ENABLE_LOCAL_DEBUGGING', 'False').lower() in ('true', 1, 't') if 'ENABLE_LOCAL_DEBUGGING' in os.environ else False
def get_store_key(self) -> str:
if 'STATE_STORE_KEY' in os.environ:
@@ -106,24 +109,27 @@
def main():
migrations = TRECosmosDBMigrations()
- # PR 1030
- migrations.renameCosmosDBFields("Resources", 'resourceTemplateName', 'templateName')
- migrations.renameCosmosDBFields("Resources", 'resourceTemplateVersion', 'templateVersion')
- migrations.renameCosmosDBFields("Resources", 'resourceTemplateParameters', 'properties')
-
- # PR 1031
- migrations.renameCosmosDBFields("Resources", 'workspaceType', 'templateName')
- migrations.renameCosmosDBFields("Resources", 'workspaceServiceType', 'templateName')
- migrations.renameCosmosDBFields("Resources", 'userResourceType', 'templateName')
-
- # Operations History
- migrations.moveDeploymentsToOperations("Resources", "Operations")
-
- # Shared services (PR #1717)
- migrations.deleteDuplicatedSharedServices("Resources")
-
- # Authentication needs to be in properties so we can update them. (PR #1726)
- migrations.moveAuthInformationToProperties("Resources")
+ if not migrations.can_connect_to_cosmos():
+ print('You cannot migrate the cosmos database without setting ENABLE_LOCAL_DEBUGGING to true.')
+ else:
+ # PR 1030
+ migrations.renameCosmosDBFields("Resources", 'resourceTemplateName', 'templateName')
+ migrations.renameCosmosDBFields("Resources", 'resourceTemplateVersion', 'templateVersion')
+ migrations.renameCosmosDBFields("Resources", 'resourceTemplateParameters', 'properties')
+
+ # PR 1031
+ migrations.renameCosmosDBFields("Resources", 'workspaceType', 'templateName')
+ migrations.renameCosmosDBFields("Resources", 'workspaceServiceType', 'templateName')
+ migrations.renameCosmosDBFields("Resources", 'userResourceType', 'templateName')
+
+ # Operations History
+ migrations.moveDeploymentsToOperations("Resources", "Operations")
+
+ # Shared services (PR #1717)
+ migrations.deleteDuplicatedSharedServices("Resources")
+
+ # Authentication needs to be in properties so we can update them. (PR #1726)
+ migrations.moveAuthInformationToProperties("Resources")
if __name__ == "__main__":
|
{"golden_diff": "diff --git a/scripts/db_migrations.py b/scripts/db_migrations.py\n--- a/scripts/db_migrations.py\n+++ b/scripts/db_migrations.py\n@@ -16,11 +16,14 @@\n class TRECosmosDBMigrations:\n \n def __init__(self):\n+ if (self.can_connect_to_cosmos()):\n+ url = os.environ['STATE_STORE_ENDPOINT']\n+ key = self.get_store_key()\n+ self.client = CosmosClient(url=url, credential=key)\n+ self.database = self.client.get_database_client(STATE_STORE_DATABASE)\n \n- url = os.environ['STATE_STORE_ENDPOINT']\n- key = self.get_store_key()\n- self.client = CosmosClient(url=url, credential=key)\n- self.database = self.client.get_database_client(STATE_STORE_DATABASE)\n+ def can_connect_to_cosmos(self) -> bool:\n+ return os.getenv('ENABLE_LOCAL_DEBUGGING', 'False').lower() in ('true', 1, 't') if 'ENABLE_LOCAL_DEBUGGING' in os.environ else False\n \n def get_store_key(self) -> str:\n if 'STATE_STORE_KEY' in os.environ:\n@@ -106,24 +109,27 @@\n \n def main():\n migrations = TRECosmosDBMigrations()\n- # PR 1030\n- migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateName', 'templateName')\n- migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateVersion', 'templateVersion')\n- migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateParameters', 'properties')\n-\n- # PR 1031\n- migrations.renameCosmosDBFields(\"Resources\", 'workspaceType', 'templateName')\n- migrations.renameCosmosDBFields(\"Resources\", 'workspaceServiceType', 'templateName')\n- migrations.renameCosmosDBFields(\"Resources\", 'userResourceType', 'templateName')\n-\n- # Operations History\n- migrations.moveDeploymentsToOperations(\"Resources\", \"Operations\")\n-\n- # Shared services (PR #1717)\n- migrations.deleteDuplicatedSharedServices(\"Resources\")\n-\n- # Authentication needs to be in properties so we can update them. (PR #1726)\n- migrations.moveAuthInformationToProperties(\"Resources\")\n+ if not migrations.can_connect_to_cosmos():\n+ print('You cannot migrate the cosmos database without setting ENABLE_LOCAL_DEBUGGING to true.')\n+ else:\n+ # PR 1030\n+ migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateName', 'templateName')\n+ migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateVersion', 'templateVersion')\n+ migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateParameters', 'properties')\n+\n+ # PR 1031\n+ migrations.renameCosmosDBFields(\"Resources\", 'workspaceType', 'templateName')\n+ migrations.renameCosmosDBFields(\"Resources\", 'workspaceServiceType', 'templateName')\n+ migrations.renameCosmosDBFields(\"Resources\", 'userResourceType', 'templateName')\n+\n+ # Operations History\n+ migrations.moveDeploymentsToOperations(\"Resources\", \"Operations\")\n+\n+ # Shared services (PR #1717)\n+ migrations.deleteDuplicatedSharedServices(\"Resources\")\n+\n+ # Authentication needs to be in properties so we can update them. (PR #1726)\n+ migrations.moveAuthInformationToProperties(\"Resources\")\n \n \n if __name__ == \"__main__\":\n", "issue": "make all/tre-deploy fails due to db-migrate\n**Describe the bug**\r\nRunning make `all` (or `tre-deploy` for that matter) fails because it runs `db-migrate` that needs direct access to cosmos.\r\n\r\n**Steps to reproduce**\r\n\r\n1. Make sure you don't have `ENABLE_LOCAL_DEBUGGING=true` in your .env files\r\n2. Deploy a new environment by running `make all` or `make tre-deploy`\r\n3. See that the activity fail due to lack of access to cosmos\r\n\n", "before_files": [{"content": "#!/usr/local/bin/python3\n\nfrom datetime import datetime\nimport os\nfrom azure.cosmos.cosmos_client import CosmosClient\nfrom azure.mgmt.cosmosdb import CosmosDBManagementClient\nfrom azure.cosmos import PartitionKey\nfrom azure.identity import DefaultAzureCredential\nimport json\nimport semantic_version\nimport uuid\n\nSTATE_STORE_DATABASE = \"AzureTRE\"\n\n\nclass TRECosmosDBMigrations:\n\n def __init__(self):\n\n url = os.environ['STATE_STORE_ENDPOINT']\n key = self.get_store_key()\n self.client = CosmosClient(url=url, credential=key)\n self.database = self.client.get_database_client(STATE_STORE_DATABASE)\n\n def get_store_key(self) -> str:\n if 'STATE_STORE_KEY' in os.environ:\n primary_master_key = os.getenv('STATE_STORE_KEY')\n else:\n credential = DefaultAzureCredential()\n cosmosdb_client = CosmosDBManagementClient(credential, subscription_id=os.environ['SUBSCRIPTION_ID'])\n database_keys = cosmosdb_client.database_accounts.list_keys(resource_group_name=os.environ['RESOURCE_GROUP_NAME'], account_name=os.environ['COSMOSDB_ACCOUNT_NAME'])\n primary_master_key = database_keys.primary_master_key\n\n return primary_master_key\n\n def renameCosmosDBFields(self, container_name, old_field_name, new_field_name):\n\n container = self.database.get_container_client(container_name)\n\n for item in container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):\n print(json.dumps(item, indent=True))\n if old_field_name in item:\n item[new_field_name] = item[old_field_name]\n del item[old_field_name]\n container.upsert_item(item)\n\n def moveDeploymentsToOperations(self, resources_container_name, operations_container_name):\n resources_container = self.database.get_container_client(resources_container_name)\n\n # create operations container if needed\n self.database.create_container_if_not_exists(id=operations_container_name, partition_key=PartitionKey(path=\"/id\"))\n operations_container = self.database.get_container_client(operations_container_name)\n\n for item in resources_container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):\n isActive = True\n if (\"deployment\" in item):\n newOperation = {\n \"id\": str(uuid.uuid4()),\n \"resourceId\": item[\"id\"],\n \"status\": item[\"deployment\"][\"status\"],\n \"message\": item[\"deployment\"][\"message\"],\n \"resourceVersion\": 0,\n \"createdWhen\": datetime.utcnow().timestamp(),\n \"updatedWhen\": datetime.utcnow().timestamp()\n }\n operations_container.create_item(newOperation)\n\n if item[\"deployment\"][\"status\"] == \"deleted\":\n isActive = False\n\n del item[\"deployment\"]\n item[\"isActive\"] = isActive\n resources_container.upsert_item(item)\n print(f'Moved deployment from resource id {item[\"id\"]} to operations')\n\n def deleteDuplicatedSharedServices(self, resource_container_name):\n resources_container = self.database.get_container_client(resource_container_name)\n\n template_names = ['tre-shared-service-firewall', 'tre-shared-service-nexus', 'tre-shared-service-gitea']\n\n for template_name in template_names:\n for item in resources_container.query_items(query=f'SELECT * FROM c WHERE c.resourceType = \"shared-service\" AND c.templateName = \"{template_name}\" \\\n ORDER BY c.updatedWhen ASC OFFSET 1 LIMIT 10000', enable_cross_partition_query=True):\n print(f\"Deleting element {item}\")\n resources_container.delete_item(item, partition_key=item[\"id\"])\n\n def moveAuthInformationToProperties(self, resources_container_name):\n resources_container = self.database.get_container_client(resources_container_name)\n\n for item in resources_container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):\n template_version = semantic_version.Version(item[\"templateVersion\"])\n if (template_version > semantic_version.Version('0.3.0') and \"authInformation\" in item):\n print(f'Found workspace {item[\"id\"]} that needs migrating')\n\n # Rename app_id to be client_id\n item[\"properties\"][\"client_id\"] = item[\"properties\"][\"app_id\"]\n del item[\"properties\"][\"app_id\"]\n del item[\"authInformation\"][\"app_id\"]\n\n # merge authInformation into properties\n item[\"properties\"] = {**item[\"authInformation\"], **item[\"properties\"]}\n del item[\"authInformation\"]\n resources_container.upsert_item(item)\n print(f'Upgraded authentication info for workspace id {item[\"id\"]}')\n\n\ndef main():\n migrations = TRECosmosDBMigrations()\n # PR 1030\n migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateName', 'templateName')\n migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateVersion', 'templateVersion')\n migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateParameters', 'properties')\n\n # PR 1031\n migrations.renameCosmosDBFields(\"Resources\", 'workspaceType', 'templateName')\n migrations.renameCosmosDBFields(\"Resources\", 'workspaceServiceType', 'templateName')\n migrations.renameCosmosDBFields(\"Resources\", 'userResourceType', 'templateName')\n\n # Operations History\n migrations.moveDeploymentsToOperations(\"Resources\", \"Operations\")\n\n # Shared services (PR #1717)\n migrations.deleteDuplicatedSharedServices(\"Resources\")\n\n # Authentication needs to be in properties so we can update them. (PR #1726)\n migrations.moveAuthInformationToProperties(\"Resources\")\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "scripts/db_migrations.py"}], "after_files": [{"content": "#!/usr/local/bin/python3\n\nfrom datetime import datetime\nimport os\nfrom azure.cosmos.cosmos_client import CosmosClient\nfrom azure.mgmt.cosmosdb import CosmosDBManagementClient\nfrom azure.cosmos import PartitionKey\nfrom azure.identity import DefaultAzureCredential\nimport json\nimport semantic_version\nimport uuid\n\nSTATE_STORE_DATABASE = \"AzureTRE\"\n\n\nclass TRECosmosDBMigrations:\n\n def __init__(self):\n if (self.can_connect_to_cosmos()):\n url = os.environ['STATE_STORE_ENDPOINT']\n key = self.get_store_key()\n self.client = CosmosClient(url=url, credential=key)\n self.database = self.client.get_database_client(STATE_STORE_DATABASE)\n\n def can_connect_to_cosmos(self) -> bool:\n return os.getenv('ENABLE_LOCAL_DEBUGGING', 'False').lower() in ('true', 1, 't') if 'ENABLE_LOCAL_DEBUGGING' in os.environ else False\n\n def get_store_key(self) -> str:\n if 'STATE_STORE_KEY' in os.environ:\n primary_master_key = os.getenv('STATE_STORE_KEY')\n else:\n credential = DefaultAzureCredential()\n cosmosdb_client = CosmosDBManagementClient(credential, subscription_id=os.environ['SUBSCRIPTION_ID'])\n database_keys = cosmosdb_client.database_accounts.list_keys(resource_group_name=os.environ['RESOURCE_GROUP_NAME'], account_name=os.environ['COSMOSDB_ACCOUNT_NAME'])\n primary_master_key = database_keys.primary_master_key\n\n return primary_master_key\n\n def renameCosmosDBFields(self, container_name, old_field_name, new_field_name):\n\n container = self.database.get_container_client(container_name)\n\n for item in container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):\n print(json.dumps(item, indent=True))\n if old_field_name in item:\n item[new_field_name] = item[old_field_name]\n del item[old_field_name]\n container.upsert_item(item)\n\n def moveDeploymentsToOperations(self, resources_container_name, operations_container_name):\n resources_container = self.database.get_container_client(resources_container_name)\n\n # create operations container if needed\n self.database.create_container_if_not_exists(id=operations_container_name, partition_key=PartitionKey(path=\"/id\"))\n operations_container = self.database.get_container_client(operations_container_name)\n\n for item in resources_container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):\n isActive = True\n if (\"deployment\" in item):\n newOperation = {\n \"id\": str(uuid.uuid4()),\n \"resourceId\": item[\"id\"],\n \"status\": item[\"deployment\"][\"status\"],\n \"message\": item[\"deployment\"][\"message\"],\n \"resourceVersion\": 0,\n \"createdWhen\": datetime.utcnow().timestamp(),\n \"updatedWhen\": datetime.utcnow().timestamp()\n }\n operations_container.create_item(newOperation)\n\n if item[\"deployment\"][\"status\"] == \"deleted\":\n isActive = False\n\n del item[\"deployment\"]\n item[\"isActive\"] = isActive\n resources_container.upsert_item(item)\n print(f'Moved deployment from resource id {item[\"id\"]} to operations')\n\n def deleteDuplicatedSharedServices(self, resource_container_name):\n resources_container = self.database.get_container_client(resource_container_name)\n\n template_names = ['tre-shared-service-firewall', 'tre-shared-service-nexus', 'tre-shared-service-gitea']\n\n for template_name in template_names:\n for item in resources_container.query_items(query=f'SELECT * FROM c WHERE c.resourceType = \"shared-service\" AND c.templateName = \"{template_name}\" \\\n ORDER BY c.updatedWhen ASC OFFSET 1 LIMIT 10000', enable_cross_partition_query=True):\n print(f\"Deleting element {item}\")\n resources_container.delete_item(item, partition_key=item[\"id\"])\n\n def moveAuthInformationToProperties(self, resources_container_name):\n resources_container = self.database.get_container_client(resources_container_name)\n\n for item in resources_container.query_items(query='SELECT * FROM c', enable_cross_partition_query=True):\n template_version = semantic_version.Version(item[\"templateVersion\"])\n if (template_version > semantic_version.Version('0.3.0') and \"authInformation\" in item):\n print(f'Found workspace {item[\"id\"]} that needs migrating')\n\n # Rename app_id to be client_id\n item[\"properties\"][\"client_id\"] = item[\"properties\"][\"app_id\"]\n del item[\"properties\"][\"app_id\"]\n del item[\"authInformation\"][\"app_id\"]\n\n # merge authInformation into properties\n item[\"properties\"] = {**item[\"authInformation\"], **item[\"properties\"]}\n del item[\"authInformation\"]\n resources_container.upsert_item(item)\n print(f'Upgraded authentication info for workspace id {item[\"id\"]}')\n\n\ndef main():\n migrations = TRECosmosDBMigrations()\n if not migrations.can_connect_to_cosmos():\n print('You cannot migrate the cosmos database without setting ENABLE_LOCAL_DEBUGGING to true.')\n else:\n # PR 1030\n migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateName', 'templateName')\n migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateVersion', 'templateVersion')\n migrations.renameCosmosDBFields(\"Resources\", 'resourceTemplateParameters', 'properties')\n\n # PR 1031\n migrations.renameCosmosDBFields(\"Resources\", 'workspaceType', 'templateName')\n migrations.renameCosmosDBFields(\"Resources\", 'workspaceServiceType', 'templateName')\n migrations.renameCosmosDBFields(\"Resources\", 'userResourceType', 'templateName')\n\n # Operations History\n migrations.moveDeploymentsToOperations(\"Resources\", \"Operations\")\n\n # Shared services (PR #1717)\n migrations.deleteDuplicatedSharedServices(\"Resources\")\n\n # Authentication needs to be in properties so we can update them. (PR #1726)\n migrations.moveAuthInformationToProperties(\"Resources\")\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "scripts/db_migrations.py"}]}
| 1,845 | 754 |
gh_patches_debug_1294
|
rasdani/github-patches
|
git_diff
|
Nitrate__Nitrate-564
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove Django 2.0
Django 2.0 is not supported and marked as insecure. Refer to https://docs.djangoproject.com/en/2.0/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 from setuptools import setup, find_packages
4
5
6 with open('VERSION.txt', 'r') as f:
7 pkg_version = f.read().strip()
8
9
10 def get_long_description():
11 with open('README.rst', 'r') as f:
12 return f.read()
13
14
15 install_requires = [
16 'beautifulsoup4 >= 4.1.1',
17 'django >= 2.0,<3.0',
18 'django-contrib-comments == 1.9.1',
19 'django-tinymce == 2.7.0',
20 'django-uuslug == 1.1.8',
21 'html2text',
22 'odfpy >= 0.9.6',
23 'python-bugzilla',
24 'xmltodict',
25 'kobo == 0.9.0'
26 ]
27
28 extras_require = {
29 'mysql': ['mysqlclient >= 1.2.3'],
30 'pgsql': ['psycopg2 == 2.7.5'],
31
32 # Required for tcms.auth.backends.KerberosBackend
33 'krbauth': [
34 'kerberos == 1.2.5'
35 ],
36
37 # Packages for building documentation
38 'docs': [
39 'Sphinx >= 1.1.2',
40 'sphinx_rtd_theme',
41 ],
42
43 # Necessary packages for running tests
44 'tests': [
45 'beautifulsoup4',
46 'coverage',
47 'factory_boy',
48 'flake8',
49 'pytest',
50 'pytest-cov',
51 'pytest-django',
52 ],
53
54 # Contain tools that assists the development
55 'devtools': [
56 'django-debug-toolbar',
57 'tox',
58 'django-extensions',
59 'pygraphviz',
60 ],
61
62 # Required packages required to run async tasks
63 'async': [
64 'celery == 4.2.0',
65 ],
66
67 'multiauth': [
68 'social-auth-app-django == 3.1.0',
69 ]
70 }
71
72 setup(
73 name='nitrate-tcms',
74 version=pkg_version,
75 description='A full-featured Test Case Management System',
76 long_description=get_long_description(),
77 author='Nitrate Team',
78 maintainer='Chenxiong Qi',
79 maintainer_email='[email protected]',
80 url='https://github.com/Nitrate/Nitrate/',
81 license='GPLv2+',
82 keywords='test case',
83 install_requires=install_requires,
84 extras_require=extras_require,
85 python_requires='>=3.6',
86 package_dir={'': 'src'},
87 packages=find_packages('src', exclude=['test*']),
88 include_package_data=True,
89 zip_safe=False,
90 classifiers=[
91 'Framework :: Django',
92 'Framework :: Django :: 2.0',
93 'Framework :: Django :: 2.1',
94 'Framework :: Django :: 2.2',
95 'Intended Audience :: Developers',
96 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',
97 'Programming Language :: Python :: 3',
98 'Programming Language :: Python :: 3.6',
99 'Programming Language :: Python :: 3.7',
100 'Programming Language :: Python :: 3 :: Only',
101 'Topic :: Software Development :: Quality Assurance',
102 'Topic :: Software Development :: Testing',
103 ],
104 project_urls={
105 'Issue Tracker': 'https://github.com/Nitrate/Nitrate/issues',
106 'Source Code': 'https://github.com/Nitrate/Nitrate',
107 'Documentation': 'https://nitrate.readthedocs.io/',
108 },
109 )
110
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -14,7 +14,7 @@
install_requires = [
'beautifulsoup4 >= 4.1.1',
- 'django >= 2.0,<3.0',
+ 'django >= 2.1,<3.0',
'django-contrib-comments == 1.9.1',
'django-tinymce == 2.7.0',
'django-uuslug == 1.1.8',
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -14,7 +14,7 @@\n \n install_requires = [\n 'beautifulsoup4 >= 4.1.1',\n- 'django >= 2.0,<3.0',\n+ 'django >= 2.1,<3.0',\n 'django-contrib-comments == 1.9.1',\n 'django-tinymce == 2.7.0',\n 'django-uuslug == 1.1.8',\n", "issue": "Remove Django 2.0\nDjango 2.0 is not supported and marked as insecure. Refer to https://docs.djangoproject.com/en/2.0/\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom setuptools import setup, find_packages\n\n\nwith open('VERSION.txt', 'r') as f:\n pkg_version = f.read().strip()\n\n\ndef get_long_description():\n with open('README.rst', 'r') as f:\n return f.read()\n\n\ninstall_requires = [\n 'beautifulsoup4 >= 4.1.1',\n 'django >= 2.0,<3.0',\n 'django-contrib-comments == 1.9.1',\n 'django-tinymce == 2.7.0',\n 'django-uuslug == 1.1.8',\n 'html2text',\n 'odfpy >= 0.9.6',\n 'python-bugzilla',\n 'xmltodict',\n 'kobo == 0.9.0'\n]\n\nextras_require = {\n 'mysql': ['mysqlclient >= 1.2.3'],\n 'pgsql': ['psycopg2 == 2.7.5'],\n\n # Required for tcms.auth.backends.KerberosBackend\n 'krbauth': [\n 'kerberos == 1.2.5'\n ],\n\n # Packages for building documentation\n 'docs': [\n 'Sphinx >= 1.1.2',\n 'sphinx_rtd_theme',\n ],\n\n # Necessary packages for running tests\n 'tests': [\n 'beautifulsoup4',\n 'coverage',\n 'factory_boy',\n 'flake8',\n 'pytest',\n 'pytest-cov',\n 'pytest-django',\n ],\n\n # Contain tools that assists the development\n 'devtools': [\n 'django-debug-toolbar',\n 'tox',\n 'django-extensions',\n 'pygraphviz',\n ],\n\n # Required packages required to run async tasks\n 'async': [\n 'celery == 4.2.0',\n ],\n\n 'multiauth': [\n 'social-auth-app-django == 3.1.0',\n ]\n}\n\nsetup(\n name='nitrate-tcms',\n version=pkg_version,\n description='A full-featured Test Case Management System',\n long_description=get_long_description(),\n author='Nitrate Team',\n maintainer='Chenxiong Qi',\n maintainer_email='[email protected]',\n url='https://github.com/Nitrate/Nitrate/',\n license='GPLv2+',\n keywords='test case',\n install_requires=install_requires,\n extras_require=extras_require,\n python_requires='>=3.6',\n package_dir={'': 'src'},\n packages=find_packages('src', exclude=['test*']),\n include_package_data=True,\n zip_safe=False,\n classifiers=[\n 'Framework :: Django',\n 'Framework :: Django :: 2.0',\n 'Framework :: Django :: 2.1',\n 'Framework :: Django :: 2.2',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Software Development :: Quality Assurance',\n 'Topic :: Software Development :: Testing',\n ],\n project_urls={\n 'Issue Tracker': 'https://github.com/Nitrate/Nitrate/issues',\n 'Source Code': 'https://github.com/Nitrate/Nitrate',\n 'Documentation': 'https://nitrate.readthedocs.io/',\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom setuptools import setup, find_packages\n\n\nwith open('VERSION.txt', 'r') as f:\n pkg_version = f.read().strip()\n\n\ndef get_long_description():\n with open('README.rst', 'r') as f:\n return f.read()\n\n\ninstall_requires = [\n 'beautifulsoup4 >= 4.1.1',\n 'django >= 2.1,<3.0',\n 'django-contrib-comments == 1.9.1',\n 'django-tinymce == 2.7.0',\n 'django-uuslug == 1.1.8',\n 'html2text',\n 'odfpy >= 0.9.6',\n 'python-bugzilla',\n 'xmltodict',\n 'kobo == 0.9.0'\n]\n\nextras_require = {\n 'mysql': ['mysqlclient >= 1.2.3'],\n 'pgsql': ['psycopg2 == 2.7.5'],\n\n # Required for tcms.auth.backends.KerberosBackend\n 'krbauth': [\n 'kerberos == 1.2.5'\n ],\n\n # Packages for building documentation\n 'docs': [\n 'Sphinx >= 1.1.2',\n 'sphinx_rtd_theme',\n ],\n\n # Necessary packages for running tests\n 'tests': [\n 'beautifulsoup4',\n 'coverage',\n 'factory_boy',\n 'flake8',\n 'pytest',\n 'pytest-cov',\n 'pytest-django',\n ],\n\n # Contain tools that assists the development\n 'devtools': [\n 'django-debug-toolbar',\n 'tox',\n 'django-extensions',\n 'pygraphviz',\n ],\n\n # Required packages required to run async tasks\n 'async': [\n 'celery == 4.2.0',\n ],\n\n 'multiauth': [\n 'social-auth-app-django == 3.1.0',\n ]\n}\n\nsetup(\n name='nitrate-tcms',\n version=pkg_version,\n description='A full-featured Test Case Management System',\n long_description=get_long_description(),\n author='Nitrate Team',\n maintainer='Chenxiong Qi',\n maintainer_email='[email protected]',\n url='https://github.com/Nitrate/Nitrate/',\n license='GPLv2+',\n keywords='test case',\n install_requires=install_requires,\n extras_require=extras_require,\n python_requires='>=3.6',\n package_dir={'': 'src'},\n packages=find_packages('src', exclude=['test*']),\n include_package_data=True,\n zip_safe=False,\n classifiers=[\n 'Framework :: Django',\n 'Framework :: Django :: 2.0',\n 'Framework :: Django :: 2.1',\n 'Framework :: Django :: 2.2',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Software Development :: Quality Assurance',\n 'Topic :: Software Development :: Testing',\n ],\n project_urls={\n 'Issue Tracker': 'https://github.com/Nitrate/Nitrate/issues',\n 'Source Code': 'https://github.com/Nitrate/Nitrate',\n 'Documentation': 'https://nitrate.readthedocs.io/',\n },\n)\n", "path": "setup.py"}]}
| 1,294 | 119 |
gh_patches_debug_12358
|
rasdani/github-patches
|
git_diff
|
fossasia__open-event-server-6307
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Wrong event invoice status fields in db schema
**Describe the bug**
The values allowed for invoices are paid, due &incoming [schema choices]. They are currently equal to those of orders.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/api/schema/event_invoices.py`
Content:
```
1 from marshmallow import validate as validate
2 from marshmallow_jsonapi import fields
3 from marshmallow_jsonapi.flask import Relationship
4
5 from app.api.helpers.static import PAYMENT_COUNTRIES
6 from app.api.helpers.utilities import dasherize
7 from app.api.schema.base import SoftDeletionSchema
8 from utils.common import use_defaults
9
10
11 @use_defaults()
12 class EventInvoiceSchema(SoftDeletionSchema):
13 """
14 Event Invoice API Schema based on event invoice model
15 """
16 class Meta:
17 type_ = 'event-invoice'
18 self_view = 'v1.event_invoice_detail'
19 self_view_kwargs = {'id': '<id>'}
20 inflect = dasherize
21
22 id = fields.Str(dump_only=True)
23 identifier = fields.Str(allow_none=True)
24 amount = fields.Float(validate=lambda n: n >= 0, allow_none=True)
25 address = fields.Str(allow_none=True)
26 city = fields.Str(allow_none=True)
27 state = fields.Str(allow_none=True)
28 country = fields.Str(validate=validate.OneOf(choices=PAYMENT_COUNTRIES), allow_none=True)
29 zipcode = fields.Str(allow_none=True)
30 created_at = fields.DateTime(allow_none=True)
31 completed_at = fields.DateTime(default=None)
32 transaction_id = fields.Str(allow_none=True)
33 paid_via = fields.Str(validate=validate.OneOf(
34 choices=["free", "stripe", "paypal", "transfer", "onsite", "cheque"]), allow_none=True)
35 payment_mode = fields.Str(allow_none=True)
36 brand = fields.Str(allow_none=True)
37 exp_month = fields.Integer(validate=lambda n: 0 <= n <= 12, allow_none=True)
38 exp_year = fields.Integer(validate=lambda n: n >= 2015, allow_none=True)
39 last4 = fields.Str(allow_none=True)
40 stripe_token = fields.Str(allow_none=True)
41 paypal_token = fields.Str(allow_none=True)
42 status = fields.Str(validate=validate.OneOf(
43 choices=["expired", "deleted", "initialized" "completed", "placed", "pending", "cancelled"]), allow_none=True)
44 invoice_pdf_url = fields.Url(allow_none=True)
45 user = Relationship(attribute='user',
46 self_view='v1.event_invoice_user',
47 self_view_kwargs={'id': '<id>'},
48 related_view='v1.user_detail',
49 related_view_kwargs={'event_invoice_id': '<id>'},
50 schema='UserSchemaPublic',
51 type_='user')
52 order = Relationship(attribute='order',
53 self_view='v1.event_invoice_order',
54 self_view_kwargs={'id': '<id>'},
55 related_view='v1.order_detail',
56 related_view_kwargs={'id': '<id>'},
57 schema='OrderSchema',
58 type_='order')
59 event = Relationship(attribute='event',
60 self_view='v1.event_invoice_event',
61 self_view_kwargs={'id': '<id>'},
62 related_view='v1.event_detail',
63 related_view_kwargs={'event_invoice_id': '<id>'},
64 schema='EventSchemaPublic',
65 type_='event')
66 discount_code = Relationship(attribute='discount_code',
67 self_view='v1.event_invoice_discount_code',
68 self_view_kwargs={'id': '<id>'},
69 related_view='v1.discount_code_detail',
70 related_view_kwargs={'event_invoice_id': '<id>'},
71 schema='DiscountCodeSchemaPublic',
72 type_='discount-code')
73
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/app/api/schema/event_invoices.py b/app/api/schema/event_invoices.py
--- a/app/api/schema/event_invoices.py
+++ b/app/api/schema/event_invoices.py
@@ -39,8 +39,7 @@
last4 = fields.Str(allow_none=True)
stripe_token = fields.Str(allow_none=True)
paypal_token = fields.Str(allow_none=True)
- status = fields.Str(validate=validate.OneOf(
- choices=["expired", "deleted", "initialized" "completed", "placed", "pending", "cancelled"]), allow_none=True)
+ status = fields.Str(validate=validate.OneOf(choices=["paid", "due"]), allow_none=True)
invoice_pdf_url = fields.Url(allow_none=True)
user = Relationship(attribute='user',
self_view='v1.event_invoice_user',
|
{"golden_diff": "diff --git a/app/api/schema/event_invoices.py b/app/api/schema/event_invoices.py\n--- a/app/api/schema/event_invoices.py\n+++ b/app/api/schema/event_invoices.py\n@@ -39,8 +39,7 @@\n last4 = fields.Str(allow_none=True)\n stripe_token = fields.Str(allow_none=True)\n paypal_token = fields.Str(allow_none=True)\n- status = fields.Str(validate=validate.OneOf(\n- choices=[\"expired\", \"deleted\", \"initialized\" \"completed\", \"placed\", \"pending\", \"cancelled\"]), allow_none=True)\n+ status = fields.Str(validate=validate.OneOf(choices=[\"paid\", \"due\"]), allow_none=True)\n invoice_pdf_url = fields.Url(allow_none=True)\n user = Relationship(attribute='user',\n self_view='v1.event_invoice_user',\n", "issue": "Wrong event invoice status fields in db schema\n**Describe the bug**\r\nThe values allowed for invoices are paid, due &incoming [schema choices]. They are currently equal to those of orders.\r\n\r\n\n", "before_files": [{"content": "from marshmallow import validate as validate\nfrom marshmallow_jsonapi import fields\nfrom marshmallow_jsonapi.flask import Relationship\n\nfrom app.api.helpers.static import PAYMENT_COUNTRIES\nfrom app.api.helpers.utilities import dasherize\nfrom app.api.schema.base import SoftDeletionSchema\nfrom utils.common import use_defaults\n\n\n@use_defaults()\nclass EventInvoiceSchema(SoftDeletionSchema):\n \"\"\"\n Event Invoice API Schema based on event invoice model\n \"\"\"\n class Meta:\n type_ = 'event-invoice'\n self_view = 'v1.event_invoice_detail'\n self_view_kwargs = {'id': '<id>'}\n inflect = dasherize\n\n id = fields.Str(dump_only=True)\n identifier = fields.Str(allow_none=True)\n amount = fields.Float(validate=lambda n: n >= 0, allow_none=True)\n address = fields.Str(allow_none=True)\n city = fields.Str(allow_none=True)\n state = fields.Str(allow_none=True)\n country = fields.Str(validate=validate.OneOf(choices=PAYMENT_COUNTRIES), allow_none=True)\n zipcode = fields.Str(allow_none=True)\n created_at = fields.DateTime(allow_none=True)\n completed_at = fields.DateTime(default=None)\n transaction_id = fields.Str(allow_none=True)\n paid_via = fields.Str(validate=validate.OneOf(\n choices=[\"free\", \"stripe\", \"paypal\", \"transfer\", \"onsite\", \"cheque\"]), allow_none=True)\n payment_mode = fields.Str(allow_none=True)\n brand = fields.Str(allow_none=True)\n exp_month = fields.Integer(validate=lambda n: 0 <= n <= 12, allow_none=True)\n exp_year = fields.Integer(validate=lambda n: n >= 2015, allow_none=True)\n last4 = fields.Str(allow_none=True)\n stripe_token = fields.Str(allow_none=True)\n paypal_token = fields.Str(allow_none=True)\n status = fields.Str(validate=validate.OneOf(\n choices=[\"expired\", \"deleted\", \"initialized\" \"completed\", \"placed\", \"pending\", \"cancelled\"]), allow_none=True)\n invoice_pdf_url = fields.Url(allow_none=True)\n user = Relationship(attribute='user',\n self_view='v1.event_invoice_user',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.user_detail',\n related_view_kwargs={'event_invoice_id': '<id>'},\n schema='UserSchemaPublic',\n type_='user')\n order = Relationship(attribute='order',\n self_view='v1.event_invoice_order',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.order_detail',\n related_view_kwargs={'id': '<id>'},\n schema='OrderSchema',\n type_='order')\n event = Relationship(attribute='event',\n self_view='v1.event_invoice_event',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.event_detail',\n related_view_kwargs={'event_invoice_id': '<id>'},\n schema='EventSchemaPublic',\n type_='event')\n discount_code = Relationship(attribute='discount_code',\n self_view='v1.event_invoice_discount_code',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.discount_code_detail',\n related_view_kwargs={'event_invoice_id': '<id>'},\n schema='DiscountCodeSchemaPublic',\n type_='discount-code')\n", "path": "app/api/schema/event_invoices.py"}], "after_files": [{"content": "from marshmallow import validate as validate\nfrom marshmallow_jsonapi import fields\nfrom marshmallow_jsonapi.flask import Relationship\n\nfrom app.api.helpers.static import PAYMENT_COUNTRIES\nfrom app.api.helpers.utilities import dasherize\nfrom app.api.schema.base import SoftDeletionSchema\nfrom utils.common import use_defaults\n\n\n@use_defaults()\nclass EventInvoiceSchema(SoftDeletionSchema):\n \"\"\"\n Event Invoice API Schema based on event invoice model\n \"\"\"\n class Meta:\n type_ = 'event-invoice'\n self_view = 'v1.event_invoice_detail'\n self_view_kwargs = {'id': '<id>'}\n inflect = dasherize\n\n id = fields.Str(dump_only=True)\n identifier = fields.Str(allow_none=True)\n amount = fields.Float(validate=lambda n: n >= 0, allow_none=True)\n address = fields.Str(allow_none=True)\n city = fields.Str(allow_none=True)\n state = fields.Str(allow_none=True)\n country = fields.Str(validate=validate.OneOf(choices=PAYMENT_COUNTRIES), allow_none=True)\n zipcode = fields.Str(allow_none=True)\n created_at = fields.DateTime(allow_none=True)\n completed_at = fields.DateTime(default=None)\n transaction_id = fields.Str(allow_none=True)\n paid_via = fields.Str(validate=validate.OneOf(\n choices=[\"free\", \"stripe\", \"paypal\", \"transfer\", \"onsite\", \"cheque\"]), allow_none=True)\n payment_mode = fields.Str(allow_none=True)\n brand = fields.Str(allow_none=True)\n exp_month = fields.Integer(validate=lambda n: 0 <= n <= 12, allow_none=True)\n exp_year = fields.Integer(validate=lambda n: n >= 2015, allow_none=True)\n last4 = fields.Str(allow_none=True)\n stripe_token = fields.Str(allow_none=True)\n paypal_token = fields.Str(allow_none=True)\n status = fields.Str(validate=validate.OneOf(choices=[\"paid\", \"due\"]), allow_none=True)\n invoice_pdf_url = fields.Url(allow_none=True)\n user = Relationship(attribute='user',\n self_view='v1.event_invoice_user',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.user_detail',\n related_view_kwargs={'event_invoice_id': '<id>'},\n schema='UserSchemaPublic',\n type_='user')\n order = Relationship(attribute='order',\n self_view='v1.event_invoice_order',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.order_detail',\n related_view_kwargs={'id': '<id>'},\n schema='OrderSchema',\n type_='order')\n event = Relationship(attribute='event',\n self_view='v1.event_invoice_event',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.event_detail',\n related_view_kwargs={'event_invoice_id': '<id>'},\n schema='EventSchemaPublic',\n type_='event')\n discount_code = Relationship(attribute='discount_code',\n self_view='v1.event_invoice_discount_code',\n self_view_kwargs={'id': '<id>'},\n related_view='v1.discount_code_detail',\n related_view_kwargs={'event_invoice_id': '<id>'},\n schema='DiscountCodeSchemaPublic',\n type_='discount-code')\n", "path": "app/api/schema/event_invoices.py"}]}
| 1,154 | 177 |
gh_patches_debug_24982
|
rasdani/github-patches
|
git_diff
|
facebookresearch__ParlAI-1923
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Instructions at https://parl.ai yield: Parse Error: unrecognized arguments: -n 100
From the examples section at https://parl.ai/, when running:
```bash
python examples/display_data.py -t babi:task1k:1,squad -n 100
```
I get the error:
```
Parse Error: unrecognized arguments: -n 100
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `parlai/scripts/display_model.py`
Content:
```
1 #!/usr/bin/env python3
2
3 # Copyright (c) Facebook, Inc. and its affiliates.
4 # This source code is licensed under the MIT license found in the
5 # LICENSE file in the root directory of this source tree.
6 """Basic example which iterates through the tasks specified and runs the given
7 model on them.
8
9 Examples
10 --------
11
12 .. code-block:: shell
13
14 python examples/display_model.py -t babi:task1k:1 -m "repeat_label"
15 python examples/display_model.py -t "#MovieDD-Reddit" -m "ir_baseline" -mp "-lp 0.5" -dt test
16 """ # noqa: E501
17
18 from parlai.core.params import ParlaiParser
19 from parlai.core.agents import create_agent
20 from parlai.core.worlds import create_task
21
22 import random
23
24
25 def setup_args():
26 parser = ParlaiParser(True, True, 'Display model predictions.')
27 parser.add_argument('-n', '--num-examples', default=10)
28 parser.add_argument('--display-ignore-fields', type=str, default='')
29 # by default we want to display info about the validation set
30 parser.set_defaults(datatype='valid')
31 return parser
32
33
34 def display_model(opt):
35 random.seed(42)
36
37 # Create model and assign it to the specified task
38 agent = create_agent(opt)
39 world = create_task(opt, agent)
40
41 # Show some example dialogs.
42 with world:
43 for _k in range(int(opt['num_examples'])):
44 world.parley()
45 print(world.display() + "\n~~")
46 if world.epoch_done():
47 print("EPOCH DONE")
48 break
49
50
51 if __name__ == '__main__':
52 # Get command line arguments
53 parser = setup_args()
54 opt = parser.parse_args()
55 display_model(opt)
56
```
Path: `parlai/scripts/display_data.py`
Content:
```
1 #!/usr/bin/env python3
2
3 # Copyright (c) Facebook, Inc. and its affiliates.
4 # This source code is licensed under the MIT license found in the
5 # LICENSE file in the root directory of this source tree.
6 """Basic example which iterates through the tasks specified and prints them out.
7 Used for verification of data loading and iteration.
8
9 For example, to make sure that bAbI task 1 (1k exs) loads one can run and to
10 see a few of them:
11
12 Examples
13 --------
14
15 .. code-block:: shell
16
17 python display_data.py -t babi:task1k:1
18 """
19
20 from parlai.core.params import ParlaiParser
21 from parlai.agents.repeat_label.repeat_label import RepeatLabelAgent
22 from parlai.core.worlds import create_task
23
24 import random
25
26
27 def setup_args(parser=None):
28 if parser is None:
29 parser = ParlaiParser(True, True, 'Display data from a task')
30 parser.add_pytorch_datateacher_args()
31 # Get command line arguments
32 parser.add_argument('-ne', '--num_examples', type=int, default=10)
33 parser.add_argument('-mdl', '--max_display_len', type=int, default=1000)
34 parser.add_argument('--display_ignore_fields', type=str, default='agent_reply')
35 parser.set_defaults(datatype='train:stream')
36 return parser
37
38
39 def display_data(opt):
40 # create repeat label agent and assign it to the specified task
41 agent = RepeatLabelAgent(opt)
42 world = create_task(opt, agent)
43
44 # Show some example dialogs.
45 for _ in range(opt['num_examples']):
46 world.parley()
47
48 # NOTE: If you want to look at the data from here rather than calling
49 # world.display() you could access world.acts[0] directly
50 print(world.display() + '\n~~')
51
52 if world.epoch_done():
53 print('EPOCH DONE')
54 break
55
56 try:
57 # print dataset size if available
58 print(
59 '[ loaded {} episodes with a total of {} examples ]'.format(
60 world.num_episodes(), world.num_examples()
61 )
62 )
63 except Exception:
64 pass
65
66
67 if __name__ == '__main__':
68 random.seed(42)
69
70 # Get command line arguments
71 parser = setup_args()
72 opt = parser.parse_args()
73 display_data(opt)
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/parlai/scripts/display_data.py b/parlai/scripts/display_data.py
--- a/parlai/scripts/display_data.py
+++ b/parlai/scripts/display_data.py
@@ -29,9 +29,9 @@
parser = ParlaiParser(True, True, 'Display data from a task')
parser.add_pytorch_datateacher_args()
# Get command line arguments
- parser.add_argument('-ne', '--num_examples', type=int, default=10)
- parser.add_argument('-mdl', '--max_display_len', type=int, default=1000)
- parser.add_argument('--display_ignore_fields', type=str, default='agent_reply')
+ parser.add_argument('-n', '-ne', '--num-examples', type=int, default=10)
+ parser.add_argument('-mdl', '--max-display-len', type=int, default=1000)
+ parser.add_argument('--display-ignore-fields', type=str, default='agent_reply')
parser.set_defaults(datatype='train:stream')
return parser
diff --git a/parlai/scripts/display_model.py b/parlai/scripts/display_model.py
--- a/parlai/scripts/display_model.py
+++ b/parlai/scripts/display_model.py
@@ -24,7 +24,7 @@
def setup_args():
parser = ParlaiParser(True, True, 'Display model predictions.')
- parser.add_argument('-n', '--num-examples', default=10)
+ parser.add_argument('-n', '-ne', '--num-examples', default=10)
parser.add_argument('--display-ignore-fields', type=str, default='')
# by default we want to display info about the validation set
parser.set_defaults(datatype='valid')
|
{"golden_diff": "diff --git a/parlai/scripts/display_data.py b/parlai/scripts/display_data.py\n--- a/parlai/scripts/display_data.py\n+++ b/parlai/scripts/display_data.py\n@@ -29,9 +29,9 @@\n parser = ParlaiParser(True, True, 'Display data from a task')\n parser.add_pytorch_datateacher_args()\n # Get command line arguments\n- parser.add_argument('-ne', '--num_examples', type=int, default=10)\n- parser.add_argument('-mdl', '--max_display_len', type=int, default=1000)\n- parser.add_argument('--display_ignore_fields', type=str, default='agent_reply')\n+ parser.add_argument('-n', '-ne', '--num-examples', type=int, default=10)\n+ parser.add_argument('-mdl', '--max-display-len', type=int, default=1000)\n+ parser.add_argument('--display-ignore-fields', type=str, default='agent_reply')\n parser.set_defaults(datatype='train:stream')\n return parser\n \ndiff --git a/parlai/scripts/display_model.py b/parlai/scripts/display_model.py\n--- a/parlai/scripts/display_model.py\n+++ b/parlai/scripts/display_model.py\n@@ -24,7 +24,7 @@\n \n def setup_args():\n parser = ParlaiParser(True, True, 'Display model predictions.')\n- parser.add_argument('-n', '--num-examples', default=10)\n+ parser.add_argument('-n', '-ne', '--num-examples', default=10)\n parser.add_argument('--display-ignore-fields', type=str, default='')\n # by default we want to display info about the validation set\n parser.set_defaults(datatype='valid')\n", "issue": "Instructions at https://parl.ai yield: Parse Error: unrecognized arguments: -n 100\nFrom the examples section at https://parl.ai/, when running:\r\n\r\n```bash\r\npython examples/display_data.py -t babi:task1k:1,squad -n 100\r\n```\r\n\r\nI get the error:\r\n```\r\nParse Error: unrecognized arguments: -n 100\r\n```\n", "before_files": [{"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\"\"\"Basic example which iterates through the tasks specified and runs the given\nmodel on them.\n\nExamples\n--------\n\n.. code-block:: shell\n\n python examples/display_model.py -t babi:task1k:1 -m \"repeat_label\"\n python examples/display_model.py -t \"#MovieDD-Reddit\" -m \"ir_baseline\" -mp \"-lp 0.5\" -dt test\n\"\"\" # noqa: E501\n\nfrom parlai.core.params import ParlaiParser\nfrom parlai.core.agents import create_agent\nfrom parlai.core.worlds import create_task\n\nimport random\n\n\ndef setup_args():\n parser = ParlaiParser(True, True, 'Display model predictions.')\n parser.add_argument('-n', '--num-examples', default=10)\n parser.add_argument('--display-ignore-fields', type=str, default='')\n # by default we want to display info about the validation set\n parser.set_defaults(datatype='valid')\n return parser\n\n\ndef display_model(opt):\n random.seed(42)\n\n # Create model and assign it to the specified task\n agent = create_agent(opt)\n world = create_task(opt, agent)\n\n # Show some example dialogs.\n with world:\n for _k in range(int(opt['num_examples'])):\n world.parley()\n print(world.display() + \"\\n~~\")\n if world.epoch_done():\n print(\"EPOCH DONE\")\n break\n\n\nif __name__ == '__main__':\n # Get command line arguments\n parser = setup_args()\n opt = parser.parse_args()\n display_model(opt)\n", "path": "parlai/scripts/display_model.py"}, {"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\"\"\"Basic example which iterates through the tasks specified and prints them out.\nUsed for verification of data loading and iteration.\n\nFor example, to make sure that bAbI task 1 (1k exs) loads one can run and to\nsee a few of them:\n\nExamples\n--------\n\n.. code-block:: shell\n\n python display_data.py -t babi:task1k:1\n\"\"\"\n\nfrom parlai.core.params import ParlaiParser\nfrom parlai.agents.repeat_label.repeat_label import RepeatLabelAgent\nfrom parlai.core.worlds import create_task\n\nimport random\n\n\ndef setup_args(parser=None):\n if parser is None:\n parser = ParlaiParser(True, True, 'Display data from a task')\n parser.add_pytorch_datateacher_args()\n # Get command line arguments\n parser.add_argument('-ne', '--num_examples', type=int, default=10)\n parser.add_argument('-mdl', '--max_display_len', type=int, default=1000)\n parser.add_argument('--display_ignore_fields', type=str, default='agent_reply')\n parser.set_defaults(datatype='train:stream')\n return parser\n\n\ndef display_data(opt):\n # create repeat label agent and assign it to the specified task\n agent = RepeatLabelAgent(opt)\n world = create_task(opt, agent)\n\n # Show some example dialogs.\n for _ in range(opt['num_examples']):\n world.parley()\n\n # NOTE: If you want to look at the data from here rather than calling\n # world.display() you could access world.acts[0] directly\n print(world.display() + '\\n~~')\n\n if world.epoch_done():\n print('EPOCH DONE')\n break\n\n try:\n # print dataset size if available\n print(\n '[ loaded {} episodes with a total of {} examples ]'.format(\n world.num_episodes(), world.num_examples()\n )\n )\n except Exception:\n pass\n\n\nif __name__ == '__main__':\n random.seed(42)\n\n # Get command line arguments\n parser = setup_args()\n opt = parser.parse_args()\n display_data(opt)\n", "path": "parlai/scripts/display_data.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\"\"\"Basic example which iterates through the tasks specified and runs the given\nmodel on them.\n\nExamples\n--------\n\n.. code-block:: shell\n\n python examples/display_model.py -t babi:task1k:1 -m \"repeat_label\"\n python examples/display_model.py -t \"#MovieDD-Reddit\" -m \"ir_baseline\" -mp \"-lp 0.5\" -dt test\n\"\"\" # noqa: E501\n\nfrom parlai.core.params import ParlaiParser\nfrom parlai.core.agents import create_agent\nfrom parlai.core.worlds import create_task\n\nimport random\n\n\ndef setup_args():\n parser = ParlaiParser(True, True, 'Display model predictions.')\n parser.add_argument('-n', '-ne', '--num-examples', default=10)\n parser.add_argument('--display-ignore-fields', type=str, default='')\n # by default we want to display info about the validation set\n parser.set_defaults(datatype='valid')\n return parser\n\n\ndef display_model(opt):\n random.seed(42)\n\n # Create model and assign it to the specified task\n agent = create_agent(opt)\n world = create_task(opt, agent)\n\n # Show some example dialogs.\n with world:\n for _k in range(int(opt['num_examples'])):\n world.parley()\n print(world.display() + \"\\n~~\")\n if world.epoch_done():\n print(\"EPOCH DONE\")\n break\n\n\nif __name__ == '__main__':\n # Get command line arguments\n parser = setup_args()\n opt = parser.parse_args()\n display_model(opt)\n", "path": "parlai/scripts/display_model.py"}, {"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\"\"\"Basic example which iterates through the tasks specified and prints them out.\nUsed for verification of data loading and iteration.\n\nFor example, to make sure that bAbI task 1 (1k exs) loads one can run and to\nsee a few of them:\n\nExamples\n--------\n\n.. code-block:: shell\n\n python display_data.py -t babi:task1k:1\n\"\"\"\n\nfrom parlai.core.params import ParlaiParser\nfrom parlai.agents.repeat_label.repeat_label import RepeatLabelAgent\nfrom parlai.core.worlds import create_task\n\nimport random\n\n\ndef setup_args(parser=None):\n if parser is None:\n parser = ParlaiParser(True, True, 'Display data from a task')\n parser.add_pytorch_datateacher_args()\n # Get command line arguments\n parser.add_argument('-n', '-ne', '--num-examples', type=int, default=10)\n parser.add_argument('-mdl', '--max-display-len', type=int, default=1000)\n parser.add_argument('--display-ignore-fields', type=str, default='agent_reply')\n parser.set_defaults(datatype='train:stream')\n return parser\n\n\ndef display_data(opt):\n # create repeat label agent and assign it to the specified task\n agent = RepeatLabelAgent(opt)\n world = create_task(opt, agent)\n\n # Show some example dialogs.\n for _ in range(opt['num_examples']):\n world.parley()\n\n # NOTE: If you want to look at the data from here rather than calling\n # world.display() you could access world.acts[0] directly\n print(world.display() + '\\n~~')\n\n if world.epoch_done():\n print('EPOCH DONE')\n break\n\n try:\n # print dataset size if available\n print(\n '[ loaded {} episodes with a total of {} examples ]'.format(\n world.num_episodes(), world.num_examples()\n )\n )\n except Exception:\n pass\n\n\nif __name__ == '__main__':\n random.seed(42)\n\n # Get command line arguments\n parser = setup_args()\n opt = parser.parse_args()\n display_data(opt)\n", "path": "parlai/scripts/display_data.py"}]}
| 1,510 | 383 |
gh_patches_debug_8666
|
rasdani/github-patches
|
git_diff
|
open-telemetry__opentelemetry-python-570
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove unnecessary Flask app
The current auto instrumentation example includes an unnecessary Flask app in its client, remove that.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/examples/auto-instrumentation/client.py`
Content:
```
1 # Copyright The OpenTelemetry Authors
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from sys import argv
16
17 from flask import Flask
18 from requests import get
19
20 from opentelemetry import propagators, trace
21 from opentelemetry.sdk.trace import TracerProvider
22 from opentelemetry.sdk.trace.export import (
23 ConsoleSpanExporter,
24 SimpleExportSpanProcessor,
25 )
26
27 app = Flask(__name__)
28
29 trace.set_tracer_provider(TracerProvider())
30 tracer = trace.get_tracer_provider().get_tracer(__name__)
31
32 trace.get_tracer_provider().add_span_processor(
33 SimpleExportSpanProcessor(ConsoleSpanExporter())
34 )
35
36
37 assert len(argv) == 2
38
39 with tracer.start_as_current_span("client"):
40
41 with tracer.start_as_current_span("client-server"):
42 headers = {}
43 propagators.inject(dict.__setitem__, headers)
44 requested = get(
45 "http://localhost:8082/server_request",
46 params={"param": argv[1]},
47 headers=headers,
48 )
49
50 assert requested.status_code == 200
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/docs/examples/auto-instrumentation/client.py b/docs/examples/auto-instrumentation/client.py
--- a/docs/examples/auto-instrumentation/client.py
+++ b/docs/examples/auto-instrumentation/client.py
@@ -14,7 +14,6 @@
from sys import argv
-from flask import Flask
from requests import get
from opentelemetry import propagators, trace
@@ -24,8 +23,6 @@
SimpleExportSpanProcessor,
)
-app = Flask(__name__)
-
trace.set_tracer_provider(TracerProvider())
tracer = trace.get_tracer_provider().get_tracer(__name__)
|
{"golden_diff": "diff --git a/docs/examples/auto-instrumentation/client.py b/docs/examples/auto-instrumentation/client.py\n--- a/docs/examples/auto-instrumentation/client.py\n+++ b/docs/examples/auto-instrumentation/client.py\n@@ -14,7 +14,6 @@\n \n from sys import argv\n \n-from flask import Flask\n from requests import get\n \n from opentelemetry import propagators, trace\n@@ -24,8 +23,6 @@\n SimpleExportSpanProcessor,\n )\n \n-app = Flask(__name__)\n-\n trace.set_tracer_provider(TracerProvider())\n tracer = trace.get_tracer_provider().get_tracer(__name__)\n", "issue": "Remove unnecessary Flask app\nThe current auto instrumentation example includes an unnecessary Flask app in its client, remove that.\n", "before_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom sys import argv\n\nfrom flask import Flask\nfrom requests import get\n\nfrom opentelemetry import propagators, trace\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import (\n ConsoleSpanExporter,\n SimpleExportSpanProcessor,\n)\n\napp = Flask(__name__)\n\ntrace.set_tracer_provider(TracerProvider())\ntracer = trace.get_tracer_provider().get_tracer(__name__)\n\ntrace.get_tracer_provider().add_span_processor(\n SimpleExportSpanProcessor(ConsoleSpanExporter())\n)\n\n\nassert len(argv) == 2\n\nwith tracer.start_as_current_span(\"client\"):\n\n with tracer.start_as_current_span(\"client-server\"):\n headers = {}\n propagators.inject(dict.__setitem__, headers)\n requested = get(\n \"http://localhost:8082/server_request\",\n params={\"param\": argv[1]},\n headers=headers,\n )\n\n assert requested.status_code == 200\n", "path": "docs/examples/auto-instrumentation/client.py"}], "after_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom sys import argv\n\nfrom requests import get\n\nfrom opentelemetry import propagators, trace\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import (\n ConsoleSpanExporter,\n SimpleExportSpanProcessor,\n)\n\ntrace.set_tracer_provider(TracerProvider())\ntracer = trace.get_tracer_provider().get_tracer(__name__)\n\ntrace.get_tracer_provider().add_span_processor(\n SimpleExportSpanProcessor(ConsoleSpanExporter())\n)\n\n\nassert len(argv) == 2\n\nwith tracer.start_as_current_span(\"client\"):\n\n with tracer.start_as_current_span(\"client-server\"):\n headers = {}\n propagators.inject(dict.__setitem__, headers)\n requested = get(\n \"http://localhost:8082/server_request\",\n params={\"param\": argv[1]},\n headers=headers,\n )\n\n assert requested.status_code == 200\n", "path": "docs/examples/auto-instrumentation/client.py"}]}
| 710 | 133 |
gh_patches_debug_22214
|
rasdani/github-patches
|
git_diff
|
vega__altair-167
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
numpy scalar in Dataframe not JSON serializable
Is it possible to allow numpy scalars in a dataframe?
The following code is not working, because a np.array is not json serializable:
```
df_numpy = pd.DataFrame([dict(
x=np.array(1),
y=np.array(2),
)])
Chart(df_numpy).encode(x='x', y='y')
# TypeError: array(1) is not JSON serializable
```
Full Example: [
https://github.com/boeddeker/ipynb/blob/master/bug_altair_numpy.ipynb](https://github.com/boeddeker/ipynb/blob/master/bug_altair_numpy.ipynb)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `altair/utils/core.py`
Content:
```
1 """
2 Utility routines
3 """
4 import re
5 import warnings
6
7 import pandas as pd
8 import numpy as np
9
10
11 TYPECODE_MAP = {'ordinal': 'O',
12 'nominal': 'N',
13 'quantitative': 'Q',
14 'temporal': 'T'}
15
16 INV_TYPECODE_MAP = {v: k for k, v in TYPECODE_MAP.items()}
17
18 TYPE_ABBR = TYPECODE_MAP.values()
19
20
21 def parse_shorthand(shorthand):
22 """
23 Parse the shorthand expression for aggregation, field, and type.
24
25 These are of the form:
26
27 - "col_name"
28 - "col_name:O"
29 - "average(col_name)"
30 - "average(col_name):O"
31
32 Parameters
33 ----------
34 shorthand: str
35 Shorthand string
36
37 Returns
38 -------
39 D : dict
40 Dictionary containing the field, aggregate, and typecode
41 """
42 if not shorthand:
43 return {}
44
45 # Must import this here to avoid circular imports
46 from ..schema import AggregateOp
47 valid_aggregates = AggregateOp().values
48 valid_typecodes = list(TYPECODE_MAP) + list(INV_TYPECODE_MAP)
49
50 # build regular expressions
51 units = dict(field='(?P<field>.*)',
52 type='(?P<type>{0})'.format('|'.join(valid_typecodes)),
53 aggregate='(?P<aggregate>{0})'.format('|'.join(valid_aggregates)))
54 patterns = [r'{field}',
55 r'{field}:{type}',
56 r'{aggregate}\({field}\)',
57 r'{aggregate}\({field}\):{type}']
58 regexps = (re.compile('\A' + p.format(**units) + '\Z', re.DOTALL)
59 for p in patterns[::-1])
60
61 # find matches depending on valid fields passed
62 match = next(exp.match(shorthand).groupdict() for exp in regexps
63 if exp.match(shorthand))
64
65 # Use short form of the type expression
66 typ = match.get('type', None)
67 if typ:
68 match['type'] = INV_TYPECODE_MAP.get(typ, typ)
69 return match
70
71
72 def construct_shorthand(field=None, aggregate=None, type=None):
73 """Construct a shorthand representation.
74
75 See also: parse_shorthand"""
76 if field is None:
77 return ''
78
79 sh = field
80
81 if aggregate is not None:
82 sh = '{0}({1})'.format(aggregate, sh)
83
84 if type is not None:
85 type = TYPECODE_MAP.get(type, type)
86 if type not in TYPE_ABBR:
87 raise ValueError('Unrecognized Type: {0}'.format(type))
88 sh = '{0}:{1}'.format(sh, type)
89
90 return sh
91
92
93 def infer_vegalite_type(data, field=None):
94 """
95 From an array-like input, infer the correct vega typecode
96 ('ordinal', 'nominal', 'quantitative', or 'temporal')
97
98 Parameters
99 ----------
100 data: Numpy array or Pandas Series
101 field: str column name
102 """
103 # See if we can read the type from the field
104 if field is not None:
105 parsed = parse_shorthand(field)
106 if parsed.get('type'):
107 return parsed['type']
108
109 # Otherwise, infer based on the dtype of the input
110 typ = pd.lib.infer_dtype(data)
111
112 # TODO: Once this returns 'O', please update test_select_x and test_select_y in test_api.py
113
114 if typ in ['floating', 'mixed-integer-float', 'integer',
115 'mixed-integer', 'complex']:
116 return 'quantitative'
117 elif typ in ['string', 'bytes', 'categorical', 'boolean', 'mixed', 'unicode']:
118 return 'nominal'
119 elif typ in ['datetime', 'datetime64', 'timedelta',
120 'timedelta64', 'date', 'time', 'period']:
121 return 'temporal'
122 else:
123 warnings.warn("I don't know how to infer vegalite type from '{0}'. "
124 "Defaulting to nominal.".format(typ))
125 return 'nominal'
126
127
128 def sanitize_dataframe(df):
129 """Sanitize a DataFrame to prepare it for serialization.
130
131 * Make a copy
132 * Raise ValueError if it has a hierarchical index.
133 * Convert categoricals to strings.
134 * Convert np.int dtypes to Python int objects
135 * Convert floats to objects and replace NaNs by None.
136 * Convert DateTime dtypes into appropriate string representations
137 """
138 df = df.copy()
139
140 if isinstance(df.index, pd.core.index.MultiIndex):
141 raise ValueError('Hierarchical indices not supported')
142 if isinstance(df.columns, pd.core.index.MultiIndex):
143 raise ValueError('Hierarchical indices not supported')
144
145 for col_name, dtype in df.dtypes.iteritems():
146 if str(dtype) == 'category':
147 # XXXX: work around bug in to_json for categorical types
148 # https://github.com/pydata/pandas/issues/10778
149 df[col_name] = df[col_name].astype(str)
150 elif np.issubdtype(dtype, np.integer):
151 # convert integers to objects; np.int is not JSON serializable
152 df[col_name] = df[col_name].astype(object)
153 elif np.issubdtype(dtype, np.floating):
154 # For floats, convert nan->None: np.float is not JSON serializable
155 col = df[col_name].astype(object)
156 df[col_name] = col.where(col.notnull(), None)
157 elif str(dtype).startswith('datetime'):
158 # Convert datetimes to strings
159 # astype(str) will choose the appropriate resolution
160 df[col_name] = df[col_name].astype(str).replace('NaT', '')
161 return df
162
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/altair/utils/core.py b/altair/utils/core.py
--- a/altair/utils/core.py
+++ b/altair/utils/core.py
@@ -142,6 +142,12 @@
if isinstance(df.columns, pd.core.index.MultiIndex):
raise ValueError('Hierarchical indices not supported')
+ def to_list_if_array(val):
+ if isinstance(val, np.ndarray):
+ return val.tolist()
+ else:
+ return val
+
for col_name, dtype in df.dtypes.iteritems():
if str(dtype) == 'category':
# XXXX: work around bug in to_json for categorical types
@@ -158,4 +164,9 @@
# Convert datetimes to strings
# astype(str) will choose the appropriate resolution
df[col_name] = df[col_name].astype(str).replace('NaT', '')
+ elif dtype == object:
+ # Convert numpy arrays saved as objects to lists
+ # Arrays are not JSON serializable
+ col = df[col_name].apply(to_list_if_array, convert_dtype=False)
+ df[col_name] = col.where(col.notnull(), None)
return df
|
{"golden_diff": "diff --git a/altair/utils/core.py b/altair/utils/core.py\n--- a/altair/utils/core.py\n+++ b/altair/utils/core.py\n@@ -142,6 +142,12 @@\n if isinstance(df.columns, pd.core.index.MultiIndex):\n raise ValueError('Hierarchical indices not supported')\n \n+ def to_list_if_array(val):\n+ if isinstance(val, np.ndarray):\n+ return val.tolist()\n+ else:\n+ return val\n+\n for col_name, dtype in df.dtypes.iteritems():\n if str(dtype) == 'category':\n # XXXX: work around bug in to_json for categorical types\n@@ -158,4 +164,9 @@\n # Convert datetimes to strings\n # astype(str) will choose the appropriate resolution\n df[col_name] = df[col_name].astype(str).replace('NaT', '')\n+ elif dtype == object:\n+ # Convert numpy arrays saved as objects to lists\n+ # Arrays are not JSON serializable\n+ col = df[col_name].apply(to_list_if_array, convert_dtype=False)\n+ df[col_name] = col.where(col.notnull(), None)\n return df\n", "issue": "numpy scalar in Dataframe not JSON serializable\nIs it possible to allow numpy scalars in a dataframe?\n\nThe following code is not working, because a np.array is not json serializable:\n\n```\ndf_numpy = pd.DataFrame([dict(\n x=np.array(1),\n y=np.array(2),\n )])\nChart(df_numpy).encode(x='x', y='y')\n# TypeError: array(1) is not JSON serializable\n```\n\nFull Example: [\nhttps://github.com/boeddeker/ipynb/blob/master/bug_altair_numpy.ipynb](https://github.com/boeddeker/ipynb/blob/master/bug_altair_numpy.ipynb)\n\n", "before_files": [{"content": "\"\"\"\nUtility routines\n\"\"\"\nimport re\nimport warnings\n\nimport pandas as pd\nimport numpy as np\n\n\nTYPECODE_MAP = {'ordinal': 'O',\n 'nominal': 'N',\n 'quantitative': 'Q',\n 'temporal': 'T'}\n\nINV_TYPECODE_MAP = {v: k for k, v in TYPECODE_MAP.items()}\n\nTYPE_ABBR = TYPECODE_MAP.values()\n\n\ndef parse_shorthand(shorthand):\n \"\"\"\n Parse the shorthand expression for aggregation, field, and type.\n\n These are of the form:\n\n - \"col_name\"\n - \"col_name:O\"\n - \"average(col_name)\"\n - \"average(col_name):O\"\n\n Parameters\n ----------\n shorthand: str\n Shorthand string\n\n Returns\n -------\n D : dict\n Dictionary containing the field, aggregate, and typecode\n \"\"\"\n if not shorthand:\n return {}\n\n # Must import this here to avoid circular imports\n from ..schema import AggregateOp\n valid_aggregates = AggregateOp().values\n valid_typecodes = list(TYPECODE_MAP) + list(INV_TYPECODE_MAP)\n\n # build regular expressions\n units = dict(field='(?P<field>.*)',\n type='(?P<type>{0})'.format('|'.join(valid_typecodes)),\n aggregate='(?P<aggregate>{0})'.format('|'.join(valid_aggregates)))\n patterns = [r'{field}',\n r'{field}:{type}',\n r'{aggregate}\\({field}\\)',\n r'{aggregate}\\({field}\\):{type}']\n regexps = (re.compile('\\A' + p.format(**units) + '\\Z', re.DOTALL)\n for p in patterns[::-1])\n\n # find matches depending on valid fields passed\n match = next(exp.match(shorthand).groupdict() for exp in regexps\n if exp.match(shorthand))\n\n # Use short form of the type expression\n typ = match.get('type', None)\n if typ:\n match['type'] = INV_TYPECODE_MAP.get(typ, typ)\n return match\n\n\ndef construct_shorthand(field=None, aggregate=None, type=None):\n \"\"\"Construct a shorthand representation.\n\n See also: parse_shorthand\"\"\"\n if field is None:\n return ''\n\n sh = field\n\n if aggregate is not None:\n sh = '{0}({1})'.format(aggregate, sh)\n\n if type is not None:\n type = TYPECODE_MAP.get(type, type)\n if type not in TYPE_ABBR:\n raise ValueError('Unrecognized Type: {0}'.format(type))\n sh = '{0}:{1}'.format(sh, type)\n\n return sh\n\n\ndef infer_vegalite_type(data, field=None):\n \"\"\"\n From an array-like input, infer the correct vega typecode\n ('ordinal', 'nominal', 'quantitative', or 'temporal')\n\n Parameters\n ----------\n data: Numpy array or Pandas Series\n field: str column name\n \"\"\"\n # See if we can read the type from the field\n if field is not None:\n parsed = parse_shorthand(field)\n if parsed.get('type'):\n return parsed['type']\n\n # Otherwise, infer based on the dtype of the input\n typ = pd.lib.infer_dtype(data)\n\n # TODO: Once this returns 'O', please update test_select_x and test_select_y in test_api.py\n\n if typ in ['floating', 'mixed-integer-float', 'integer',\n 'mixed-integer', 'complex']:\n return 'quantitative'\n elif typ in ['string', 'bytes', 'categorical', 'boolean', 'mixed', 'unicode']:\n return 'nominal'\n elif typ in ['datetime', 'datetime64', 'timedelta',\n 'timedelta64', 'date', 'time', 'period']:\n return 'temporal'\n else:\n warnings.warn(\"I don't know how to infer vegalite type from '{0}'. \"\n \"Defaulting to nominal.\".format(typ))\n return 'nominal'\n\n\ndef sanitize_dataframe(df):\n \"\"\"Sanitize a DataFrame to prepare it for serialization.\n\n * Make a copy\n * Raise ValueError if it has a hierarchical index.\n * Convert categoricals to strings.\n * Convert np.int dtypes to Python int objects\n * Convert floats to objects and replace NaNs by None.\n * Convert DateTime dtypes into appropriate string representations\n \"\"\"\n df = df.copy()\n\n if isinstance(df.index, pd.core.index.MultiIndex):\n raise ValueError('Hierarchical indices not supported')\n if isinstance(df.columns, pd.core.index.MultiIndex):\n raise ValueError('Hierarchical indices not supported')\n\n for col_name, dtype in df.dtypes.iteritems():\n if str(dtype) == 'category':\n # XXXX: work around bug in to_json for categorical types\n # https://github.com/pydata/pandas/issues/10778\n df[col_name] = df[col_name].astype(str)\n elif np.issubdtype(dtype, np.integer):\n # convert integers to objects; np.int is not JSON serializable\n df[col_name] = df[col_name].astype(object)\n elif np.issubdtype(dtype, np.floating):\n # For floats, convert nan->None: np.float is not JSON serializable\n col = df[col_name].astype(object)\n df[col_name] = col.where(col.notnull(), None)\n elif str(dtype).startswith('datetime'):\n # Convert datetimes to strings\n # astype(str) will choose the appropriate resolution\n df[col_name] = df[col_name].astype(str).replace('NaT', '')\n return df\n", "path": "altair/utils/core.py"}], "after_files": [{"content": "\"\"\"\nUtility routines\n\"\"\"\nimport re\nimport warnings\n\nimport pandas as pd\nimport numpy as np\n\n\nTYPECODE_MAP = {'ordinal': 'O',\n 'nominal': 'N',\n 'quantitative': 'Q',\n 'temporal': 'T'}\n\nINV_TYPECODE_MAP = {v: k for k, v in TYPECODE_MAP.items()}\n\nTYPE_ABBR = TYPECODE_MAP.values()\n\n\ndef parse_shorthand(shorthand):\n \"\"\"\n Parse the shorthand expression for aggregation, field, and type.\n\n These are of the form:\n\n - \"col_name\"\n - \"col_name:O\"\n - \"average(col_name)\"\n - \"average(col_name):O\"\n\n Parameters\n ----------\n shorthand: str\n Shorthand string\n\n Returns\n -------\n D : dict\n Dictionary containing the field, aggregate, and typecode\n \"\"\"\n if not shorthand:\n return {}\n\n # Must import this here to avoid circular imports\n from ..schema import AggregateOp\n valid_aggregates = AggregateOp().values\n valid_typecodes = list(TYPECODE_MAP) + list(INV_TYPECODE_MAP)\n\n # build regular expressions\n units = dict(field='(?P<field>.*)',\n type='(?P<type>{0})'.format('|'.join(valid_typecodes)),\n aggregate='(?P<aggregate>{0})'.format('|'.join(valid_aggregates)))\n patterns = [r'{field}',\n r'{field}:{type}',\n r'{aggregate}\\({field}\\)',\n r'{aggregate}\\({field}\\):{type}']\n regexps = (re.compile('\\A' + p.format(**units) + '\\Z', re.DOTALL)\n for p in patterns[::-1])\n\n # find matches depending on valid fields passed\n match = next(exp.match(shorthand).groupdict() for exp in regexps\n if exp.match(shorthand))\n\n # Use short form of the type expression\n typ = match.get('type', None)\n if typ:\n match['type'] = INV_TYPECODE_MAP.get(typ, typ)\n return match\n\n\ndef construct_shorthand(field=None, aggregate=None, type=None):\n \"\"\"Construct a shorthand representation.\n\n See also: parse_shorthand\"\"\"\n if field is None:\n return ''\n\n sh = field\n\n if aggregate is not None:\n sh = '{0}({1})'.format(aggregate, sh)\n\n if type is not None:\n type = TYPECODE_MAP.get(type, type)\n if type not in TYPE_ABBR:\n raise ValueError('Unrecognized Type: {0}'.format(type))\n sh = '{0}:{1}'.format(sh, type)\n\n return sh\n\n\ndef infer_vegalite_type(data, field=None):\n \"\"\"\n From an array-like input, infer the correct vega typecode\n ('ordinal', 'nominal', 'quantitative', or 'temporal')\n\n Parameters\n ----------\n data: Numpy array or Pandas Series\n field: str column name\n \"\"\"\n # See if we can read the type from the field\n if field is not None:\n parsed = parse_shorthand(field)\n if parsed.get('type'):\n return parsed['type']\n\n # Otherwise, infer based on the dtype of the input\n typ = pd.lib.infer_dtype(data)\n\n # TODO: Once this returns 'O', please update test_select_x and test_select_y in test_api.py\n\n if typ in ['floating', 'mixed-integer-float', 'integer',\n 'mixed-integer', 'complex']:\n return 'quantitative'\n elif typ in ['string', 'bytes', 'categorical', 'boolean', 'mixed', 'unicode']:\n return 'nominal'\n elif typ in ['datetime', 'datetime64', 'timedelta',\n 'timedelta64', 'date', 'time', 'period']:\n return 'temporal'\n else:\n warnings.warn(\"I don't know how to infer vegalite type from '{0}'. \"\n \"Defaulting to nominal.\".format(typ))\n return 'nominal'\n\n\ndef sanitize_dataframe(df):\n \"\"\"Sanitize a DataFrame to prepare it for serialization.\n\n * Make a copy\n * Raise ValueError if it has a hierarchical index.\n * Convert categoricals to strings.\n * Convert np.int dtypes to Python int objects\n * Convert floats to objects and replace NaNs by None.\n * Convert DateTime dtypes into appropriate string representations\n \"\"\"\n df = df.copy()\n\n if isinstance(df.index, pd.core.index.MultiIndex):\n raise ValueError('Hierarchical indices not supported')\n if isinstance(df.columns, pd.core.index.MultiIndex):\n raise ValueError('Hierarchical indices not supported')\n\n def to_list_if_array(val):\n if isinstance(val, np.ndarray):\n return val.tolist()\n else:\n return val\n\n for col_name, dtype in df.dtypes.iteritems():\n if str(dtype) == 'category':\n # XXXX: work around bug in to_json for categorical types\n # https://github.com/pydata/pandas/issues/10778\n df[col_name] = df[col_name].astype(str)\n elif np.issubdtype(dtype, np.integer):\n # convert integers to objects; np.int is not JSON serializable\n df[col_name] = df[col_name].astype(object)\n elif np.issubdtype(dtype, np.floating):\n # For floats, convert nan->None: np.float is not JSON serializable\n col = df[col_name].astype(object)\n df[col_name] = col.where(col.notnull(), None)\n elif str(dtype).startswith('datetime'):\n # Convert datetimes to strings\n # astype(str) will choose the appropriate resolution\n df[col_name] = df[col_name].astype(str).replace('NaT', '')\n elif dtype == object:\n # Convert numpy arrays saved as objects to lists\n # Arrays are not JSON serializable\n col = df[col_name].apply(to_list_if_array, convert_dtype=False)\n df[col_name] = col.where(col.notnull(), None)\n return df\n", "path": "altair/utils/core.py"}]}
| 2,023 | 263 |
gh_patches_debug_21070
|
rasdani/github-patches
|
git_diff
|
Mailu__Mailu-2808
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SCAN_MACROS is incorrect case in Rspamd configs
In Mailu/core/rspamd/conf/external_services.conf, the first line is '{% if SCAN_MACROS == 'True' %}'. It is also the same in external_services_group.conf, note the capital 'T' in 'True'. When the mailu.env is generated, it generates SCAN_MACROS=true, i.e . all lowercase. Thus, in order to enable oletools, one must put SCAN_MACROS=True in environment file. I'm not sure what other items are reliant on SCAN_MACROS, but they should be checked as well.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/rspamd/start.py`
Content:
```
1 #!/usr/bin/env python3
2
3 import os
4 import glob
5 import logging as log
6 import requests
7 import shutil
8 import sys
9 import time
10 from socrate import system,conf
11
12 system.set_env()
13
14 # Actual startup script
15
16 config_files = []
17 for rspamd_file in glob.glob("/conf/*"):
18 conf.jinja(rspamd_file, os.environ, os.path.join("/etc/rspamd/local.d", os.path.basename(rspamd_file)))
19 config_files.append(os.path.basename(rspamd_file))
20
21 for override_file in glob.glob("/overrides/*"):
22 if os.path.basename(override_file) not in config_files:
23 shutil.copyfile(override_file, os.path.join("/etc/rspamd/local.d", os.path.basename(override_file)))
24
25 # Admin may not be up just yet
26 healthcheck = f'http://{os.environ["ADMIN_ADDRESS"]}/internal/rspamd/local_domains'
27 while True:
28 time.sleep(1)
29 try:
30 if requests.get(healthcheck,timeout=2).ok:
31 break
32 except:
33 pass
34 log.warning("Admin is not up just yet, retrying in 1 second")
35
36 # Run rspamd
37 os.system("mkdir -m 755 -p /run/rspamd")
38 os.system("chown rspamd:rspamd /run/rspamd")
39 os.system("find /var/lib/rspamd | grep -v /filter | xargs -n1 chown rspamd:rspamd")
40 os.execv("/usr/sbin/rspamd", ["rspamd", "-f", "-u", "rspamd", "-g", "rspamd"])
41
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/core/rspamd/start.py b/core/rspamd/start.py
--- a/core/rspamd/start.py
+++ b/core/rspamd/start.py
@@ -9,13 +9,13 @@
import time
from socrate import system,conf
-system.set_env()
+env = system.set_env()
# Actual startup script
config_files = []
for rspamd_file in glob.glob("/conf/*"):
- conf.jinja(rspamd_file, os.environ, os.path.join("/etc/rspamd/local.d", os.path.basename(rspamd_file)))
+ conf.jinja(rspamd_file, env, os.path.join("/etc/rspamd/local.d", os.path.basename(rspamd_file)))
config_files.append(os.path.basename(rspamd_file))
for override_file in glob.glob("/overrides/*"):
@@ -23,7 +23,7 @@
shutil.copyfile(override_file, os.path.join("/etc/rspamd/local.d", os.path.basename(override_file)))
# Admin may not be up just yet
-healthcheck = f'http://{os.environ["ADMIN_ADDRESS"]}/internal/rspamd/local_domains'
+healthcheck = f'http://{env["ADMIN_ADDRESS"]}/internal/rspamd/local_domains'
while True:
time.sleep(1)
try:
|
{"golden_diff": "diff --git a/core/rspamd/start.py b/core/rspamd/start.py\n--- a/core/rspamd/start.py\n+++ b/core/rspamd/start.py\n@@ -9,13 +9,13 @@\n import time\n from socrate import system,conf\n \n-system.set_env()\n+env = system.set_env()\n \n # Actual startup script\n \n config_files = []\n for rspamd_file in glob.glob(\"/conf/*\"):\n- conf.jinja(rspamd_file, os.environ, os.path.join(\"/etc/rspamd/local.d\", os.path.basename(rspamd_file)))\n+ conf.jinja(rspamd_file, env, os.path.join(\"/etc/rspamd/local.d\", os.path.basename(rspamd_file)))\n config_files.append(os.path.basename(rspamd_file))\n \n for override_file in glob.glob(\"/overrides/*\"):\n@@ -23,7 +23,7 @@\n shutil.copyfile(override_file, os.path.join(\"/etc/rspamd/local.d\", os.path.basename(override_file)))\n \n # Admin may not be up just yet\n-healthcheck = f'http://{os.environ[\"ADMIN_ADDRESS\"]}/internal/rspamd/local_domains'\n+healthcheck = f'http://{env[\"ADMIN_ADDRESS\"]}/internal/rspamd/local_domains'\n while True:\n time.sleep(1)\n try:\n", "issue": "SCAN_MACROS is incorrect case in Rspamd configs\nIn Mailu/core/rspamd/conf/external_services.conf, the first line is '{% if SCAN_MACROS == 'True' %}'. It is also the same in external_services_group.conf, note the capital 'T' in 'True'. When the mailu.env is generated, it generates SCAN_MACROS=true, i.e . all lowercase. Thus, in order to enable oletools, one must put SCAN_MACROS=True in environment file. I'm not sure what other items are reliant on SCAN_MACROS, but they should be checked as well. \n", "before_files": [{"content": "#!/usr/bin/env python3\n\nimport os\nimport glob\nimport logging as log\nimport requests\nimport shutil\nimport sys\nimport time\nfrom socrate import system,conf\n\nsystem.set_env()\n\n# Actual startup script\n\nconfig_files = []\nfor rspamd_file in glob.glob(\"/conf/*\"):\n conf.jinja(rspamd_file, os.environ, os.path.join(\"/etc/rspamd/local.d\", os.path.basename(rspamd_file)))\n config_files.append(os.path.basename(rspamd_file))\n\nfor override_file in glob.glob(\"/overrides/*\"):\n if os.path.basename(override_file) not in config_files:\n shutil.copyfile(override_file, os.path.join(\"/etc/rspamd/local.d\", os.path.basename(override_file)))\n\n# Admin may not be up just yet\nhealthcheck = f'http://{os.environ[\"ADMIN_ADDRESS\"]}/internal/rspamd/local_domains'\nwhile True:\n time.sleep(1)\n try:\n if requests.get(healthcheck,timeout=2).ok:\n break\n except:\n pass\n log.warning(\"Admin is not up just yet, retrying in 1 second\")\n\n# Run rspamd\nos.system(\"mkdir -m 755 -p /run/rspamd\")\nos.system(\"chown rspamd:rspamd /run/rspamd\")\nos.system(\"find /var/lib/rspamd | grep -v /filter | xargs -n1 chown rspamd:rspamd\")\nos.execv(\"/usr/sbin/rspamd\", [\"rspamd\", \"-f\", \"-u\", \"rspamd\", \"-g\", \"rspamd\"])\n", "path": "core/rspamd/start.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nimport os\nimport glob\nimport logging as log\nimport requests\nimport shutil\nimport sys\nimport time\nfrom socrate import system,conf\n\nenv = system.set_env()\n\n# Actual startup script\n\nconfig_files = []\nfor rspamd_file in glob.glob(\"/conf/*\"):\n conf.jinja(rspamd_file, env, os.path.join(\"/etc/rspamd/local.d\", os.path.basename(rspamd_file)))\n config_files.append(os.path.basename(rspamd_file))\n\nfor override_file in glob.glob(\"/overrides/*\"):\n if os.path.basename(override_file) not in config_files:\n shutil.copyfile(override_file, os.path.join(\"/etc/rspamd/local.d\", os.path.basename(override_file)))\n\n# Admin may not be up just yet\nhealthcheck = f'http://{env[\"ADMIN_ADDRESS\"]}/internal/rspamd/local_domains'\nwhile True:\n time.sleep(1)\n try:\n if requests.get(healthcheck,timeout=2).ok:\n break\n except:\n pass\n log.warning(\"Admin is not up just yet, retrying in 1 second\")\n\n# Run rspamd\nos.system(\"mkdir -m 755 -p /run/rspamd\")\nos.system(\"chown rspamd:rspamd /run/rspamd\")\nos.system(\"find /var/lib/rspamd | grep -v /filter | xargs -n1 chown rspamd:rspamd\")\nos.execv(\"/usr/sbin/rspamd\", [\"rspamd\", \"-f\", \"-u\", \"rspamd\", \"-g\", \"rspamd\"])\n", "path": "core/rspamd/start.py"}]}
| 805 | 278 |
gh_patches_debug_9817
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-211
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add new check: IAM Policies should not be attached to a user
Attach policies only to groups or roles
https://www.terraform.io/docs/providers/aws/r/iam_user_policy.html
https://www.terraform.io/docs/providers/aws/r/iam_user_policy_attachment.html
users: https://www.terraform.io/docs/providers/aws/r/iam_policy_attachment.html
Policies / ManagedPolicyArns: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-iam-user.html
Users: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-iam-policy.html
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py`
Content:
```
1 from checkov.common.models.enums import CheckResult, CheckCategories
2 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
3
4
5 class IAMPolicyAttachedToGroupOrRoles(BaseResourceCheck):
6 def __init__(self):
7 name = "Ensure IAM policies are attached only to groups or roles (Reducing access management complexity may " \
8 "in-turn reduce opportunity for a principal to inadvertently receive or retain excessive privileges.) "
9 id = "CKV_AWS_40"
10 supported_resources = ['aws_iam_user_policy_attachment', 'aws_iam_user_policy', 'aws_iam_policy_attachment']
11 categories = [CheckCategories.IAM]
12 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
13
14 def scan_resource_conf(self, conf):
15 if 'user' in conf.keys() or 'users' in conf.keys():
16 return CheckResult.FAILED
17 return CheckResult.PASSED
18
19 check = IAMPolicyAttachedToGroupOrRoles()
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py b/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py
--- a/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py
+++ b/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py
@@ -12,8 +12,9 @@
super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
def scan_resource_conf(self, conf):
- if 'user' in conf.keys() or 'users' in conf.keys():
+ if 'user' in conf.keys() or ('users' in conf.keys() and len(conf['users'][0]) > 0):
return CheckResult.FAILED
return CheckResult.PASSED
+
check = IAMPolicyAttachedToGroupOrRoles()
|
{"golden_diff": "diff --git a/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py b/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py\n--- a/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py\n+++ b/checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py\n@@ -12,8 +12,9 @@\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n \n def scan_resource_conf(self, conf):\n- if 'user' in conf.keys() or 'users' in conf.keys():\n+ if 'user' in conf.keys() or ('users' in conf.keys() and len(conf['users'][0]) > 0):\n return CheckResult.FAILED\n return CheckResult.PASSED\n \n+\n check = IAMPolicyAttachedToGroupOrRoles()\n", "issue": "Add new check: IAM Policies should not be attached to a user\nAttach policies only to groups or roles \r\n\r\nhttps://www.terraform.io/docs/providers/aws/r/iam_user_policy.html\r\nhttps://www.terraform.io/docs/providers/aws/r/iam_user_policy_attachment.html\r\nusers: https://www.terraform.io/docs/providers/aws/r/iam_policy_attachment.html\r\n\r\nPolicies / ManagedPolicyArns: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-iam-user.html\r\nUsers: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-iam-policy.html\r\n\r\n\n", "before_files": [{"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass IAMPolicyAttachedToGroupOrRoles(BaseResourceCheck):\n def __init__(self):\n name = \"Ensure IAM policies are attached only to groups or roles (Reducing access management complexity may \" \\\n \"in-turn reduce opportunity for a principal to inadvertently receive or retain excessive privileges.) \"\n id = \"CKV_AWS_40\"\n supported_resources = ['aws_iam_user_policy_attachment', 'aws_iam_user_policy', 'aws_iam_policy_attachment']\n categories = [CheckCategories.IAM]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n if 'user' in conf.keys() or 'users' in conf.keys():\n return CheckResult.FAILED\n return CheckResult.PASSED\n\ncheck = IAMPolicyAttachedToGroupOrRoles()\n", "path": "checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py"}], "after_files": [{"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass IAMPolicyAttachedToGroupOrRoles(BaseResourceCheck):\n def __init__(self):\n name = \"Ensure IAM policies are attached only to groups or roles (Reducing access management complexity may \" \\\n \"in-turn reduce opportunity for a principal to inadvertently receive or retain excessive privileges.) \"\n id = \"CKV_AWS_40\"\n supported_resources = ['aws_iam_user_policy_attachment', 'aws_iam_user_policy', 'aws_iam_policy_attachment']\n categories = [CheckCategories.IAM]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n if 'user' in conf.keys() or ('users' in conf.keys() and len(conf['users'][0]) > 0):\n return CheckResult.FAILED\n return CheckResult.PASSED\n\n\ncheck = IAMPolicyAttachedToGroupOrRoles()\n", "path": "checkov/terraform/checks/resource/aws/IAMPolicyAttachedToGroupOrRoles.py"}]}
| 640 | 198 |
gh_patches_debug_24624
|
rasdani/github-patches
|
git_diff
|
ivy-llc__ivy-17570
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
multinomial
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/backends/paddle/random.py`
Content:
```
1 """Collection of Paddle random functions, wrapped to fit Ivy syntax and signature."""
2
3 # global
4 import paddle
5 import ivy.functional.backends.paddle as paddle_backend
6 from typing import Optional, Union, Sequence
7
8 # local
9 import ivy
10 from paddle.fluid.libpaddle import Place
11 from ivy.utils.exceptions import IvyNotImplementedException
12 from ivy.functional.backends.paddle.device import to_device
13 from ivy.functional.ivy.random import (
14 _check_bounds_and_get_shape,
15 _randint_check_dtype_and_bound,
16 _check_valid_scale,
17 )
18 from ivy.func_wrapper import with_unsupported_device_and_dtypes
19 from . import backend_version
20
21 # Extra #
22 # ------#
23
24
25 @with_unsupported_device_and_dtypes(
26 {"2.5.0 and below": {"cpu": ("int8",)}},
27 backend_version,
28 )
29 def random_uniform(
30 *,
31 low: Union[float, paddle.Tensor] = 0.0,
32 high: Union[float, paddle.Tensor] = 1.0,
33 shape: Optional[Union[paddle.Tensor, ivy.NativeShape, Sequence[int]]] = None,
34 dtype: paddle.dtype,
35 device: Place,
36 seed=None,
37 out: Optional[paddle.Tensor] = None,
38 ) -> paddle.Tensor:
39 if not dtype:
40 dtype = ivy.default_int_dtype()
41 dtype = ivy.as_native_dtype(dtype)
42 low = paddle.cast(low, "float32") if isinstance(low, paddle.Tensor) else low
43 high = paddle.cast(high, "float32") if isinstance(high, paddle.Tensor) else high
44 shape = _check_bounds_and_get_shape(low, high, shape).shape
45 # Set range and seed
46 rng = high - low
47 if seed:
48 _ = paddle.seed(seed)
49 random_base = paddle.uniform(shape, min=0.0, max=1.0)
50
51 return paddle_backend.add(paddle_backend.multiply(random_base, rng), low).cast(
52 dtype
53 )
54
55
56 @with_unsupported_device_and_dtypes(
57 {"2.5.0 and below": {"cpu": ("complex64", "complex128")}},
58 backend_version,
59 )
60 def random_normal(
61 *,
62 mean: Union[float, paddle.Tensor] = 0.0,
63 std: Union[float, paddle.Tensor] = 1.0,
64 shape: Optional[Union[ivy.NativeShape, Sequence[int]]] = None,
65 dtype: paddle.dtype,
66 seed: Optional[int] = None,
67 device: Place,
68 out: Optional[paddle.Tensor] = None,
69 ) -> paddle.Tensor:
70 _check_valid_scale(std)
71 shape = _check_bounds_and_get_shape(mean, std, shape).shape
72 if seed:
73 paddle.seed(seed)
74 if isinstance(mean, (int, float)) and isinstance(std, (int, float)):
75 return paddle.normal(mean, std, shape).cast(dtype)
76 if mean.dtype not in [paddle.float32, paddle.float64]:
77 mean = mean.cast("float32")
78 std = std.cast(mean.dtype)
79 return paddle.normal(mean, std).cast(dtype)
80
81
82 def multinomial(
83 population_size: int,
84 num_samples: int,
85 /,
86 *,
87 batch_size: int = 1,
88 probs: Optional[paddle.Tensor] = None,
89 replace: bool = True,
90 device: Place,
91 seed: Optional[int] = None,
92 out: Optional[paddle.Tensor] = None,
93 ) -> paddle.Tensor:
94 raise IvyNotImplementedException()
95
96
97 @with_unsupported_device_and_dtypes(
98 {"2.5.0 and below": {"cpu": ("int8",)}},
99 backend_version,
100 )
101 def randint(
102 low: Union[int, paddle.Tensor],
103 high: Union[int, paddle.Tensor],
104 /,
105 *,
106 shape: Optional[Union[ivy.NativeShape, Sequence[int]]] = None,
107 device: Place,
108 dtype: Optional[Union[paddle.dtype, ivy.Dtype]] = None,
109 seed: Optional[int] = None,
110 out: Optional[paddle.Tensor] = None,
111 ) -> paddle.Tensor:
112 if not dtype:
113 dtype = ivy.default_int_dtype()
114 dtype = ivy.as_native_dtype(dtype)
115 _randint_check_dtype_and_bound(low, high, dtype)
116 low = paddle.cast(low, "float32") if isinstance(low, paddle.Tensor) else low
117 high = paddle.cast(high, "float32") if isinstance(high, paddle.Tensor) else high
118 shape = _check_bounds_and_get_shape(low, high, shape).shape
119 range = high - low
120 if seed:
121 _ = paddle.seed(seed)
122 _retval = to_device(
123 paddle.cast(
124 paddle.uniform(shape or [1], min=0.0, max=1.0) * range + low, dtype
125 ),
126 device,
127 )
128 return _retval if shape else _retval.squeeze(axis=0)
129
130
131 def seed(*, seed_value: int = 0) -> None:
132 _ = paddle.seed(seed_value)
133 return
134
135
136 def shuffle(
137 x: paddle.Tensor,
138 axis: Optional[int] = 0,
139 /,
140 *,
141 seed: Optional[int] = None,
142 out: Optional[paddle.Tensor] = None,
143 ) -> paddle.Tensor:
144 if seed:
145 _ = paddle.seed(seed)
146 # Use Paddle's randperm function to generate shuffled indices
147 indices = paddle.randperm(x.shape[0], dtype="int64")
148 if x.dtype in [
149 paddle.int8,
150 paddle.int16,
151 paddle.uint8,
152 paddle.float16,
153 paddle.complex64,
154 paddle.complex128,
155 paddle.bool,
156 ]:
157 if paddle.is_complex(x):
158 shuffled_real = paddle.index_select(x.real(), indices)
159 shuffled_imag = paddle.index_select(x.imag(), indices)
160 return paddle.complex(shuffled_real, shuffled_imag)
161 return paddle.index_select(x.cast("float32"), indices).cast(x.dtype)
162 return paddle.index_select(x, indices)
163
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ivy/functional/backends/paddle/random.py b/ivy/functional/backends/paddle/random.py
--- a/ivy/functional/backends/paddle/random.py
+++ b/ivy/functional/backends/paddle/random.py
@@ -8,7 +8,6 @@
# local
import ivy
from paddle.fluid.libpaddle import Place
-from ivy.utils.exceptions import IvyNotImplementedException
from ivy.functional.backends.paddle.device import to_device
from ivy.functional.ivy.random import (
_check_bounds_and_get_shape,
@@ -79,6 +78,10 @@
return paddle.normal(mean, std).cast(dtype)
+@with_unsupported_device_and_dtypes(
+ {"2.4.2 and below": {"cpu": ("float16",)}},
+ backend_version,
+)
def multinomial(
population_size: int,
num_samples: int,
@@ -91,7 +94,13 @@
seed: Optional[int] = None,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
- raise IvyNotImplementedException()
+ if probs is None:
+ probs = paddle.ones((batch_size, num_samples)) / population_size
+ probs = paddle.cast(probs, paddle.float32)
+ if seed:
+ paddle.seed(seed)
+ x = paddle.multinomial(probs, num_samples=num_samples, replacement=replace)
+ return x
@with_unsupported_device_and_dtypes(
|
{"golden_diff": "diff --git a/ivy/functional/backends/paddle/random.py b/ivy/functional/backends/paddle/random.py\n--- a/ivy/functional/backends/paddle/random.py\n+++ b/ivy/functional/backends/paddle/random.py\n@@ -8,7 +8,6 @@\n # local\n import ivy\n from paddle.fluid.libpaddle import Place\n-from ivy.utils.exceptions import IvyNotImplementedException\n from ivy.functional.backends.paddle.device import to_device\n from ivy.functional.ivy.random import (\n _check_bounds_and_get_shape,\n@@ -79,6 +78,10 @@\n return paddle.normal(mean, std).cast(dtype)\n \n \n+@with_unsupported_device_and_dtypes(\n+ {\"2.4.2 and below\": {\"cpu\": (\"float16\",)}},\n+ backend_version,\n+)\n def multinomial(\n population_size: int,\n num_samples: int,\n@@ -91,7 +94,13 @@\n seed: Optional[int] = None,\n out: Optional[paddle.Tensor] = None,\n ) -> paddle.Tensor:\n- raise IvyNotImplementedException()\n+ if probs is None:\n+ probs = paddle.ones((batch_size, num_samples)) / population_size\n+ probs = paddle.cast(probs, paddle.float32)\n+ if seed:\n+ paddle.seed(seed)\n+ x = paddle.multinomial(probs, num_samples=num_samples, replacement=replace)\n+ return x\n \n \n @with_unsupported_device_and_dtypes(\n", "issue": "multinomial\n\n", "before_files": [{"content": "\"\"\"Collection of Paddle random functions, wrapped to fit Ivy syntax and signature.\"\"\"\n\n# global\nimport paddle\nimport ivy.functional.backends.paddle as paddle_backend\nfrom typing import Optional, Union, Sequence\n\n# local\nimport ivy\nfrom paddle.fluid.libpaddle import Place\nfrom ivy.utils.exceptions import IvyNotImplementedException\nfrom ivy.functional.backends.paddle.device import to_device\nfrom ivy.functional.ivy.random import (\n _check_bounds_and_get_shape,\n _randint_check_dtype_and_bound,\n _check_valid_scale,\n)\nfrom ivy.func_wrapper import with_unsupported_device_and_dtypes\nfrom . import backend_version\n\n# Extra #\n# ------#\n\n\n@with_unsupported_device_and_dtypes(\n {\"2.5.0 and below\": {\"cpu\": (\"int8\",)}},\n backend_version,\n)\ndef random_uniform(\n *,\n low: Union[float, paddle.Tensor] = 0.0,\n high: Union[float, paddle.Tensor] = 1.0,\n shape: Optional[Union[paddle.Tensor, ivy.NativeShape, Sequence[int]]] = None,\n dtype: paddle.dtype,\n device: Place,\n seed=None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n if not dtype:\n dtype = ivy.default_int_dtype()\n dtype = ivy.as_native_dtype(dtype)\n low = paddle.cast(low, \"float32\") if isinstance(low, paddle.Tensor) else low\n high = paddle.cast(high, \"float32\") if isinstance(high, paddle.Tensor) else high\n shape = _check_bounds_and_get_shape(low, high, shape).shape\n # Set range and seed\n rng = high - low\n if seed:\n _ = paddle.seed(seed)\n random_base = paddle.uniform(shape, min=0.0, max=1.0)\n\n return paddle_backend.add(paddle_backend.multiply(random_base, rng), low).cast(\n dtype\n )\n\n\n@with_unsupported_device_and_dtypes(\n {\"2.5.0 and below\": {\"cpu\": (\"complex64\", \"complex128\")}},\n backend_version,\n)\ndef random_normal(\n *,\n mean: Union[float, paddle.Tensor] = 0.0,\n std: Union[float, paddle.Tensor] = 1.0,\n shape: Optional[Union[ivy.NativeShape, Sequence[int]]] = None,\n dtype: paddle.dtype,\n seed: Optional[int] = None,\n device: Place,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n _check_valid_scale(std)\n shape = _check_bounds_and_get_shape(mean, std, shape).shape\n if seed:\n paddle.seed(seed)\n if isinstance(mean, (int, float)) and isinstance(std, (int, float)):\n return paddle.normal(mean, std, shape).cast(dtype)\n if mean.dtype not in [paddle.float32, paddle.float64]:\n mean = mean.cast(\"float32\")\n std = std.cast(mean.dtype)\n return paddle.normal(mean, std).cast(dtype)\n\n\ndef multinomial(\n population_size: int,\n num_samples: int,\n /,\n *,\n batch_size: int = 1,\n probs: Optional[paddle.Tensor] = None,\n replace: bool = True,\n device: Place,\n seed: Optional[int] = None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n raise IvyNotImplementedException()\n\n\n@with_unsupported_device_and_dtypes(\n {\"2.5.0 and below\": {\"cpu\": (\"int8\",)}},\n backend_version,\n)\ndef randint(\n low: Union[int, paddle.Tensor],\n high: Union[int, paddle.Tensor],\n /,\n *,\n shape: Optional[Union[ivy.NativeShape, Sequence[int]]] = None,\n device: Place,\n dtype: Optional[Union[paddle.dtype, ivy.Dtype]] = None,\n seed: Optional[int] = None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n if not dtype:\n dtype = ivy.default_int_dtype()\n dtype = ivy.as_native_dtype(dtype)\n _randint_check_dtype_and_bound(low, high, dtype)\n low = paddle.cast(low, \"float32\") if isinstance(low, paddle.Tensor) else low\n high = paddle.cast(high, \"float32\") if isinstance(high, paddle.Tensor) else high\n shape = _check_bounds_and_get_shape(low, high, shape).shape\n range = high - low\n if seed:\n _ = paddle.seed(seed)\n _retval = to_device(\n paddle.cast(\n paddle.uniform(shape or [1], min=0.0, max=1.0) * range + low, dtype\n ),\n device,\n )\n return _retval if shape else _retval.squeeze(axis=0)\n\n\ndef seed(*, seed_value: int = 0) -> None:\n _ = paddle.seed(seed_value)\n return\n\n\ndef shuffle(\n x: paddle.Tensor,\n axis: Optional[int] = 0,\n /,\n *,\n seed: Optional[int] = None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n if seed:\n _ = paddle.seed(seed)\n # Use Paddle's randperm function to generate shuffled indices\n indices = paddle.randperm(x.shape[0], dtype=\"int64\")\n if x.dtype in [\n paddle.int8,\n paddle.int16,\n paddle.uint8,\n paddle.float16,\n paddle.complex64,\n paddle.complex128,\n paddle.bool,\n ]:\n if paddle.is_complex(x):\n shuffled_real = paddle.index_select(x.real(), indices)\n shuffled_imag = paddle.index_select(x.imag(), indices)\n return paddle.complex(shuffled_real, shuffled_imag)\n return paddle.index_select(x.cast(\"float32\"), indices).cast(x.dtype)\n return paddle.index_select(x, indices)\n", "path": "ivy/functional/backends/paddle/random.py"}], "after_files": [{"content": "\"\"\"Collection of Paddle random functions, wrapped to fit Ivy syntax and signature.\"\"\"\n\n# global\nimport paddle\nimport ivy.functional.backends.paddle as paddle_backend\nfrom typing import Optional, Union, Sequence\n\n# local\nimport ivy\nfrom paddle.fluid.libpaddle import Place\nfrom ivy.functional.backends.paddle.device import to_device\nfrom ivy.functional.ivy.random import (\n _check_bounds_and_get_shape,\n _randint_check_dtype_and_bound,\n _check_valid_scale,\n)\nfrom ivy.func_wrapper import with_unsupported_device_and_dtypes\nfrom . import backend_version\n\n# Extra #\n# ------#\n\n\n@with_unsupported_device_and_dtypes(\n {\"2.5.0 and below\": {\"cpu\": (\"int8\",)}},\n backend_version,\n)\ndef random_uniform(\n *,\n low: Union[float, paddle.Tensor] = 0.0,\n high: Union[float, paddle.Tensor] = 1.0,\n shape: Optional[Union[paddle.Tensor, ivy.NativeShape, Sequence[int]]] = None,\n dtype: paddle.dtype,\n device: Place,\n seed=None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n if not dtype:\n dtype = ivy.default_int_dtype()\n dtype = ivy.as_native_dtype(dtype)\n low = paddle.cast(low, \"float32\") if isinstance(low, paddle.Tensor) else low\n high = paddle.cast(high, \"float32\") if isinstance(high, paddle.Tensor) else high\n shape = _check_bounds_and_get_shape(low, high, shape).shape\n # Set range and seed\n rng = high - low\n if seed:\n _ = paddle.seed(seed)\n random_base = paddle.uniform(shape, min=0.0, max=1.0)\n\n return paddle_backend.add(paddle_backend.multiply(random_base, rng), low).cast(\n dtype\n )\n\n\n@with_unsupported_device_and_dtypes(\n {\"2.5.0 and below\": {\"cpu\": (\"complex64\", \"complex128\")}},\n backend_version,\n)\ndef random_normal(\n *,\n mean: Union[float, paddle.Tensor] = 0.0,\n std: Union[float, paddle.Tensor] = 1.0,\n shape: Optional[Union[ivy.NativeShape, Sequence[int]]] = None,\n dtype: paddle.dtype,\n seed: Optional[int] = None,\n device: Place,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n _check_valid_scale(std)\n shape = _check_bounds_and_get_shape(mean, std, shape).shape\n if seed:\n paddle.seed(seed)\n if isinstance(mean, (int, float)) and isinstance(std, (int, float)):\n return paddle.normal(mean, std, shape).cast(dtype)\n if mean.dtype not in [paddle.float32, paddle.float64]:\n mean = mean.cast(\"float32\")\n std = std.cast(mean.dtype)\n return paddle.normal(mean, std).cast(dtype)\n\n\n@with_unsupported_device_and_dtypes(\n {\"2.4.2 and below\": {\"cpu\": (\"float16\",)}},\n backend_version,\n)\ndef multinomial(\n population_size: int,\n num_samples: int,\n /,\n *,\n batch_size: int = 1,\n probs: Optional[paddle.Tensor] = None,\n replace: bool = True,\n device: Place,\n seed: Optional[int] = None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n if probs is None:\n probs = paddle.ones((batch_size, num_samples)) / population_size\n probs = paddle.cast(probs, paddle.float32)\n if seed:\n paddle.seed(seed)\n x = paddle.multinomial(probs, num_samples=num_samples, replacement=replace)\n return x\n\n\n@with_unsupported_device_and_dtypes(\n {\"2.5.0 and below\": {\"cpu\": (\"int8\",)}},\n backend_version,\n)\ndef randint(\n low: Union[int, paddle.Tensor],\n high: Union[int, paddle.Tensor],\n /,\n *,\n shape: Optional[Union[ivy.NativeShape, Sequence[int]]] = None,\n device: Place,\n dtype: Optional[Union[paddle.dtype, ivy.Dtype]] = None,\n seed: Optional[int] = None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n if not dtype:\n dtype = ivy.default_int_dtype()\n dtype = ivy.as_native_dtype(dtype)\n _randint_check_dtype_and_bound(low, high, dtype)\n low = paddle.cast(low, \"float32\") if isinstance(low, paddle.Tensor) else low\n high = paddle.cast(high, \"float32\") if isinstance(high, paddle.Tensor) else high\n shape = _check_bounds_and_get_shape(low, high, shape).shape\n range = high - low\n if seed:\n _ = paddle.seed(seed)\n _retval = to_device(\n paddle.cast(\n paddle.uniform(shape or [1], min=0.0, max=1.0) * range + low, dtype\n ),\n device,\n )\n return _retval if shape else _retval.squeeze(axis=0)\n\n\ndef seed(*, seed_value: int = 0) -> None:\n _ = paddle.seed(seed_value)\n return\n\n\ndef shuffle(\n x: paddle.Tensor,\n axis: Optional[int] = 0,\n /,\n *,\n seed: Optional[int] = None,\n out: Optional[paddle.Tensor] = None,\n) -> paddle.Tensor:\n if seed:\n _ = paddle.seed(seed)\n # Use Paddle's randperm function to generate shuffled indices\n indices = paddle.randperm(x.shape[0], dtype=\"int64\")\n if x.dtype in [\n paddle.int8,\n paddle.int16,\n paddle.uint8,\n paddle.float16,\n paddle.complex64,\n paddle.complex128,\n paddle.bool,\n ]:\n if paddle.is_complex(x):\n shuffled_real = paddle.index_select(x.real(), indices)\n shuffled_imag = paddle.index_select(x.imag(), indices)\n return paddle.complex(shuffled_real, shuffled_imag)\n return paddle.index_select(x.cast(\"float32\"), indices).cast(x.dtype)\n return paddle.index_select(x, indices)\n", "path": "ivy/functional/backends/paddle/random.py"}]}
| 1,944 | 328 |
gh_patches_debug_24218
|
rasdani/github-patches
|
git_diff
|
Cloud-CV__EvalAI-1042
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Change path of storage of Evaluation Scripts
## Current behaviour
Currently, the evaluation scripts are stored in the `submission_files` directory which should not happen ideally.
## Solution
Modify the `RandomFile()` method such that the evaluation scripts should be uploaded in the `evaluation_scripts` directory instead of the `submission_files` directory.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/base/utils.py`
Content:
```
1 import os
2 import uuid
3
4 from django.conf import settings
5 from django.utils.deconstruct import deconstructible
6
7 from rest_framework.exceptions import NotFound
8 from rest_framework.pagination import PageNumberPagination
9
10
11 class StandardResultSetPagination(PageNumberPagination):
12 page_size = 100
13 page_size_query_param = 'page_size'
14 max_page_size = 1000
15
16
17 def paginated_queryset(queryset, request, pagination_class=PageNumberPagination()):
18 '''
19 Return a paginated result for a queryset
20 '''
21 paginator = pagination_class
22 paginator.page_size = settings.REST_FRAMEWORK['PAGE_SIZE']
23 result_page = paginator.paginate_queryset(queryset, request)
24 return (paginator, result_page)
25
26
27 @deconstructible
28 class RandomFileName(object):
29 def __init__(self, path):
30 self.path = path
31
32 def __call__(self, instance, filename):
33 extension = os.path.splitext(filename)[1]
34 if 'id' in self.path and instance.pk:
35 self.path = self.path.format(id=instance.pk)
36 elif 'id' not in self.path and instance.pk:
37 path = "submission_files/submission_{id}"
38 self.path = path.format(id=instance.pk)
39 filename = '{}{}'.format(uuid.uuid4(), extension)
40 filename = os.path.join(self.path, filename)
41 return filename
42
43
44 def get_model_object(model_name):
45 def get_model_by_pk(pk):
46 try:
47 model_object = model_name.objects.get(pk=pk)
48 return model_object
49 except model_name.DoesNotExist:
50 raise NotFound('{} {} does not exist'.format(model_name.__name__, pk))
51 get_model_by_pk.__name__ = 'get_{}_object'.format(model_name.__name__.lower())
52 return get_model_by_pk
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/apps/base/utils.py b/apps/base/utils.py
--- a/apps/base/utils.py
+++ b/apps/base/utils.py
@@ -2,7 +2,6 @@
import uuid
from django.conf import settings
-from django.utils.deconstruct import deconstructible
from rest_framework.exceptions import NotFound
from rest_framework.pagination import PageNumberPagination
@@ -24,20 +23,17 @@
return (paginator, result_page)
-@deconstructible
class RandomFileName(object):
def __init__(self, path):
self.path = path
def __call__(self, instance, filename):
extension = os.path.splitext(filename)[1]
+ path = self.path
if 'id' in self.path and instance.pk:
- self.path = self.path.format(id=instance.pk)
- elif 'id' not in self.path and instance.pk:
- path = "submission_files/submission_{id}"
- self.path = path.format(id=instance.pk)
+ path = self.path.format(id=instance.pk)
filename = '{}{}'.format(uuid.uuid4(), extension)
- filename = os.path.join(self.path, filename)
+ filename = os.path.join(path, filename)
return filename
|
{"golden_diff": "diff --git a/apps/base/utils.py b/apps/base/utils.py\n--- a/apps/base/utils.py\n+++ b/apps/base/utils.py\n@@ -2,7 +2,6 @@\n import uuid\n \n from django.conf import settings\n-from django.utils.deconstruct import deconstructible\n \n from rest_framework.exceptions import NotFound\n from rest_framework.pagination import PageNumberPagination\n@@ -24,20 +23,17 @@\n return (paginator, result_page)\n \n \n-@deconstructible\n class RandomFileName(object):\n def __init__(self, path):\n self.path = path\n \n def __call__(self, instance, filename):\n extension = os.path.splitext(filename)[1]\n+ path = self.path\n if 'id' in self.path and instance.pk:\n- self.path = self.path.format(id=instance.pk)\n- elif 'id' not in self.path and instance.pk:\n- path = \"submission_files/submission_{id}\"\n- self.path = path.format(id=instance.pk)\n+ path = self.path.format(id=instance.pk)\n filename = '{}{}'.format(uuid.uuid4(), extension)\n- filename = os.path.join(self.path, filename)\n+ filename = os.path.join(path, filename)\n return filename\n", "issue": "Change path of storage of Evaluation Scripts\n## Current behaviour\r\n\r\nCurrently, the evaluation scripts are stored in the `submission_files` directory which should not happen ideally.\r\n\r\n## Solution\r\nModify the `RandomFile()` method such that the evaluation scripts should be uploaded in the `evaluation_scripts` directory instead of the `submission_files` directory. \r\n\n", "before_files": [{"content": "import os\nimport uuid\n\nfrom django.conf import settings\nfrom django.utils.deconstruct import deconstructible\n\nfrom rest_framework.exceptions import NotFound\nfrom rest_framework.pagination import PageNumberPagination\n\n\nclass StandardResultSetPagination(PageNumberPagination):\n page_size = 100\n page_size_query_param = 'page_size'\n max_page_size = 1000\n\n\ndef paginated_queryset(queryset, request, pagination_class=PageNumberPagination()):\n '''\n Return a paginated result for a queryset\n '''\n paginator = pagination_class\n paginator.page_size = settings.REST_FRAMEWORK['PAGE_SIZE']\n result_page = paginator.paginate_queryset(queryset, request)\n return (paginator, result_page)\n\n\n@deconstructible\nclass RandomFileName(object):\n def __init__(self, path):\n self.path = path\n\n def __call__(self, instance, filename):\n extension = os.path.splitext(filename)[1]\n if 'id' in self.path and instance.pk:\n self.path = self.path.format(id=instance.pk)\n elif 'id' not in self.path and instance.pk:\n path = \"submission_files/submission_{id}\"\n self.path = path.format(id=instance.pk)\n filename = '{}{}'.format(uuid.uuid4(), extension)\n filename = os.path.join(self.path, filename)\n return filename\n\n\ndef get_model_object(model_name):\n def get_model_by_pk(pk):\n try:\n model_object = model_name.objects.get(pk=pk)\n return model_object\n except model_name.DoesNotExist:\n raise NotFound('{} {} does not exist'.format(model_name.__name__, pk))\n get_model_by_pk.__name__ = 'get_{}_object'.format(model_name.__name__.lower())\n return get_model_by_pk\n", "path": "apps/base/utils.py"}], "after_files": [{"content": "import os\nimport uuid\n\nfrom django.conf import settings\n\nfrom rest_framework.exceptions import NotFound\nfrom rest_framework.pagination import PageNumberPagination\n\n\nclass StandardResultSetPagination(PageNumberPagination):\n page_size = 100\n page_size_query_param = 'page_size'\n max_page_size = 1000\n\n\ndef paginated_queryset(queryset, request, pagination_class=PageNumberPagination()):\n '''\n Return a paginated result for a queryset\n '''\n paginator = pagination_class\n paginator.page_size = settings.REST_FRAMEWORK['PAGE_SIZE']\n result_page = paginator.paginate_queryset(queryset, request)\n return (paginator, result_page)\n\n\nclass RandomFileName(object):\n def __init__(self, path):\n self.path = path\n\n def __call__(self, instance, filename):\n extension = os.path.splitext(filename)[1]\n path = self.path\n if 'id' in self.path and instance.pk:\n path = self.path.format(id=instance.pk)\n filename = '{}{}'.format(uuid.uuid4(), extension)\n filename = os.path.join(path, filename)\n return filename\n\n\ndef get_model_object(model_name):\n def get_model_by_pk(pk):\n try:\n model_object = model_name.objects.get(pk=pk)\n return model_object\n except model_name.DoesNotExist:\n raise NotFound('{} {} does not exist'.format(model_name.__name__, pk))\n get_model_by_pk.__name__ = 'get_{}_object'.format(model_name.__name__.lower())\n return get_model_by_pk\n", "path": "apps/base/utils.py"}]}
| 801 | 264 |
gh_patches_debug_2828
|
rasdani/github-patches
|
git_diff
|
svthalia__concrexit-3188
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Non-closable site announcements can secretly be closed if you send the right cookie
### Describe the bug
Non-closable site announcements can secretly be closed if you send the right cookie
### How to reproduce
Steps to reproduce the behaviour:
1. Have an announcement that is closable
2. Close it on your machine
3. Change the announcement to be non-closable
4. It still isn't there because you already closed it.
The `closable` field just controls whether the close button appears or not, but it doesn't influence the actual logic
### Expected behaviour
Always show non-closable announcements
### Screenshots
### Additional context
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/announcements/context_processors.py`
Content:
```
1 """These context processors can be used to expand the context provided to admin views."""
2 from .models import Announcement
3
4
5 def announcements(request):
6 """Get a list of announcements.
7
8 Filters out announcements that have been closed already.
9
10 :param request: the request object
11 :return: a dict containing the list announcements
12 :rtype: dict
13 """
14 closed_announcements = request.session.get("closed_announcements", [])
15 announcements_list = [
16 a
17 for a in Announcement.objects.all()
18 if a.is_visible and a.pk not in closed_announcements
19 ]
20
21 # Announcements set by AnnouncementMiddleware.
22 persistent_announcements = getattr(request, "_announcements", [])
23 return {
24 "announcements": announcements_list,
25 "persistent_announcements": persistent_announcements,
26 }
27
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/website/announcements/context_processors.py b/website/announcements/context_processors.py
--- a/website/announcements/context_processors.py
+++ b/website/announcements/context_processors.py
@@ -15,7 +15,7 @@
announcements_list = [
a
for a in Announcement.objects.all()
- if a.is_visible and a.pk not in closed_announcements
+ if a.is_visible and (not a.closeable or a.pk not in closed_announcements)
]
# Announcements set by AnnouncementMiddleware.
|
{"golden_diff": "diff --git a/website/announcements/context_processors.py b/website/announcements/context_processors.py\n--- a/website/announcements/context_processors.py\n+++ b/website/announcements/context_processors.py\n@@ -15,7 +15,7 @@\n announcements_list = [\n a\n for a in Announcement.objects.all()\n- if a.is_visible and a.pk not in closed_announcements\n+ if a.is_visible and (not a.closeable or a.pk not in closed_announcements)\n ]\n \n # Announcements set by AnnouncementMiddleware.\n", "issue": "Non-closable site announcements can secretly be closed if you send the right cookie\n### Describe the bug\r\nNon-closable site announcements can secretly be closed if you send the right cookie\r\n\r\n### How to reproduce\r\nSteps to reproduce the behaviour:\r\n1. Have an announcement that is closable\r\n2. Close it on your machine\r\n3. Change the announcement to be non-closable\r\n4. It still isn't there because you already closed it.\r\n\r\nThe `closable` field just controls whether the close button appears or not, but it doesn't influence the actual logic\r\n\r\n### Expected behaviour\r\nAlways show non-closable announcements\r\n\r\n### Screenshots\r\n\r\n### Additional context\r\n\n", "before_files": [{"content": "\"\"\"These context processors can be used to expand the context provided to admin views.\"\"\"\nfrom .models import Announcement\n\n\ndef announcements(request):\n \"\"\"Get a list of announcements.\n\n Filters out announcements that have been closed already.\n\n :param request: the request object\n :return: a dict containing the list announcements\n :rtype: dict\n \"\"\"\n closed_announcements = request.session.get(\"closed_announcements\", [])\n announcements_list = [\n a\n for a in Announcement.objects.all()\n if a.is_visible and a.pk not in closed_announcements\n ]\n\n # Announcements set by AnnouncementMiddleware.\n persistent_announcements = getattr(request, \"_announcements\", [])\n return {\n \"announcements\": announcements_list,\n \"persistent_announcements\": persistent_announcements,\n }\n", "path": "website/announcements/context_processors.py"}], "after_files": [{"content": "\"\"\"These context processors can be used to expand the context provided to admin views.\"\"\"\nfrom .models import Announcement\n\n\ndef announcements(request):\n \"\"\"Get a list of announcements.\n\n Filters out announcements that have been closed already.\n\n :param request: the request object\n :return: a dict containing the list announcements\n :rtype: dict\n \"\"\"\n closed_announcements = request.session.get(\"closed_announcements\", [])\n announcements_list = [\n a\n for a in Announcement.objects.all()\n if a.is_visible and (not a.closeable or a.pk not in closed_announcements)\n ]\n\n # Announcements set by AnnouncementMiddleware.\n persistent_announcements = getattr(request, \"_announcements\", [])\n return {\n \"announcements\": announcements_list,\n \"persistent_announcements\": persistent_announcements,\n }\n", "path": "website/announcements/context_processors.py"}]}
| 614 | 120 |
gh_patches_debug_25342
|
rasdani/github-patches
|
git_diff
|
getredash__redash-4239
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Change PagerDuty's default summary text
Currently PagerDuty's Alert destination default summary text uses the query id and name. We should change it to use the alert name as it's usually better explains what the alert is.
While #4153 implements ability to customize the summary text, it's good to have a saner default regardless.
(If #4153 is not merged before implementing, should be implemented based on its branch)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `redash/destinations/pagerduty.py`
Content:
```
1 import logging
2 from redash.destinations import *
3
4 enabled = True
5
6 try:
7 import pypd
8 except ImportError:
9 enabled = False
10
11
12 class PagerDuty(BaseDestination):
13
14 KEY_STRING = '{alert_id}_{query_id}'
15 DESCRIPTION_STR = u'Alert - Redash Query #{query_id}: {query_name}'
16
17 @classmethod
18 def enabled(cls):
19 return enabled
20
21 @classmethod
22 def configuration_schema(cls):
23 return {
24 'type': 'object',
25 'properties': {
26 'integration_key': {
27 'type': 'string',
28 'title': 'PagerDuty Service Integration Key'
29 },
30 'description': {
31 'type': 'string',
32 'title': 'Description for the event, defaults to query',
33 }
34 },
35 "required": ["integration_key"]
36 }
37
38 @classmethod
39 def icon(cls):
40 return 'creative-commons-pd-alt'
41
42 def notify(self, alert, query, user, new_state, app, host, options):
43
44 if alert.custom_subject:
45 default_desc = alert.custom_subject
46 elif options.get('description'):
47 default_desc = options.get('description')
48 else:
49 default_desc = self.DESCRIPTION_STR.format(query_id=query.id, query_name=query.name)
50
51 incident_key = self.KEY_STRING.format(alert_id=alert.id, query_id=query.id)
52 data = {
53 'routing_key': options.get('integration_key'),
54 'incident_key': incident_key,
55 'dedup_key': incident_key,
56 'payload': {
57 'summary': default_desc,
58 'severity': 'error',
59 'source': 'redash',
60 }
61 }
62
63 if alert.custom_body:
64 data['payload']['custom_details'] = alert.custom_body
65
66 if new_state == 'triggered':
67 data['event_action'] = 'trigger'
68 elif new_state == "unknown":
69 logging.info('Unknown state, doing nothing')
70 return
71 else:
72 data['event_action'] = 'resolve'
73
74 try:
75
76 ev = pypd.EventV2.create(data=data)
77 logging.warning(ev)
78
79 except Exception:
80 logging.exception("PagerDuty trigger failed!")
81
82
83 register(PagerDuty)
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/redash/destinations/pagerduty.py b/redash/destinations/pagerduty.py
--- a/redash/destinations/pagerduty.py
+++ b/redash/destinations/pagerduty.py
@@ -12,7 +12,7 @@
class PagerDuty(BaseDestination):
KEY_STRING = '{alert_id}_{query_id}'
- DESCRIPTION_STR = u'Alert - Redash Query #{query_id}: {query_name}'
+ DESCRIPTION_STR = u'Alert: {alert_name}'
@classmethod
def enabled(cls):
@@ -29,7 +29,7 @@
},
'description': {
'type': 'string',
- 'title': 'Description for the event, defaults to query',
+ 'title': 'Description for the event, defaults to alert name',
}
},
"required": ["integration_key"]
@@ -46,7 +46,7 @@
elif options.get('description'):
default_desc = options.get('description')
else:
- default_desc = self.DESCRIPTION_STR.format(query_id=query.id, query_name=query.name)
+ default_desc = self.DESCRIPTION_STR.format(alert_name=alert.name)
incident_key = self.KEY_STRING.format(alert_id=alert.id, query_id=query.id)
data = {
|
{"golden_diff": "diff --git a/redash/destinations/pagerduty.py b/redash/destinations/pagerduty.py\n--- a/redash/destinations/pagerduty.py\n+++ b/redash/destinations/pagerduty.py\n@@ -12,7 +12,7 @@\n class PagerDuty(BaseDestination):\n \n KEY_STRING = '{alert_id}_{query_id}'\n- DESCRIPTION_STR = u'Alert - Redash Query #{query_id}: {query_name}'\n+ DESCRIPTION_STR = u'Alert: {alert_name}'\n \n @classmethod\n def enabled(cls):\n@@ -29,7 +29,7 @@\n },\n 'description': {\n 'type': 'string',\n- 'title': 'Description for the event, defaults to query',\n+ 'title': 'Description for the event, defaults to alert name',\n }\n },\n \"required\": [\"integration_key\"]\n@@ -46,7 +46,7 @@\n elif options.get('description'):\n default_desc = options.get('description')\n else:\n- default_desc = self.DESCRIPTION_STR.format(query_id=query.id, query_name=query.name)\n+ default_desc = self.DESCRIPTION_STR.format(alert_name=alert.name)\n \n incident_key = self.KEY_STRING.format(alert_id=alert.id, query_id=query.id)\n data = {\n", "issue": "Change PagerDuty's default summary text\nCurrently PagerDuty's Alert destination default summary text uses the query id and name. We should change it to use the alert name as it's usually better explains what the alert is.\r\n\r\nWhile #4153 implements ability to customize the summary text, it's good to have a saner default regardless.\r\n\r\n(If #4153 is not merged before implementing, should be implemented based on its branch)\n", "before_files": [{"content": "import logging\nfrom redash.destinations import *\n\nenabled = True\n\ntry:\n import pypd\nexcept ImportError:\n enabled = False\n\n\nclass PagerDuty(BaseDestination):\n\n KEY_STRING = '{alert_id}_{query_id}'\n DESCRIPTION_STR = u'Alert - Redash Query #{query_id}: {query_name}'\n\n @classmethod\n def enabled(cls):\n return enabled\n\n @classmethod\n def configuration_schema(cls):\n return {\n 'type': 'object',\n 'properties': {\n 'integration_key': {\n 'type': 'string',\n 'title': 'PagerDuty Service Integration Key'\n },\n 'description': {\n 'type': 'string',\n 'title': 'Description for the event, defaults to query',\n }\n },\n \"required\": [\"integration_key\"]\n }\n\n @classmethod\n def icon(cls):\n return 'creative-commons-pd-alt'\n\n def notify(self, alert, query, user, new_state, app, host, options):\n\n if alert.custom_subject:\n default_desc = alert.custom_subject\n elif options.get('description'):\n default_desc = options.get('description')\n else:\n default_desc = self.DESCRIPTION_STR.format(query_id=query.id, query_name=query.name)\n\n incident_key = self.KEY_STRING.format(alert_id=alert.id, query_id=query.id)\n data = {\n 'routing_key': options.get('integration_key'),\n 'incident_key': incident_key,\n 'dedup_key': incident_key,\n 'payload': {\n 'summary': default_desc,\n 'severity': 'error',\n 'source': 'redash',\n }\n }\n\n if alert.custom_body:\n data['payload']['custom_details'] = alert.custom_body\n\n if new_state == 'triggered':\n data['event_action'] = 'trigger'\n elif new_state == \"unknown\":\n logging.info('Unknown state, doing nothing')\n return\n else:\n data['event_action'] = 'resolve'\n\n try:\n\n ev = pypd.EventV2.create(data=data)\n logging.warning(ev)\n\n except Exception:\n logging.exception(\"PagerDuty trigger failed!\")\n\n\nregister(PagerDuty)\n", "path": "redash/destinations/pagerduty.py"}], "after_files": [{"content": "import logging\nfrom redash.destinations import *\n\nenabled = True\n\ntry:\n import pypd\nexcept ImportError:\n enabled = False\n\n\nclass PagerDuty(BaseDestination):\n\n KEY_STRING = '{alert_id}_{query_id}'\n DESCRIPTION_STR = u'Alert: {alert_name}'\n\n @classmethod\n def enabled(cls):\n return enabled\n\n @classmethod\n def configuration_schema(cls):\n return {\n 'type': 'object',\n 'properties': {\n 'integration_key': {\n 'type': 'string',\n 'title': 'PagerDuty Service Integration Key'\n },\n 'description': {\n 'type': 'string',\n 'title': 'Description for the event, defaults to alert name',\n }\n },\n \"required\": [\"integration_key\"]\n }\n\n @classmethod\n def icon(cls):\n return 'creative-commons-pd-alt'\n\n def notify(self, alert, query, user, new_state, app, host, options):\n\n if alert.custom_subject:\n default_desc = alert.custom_subject\n elif options.get('description'):\n default_desc = options.get('description')\n else:\n default_desc = self.DESCRIPTION_STR.format(alert_name=alert.name)\n\n incident_key = self.KEY_STRING.format(alert_id=alert.id, query_id=query.id)\n data = {\n 'routing_key': options.get('integration_key'),\n 'incident_key': incident_key,\n 'dedup_key': incident_key,\n 'payload': {\n 'summary': default_desc,\n 'severity': 'error',\n 'source': 'redash',\n }\n }\n\n if alert.custom_body:\n data['payload']['custom_details'] = alert.custom_body\n\n if new_state == 'triggered':\n data['event_action'] = 'trigger'\n elif new_state == \"unknown\":\n logging.info('Unknown state, doing nothing')\n return\n else:\n data['event_action'] = 'resolve'\n\n try:\n\n ev = pypd.EventV2.create(data=data)\n logging.warning(ev)\n\n except Exception:\n logging.exception(\"PagerDuty trigger failed!\")\n\n\nregister(PagerDuty)\n", "path": "redash/destinations/pagerduty.py"}]}
| 991 | 286 |
gh_patches_debug_32543
|
rasdani/github-patches
|
git_diff
|
streamlink__streamlink-1935
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Can't Download Picarto Video
## Bug Report
- [X ] I have read the contribution guidelines.
### Description
Picarto.tv plugin doesn't seem to download any recorded streams anymore
### Expected / Actual behavior
Expected it to download video, didn't
### Reproduction steps / Explicit stream URLs to test
1. streamlink -o Amarynceus.flv https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv
### Logs
```
streamlink -o Amarynceus.flv https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv -l debug
[cli][debug] OS: Linux-4.15.0-24-generic-x86_64-with-Ubuntu-18.04-bionic
[cli][debug] Python: 2.7.15rc1
[cli][debug] Streamlink: 0.14.2+24.g6bf654a2
[cli][debug] Requests(2.19.1), Socks(1.6.7), Websocket(0.48.0)
[cli][info] Found matching plugin picarto for URL https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv
[plugin.picarto][debug] Possible VOD stream...
[plugin.picarto][warning] Probably a VOD stream but no VOD found?
[plugin.picarto][error] The channel Amarynceus_2018.07.04.22.55.48.flv does not exist
error: No playable streams found on this URL: https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/picarto.py`
Content:
```
1 from __future__ import print_function
2
3 import re
4 import json
5
6 from streamlink.plugin import Plugin
7 from streamlink.stream import HLSStream
8 from streamlink.stream import RTMPStream
9
10
11 class Picarto(Plugin):
12 CHANNEL_API_URL = "https://api.picarto.tv/v1/channel/name/{channel}"
13 VIDEO_API_URL = "https://picarto.tv/process/channel"
14 RTMP_URL = "rtmp://{server}:1935/play/"
15 RTMP_PLAYPATH = "golive+{channel}?token={token}"
16 HLS_URL = "https://{server}/hls/{channel}/index.m3u8?token={token}"
17
18 # Regex for all usable URLs
19 _url_re = re.compile(r"""
20 https?://(?:\w+\.)?picarto\.tv/(?:videopopout/)?([^&?/]+)
21 """, re.VERBOSE)
22
23 # Regex for VOD extraction
24 _vod_re = re.compile(r'''vod: "(https?://[\S]+?/index.m3u8)",''')
25
26 @classmethod
27 def can_handle_url(cls, url):
28 return cls._url_re.match(url) is not None
29
30 def _create_hls_stream(self, server, channel, token):
31 streams = HLSStream.parse_variant_playlist(self.session,
32 self.HLS_URL.format(
33 server=server,
34 channel=channel,
35 token=token),
36 verify=False)
37 if len(streams) > 1:
38 self.logger.debug("Multiple HLS streams found")
39 return streams
40 elif len(streams) == 0:
41 self.logger.warning("No HLS streams found when expected")
42 return {}
43 else:
44 # one HLS streams, rename it to live
45 return {"live": list(streams.values())[0]}
46
47 def _create_flash_stream(self, server, channel, token):
48 params = {
49 "rtmp": self.RTMP_URL.format(server=server),
50 "playpath": self.RTMP_PLAYPATH.format(token=token, channel=channel)
51 }
52 return RTMPStream(self.session, params=params)
53
54 def _get_vod_stream(self, page):
55 m = self._vod_re.search(page.text)
56 if m:
57 return HLSStream.parse_variant_playlist(self.session, m.group(1))
58
59 def _get_streams(self):
60 url_channel_name = self._url_re.match(self.url).group(1)
61
62 # Handle VODs first, since their "channel name" is different
63 if url_channel_name.endswith(".flv"):
64 self.logger.debug("Possible VOD stream...")
65 page = self.session.http.get(self.url)
66 vod_streams = self._get_vod_stream(page)
67 if vod_streams:
68 for s in vod_streams.items():
69 yield s
70 return
71 else:
72 self.logger.warning("Probably a VOD stream but no VOD found?")
73
74 ci = self.session.http.get(self.CHANNEL_API_URL.format(channel=url_channel_name), raise_for_status=False)
75
76 if ci.status_code == 404:
77 self.logger.error("The channel {0} does not exist".format(url_channel_name))
78 return
79
80 channel_api_json = json.loads(ci.text)
81
82 if channel_api_json["online"] != True:
83 self.logger.error("The channel {0} is currently offline".format(url_channel_name))
84 return
85
86 server = None
87 token = "public"
88 channel = channel_api_json["name"]
89
90 # Extract preferred edge server and available techs from the undocumented channel API
91 channel_server_res = self.session.http.post(self.VIDEO_API_URL, data={"loadbalancinginfo": channel})
92 info_json = json.loads(channel_server_res.text)
93 pref = info_json["preferedEdge"]
94 for i in info_json["edges"]:
95 if i["id"] == pref:
96 server = i["ep"]
97 break
98 self.logger.debug("Using load balancing server {0} : {1} for channel {2}",
99 pref,
100 server,
101 channel)
102
103 for i in info_json["techs"]:
104 if i["label"] == "HLS":
105 for s in self._create_hls_stream(server, channel, token).items():
106 yield s
107 elif i["label"] == "RTMP Flash":
108 stream = self._create_flash_stream(server, channel, token)
109 yield "live", stream
110
111
112 __plugin__ = Picarto
113
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/streamlink/plugins/picarto.py b/src/streamlink/plugins/picarto.py
--- a/src/streamlink/plugins/picarto.py
+++ b/src/streamlink/plugins/picarto.py
@@ -4,8 +4,10 @@
import json
from streamlink.plugin import Plugin
+from streamlink.plugin.api import validate
from streamlink.stream import HLSStream
from streamlink.stream import RTMPStream
+from streamlink.utils import parse_json
class Picarto(Plugin):
@@ -15,13 +17,29 @@
RTMP_PLAYPATH = "golive+{channel}?token={token}"
HLS_URL = "https://{server}/hls/{channel}/index.m3u8?token={token}"
+
# Regex for all usable URLs
_url_re = re.compile(r"""
https?://(?:\w+\.)?picarto\.tv/(?:videopopout/)?([^&?/]+)
""", re.VERBOSE)
# Regex for VOD extraction
- _vod_re = re.compile(r'''vod: "(https?://[\S]+?/index.m3u8)",''')
+ _vod_re = re.compile(r'''(?<=#vod-player", )(\{.*?\})''')
+
+ data_schema = validate.Schema(
+ validate.transform(_vod_re.search),
+ validate.any(
+ None,
+ validate.all(
+ validate.get(0),
+ validate.transform(parse_json),
+ {
+ "vod": validate.url(),
+ }
+ )
+ )
+ )
+
@classmethod
def can_handle_url(cls, url):
@@ -52,9 +70,10 @@
return RTMPStream(self.session, params=params)
def _get_vod_stream(self, page):
- m = self._vod_re.search(page.text)
- if m:
- return HLSStream.parse_variant_playlist(self.session, m.group(1))
+ data = self.data_schema.validate(page.text)
+
+ if data:
+ return HLSStream.parse_variant_playlist(self.session, data["vod"])
def _get_streams(self):
url_channel_name = self._url_re.match(self.url).group(1)
|
{"golden_diff": "diff --git a/src/streamlink/plugins/picarto.py b/src/streamlink/plugins/picarto.py\n--- a/src/streamlink/plugins/picarto.py\n+++ b/src/streamlink/plugins/picarto.py\n@@ -4,8 +4,10 @@\n import json\n \n from streamlink.plugin import Plugin\n+from streamlink.plugin.api import validate\n from streamlink.stream import HLSStream\n from streamlink.stream import RTMPStream\n+from streamlink.utils import parse_json\n \n \n class Picarto(Plugin):\n@@ -15,13 +17,29 @@\n RTMP_PLAYPATH = \"golive+{channel}?token={token}\"\n HLS_URL = \"https://{server}/hls/{channel}/index.m3u8?token={token}\"\n \n+\n # Regex for all usable URLs\n _url_re = re.compile(r\"\"\"\n https?://(?:\\w+\\.)?picarto\\.tv/(?:videopopout/)?([^&?/]+)\n \"\"\", re.VERBOSE)\n \n # Regex for VOD extraction\n- _vod_re = re.compile(r'''vod: \"(https?://[\\S]+?/index.m3u8)\",''')\n+ _vod_re = re.compile(r'''(?<=#vod-player\", )(\\{.*?\\})''')\n+\n+ data_schema = validate.Schema(\n+ validate.transform(_vod_re.search),\n+ validate.any(\n+ None,\n+ validate.all(\n+ validate.get(0),\n+ validate.transform(parse_json),\n+ {\n+ \"vod\": validate.url(),\n+ }\n+ )\n+ )\n+ )\n+\n \n @classmethod\n def can_handle_url(cls, url):\n@@ -52,9 +70,10 @@\n return RTMPStream(self.session, params=params)\n \n def _get_vod_stream(self, page):\n- m = self._vod_re.search(page.text)\n- if m:\n- return HLSStream.parse_variant_playlist(self.session, m.group(1))\n+ data = self.data_schema.validate(page.text)\n+\n+ if data:\n+ return HLSStream.parse_variant_playlist(self.session, data[\"vod\"])\n \n def _get_streams(self):\n url_channel_name = self._url_re.match(self.url).group(1)\n", "issue": "Can't Download Picarto Video\n## Bug Report\r\n\r\n- [X ] I have read the contribution guidelines. \r\n\r\n### Description\r\n\r\nPicarto.tv plugin doesn't seem to download any recorded streams anymore\r\n\r\n### Expected / Actual behavior\r\n\r\nExpected it to download video, didn't\r\n\r\n### Reproduction steps / Explicit stream URLs to test\r\n\r\n1. streamlink -o Amarynceus.flv https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv\r\n\r\n### Logs\r\n\r\n\r\n```\r\nstreamlink -o Amarynceus.flv https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv -l debug\r\n[cli][debug] OS: Linux-4.15.0-24-generic-x86_64-with-Ubuntu-18.04-bionic\r\n[cli][debug] Python: 2.7.15rc1\r\n[cli][debug] Streamlink: 0.14.2+24.g6bf654a2\r\n[cli][debug] Requests(2.19.1), Socks(1.6.7), Websocket(0.48.0)\r\n[cli][info] Found matching plugin picarto for URL https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv\r\n[plugin.picarto][debug] Possible VOD stream...\r\n[plugin.picarto][warning] Probably a VOD stream but no VOD found?\r\n[plugin.picarto][error] The channel Amarynceus_2018.07.04.22.55.48.flv does not exist\r\nerror: No playable streams found on this URL: https://picarto.tv/videopopout/Amarynceus_2018.07.04.22.55.48.flv\r\n```\n", "before_files": [{"content": "from __future__ import print_function\n\nimport re\nimport json\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.stream import HLSStream\nfrom streamlink.stream import RTMPStream\n\n\nclass Picarto(Plugin):\n CHANNEL_API_URL = \"https://api.picarto.tv/v1/channel/name/{channel}\"\n VIDEO_API_URL = \"https://picarto.tv/process/channel\"\n RTMP_URL = \"rtmp://{server}:1935/play/\"\n RTMP_PLAYPATH = \"golive+{channel}?token={token}\"\n HLS_URL = \"https://{server}/hls/{channel}/index.m3u8?token={token}\"\n\n # Regex for all usable URLs\n _url_re = re.compile(r\"\"\"\n https?://(?:\\w+\\.)?picarto\\.tv/(?:videopopout/)?([^&?/]+)\n \"\"\", re.VERBOSE)\n\n # Regex for VOD extraction\n _vod_re = re.compile(r'''vod: \"(https?://[\\S]+?/index.m3u8)\",''')\n\n @classmethod\n def can_handle_url(cls, url):\n return cls._url_re.match(url) is not None\n\n def _create_hls_stream(self, server, channel, token):\n streams = HLSStream.parse_variant_playlist(self.session,\n self.HLS_URL.format(\n server=server,\n channel=channel,\n token=token),\n verify=False)\n if len(streams) > 1:\n self.logger.debug(\"Multiple HLS streams found\")\n return streams\n elif len(streams) == 0:\n self.logger.warning(\"No HLS streams found when expected\")\n return {}\n else:\n # one HLS streams, rename it to live\n return {\"live\": list(streams.values())[0]}\n\n def _create_flash_stream(self, server, channel, token):\n params = {\n \"rtmp\": self.RTMP_URL.format(server=server),\n \"playpath\": self.RTMP_PLAYPATH.format(token=token, channel=channel)\n }\n return RTMPStream(self.session, params=params)\n\n def _get_vod_stream(self, page):\n m = self._vod_re.search(page.text)\n if m:\n return HLSStream.parse_variant_playlist(self.session, m.group(1))\n\n def _get_streams(self):\n url_channel_name = self._url_re.match(self.url).group(1)\n\n # Handle VODs first, since their \"channel name\" is different\n if url_channel_name.endswith(\".flv\"):\n self.logger.debug(\"Possible VOD stream...\")\n page = self.session.http.get(self.url)\n vod_streams = self._get_vod_stream(page)\n if vod_streams:\n for s in vod_streams.items():\n yield s\n return\n else:\n self.logger.warning(\"Probably a VOD stream but no VOD found?\")\n\n ci = self.session.http.get(self.CHANNEL_API_URL.format(channel=url_channel_name), raise_for_status=False)\n\n if ci.status_code == 404:\n self.logger.error(\"The channel {0} does not exist\".format(url_channel_name))\n return\n\n channel_api_json = json.loads(ci.text)\n\n if channel_api_json[\"online\"] != True:\n self.logger.error(\"The channel {0} is currently offline\".format(url_channel_name))\n return\n\n server = None\n token = \"public\"\n channel = channel_api_json[\"name\"]\n\n # Extract preferred edge server and available techs from the undocumented channel API\n channel_server_res = self.session.http.post(self.VIDEO_API_URL, data={\"loadbalancinginfo\": channel})\n info_json = json.loads(channel_server_res.text)\n pref = info_json[\"preferedEdge\"]\n for i in info_json[\"edges\"]:\n if i[\"id\"] == pref:\n server = i[\"ep\"]\n break\n self.logger.debug(\"Using load balancing server {0} : {1} for channel {2}\",\n pref,\n server,\n channel)\n\n for i in info_json[\"techs\"]:\n if i[\"label\"] == \"HLS\":\n for s in self._create_hls_stream(server, channel, token).items():\n yield s\n elif i[\"label\"] == \"RTMP Flash\":\n stream = self._create_flash_stream(server, channel, token)\n yield \"live\", stream\n\n\n__plugin__ = Picarto\n", "path": "src/streamlink/plugins/picarto.py"}], "after_files": [{"content": "from __future__ import print_function\n\nimport re\nimport json\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream\nfrom streamlink.stream import RTMPStream\nfrom streamlink.utils import parse_json\n\n\nclass Picarto(Plugin):\n CHANNEL_API_URL = \"https://api.picarto.tv/v1/channel/name/{channel}\"\n VIDEO_API_URL = \"https://picarto.tv/process/channel\"\n RTMP_URL = \"rtmp://{server}:1935/play/\"\n RTMP_PLAYPATH = \"golive+{channel}?token={token}\"\n HLS_URL = \"https://{server}/hls/{channel}/index.m3u8?token={token}\"\n\n\n # Regex for all usable URLs\n _url_re = re.compile(r\"\"\"\n https?://(?:\\w+\\.)?picarto\\.tv/(?:videopopout/)?([^&?/]+)\n \"\"\", re.VERBOSE)\n\n # Regex for VOD extraction\n _vod_re = re.compile(r'''(?<=#vod-player\", )(\\{.*?\\})''')\n\n data_schema = validate.Schema(\n validate.transform(_vod_re.search),\n validate.any(\n None,\n validate.all(\n validate.get(0),\n validate.transform(parse_json),\n {\n \"vod\": validate.url(),\n }\n )\n )\n )\n\n\n @classmethod\n def can_handle_url(cls, url):\n return cls._url_re.match(url) is not None\n\n def _create_hls_stream(self, server, channel, token):\n streams = HLSStream.parse_variant_playlist(self.session,\n self.HLS_URL.format(\n server=server,\n channel=channel,\n token=token),\n verify=False)\n if len(streams) > 1:\n self.logger.debug(\"Multiple HLS streams found\")\n return streams\n elif len(streams) == 0:\n self.logger.warning(\"No HLS streams found when expected\")\n return {}\n else:\n # one HLS streams, rename it to live\n return {\"live\": list(streams.values())[0]}\n\n def _create_flash_stream(self, server, channel, token):\n params = {\n \"rtmp\": self.RTMP_URL.format(server=server),\n \"playpath\": self.RTMP_PLAYPATH.format(token=token, channel=channel)\n }\n return RTMPStream(self.session, params=params)\n\n def _get_vod_stream(self, page):\n data = self.data_schema.validate(page.text)\n\n if data:\n return HLSStream.parse_variant_playlist(self.session, data[\"vod\"])\n\n def _get_streams(self):\n url_channel_name = self._url_re.match(self.url).group(1)\n\n # Handle VODs first, since their \"channel name\" is different\n if url_channel_name.endswith(\".flv\"):\n self.logger.debug(\"Possible VOD stream...\")\n page = self.session.http.get(self.url)\n vod_streams = self._get_vod_stream(page)\n if vod_streams:\n for s in vod_streams.items():\n yield s\n return\n else:\n self.logger.warning(\"Probably a VOD stream but no VOD found?\")\n\n ci = self.session.http.get(self.CHANNEL_API_URL.format(channel=url_channel_name), raise_for_status=False)\n\n if ci.status_code == 404:\n self.logger.error(\"The channel {0} does not exist\".format(url_channel_name))\n return\n\n channel_api_json = json.loads(ci.text)\n\n if channel_api_json[\"online\"] != True:\n self.logger.error(\"The channel {0} is currently offline\".format(url_channel_name))\n return\n\n server = None\n token = \"public\"\n channel = channel_api_json[\"name\"]\n\n # Extract preferred edge server and available techs from the undocumented channel API\n channel_server_res = self.session.http.post(self.VIDEO_API_URL, data={\"loadbalancinginfo\": channel})\n info_json = json.loads(channel_server_res.text)\n pref = info_json[\"preferedEdge\"]\n for i in info_json[\"edges\"]:\n if i[\"id\"] == pref:\n server = i[\"ep\"]\n break\n self.logger.debug(\"Using load balancing server {0} : {1} for channel {2}\",\n pref,\n server,\n channel)\n\n for i in info_json[\"techs\"]:\n if i[\"label\"] == \"HLS\":\n for s in self._create_hls_stream(server, channel, token).items():\n yield s\n elif i[\"label\"] == \"RTMP Flash\":\n stream = self._create_flash_stream(server, channel, token)\n yield \"live\", stream\n\n\n__plugin__ = Picarto\n", "path": "src/streamlink/plugins/picarto.py"}]}
| 1,896 | 489 |
gh_patches_debug_18680
|
rasdani/github-patches
|
git_diff
|
wagtail__wagtail-118
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
'AnonymousUser' object has no attribute 'has_usable_password'
Visiting http://localhost:8000/admin/account/ as not logged in user gives the AttributeError above...
Traceback;
``` Environment:
Request Method: GET
Request URL: http://localhost:8000/admin/account/
Django Version: 1.6.2
Python Version: 2.7.6
Installed Applications:
('django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'south',
'compressor',
'taggit',
'modelcluster',
'django.contrib.admin',
'wagtail.wagtailcore',
'wagtail.wagtailadmin',
'wagtail.wagtaildocs',
'wagtail.wagtailsnippets',
'wagtail.wagtailusers',
'wagtail.wagtailimages',
'wagtail.wagtailembeds',
'wagtail.wagtailsearch',
'wagtail.wagtailredirects')
Installed Middleware:
('django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
'wagtail.wagtailcore.middleware.SiteMiddleware',
'wagtail.wagtailredirects.middleware.RedirectMiddleware')
Traceback:
File "C:\Users\drager\developing\django-env\lib\site-packages\django\core\handlers\base.py" in get_response
114. response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "c:\users\drager\developing\django-env\src\wagtail\wagtail\wagtailadmin\views\account.py" in account
9. 'show_change_password': getattr(settings, 'WAGTAIL_PASSWORD_MANAGEMENT_ENABLED', True) and request.user.has_usable_password(),
File "C:\Users\drager\developing\django-env\lib\site-packages\django\utils\functional.py" in inner
214. return func(self._wrapped, *args)
Exception Type: AttributeError at /admin/account/
Exception Value: 'AnonymousUser' object has no attribute 'has_usable_password'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/wagtailadmin/views/account.py`
Content:
```
1 from django.conf import settings
2 from django.shortcuts import render, redirect
3 from django.contrib import messages
4 from django.contrib.auth.forms import SetPasswordForm
5 from django.utils.translation import ugettext as _
6
7 def account(request):
8 return render(request, 'wagtailadmin/account/account.html', {
9 'show_change_password': getattr(settings, 'WAGTAIL_PASSWORD_MANAGEMENT_ENABLED', True) and request.user.has_usable_password(),
10 })
11
12
13 def change_password(request):
14 can_change_password = request.user.has_usable_password()
15
16 if can_change_password:
17 if request.POST:
18 form = SetPasswordForm(request.user, request.POST)
19
20 if form.is_valid():
21 form.save()
22
23 messages.success(request, _("Your password has been changed successfully!"))
24 return redirect('wagtailadmin_account')
25 else:
26 form = SetPasswordForm(request.user)
27 else:
28 form = None
29
30 return render(request, 'wagtailadmin/account/change_password.html', {
31 'form': form,
32 'can_change_password': can_change_password,
33 })
34
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/wagtail/wagtailadmin/views/account.py b/wagtail/wagtailadmin/views/account.py
--- a/wagtail/wagtailadmin/views/account.py
+++ b/wagtail/wagtailadmin/views/account.py
@@ -2,14 +2,17 @@
from django.shortcuts import render, redirect
from django.contrib import messages
from django.contrib.auth.forms import SetPasswordForm
+from django.contrib.auth.decorators import permission_required
from django.utils.translation import ugettext as _
+@permission_required('wagtailadmin.access_admin')
def account(request):
return render(request, 'wagtailadmin/account/account.html', {
'show_change_password': getattr(settings, 'WAGTAIL_PASSWORD_MANAGEMENT_ENABLED', True) and request.user.has_usable_password(),
})
+@permission_required('wagtailadmin.access_admin')
def change_password(request):
can_change_password = request.user.has_usable_password()
|
{"golden_diff": "diff --git a/wagtail/wagtailadmin/views/account.py b/wagtail/wagtailadmin/views/account.py\n--- a/wagtail/wagtailadmin/views/account.py\n+++ b/wagtail/wagtailadmin/views/account.py\n@@ -2,14 +2,17 @@\n from django.shortcuts import render, redirect\n from django.contrib import messages\n from django.contrib.auth.forms import SetPasswordForm\n+from django.contrib.auth.decorators import permission_required\n from django.utils.translation import ugettext as _ \n \n+@permission_required('wagtailadmin.access_admin')\n def account(request):\n return render(request, 'wagtailadmin/account/account.html', {\n 'show_change_password': getattr(settings, 'WAGTAIL_PASSWORD_MANAGEMENT_ENABLED', True) and request.user.has_usable_password(),\n })\n \n \n+@permission_required('wagtailadmin.access_admin')\n def change_password(request):\n can_change_password = request.user.has_usable_password()\n", "issue": "'AnonymousUser' object has no attribute 'has_usable_password'\nVisiting http://localhost:8000/admin/account/ as not logged in user gives the AttributeError above...\n\nTraceback;\n\n``` Environment:\n\n\nRequest Method: GET\nRequest URL: http://localhost:8000/admin/account/\n\nDjango Version: 1.6.2\nPython Version: 2.7.6\nInstalled Applications:\n('django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'south',\n 'compressor',\n 'taggit',\n 'modelcluster',\n 'django.contrib.admin',\n 'wagtail.wagtailcore',\n 'wagtail.wagtailadmin',\n 'wagtail.wagtaildocs',\n 'wagtail.wagtailsnippets',\n 'wagtail.wagtailusers',\n 'wagtail.wagtailimages',\n 'wagtail.wagtailembeds',\n 'wagtail.wagtailsearch',\n 'wagtail.wagtailredirects')\nInstalled Middleware:\n('django.contrib.sessions.middleware.SessionMiddleware',\n 'django.middleware.common.CommonMiddleware',\n 'django.middleware.csrf.CsrfViewMiddleware',\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\n 'django.contrib.messages.middleware.MessageMiddleware',\n 'django.middleware.clickjacking.XFrameOptionsMiddleware',\n 'wagtail.wagtailcore.middleware.SiteMiddleware',\n 'wagtail.wagtailredirects.middleware.RedirectMiddleware')\n\n\nTraceback:\nFile \"C:\\Users\\drager\\developing\\django-env\\lib\\site-packages\\django\\core\\handlers\\base.py\" in get_response\n 114. response = wrapped_callback(request, *callback_args, **callback_kwargs)\nFile \"c:\\users\\drager\\developing\\django-env\\src\\wagtail\\wagtail\\wagtailadmin\\views\\account.py\" in account\n 9. 'show_change_password': getattr(settings, 'WAGTAIL_PASSWORD_MANAGEMENT_ENABLED', True) and request.user.has_usable_password(),\nFile \"C:\\Users\\drager\\developing\\django-env\\lib\\site-packages\\django\\utils\\functional.py\" in inner\n 214. return func(self._wrapped, *args)\n\nException Type: AttributeError at /admin/account/\nException Value: 'AnonymousUser' object has no attribute 'has_usable_password'\n```\n\n", "before_files": [{"content": "from django.conf import settings\nfrom django.shortcuts import render, redirect\nfrom django.contrib import messages\nfrom django.contrib.auth.forms import SetPasswordForm\nfrom django.utils.translation import ugettext as _ \n\ndef account(request):\n return render(request, 'wagtailadmin/account/account.html', {\n 'show_change_password': getattr(settings, 'WAGTAIL_PASSWORD_MANAGEMENT_ENABLED', True) and request.user.has_usable_password(),\n })\n\n\ndef change_password(request):\n can_change_password = request.user.has_usable_password()\n\n if can_change_password:\n if request.POST:\n form = SetPasswordForm(request.user, request.POST)\n\n if form.is_valid():\n form.save()\n\n messages.success(request, _(\"Your password has been changed successfully!\"))\n return redirect('wagtailadmin_account')\n else:\n form = SetPasswordForm(request.user)\n else:\n form = None\n\n return render(request, 'wagtailadmin/account/change_password.html', {\n 'form': form,\n 'can_change_password': can_change_password,\n })\n", "path": "wagtail/wagtailadmin/views/account.py"}], "after_files": [{"content": "from django.conf import settings\nfrom django.shortcuts import render, redirect\nfrom django.contrib import messages\nfrom django.contrib.auth.forms import SetPasswordForm\nfrom django.contrib.auth.decorators import permission_required\nfrom django.utils.translation import ugettext as _ \n\n@permission_required('wagtailadmin.access_admin')\ndef account(request):\n return render(request, 'wagtailadmin/account/account.html', {\n 'show_change_password': getattr(settings, 'WAGTAIL_PASSWORD_MANAGEMENT_ENABLED', True) and request.user.has_usable_password(),\n })\n\n\n@permission_required('wagtailadmin.access_admin')\ndef change_password(request):\n can_change_password = request.user.has_usable_password()\n\n if can_change_password:\n if request.POST:\n form = SetPasswordForm(request.user, request.POST)\n\n if form.is_valid():\n form.save()\n\n messages.success(request, _(\"Your password has been changed successfully!\"))\n return redirect('wagtailadmin_account')\n else:\n form = SetPasswordForm(request.user)\n else:\n form = None\n\n return render(request, 'wagtailadmin/account/change_password.html', {\n 'form': form,\n 'can_change_password': can_change_password,\n })\n", "path": "wagtail/wagtailadmin/views/account.py"}]}
| 1,044 | 199 |
gh_patches_debug_29670
|
rasdani/github-patches
|
git_diff
|
streamlink__streamlink-3457
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
https://www.micous.com/ Site closing webcasts
<!--
Thanks for reporting a plugin issue!
USE THE TEMPLATE. Otherwise your plugin issue may be rejected.
First, see the contribution guidelines:
https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink
Also check the list of open and closed plugin issues:
https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22
Please see the text preview to avoid unnecessary formatting errors.
-->
## Plugin Issue
<!-- Replace the space character between the square brackets with an x in order to check the boxes -->
- [ ] This is a plugin issue and I have read the contribution guidelines.
- [ ] I am using the latest development version from the master branch.
### Description
I was checking the m3u8 codes for Tango Live so that it would benefit people and maybe we can bring plugins but now I have checked and the mico.us plugin does not work because mico.us is no longer working. It looks like it has turned off broadcasting and playback, I guess only broadcasts can be followed through the application and need to be checked.
<!-- Explain the plugin issue as thoroughly as you can. -->
### Reproduction steps / Explicit stream URLs to test
this? Please note the exact steps below using the list format supplied. If you need more steps please add them. -->
1. ...https://www.micous.com/
2. ...
3. ...
### Log output
<!--
TEXT LOG OUTPUT IS REQUIRED for a plugin issue!
Use the `--loglevel debug` parameter and avoid using parameters which suppress log output.
https://streamlink.github.io/cli.html#cmdoption-l
Make sure to **remove usernames and passwords**
You can copy the output to https://gist.github.com/ or paste it below.
Don't post screenshots of the log output and instead copy the text from your terminal application.
-->
```
REPLACE THIS TEXT WITH THE LOG OUTPUT
```
### Additional comments, etc.
[Love Streamlink? Please consider supporting our collective. Thanks!](https://opencollective.com/streamlink/donate)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/mico.py`
Content:
```
1 import logging
2 import re
3
4 from streamlink.plugin import Plugin
5 from streamlink.plugin.api import validate
6 from streamlink.stream import HLSStream
7 from streamlink.utils import parse_json
8 from streamlink.utils.url import update_scheme
9
10 log = logging.getLogger(__name__)
11
12
13 class Mico(Plugin):
14 author = None
15 category = None
16 title = None
17
18 url_re = re.compile(r'https?://(?:www\.)?micous\.com/live/\d+')
19 json_data_re = re.compile(r'win._profile\s*=\s*({.*})')
20
21 _json_data_schema = validate.Schema(
22 validate.transform(json_data_re.search),
23 validate.any(None, validate.all(
24 validate.get(1),
25 validate.transform(parse_json),
26 validate.any(None, validate.all({
27 'mico_id': int,
28 'nickname': validate.text,
29 'h5_url': validate.all(
30 validate.transform(lambda x: update_scheme('http:', x)),
31 validate.url(),
32 ),
33 'is_live': bool,
34 })),
35 )),
36 )
37
38 @classmethod
39 def can_handle_url(cls, url):
40 return cls.url_re.match(url) is not None
41
42 def get_author(self):
43 if self.author is not None:
44 return self.author
45
46 def get_category(self):
47 if self.category is not None:
48 return self.category
49
50 def get_title(self):
51 if self.title is not None:
52 return self.title
53
54 def _get_streams(self):
55 json_data = self.session.http.get(self.url, schema=self._json_data_schema)
56
57 if not json_data:
58 log.error('Failed to get JSON data')
59 return
60
61 if not json_data['is_live']:
62 log.info('This stream is no longer online')
63 return
64
65 self.author = json_data['mico_id']
66 self.category = 'Live'
67 self.title = json_data['nickname']
68
69 return HLSStream.parse_variant_playlist(self.session, json_data['h5_url'])
70
71
72 __plugin__ = Mico
73
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/streamlink/plugins/mico.py b/src/streamlink/plugins/mico.py
deleted file mode 100644
--- a/src/streamlink/plugins/mico.py
+++ /dev/null
@@ -1,72 +0,0 @@
-import logging
-import re
-
-from streamlink.plugin import Plugin
-from streamlink.plugin.api import validate
-from streamlink.stream import HLSStream
-from streamlink.utils import parse_json
-from streamlink.utils.url import update_scheme
-
-log = logging.getLogger(__name__)
-
-
-class Mico(Plugin):
- author = None
- category = None
- title = None
-
- url_re = re.compile(r'https?://(?:www\.)?micous\.com/live/\d+')
- json_data_re = re.compile(r'win._profile\s*=\s*({.*})')
-
- _json_data_schema = validate.Schema(
- validate.transform(json_data_re.search),
- validate.any(None, validate.all(
- validate.get(1),
- validate.transform(parse_json),
- validate.any(None, validate.all({
- 'mico_id': int,
- 'nickname': validate.text,
- 'h5_url': validate.all(
- validate.transform(lambda x: update_scheme('http:', x)),
- validate.url(),
- ),
- 'is_live': bool,
- })),
- )),
- )
-
- @classmethod
- def can_handle_url(cls, url):
- return cls.url_re.match(url) is not None
-
- def get_author(self):
- if self.author is not None:
- return self.author
-
- def get_category(self):
- if self.category is not None:
- return self.category
-
- def get_title(self):
- if self.title is not None:
- return self.title
-
- def _get_streams(self):
- json_data = self.session.http.get(self.url, schema=self._json_data_schema)
-
- if not json_data:
- log.error('Failed to get JSON data')
- return
-
- if not json_data['is_live']:
- log.info('This stream is no longer online')
- return
-
- self.author = json_data['mico_id']
- self.category = 'Live'
- self.title = json_data['nickname']
-
- return HLSStream.parse_variant_playlist(self.session, json_data['h5_url'])
-
-
-__plugin__ = Mico
|
{"golden_diff": "diff --git a/src/streamlink/plugins/mico.py b/src/streamlink/plugins/mico.py\ndeleted file mode 100644\n--- a/src/streamlink/plugins/mico.py\n+++ /dev/null\n@@ -1,72 +0,0 @@\n-import logging\n-import re\n-\n-from streamlink.plugin import Plugin\n-from streamlink.plugin.api import validate\n-from streamlink.stream import HLSStream\n-from streamlink.utils import parse_json\n-from streamlink.utils.url import update_scheme\n-\n-log = logging.getLogger(__name__)\n-\n-\n-class Mico(Plugin):\n- author = None\n- category = None\n- title = None\n-\n- url_re = re.compile(r'https?://(?:www\\.)?micous\\.com/live/\\d+')\n- json_data_re = re.compile(r'win._profile\\s*=\\s*({.*})')\n-\n- _json_data_schema = validate.Schema(\n- validate.transform(json_data_re.search),\n- validate.any(None, validate.all(\n- validate.get(1),\n- validate.transform(parse_json),\n- validate.any(None, validate.all({\n- 'mico_id': int,\n- 'nickname': validate.text,\n- 'h5_url': validate.all(\n- validate.transform(lambda x: update_scheme('http:', x)),\n- validate.url(),\n- ),\n- 'is_live': bool,\n- })),\n- )),\n- )\n-\n- @classmethod\n- def can_handle_url(cls, url):\n- return cls.url_re.match(url) is not None\n-\n- def get_author(self):\n- if self.author is not None:\n- return self.author\n-\n- def get_category(self):\n- if self.category is not None:\n- return self.category\n-\n- def get_title(self):\n- if self.title is not None:\n- return self.title\n-\n- def _get_streams(self):\n- json_data = self.session.http.get(self.url, schema=self._json_data_schema)\n-\n- if not json_data:\n- log.error('Failed to get JSON data')\n- return\n-\n- if not json_data['is_live']:\n- log.info('This stream is no longer online')\n- return\n-\n- self.author = json_data['mico_id']\n- self.category = 'Live'\n- self.title = json_data['nickname']\n-\n- return HLSStream.parse_variant_playlist(self.session, json_data['h5_url'])\n-\n-\n-__plugin__ = Mico\n", "issue": "https://www.micous.com/ Site closing webcasts\n<!--\r\nThanks for reporting a plugin issue!\r\nUSE THE TEMPLATE. Otherwise your plugin issue may be rejected.\r\n\r\nFirst, see the contribution guidelines:\r\nhttps://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink\r\n\r\nAlso check the list of open and closed plugin issues:\r\nhttps://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22\r\n\r\nPlease see the text preview to avoid unnecessary formatting errors.\r\n-->\r\n\r\n\r\n## Plugin Issue\r\n\r\n<!-- Replace the space character between the square brackets with an x in order to check the boxes -->\r\n- [ ] This is a plugin issue and I have read the contribution guidelines.\r\n- [ ] I am using the latest development version from the master branch.\r\n\r\n\r\n### Description\r\nI was checking the m3u8 codes for Tango Live so that it would benefit people and maybe we can bring plugins but now I have checked and the mico.us plugin does not work because mico.us is no longer working. It looks like it has turned off broadcasting and playback, I guess only broadcasts can be followed through the application and need to be checked. \r\n<!-- Explain the plugin issue as thoroughly as you can. -->\r\n\r\n\r\n### Reproduction steps / Explicit stream URLs to test\r\nthis? Please note the exact steps below using the list format supplied. If you need more steps please add them. -->\r\n\r\n1. ...https://www.micous.com/\r\n2. ...\r\n3. ...\r\n\r\n\r\n### Log output\r\n\r\n<!--\r\nTEXT LOG OUTPUT IS REQUIRED for a plugin issue!\r\nUse the `--loglevel debug` parameter and avoid using parameters which suppress log output.\r\nhttps://streamlink.github.io/cli.html#cmdoption-l\r\n\r\nMake sure to **remove usernames and passwords**\r\nYou can copy the output to https://gist.github.com/ or paste it below.\r\n\r\nDon't post screenshots of the log output and instead copy the text from your terminal application.\r\n-->\r\n\r\n```\r\nREPLACE THIS TEXT WITH THE LOG OUTPUT\r\n```\r\n\r\n\r\n### Additional comments, etc.\r\n\r\n\r\n\r\n[Love Streamlink? Please consider supporting our collective. Thanks!](https://opencollective.com/streamlink/donate)\r\n\n", "before_files": [{"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream\nfrom streamlink.utils import parse_json\nfrom streamlink.utils.url import update_scheme\n\nlog = logging.getLogger(__name__)\n\n\nclass Mico(Plugin):\n author = None\n category = None\n title = None\n\n url_re = re.compile(r'https?://(?:www\\.)?micous\\.com/live/\\d+')\n json_data_re = re.compile(r'win._profile\\s*=\\s*({.*})')\n\n _json_data_schema = validate.Schema(\n validate.transform(json_data_re.search),\n validate.any(None, validate.all(\n validate.get(1),\n validate.transform(parse_json),\n validate.any(None, validate.all({\n 'mico_id': int,\n 'nickname': validate.text,\n 'h5_url': validate.all(\n validate.transform(lambda x: update_scheme('http:', x)),\n validate.url(),\n ),\n 'is_live': bool,\n })),\n )),\n )\n\n @classmethod\n def can_handle_url(cls, url):\n return cls.url_re.match(url) is not None\n\n def get_author(self):\n if self.author is not None:\n return self.author\n\n def get_category(self):\n if self.category is not None:\n return self.category\n\n def get_title(self):\n if self.title is not None:\n return self.title\n\n def _get_streams(self):\n json_data = self.session.http.get(self.url, schema=self._json_data_schema)\n\n if not json_data:\n log.error('Failed to get JSON data')\n return\n\n if not json_data['is_live']:\n log.info('This stream is no longer online')\n return\n\n self.author = json_data['mico_id']\n self.category = 'Live'\n self.title = json_data['nickname']\n\n return HLSStream.parse_variant_playlist(self.session, json_data['h5_url'])\n\n\n__plugin__ = Mico\n", "path": "src/streamlink/plugins/mico.py"}], "after_files": [{"content": null, "path": "src/streamlink/plugins/mico.py"}]}
| 1,294 | 540 |
gh_patches_debug_30756
|
rasdani/github-patches
|
git_diff
|
apache__airflow-19592
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cast to string in ds macro functions
As already written in this issue https://github.com/apache/airflow/issues/19241 strptime function required string, but got proxy if the variables ds/next_ds (the types of these variables changed on version 2.2.0) sent.
This change will make the function `ds_add` and `ds_format` backward compatible.
<!--
Thank you for contributing! Please make sure that your code changes
are covered with tests. And in case of new features or big changes
remember to adjust the documentation.
Feel free to ping committers for the review!
In case of existing issue, reference it using one of the following:
closes: #ISSUE
related: #ISSUE
How to write a good git commit message:
http://chris.beams.io/posts/git-commit/
-->
---
**^ Add meaningful description above**
Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information.
In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed.
In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
next_ds changed to proxy and it cannot be used in ds_add macro function
### Apache Airflow version
2.2.0 (latest released)
### Operating System
Ubuntu
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### What happened
Tried to use this this code:
`some_variable='{{macros.ds_format(macros.ds_add(next_ds, '
'(ti.start_date - ti.execution_date).days), '
'"%Y-%m-%d", "%Y-%m-%d 21:00:00")}}')`
but got this error:
`strptime() argument 1 must be str, not Proxy`
because the `next_ds` variable changed to proxy.
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `airflow/macros/__init__.py`
Content:
```
1 #
2 # Licensed to the Apache Software Foundation (ASF) under one
3 # or more contributor license agreements. See the NOTICE file
4 # distributed with this work for additional information
5 # regarding copyright ownership. The ASF licenses this file
6 # to you under the Apache License, Version 2.0 (the
7 # "License"); you may not use this file except in compliance
8 # with the License. You may obtain a copy of the License at
9 #
10 # http://www.apache.org/licenses/LICENSE-2.0
11 #
12 # Unless required by applicable law or agreed to in writing,
13 # software distributed under the License is distributed on an
14 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15 # KIND, either express or implied. See the License for the
16 # specific language governing permissions and limitations
17 # under the License.
18 import time # noqa
19 import uuid # noqa
20 from datetime import datetime, timedelta
21 from random import random # noqa
22
23 import dateutil # noqa
24
25 from airflow.macros import hive # noqa
26
27
28 def ds_add(ds, days):
29 """
30 Add or subtract days from a YYYY-MM-DD
31
32 :param ds: anchor date in ``YYYY-MM-DD`` format to add to
33 :type ds: str
34 :param days: number of days to add to the ds, you can use negative values
35 :type days: int
36
37 >>> ds_add('2015-01-01', 5)
38 '2015-01-06'
39 >>> ds_add('2015-01-06', -5)
40 '2015-01-01'
41 """
42 ds = datetime.strptime(ds, '%Y-%m-%d')
43 if days:
44 ds = ds + timedelta(days)
45 return ds.isoformat()[:10]
46
47
48 def ds_format(ds, input_format, output_format):
49 """
50 Takes an input string and outputs another string
51 as specified in the output format
52
53 :param ds: input string which contains a date
54 :type ds: str
55 :param input_format: input string format. E.g. %Y-%m-%d
56 :type input_format: str
57 :param output_format: output string format E.g. %Y-%m-%d
58 :type output_format: str
59
60 >>> ds_format('2015-01-01', "%Y-%m-%d", "%m-%d-%y")
61 '01-01-15'
62 >>> ds_format('1/5/2015', "%m/%d/%Y", "%Y-%m-%d")
63 '2015-01-05'
64 """
65 return datetime.strptime(ds, input_format).strftime(output_format)
66
67
68 def datetime_diff_for_humans(dt, since=None):
69 """
70 Return a human-readable/approximate difference between two datetimes, or
71 one and now.
72
73 :param dt: The datetime to display the diff for
74 :type dt: datetime.datetime
75 :param since: When to display the date from. If ``None`` then the diff is
76 between ``dt`` and now.
77 :type since: None or datetime.datetime
78 :rtype: str
79 """
80 import pendulum
81
82 return pendulum.instance(dt).diff_for_humans(since)
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/airflow/macros/__init__.py b/airflow/macros/__init__.py
--- a/airflow/macros/__init__.py
+++ b/airflow/macros/__init__.py
@@ -19,13 +19,17 @@
import uuid # noqa
from datetime import datetime, timedelta
from random import random # noqa
+from typing import Any, Optional, Union
import dateutil # noqa
+import lazy_object_proxy
from airflow.macros import hive # noqa
+TemplateStringInput = Union[str, lazy_object_proxy.Proxy]
-def ds_add(ds, days):
+
+def ds_add(ds: TemplateStringInput, days: int) -> str:
"""
Add or subtract days from a YYYY-MM-DD
@@ -39,13 +43,13 @@
>>> ds_add('2015-01-06', -5)
'2015-01-01'
"""
- ds = datetime.strptime(ds, '%Y-%m-%d')
- if days:
- ds = ds + timedelta(days)
- return ds.isoformat()[:10]
+ if not days:
+ return str(ds)
+ dt = datetime.strptime(str(ds), "%Y-%m-%d") + timedelta(days=days)
+ return dt.strftime("%Y-%m-%d")
-def ds_format(ds, input_format, output_format):
+def ds_format(ds: TemplateStringInput, input_format: str, output_format: str) -> str:
"""
Takes an input string and outputs another string
as specified in the output format
@@ -62,10 +66,10 @@
>>> ds_format('1/5/2015', "%m/%d/%Y", "%Y-%m-%d")
'2015-01-05'
"""
- return datetime.strptime(ds, input_format).strftime(output_format)
+ return datetime.strptime(str(ds), input_format).strftime(output_format)
-def datetime_diff_for_humans(dt, since=None):
+def datetime_diff_for_humans(dt: Any, since: Optional[datetime] = None) -> str:
"""
Return a human-readable/approximate difference between two datetimes, or
one and now.
|
{"golden_diff": "diff --git a/airflow/macros/__init__.py b/airflow/macros/__init__.py\n--- a/airflow/macros/__init__.py\n+++ b/airflow/macros/__init__.py\n@@ -19,13 +19,17 @@\n import uuid # noqa\n from datetime import datetime, timedelta\n from random import random # noqa\n+from typing import Any, Optional, Union\n \n import dateutil # noqa\n+import lazy_object_proxy\n \n from airflow.macros import hive # noqa\n \n+TemplateStringInput = Union[str, lazy_object_proxy.Proxy]\n \n-def ds_add(ds, days):\n+\n+def ds_add(ds: TemplateStringInput, days: int) -> str:\n \"\"\"\n Add or subtract days from a YYYY-MM-DD\n \n@@ -39,13 +43,13 @@\n >>> ds_add('2015-01-06', -5)\n '2015-01-01'\n \"\"\"\n- ds = datetime.strptime(ds, '%Y-%m-%d')\n- if days:\n- ds = ds + timedelta(days)\n- return ds.isoformat()[:10]\n+ if not days:\n+ return str(ds)\n+ dt = datetime.strptime(str(ds), \"%Y-%m-%d\") + timedelta(days=days)\n+ return dt.strftime(\"%Y-%m-%d\")\n \n \n-def ds_format(ds, input_format, output_format):\n+def ds_format(ds: TemplateStringInput, input_format: str, output_format: str) -> str:\n \"\"\"\n Takes an input string and outputs another string\n as specified in the output format\n@@ -62,10 +66,10 @@\n >>> ds_format('1/5/2015', \"%m/%d/%Y\", \"%Y-%m-%d\")\n '2015-01-05'\n \"\"\"\n- return datetime.strptime(ds, input_format).strftime(output_format)\n+ return datetime.strptime(str(ds), input_format).strftime(output_format)\n \n \n-def datetime_diff_for_humans(dt, since=None):\n+def datetime_diff_for_humans(dt: Any, since: Optional[datetime] = None) -> str:\n \"\"\"\n Return a human-readable/approximate difference between two datetimes, or\n one and now.\n", "issue": "Cast to string in ds macro functions\n\r\nAs already written in this issue https://github.com/apache/airflow/issues/19241 strptime function required string, but got proxy if the variables ds/next_ds (the types of these variables changed on version 2.2.0) sent.\r\nThis change will make the function `ds_add` and `ds_format` backward compatible.\r\n\r\n<!--\r\nThank you for contributing! Please make sure that your code changes\r\nare covered with tests. And in case of new features or big changes\r\nremember to adjust the documentation.\r\n\r\nFeel free to ping committers for the review!\r\n\r\nIn case of existing issue, reference it using one of the following:\r\n\r\ncloses: #ISSUE\r\nrelated: #ISSUE\r\n\r\nHow to write a good git commit message:\r\nhttp://chris.beams.io/posts/git-commit/\r\n-->\r\n\r\n---\r\n**^ Add meaningful description above**\r\n\r\nRead the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information.\r\nIn case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed.\r\nIn case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).\r\nIn case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).\r\n\nnext_ds changed to proxy and it cannot be used in ds_add macro function\n### Apache Airflow version\n\n2.2.0 (latest released)\n\n### Operating System\n\nUbuntu\n\n### Versions of Apache Airflow Providers\n\n_No response_\n\n### Deployment\n\nDocker-Compose\n\n### Deployment details\n\n_No response_\n\n### What happened\n\nTried to use this this code:\r\n`some_variable='{{macros.ds_format(macros.ds_add(next_ds, '\r\n '(ti.start_date - ti.execution_date).days), '\r\n '\"%Y-%m-%d\", \"%Y-%m-%d 21:00:00\")}}')`\r\nbut got this error:\r\n`strptime() argument 1 must be str, not Proxy`\r\nbecause the `next_ds` variable changed to proxy.\n\n### What you expected to happen\n\n_No response_\n\n### How to reproduce\n\n_No response_\n\n### Anything else\n\n_No response_\n\n### Are you willing to submit PR?\n\n- [ ] Yes I am willing to submit a PR!\n\n### Code of Conduct\n\n- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)\n\n", "before_files": [{"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\nimport time # noqa\nimport uuid # noqa\nfrom datetime import datetime, timedelta\nfrom random import random # noqa\n\nimport dateutil # noqa\n\nfrom airflow.macros import hive # noqa\n\n\ndef ds_add(ds, days):\n \"\"\"\n Add or subtract days from a YYYY-MM-DD\n\n :param ds: anchor date in ``YYYY-MM-DD`` format to add to\n :type ds: str\n :param days: number of days to add to the ds, you can use negative values\n :type days: int\n\n >>> ds_add('2015-01-01', 5)\n '2015-01-06'\n >>> ds_add('2015-01-06', -5)\n '2015-01-01'\n \"\"\"\n ds = datetime.strptime(ds, '%Y-%m-%d')\n if days:\n ds = ds + timedelta(days)\n return ds.isoformat()[:10]\n\n\ndef ds_format(ds, input_format, output_format):\n \"\"\"\n Takes an input string and outputs another string\n as specified in the output format\n\n :param ds: input string which contains a date\n :type ds: str\n :param input_format: input string format. E.g. %Y-%m-%d\n :type input_format: str\n :param output_format: output string format E.g. %Y-%m-%d\n :type output_format: str\n\n >>> ds_format('2015-01-01', \"%Y-%m-%d\", \"%m-%d-%y\")\n '01-01-15'\n >>> ds_format('1/5/2015', \"%m/%d/%Y\", \"%Y-%m-%d\")\n '2015-01-05'\n \"\"\"\n return datetime.strptime(ds, input_format).strftime(output_format)\n\n\ndef datetime_diff_for_humans(dt, since=None):\n \"\"\"\n Return a human-readable/approximate difference between two datetimes, or\n one and now.\n\n :param dt: The datetime to display the diff for\n :type dt: datetime.datetime\n :param since: When to display the date from. If ``None`` then the diff is\n between ``dt`` and now.\n :type since: None or datetime.datetime\n :rtype: str\n \"\"\"\n import pendulum\n\n return pendulum.instance(dt).diff_for_humans(since)\n", "path": "airflow/macros/__init__.py"}], "after_files": [{"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\nimport time # noqa\nimport uuid # noqa\nfrom datetime import datetime, timedelta\nfrom random import random # noqa\nfrom typing import Any, Optional, Union\n\nimport dateutil # noqa\nimport lazy_object_proxy\n\nfrom airflow.macros import hive # noqa\n\nTemplateStringInput = Union[str, lazy_object_proxy.Proxy]\n\n\ndef ds_add(ds: TemplateStringInput, days: int) -> str:\n \"\"\"\n Add or subtract days from a YYYY-MM-DD\n\n :param ds: anchor date in ``YYYY-MM-DD`` format to add to\n :type ds: str\n :param days: number of days to add to the ds, you can use negative values\n :type days: int\n\n >>> ds_add('2015-01-01', 5)\n '2015-01-06'\n >>> ds_add('2015-01-06', -5)\n '2015-01-01'\n \"\"\"\n if not days:\n return str(ds)\n dt = datetime.strptime(str(ds), \"%Y-%m-%d\") + timedelta(days=days)\n return dt.strftime(\"%Y-%m-%d\")\n\n\ndef ds_format(ds: TemplateStringInput, input_format: str, output_format: str) -> str:\n \"\"\"\n Takes an input string and outputs another string\n as specified in the output format\n\n :param ds: input string which contains a date\n :type ds: str\n :param input_format: input string format. E.g. %Y-%m-%d\n :type input_format: str\n :param output_format: output string format E.g. %Y-%m-%d\n :type output_format: str\n\n >>> ds_format('2015-01-01', \"%Y-%m-%d\", \"%m-%d-%y\")\n '01-01-15'\n >>> ds_format('1/5/2015', \"%m/%d/%Y\", \"%Y-%m-%d\")\n '2015-01-05'\n \"\"\"\n return datetime.strptime(str(ds), input_format).strftime(output_format)\n\n\ndef datetime_diff_for_humans(dt: Any, since: Optional[datetime] = None) -> str:\n \"\"\"\n Return a human-readable/approximate difference between two datetimes, or\n one and now.\n\n :param dt: The datetime to display the diff for\n :type dt: datetime.datetime\n :param since: When to display the date from. If ``None`` then the diff is\n between ``dt`` and now.\n :type since: None or datetime.datetime\n :rtype: str\n \"\"\"\n import pendulum\n\n return pendulum.instance(dt).diff_for_humans(since)\n", "path": "airflow/macros/__init__.py"}]}
| 1,727 | 504 |
gh_patches_debug_22849
|
rasdani/github-patches
|
git_diff
|
uclapi__uclapi-51
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error responses missing OK property and correct HTTP Status Code
Hello there,
when making an invalid request, say, an invalid token, an error is given but the HTTP code is still `200 OK` when it should give an authentication error (a `400 Bad Request` or even a `401 Unauthorized`).
Request:
```http
GET /roombookings/rooms?token=uclapi-5d58c3c4e6bf9c-c2910ad3b6e054-7ef60f44f1c14f-a05147bfd17fdb HTTP/1.1
Host: uclapi.com
User-Agent: Paw/3.0.16 (Macintosh; OS X/10.12.4) NSURLConnection/1349.63
Cookie: AWSALB=8q9+FZmk9TOAZ/GG2tFsVUuckVO8STONoiGDn6/jd9FBEwFi5Ke/kvz+hIjdCmAwtpOIXGlnudL7LU3AaVxKt1sHWYGjLJnoMypHqu53r7Ub4b73trfiMx5NMVZ6
```
Response:
```http
HTTP/1.1 200 OK
Connection: keep-alive
Content-Type: application/json
Allow: GET, OPTIONS
Server: nginx/1.11.10
Set-Cookie: AWSALB=U2qTk7k+LrfxXibfskgPN1RoWvDQqKhaSBmgvUMOa3AcYgag1BZMcyz+5h5rQ2qhQc+Cm2PYzfjKV466PHcc9dleZHkLYE5O8d5q2WO+7WVbQT6VQMSSSHpOh3xy; Expires=Sat, 11 Mar 2017 14:18:40 GMT; Path=/
Transfer-Encoding: Identity
Date: Sat, 04 Mar 2017 14:18:40 GMT
X-Frame-Options: SAMEORIGIN
Vary: Accept, Cookie
{"error": "Token does not exist"}
```
Furthermore, when a request is successful, the first property is an `"ok": true`, however, with invalid requests, the `"ok": false` is missing from the error response.
Actual:
```json
{
"error": "Token does not exist"
}
```
Expected:
```json
{
"ok": false,
"error": "Token does not exist"
}
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `backend/uclapi/roombookings/token_auth.py`
Content:
```
1 from rest_framework.response import Response
2 from dashboard.models import App
3 from django.core.exceptions import ObjectDoesNotExist
4 from django.http import JsonResponse
5
6
7 def does_token_exist(view_func):
8 def wrapped(request, *args, **kwargs):
9 token = request.GET.get("token")
10
11 if not token:
12 return JsonResponse({
13 "error": "No token provided"
14 })
15
16 try:
17 App.objects.get(api_token=token)
18 except ObjectDoesNotExist:
19 return JsonResponse({
20 "error": "Token does not exist"
21 })
22
23 return view_func(request, *args, **kwargs)
24 return wrapped
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/backend/uclapi/roombookings/token_auth.py b/backend/uclapi/roombookings/token_auth.py
--- a/backend/uclapi/roombookings/token_auth.py
+++ b/backend/uclapi/roombookings/token_auth.py
@@ -1,4 +1,3 @@
-from rest_framework.response import Response
from dashboard.models import App
from django.core.exceptions import ObjectDoesNotExist
from django.http import JsonResponse
@@ -9,16 +8,22 @@
token = request.GET.get("token")
if not token:
- return JsonResponse({
+ response = JsonResponse({
+ "ok": False,
"error": "No token provided"
})
+ response.status_code = 400
+ return response
try:
App.objects.get(api_token=token)
except ObjectDoesNotExist:
- return JsonResponse({
+ response = JsonResponse({
+ "ok": False,
"error": "Token does not exist"
})
+ response.status_code = 400
+ return response
return view_func(request, *args, **kwargs)
return wrapped
|
{"golden_diff": "diff --git a/backend/uclapi/roombookings/token_auth.py b/backend/uclapi/roombookings/token_auth.py\n--- a/backend/uclapi/roombookings/token_auth.py\n+++ b/backend/uclapi/roombookings/token_auth.py\n@@ -1,4 +1,3 @@\n-from rest_framework.response import Response\n from dashboard.models import App\n from django.core.exceptions import ObjectDoesNotExist\n from django.http import JsonResponse\n@@ -9,16 +8,22 @@\n token = request.GET.get(\"token\")\n \n if not token:\n- return JsonResponse({\n+ response = JsonResponse({\n+ \"ok\": False,\n \"error\": \"No token provided\"\n })\n+ response.status_code = 400\n+ return response\n \n try:\n App.objects.get(api_token=token)\n except ObjectDoesNotExist:\n- return JsonResponse({\n+ response = JsonResponse({\n+ \"ok\": False,\n \"error\": \"Token does not exist\"\n })\n+ response.status_code = 400\n+ return response\n \n return view_func(request, *args, **kwargs)\n return wrapped\n", "issue": "Error responses missing OK property and correct HTTP Status Code\nHello there,\r\nwhen making an invalid request, say, an invalid token, an error is given but the HTTP code is still `200 OK` when it should give an authentication error (a `400 Bad Request` or even a `401 Unauthorized`).\r\n\r\nRequest:\r\n```http\r\nGET /roombookings/rooms?token=uclapi-5d58c3c4e6bf9c-c2910ad3b6e054-7ef60f44f1c14f-a05147bfd17fdb HTTP/1.1\r\nHost: uclapi.com\r\nUser-Agent: Paw/3.0.16 (Macintosh; OS X/10.12.4) NSURLConnection/1349.63\r\nCookie: AWSALB=8q9+FZmk9TOAZ/GG2tFsVUuckVO8STONoiGDn6/jd9FBEwFi5Ke/kvz+hIjdCmAwtpOIXGlnudL7LU3AaVxKt1sHWYGjLJnoMypHqu53r7Ub4b73trfiMx5NMVZ6\r\n```\r\n\r\nResponse:\r\n```http\r\nHTTP/1.1 200 OK\r\nConnection: keep-alive\r\nContent-Type: application/json\r\nAllow: GET, OPTIONS\r\nServer: nginx/1.11.10\r\nSet-Cookie: AWSALB=U2qTk7k+LrfxXibfskgPN1RoWvDQqKhaSBmgvUMOa3AcYgag1BZMcyz+5h5rQ2qhQc+Cm2PYzfjKV466PHcc9dleZHkLYE5O8d5q2WO+7WVbQT6VQMSSSHpOh3xy; Expires=Sat, 11 Mar 2017 14:18:40 GMT; Path=/\r\nTransfer-Encoding: Identity\r\nDate: Sat, 04 Mar 2017 14:18:40 GMT\r\nX-Frame-Options: SAMEORIGIN\r\nVary: Accept, Cookie\r\n\r\n{\"error\": \"Token does not exist\"}\r\n```\r\n\r\nFurthermore, when a request is successful, the first property is an `\"ok\": true`, however, with invalid requests, the `\"ok\": false` is missing from the error response.\r\n\r\nActual:\r\n```json\r\n{\r\n \"error\": \"Token does not exist\"\r\n}\r\n```\r\n\r\nExpected:\r\n```json\r\n{\r\n \"ok\": false,\r\n \"error\": \"Token does not exist\"\r\n}\r\n```\n", "before_files": [{"content": "from rest_framework.response import Response\nfrom dashboard.models import App\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.http import JsonResponse\n\n\ndef does_token_exist(view_func):\n def wrapped(request, *args, **kwargs):\n token = request.GET.get(\"token\")\n\n if not token:\n return JsonResponse({\n \"error\": \"No token provided\"\n })\n\n try:\n App.objects.get(api_token=token)\n except ObjectDoesNotExist:\n return JsonResponse({\n \"error\": \"Token does not exist\"\n })\n\n return view_func(request, *args, **kwargs)\n return wrapped\n", "path": "backend/uclapi/roombookings/token_auth.py"}], "after_files": [{"content": "from dashboard.models import App\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.http import JsonResponse\n\n\ndef does_token_exist(view_func):\n def wrapped(request, *args, **kwargs):\n token = request.GET.get(\"token\")\n\n if not token:\n response = JsonResponse({\n \"ok\": False,\n \"error\": \"No token provided\"\n })\n response.status_code = 400\n return response\n\n try:\n App.objects.get(api_token=token)\n except ObjectDoesNotExist:\n response = JsonResponse({\n \"ok\": False,\n \"error\": \"Token does not exist\"\n })\n response.status_code = 400\n return response\n\n return view_func(request, *args, **kwargs)\n return wrapped\n", "path": "backend/uclapi/roombookings/token_auth.py"}]}
| 1,034 | 246 |
gh_patches_debug_26093
|
rasdani/github-patches
|
git_diff
|
lightly-ai__lightly-618
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update docs version
Update docs version and link copyright in docs footer to lightly website
Closes #618
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/source/conf.py`
Content:
```
1 # Configuration file for the Sphinx documentation builder.
2 #
3 # This file only contains a selection of the most common options. For a full
4 # list see the documentation:
5 # https://www.sphinx-doc.org/en/master/usage/configuration.html
6
7 # -- Path setup --------------------------------------------------------------
8
9 # If extensions (or modules to document with autodoc) are in another directory,
10 # add these directories to sys.path here. If the directory is relative to the
11 # documentation root, use os.path.abspath to make it absolute, like shown here.
12 #
13 import os
14 import sys
15 sys.path.insert(0, os.path.abspath('../..'))
16
17 import sphinx_rtd_theme
18
19
20 # -- Project information -----------------------------------------------------
21
22 project = 'lightly'
23 copyright = '2020, Lightly AG'
24 author = 'Philipp Wirth, Igor Susmelj'
25
26 # The full version, including alpha/beta/rc tags
27 release = '1.0.0'
28 master_doc = 'index'
29
30
31 # -- General configuration ---------------------------------------------------
32
33 # Add any Sphinx extension module names here, as strings. They can be
34 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
35 # ones.
36 extensions = [
37 "sphinx_rtd_theme",
38 #'sphinx.ext.napoleon',
39 "sphinx.ext.autosummary",
40 "sphinx_gallery.gen_gallery",
41 "sphinx_tabs.tabs",
42 "sphinx_copybutton",
43 ]
44
45 sphinx_gallery_conf = {
46 'examples_dirs': ['tutorials_source/package', 'tutorials_source/platform'],
47 'gallery_dirs': ['tutorials/package', 'tutorials/platform'], # path to where to save gallery generated output
48 'filename_pattern': '/tutorial_',
49 }
50
51 napoleon_google_docstring = True
52 napoleon_numpy_docstring = False
53 napoleon_include_init_with_doc = False
54 napoleon_include_private_with_doc = False
55 napoleon_include_special_with_doc = False
56 napoleon_use_admonition_for_examples = False
57 napoleon_use_admonition_for_notes = False
58 napoleon_use_admonition_for_references = False
59 napoleon_use_ivar = False
60 napoleon_use_param = False
61 napoleon_use_rtype = False
62 napoleon_type_aliases = None
63
64 # Add any paths that contain templates here, relative to this directory.
65 templates_path = ['_templates']
66
67 # List of patterns, relative to source directory, that match files and
68 # directories to ignore when looking for source files.
69 # This pattern also affects html_static_path and html_extra_path.
70 exclude_patterns = []
71
72
73 # -- Options for HTML output -------------------------------------------------
74
75 # The theme to use for HTML and HTML Help pages. See the documentation for
76 # a list of builtin themes.
77 #
78 html_theme = 'sphinx_rtd_theme'
79
80 html_theme_options = {
81 'collapse_navigation': False, # set to false to prevent menu item collapse
82 }
83
84 # Add any paths that contain custom static files (such as style sheets) here,
85 # relative to this directory. They are copied after the builtin static files,
86 # so a file named "default.css" will overwrite the builtin "default.css".
87 html_static_path = ['_static']
88
89 html_favicon = 'favicon.png'
90
91 #html_logo = "../logos/lightly_logo_crop.png"
92 def setup(app):
93 app.add_css_file('css/my-styles.css')
94
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/docs/source/conf.py b/docs/source/conf.py
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -15,16 +15,19 @@
sys.path.insert(0, os.path.abspath('../..'))
import sphinx_rtd_theme
+import lightly
# -- Project information -----------------------------------------------------
project = 'lightly'
-copyright = '2020, Lightly AG'
+copyright_year = '2020'
+copyright = "Lightly AG"
+website_url = 'https://www.lightly.ai/'
author = 'Philipp Wirth, Igor Susmelj'
# The full version, including alpha/beta/rc tags
-release = '1.0.0'
+release = lightly.__version__
master_doc = 'index'
@@ -79,6 +82,7 @@
html_theme_options = {
'collapse_navigation': False, # set to false to prevent menu item collapse
+ 'logo_only': True
}
# Add any paths that contain custom static files (such as style sheets) here,
@@ -88,6 +92,10 @@
html_favicon = 'favicon.png'
-#html_logo = "../logos/lightly_logo_crop.png"
-def setup(app):
- app.add_css_file('css/my-styles.css')
+html_logo = '../logos/lightly_logo_crop_white_text.png'
+
+# Exposes variables so that they can be used by django
+html_context = {
+ 'copyright_year': copyright_year,
+ 'website_url': website_url,
+}
\ No newline at end of file
|
{"golden_diff": "diff --git a/docs/source/conf.py b/docs/source/conf.py\n--- a/docs/source/conf.py\n+++ b/docs/source/conf.py\n@@ -15,16 +15,19 @@\n sys.path.insert(0, os.path.abspath('../..'))\n \n import sphinx_rtd_theme\n+import lightly\n \n \n # -- Project information -----------------------------------------------------\n \n project = 'lightly'\n-copyright = '2020, Lightly AG'\n+copyright_year = '2020'\n+copyright = \"Lightly AG\"\n+website_url = 'https://www.lightly.ai/'\n author = 'Philipp Wirth, Igor Susmelj'\n \n # The full version, including alpha/beta/rc tags\n-release = '1.0.0'\n+release = lightly.__version__\n master_doc = 'index'\n \n \n@@ -79,6 +82,7 @@\n \n html_theme_options = {\n 'collapse_navigation': False, # set to false to prevent menu item collapse\n+ 'logo_only': True\n }\n \n # Add any paths that contain custom static files (such as style sheets) here,\n@@ -88,6 +92,10 @@\n \n html_favicon = 'favicon.png'\n \n-#html_logo = \"../logos/lightly_logo_crop.png\"\n-def setup(app):\n- app.add_css_file('css/my-styles.css')\n+html_logo = '../logos/lightly_logo_crop_white_text.png'\n+\n+#\u00a0Exposes variables so that they can be used by django\n+html_context = {\n+ 'copyright_year': copyright_year,\n+ 'website_url': website_url,\n+}\n\\ No newline at end of file\n", "issue": "Update docs version\nUpdate docs version and link copyright in docs footer to lightly website\r\n\r\nCloses #618 \n", "before_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\nsys.path.insert(0, os.path.abspath('../..'))\n\nimport sphinx_rtd_theme\n\n\n# -- Project information -----------------------------------------------------\n\nproject = 'lightly'\ncopyright = '2020, Lightly AG'\nauthor = 'Philipp Wirth, Igor Susmelj'\n\n# The full version, including alpha/beta/rc tags\nrelease = '1.0.0'\nmaster_doc = 'index'\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx_rtd_theme\",\n #'sphinx.ext.napoleon',\n \"sphinx.ext.autosummary\",\n \"sphinx_gallery.gen_gallery\",\n \"sphinx_tabs.tabs\",\n \"sphinx_copybutton\",\n]\n\nsphinx_gallery_conf = {\n 'examples_dirs': ['tutorials_source/package', 'tutorials_source/platform'],\n 'gallery_dirs': ['tutorials/package', 'tutorials/platform'], # path to where to save gallery generated output\n 'filename_pattern': '/tutorial_',\n}\n\nnapoleon_google_docstring = True\nnapoleon_numpy_docstring = False\nnapoleon_include_init_with_doc = False\nnapoleon_include_private_with_doc = False\nnapoleon_include_special_with_doc = False\nnapoleon_use_admonition_for_examples = False\nnapoleon_use_admonition_for_notes = False\nnapoleon_use_admonition_for_references = False\nnapoleon_use_ivar = False\nnapoleon_use_param = False\nnapoleon_use_rtype = False\nnapoleon_type_aliases = None\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = 'sphinx_rtd_theme'\n\nhtml_theme_options = {\n 'collapse_navigation': False, # set to false to prevent menu item collapse\n}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\nhtml_favicon = 'favicon.png'\n\n#html_logo = \"../logos/lightly_logo_crop.png\"\ndef setup(app):\n app.add_css_file('css/my-styles.css')\n", "path": "docs/source/conf.py"}], "after_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\nsys.path.insert(0, os.path.abspath('../..'))\n\nimport sphinx_rtd_theme\nimport lightly\n\n\n# -- Project information -----------------------------------------------------\n\nproject = 'lightly'\ncopyright_year = '2020'\ncopyright = \"Lightly AG\"\nwebsite_url = 'https://www.lightly.ai/'\nauthor = 'Philipp Wirth, Igor Susmelj'\n\n# The full version, including alpha/beta/rc tags\nrelease = lightly.__version__\nmaster_doc = 'index'\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx_rtd_theme\",\n #'sphinx.ext.napoleon',\n \"sphinx.ext.autosummary\",\n \"sphinx_gallery.gen_gallery\",\n \"sphinx_tabs.tabs\",\n \"sphinx_copybutton\",\n]\n\nsphinx_gallery_conf = {\n 'examples_dirs': ['tutorials_source/package', 'tutorials_source/platform'],\n 'gallery_dirs': ['tutorials/package', 'tutorials/platform'], # path to where to save gallery generated output\n 'filename_pattern': '/tutorial_',\n}\n\nnapoleon_google_docstring = True\nnapoleon_numpy_docstring = False\nnapoleon_include_init_with_doc = False\nnapoleon_include_private_with_doc = False\nnapoleon_include_special_with_doc = False\nnapoleon_use_admonition_for_examples = False\nnapoleon_use_admonition_for_notes = False\nnapoleon_use_admonition_for_references = False\nnapoleon_use_ivar = False\nnapoleon_use_param = False\nnapoleon_use_rtype = False\nnapoleon_type_aliases = None\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = 'sphinx_rtd_theme'\n\nhtml_theme_options = {\n 'collapse_navigation': False, # set to false to prevent menu item collapse\n 'logo_only': True\n}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\nhtml_favicon = 'favicon.png'\n\nhtml_logo = '../logos/lightly_logo_crop_white_text.png'\n\n#\u00a0Exposes variables so that they can be used by django\nhtml_context = {\n 'copyright_year': copyright_year,\n 'website_url': website_url,\n}", "path": "docs/source/conf.py"}]}
| 1,161 | 350 |
gh_patches_debug_16649
|
rasdani/github-patches
|
git_diff
|
pyjanitor-devs__pyjanitor-1259
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Deprecate functions ?
Central point to discuss functions to deprecate, if any?
- [x] `process_text` - `transform_columns` covers this very well
- [x] `impute` vs `fill_empty` - `impute` has the advantage of extra statistics functions (mean, mode, ...)
- [x] `rename_columns` - use pandas `rename`
- [x] `rename_column` - use `pd.rename`
- [x] `remove_columns` - use `pd.drop` or `select`
- [x] `filter_on` - use `query` or `select`
- [x] `fill_direction` - use `transform_columns` or `pd.DataFrame.assign`
- [x] `groupby_agg` - use `transform_columns` - once `by` is implemented
- [x] `then` - use `pd.DataFrame.pipe`
- [x] `to_datetime` - use `jn.transform_columns`
- [x] `pivot_wider` - use `pd.DataFrame.pivot`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `janitor/functions/then.py`
Content:
```
1 """Implementation source for `then`."""
2 from typing import Callable
3 import pandas_flavor as pf
4 import pandas as pd
5
6
7 @pf.register_dataframe_method
8 def then(df: pd.DataFrame, func: Callable) -> pd.DataFrame:
9 """Add an arbitrary function to run in the `pyjanitor` method chain.
10
11 This method does not mutate the original DataFrame.
12
13 Examples:
14 A trivial example using a lambda `func`.
15
16 >>> import pandas as pd
17 >>> import janitor
18 >>> (pd.DataFrame({"a": [1, 2, 3], "b": [7, 8, 9]})
19 ... .then(lambda df: df * 2))
20 a b
21 0 2 14
22 1 4 16
23 2 6 18
24
25 Args:
26 df: A pandas DataFrame.
27 func: A function you would like to run in the method chain.
28 It should take one parameter and return one parameter, each being
29 the DataFrame object. After that, do whatever you want in the
30 middle. Go crazy.
31
32 Returns:
33 A pandas DataFrame.
34 """
35 df = func(df)
36 return df
37
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/janitor/functions/then.py b/janitor/functions/then.py
--- a/janitor/functions/then.py
+++ b/janitor/functions/then.py
@@ -2,14 +2,25 @@
from typing import Callable
import pandas_flavor as pf
import pandas as pd
+from janitor.utils import refactored_function
[email protected]_dataframe_method
+@refactored_function(
+ message="This function will be deprecated in a 1.x release. "
+ "Kindly use `pd.DataFrame.pipe` instead."
+)
@pf.register_dataframe_method
def then(df: pd.DataFrame, func: Callable) -> pd.DataFrame:
"""Add an arbitrary function to run in the `pyjanitor` method chain.
This method does not mutate the original DataFrame.
+ !!!note
+
+ This function will be deprecated in a 1.x release.
+ Please use `pd.DataFrame.pipe` instead.
+
Examples:
A trivial example using a lambda `func`.
|
{"golden_diff": "diff --git a/janitor/functions/then.py b/janitor/functions/then.py\n--- a/janitor/functions/then.py\n+++ b/janitor/functions/then.py\n@@ -2,14 +2,25 @@\n from typing import Callable\n import pandas_flavor as pf\n import pandas as pd\n+from janitor.utils import refactored_function\n \n \[email protected]_dataframe_method\n+@refactored_function(\n+ message=\"This function will be deprecated in a 1.x release. \"\n+ \"Kindly use `pd.DataFrame.pipe` instead.\"\n+)\n @pf.register_dataframe_method\n def then(df: pd.DataFrame, func: Callable) -> pd.DataFrame:\n \"\"\"Add an arbitrary function to run in the `pyjanitor` method chain.\n \n This method does not mutate the original DataFrame.\n \n+ !!!note\n+\n+ This function will be deprecated in a 1.x release.\n+ Please use `pd.DataFrame.pipe` instead.\n+\n Examples:\n A trivial example using a lambda `func`.\n", "issue": "Deprecate functions ?\nCentral point to discuss functions to deprecate, if any?\r\n\r\n- [x] `process_text` - `transform_columns` covers this very well\r\n- [x] `impute` vs `fill_empty` - `impute` has the advantage of extra statistics functions (mean, mode, ...)\r\n- [x] `rename_columns` - use pandas `rename`\r\n- [x] `rename_column` - use `pd.rename`\r\n- [x] `remove_columns` - use `pd.drop` or `select`\r\n- [x] `filter_on` - use `query` or `select`\r\n- [x] `fill_direction` - use `transform_columns` or `pd.DataFrame.assign`\r\n- [x] `groupby_agg` - use `transform_columns` - once `by` is implemented\r\n- [x] `then` - use `pd.DataFrame.pipe`\r\n- [x] `to_datetime` - use `jn.transform_columns`\r\n- [x] `pivot_wider` - use `pd.DataFrame.pivot`\n", "before_files": [{"content": "\"\"\"Implementation source for `then`.\"\"\"\nfrom typing import Callable\nimport pandas_flavor as pf\nimport pandas as pd\n\n\[email protected]_dataframe_method\ndef then(df: pd.DataFrame, func: Callable) -> pd.DataFrame:\n \"\"\"Add an arbitrary function to run in the `pyjanitor` method chain.\n\n This method does not mutate the original DataFrame.\n\n Examples:\n A trivial example using a lambda `func`.\n\n >>> import pandas as pd\n >>> import janitor\n >>> (pd.DataFrame({\"a\": [1, 2, 3], \"b\": [7, 8, 9]})\n ... .then(lambda df: df * 2))\n a b\n 0 2 14\n 1 4 16\n 2 6 18\n\n Args:\n df: A pandas DataFrame.\n func: A function you would like to run in the method chain.\n It should take one parameter and return one parameter, each being\n the DataFrame object. After that, do whatever you want in the\n middle. Go crazy.\n\n Returns:\n A pandas DataFrame.\n \"\"\"\n df = func(df)\n return df\n", "path": "janitor/functions/then.py"}], "after_files": [{"content": "\"\"\"Implementation source for `then`.\"\"\"\nfrom typing import Callable\nimport pandas_flavor as pf\nimport pandas as pd\nfrom janitor.utils import refactored_function\n\n\[email protected]_dataframe_method\n@refactored_function(\n message=\"This function will be deprecated in a 1.x release. \"\n \"Kindly use `pd.DataFrame.pipe` instead.\"\n)\[email protected]_dataframe_method\ndef then(df: pd.DataFrame, func: Callable) -> pd.DataFrame:\n \"\"\"Add an arbitrary function to run in the `pyjanitor` method chain.\n\n This method does not mutate the original DataFrame.\n\n !!!note\n\n This function will be deprecated in a 1.x release.\n Please use `pd.DataFrame.pipe` instead.\n\n Examples:\n A trivial example using a lambda `func`.\n\n >>> import pandas as pd\n >>> import janitor\n >>> (pd.DataFrame({\"a\": [1, 2, 3], \"b\": [7, 8, 9]})\n ... .then(lambda df: df * 2))\n a b\n 0 2 14\n 1 4 16\n 2 6 18\n\n Args:\n df: A pandas DataFrame.\n func: A function you would like to run in the method chain.\n It should take one parameter and return one parameter, each being\n the DataFrame object. After that, do whatever you want in the\n middle. Go crazy.\n\n Returns:\n A pandas DataFrame.\n \"\"\"\n df = func(df)\n return df\n", "path": "janitor/functions/then.py"}]}
| 813 | 223 |
gh_patches_debug_12049
|
rasdani/github-patches
|
git_diff
|
nf-core__tools-1755
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use of future functionality
### Description of the bug
The `shutil.copytree` argument [`dirs_exist_ok`](https://docs.python.org/3/library/shutil.html#shutil.copytree) is only available in Python version 3.8+ but nf-core tool still support Python 3.7
https://github.com/nf-core/tools/blob/b5400d84d307343353b6ab09aad204231f74fb0e/nf_core/modules/lint/module_changes.py#L28
Two options: drop Python 3.7 support or change that code.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nf_core/modules/lint/module_changes.py`
Content:
```
1 """
2 Check whether the content of a module has changed compared to the original repository
3 """
4 import shutil
5 import tempfile
6 from pathlib import Path
7
8 from nf_core.modules.modules_differ import ModulesDiffer
9
10
11 def module_changes(module_lint_object, module):
12 """
13 Checks whether installed nf-core modules have changed compared to the
14 original repository
15
16 Downloads the ``main.nf`` and ``meta.yml`` files for every module
17 and compares them to the local copies
18
19 If the module has a commit SHA entry in the ``modules.json``, the file content is
20 compared against the files in the remote at this SHA.
21
22 Only runs when linting a pipeline, not the modules repository
23 """
24 if module.is_patched:
25 # If the module is patched, we need to apply
26 # the patch in reverse before comparing with the remote
27 tempdir = Path(tempfile.mkdtemp())
28 shutil.copytree(module.module_dir, tempdir, dirs_exist_ok=True)
29 try:
30 new_lines = ModulesDiffer.try_apply_patch(
31 module.module_name, module_lint_object.modules_repo.fullname, module.patch_path, tempdir, reverse=True
32 )
33 for file, lines in new_lines.items():
34 with open(tempdir / file, "w") as fh:
35 fh.writelines(lines)
36 except LookupError:
37 # This error is already reported by module_patch, so just return
38 return
39 else:
40 tempdir = module.module_dir
41
42 for f, same in module_lint_object.modules_repo.module_files_identical(
43 module.module_name, tempdir, module.git_sha
44 ).items():
45 if same:
46 module.passed.append(
47 (
48 "check_local_copy",
49 "Local copy of module up to date",
50 f"{Path(module.module_dir, f)}",
51 )
52 )
53 else:
54 module.failed.append(
55 (
56 "check_local_copy",
57 "Local copy of module does not match remote",
58 f"{Path(module.module_dir, f)}",
59 )
60 )
61
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/nf_core/modules/lint/module_changes.py b/nf_core/modules/lint/module_changes.py
--- a/nf_core/modules/lint/module_changes.py
+++ b/nf_core/modules/lint/module_changes.py
@@ -25,7 +25,7 @@
# If the module is patched, we need to apply
# the patch in reverse before comparing with the remote
tempdir = Path(tempfile.mkdtemp())
- shutil.copytree(module.module_dir, tempdir, dirs_exist_ok=True)
+ shutil.copytree(module.module_dir, tempdir)
try:
new_lines = ModulesDiffer.try_apply_patch(
module.module_name, module_lint_object.modules_repo.fullname, module.patch_path, tempdir, reverse=True
|
{"golden_diff": "diff --git a/nf_core/modules/lint/module_changes.py b/nf_core/modules/lint/module_changes.py\n--- a/nf_core/modules/lint/module_changes.py\n+++ b/nf_core/modules/lint/module_changes.py\n@@ -25,7 +25,7 @@\n # If the module is patched, we need to apply\n # the patch in reverse before comparing with the remote\n tempdir = Path(tempfile.mkdtemp())\n- shutil.copytree(module.module_dir, tempdir, dirs_exist_ok=True)\n+ shutil.copytree(module.module_dir, tempdir)\n try:\n new_lines = ModulesDiffer.try_apply_patch(\n module.module_name, module_lint_object.modules_repo.fullname, module.patch_path, tempdir, reverse=True\n", "issue": "Use of future functionality\n### Description of the bug\r\n\r\nThe `shutil.copytree` argument [`dirs_exist_ok`](https://docs.python.org/3/library/shutil.html#shutil.copytree) is only available in Python version 3.8+ but nf-core tool still support Python 3.7\r\n\r\nhttps://github.com/nf-core/tools/blob/b5400d84d307343353b6ab09aad204231f74fb0e/nf_core/modules/lint/module_changes.py#L28\r\n\r\nTwo options: drop Python 3.7 support or change that code.\r\n\n", "before_files": [{"content": "\"\"\"\nCheck whether the content of a module has changed compared to the original repository\n\"\"\"\nimport shutil\nimport tempfile\nfrom pathlib import Path\n\nfrom nf_core.modules.modules_differ import ModulesDiffer\n\n\ndef module_changes(module_lint_object, module):\n \"\"\"\n Checks whether installed nf-core modules have changed compared to the\n original repository\n\n Downloads the ``main.nf`` and ``meta.yml`` files for every module\n and compares them to the local copies\n\n If the module has a commit SHA entry in the ``modules.json``, the file content is\n compared against the files in the remote at this SHA.\n\n Only runs when linting a pipeline, not the modules repository\n \"\"\"\n if module.is_patched:\n # If the module is patched, we need to apply\n # the patch in reverse before comparing with the remote\n tempdir = Path(tempfile.mkdtemp())\n shutil.copytree(module.module_dir, tempdir, dirs_exist_ok=True)\n try:\n new_lines = ModulesDiffer.try_apply_patch(\n module.module_name, module_lint_object.modules_repo.fullname, module.patch_path, tempdir, reverse=True\n )\n for file, lines in new_lines.items():\n with open(tempdir / file, \"w\") as fh:\n fh.writelines(lines)\n except LookupError:\n # This error is already reported by module_patch, so just return\n return\n else:\n tempdir = module.module_dir\n\n for f, same in module_lint_object.modules_repo.module_files_identical(\n module.module_name, tempdir, module.git_sha\n ).items():\n if same:\n module.passed.append(\n (\n \"check_local_copy\",\n \"Local copy of module up to date\",\n f\"{Path(module.module_dir, f)}\",\n )\n )\n else:\n module.failed.append(\n (\n \"check_local_copy\",\n \"Local copy of module does not match remote\",\n f\"{Path(module.module_dir, f)}\",\n )\n )\n", "path": "nf_core/modules/lint/module_changes.py"}], "after_files": [{"content": "\"\"\"\nCheck whether the content of a module has changed compared to the original repository\n\"\"\"\nimport shutil\nimport tempfile\nfrom pathlib import Path\n\nfrom nf_core.modules.modules_differ import ModulesDiffer\n\n\ndef module_changes(module_lint_object, module):\n \"\"\"\n Checks whether installed nf-core modules have changed compared to the\n original repository\n\n Downloads the ``main.nf`` and ``meta.yml`` files for every module\n and compares them to the local copies\n\n If the module has a commit SHA entry in the ``modules.json``, the file content is\n compared against the files in the remote at this SHA.\n\n Only runs when linting a pipeline, not the modules repository\n \"\"\"\n if module.is_patched:\n # If the module is patched, we need to apply\n # the patch in reverse before comparing with the remote\n tempdir = Path(tempfile.mkdtemp())\n shutil.copytree(module.module_dir, tempdir)\n try:\n new_lines = ModulesDiffer.try_apply_patch(\n module.module_name, module_lint_object.modules_repo.fullname, module.patch_path, tempdir, reverse=True\n )\n for file, lines in new_lines.items():\n with open(tempdir / file, \"w\") as fh:\n fh.writelines(lines)\n except LookupError:\n # This error is already reported by module_patch, so just return\n return\n else:\n tempdir = module.module_dir\n\n for f, same in module_lint_object.modules_repo.module_files_identical(\n module.module_name, tempdir, module.git_sha\n ).items():\n if same:\n module.passed.append(\n (\n \"check_local_copy\",\n \"Local copy of module up to date\",\n f\"{Path(module.module_dir, f)}\",\n )\n )\n else:\n module.failed.append(\n (\n \"check_local_copy\",\n \"Local copy of module does not match remote\",\n f\"{Path(module.module_dir, f)}\",\n )\n )\n", "path": "nf_core/modules/lint/module_changes.py"}]}
| 947 | 163 |
gh_patches_debug_2117
|
rasdani/github-patches
|
git_diff
|
streamlit__streamlit-1469
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spelling mistake while running streamlit hello , DataFrame Demo
# Summary
I noticed a spelling mistake in dataframe demo while runing streamlit hello , It displays UN Data Exlorer instead of UN Data Explorer
# Steps to reproduce
1. Go to terminal
2. Run `streamlit hello`
3. Open browser at localhost:8501 and choose dataframe demo
## Expected behavior:
It should display correct spelling as `(Data courtesy of the UN Data Exlporer.)`
## Actual behavior:
It's displaying `(Data courtesy of the UN Data Exlorer.)`

## Is this a regression?
no
# Debug info
- Streamlit version: 0.57.3
- Python version: 3.8.2
- Using Conda? PipEnv? PyEnv? Pex? Conda
- OS version: Windows 10
- Browser version: Chrome v81.0
# Additional information
If needed, add any other context about the problem here. For example, did this bug come from https://discuss.streamlit.io or another site? Link the original source here!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/streamlit/hello/hello.py`
Content:
```
1 # Copyright 2018-2020 Streamlit Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import inspect
16 import textwrap
17 from collections import OrderedDict
18
19 import streamlit as st
20 from streamlit.logger import get_logger
21 from streamlit.hello import demos
22
23 LOGGER = get_logger(__name__)
24
25 # Dictionary of
26 # demo_name -> (demo_function, demo_description)
27 DEMOS = OrderedDict(
28 [
29 ("—", (demos.intro, None)),
30 (
31 "Animation Demo",
32 (
33 demos.fractal_demo,
34 """
35 This app shows how you can use Streamlit to build cool animations.
36 It displays an animated fractal based on the the Julia Set. Use the slider
37 to tune different parameters.
38 """,
39 ),
40 ),
41 (
42 "Plotting Demo",
43 (
44 demos.plotting_demo,
45 """
46 This demo illustrates a combination of plotting and animation with
47 Streamlit. We're generating a bunch of random numbers in a loop for around
48 5 seconds. Enjoy!
49 """,
50 ),
51 ),
52 (
53 "Mapping Demo",
54 (
55 demos.mapping_demo,
56 """
57 This demo shows how to use
58 [`st.deck_gl_chart`](https://docs.streamlit.io/api.html#streamlit.deck_gl_chart)
59 to display geospatial data.
60 """,
61 ),
62 ),
63 (
64 "DataFrame Demo",
65 (
66 demos.data_frame_demo,
67 """
68 This demo shows how to use `st.write` to visualize Pandas DataFrames.
69
70 (Data courtesy of the [UN Data Exlorer](http://data.un.org/Explorer.aspx).)
71 """,
72 ),
73 ),
74 ]
75 )
76
77
78 def run():
79 demo_name = st.sidebar.selectbox("Choose a demo", list(DEMOS.keys()), 0)
80 demo = DEMOS[demo_name][0]
81
82 if demo_name == "—":
83 show_code = False
84 st.write("# Welcome to Streamlit! 👋")
85 else:
86 show_code = st.sidebar.checkbox("Show code", True)
87 st.markdown("# %s" % demo_name)
88 description = DEMOS[demo_name][1]
89 if description:
90 st.write(description)
91 # Clear everything from the intro page.
92 # We only have 4 elements in the page so this is intentional overkill.
93 for i in range(10):
94 st.empty()
95
96 demo()
97
98 if show_code:
99 st.markdown("## Code")
100 sourcelines, _ = inspect.getsourcelines(demo)
101 st.code(textwrap.dedent("".join(sourcelines[1:])))
102
103
104 if __name__ == "__main__":
105 run()
106
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lib/streamlit/hello/hello.py b/lib/streamlit/hello/hello.py
--- a/lib/streamlit/hello/hello.py
+++ b/lib/streamlit/hello/hello.py
@@ -67,7 +67,7 @@
"""
This demo shows how to use `st.write` to visualize Pandas DataFrames.
-(Data courtesy of the [UN Data Exlorer](http://data.un.org/Explorer.aspx).)
+(Data courtesy of the [UN Data Explorer](http://data.un.org/Explorer.aspx).)
""",
),
),
|
{"golden_diff": "diff --git a/lib/streamlit/hello/hello.py b/lib/streamlit/hello/hello.py\n--- a/lib/streamlit/hello/hello.py\n+++ b/lib/streamlit/hello/hello.py\n@@ -67,7 +67,7 @@\n \"\"\"\n This demo shows how to use `st.write` to visualize Pandas DataFrames.\n \n-(Data courtesy of the [UN Data Exlorer](http://data.un.org/Explorer.aspx).)\n+(Data courtesy of the [UN Data Explorer](http://data.un.org/Explorer.aspx).)\n \"\"\",\n ),\n ),\n", "issue": "Spelling mistake while running streamlit hello , DataFrame Demo\n# Summary\r\n\r\nI noticed a spelling mistake in dataframe demo while runing streamlit hello , It displays UN Data Exlorer instead of UN Data Explorer\r\n\r\n# Steps to reproduce\r\n\r\n1. Go to terminal\r\n2. Run `streamlit hello`\r\n3. Open browser at localhost:8501 and choose dataframe demo\r\n\r\n## Expected behavior:\r\n\r\nIt should display correct spelling as `(Data courtesy of the UN Data Exlporer.)`\r\n\r\n## Actual behavior:\r\n\r\nIt's displaying `(Data courtesy of the UN Data Exlorer.)`\r\n\r\n\r\n\r\n## Is this a regression?\r\n no\r\n\r\n# Debug info\r\n\r\n- Streamlit version: 0.57.3\r\n- Python version: 3.8.2\r\n- Using Conda? PipEnv? PyEnv? Pex? Conda\r\n- OS version: Windows 10\r\n- Browser version: Chrome v81.0\r\n\r\n# Additional information\r\n\r\nIf needed, add any other context about the problem here. For example, did this bug come from https://discuss.streamlit.io or another site? Link the original source here!\r\n\n", "before_files": [{"content": "# Copyright 2018-2020 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport inspect\nimport textwrap\nfrom collections import OrderedDict\n\nimport streamlit as st\nfrom streamlit.logger import get_logger\nfrom streamlit.hello import demos\n\nLOGGER = get_logger(__name__)\n\n# Dictionary of\n# demo_name -> (demo_function, demo_description)\nDEMOS = OrderedDict(\n [\n (\"\u2014\", (demos.intro, None)),\n (\n \"Animation Demo\",\n (\n demos.fractal_demo,\n \"\"\"\nThis app shows how you can use Streamlit to build cool animations.\nIt displays an animated fractal based on the the Julia Set. Use the slider\nto tune different parameters.\n\"\"\",\n ),\n ),\n (\n \"Plotting Demo\",\n (\n demos.plotting_demo,\n \"\"\"\nThis demo illustrates a combination of plotting and animation with\nStreamlit. We're generating a bunch of random numbers in a loop for around\n5 seconds. Enjoy!\n\"\"\",\n ),\n ),\n (\n \"Mapping Demo\",\n (\n demos.mapping_demo,\n \"\"\"\nThis demo shows how to use\n[`st.deck_gl_chart`](https://docs.streamlit.io/api.html#streamlit.deck_gl_chart)\nto display geospatial data.\n\"\"\",\n ),\n ),\n (\n \"DataFrame Demo\",\n (\n demos.data_frame_demo,\n \"\"\"\nThis demo shows how to use `st.write` to visualize Pandas DataFrames.\n\n(Data courtesy of the [UN Data Exlorer](http://data.un.org/Explorer.aspx).)\n\"\"\",\n ),\n ),\n ]\n)\n\n\ndef run():\n demo_name = st.sidebar.selectbox(\"Choose a demo\", list(DEMOS.keys()), 0)\n demo = DEMOS[demo_name][0]\n\n if demo_name == \"\u2014\":\n show_code = False\n st.write(\"# Welcome to Streamlit! \ud83d\udc4b\")\n else:\n show_code = st.sidebar.checkbox(\"Show code\", True)\n st.markdown(\"# %s\" % demo_name)\n description = DEMOS[demo_name][1]\n if description:\n st.write(description)\n # Clear everything from the intro page.\n # We only have 4 elements in the page so this is intentional overkill.\n for i in range(10):\n st.empty()\n\n demo()\n\n if show_code:\n st.markdown(\"## Code\")\n sourcelines, _ = inspect.getsourcelines(demo)\n st.code(textwrap.dedent(\"\".join(sourcelines[1:])))\n\n\nif __name__ == \"__main__\":\n run()\n", "path": "lib/streamlit/hello/hello.py"}], "after_files": [{"content": "# Copyright 2018-2020 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport inspect\nimport textwrap\nfrom collections import OrderedDict\n\nimport streamlit as st\nfrom streamlit.logger import get_logger\nfrom streamlit.hello import demos\n\nLOGGER = get_logger(__name__)\n\n# Dictionary of\n# demo_name -> (demo_function, demo_description)\nDEMOS = OrderedDict(\n [\n (\"\u2014\", (demos.intro, None)),\n (\n \"Animation Demo\",\n (\n demos.fractal_demo,\n \"\"\"\nThis app shows how you can use Streamlit to build cool animations.\nIt displays an animated fractal based on the the Julia Set. Use the slider\nto tune different parameters.\n\"\"\",\n ),\n ),\n (\n \"Plotting Demo\",\n (\n demos.plotting_demo,\n \"\"\"\nThis demo illustrates a combination of plotting and animation with\nStreamlit. We're generating a bunch of random numbers in a loop for around\n5 seconds. Enjoy!\n\"\"\",\n ),\n ),\n (\n \"Mapping Demo\",\n (\n demos.mapping_demo,\n \"\"\"\nThis demo shows how to use\n[`st.deck_gl_chart`](https://docs.streamlit.io/api.html#streamlit.deck_gl_chart)\nto display geospatial data.\n\"\"\",\n ),\n ),\n (\n \"DataFrame Demo\",\n (\n demos.data_frame_demo,\n \"\"\"\nThis demo shows how to use `st.write` to visualize Pandas DataFrames.\n\n(Data courtesy of the [UN Data Explorer](http://data.un.org/Explorer.aspx).)\n\"\"\",\n ),\n ),\n ]\n)\n\n\ndef run():\n demo_name = st.sidebar.selectbox(\"Choose a demo\", list(DEMOS.keys()), 0)\n demo = DEMOS[demo_name][0]\n\n if demo_name == \"\u2014\":\n show_code = False\n st.write(\"# Welcome to Streamlit! \ud83d\udc4b\")\n else:\n show_code = st.sidebar.checkbox(\"Show code\", True)\n st.markdown(\"# %s\" % demo_name)\n description = DEMOS[demo_name][1]\n if description:\n st.write(description)\n # Clear everything from the intro page.\n # We only have 4 elements in the page so this is intentional overkill.\n for i in range(10):\n st.empty()\n\n demo()\n\n if show_code:\n st.markdown(\"## Code\")\n sourcelines, _ = inspect.getsourcelines(demo)\n st.code(textwrap.dedent(\"\".join(sourcelines[1:])))\n\n\nif __name__ == \"__main__\":\n run()\n", "path": "lib/streamlit/hello/hello.py"}]}
| 1,445 | 124 |
gh_patches_debug_3743
|
rasdani/github-patches
|
git_diff
|
openai__gym-2576
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Proposal] Add license to the license field in the setup.py file
### Proposal
Add license to the license field in the setup.py file.
### Motivation
The license field is defined but not completed in the setup.py file.
Some package dependency scanners will block the usage of this package as the license is not specified.
### Checklist
- [x] I have checked that there is no similar [issue](https://github.com/openai/gym/issues) in the repo (**required**)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import os.path
2 import sys
3 import itertools
4
5 from setuptools import find_packages, setup
6
7 # Don't import gym module here, since deps may not be installed
8 sys.path.insert(0, os.path.join(os.path.dirname(__file__), "gym"))
9 from version import VERSION
10
11 # Environment-specific dependencies.
12 extras = {
13 "atari": ["ale-py~=0.7.1"],
14 "accept-rom-license": ["autorom[accept-rom-license]~=0.4.2"],
15 "box2d": ["box2d-py==2.3.5", "pyglet>=1.4.0"],
16 "classic_control": ["pyglet>=1.4.0"],
17 "mujoco": ["mujoco_py>=1.50, <2.0"],
18 "toy_text": ["pygame==2.1.0", "scipy>=1.4.1"],
19 "other": ["lz4>=3.1.0", "opencv-python>=3.0"],
20 }
21
22 # Meta dependency groups.
23 nomujoco_blacklist = set(["mujoco", "accept-rom-license", "atari"])
24 nomujoco_groups = set(extras.keys()) - nomujoco_blacklist
25
26 extras["nomujoco"] = list(
27 itertools.chain.from_iterable(map(lambda group: extras[group], nomujoco_groups))
28 )
29
30
31 all_blacklist = set(["accept-rom-license"])
32 all_groups = set(extras.keys()) - all_blacklist
33
34 extras["all"] = list(
35 itertools.chain.from_iterable(map(lambda group: extras[group], all_groups))
36 )
37
38 setup(
39 name="gym",
40 version=VERSION,
41 description="Gym: A universal API for reinforcement learning environments.",
42 url="https://github.com/openai/gym",
43 author="Gym Community",
44 author_email="[email protected]",
45 license="",
46 packages=[package for package in find_packages() if package.startswith("gym")],
47 zip_safe=False,
48 install_requires=[
49 "numpy>=1.18.0",
50 "cloudpickle>=1.2.0",
51 "importlib_metadata>=4.10.0; python_version < '3.10'",
52 ],
53 extras_require=extras,
54 package_data={
55 "gym": [
56 "envs/mujoco/assets/*.xml",
57 "envs/classic_control/assets/*.png",
58 "envs/toy_text/font/*.ttf",
59 "envs/toy_text/img/*.png",
60 ]
61 },
62 tests_require=["pytest", "mock"],
63 python_requires=">=3.7",
64 classifiers=[
65 "Programming Language :: Python :: 3",
66 "Programming Language :: Python :: 3.7",
67 "Programming Language :: Python :: 3.8",
68 "Programming Language :: Python :: 3.9",
69 "Programming Language :: Python :: 3.10",
70 ],
71 )
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -42,7 +42,7 @@
url="https://github.com/openai/gym",
author="Gym Community",
author_email="[email protected]",
- license="",
+ license="MIT",
packages=[package for package in find_packages() if package.startswith("gym")],
zip_safe=False,
install_requires=[
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -42,7 +42,7 @@\n url=\"https://github.com/openai/gym\",\n author=\"Gym Community\",\n author_email=\"[email protected]\",\n- license=\"\",\n+ license=\"MIT\",\n packages=[package for package in find_packages() if package.startswith(\"gym\")],\n zip_safe=False,\n install_requires=[\n", "issue": "[Proposal] Add license to the license field in the setup.py file\n### Proposal \r\n\r\nAdd license to the license field in the setup.py file.\r\n\r\n### Motivation\r\n\r\nThe license field is defined but not completed in the setup.py file. \r\n\r\nSome package dependency scanners will block the usage of this package as the license is not specified.\r\n\r\n### Checklist\r\n\r\n- [x] I have checked that there is no similar [issue](https://github.com/openai/gym/issues) in the repo (**required**)\r\n\n", "before_files": [{"content": "import os.path\nimport sys\nimport itertools\n\nfrom setuptools import find_packages, setup\n\n# Don't import gym module here, since deps may not be installed\nsys.path.insert(0, os.path.join(os.path.dirname(__file__), \"gym\"))\nfrom version import VERSION\n\n# Environment-specific dependencies.\nextras = {\n \"atari\": [\"ale-py~=0.7.1\"],\n \"accept-rom-license\": [\"autorom[accept-rom-license]~=0.4.2\"],\n \"box2d\": [\"box2d-py==2.3.5\", \"pyglet>=1.4.0\"],\n \"classic_control\": [\"pyglet>=1.4.0\"],\n \"mujoco\": [\"mujoco_py>=1.50, <2.0\"],\n \"toy_text\": [\"pygame==2.1.0\", \"scipy>=1.4.1\"],\n \"other\": [\"lz4>=3.1.0\", \"opencv-python>=3.0\"],\n}\n\n# Meta dependency groups.\nnomujoco_blacklist = set([\"mujoco\", \"accept-rom-license\", \"atari\"])\nnomujoco_groups = set(extras.keys()) - nomujoco_blacklist\n\nextras[\"nomujoco\"] = list(\n itertools.chain.from_iterable(map(lambda group: extras[group], nomujoco_groups))\n)\n\n\nall_blacklist = set([\"accept-rom-license\"])\nall_groups = set(extras.keys()) - all_blacklist\n\nextras[\"all\"] = list(\n itertools.chain.from_iterable(map(lambda group: extras[group], all_groups))\n)\n\nsetup(\n name=\"gym\",\n version=VERSION,\n description=\"Gym: A universal API for reinforcement learning environments.\",\n url=\"https://github.com/openai/gym\",\n author=\"Gym Community\",\n author_email=\"[email protected]\",\n license=\"\",\n packages=[package for package in find_packages() if package.startswith(\"gym\")],\n zip_safe=False,\n install_requires=[\n \"numpy>=1.18.0\",\n \"cloudpickle>=1.2.0\",\n \"importlib_metadata>=4.10.0; python_version < '3.10'\",\n ],\n extras_require=extras,\n package_data={\n \"gym\": [\n \"envs/mujoco/assets/*.xml\",\n \"envs/classic_control/assets/*.png\",\n \"envs/toy_text/font/*.ttf\",\n \"envs/toy_text/img/*.png\",\n ]\n },\n tests_require=[\"pytest\", \"mock\"],\n python_requires=\">=3.7\",\n classifiers=[\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "import os.path\nimport sys\nimport itertools\n\nfrom setuptools import find_packages, setup\n\n# Don't import gym module here, since deps may not be installed\nsys.path.insert(0, os.path.join(os.path.dirname(__file__), \"gym\"))\nfrom version import VERSION\n\n# Environment-specific dependencies.\nextras = {\n \"atari\": [\"ale-py~=0.7.1\"],\n \"accept-rom-license\": [\"autorom[accept-rom-license]~=0.4.2\"],\n \"box2d\": [\"box2d-py==2.3.5\", \"pyglet>=1.4.0\"],\n \"classic_control\": [\"pyglet>=1.4.0\"],\n \"mujoco\": [\"mujoco_py>=1.50, <2.0\"],\n \"toy_text\": [\"pygame==2.1.0\", \"scipy>=1.4.1\"],\n \"other\": [\"lz4>=3.1.0\", \"opencv-python>=3.0\"],\n}\n\n# Meta dependency groups.\nnomujoco_blacklist = set([\"mujoco\", \"accept-rom-license\", \"atari\"])\nnomujoco_groups = set(extras.keys()) - nomujoco_blacklist\n\nextras[\"nomujoco\"] = list(\n itertools.chain.from_iterable(map(lambda group: extras[group], nomujoco_groups))\n)\n\n\nall_blacklist = set([\"accept-rom-license\"])\nall_groups = set(extras.keys()) - all_blacklist\n\nextras[\"all\"] = list(\n itertools.chain.from_iterable(map(lambda group: extras[group], all_groups))\n)\n\nsetup(\n name=\"gym\",\n version=VERSION,\n description=\"Gym: A universal API for reinforcement learning environments.\",\n url=\"https://github.com/openai/gym\",\n author=\"Gym Community\",\n author_email=\"[email protected]\",\n license=\"MIT\",\n packages=[package for package in find_packages() if package.startswith(\"gym\")],\n zip_safe=False,\n install_requires=[\n \"numpy>=1.18.0\",\n \"cloudpickle>=1.2.0\",\n \"importlib_metadata>=4.10.0; python_version < '3.10'\",\n ],\n extras_require=extras,\n package_data={\n \"gym\": [\n \"envs/mujoco/assets/*.xml\",\n \"envs/classic_control/assets/*.png\",\n \"envs/toy_text/font/*.ttf\",\n \"envs/toy_text/img/*.png\",\n ]\n },\n tests_require=[\"pytest\", \"mock\"],\n python_requires=\">=3.7\",\n classifiers=[\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n ],\n)\n", "path": "setup.py"}]}
| 1,124 | 100 |
gh_patches_debug_4907
|
rasdani/github-patches
|
git_diff
|
Mailu__Mailu-1487
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
We need to check the (persistent) postfix mailqueue permissions
As the postfix and postdrop uid/gid might change (especially with a change of base image), it is needed to check the directory permissions on /queue before starting postfix
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/postfix/start.py`
Content:
```
1 #!/usr/bin/python3
2
3 import os
4 import glob
5 import shutil
6 import multiprocessing
7 import logging as log
8 import sys
9
10 from podop import run_server
11 from socrate import system, conf
12
13 log.basicConfig(stream=sys.stderr, level=os.environ.get("LOG_LEVEL", "WARNING"))
14
15 def start_podop():
16 os.setuid(100)
17 url = "http://" + os.environ["ADMIN_ADDRESS"] + "/internal/postfix/"
18 # TODO: Remove verbosity setting from Podop?
19 run_server(0, "postfix", "/tmp/podop.socket", [
20 ("transport", "url", url + "transport/§"),
21 ("alias", "url", url + "alias/§"),
22 ("domain", "url", url + "domain/§"),
23 ("mailbox", "url", url + "mailbox/§"),
24 ("recipientmap", "url", url + "recipient/map/§"),
25 ("sendermap", "url", url + "sender/map/§"),
26 ("senderaccess", "url", url + "sender/access/§"),
27 ("senderlogin", "url", url + "sender/login/§")
28 ])
29
30 def is_valid_postconf_line(line):
31 return not line.startswith("#") \
32 and not line == ''
33
34 # Actual startup script
35 os.environ["FRONT_ADDRESS"] = system.get_host_address_from_environment("FRONT", "front")
36 os.environ["ADMIN_ADDRESS"] = system.get_host_address_from_environment("ADMIN", "admin")
37 os.environ["ANTISPAM_MILTER_ADDRESS"] = system.get_host_address_from_environment("ANTISPAM_MILTER", "antispam:11332")
38 os.environ["LMTP_ADDRESS"] = system.get_host_address_from_environment("LMTP", "imap:2525")
39
40 for postfix_file in glob.glob("/conf/*.cf"):
41 conf.jinja(postfix_file, os.environ, os.path.join("/etc/postfix", os.path.basename(postfix_file)))
42
43 if os.path.exists("/overrides/postfix.cf"):
44 for line in open("/overrides/postfix.cf").read().strip().split("\n"):
45 if is_valid_postconf_line(line):
46 os.system('postconf -e "{}"'.format(line))
47
48 if os.path.exists("/overrides/postfix.master"):
49 for line in open("/overrides/postfix.master").read().strip().split("\n"):
50 if is_valid_postconf_line(line):
51 os.system('postconf -Me "{}"'.format(line))
52
53 for map_file in glob.glob("/overrides/*.map"):
54 destination = os.path.join("/etc/postfix", os.path.basename(map_file))
55 shutil.copyfile(map_file, destination)
56 os.system("postmap {}".format(destination))
57 os.remove(destination)
58
59 if "RELAYUSER" in os.environ:
60 path = "/etc/postfix/sasl_passwd"
61 conf.jinja("/conf/sasl_passwd", os.environ, path)
62 os.system("postmap {}".format(path))
63
64 # Run Podop and Postfix
65 multiprocessing.Process(target=start_podop).start()
66 os.system("/usr/libexec/postfix/post-install meta_directory=/etc/postfix create-missing")
67 os.system("postfix start-fg")
68
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/core/postfix/start.py b/core/postfix/start.py
--- a/core/postfix/start.py
+++ b/core/postfix/start.py
@@ -64,4 +64,7 @@
# Run Podop and Postfix
multiprocessing.Process(target=start_podop).start()
os.system("/usr/libexec/postfix/post-install meta_directory=/etc/postfix create-missing")
+# Before starting postfix, we need to check permissions on /queue
+# in the event that postfix,postdrop id have changed
+os.system("postfix set-permissions")
os.system("postfix start-fg")
|
{"golden_diff": "diff --git a/core/postfix/start.py b/core/postfix/start.py\n--- a/core/postfix/start.py\n+++ b/core/postfix/start.py\n@@ -64,4 +64,7 @@\n # Run Podop and Postfix\n multiprocessing.Process(target=start_podop).start()\n os.system(\"/usr/libexec/postfix/post-install meta_directory=/etc/postfix create-missing\")\n+# Before starting postfix, we need to check permissions on /queue\n+# in the event that postfix,postdrop id have changed\n+os.system(\"postfix set-permissions\")\n os.system(\"postfix start-fg\")\n", "issue": "We need to check the (persistent) postfix mailqueue permissions\nAs the postfix and postdrop uid/gid might change (especially with a change of base image), it is needed to check the directory permissions on /queue before starting postfix\n", "before_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport glob\nimport shutil\nimport multiprocessing\nimport logging as log\nimport sys\n\nfrom podop import run_server\nfrom socrate import system, conf\n\nlog.basicConfig(stream=sys.stderr, level=os.environ.get(\"LOG_LEVEL\", \"WARNING\"))\n\ndef start_podop():\n os.setuid(100)\n url = \"http://\" + os.environ[\"ADMIN_ADDRESS\"] + \"/internal/postfix/\"\n # TODO: Remove verbosity setting from Podop?\n run_server(0, \"postfix\", \"/tmp/podop.socket\", [\n\t\t(\"transport\", \"url\", url + \"transport/\u00a7\"),\n\t\t(\"alias\", \"url\", url + \"alias/\u00a7\"),\n\t\t(\"domain\", \"url\", url + \"domain/\u00a7\"),\n (\"mailbox\", \"url\", url + \"mailbox/\u00a7\"),\n (\"recipientmap\", \"url\", url + \"recipient/map/\u00a7\"),\n (\"sendermap\", \"url\", url + \"sender/map/\u00a7\"),\n (\"senderaccess\", \"url\", url + \"sender/access/\u00a7\"),\n (\"senderlogin\", \"url\", url + \"sender/login/\u00a7\")\n ])\n\ndef is_valid_postconf_line(line):\n return not line.startswith(\"#\") \\\n and not line == ''\n\n# Actual startup script\nos.environ[\"FRONT_ADDRESS\"] = system.get_host_address_from_environment(\"FRONT\", \"front\")\nos.environ[\"ADMIN_ADDRESS\"] = system.get_host_address_from_environment(\"ADMIN\", \"admin\")\nos.environ[\"ANTISPAM_MILTER_ADDRESS\"] = system.get_host_address_from_environment(\"ANTISPAM_MILTER\", \"antispam:11332\")\nos.environ[\"LMTP_ADDRESS\"] = system.get_host_address_from_environment(\"LMTP\", \"imap:2525\")\n\nfor postfix_file in glob.glob(\"/conf/*.cf\"):\n conf.jinja(postfix_file, os.environ, os.path.join(\"/etc/postfix\", os.path.basename(postfix_file)))\n\nif os.path.exists(\"/overrides/postfix.cf\"):\n for line in open(\"/overrides/postfix.cf\").read().strip().split(\"\\n\"):\n if is_valid_postconf_line(line):\n os.system('postconf -e \"{}\"'.format(line))\n\nif os.path.exists(\"/overrides/postfix.master\"):\n for line in open(\"/overrides/postfix.master\").read().strip().split(\"\\n\"):\n if is_valid_postconf_line(line):\n os.system('postconf -Me \"{}\"'.format(line))\n\nfor map_file in glob.glob(\"/overrides/*.map\"):\n destination = os.path.join(\"/etc/postfix\", os.path.basename(map_file))\n shutil.copyfile(map_file, destination)\n os.system(\"postmap {}\".format(destination))\n os.remove(destination)\n\nif \"RELAYUSER\" in os.environ:\n path = \"/etc/postfix/sasl_passwd\"\n conf.jinja(\"/conf/sasl_passwd\", os.environ, path)\n os.system(\"postmap {}\".format(path))\n\n# Run Podop and Postfix\nmultiprocessing.Process(target=start_podop).start()\nos.system(\"/usr/libexec/postfix/post-install meta_directory=/etc/postfix create-missing\")\nos.system(\"postfix start-fg\")\n", "path": "core/postfix/start.py"}], "after_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport glob\nimport shutil\nimport multiprocessing\nimport logging as log\nimport sys\n\nfrom podop import run_server\nfrom socrate import system, conf\n\nlog.basicConfig(stream=sys.stderr, level=os.environ.get(\"LOG_LEVEL\", \"WARNING\"))\n\ndef start_podop():\n os.setuid(100)\n url = \"http://\" + os.environ[\"ADMIN_ADDRESS\"] + \"/internal/postfix/\"\n # TODO: Remove verbosity setting from Podop?\n run_server(0, \"postfix\", \"/tmp/podop.socket\", [\n\t\t(\"transport\", \"url\", url + \"transport/\u00a7\"),\n\t\t(\"alias\", \"url\", url + \"alias/\u00a7\"),\n\t\t(\"domain\", \"url\", url + \"domain/\u00a7\"),\n (\"mailbox\", \"url\", url + \"mailbox/\u00a7\"),\n (\"recipientmap\", \"url\", url + \"recipient/map/\u00a7\"),\n (\"sendermap\", \"url\", url + \"sender/map/\u00a7\"),\n (\"senderaccess\", \"url\", url + \"sender/access/\u00a7\"),\n (\"senderlogin\", \"url\", url + \"sender/login/\u00a7\")\n ])\n\ndef is_valid_postconf_line(line):\n return not line.startswith(\"#\") \\\n and not line == ''\n\n# Actual startup script\nos.environ[\"FRONT_ADDRESS\"] = system.get_host_address_from_environment(\"FRONT\", \"front\")\nos.environ[\"ADMIN_ADDRESS\"] = system.get_host_address_from_environment(\"ADMIN\", \"admin\")\nos.environ[\"ANTISPAM_MILTER_ADDRESS\"] = system.get_host_address_from_environment(\"ANTISPAM_MILTER\", \"antispam:11332\")\nos.environ[\"LMTP_ADDRESS\"] = system.get_host_address_from_environment(\"LMTP\", \"imap:2525\")\n\nfor postfix_file in glob.glob(\"/conf/*.cf\"):\n conf.jinja(postfix_file, os.environ, os.path.join(\"/etc/postfix\", os.path.basename(postfix_file)))\n\nif os.path.exists(\"/overrides/postfix.cf\"):\n for line in open(\"/overrides/postfix.cf\").read().strip().split(\"\\n\"):\n if is_valid_postconf_line(line):\n os.system('postconf -e \"{}\"'.format(line))\n\nif os.path.exists(\"/overrides/postfix.master\"):\n for line in open(\"/overrides/postfix.master\").read().strip().split(\"\\n\"):\n if is_valid_postconf_line(line):\n os.system('postconf -Me \"{}\"'.format(line))\n\nfor map_file in glob.glob(\"/overrides/*.map\"):\n destination = os.path.join(\"/etc/postfix\", os.path.basename(map_file))\n shutil.copyfile(map_file, destination)\n os.system(\"postmap {}\".format(destination))\n os.remove(destination)\n\nif \"RELAYUSER\" in os.environ:\n path = \"/etc/postfix/sasl_passwd\"\n conf.jinja(\"/conf/sasl_passwd\", os.environ, path)\n os.system(\"postmap {}\".format(path))\n\n# Run Podop and Postfix\nmultiprocessing.Process(target=start_podop).start()\nos.system(\"/usr/libexec/postfix/post-install meta_directory=/etc/postfix create-missing\")\n# Before starting postfix, we need to check permissions on /queue\n# in the event that postfix,postdrop id have changed\nos.system(\"postfix set-permissions\")\nos.system(\"postfix start-fg\")\n", "path": "core/postfix/start.py"}]}
| 1,107 | 127 |
gh_patches_debug_24129
|
rasdani/github-patches
|
git_diff
|
fossasia__open-event-server-7875
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Delete space in front of colons
Some text pieces coming from the server has a space in front of colons " :". Please check all text including email texts and delete the space in front of colons.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `create_db.py`
Content:
```
1 import argparse
2 import getpass
3 import re
4
5 from flask_migrate import stamp
6
7 from app.instance import current_app
8 from app.models import db
9 from populate_db import populate
10 from tests.all.integration.auth_helper import create_super_admin
11
12
13 def create_default_user(email, password):
14 print("Your login is 'super_admin'.")
15 if not email:
16 ask_email = True
17 while ask_email:
18 email = input("Enter email for super_admin : ")
19 if not re.match(r'[^@]+@[^@]+\.[^@]+', email):
20 print('\nInvalid email address\n')
21 continue
22 ask_email = False
23 if not password:
24 ask_password = True
25 while ask_password:
26 password = getpass.getpass("Enter password for super_admin : ")
27 if len(password) < 8:
28 print('\nPassword should have minimum 8 characters')
29 continue
30 repassword = getpass.getpass("Enter your password again to confirm : ")
31 if password != repassword:
32 print('\nPassword did not match')
33 continue
34 ask_password = False
35 create_super_admin(email, password)
36
37
38 if __name__ == "__main__":
39 parser = argparse.ArgumentParser()
40 parser.add_argument("email", nargs='?', help="The email for super_admin.", default='')
41 parser.add_argument(
42 "password", nargs='?', help="The password for super_admin.", default=''
43 )
44 parsed = parser.parse_args()
45 with current_app.app_context():
46 db.engine.execute('create extension if not exists citext')
47 db.create_all()
48 stamp()
49 create_default_user(parsed.email, parsed.password)
50 populate()
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/create_db.py b/create_db.py
--- a/create_db.py
+++ b/create_db.py
@@ -15,7 +15,7 @@
if not email:
ask_email = True
while ask_email:
- email = input("Enter email for super_admin : ")
+ email = input("Enter email for super_admin: ")
if not re.match(r'[^@]+@[^@]+\.[^@]+', email):
print('\nInvalid email address\n')
continue
@@ -23,11 +23,11 @@
if not password:
ask_password = True
while ask_password:
- password = getpass.getpass("Enter password for super_admin : ")
+ password = getpass.getpass("Enter password for super_admin: ")
if len(password) < 8:
print('\nPassword should have minimum 8 characters')
continue
- repassword = getpass.getpass("Enter your password again to confirm : ")
+ repassword = getpass.getpass("Enter your password again to confirm: ")
if password != repassword:
print('\nPassword did not match')
continue
|
{"golden_diff": "diff --git a/create_db.py b/create_db.py\n--- a/create_db.py\n+++ b/create_db.py\n@@ -15,7 +15,7 @@\n if not email:\n ask_email = True\n while ask_email:\n- email = input(\"Enter email for super_admin : \")\n+ email = input(\"Enter email for super_admin: \")\n if not re.match(r'[^@]+@[^@]+\\.[^@]+', email):\n print('\\nInvalid email address\\n')\n continue\n@@ -23,11 +23,11 @@\n if not password:\n ask_password = True\n while ask_password:\n- password = getpass.getpass(\"Enter password for super_admin : \")\n+ password = getpass.getpass(\"Enter password for super_admin: \")\n if len(password) < 8:\n print('\\nPassword should have minimum 8 characters')\n continue\n- repassword = getpass.getpass(\"Enter your password again to confirm : \")\n+ repassword = getpass.getpass(\"Enter your password again to confirm: \")\n if password != repassword:\n print('\\nPassword did not match')\n continue\n", "issue": "Delete space in front of colons\nSome text pieces coming from the server has a space in front of colons \" :\". Please check all text including email texts and delete the space in front of colons.\n", "before_files": [{"content": "import argparse\nimport getpass\nimport re\n\nfrom flask_migrate import stamp\n\nfrom app.instance import current_app\nfrom app.models import db\nfrom populate_db import populate\nfrom tests.all.integration.auth_helper import create_super_admin\n\n\ndef create_default_user(email, password):\n print(\"Your login is 'super_admin'.\")\n if not email:\n ask_email = True\n while ask_email:\n email = input(\"Enter email for super_admin : \")\n if not re.match(r'[^@]+@[^@]+\\.[^@]+', email):\n print('\\nInvalid email address\\n')\n continue\n ask_email = False\n if not password:\n ask_password = True\n while ask_password:\n password = getpass.getpass(\"Enter password for super_admin : \")\n if len(password) < 8:\n print('\\nPassword should have minimum 8 characters')\n continue\n repassword = getpass.getpass(\"Enter your password again to confirm : \")\n if password != repassword:\n print('\\nPassword did not match')\n continue\n ask_password = False\n create_super_admin(email, password)\n\n\nif __name__ == \"__main__\":\n parser = argparse.ArgumentParser()\n parser.add_argument(\"email\", nargs='?', help=\"The email for super_admin.\", default='')\n parser.add_argument(\n \"password\", nargs='?', help=\"The password for super_admin.\", default=''\n )\n parsed = parser.parse_args()\n with current_app.app_context():\n db.engine.execute('create extension if not exists citext')\n db.create_all()\n stamp()\n create_default_user(parsed.email, parsed.password)\n populate()\n", "path": "create_db.py"}], "after_files": [{"content": "import argparse\nimport getpass\nimport re\n\nfrom flask_migrate import stamp\n\nfrom app.instance import current_app\nfrom app.models import db\nfrom populate_db import populate\nfrom tests.all.integration.auth_helper import create_super_admin\n\n\ndef create_default_user(email, password):\n print(\"Your login is 'super_admin'.\")\n if not email:\n ask_email = True\n while ask_email:\n email = input(\"Enter email for super_admin: \")\n if not re.match(r'[^@]+@[^@]+\\.[^@]+', email):\n print('\\nInvalid email address\\n')\n continue\n ask_email = False\n if not password:\n ask_password = True\n while ask_password:\n password = getpass.getpass(\"Enter password for super_admin: \")\n if len(password) < 8:\n print('\\nPassword should have minimum 8 characters')\n continue\n repassword = getpass.getpass(\"Enter your password again to confirm: \")\n if password != repassword:\n print('\\nPassword did not match')\n continue\n ask_password = False\n create_super_admin(email, password)\n\n\nif __name__ == \"__main__\":\n parser = argparse.ArgumentParser()\n parser.add_argument(\"email\", nargs='?', help=\"The email for super_admin.\", default='')\n parser.add_argument(\n \"password\", nargs='?', help=\"The password for super_admin.\", default=''\n )\n parsed = parser.parse_args()\n with current_app.app_context():\n db.engine.execute('create extension if not exists citext')\n db.create_all()\n stamp()\n create_default_user(parsed.email, parsed.password)\n populate()\n", "path": "create_db.py"}]}
| 744 | 253 |
gh_patches_debug_31190
|
rasdani/github-patches
|
git_diff
|
alltheplaces__alltheplaces-5890
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
@spider=zabka_pl opening hours are missing Sunday data, proposed `opening_hours` are broken
https://www.alltheplaces.xyz/map/#16.47/50.073227/20.037421

https://www.openstreetmap.org/node/4271289403
Opening hours in OSM (`Mo-Sa 06:00-22:00; Su 11:00-20:00`) are correct.
At https://www.zabka.pl/znajdz-sklep it refuses to show Sunday opening hours.
Maybe it is caused by their internal structure? This shops are franchises and maybe they are obligated to be open `Mo-Sa 06:00-22:00` and Sundays can be decided by operator?
Overall it seems that Monday to Saturday data is likely correct.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/zabka_pl.py`
Content:
```
1 import scrapy
2 from scrapy.http import JsonRequest
3
4 from locations.dict_parser import DictParser
5 from locations.hours import OpeningHours
6
7
8 class ZabkaPLSpider(scrapy.Spider):
9 name = "zabka_pl"
10 item_attributes = {"brand": "Żabka", "brand_wikidata": "Q2589061"}
11
12 # URL extracted by observing request made by Żappka Android app (using HTTP Toolkit)
13 start_urls = ["https://partner-api.zabkamobile.pl/v2/shops"]
14
15 def start_requests(self):
16 # Authorization header is hard-coded into the Żappka app and does not appear to change (as of version 3.14.10).
17 headers = {
18 "Authorization": "PartnerKey 424A0B7AD0E9EA136510474D89061BBDC007B9BE5256A638EA28CC19D2BB15CD",
19 }
20 yield JsonRequest(url=self.start_urls[0], headers=headers)
21
22 def parse(self, response):
23 for location in response.json():
24 item = DictParser.parse(location)
25 item["street_address"] = item.pop("addr_full", "")
26 # unset "state" field, it is taken from the "region" field which is some internal Żabka ID
27 item["state"] = None
28 item["opening_hours"] = OpeningHours()
29
30 # Each franchisee is required to be open Mon-Sat with the same hours
31 # But the hours for Sundays are set in the "nonTradingDays" field, which
32 # contains the opening hours for each specific Sunday.
33 item["opening_hours"].add_days_range(
34 ["Mo", "Tu", "We", "Th", "Fr", "Sa"], location["openTime"], location["closeTime"]
35 )
36 yield item
37
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/locations/spiders/zabka_pl.py b/locations/spiders/zabka_pl.py
--- a/locations/spiders/zabka_pl.py
+++ b/locations/spiders/zabka_pl.py
@@ -1,3 +1,5 @@
+from datetime import datetime
+
import scrapy
from scrapy.http import JsonRequest
@@ -20,6 +22,7 @@
yield JsonRequest(url=self.start_urls[0], headers=headers)
def parse(self, response):
+ today = datetime.now()
for location in response.json():
item = DictParser.parse(location)
item["street_address"] = item.pop("addr_full", "")
@@ -33,4 +36,25 @@
item["opening_hours"].add_days_range(
["Mo", "Tu", "We", "Th", "Fr", "Sa"], location["openTime"], location["closeTime"]
)
+
+ if location["nonTradingDays"]:
+ sunday_open = None
+ sunday_close = None
+ for rule in location["nonTradingDays"]:
+ d = datetime.strptime(rule["date"], "%Y-%m-%d")
+ if d.weekday() != 6 or d < today:
+ continue # In the past, ignore
+ if sunday_open is None:
+ sunday_open = rule["openTime"]
+ sunday_close = rule["closeTime"]
+ else:
+ if sunday_open != rule["openTime"] or sunday_close != rule["closeTime"]:
+ self.crawler.stats.inc_value("atp/zabka_pl/nonTradingDays/mismatching")
+ break # Mismatching future Sundays, skip
+ else:
+ self.crawler.stats.inc_value("atp/zabka_pl/nonTradingDays/fine")
+ item["opening_hours"].add_range("Su", sunday_open, sunday_close)
+ else:
+ self.crawler.stats.inc_value("atp/zabka_pl/nonTradingDays/missing") # Sunday closed? Missing data?
+
yield item
|
{"golden_diff": "diff --git a/locations/spiders/zabka_pl.py b/locations/spiders/zabka_pl.py\n--- a/locations/spiders/zabka_pl.py\n+++ b/locations/spiders/zabka_pl.py\n@@ -1,3 +1,5 @@\n+from datetime import datetime\n+\n import scrapy\n from scrapy.http import JsonRequest\n \n@@ -20,6 +22,7 @@\n yield JsonRequest(url=self.start_urls[0], headers=headers)\n \n def parse(self, response):\n+ today = datetime.now()\n for location in response.json():\n item = DictParser.parse(location)\n item[\"street_address\"] = item.pop(\"addr_full\", \"\")\n@@ -33,4 +36,25 @@\n item[\"opening_hours\"].add_days_range(\n [\"Mo\", \"Tu\", \"We\", \"Th\", \"Fr\", \"Sa\"], location[\"openTime\"], location[\"closeTime\"]\n )\n+\n+ if location[\"nonTradingDays\"]:\n+ sunday_open = None\n+ sunday_close = None\n+ for rule in location[\"nonTradingDays\"]:\n+ d = datetime.strptime(rule[\"date\"], \"%Y-%m-%d\")\n+ if d.weekday() != 6 or d < today:\n+ continue # In the past, ignore\n+ if sunday_open is None:\n+ sunday_open = rule[\"openTime\"]\n+ sunday_close = rule[\"closeTime\"]\n+ else:\n+ if sunday_open != rule[\"openTime\"] or sunday_close != rule[\"closeTime\"]:\n+ self.crawler.stats.inc_value(\"atp/zabka_pl/nonTradingDays/mismatching\")\n+ break # Mismatching future Sundays, skip\n+ else:\n+ self.crawler.stats.inc_value(\"atp/zabka_pl/nonTradingDays/fine\")\n+ item[\"opening_hours\"].add_range(\"Su\", sunday_open, sunday_close)\n+ else:\n+ self.crawler.stats.inc_value(\"atp/zabka_pl/nonTradingDays/missing\") # Sunday closed? Missing data?\n+\n yield item\n", "issue": "@spider=zabka_pl opening hours are missing Sunday data, proposed `opening_hours` are broken\nhttps://www.alltheplaces.xyz/map/#16.47/50.073227/20.037421\r\n\r\n\r\n\r\nhttps://www.openstreetmap.org/node/4271289403\r\n\r\nOpening hours in OSM (`Mo-Sa 06:00-22:00; Su 11:00-20:00`) are correct.\r\n\r\nAt https://www.zabka.pl/znajdz-sklep it refuses to show Sunday opening hours.\r\n\r\nMaybe it is caused by their internal structure? This shops are franchises and maybe they are obligated to be open `Mo-Sa 06:00-22:00` and Sundays can be decided by operator?\r\n\r\nOverall it seems that Monday to Saturday data is likely correct.\n", "before_files": [{"content": "import scrapy\nfrom scrapy.http import JsonRequest\n\nfrom locations.dict_parser import DictParser\nfrom locations.hours import OpeningHours\n\n\nclass ZabkaPLSpider(scrapy.Spider):\n name = \"zabka_pl\"\n item_attributes = {\"brand\": \"\u017babka\", \"brand_wikidata\": \"Q2589061\"}\n\n # URL extracted by observing request made by \u017bappka Android app (using HTTP Toolkit)\n start_urls = [\"https://partner-api.zabkamobile.pl/v2/shops\"]\n\n def start_requests(self):\n # Authorization header is hard-coded into the \u017bappka app and does not appear to change (as of version 3.14.10).\n headers = {\n \"Authorization\": \"PartnerKey 424A0B7AD0E9EA136510474D89061BBDC007B9BE5256A638EA28CC19D2BB15CD\",\n }\n yield JsonRequest(url=self.start_urls[0], headers=headers)\n\n def parse(self, response):\n for location in response.json():\n item = DictParser.parse(location)\n item[\"street_address\"] = item.pop(\"addr_full\", \"\")\n # unset \"state\" field, it is taken from the \"region\" field which is some internal \u017babka ID\n item[\"state\"] = None\n item[\"opening_hours\"] = OpeningHours()\n\n # Each franchisee is required to be open Mon-Sat with the same hours\n # But the hours for Sundays are set in the \"nonTradingDays\" field, which\n # contains the opening hours for each specific Sunday.\n item[\"opening_hours\"].add_days_range(\n [\"Mo\", \"Tu\", \"We\", \"Th\", \"Fr\", \"Sa\"], location[\"openTime\"], location[\"closeTime\"]\n )\n yield item\n", "path": "locations/spiders/zabka_pl.py"}], "after_files": [{"content": "from datetime import datetime\n\nimport scrapy\nfrom scrapy.http import JsonRequest\n\nfrom locations.dict_parser import DictParser\nfrom locations.hours import OpeningHours\n\n\nclass ZabkaPLSpider(scrapy.Spider):\n name = \"zabka_pl\"\n item_attributes = {\"brand\": \"\u017babka\", \"brand_wikidata\": \"Q2589061\"}\n\n # URL extracted by observing request made by \u017bappka Android app (using HTTP Toolkit)\n start_urls = [\"https://partner-api.zabkamobile.pl/v2/shops\"]\n\n def start_requests(self):\n # Authorization header is hard-coded into the \u017bappka app and does not appear to change (as of version 3.14.10).\n headers = {\n \"Authorization\": \"PartnerKey 424A0B7AD0E9EA136510474D89061BBDC007B9BE5256A638EA28CC19D2BB15CD\",\n }\n yield JsonRequest(url=self.start_urls[0], headers=headers)\n\n def parse(self, response):\n today = datetime.now()\n for location in response.json():\n item = DictParser.parse(location)\n item[\"street_address\"] = item.pop(\"addr_full\", \"\")\n # unset \"state\" field, it is taken from the \"region\" field which is some internal \u017babka ID\n item[\"state\"] = None\n item[\"opening_hours\"] = OpeningHours()\n\n # Each franchisee is required to be open Mon-Sat with the same hours\n # But the hours for Sundays are set in the \"nonTradingDays\" field, which\n # contains the opening hours for each specific Sunday.\n item[\"opening_hours\"].add_days_range(\n [\"Mo\", \"Tu\", \"We\", \"Th\", \"Fr\", \"Sa\"], location[\"openTime\"], location[\"closeTime\"]\n )\n\n if location[\"nonTradingDays\"]:\n sunday_open = None\n sunday_close = None\n for rule in location[\"nonTradingDays\"]:\n d = datetime.strptime(rule[\"date\"], \"%Y-%m-%d\")\n if d.weekday() != 6 or d < today:\n continue # In the past, ignore\n if sunday_open is None:\n sunday_open = rule[\"openTime\"]\n sunday_close = rule[\"closeTime\"]\n else:\n if sunday_open != rule[\"openTime\"] or sunday_close != rule[\"closeTime\"]:\n self.crawler.stats.inc_value(\"atp/zabka_pl/nonTradingDays/mismatching\")\n break # Mismatching future Sundays, skip\n else:\n self.crawler.stats.inc_value(\"atp/zabka_pl/nonTradingDays/fine\")\n item[\"opening_hours\"].add_range(\"Su\", sunday_open, sunday_close)\n else:\n self.crawler.stats.inc_value(\"atp/zabka_pl/nonTradingDays/missing\") # Sunday closed? Missing data?\n\n yield item\n", "path": "locations/spiders/zabka_pl.py"}]}
| 993 | 443 |
gh_patches_debug_4671
|
rasdani/github-patches
|
git_diff
|
holoviz__panel-609
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Replacing periodic.period doesn't change period
```
import panel as pn
pn.extension()
test = pn.widgets.Toggle(name='Test')
def toggle():
test.value = not test.value
periodic = test.add_periodic_callback(toggle, period=1000)
test
```
```
periodic.period = 20000 #this doesnt update the periodic milliseconds
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `panel/callbacks.py`
Content:
```
1 """
2 Defines callbacks to be executed on a thread or by scheduling it
3 on a running bokeh server.
4 """
5 from __future__ import absolute_import, division, unicode_literals
6
7
8 import time
9 import param
10
11 from bokeh.io import curdoc as _curdoc
12
13
14 class PeriodicCallback(param.Parameterized):
15 """
16 Periodic encapsulates a periodic callback which will run both
17 in tornado based notebook environments and on bokeh server. By
18 default the callback will run until the stop method is called,
19 but count and timeout values can be set to limit the number of
20 executions or the maximum length of time for which the callback
21 will run.
22 """
23
24 callback = param.Callable(doc="""
25 The callback to execute periodically.""")
26
27 count = param.Integer(default=None, doc="""
28 Number of times the callback will be executed, by default
29 this is unlimited.""")
30
31 period = param.Integer(default=500, doc="""
32 Period in milliseconds at which the callback is executed.""")
33
34 timeout = param.Integer(default=None, doc="""
35 Timeout in seconds from the start time at which the callback
36 expires""")
37
38 def __init__(self, **params):
39 super(PeriodicCallback, self).__init__(**params)
40 self._counter = 0
41 self._start_time = None
42 self._timeout = None
43 self._cb = None
44 self._doc = None
45
46 def start(self):
47 if self._cb is not None:
48 raise RuntimeError('Periodic callback has already started.')
49 self._start_time = time.time()
50 if _curdoc().session_context:
51 self._doc = _curdoc()
52 self._cb = self._doc.add_periodic_callback(self._periodic_callback, self.period)
53 else:
54 from tornado.ioloop import PeriodicCallback
55 self._cb = PeriodicCallback(self._periodic_callback, self.period)
56 self._cb.start()
57
58 def _periodic_callback(self):
59 self.callback()
60 self._counter += 1
61 if self._timeout is not None:
62 dt = (time.time() - self._start_time)
63 if dt > self._timeout:
64 self.stop()
65 if self._counter == self.count:
66 self.stop()
67
68 def stop(self):
69 self._counter = 0
70 self._timeout = None
71 if self._doc:
72 self._doc.remove_periodic_callback(self._cb)
73 else:
74 self._cb.stop()
75 self._cb = None
76
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/panel/callbacks.py b/panel/callbacks.py
--- a/panel/callbacks.py
+++ b/panel/callbacks.py
@@ -55,6 +55,12 @@
self._cb = PeriodicCallback(self._periodic_callback, self.period)
self._cb.start()
+ @param.depends('period', watch=True)
+ def _update_period(self):
+ if self._cb:
+ self.stop()
+ self.start()
+
def _periodic_callback(self):
self.callback()
self._counter += 1
|
{"golden_diff": "diff --git a/panel/callbacks.py b/panel/callbacks.py\n--- a/panel/callbacks.py\n+++ b/panel/callbacks.py\n@@ -55,6 +55,12 @@\n self._cb = PeriodicCallback(self._periodic_callback, self.period)\n self._cb.start()\n \n+ @param.depends('period', watch=True)\n+ def _update_period(self):\n+ if self._cb:\n+ self.stop()\n+ self.start()\n+\n def _periodic_callback(self):\n self.callback()\n self._counter += 1\n", "issue": "Replacing periodic.period doesn't change period\n```\r\nimport panel as pn\r\npn.extension()\r\n\r\ntest = pn.widgets.Toggle(name='Test')\r\n\r\ndef toggle():\r\n test.value = not test.value\r\n\r\nperiodic = test.add_periodic_callback(toggle, period=1000)\r\ntest\r\n```\r\n\r\n```\r\nperiodic.period = 20000 #this doesnt update the periodic milliseconds\r\n```\n", "before_files": [{"content": "\"\"\"\nDefines callbacks to be executed on a thread or by scheduling it\non a running bokeh server.\n\"\"\"\nfrom __future__ import absolute_import, division, unicode_literals\n\n\nimport time\nimport param\n\nfrom bokeh.io import curdoc as _curdoc\n\n\nclass PeriodicCallback(param.Parameterized):\n \"\"\"\n Periodic encapsulates a periodic callback which will run both\n in tornado based notebook environments and on bokeh server. By\n default the callback will run until the stop method is called,\n but count and timeout values can be set to limit the number of\n executions or the maximum length of time for which the callback\n will run.\n \"\"\"\n\n callback = param.Callable(doc=\"\"\"\n The callback to execute periodically.\"\"\")\n\n count = param.Integer(default=None, doc=\"\"\"\n Number of times the callback will be executed, by default\n this is unlimited.\"\"\")\n\n period = param.Integer(default=500, doc=\"\"\"\n Period in milliseconds at which the callback is executed.\"\"\")\n\n timeout = param.Integer(default=None, doc=\"\"\"\n Timeout in seconds from the start time at which the callback\n expires\"\"\")\n\n def __init__(self, **params):\n super(PeriodicCallback, self).__init__(**params)\n self._counter = 0\n self._start_time = None\n self._timeout = None\n self._cb = None\n self._doc = None\n\n def start(self):\n if self._cb is not None:\n raise RuntimeError('Periodic callback has already started.')\n self._start_time = time.time()\n if _curdoc().session_context:\n self._doc = _curdoc()\n self._cb = self._doc.add_periodic_callback(self._periodic_callback, self.period)\n else:\n from tornado.ioloop import PeriodicCallback\n self._cb = PeriodicCallback(self._periodic_callback, self.period)\n self._cb.start()\n\n def _periodic_callback(self):\n self.callback()\n self._counter += 1\n if self._timeout is not None:\n dt = (time.time() - self._start_time)\n if dt > self._timeout:\n self.stop()\n if self._counter == self.count:\n self.stop()\n\n def stop(self):\n self._counter = 0\n self._timeout = None\n if self._doc:\n self._doc.remove_periodic_callback(self._cb)\n else:\n self._cb.stop()\n self._cb = None\n\n", "path": "panel/callbacks.py"}], "after_files": [{"content": "\"\"\"\nDefines callbacks to be executed on a thread or by scheduling it\non a running bokeh server.\n\"\"\"\nfrom __future__ import absolute_import, division, unicode_literals\n\n\nimport time\nimport param\n\nfrom bokeh.io import curdoc as _curdoc\n\n\nclass PeriodicCallback(param.Parameterized):\n \"\"\"\n Periodic encapsulates a periodic callback which will run both\n in tornado based notebook environments and on bokeh server. By\n default the callback will run until the stop method is called,\n but count and timeout values can be set to limit the number of\n executions or the maximum length of time for which the callback\n will run.\n \"\"\"\n\n callback = param.Callable(doc=\"\"\"\n The callback to execute periodically.\"\"\")\n\n count = param.Integer(default=None, doc=\"\"\"\n Number of times the callback will be executed, by default\n this is unlimited.\"\"\")\n\n period = param.Integer(default=500, doc=\"\"\"\n Period in milliseconds at which the callback is executed.\"\"\")\n\n timeout = param.Integer(default=None, doc=\"\"\"\n Timeout in seconds from the start time at which the callback\n expires\"\"\")\n\n def __init__(self, **params):\n super(PeriodicCallback, self).__init__(**params)\n self._counter = 0\n self._start_time = None\n self._timeout = None\n self._cb = None\n self._doc = None\n\n def start(self):\n if self._cb is not None:\n raise RuntimeError('Periodic callback has already started.')\n self._start_time = time.time()\n if _curdoc().session_context:\n self._doc = _curdoc()\n self._cb = self._doc.add_periodic_callback(self._periodic_callback, self.period)\n else:\n from tornado.ioloop import PeriodicCallback\n self._cb = PeriodicCallback(self._periodic_callback, self.period)\n self._cb.start()\n\n @param.depends('period', watch=True)\n def _update_period(self):\n if self._cb:\n self.stop()\n self.start()\n\n def _periodic_callback(self):\n self.callback()\n self._counter += 1\n if self._timeout is not None:\n dt = (time.time() - self._start_time)\n if dt > self._timeout:\n self.stop()\n if self._counter == self.count:\n self.stop()\n\n def stop(self):\n self._counter = 0\n self._timeout = None\n if self._doc:\n self._doc.remove_periodic_callback(self._cb)\n else:\n self._cb.stop()\n self._cb = None\n\n", "path": "panel/callbacks.py"}]}
| 1,029 | 127 |
gh_patches_debug_16665
|
rasdani/github-patches
|
git_diff
|
opendatacube__datacube-core-1446
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Feature request: make netcdf4 dependency optional
Right now netcdf4 library is a non-optional dependency of datacube, but it's use within the library is actually fairly limited. With deprecation of ingestion step and move towards cloud, netcdf data sources are becoming less relevant in many deployments of the datacube. netcdf4 is a rather heavy dependency both in terms of "disk used" and complexity of installation. Having lean dependency set is particularly beneficial for things like cloud deployments (AWS lambda layer limit is 250Mb for example).
I believe that in the case of netdcf4 making it optional is relatively low cost and the benefit is significant. Ideally of course this would require automated testing across different python environments and that can be tricky and complex to setup, but we can start with just a manual test.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2
3 from setuptools import setup, find_packages
4
5 tests_require = [
6 'hypothesis',
7 'pycodestyle',
8 'pylint',
9 'pytest',
10 'pytest-cov',
11 'pytest-timeout',
12 'pytest-httpserver',
13 'moto',
14 ]
15 doc_require = [
16 'Sphinx',
17 'sphinx_autodoc_typehints', # Propagate mypy info into docs
18 'sphinx-click',
19 'recommonmark',
20 'autodocsumm',
21 'beautifulsoup4',
22 'nbsphinx',
23 'pydata-sphinx-theme==0.9.0',
24 ]
25
26 extras_require = {
27 'performance': ['ciso8601', 'bottleneck'],
28 'distributed': ['distributed', 'dask[distributed]'],
29 'doc': doc_require,
30 's3': ['boto3', 'botocore'],
31 'test': tests_require,
32 'cf': ['compliance-checker>=4.0.0'],
33 }
34
35 extras_require['dev'] = sorted(set(sum([extras_require[k] for k in [
36 'test',
37 'doc',
38 'performance',
39 's3',
40 'distributed',
41 ]], [])))
42
43 # An 'all' option, following ipython naming conventions.
44 extras_require['all'] = sorted(set(sum(extras_require.values(), [])))
45
46 extra_plugins = dict(read=[], write=[], index=[])
47
48 setup(
49 name='datacube',
50 python_requires='>=3.8.0',
51
52 url='https://github.com/opendatacube/datacube-core',
53 author='Open Data Cube',
54 maintainer='Open Data Cube',
55 maintainer_email='',
56 description='An analysis environment for satellite and other earth observation data',
57 long_description=open('README.rst').read(),
58 long_description_content_type='text/x-rst',
59 license='Apache License 2.0',
60 classifiers=[
61 "Development Status :: 4 - Beta",
62 "Intended Audience :: Developers",
63 "Intended Audience :: Science/Research",
64 "License :: OSI Approved :: Apache Software License",
65 "Natural Language :: English",
66 "Operating System :: MacOS :: MacOS X",
67 "Operating System :: POSIX",
68 "Operating System :: POSIX :: BSD",
69 "Operating System :: POSIX :: Linux",
70 "Operating System :: Microsoft :: Windows",
71 "Programming Language :: Python",
72 "Programming Language :: Python :: 3",
73 "Programming Language :: Python :: 3.8",
74 "Programming Language :: Python :: 3.9",
75 "Programming Language :: Python :: 3.10",
76 "Topic :: Scientific/Engineering :: GIS",
77 "Topic :: Scientific/Engineering :: Information Analysis",
78 ],
79
80 packages=find_packages(
81 exclude=('tests', 'tests.*',
82 'integration_tests', 'integration_tests.*')
83 ),
84 package_data={
85 '': ['*.yaml', '*/*.yaml'],
86 'datacube': ['py.typed'],
87 },
88 scripts=[],
89 install_requires=[
90 'affine',
91 'attrs>=18.1',
92 'pyproj>=2.5',
93 'shapely>=2.0',
94 'cachetools',
95 'click>=5.0',
96 'cloudpickle>=0.4',
97 'dask[array]',
98 'distributed',
99 'jsonschema',
100 'netcdf4',
101 'numpy',
102 'psycopg2',
103 'lark',
104 'pandas',
105 'python-dateutil',
106 'pyyaml',
107 'rasterio>=1.3.2', # Warping broken in 1.3.0 and 1.3.1
108 'ruamel.yaml',
109 'sqlalchemy>=2.0', # GeoAlchemy2 requires >=1.4. SqlAlchemy2 *may* work but has not been tested yet.
110 'GeoAlchemy2',
111 'toolz',
112 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost
113 'packaging',
114 'odc-geo',
115 'deprecat',
116 ],
117 extras_require=extras_require,
118 tests_require=tests_require,
119
120 entry_points={
121 'console_scripts': [
122 'datacube = datacube.scripts.cli_app:cli',
123 'datacube-search = datacube.scripts.search_tool:cli',
124 'datacube-worker = datacube.execution.worker:main',
125 ],
126 'datacube.plugins.io.read': [
127 'netcdf = datacube.drivers.netcdf.driver:reader_driver_init',
128 *extra_plugins['read'],
129 ],
130 'datacube.plugins.io.write': [
131 'netcdf = datacube.drivers.netcdf.driver:writer_driver_init',
132 *extra_plugins['write'],
133 ],
134 'datacube.plugins.index': [
135 'default = datacube.index.postgres.index:index_driver_init',
136 'null = datacube.index.null.index:index_driver_init',
137 'memory = datacube.index.memory.index:index_driver_init',
138 'postgis = datacube.index.postgis.index:index_driver_init',
139 *extra_plugins['index'],
140 ],
141 },
142 )
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -30,6 +30,7 @@
's3': ['boto3', 'botocore'],
'test': tests_require,
'cf': ['compliance-checker>=4.0.0'],
+ 'netcdf': ['netcdf4'],
}
extras_require['dev'] = sorted(set(sum([extras_require[k] for k in [
@@ -38,6 +39,7 @@
'performance',
's3',
'distributed',
+ 'netcdf',
]], [])))
# An 'all' option, following ipython naming conventions.
@@ -97,7 +99,6 @@
'dask[array]',
'distributed',
'jsonschema',
- 'netcdf4',
'numpy',
'psycopg2',
'lark',
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -30,6 +30,7 @@\n 's3': ['boto3', 'botocore'],\n 'test': tests_require,\n 'cf': ['compliance-checker>=4.0.0'],\n+ 'netcdf': ['netcdf4'],\n }\n \n extras_require['dev'] = sorted(set(sum([extras_require[k] for k in [\n@@ -38,6 +39,7 @@\n 'performance',\n 's3',\n 'distributed',\n+ 'netcdf',\n ]], [])))\n \n # An 'all' option, following ipython naming conventions.\n@@ -97,7 +99,6 @@\n 'dask[array]',\n 'distributed',\n 'jsonschema',\n- 'netcdf4',\n 'numpy',\n 'psycopg2',\n 'lark',\n", "issue": "Feature request: make netcdf4 dependency optional\nRight now netcdf4 library is a non-optional dependency of datacube, but it's use within the library is actually fairly limited. With deprecation of ingestion step and move towards cloud, netcdf data sources are becoming less relevant in many deployments of the datacube. netcdf4 is a rather heavy dependency both in terms of \"disk used\" and complexity of installation. Having lean dependency set is particularly beneficial for things like cloud deployments (AWS lambda layer limit is 250Mb for example).\r\n\r\nI believe that in the case of netdcf4 making it optional is relatively low cost and the benefit is significant. Ideally of course this would require automated testing across different python environments and that can be tricky and complex to setup, but we can start with just a manual test.\n", "before_files": [{"content": "#!/usr/bin/env python\n\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'hypothesis',\n 'pycodestyle',\n 'pylint',\n 'pytest',\n 'pytest-cov',\n 'pytest-timeout',\n 'pytest-httpserver',\n 'moto',\n]\ndoc_require = [\n 'Sphinx',\n 'sphinx_autodoc_typehints', # Propagate mypy info into docs\n 'sphinx-click',\n 'recommonmark',\n 'autodocsumm',\n 'beautifulsoup4',\n 'nbsphinx',\n 'pydata-sphinx-theme==0.9.0',\n]\n\nextras_require = {\n 'performance': ['ciso8601', 'bottleneck'],\n 'distributed': ['distributed', 'dask[distributed]'],\n 'doc': doc_require,\n 's3': ['boto3', 'botocore'],\n 'test': tests_require,\n 'cf': ['compliance-checker>=4.0.0'],\n}\n\nextras_require['dev'] = sorted(set(sum([extras_require[k] for k in [\n 'test',\n 'doc',\n 'performance',\n 's3',\n 'distributed',\n]], [])))\n\n# An 'all' option, following ipython naming conventions.\nextras_require['all'] = sorted(set(sum(extras_require.values(), [])))\n\nextra_plugins = dict(read=[], write=[], index=[])\n\nsetup(\n name='datacube',\n python_requires='>=3.8.0',\n\n url='https://github.com/opendatacube/datacube-core',\n author='Open Data Cube',\n maintainer='Open Data Cube',\n maintainer_email='',\n description='An analysis environment for satellite and other earth observation data',\n long_description=open('README.rst').read(),\n long_description_content_type='text/x-rst',\n license='Apache License 2.0',\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX\",\n \"Operating System :: POSIX :: BSD\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Scientific/Engineering :: GIS\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n ],\n\n packages=find_packages(\n exclude=('tests', 'tests.*',\n 'integration_tests', 'integration_tests.*')\n ),\n package_data={\n '': ['*.yaml', '*/*.yaml'],\n 'datacube': ['py.typed'],\n },\n scripts=[],\n install_requires=[\n 'affine',\n 'attrs>=18.1',\n 'pyproj>=2.5',\n 'shapely>=2.0',\n 'cachetools',\n 'click>=5.0',\n 'cloudpickle>=0.4',\n 'dask[array]',\n 'distributed',\n 'jsonschema',\n 'netcdf4',\n 'numpy',\n 'psycopg2',\n 'lark',\n 'pandas',\n 'python-dateutil',\n 'pyyaml',\n 'rasterio>=1.3.2', # Warping broken in 1.3.0 and 1.3.1\n 'ruamel.yaml',\n 'sqlalchemy>=2.0', # GeoAlchemy2 requires >=1.4. SqlAlchemy2 *may* work but has not been tested yet.\n 'GeoAlchemy2',\n 'toolz',\n 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost\n 'packaging',\n 'odc-geo',\n 'deprecat',\n ],\n extras_require=extras_require,\n tests_require=tests_require,\n\n entry_points={\n 'console_scripts': [\n 'datacube = datacube.scripts.cli_app:cli',\n 'datacube-search = datacube.scripts.search_tool:cli',\n 'datacube-worker = datacube.execution.worker:main',\n ],\n 'datacube.plugins.io.read': [\n 'netcdf = datacube.drivers.netcdf.driver:reader_driver_init',\n *extra_plugins['read'],\n ],\n 'datacube.plugins.io.write': [\n 'netcdf = datacube.drivers.netcdf.driver:writer_driver_init',\n *extra_plugins['write'],\n ],\n 'datacube.plugins.index': [\n 'default = datacube.index.postgres.index:index_driver_init',\n 'null = datacube.index.null.index:index_driver_init',\n 'memory = datacube.index.memory.index:index_driver_init',\n 'postgis = datacube.index.postgis.index:index_driver_init',\n *extra_plugins['index'],\n ],\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'hypothesis',\n 'pycodestyle',\n 'pylint',\n 'pytest',\n 'pytest-cov',\n 'pytest-timeout',\n 'pytest-httpserver',\n 'moto',\n]\ndoc_require = [\n 'Sphinx',\n 'sphinx_autodoc_typehints', # Propagate mypy info into docs\n 'sphinx-click',\n 'recommonmark',\n 'autodocsumm',\n 'beautifulsoup4',\n 'nbsphinx',\n 'pydata-sphinx-theme==0.9.0',\n]\n\nextras_require = {\n 'performance': ['ciso8601', 'bottleneck'],\n 'distributed': ['distributed', 'dask[distributed]'],\n 'doc': doc_require,\n 's3': ['boto3', 'botocore'],\n 'test': tests_require,\n 'cf': ['compliance-checker>=4.0.0'],\n 'netcdf': ['netcdf4'],\n}\n\nextras_require['dev'] = sorted(set(sum([extras_require[k] for k in [\n 'test',\n 'doc',\n 'performance',\n 's3',\n 'distributed',\n 'netcdf',\n]], [])))\n\n# An 'all' option, following ipython naming conventions.\nextras_require['all'] = sorted(set(sum(extras_require.values(), [])))\n\nextra_plugins = dict(read=[], write=[], index=[])\n\nsetup(\n name='datacube',\n python_requires='>=3.8.0',\n\n url='https://github.com/opendatacube/datacube-core',\n author='Open Data Cube',\n maintainer='Open Data Cube',\n maintainer_email='',\n description='An analysis environment for satellite and other earth observation data',\n long_description=open('README.rst').read(),\n long_description_content_type='text/x-rst',\n license='Apache License 2.0',\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX\",\n \"Operating System :: POSIX :: BSD\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Scientific/Engineering :: GIS\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n ],\n\n packages=find_packages(\n exclude=('tests', 'tests.*',\n 'integration_tests', 'integration_tests.*')\n ),\n package_data={\n '': ['*.yaml', '*/*.yaml'],\n 'datacube': ['py.typed'],\n },\n scripts=[],\n install_requires=[\n 'affine',\n 'attrs>=18.1',\n 'pyproj>=2.5',\n 'shapely>=2.0',\n 'cachetools',\n 'click>=5.0',\n 'cloudpickle>=0.4',\n 'dask[array]',\n 'distributed',\n 'jsonschema',\n 'numpy',\n 'psycopg2',\n 'lark',\n 'pandas',\n 'python-dateutil',\n 'pyyaml',\n 'rasterio>=1.3.2', # Warping broken in 1.3.0 and 1.3.1\n 'ruamel.yaml',\n 'sqlalchemy>=2.0', # GeoAlchemy2 requires >=1.4. SqlAlchemy2 *may* work but has not been tested yet.\n 'GeoAlchemy2',\n 'toolz',\n 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost\n 'packaging',\n 'odc-geo',\n 'deprecat',\n ],\n extras_require=extras_require,\n tests_require=tests_require,\n\n entry_points={\n 'console_scripts': [\n 'datacube = datacube.scripts.cli_app:cli',\n 'datacube-search = datacube.scripts.search_tool:cli',\n 'datacube-worker = datacube.execution.worker:main',\n ],\n 'datacube.plugins.io.read': [\n 'netcdf = datacube.drivers.netcdf.driver:reader_driver_init',\n *extra_plugins['read'],\n ],\n 'datacube.plugins.io.write': [\n 'netcdf = datacube.drivers.netcdf.driver:writer_driver_init',\n *extra_plugins['write'],\n ],\n 'datacube.plugins.index': [\n 'default = datacube.index.postgres.index:index_driver_init',\n 'null = datacube.index.null.index:index_driver_init',\n 'memory = datacube.index.memory.index:index_driver_init',\n 'postgis = datacube.index.postgis.index:index_driver_init',\n *extra_plugins['index'],\n ],\n },\n)\n", "path": "setup.py"}]}
| 1,843 | 199 |
gh_patches_debug_36254
|
rasdani/github-patches
|
git_diff
|
open-telemetry__opentelemetry-python-494
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Author a full getting started guide
As part of the final beta announce on March 30th, we should have a full getting started guide, similar to this one for js:
https://github.com/open-telemetry/opentelemetry-js/blob/master/getting-started/README.md
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/trace_example.py`
Content:
```
1 from opentelemetry import trace
2 from opentelemetry.sdk.trace import TracerProvider
3 from opentelemetry.sdk.trace.export import (
4 ConsoleSpanExporter,
5 SimpleExportSpanProcessor,
6 )
7
8 trace.set_preferred_tracer_provider_implementation(lambda T: TracerProvider())
9 trace.tracer_provider().add_span_processor(
10 SimpleExportSpanProcessor(ConsoleSpanExporter())
11 )
12
13 tracer = trace.get_tracer(__name__)
14
15 with tracer.start_as_current_span("foo"):
16 with tracer.start_as_current_span("bar"):
17 with tracer.start_as_current_span("baz"):
18 print("Hello world from OpenTelemetry Python!")
19
```
Path: `docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py`
Content:
```
1 # Copyright 2019, OpenTelemetry Authors
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 #
15 """
16 This module serves as an example to integrate with flask, using
17 the requests library to perform downstream requests
18 """
19 import flask
20 import pkg_resources
21 import requests
22
23 import opentelemetry.ext.http_requests
24 from opentelemetry import trace
25 from opentelemetry.ext.flask import instrument_app
26 from opentelemetry.sdk.trace import TracerProvider
27
28
29 def configure_opentelemetry(flask_app: flask.Flask):
30 """Configure a flask application to use OpenTelemetry.
31
32 This activates the specific components:
33
34 * sets tracer to the SDK's Tracer
35 * enables requests integration on the Tracer
36 * uses a WSGI middleware to enable configuration
37 """
38 # Start by configuring all objects required to ensure a complete end to end
39 # workflow.
40 trace.set_tracer_provider(TracerProvider())
41
42 # Next, we need to configure how the values that are used by traces and
43 # metrics are propagated (such as what specific headers carry this value).
44 # Integrations are the glue that binds the OpenTelemetry API and the
45 # frameworks and libraries that are used together, automatically creating
46 # Spans and propagating context as appropriate.
47 opentelemetry.ext.http_requests.enable(trace.get_tracer_provider())
48 instrument_app(flask_app)
49
50
51 app = flask.Flask(__name__)
52
53
54 @app.route("/")
55 def hello():
56 # Emit a trace that measures how long the sleep takes
57 version = pkg_resources.get_distribution(
58 "opentelemetry-example-app"
59 ).version
60 tracer = trace.get_tracer(__name__, version)
61 with tracer.start_as_current_span("example-request"):
62 requests.get("http://www.example.com")
63 return "hello"
64
65
66 configure_opentelemetry(app)
67
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py b/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py
--- a/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py
+++ b/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py
@@ -17,50 +17,31 @@
the requests library to perform downstream requests
"""
import flask
-import pkg_resources
import requests
import opentelemetry.ext.http_requests
from opentelemetry import trace
from opentelemetry.ext.flask import instrument_app
from opentelemetry.sdk.trace import TracerProvider
+from opentelemetry.sdk.trace.export import ConsoleSpanExporter
+from opentelemetry.sdk.trace.export import SimpleExportSpanProcessor
-
-def configure_opentelemetry(flask_app: flask.Flask):
- """Configure a flask application to use OpenTelemetry.
-
- This activates the specific components:
-
- * sets tracer to the SDK's Tracer
- * enables requests integration on the Tracer
- * uses a WSGI middleware to enable configuration
- """
- # Start by configuring all objects required to ensure a complete end to end
- # workflow.
- trace.set_tracer_provider(TracerProvider())
-
- # Next, we need to configure how the values that are used by traces and
- # metrics are propagated (such as what specific headers carry this value).
- # Integrations are the glue that binds the OpenTelemetry API and the
- # frameworks and libraries that are used together, automatically creating
- # Spans and propagating context as appropriate.
- opentelemetry.ext.http_requests.enable(trace.get_tracer_provider())
- instrument_app(flask_app)
-
+trace.set_tracer_provider(TracerProvider())
+trace.get_tracer_provider().add_span_processor(
+ SimpleExportSpanProcessor(ConsoleSpanExporter())
+)
app = flask.Flask(__name__)
+opentelemetry.ext.http_requests.enable(trace.get_tracer_provider())
+instrument_app(app)
@app.route("/")
def hello():
- # Emit a trace that measures how long the sleep takes
- version = pkg_resources.get_distribution(
- "opentelemetry-example-app"
- ).version
- tracer = trace.get_tracer(__name__, version)
+ tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("example-request"):
requests.get("http://www.example.com")
return "hello"
-configure_opentelemetry(app)
+app.run(debug=True)
diff --git a/docs/trace_example.py b/docs/trace_example.py
--- a/docs/trace_example.py
+++ b/docs/trace_example.py
@@ -5,8 +5,8 @@
SimpleExportSpanProcessor,
)
-trace.set_preferred_tracer_provider_implementation(lambda T: TracerProvider())
-trace.tracer_provider().add_span_processor(
+trace.set_tracer_provider(TracerProvider())
+trace.get_tracer_provider().add_span_processor(
SimpleExportSpanProcessor(ConsoleSpanExporter())
)
|
{"golden_diff": "diff --git a/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py b/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py\n--- a/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py\n+++ b/docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py\n@@ -17,50 +17,31 @@\n the requests library to perform downstream requests\n \"\"\"\n import flask\n-import pkg_resources\n import requests\n \n import opentelemetry.ext.http_requests\n from opentelemetry import trace\n from opentelemetry.ext.flask import instrument_app\n from opentelemetry.sdk.trace import TracerProvider\n+from opentelemetry.sdk.trace.export import ConsoleSpanExporter\n+from opentelemetry.sdk.trace.export import SimpleExportSpanProcessor\n \n-\n-def configure_opentelemetry(flask_app: flask.Flask):\n- \"\"\"Configure a flask application to use OpenTelemetry.\n-\n- This activates the specific components:\n-\n- * sets tracer to the SDK's Tracer\n- * enables requests integration on the Tracer\n- * uses a WSGI middleware to enable configuration\n- \"\"\"\n- # Start by configuring all objects required to ensure a complete end to end\n- # workflow.\n- trace.set_tracer_provider(TracerProvider())\n-\n- # Next, we need to configure how the values that are used by traces and\n- # metrics are propagated (such as what specific headers carry this value).\n- # Integrations are the glue that binds the OpenTelemetry API and the\n- # frameworks and libraries that are used together, automatically creating\n- # Spans and propagating context as appropriate.\n- opentelemetry.ext.http_requests.enable(trace.get_tracer_provider())\n- instrument_app(flask_app)\n-\n+trace.set_tracer_provider(TracerProvider())\n+trace.get_tracer_provider().add_span_processor(\n+ SimpleExportSpanProcessor(ConsoleSpanExporter())\n+)\n \n app = flask.Flask(__name__)\n+opentelemetry.ext.http_requests.enable(trace.get_tracer_provider())\n+instrument_app(app)\n \n \n @app.route(\"/\")\n def hello():\n- # Emit a trace that measures how long the sleep takes\n- version = pkg_resources.get_distribution(\n- \"opentelemetry-example-app\"\n- ).version\n- tracer = trace.get_tracer(__name__, version)\n+ tracer = trace.get_tracer(__name__)\n with tracer.start_as_current_span(\"example-request\"):\n requests.get(\"http://www.example.com\")\n return \"hello\"\n \n \n-configure_opentelemetry(app)\n+app.run(debug=True)\ndiff --git a/docs/trace_example.py b/docs/trace_example.py\n--- a/docs/trace_example.py\n+++ b/docs/trace_example.py\n@@ -5,8 +5,8 @@\n SimpleExportSpanProcessor,\n )\n \n-trace.set_preferred_tracer_provider_implementation(lambda T: TracerProvider())\n-trace.tracer_provider().add_span_processor(\n+trace.set_tracer_provider(TracerProvider())\n+trace.get_tracer_provider().add_span_processor(\n SimpleExportSpanProcessor(ConsoleSpanExporter())\n )\n", "issue": "Author a full getting started guide\nAs part of the final beta announce on March 30th, we should have a full getting started guide, similar to this one for js:\r\n\r\nhttps://github.com/open-telemetry/opentelemetry-js/blob/master/getting-started/README.md\r\n\r\n\n", "before_files": [{"content": "from opentelemetry import trace\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import (\n ConsoleSpanExporter,\n SimpleExportSpanProcessor,\n)\n\ntrace.set_preferred_tracer_provider_implementation(lambda T: TracerProvider())\ntrace.tracer_provider().add_span_processor(\n SimpleExportSpanProcessor(ConsoleSpanExporter())\n)\n\ntracer = trace.get_tracer(__name__)\n\nwith tracer.start_as_current_span(\"foo\"):\n with tracer.start_as_current_span(\"bar\"):\n with tracer.start_as_current_span(\"baz\"):\n print(\"Hello world from OpenTelemetry Python!\")\n", "path": "docs/trace_example.py"}, {"content": "# Copyright 2019, OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n#\n\"\"\"\nThis module serves as an example to integrate with flask, using\nthe requests library to perform downstream requests\n\"\"\"\nimport flask\nimport pkg_resources\nimport requests\n\nimport opentelemetry.ext.http_requests\nfrom opentelemetry import trace\nfrom opentelemetry.ext.flask import instrument_app\nfrom opentelemetry.sdk.trace import TracerProvider\n\n\ndef configure_opentelemetry(flask_app: flask.Flask):\n \"\"\"Configure a flask application to use OpenTelemetry.\n\n This activates the specific components:\n\n * sets tracer to the SDK's Tracer\n * enables requests integration on the Tracer\n * uses a WSGI middleware to enable configuration\n \"\"\"\n # Start by configuring all objects required to ensure a complete end to end\n # workflow.\n trace.set_tracer_provider(TracerProvider())\n\n # Next, we need to configure how the values that are used by traces and\n # metrics are propagated (such as what specific headers carry this value).\n # Integrations are the glue that binds the OpenTelemetry API and the\n # frameworks and libraries that are used together, automatically creating\n # Spans and propagating context as appropriate.\n opentelemetry.ext.http_requests.enable(trace.get_tracer_provider())\n instrument_app(flask_app)\n\n\napp = flask.Flask(__name__)\n\n\[email protected](\"/\")\ndef hello():\n # Emit a trace that measures how long the sleep takes\n version = pkg_resources.get_distribution(\n \"opentelemetry-example-app\"\n ).version\n tracer = trace.get_tracer(__name__, version)\n with tracer.start_as_current_span(\"example-request\"):\n requests.get(\"http://www.example.com\")\n return \"hello\"\n\n\nconfigure_opentelemetry(app)\n", "path": "docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py"}], "after_files": [{"content": "from opentelemetry import trace\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import (\n ConsoleSpanExporter,\n SimpleExportSpanProcessor,\n)\n\ntrace.set_tracer_provider(TracerProvider())\ntrace.get_tracer_provider().add_span_processor(\n SimpleExportSpanProcessor(ConsoleSpanExporter())\n)\n\ntracer = trace.get_tracer(__name__)\n\nwith tracer.start_as_current_span(\"foo\"):\n with tracer.start_as_current_span(\"bar\"):\n with tracer.start_as_current_span(\"baz\"):\n print(\"Hello world from OpenTelemetry Python!\")\n", "path": "docs/trace_example.py"}, {"content": "# Copyright 2019, OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n#\n\"\"\"\nThis module serves as an example to integrate with flask, using\nthe requests library to perform downstream requests\n\"\"\"\nimport flask\nimport requests\n\nimport opentelemetry.ext.http_requests\nfrom opentelemetry import trace\nfrom opentelemetry.ext.flask import instrument_app\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter\nfrom opentelemetry.sdk.trace.export import SimpleExportSpanProcessor\n\ntrace.set_tracer_provider(TracerProvider())\ntrace.get_tracer_provider().add_span_processor(\n SimpleExportSpanProcessor(ConsoleSpanExporter())\n)\n\napp = flask.Flask(__name__)\nopentelemetry.ext.http_requests.enable(trace.get_tracer_provider())\ninstrument_app(app)\n\n\[email protected](\"/\")\ndef hello():\n tracer = trace.get_tracer(__name__)\n with tracer.start_as_current_span(\"example-request\"):\n requests.get(\"http://www.example.com\")\n return \"hello\"\n\n\napp.run(debug=True)\n", "path": "docs/examples/opentelemetry-example-app/src/opentelemetry_example_app/flask_example.py"}]}
| 1,128 | 671 |
gh_patches_debug_2148
|
rasdani/github-patches
|
git_diff
|
voxel51__fiftyone-3439
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] Fiftyone v0.21.6 localhost Not found, while v0.21.4 Could not connect session, despite working before 7Aug23
### System information
- **OS Platform and Distribution** (`Windows 11 Pro, build 22621.2134`)
- **Google Chrome** (`Version 115.0.5790.171 (Official Build) (64-bit)`)
- **Python version** (`python --version 3.10.0`)
- **FiftyOne version** (`fiftyone --version 0.21.4`)
- **FiftyOne installed from** (`pip`)
### Commands to reproduce
I have my own custom Python script (including option to run fiftyone's quickstart) working for weeks at least until 7 Aug 2023. I'm unable to share the custom scripts.
Today (17 Aug 2023), as I run through installing fiftyone and running my scripts, I encounter the following problems.
I had a few updates to Google Chrome browser in these 10 days too.
### Describe the problem
Here's what i've tried.
Scenario A: fiftyone v0.21.6
Problem: `App launched. Point your browser to http://localhost:5151`. But unable to load App page at localhost 5151 ("Not found" displayed on page)
Solutions tried but did not work: [Registry settings](https://github.com/voxel51/fiftyone/issues/2010) and/or [mimetype](https://github.com/voxel51/fiftyone/issues/2522#issuecomment-1416318362)
Scenario B: fiftyone v0.21.4
Problem: unable to fully display the App (it only shows the fiftyone skeleton page with "Select Dataset", and no layout and data in the middle). The terminal repeatedly gives `could not connect session, retrying in 10 seconds`.
I'm unsure what is the cause to the above and would appreciate your assistance.
### What areas of FiftyOne does this bug affect?
- [x] `App`: FiftyOne application issue
- [ ] `Core`: Core Python library issue
- [ ] `Server`: FiftyOne server issue
### Willingness to contribute
The FiftyOne Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the FiftyOne codebase?
- [ ] Yes. I can contribute a fix for this bug independently
- [ ] Yes. I would be willing to contribute a fix for this bug with guidance from the FiftyOne community
- [x] No. I cannot contribute a bug fix at this time
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `fiftyone/server/app.py`
Content:
```
1 """
2 FiftyOne Server app.
3
4 | Copyright 2017-2023, Voxel51, Inc.
5 | `voxel51.com <https://voxel51.com/>`_
6 |
7 """
8 from datetime import date, datetime
9 import os
10 import pathlib
11
12 import eta.core.utils as etau
13 from starlette.applications import Starlette
14 from starlette.middleware import Middleware
15 from starlette.middleware.base import (
16 BaseHTTPMiddleware,
17 RequestResponseEndpoint,
18 )
19 from starlette.middleware.cors import CORSMiddleware
20 from starlette.requests import Request
21 from starlette.responses import Response
22 from starlette.routing import Mount, Route
23 from starlette.staticfiles import StaticFiles
24 from starlette.types import Scope
25 import strawberry as gql
26
27 import fiftyone as fo
28 import fiftyone.constants as foc
29 from fiftyone.server.context import GraphQL
30 from fiftyone.server.extensions import EndSession
31 from fiftyone.server.mutation import Mutation
32 from fiftyone.server.query import Query
33 from fiftyone.server.routes import routes
34 from fiftyone.server.scalars import Date, DateTime
35
36
37 etau.ensure_dir(os.path.join(os.path.dirname(__file__), "static"))
38
39
40 class Static(StaticFiles):
41 async def get_response(self, path: str, scope: Scope) -> Response:
42 response = await super().get_response(path, scope)
43
44 if response.status_code == 404:
45 path = pathlib.Path(
46 *pathlib.Path(path).parts[2:]
47 ) # strip dataset/{name}
48 response = await super().get_response(path, scope)
49 if response.status_code == 404:
50 full_path, stat_result = self.lookup_path("index.html")
51 return self.file_response(full_path, stat_result, scope)
52
53 return response
54
55
56 class HeadersMiddleware(BaseHTTPMiddleware):
57 async def dispatch(
58 self, request: Request, call_next: RequestResponseEndpoint
59 ) -> Response:
60 response = await call_next(request)
61 response.headers["x-colab-notebook-cache-control"] = "no-cache"
62 return response
63
64
65 schema = gql.Schema(
66 mutation=Mutation,
67 query=Query,
68 extensions=[EndSession],
69 scalar_overrides={
70 date: Date,
71 datetime: DateTime,
72 },
73 )
74
75
76 app = Starlette(
77 middleware=[
78 Middleware(
79 CORSMiddleware,
80 allow_origins=["*"],
81 allow_methods=["GET", "POST", "HEAD", "OPTIONS"],
82 allow_headers=[
83 "access-control-allow-origin",
84 "authorization",
85 "content-type",
86 ],
87 ),
88 Middleware(HeadersMiddleware),
89 ],
90 debug=foc.DEV_INSTALL,
91 routes=[Route(route, endpoint) for route, endpoint in routes]
92 + [
93 Route(
94 "/graphql",
95 GraphQL(
96 schema,
97 graphiql=foc.DEV_INSTALL,
98 ),
99 ),
100 Mount(
101 "/plugins",
102 app=Static(
103 directory=fo.config.plugins_dir,
104 html=True,
105 check_dir=False,
106 follow_symlink=True,
107 ),
108 name="plugins",
109 ),
110 Mount(
111 "/",
112 app=Static(
113 directory=os.path.join(os.path.dirname(__file__), "static"),
114 html=True,
115 follow_symlink=True,
116 ),
117 name="static",
118 ),
119 ],
120 )
121
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/fiftyone/server/app.py b/fiftyone/server/app.py
--- a/fiftyone/server/app.py
+++ b/fiftyone/server/app.py
@@ -112,7 +112,6 @@
app=Static(
directory=os.path.join(os.path.dirname(__file__), "static"),
html=True,
- follow_symlink=True,
),
name="static",
),
|
{"golden_diff": "diff --git a/fiftyone/server/app.py b/fiftyone/server/app.py\n--- a/fiftyone/server/app.py\n+++ b/fiftyone/server/app.py\n@@ -112,7 +112,6 @@\n app=Static(\n directory=os.path.join(os.path.dirname(__file__), \"static\"),\n html=True,\n- follow_symlink=True,\n ),\n name=\"static\",\n ),\n", "issue": "[BUG] Fiftyone v0.21.6 localhost Not found, while v0.21.4 Could not connect session, despite working before 7Aug23\n### System information\r\n\r\n- **OS Platform and Distribution** (`Windows 11 Pro, build 22621.2134`)\r\n- **Google Chrome** (`Version 115.0.5790.171 (Official Build) (64-bit)`)\r\n- **Python version** (`python --version 3.10.0`)\r\n- **FiftyOne version** (`fiftyone --version 0.21.4`)\r\n- **FiftyOne installed from** (`pip`)\r\n\r\n### Commands to reproduce\r\n\r\nI have my own custom Python script (including option to run fiftyone's quickstart) working for weeks at least until 7 Aug 2023. I'm unable to share the custom scripts.\r\n\r\nToday (17 Aug 2023), as I run through installing fiftyone and running my scripts, I encounter the following problems.\r\n\r\nI had a few updates to Google Chrome browser in these 10 days too.\r\n\r\n### Describe the problem\r\n\r\nHere's what i've tried.\r\n\r\nScenario A: fiftyone v0.21.6\r\nProblem: `App launched. Point your browser to http://localhost:5151`. But unable to load App page at localhost 5151 (\"Not found\" displayed on page)\r\nSolutions tried but did not work: [Registry settings](https://github.com/voxel51/fiftyone/issues/2010) and/or [mimetype](https://github.com/voxel51/fiftyone/issues/2522#issuecomment-1416318362)\r\n\r\nScenario B: fiftyone v0.21.4\r\nProblem: unable to fully display the App (it only shows the fiftyone skeleton page with \"Select Dataset\", and no layout and data in the middle). The terminal repeatedly gives `could not connect session, retrying in 10 seconds`.\r\n\r\nI'm unsure what is the cause to the above and would appreciate your assistance. \r\n\r\n### What areas of FiftyOne does this bug affect?\r\n\r\n- [x] `App`: FiftyOne application issue\r\n- [ ] `Core`: Core Python library issue\r\n- [ ] `Server`: FiftyOne server issue\r\n\r\n### Willingness to contribute\r\n\r\nThe FiftyOne Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the FiftyOne codebase?\r\n\r\n- [ ] Yes. I can contribute a fix for this bug independently\r\n- [ ] Yes. I would be willing to contribute a fix for this bug with guidance from the FiftyOne community\r\n- [x] No. I cannot contribute a bug fix at this time\r\n\n", "before_files": [{"content": "\"\"\"\nFiftyOne Server app.\n\n| Copyright 2017-2023, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\nfrom datetime import date, datetime\nimport os\nimport pathlib\n\nimport eta.core.utils as etau\nfrom starlette.applications import Starlette\nfrom starlette.middleware import Middleware\nfrom starlette.middleware.base import (\n BaseHTTPMiddleware,\n RequestResponseEndpoint,\n)\nfrom starlette.middleware.cors import CORSMiddleware\nfrom starlette.requests import Request\nfrom starlette.responses import Response\nfrom starlette.routing import Mount, Route\nfrom starlette.staticfiles import StaticFiles\nfrom starlette.types import Scope\nimport strawberry as gql\n\nimport fiftyone as fo\nimport fiftyone.constants as foc\nfrom fiftyone.server.context import GraphQL\nfrom fiftyone.server.extensions import EndSession\nfrom fiftyone.server.mutation import Mutation\nfrom fiftyone.server.query import Query\nfrom fiftyone.server.routes import routes\nfrom fiftyone.server.scalars import Date, DateTime\n\n\netau.ensure_dir(os.path.join(os.path.dirname(__file__), \"static\"))\n\n\nclass Static(StaticFiles):\n async def get_response(self, path: str, scope: Scope) -> Response:\n response = await super().get_response(path, scope)\n\n if response.status_code == 404:\n path = pathlib.Path(\n *pathlib.Path(path).parts[2:]\n ) # strip dataset/{name}\n response = await super().get_response(path, scope)\n if response.status_code == 404:\n full_path, stat_result = self.lookup_path(\"index.html\")\n return self.file_response(full_path, stat_result, scope)\n\n return response\n\n\nclass HeadersMiddleware(BaseHTTPMiddleware):\n async def dispatch(\n self, request: Request, call_next: RequestResponseEndpoint\n ) -> Response:\n response = await call_next(request)\n response.headers[\"x-colab-notebook-cache-control\"] = \"no-cache\"\n return response\n\n\nschema = gql.Schema(\n mutation=Mutation,\n query=Query,\n extensions=[EndSession],\n scalar_overrides={\n date: Date,\n datetime: DateTime,\n },\n)\n\n\napp = Starlette(\n middleware=[\n Middleware(\n CORSMiddleware,\n allow_origins=[\"*\"],\n allow_methods=[\"GET\", \"POST\", \"HEAD\", \"OPTIONS\"],\n allow_headers=[\n \"access-control-allow-origin\",\n \"authorization\",\n \"content-type\",\n ],\n ),\n Middleware(HeadersMiddleware),\n ],\n debug=foc.DEV_INSTALL,\n routes=[Route(route, endpoint) for route, endpoint in routes]\n + [\n Route(\n \"/graphql\",\n GraphQL(\n schema,\n graphiql=foc.DEV_INSTALL,\n ),\n ),\n Mount(\n \"/plugins\",\n app=Static(\n directory=fo.config.plugins_dir,\n html=True,\n check_dir=False,\n follow_symlink=True,\n ),\n name=\"plugins\",\n ),\n Mount(\n \"/\",\n app=Static(\n directory=os.path.join(os.path.dirname(__file__), \"static\"),\n html=True,\n follow_symlink=True,\n ),\n name=\"static\",\n ),\n ],\n)\n", "path": "fiftyone/server/app.py"}], "after_files": [{"content": "\"\"\"\nFiftyOne Server app.\n\n| Copyright 2017-2023, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\nfrom datetime import date, datetime\nimport os\nimport pathlib\n\nimport eta.core.utils as etau\nfrom starlette.applications import Starlette\nfrom starlette.middleware import Middleware\nfrom starlette.middleware.base import (\n BaseHTTPMiddleware,\n RequestResponseEndpoint,\n)\nfrom starlette.middleware.cors import CORSMiddleware\nfrom starlette.requests import Request\nfrom starlette.responses import Response\nfrom starlette.routing import Mount, Route\nfrom starlette.staticfiles import StaticFiles\nfrom starlette.types import Scope\nimport strawberry as gql\n\nimport fiftyone as fo\nimport fiftyone.constants as foc\nfrom fiftyone.server.context import GraphQL\nfrom fiftyone.server.extensions import EndSession\nfrom fiftyone.server.mutation import Mutation\nfrom fiftyone.server.query import Query\nfrom fiftyone.server.routes import routes\nfrom fiftyone.server.scalars import Date, DateTime\n\n\netau.ensure_dir(os.path.join(os.path.dirname(__file__), \"static\"))\n\n\nclass Static(StaticFiles):\n async def get_response(self, path: str, scope: Scope) -> Response:\n response = await super().get_response(path, scope)\n\n if response.status_code == 404:\n path = pathlib.Path(\n *pathlib.Path(path).parts[2:]\n ) # strip dataset/{name}\n response = await super().get_response(path, scope)\n if response.status_code == 404:\n full_path, stat_result = self.lookup_path(\"index.html\")\n return self.file_response(full_path, stat_result, scope)\n\n return response\n\n\nclass HeadersMiddleware(BaseHTTPMiddleware):\n async def dispatch(\n self, request: Request, call_next: RequestResponseEndpoint\n ) -> Response:\n response = await call_next(request)\n response.headers[\"x-colab-notebook-cache-control\"] = \"no-cache\"\n return response\n\n\nschema = gql.Schema(\n mutation=Mutation,\n query=Query,\n extensions=[EndSession],\n scalar_overrides={\n date: Date,\n datetime: DateTime,\n },\n)\n\n\napp = Starlette(\n middleware=[\n Middleware(\n CORSMiddleware,\n allow_origins=[\"*\"],\n allow_methods=[\"GET\", \"POST\", \"HEAD\", \"OPTIONS\"],\n allow_headers=[\n \"access-control-allow-origin\",\n \"authorization\",\n \"content-type\",\n ],\n ),\n Middleware(HeadersMiddleware),\n ],\n debug=foc.DEV_INSTALL,\n routes=[Route(route, endpoint) for route, endpoint in routes]\n + [\n Route(\n \"/graphql\",\n GraphQL(\n schema,\n graphiql=foc.DEV_INSTALL,\n ),\n ),\n Mount(\n \"/plugins\",\n app=Static(\n directory=fo.config.plugins_dir,\n html=True,\n check_dir=False,\n follow_symlink=True,\n ),\n name=\"plugins\",\n ),\n Mount(\n \"/\",\n app=Static(\n directory=os.path.join(os.path.dirname(__file__), \"static\"),\n html=True,\n ),\n name=\"static\",\n ),\n ],\n)\n", "path": "fiftyone/server/app.py"}]}
| 1,825 | 89 |
gh_patches_debug_17216
|
rasdani/github-patches
|
git_diff
|
medtagger__MedTagger-407
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Scans with low number of Slices are not properly loaded on Labeling page
## Current Behavior
When Scan has less than 10 Slices, Scan Viewer keeps displaying spinner indicator and never gets to displaying actual Scan.
## Expected Behavior
Scan should be displayed properly, no matter how many Slices it has.
## Steps to Reproduce the Problem
1. Upload less than 10 Slices.
2. Go to Labelling page.
3. Voi'la, Scan is not displayed, instead you're presented with spinner indicator that doesn't dissapear.
## Additional comment (optional)
Reason for that happening is that `SLICE_BATCH_SIZE` (`validation-page.component.ts`, Line 23) is fixed to 10. When we upload Scan that has less than `SLICE_BATCH_SIZE` slices, method `slice.isLastInBatch()` (`marker-page.component.ts`, Line 101) will always return `false`, because it will never hit slice with id = 9 (because there isn't one...). Therefore, Slice download will never be marked as completed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `backend/medtagger/api/scans/service_web_socket.py`
Content:
```
1 """Module responsible for definition of Scans service available via WebSockets."""
2 from typing import Dict
3
4 from flask_socketio import Namespace, emit
5
6 from medtagger.api import web_socket
7 from medtagger.database.models import SliceOrientation
8 from medtagger.types import ScanID
9 from medtagger.api.exceptions import InvalidArgumentsException
10 from medtagger.api.scans import business
11
12
13 class Slices(Namespace):
14 """WebSocket handler for /slices namespace."""
15
16 MAX_NUMBER_OF_SLICES_PER_REQUEST = 25
17
18 def on_request_slices(self, request: Dict) -> None:
19 """Handle slices request triggered by `request_slices` event."""
20 assert request.get('scan_id'), 'ScanID is required!'
21 scan_id = ScanID(str(request['scan_id']))
22 begin = max(0, request.get('begin', 0))
23 count = request.get('count', 1)
24 reversed_order = request.get('reversed', False)
25 orientation = request.get('orientation', SliceOrientation.Z.value)
26 self._raise_on_invalid_request_slices(count, orientation)
27
28 orientation = SliceOrientation[orientation]
29 slices = business.get_slices_for_scan(scan_id, begin, count, orientation=orientation)
30 slices_to_send = reversed(list(enumerate(slices))) if reversed_order else enumerate(slices)
31 last_in_batch = begin if reversed_order else begin + count - 1
32 for index, (_slice, image) in slices_to_send:
33 emit('slice', {
34 'scan_id': scan_id,
35 'index': begin + index,
36 'last_in_batch': last_in_batch,
37 'image': image,
38 })
39
40 def _raise_on_invalid_request_slices(self, count: int, orientation: str) -> None:
41 """Validate incoming request and raise an exception if there are issues with given arguments.
42
43 :param count: number of slices that should be returned
44 :param orientation: Slice's orientation as a string
45 """
46 # Make sure that passed orientation is proper one
47 if orientation not in SliceOrientation.__members__:
48 raise InvalidArgumentsException('Invalid Slice orientation.')
49
50 # Make sure that nobody will fetch whole scan at once. It could freeze our backend application.
51 if count > self.MAX_NUMBER_OF_SLICES_PER_REQUEST:
52 message = 'Cannot return more than {} slices per request.'.format(self.MAX_NUMBER_OF_SLICES_PER_REQUEST)
53 raise InvalidArgumentsException(message)
54
55
56 # Register above namespace
57 web_socket.on_namespace(Slices('/slices'))
58
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/backend/medtagger/api/scans/service_web_socket.py b/backend/medtagger/api/scans/service_web_socket.py
--- a/backend/medtagger/api/scans/service_web_socket.py
+++ b/backend/medtagger/api/scans/service_web_socket.py
@@ -26,10 +26,10 @@
self._raise_on_invalid_request_slices(count, orientation)
orientation = SliceOrientation[orientation]
- slices = business.get_slices_for_scan(scan_id, begin, count, orientation=orientation)
- slices_to_send = reversed(list(enumerate(slices))) if reversed_order else enumerate(slices)
- last_in_batch = begin if reversed_order else begin + count - 1
- for index, (_slice, image) in slices_to_send:
+ slices = list(business.get_slices_for_scan(scan_id, begin, count, orientation=orientation))
+ slices_to_send = list(reversed(slices)) if reversed_order else slices
+ last_in_batch = begin if reversed_order else begin + len(slices_to_send) - 1
+ for index, (_slice, image) in enumerate(slices_to_send):
emit('slice', {
'scan_id': scan_id,
'index': begin + index,
|
{"golden_diff": "diff --git a/backend/medtagger/api/scans/service_web_socket.py b/backend/medtagger/api/scans/service_web_socket.py\n--- a/backend/medtagger/api/scans/service_web_socket.py\n+++ b/backend/medtagger/api/scans/service_web_socket.py\n@@ -26,10 +26,10 @@\n self._raise_on_invalid_request_slices(count, orientation)\n \n orientation = SliceOrientation[orientation]\n- slices = business.get_slices_for_scan(scan_id, begin, count, orientation=orientation)\n- slices_to_send = reversed(list(enumerate(slices))) if reversed_order else enumerate(slices)\n- last_in_batch = begin if reversed_order else begin + count - 1\n- for index, (_slice, image) in slices_to_send:\n+ slices = list(business.get_slices_for_scan(scan_id, begin, count, orientation=orientation))\n+ slices_to_send = list(reversed(slices)) if reversed_order else slices\n+ last_in_batch = begin if reversed_order else begin + len(slices_to_send) - 1\n+ for index, (_slice, image) in enumerate(slices_to_send):\n emit('slice', {\n 'scan_id': scan_id,\n 'index': begin + index,\n", "issue": "Scans with low number of Slices are not properly loaded on Labeling page\n## Current Behavior\r\n\r\nWhen Scan has less than 10 Slices, Scan Viewer keeps displaying spinner indicator and never gets to displaying actual Scan.\r\n\r\n## Expected Behavior\r\n\r\nScan should be displayed properly, no matter how many Slices it has.\r\n\r\n## Steps to Reproduce the Problem\r\n\r\n 1. Upload less than 10 Slices.\r\n 2. Go to Labelling page.\r\n 3. Voi'la, Scan is not displayed, instead you're presented with spinner indicator that doesn't dissapear.\r\n\r\n## Additional comment (optional)\r\n\r\nReason for that happening is that `SLICE_BATCH_SIZE` (`validation-page.component.ts`, Line 23) is fixed to 10. When we upload Scan that has less than `SLICE_BATCH_SIZE` slices, method `slice.isLastInBatch()` (`marker-page.component.ts`, Line 101) will always return `false`, because it will never hit slice with id = 9 (because there isn't one...). Therefore, Slice download will never be marked as completed.\r\n\n", "before_files": [{"content": "\"\"\"Module responsible for definition of Scans service available via WebSockets.\"\"\"\nfrom typing import Dict\n\nfrom flask_socketio import Namespace, emit\n\nfrom medtagger.api import web_socket\nfrom medtagger.database.models import SliceOrientation\nfrom medtagger.types import ScanID\nfrom medtagger.api.exceptions import InvalidArgumentsException\nfrom medtagger.api.scans import business\n\n\nclass Slices(Namespace):\n \"\"\"WebSocket handler for /slices namespace.\"\"\"\n\n MAX_NUMBER_OF_SLICES_PER_REQUEST = 25\n\n def on_request_slices(self, request: Dict) -> None:\n \"\"\"Handle slices request triggered by `request_slices` event.\"\"\"\n assert request.get('scan_id'), 'ScanID is required!'\n scan_id = ScanID(str(request['scan_id']))\n begin = max(0, request.get('begin', 0))\n count = request.get('count', 1)\n reversed_order = request.get('reversed', False)\n orientation = request.get('orientation', SliceOrientation.Z.value)\n self._raise_on_invalid_request_slices(count, orientation)\n\n orientation = SliceOrientation[orientation]\n slices = business.get_slices_for_scan(scan_id, begin, count, orientation=orientation)\n slices_to_send = reversed(list(enumerate(slices))) if reversed_order else enumerate(slices)\n last_in_batch = begin if reversed_order else begin + count - 1\n for index, (_slice, image) in slices_to_send:\n emit('slice', {\n 'scan_id': scan_id,\n 'index': begin + index,\n 'last_in_batch': last_in_batch,\n 'image': image,\n })\n\n def _raise_on_invalid_request_slices(self, count: int, orientation: str) -> None:\n \"\"\"Validate incoming request and raise an exception if there are issues with given arguments.\n\n :param count: number of slices that should be returned\n :param orientation: Slice's orientation as a string\n \"\"\"\n # Make sure that passed orientation is proper one\n if orientation not in SliceOrientation.__members__:\n raise InvalidArgumentsException('Invalid Slice orientation.')\n\n # Make sure that nobody will fetch whole scan at once. It could freeze our backend application.\n if count > self.MAX_NUMBER_OF_SLICES_PER_REQUEST:\n message = 'Cannot return more than {} slices per request.'.format(self.MAX_NUMBER_OF_SLICES_PER_REQUEST)\n raise InvalidArgumentsException(message)\n\n\n# Register above namespace\nweb_socket.on_namespace(Slices('/slices'))\n", "path": "backend/medtagger/api/scans/service_web_socket.py"}], "after_files": [{"content": "\"\"\"Module responsible for definition of Scans service available via WebSockets.\"\"\"\nfrom typing import Dict\n\nfrom flask_socketio import Namespace, emit\n\nfrom medtagger.api import web_socket\nfrom medtagger.database.models import SliceOrientation\nfrom medtagger.types import ScanID\nfrom medtagger.api.exceptions import InvalidArgumentsException\nfrom medtagger.api.scans import business\n\n\nclass Slices(Namespace):\n \"\"\"WebSocket handler for /slices namespace.\"\"\"\n\n MAX_NUMBER_OF_SLICES_PER_REQUEST = 25\n\n def on_request_slices(self, request: Dict) -> None:\n \"\"\"Handle slices request triggered by `request_slices` event.\"\"\"\n assert request.get('scan_id'), 'ScanID is required!'\n scan_id = ScanID(str(request['scan_id']))\n begin = max(0, request.get('begin', 0))\n count = request.get('count', 1)\n reversed_order = request.get('reversed', False)\n orientation = request.get('orientation', SliceOrientation.Z.value)\n self._raise_on_invalid_request_slices(count, orientation)\n\n orientation = SliceOrientation[orientation]\n slices = list(business.get_slices_for_scan(scan_id, begin, count, orientation=orientation))\n slices_to_send = list(reversed(slices)) if reversed_order else slices\n last_in_batch = begin if reversed_order else begin + len(slices_to_send) - 1\n for index, (_slice, image) in enumerate(slices_to_send):\n emit('slice', {\n 'scan_id': scan_id,\n 'index': begin + index,\n 'last_in_batch': last_in_batch,\n 'image': image,\n })\n\n def _raise_on_invalid_request_slices(self, count: int, orientation: str) -> None:\n \"\"\"Validate incoming request and raise an exception if there are issues with given arguments.\n\n :param count: number of slices that should be returned\n :param orientation: Slice's orientation as a string\n \"\"\"\n # Make sure that passed orientation is proper one\n if orientation not in SliceOrientation.__members__:\n raise InvalidArgumentsException('Invalid Slice orientation.')\n\n # Make sure that nobody will fetch whole scan at once. It could freeze our backend application.\n if count > self.MAX_NUMBER_OF_SLICES_PER_REQUEST:\n message = 'Cannot return more than {} slices per request.'.format(self.MAX_NUMBER_OF_SLICES_PER_REQUEST)\n raise InvalidArgumentsException(message)\n\n\n# Register above namespace\nweb_socket.on_namespace(Slices('/slices'))\n", "path": "backend/medtagger/api/scans/service_web_socket.py"}]}
| 1,136 | 271 |
gh_patches_debug_42742
|
rasdani/github-patches
|
git_diff
|
Lightning-AI__torchmetrics-70
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`log_dict` is missing `prefix` parameter that is mentioned in docs
## 🐛 Bug
[`MetricCollection` documentation](https://pytorch-lightning.readthedocs.io/en/1.2.0/extensions/metrics.html#metriccollection) mentions using `self.log_dict(self.train_metrics, on_step=True, on_epoch=False, prefix='train')`. The `prefix` parameter doesn't seem to be present in `log_dict` function header.
### Expected behavior
`prefix` is most likely usable in this context, so this feature should be implemented. If not - the documentation should be fixed.
### Environment
- Lightning Version: 1.2.0
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchmetrics/collections.py`
Content:
```
1 # Copyright The PyTorch Lightning team.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from copy import deepcopy
16 from typing import Any, Dict, List, Tuple, Union
17
18 from torch import nn
19
20 from torchmetrics.metric import Metric
21
22
23 class MetricCollection(nn.ModuleDict):
24 """
25 MetricCollection class can be used to chain metrics that have the same
26 call pattern into one single class.
27
28 Args:
29 metrics: One of the following
30
31 * list or tuple: if metrics are passed in as a list, will use the
32 metrics class name as key for output dict. Therefore, two metrics
33 of the same class cannot be chained this way.
34
35 * dict: if metrics are passed in as a dict, will use each key in the
36 dict as key for output dict. Use this format if you want to chain
37 together multiple of the same metric with different parameters.
38
39 Example (input as list):
40 >>> import torch
41 >>> from torchmetrics import MetricCollection, Accuracy, Precision, Recall
42 >>> target = torch.tensor([0, 2, 0, 2, 0, 1, 0, 2])
43 >>> preds = torch.tensor([2, 1, 2, 0, 1, 2, 2, 2])
44 >>> metrics = MetricCollection([Accuracy(),
45 ... Precision(num_classes=3, average='macro'),
46 ... Recall(num_classes=3, average='macro')])
47 >>> metrics(preds, target)
48 {'Accuracy': tensor(0.1250), 'Precision': tensor(0.0667), 'Recall': tensor(0.1111)}
49
50 Example (input as dict):
51 >>> metrics = MetricCollection({'micro_recall': Recall(num_classes=3, average='micro'),
52 ... 'macro_recall': Recall(num_classes=3, average='macro')})
53 >>> same_metric = metrics.clone()
54 >>> metrics(preds, target)
55 {'micro_recall': tensor(0.1250), 'macro_recall': tensor(0.1111)}
56 >>> same_metric(preds, target)
57 {'micro_recall': tensor(0.1250), 'macro_recall': tensor(0.1111)}
58 >>> metrics.persistent()
59
60 """
61
62 def __init__(self, metrics: Union[List[Metric], Tuple[Metric], Dict[str, Metric]]):
63 super().__init__()
64 if isinstance(metrics, dict):
65 # Check all values are metrics
66 for name, metric in metrics.items():
67 if not isinstance(metric, Metric):
68 raise ValueError(
69 f"Value {metric} belonging to key {name}"
70 " is not an instance of `pl.metrics.Metric`"
71 )
72 self[name] = metric
73 elif isinstance(metrics, (tuple, list)):
74 for metric in metrics:
75 if not isinstance(metric, Metric):
76 raise ValueError(
77 f"Input {metric} to `MetricCollection` is not a instance"
78 " of `pl.metrics.Metric`"
79 )
80 name = metric.__class__.__name__
81 if name in self:
82 raise ValueError(f"Encountered two metrics both named {name}")
83 self[name] = metric
84 else:
85 raise ValueError("Unknown input to MetricCollection.")
86
87 def forward(self, *args, **kwargs) -> Dict[str, Any]: # pylint: disable=E0202
88 """
89 Iteratively call forward for each metric. Positional arguments (args) will
90 be passed to every metric in the collection, while keyword arguments (kwargs)
91 will be filtered based on the signature of the individual metric.
92 """
93 return {k: m(*args, **m._filter_kwargs(**kwargs)) for k, m in self.items()}
94
95 def update(self, *args, **kwargs): # pylint: disable=E0202
96 """
97 Iteratively call update for each metric. Positional arguments (args) will
98 be passed to every metric in the collection, while keyword arguments (kwargs)
99 will be filtered based on the signature of the individual metric.
100 """
101 for _, m in self.items():
102 m_kwargs = m._filter_kwargs(**kwargs)
103 m.update(*args, **m_kwargs)
104
105 def compute(self) -> Dict[str, Any]:
106 return {k: m.compute() for k, m in self.items()}
107
108 def reset(self):
109 """ Iteratively call reset for each metric """
110 for _, m in self.items():
111 m.reset()
112
113 def clone(self):
114 """ Make a copy of the metric collection """
115 return deepcopy(self)
116
117 def persistent(self, mode: bool = True):
118 """Method for post-init to change if metric states should be saved to
119 its state_dict
120 """
121 for _, m in self.items():
122 m.persistent(mode)
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/torchmetrics/collections.py b/torchmetrics/collections.py
--- a/torchmetrics/collections.py
+++ b/torchmetrics/collections.py
@@ -13,7 +13,7 @@
# limitations under the License.
from copy import deepcopy
-from typing import Any, Dict, List, Tuple, Union
+from typing import Any, Dict, List, Optional, Tuple, Union
from torch import nn
@@ -36,6 +36,8 @@
dict as key for output dict. Use this format if you want to chain
together multiple of the same metric with different parameters.
+ prefix: a string to append in front of the keys of the output dict
+
Example (input as list):
>>> import torch
>>> from torchmetrics import MetricCollection, Accuracy, Precision, Recall
@@ -58,8 +60,11 @@
>>> metrics.persistent()
"""
-
- def __init__(self, metrics: Union[List[Metric], Tuple[Metric], Dict[str, Metric]]):
+ def __init__(
+ self,
+ metrics: Union[List[Metric], Tuple[Metric], Dict[str, Metric]],
+ prefix: Optional[str] = None
+ ):
super().__init__()
if isinstance(metrics, dict):
# Check all values are metrics
@@ -84,13 +89,15 @@
else:
raise ValueError("Unknown input to MetricCollection.")
+ self.prefix = self._check_prefix_arg(prefix)
+
def forward(self, *args, **kwargs) -> Dict[str, Any]: # pylint: disable=E0202
"""
Iteratively call forward for each metric. Positional arguments (args) will
be passed to every metric in the collection, while keyword arguments (kwargs)
will be filtered based on the signature of the individual metric.
"""
- return {k: m(*args, **m._filter_kwargs(**kwargs)) for k, m in self.items()}
+ return {self._set_prefix(k): m(*args, **m._filter_kwargs(**kwargs)) for k, m in self.items()}
def update(self, *args, **kwargs): # pylint: disable=E0202
"""
@@ -103,20 +110,36 @@
m.update(*args, **m_kwargs)
def compute(self) -> Dict[str, Any]:
- return {k: m.compute() for k, m in self.items()}
+ return {self._set_prefix(k): m.compute() for k, m in self.items()}
- def reset(self):
+ def reset(self) -> None:
""" Iteratively call reset for each metric """
for _, m in self.items():
m.reset()
- def clone(self):
- """ Make a copy of the metric collection """
- return deepcopy(self)
+ def clone(self, prefix: Optional[str] = None) -> 'MetricCollection':
+ """ Make a copy of the metric collection
+ Args:
+ prefix: a string to append in front of the metric keys
+ """
+ mc = deepcopy(self)
+ mc.prefix = self._check_prefix_arg(prefix)
+ return mc
- def persistent(self, mode: bool = True):
+ def persistent(self, mode: bool = True) -> None:
"""Method for post-init to change if metric states should be saved to
its state_dict
"""
for _, m in self.items():
m.persistent(mode)
+
+ def _set_prefix(self, k: str) -> str:
+ return k if self.prefix is None else self.prefix + k
+
+ def _check_prefix_arg(self, prefix: str) -> Optional[str]:
+ if prefix is not None:
+ if isinstance(prefix, str):
+ return prefix
+ else:
+ raise ValueError('Expected input `prefix` to be a string')
+ return None
|
{"golden_diff": "diff --git a/torchmetrics/collections.py b/torchmetrics/collections.py\n--- a/torchmetrics/collections.py\n+++ b/torchmetrics/collections.py\n@@ -13,7 +13,7 @@\n # limitations under the License.\n \n from copy import deepcopy\n-from typing import Any, Dict, List, Tuple, Union\n+from typing import Any, Dict, List, Optional, Tuple, Union\n \n from torch import nn\n \n@@ -36,6 +36,8 @@\n dict as key for output dict. Use this format if you want to chain\n together multiple of the same metric with different parameters.\n \n+ prefix: a string to append in front of the keys of the output dict\n+\n Example (input as list):\n >>> import torch\n >>> from torchmetrics import MetricCollection, Accuracy, Precision, Recall\n@@ -58,8 +60,11 @@\n >>> metrics.persistent()\n \n \"\"\"\n-\n- def __init__(self, metrics: Union[List[Metric], Tuple[Metric], Dict[str, Metric]]):\n+ def __init__(\n+ self,\n+ metrics: Union[List[Metric], Tuple[Metric], Dict[str, Metric]],\n+ prefix: Optional[str] = None\n+ ):\n super().__init__()\n if isinstance(metrics, dict):\n # Check all values are metrics\n@@ -84,13 +89,15 @@\n else:\n raise ValueError(\"Unknown input to MetricCollection.\")\n \n+ self.prefix = self._check_prefix_arg(prefix)\n+\n def forward(self, *args, **kwargs) -> Dict[str, Any]: # pylint: disable=E0202\n \"\"\"\n Iteratively call forward for each metric. Positional arguments (args) will\n be passed to every metric in the collection, while keyword arguments (kwargs)\n will be filtered based on the signature of the individual metric.\n \"\"\"\n- return {k: m(*args, **m._filter_kwargs(**kwargs)) for k, m in self.items()}\n+ return {self._set_prefix(k): m(*args, **m._filter_kwargs(**kwargs)) for k, m in self.items()}\n \n def update(self, *args, **kwargs): # pylint: disable=E0202\n \"\"\"\n@@ -103,20 +110,36 @@\n m.update(*args, **m_kwargs)\n \n def compute(self) -> Dict[str, Any]:\n- return {k: m.compute() for k, m in self.items()}\n+ return {self._set_prefix(k): m.compute() for k, m in self.items()}\n \n- def reset(self):\n+ def reset(self) -> None:\n \"\"\" Iteratively call reset for each metric \"\"\"\n for _, m in self.items():\n m.reset()\n \n- def clone(self):\n- \"\"\" Make a copy of the metric collection \"\"\"\n- return deepcopy(self)\n+ def clone(self, prefix: Optional[str] = None) -> 'MetricCollection':\n+ \"\"\" Make a copy of the metric collection\n+ Args:\n+ prefix: a string to append in front of the metric keys\n+ \"\"\"\n+ mc = deepcopy(self)\n+ mc.prefix = self._check_prefix_arg(prefix)\n+ return mc\n \n- def persistent(self, mode: bool = True):\n+ def persistent(self, mode: bool = True) -> None:\n \"\"\"Method for post-init to change if metric states should be saved to\n its state_dict\n \"\"\"\n for _, m in self.items():\n m.persistent(mode)\n+\n+ def _set_prefix(self, k: str) -> str:\n+ return k if self.prefix is None else self.prefix + k\n+\n+ def _check_prefix_arg(self, prefix: str) -> Optional[str]:\n+ if prefix is not None:\n+ if isinstance(prefix, str):\n+ return prefix\n+ else:\n+ raise ValueError('Expected input `prefix` to be a string')\n+ return None\n", "issue": "`log_dict` is missing `prefix` parameter that is mentioned in docs\n## \ud83d\udc1b Bug\r\n\r\n[`MetricCollection` documentation](https://pytorch-lightning.readthedocs.io/en/1.2.0/extensions/metrics.html#metriccollection) mentions using `self.log_dict(self.train_metrics, on_step=True, on_epoch=False, prefix='train')`. The `prefix` parameter doesn't seem to be present in `log_dict` function header.\r\n\r\n\r\n### Expected behavior\r\n\r\n`prefix` is most likely usable in this context, so this feature should be implemented. If not - the documentation should be fixed.\r\n\r\n### Environment\r\n\r\n - Lightning Version: 1.2.0\r\n\n", "before_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom copy import deepcopy\nfrom typing import Any, Dict, List, Tuple, Union\n\nfrom torch import nn\n\nfrom torchmetrics.metric import Metric\n\n\nclass MetricCollection(nn.ModuleDict):\n \"\"\"\n MetricCollection class can be used to chain metrics that have the same\n call pattern into one single class.\n\n Args:\n metrics: One of the following\n\n * list or tuple: if metrics are passed in as a list, will use the\n metrics class name as key for output dict. Therefore, two metrics\n of the same class cannot be chained this way.\n\n * dict: if metrics are passed in as a dict, will use each key in the\n dict as key for output dict. Use this format if you want to chain\n together multiple of the same metric with different parameters.\n\n Example (input as list):\n >>> import torch\n >>> from torchmetrics import MetricCollection, Accuracy, Precision, Recall\n >>> target = torch.tensor([0, 2, 0, 2, 0, 1, 0, 2])\n >>> preds = torch.tensor([2, 1, 2, 0, 1, 2, 2, 2])\n >>> metrics = MetricCollection([Accuracy(),\n ... Precision(num_classes=3, average='macro'),\n ... Recall(num_classes=3, average='macro')])\n >>> metrics(preds, target)\n {'Accuracy': tensor(0.1250), 'Precision': tensor(0.0667), 'Recall': tensor(0.1111)}\n\n Example (input as dict):\n >>> metrics = MetricCollection({'micro_recall': Recall(num_classes=3, average='micro'),\n ... 'macro_recall': Recall(num_classes=3, average='macro')})\n >>> same_metric = metrics.clone()\n >>> metrics(preds, target)\n {'micro_recall': tensor(0.1250), 'macro_recall': tensor(0.1111)}\n >>> same_metric(preds, target)\n {'micro_recall': tensor(0.1250), 'macro_recall': tensor(0.1111)}\n >>> metrics.persistent()\n\n \"\"\"\n\n def __init__(self, metrics: Union[List[Metric], Tuple[Metric], Dict[str, Metric]]):\n super().__init__()\n if isinstance(metrics, dict):\n # Check all values are metrics\n for name, metric in metrics.items():\n if not isinstance(metric, Metric):\n raise ValueError(\n f\"Value {metric} belonging to key {name}\"\n \" is not an instance of `pl.metrics.Metric`\"\n )\n self[name] = metric\n elif isinstance(metrics, (tuple, list)):\n for metric in metrics:\n if not isinstance(metric, Metric):\n raise ValueError(\n f\"Input {metric} to `MetricCollection` is not a instance\"\n \" of `pl.metrics.Metric`\"\n )\n name = metric.__class__.__name__\n if name in self:\n raise ValueError(f\"Encountered two metrics both named {name}\")\n self[name] = metric\n else:\n raise ValueError(\"Unknown input to MetricCollection.\")\n\n def forward(self, *args, **kwargs) -> Dict[str, Any]: # pylint: disable=E0202\n \"\"\"\n Iteratively call forward for each metric. Positional arguments (args) will\n be passed to every metric in the collection, while keyword arguments (kwargs)\n will be filtered based on the signature of the individual metric.\n \"\"\"\n return {k: m(*args, **m._filter_kwargs(**kwargs)) for k, m in self.items()}\n\n def update(self, *args, **kwargs): # pylint: disable=E0202\n \"\"\"\n Iteratively call update for each metric. Positional arguments (args) will\n be passed to every metric in the collection, while keyword arguments (kwargs)\n will be filtered based on the signature of the individual metric.\n \"\"\"\n for _, m in self.items():\n m_kwargs = m._filter_kwargs(**kwargs)\n m.update(*args, **m_kwargs)\n\n def compute(self) -> Dict[str, Any]:\n return {k: m.compute() for k, m in self.items()}\n\n def reset(self):\n \"\"\" Iteratively call reset for each metric \"\"\"\n for _, m in self.items():\n m.reset()\n\n def clone(self):\n \"\"\" Make a copy of the metric collection \"\"\"\n return deepcopy(self)\n\n def persistent(self, mode: bool = True):\n \"\"\"Method for post-init to change if metric states should be saved to\n its state_dict\n \"\"\"\n for _, m in self.items():\n m.persistent(mode)\n", "path": "torchmetrics/collections.py"}], "after_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom copy import deepcopy\nfrom typing import Any, Dict, List, Optional, Tuple, Union\n\nfrom torch import nn\n\nfrom torchmetrics.metric import Metric\n\n\nclass MetricCollection(nn.ModuleDict):\n \"\"\"\n MetricCollection class can be used to chain metrics that have the same\n call pattern into one single class.\n\n Args:\n metrics: One of the following\n\n * list or tuple: if metrics are passed in as a list, will use the\n metrics class name as key for output dict. Therefore, two metrics\n of the same class cannot be chained this way.\n\n * dict: if metrics are passed in as a dict, will use each key in the\n dict as key for output dict. Use this format if you want to chain\n together multiple of the same metric with different parameters.\n\n prefix: a string to append in front of the keys of the output dict\n\n Example (input as list):\n >>> import torch\n >>> from torchmetrics import MetricCollection, Accuracy, Precision, Recall\n >>> target = torch.tensor([0, 2, 0, 2, 0, 1, 0, 2])\n >>> preds = torch.tensor([2, 1, 2, 0, 1, 2, 2, 2])\n >>> metrics = MetricCollection([Accuracy(),\n ... Precision(num_classes=3, average='macro'),\n ... Recall(num_classes=3, average='macro')])\n >>> metrics(preds, target)\n {'Accuracy': tensor(0.1250), 'Precision': tensor(0.0667), 'Recall': tensor(0.1111)}\n\n Example (input as dict):\n >>> metrics = MetricCollection({'micro_recall': Recall(num_classes=3, average='micro'),\n ... 'macro_recall': Recall(num_classes=3, average='macro')})\n >>> same_metric = metrics.clone()\n >>> metrics(preds, target)\n {'micro_recall': tensor(0.1250), 'macro_recall': tensor(0.1111)}\n >>> same_metric(preds, target)\n {'micro_recall': tensor(0.1250), 'macro_recall': tensor(0.1111)}\n >>> metrics.persistent()\n\n \"\"\"\n def __init__(\n self,\n metrics: Union[List[Metric], Tuple[Metric], Dict[str, Metric]],\n prefix: Optional[str] = None\n ):\n super().__init__()\n if isinstance(metrics, dict):\n # Check all values are metrics\n for name, metric in metrics.items():\n if not isinstance(metric, Metric):\n raise ValueError(\n f\"Value {metric} belonging to key {name}\"\n \" is not an instance of `pl.metrics.Metric`\"\n )\n self[name] = metric\n elif isinstance(metrics, (tuple, list)):\n for metric in metrics:\n if not isinstance(metric, Metric):\n raise ValueError(\n f\"Input {metric} to `MetricCollection` is not a instance\"\n \" of `pl.metrics.Metric`\"\n )\n name = metric.__class__.__name__\n if name in self:\n raise ValueError(f\"Encountered two metrics both named {name}\")\n self[name] = metric\n else:\n raise ValueError(\"Unknown input to MetricCollection.\")\n\n self.prefix = self._check_prefix_arg(prefix)\n\n def forward(self, *args, **kwargs) -> Dict[str, Any]: # pylint: disable=E0202\n \"\"\"\n Iteratively call forward for each metric. Positional arguments (args) will\n be passed to every metric in the collection, while keyword arguments (kwargs)\n will be filtered based on the signature of the individual metric.\n \"\"\"\n return {self._set_prefix(k): m(*args, **m._filter_kwargs(**kwargs)) for k, m in self.items()}\n\n def update(self, *args, **kwargs): # pylint: disable=E0202\n \"\"\"\n Iteratively call update for each metric. Positional arguments (args) will\n be passed to every metric in the collection, while keyword arguments (kwargs)\n will be filtered based on the signature of the individual metric.\n \"\"\"\n for _, m in self.items():\n m_kwargs = m._filter_kwargs(**kwargs)\n m.update(*args, **m_kwargs)\n\n def compute(self) -> Dict[str, Any]:\n return {self._set_prefix(k): m.compute() for k, m in self.items()}\n\n def reset(self) -> None:\n \"\"\" Iteratively call reset for each metric \"\"\"\n for _, m in self.items():\n m.reset()\n\n def clone(self, prefix: Optional[str] = None) -> 'MetricCollection':\n \"\"\" Make a copy of the metric collection\n Args:\n prefix: a string to append in front of the metric keys\n \"\"\"\n mc = deepcopy(self)\n mc.prefix = self._check_prefix_arg(prefix)\n return mc\n\n def persistent(self, mode: bool = True) -> None:\n \"\"\"Method for post-init to change if metric states should be saved to\n its state_dict\n \"\"\"\n for _, m in self.items():\n m.persistent(mode)\n\n def _set_prefix(self, k: str) -> str:\n return k if self.prefix is None else self.prefix + k\n\n def _check_prefix_arg(self, prefix: str) -> Optional[str]:\n if prefix is not None:\n if isinstance(prefix, str):\n return prefix\n else:\n raise ValueError('Expected input `prefix` to be a string')\n return None\n", "path": "torchmetrics/collections.py"}]}
| 1,807 | 860 |
gh_patches_debug_9706
|
rasdani/github-patches
|
git_diff
|
PokemonGoF__PokemonGo-Bot-2531
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Dev] local variable 'unexpected_reponse_retry' referenced before assignment
There is a typo in a variable
`unexpected_reponse_retry` should be `unexpected_response_retry`
/pokemongo_bot/api_wrapper.py - line 120
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pokemongo_bot/api_wrapper.py`
Content:
```
1 import time
2
3 from pgoapi.exceptions import (ServerSideRequestThrottlingException,
4 NotLoggedInException, ServerBusyOrOfflineException,
5 NoPlayerPositionSetException, EmptySubrequestChainException,
6 UnexpectedResponseException)
7 from pgoapi.pgoapi import PGoApi, PGoApiRequest, RpcApi
8 from pgoapi.protos.POGOProtos.Networking.Requests_pb2 import RequestType
9
10 import pokemongo_bot.logger as logger
11 from human_behaviour import sleep
12
13 class ApiWrapper(PGoApi):
14 def __init__(self):
15 PGoApi.__init__(self)
16 self.useVanillaRequest = False
17
18 def create_request(self):
19 RequestClass = ApiRequest
20 if self.useVanillaRequest:
21 RequestClass = PGoApiRequest
22
23 return RequestClass(
24 self._api_endpoint,
25 self._auth_provider,
26 self._position_lat,
27 self._position_lng,
28 self._position_alt
29 )
30
31 def login(self, *args):
32 # login needs base class "create_request"
33 self.useVanillaRequest = True
34 try:
35 ret_value = PGoApi.login(self, *args)
36 finally:
37 # cleanup code
38 self.useVanillaRequest = False
39 return ret_value
40
41
42 class ApiRequest(PGoApiRequest):
43 def __init__(self, *args):
44 PGoApiRequest.__init__(self, *args)
45 self.request_callers = []
46 self.last_api_request_time = None
47 self.requests_per_seconds = 2
48
49 def can_call(self):
50 if not self._req_method_list:
51 raise EmptySubrequestChainException()
52
53 if (self._position_lat is None) or (self._position_lng is None) or (self._position_alt is None):
54 raise NoPlayerPositionSetException()
55
56 if self._auth_provider is None or not self._auth_provider.is_login():
57 self.log.info('Not logged in')
58 raise NotLoggedInException()
59
60 return True
61
62 def _call(self):
63 return PGoApiRequest.call(self)
64
65 def _pop_request_callers(self):
66 r = self.request_callers
67 self.request_callers = []
68 return [i.upper() for i in r]
69
70 def is_response_valid(self, result, request_callers):
71 if not result or result is None or not isinstance(result, dict):
72 return False
73
74 if not 'responses' in result or not 'status_code' in result:
75 return False
76
77 if not isinstance(result['responses'], dict):
78 return False
79
80 # the response can still programatically be valid at this point
81 # but still be wrong. we need to check if the server did sent what we asked it
82 for request_caller in request_callers:
83 if not request_caller in result['responses']:
84 return False
85
86 return True
87
88 def call(self, max_retry=15):
89 request_callers = self._pop_request_callers()
90 if not self.can_call():
91 return False # currently this is never ran, exceptions are raised before
92
93 request_timestamp = None
94 api_req_method_list = self._req_method_list
95 result = None
96 try_cnt = 0
97 throttling_retry = 0
98 unexpected_response_retry = 0
99 while True:
100 request_timestamp = self.throttle_sleep()
101 # self._call internally clear this field, so save it
102 self._req_method_list = [req_method for req_method in api_req_method_list]
103 try:
104 result = self._call()
105 should_throttle_retry = False
106 should_unexpected_response_retry = False
107 except ServerSideRequestThrottlingException:
108 should_throttle_retry = True
109 except UnexpectedResponseException:
110 should_unexpected_response_retry = True
111
112 if should_throttle_retry:
113 throttling_retry += 1
114 if throttling_retry >= max_retry:
115 raise ServerSideRequestThrottlingException('Server throttled too many times')
116 sleep(1) # huge sleep ?
117 continue # skip response checking
118
119 if should_unexpected_response_retry:
120 unexpected_reponse_retry += 1
121 if unexpected_response_retry >= 5:
122 logger.log('Server is not responding correctly to our requests. Waiting for 30 seconds to reconnect.', 'red')
123 sleep(30)
124 else:
125 sleep(2)
126 continue
127
128 if not self.is_response_valid(result, request_callers):
129 try_cnt += 1
130 if try_cnt > 3:
131 logger.log('Server seems to be busy or offline - try again - {}/{}'.format(try_cnt, max_retry), 'red')
132 if try_cnt >= max_retry:
133 raise ServerBusyOrOfflineException()
134 sleep(1)
135 else:
136 break
137
138 self.last_api_request_time = request_timestamp
139 return result
140
141 def __getattr__(self, func):
142 if func.upper() in RequestType.keys():
143 self.request_callers.append(func)
144 return PGoApiRequest.__getattr__(self, func)
145
146 def throttle_sleep(self):
147 now_milliseconds = time.time() * 1000
148 required_delay_between_requests = 1000 / self.requests_per_seconds
149
150 difference = now_milliseconds - (self.last_api_request_time if self.last_api_request_time else 0)
151
152 if self.last_api_request_time != None and difference < required_delay_between_requests:
153 sleep_time = required_delay_between_requests - difference
154 time.sleep(sleep_time / 1000)
155
156 return now_milliseconds
157
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pokemongo_bot/api_wrapper.py b/pokemongo_bot/api_wrapper.py
--- a/pokemongo_bot/api_wrapper.py
+++ b/pokemongo_bot/api_wrapper.py
@@ -117,7 +117,7 @@
continue # skip response checking
if should_unexpected_response_retry:
- unexpected_reponse_retry += 1
+ unexpected_response_retry += 1
if unexpected_response_retry >= 5:
logger.log('Server is not responding correctly to our requests. Waiting for 30 seconds to reconnect.', 'red')
sleep(30)
|
{"golden_diff": "diff --git a/pokemongo_bot/api_wrapper.py b/pokemongo_bot/api_wrapper.py\n--- a/pokemongo_bot/api_wrapper.py\n+++ b/pokemongo_bot/api_wrapper.py\n@@ -117,7 +117,7 @@\n continue # skip response checking\n \n if should_unexpected_response_retry:\n- unexpected_reponse_retry += 1\n+ unexpected_response_retry += 1\n if unexpected_response_retry >= 5:\n logger.log('Server is not responding correctly to our requests. Waiting for 30 seconds to reconnect.', 'red')\n sleep(30)\n", "issue": "[Dev] local variable 'unexpected_reponse_retry' referenced before assignment\nThere is a typo in a variable\n\n`unexpected_reponse_retry` should be `unexpected_response_retry`\n/pokemongo_bot/api_wrapper.py - line 120 \n\n", "before_files": [{"content": "import time\n\nfrom pgoapi.exceptions import (ServerSideRequestThrottlingException,\n NotLoggedInException, ServerBusyOrOfflineException,\n NoPlayerPositionSetException, EmptySubrequestChainException,\n UnexpectedResponseException)\nfrom pgoapi.pgoapi import PGoApi, PGoApiRequest, RpcApi\nfrom pgoapi.protos.POGOProtos.Networking.Requests_pb2 import RequestType\n\nimport pokemongo_bot.logger as logger\nfrom human_behaviour import sleep\n\nclass ApiWrapper(PGoApi):\n def __init__(self):\n PGoApi.__init__(self)\n self.useVanillaRequest = False\n\n def create_request(self):\n RequestClass = ApiRequest\n if self.useVanillaRequest:\n RequestClass = PGoApiRequest\n\n return RequestClass(\n self._api_endpoint,\n self._auth_provider,\n self._position_lat,\n self._position_lng,\n self._position_alt\n )\n\n def login(self, *args):\n # login needs base class \"create_request\"\n self.useVanillaRequest = True\n try:\n ret_value = PGoApi.login(self, *args)\n finally:\n # cleanup code\n self.useVanillaRequest = False\n return ret_value\n\n\nclass ApiRequest(PGoApiRequest):\n def __init__(self, *args):\n PGoApiRequest.__init__(self, *args)\n self.request_callers = []\n self.last_api_request_time = None\n self.requests_per_seconds = 2\n\n def can_call(self):\n if not self._req_method_list:\n raise EmptySubrequestChainException()\n\n if (self._position_lat is None) or (self._position_lng is None) or (self._position_alt is None):\n raise NoPlayerPositionSetException()\n\n if self._auth_provider is None or not self._auth_provider.is_login():\n self.log.info('Not logged in')\n raise NotLoggedInException()\n\n return True\n\n def _call(self):\n return PGoApiRequest.call(self)\n\n def _pop_request_callers(self):\n r = self.request_callers\n self.request_callers = []\n return [i.upper() for i in r]\n\n def is_response_valid(self, result, request_callers):\n if not result or result is None or not isinstance(result, dict):\n return False\n\n if not 'responses' in result or not 'status_code' in result:\n return False\n\n if not isinstance(result['responses'], dict):\n return False\n\n # the response can still programatically be valid at this point\n # but still be wrong. we need to check if the server did sent what we asked it\n for request_caller in request_callers:\n if not request_caller in result['responses']:\n return False\n\n return True\n\n def call(self, max_retry=15):\n request_callers = self._pop_request_callers()\n if not self.can_call():\n return False # currently this is never ran, exceptions are raised before\n\n request_timestamp = None\n api_req_method_list = self._req_method_list\n result = None\n try_cnt = 0\n throttling_retry = 0\n unexpected_response_retry = 0\n while True:\n request_timestamp = self.throttle_sleep()\n # self._call internally clear this field, so save it\n self._req_method_list = [req_method for req_method in api_req_method_list]\n try:\n result = self._call()\n should_throttle_retry = False\n should_unexpected_response_retry = False\n except ServerSideRequestThrottlingException:\n should_throttle_retry = True\n except UnexpectedResponseException:\n should_unexpected_response_retry = True\n\n if should_throttle_retry:\n throttling_retry += 1\n if throttling_retry >= max_retry:\n raise ServerSideRequestThrottlingException('Server throttled too many times')\n sleep(1) # huge sleep ?\n continue # skip response checking\n\n if should_unexpected_response_retry:\n unexpected_reponse_retry += 1\n if unexpected_response_retry >= 5:\n logger.log('Server is not responding correctly to our requests. Waiting for 30 seconds to reconnect.', 'red')\n sleep(30)\n else:\n sleep(2)\n continue\n \n if not self.is_response_valid(result, request_callers):\n try_cnt += 1\n if try_cnt > 3:\n logger.log('Server seems to be busy or offline - try again - {}/{}'.format(try_cnt, max_retry), 'red')\n if try_cnt >= max_retry:\n raise ServerBusyOrOfflineException()\n sleep(1)\n else:\n break\n\n self.last_api_request_time = request_timestamp\n return result\n\n def __getattr__(self, func):\n if func.upper() in RequestType.keys():\n self.request_callers.append(func)\n return PGoApiRequest.__getattr__(self, func)\n\n def throttle_sleep(self):\n now_milliseconds = time.time() * 1000\n required_delay_between_requests = 1000 / self.requests_per_seconds\n\n difference = now_milliseconds - (self.last_api_request_time if self.last_api_request_time else 0)\n\n if self.last_api_request_time != None and difference < required_delay_between_requests:\n sleep_time = required_delay_between_requests - difference\n time.sleep(sleep_time / 1000)\n\n return now_milliseconds\n", "path": "pokemongo_bot/api_wrapper.py"}], "after_files": [{"content": "import time\n\nfrom pgoapi.exceptions import (ServerSideRequestThrottlingException,\n NotLoggedInException, ServerBusyOrOfflineException,\n NoPlayerPositionSetException, EmptySubrequestChainException,\n UnexpectedResponseException)\nfrom pgoapi.pgoapi import PGoApi, PGoApiRequest, RpcApi\nfrom pgoapi.protos.POGOProtos.Networking.Requests_pb2 import RequestType\n\nimport pokemongo_bot.logger as logger\nfrom human_behaviour import sleep\n\nclass ApiWrapper(PGoApi):\n def __init__(self):\n PGoApi.__init__(self)\n self.useVanillaRequest = False\n\n def create_request(self):\n RequestClass = ApiRequest\n if self.useVanillaRequest:\n RequestClass = PGoApiRequest\n\n return RequestClass(\n self._api_endpoint,\n self._auth_provider,\n self._position_lat,\n self._position_lng,\n self._position_alt\n )\n\n def login(self, *args):\n # login needs base class \"create_request\"\n self.useVanillaRequest = True\n try:\n ret_value = PGoApi.login(self, *args)\n finally:\n # cleanup code\n self.useVanillaRequest = False\n return ret_value\n\n\nclass ApiRequest(PGoApiRequest):\n def __init__(self, *args):\n PGoApiRequest.__init__(self, *args)\n self.request_callers = []\n self.last_api_request_time = None\n self.requests_per_seconds = 2\n\n def can_call(self):\n if not self._req_method_list:\n raise EmptySubrequestChainException()\n\n if (self._position_lat is None) or (self._position_lng is None) or (self._position_alt is None):\n raise NoPlayerPositionSetException()\n\n if self._auth_provider is None or not self._auth_provider.is_login():\n self.log.info('Not logged in')\n raise NotLoggedInException()\n\n return True\n\n def _call(self):\n return PGoApiRequest.call(self)\n\n def _pop_request_callers(self):\n r = self.request_callers\n self.request_callers = []\n return [i.upper() for i in r]\n\n def is_response_valid(self, result, request_callers):\n if not result or result is None or not isinstance(result, dict):\n return False\n\n if not 'responses' in result or not 'status_code' in result:\n return False\n\n if not isinstance(result['responses'], dict):\n return False\n\n # the response can still programatically be valid at this point\n # but still be wrong. we need to check if the server did sent what we asked it\n for request_caller in request_callers:\n if not request_caller in result['responses']:\n return False\n\n return True\n\n def call(self, max_retry=15):\n request_callers = self._pop_request_callers()\n if not self.can_call():\n return False # currently this is never ran, exceptions are raised before\n\n request_timestamp = None\n api_req_method_list = self._req_method_list\n result = None\n try_cnt = 0\n throttling_retry = 0\n unexpected_response_retry = 0\n while True:\n request_timestamp = self.throttle_sleep()\n # self._call internally clear this field, so save it\n self._req_method_list = [req_method for req_method in api_req_method_list]\n try:\n result = self._call()\n should_throttle_retry = False\n should_unexpected_response_retry = False\n except ServerSideRequestThrottlingException:\n should_throttle_retry = True\n except UnexpectedResponseException:\n should_unexpected_response_retry = True\n\n if should_throttle_retry:\n throttling_retry += 1\n if throttling_retry >= max_retry:\n raise ServerSideRequestThrottlingException('Server throttled too many times')\n sleep(1) # huge sleep ?\n continue # skip response checking\n\n if should_unexpected_response_retry:\n unexpected_response_retry += 1\n if unexpected_response_retry >= 5:\n logger.log('Server is not responding correctly to our requests. Waiting for 30 seconds to reconnect.', 'red')\n sleep(30)\n else:\n sleep(2)\n continue\n \n if not self.is_response_valid(result, request_callers):\n try_cnt += 1\n if try_cnt > 3:\n logger.log('Server seems to be busy or offline - try again - {}/{}'.format(try_cnt, max_retry), 'red')\n if try_cnt >= max_retry:\n raise ServerBusyOrOfflineException()\n sleep(1)\n else:\n break\n\n self.last_api_request_time = request_timestamp\n return result\n\n def __getattr__(self, func):\n if func.upper() in RequestType.keys():\n self.request_callers.append(func)\n return PGoApiRequest.__getattr__(self, func)\n\n def throttle_sleep(self):\n now_milliseconds = time.time() * 1000\n required_delay_between_requests = 1000 / self.requests_per_seconds\n\n difference = now_milliseconds - (self.last_api_request_time if self.last_api_request_time else 0)\n\n if self.last_api_request_time != None and difference < required_delay_between_requests:\n sleep_time = required_delay_between_requests - difference\n time.sleep(sleep_time / 1000)\n\n return now_milliseconds\n", "path": "pokemongo_bot/api_wrapper.py"}]}
| 1,875 | 133 |
gh_patches_debug_17992
|
rasdani/github-patches
|
git_diff
|
hpcaitech__ColossalAI-3329
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `colossalai/engine/gradient_handler/__init__.py`
Content:
```
1 from ._base_gradient_handler import BaseGradientHandler
2 from ._data_parallel_gradient_handler import DataParallelGradientHandler
3 from ._zero_gradient_handler import ZeROGradientHandler
4 from ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler
5 from ._pipeline_parallel_gradient_handler import PipelineSharedModuleGradientHandler
6 from ._moe_gradient_handler import MoeGradientHandler
7 from ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler
8
9 __all__ = [
10 'BaseGradientHandler', 'DataParallelGradientHandler', 'ZeROGradientHandler', 'PipelineSharedModuleGradientHandler',
11 'MoeGradientHandler', 'SequenceParallelGradientHandler'
12 ]
13
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/colossalai/engine/gradient_handler/__init__.py b/colossalai/engine/gradient_handler/__init__.py
--- a/colossalai/engine/gradient_handler/__init__.py
+++ b/colossalai/engine/gradient_handler/__init__.py
@@ -1,10 +1,9 @@
from ._base_gradient_handler import BaseGradientHandler
from ._data_parallel_gradient_handler import DataParallelGradientHandler
-from ._zero_gradient_handler import ZeROGradientHandler
-from ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler
-from ._pipeline_parallel_gradient_handler import PipelineSharedModuleGradientHandler
from ._moe_gradient_handler import MoeGradientHandler
+from ._pipeline_parallel_gradient_handler import PipelineSharedModuleGradientHandler
from ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler
+from ._zero_gradient_handler import ZeROGradientHandler
__all__ = [
'BaseGradientHandler', 'DataParallelGradientHandler', 'ZeROGradientHandler', 'PipelineSharedModuleGradientHandler',
|
{"golden_diff": "diff --git a/colossalai/engine/gradient_handler/__init__.py b/colossalai/engine/gradient_handler/__init__.py\n--- a/colossalai/engine/gradient_handler/__init__.py\n+++ b/colossalai/engine/gradient_handler/__init__.py\n@@ -1,10 +1,9 @@\n from ._base_gradient_handler import BaseGradientHandler\n from ._data_parallel_gradient_handler import DataParallelGradientHandler\n-from ._zero_gradient_handler import ZeROGradientHandler\n-from ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler\n-from ._pipeline_parallel_gradient_handler import PipelineSharedModuleGradientHandler\n from ._moe_gradient_handler import MoeGradientHandler\n+from ._pipeline_parallel_gradient_handler import PipelineSharedModuleGradientHandler\n from ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler\n+from ._zero_gradient_handler import ZeROGradientHandler\n \n __all__ = [\n 'BaseGradientHandler', 'DataParallelGradientHandler', 'ZeROGradientHandler', 'PipelineSharedModuleGradientHandler',\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "from ._base_gradient_handler import BaseGradientHandler\nfrom ._data_parallel_gradient_handler import DataParallelGradientHandler\nfrom ._zero_gradient_handler import ZeROGradientHandler\nfrom ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler\nfrom ._pipeline_parallel_gradient_handler import PipelineSharedModuleGradientHandler\nfrom ._moe_gradient_handler import MoeGradientHandler\nfrom ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler\n\n__all__ = [\n 'BaseGradientHandler', 'DataParallelGradientHandler', 'ZeROGradientHandler', 'PipelineSharedModuleGradientHandler',\n 'MoeGradientHandler', 'SequenceParallelGradientHandler'\n]\n", "path": "colossalai/engine/gradient_handler/__init__.py"}], "after_files": [{"content": "from ._base_gradient_handler import BaseGradientHandler\nfrom ._data_parallel_gradient_handler import DataParallelGradientHandler\nfrom ._moe_gradient_handler import MoeGradientHandler\nfrom ._pipeline_parallel_gradient_handler import PipelineSharedModuleGradientHandler\nfrom ._sequence_parallel_gradient_handler import SequenceParallelGradientHandler\nfrom ._zero_gradient_handler import ZeROGradientHandler\n\n__all__ = [\n 'BaseGradientHandler', 'DataParallelGradientHandler', 'ZeROGradientHandler', 'PipelineSharedModuleGradientHandler',\n 'MoeGradientHandler', 'SequenceParallelGradientHandler'\n]\n", "path": "colossalai/engine/gradient_handler/__init__.py"}]}
| 433 | 209 |
gh_patches_debug_386
|
rasdani/github-patches
|
git_diff
|
UTNkar__moore-554
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix cookies for Utnarm
Utnarm recently switched to utnarm.se instead of utnarm.utn.se. This lead to that you can’t sign in to utnarm.se. Per default, moore uses utn.se as cookie domain and since utnarm.se is a different top level domain the cookies can’t be used.
We need to dynamically add utnarm.se as a cookie domain. This python package might be useful https://github.com/ViktorStiskala/django-shared-session
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/moore/settings/production.py`
Content:
```
1 """
2 Django settings for the production environment of Project Moore.
3
4 For more information regarding running in production see,
5 See https://docs.djangoproject.com/en/1.10/howto/deployment/checklist/
6
7 For more information on this file, see
8 https://docs.djangoproject.com/en/1.10/topics/settings/
9
10 For the full list of settings and their values, see
11 https://docs.djangoproject.com/en/1.10/ref/settings/
12 """
13 from __future__ import absolute_import, unicode_literals
14 import raven
15
16 from .base import *
17
18 # SECURITY WARNING: don't run with debug turned on in production!
19 DEBUG = False
20
21 # SECURITY WARNING: keep the secret key used in production secret!
22 SECRET_KEY = os.environ.get(
23 'DJANGO_SECRET',
24 'za7^0@54n&p-dg4)_l12q_3^o5awz_uym0osqaz2!myki_8kw0'
25 )
26
27 # Database
28 # https://docs.djangoproject.com/en/1.10/ref/settings/#databases
29
30 DATABASES = {
31 'default': {
32 'ENGINE': 'django.db.backends.postgresql',
33 'NAME': os.environ.get('DJANGO_DB_NAME', 'moore'),
34 'USER': os.environ.get('DJANGO_DB_USER', 'moore'),
35 'PASSWORD': os.environ.get('DJANGO_DB_PASS'),
36 'HOST': os.environ.get('DJANGO_DB_HOST', '127.0.0.1'),
37 'PORT': os.environ.get('DJANGO_DB_PORT', '5432'),
38 }
39 }
40
41
42 # CONN_MAX_AGE = 0
43
44 # Base URL to use when referring to full URLs within the Wagtail admin
45 # backend - e.g. in notification emails. Don't include '/admin' or a
46 # trailing slash
47 BASE_URL = 'https://utn.se'
48
49 ALLOWED_HOSTS = ['.utn.se', '.utnarm.se']
50
51 # Email settings
52 DEFAULT_FROM_EMAIL = '[email protected]'
53
54 EMAIL_SUBJECT_PREFIX = '[UTN] '
55
56 # Sentry Configuration - will be sent error messages
57 RAVEN_CONFIG = {
58 'dsn': os.environ.get('SENTRY_DSN'),
59 'release': raven.fetch_git_sha(os.path.dirname(BASE_DIR)),
60 }
61
62 LOGGING = {
63 'version': 1,
64 'disable_existing_loggers': True,
65 'root': {
66 'level': 'WARNING',
67 'handlers': ['sentry'],
68 },
69 'formatters': {
70 'verbose': {
71 'format': '%(levelname)s %(asctime)s %(module)s '
72 '%(process)d %(thread)d %(message)s'
73 },
74 },
75 'handlers': {
76 'sentry': {
77 'level': 'ERROR',
78 'class': 'raven.contrib.django.raven_compat'
79 '.handlers.SentryHandler',
80 'tags': {'custom-tag': 'x'},
81 },
82 'console': {
83 'level': 'DEBUG',
84 'class': 'logging.StreamHandler',
85 'formatter': 'verbose'
86 }
87 },
88 'loggers': {
89 'django.db.backends': {
90 'level': 'ERROR',
91 'handlers': ['console'],
92 'propagate': False,
93 },
94 'raven': {
95 'level': 'DEBUG',
96 'handlers': ['console'],
97 'propagate': False,
98 },
99 'sentry.errors': {
100 'level': 'DEBUG',
101 'handlers': ['console'],
102 'propagate': False,
103 },
104 },
105 }
106
107 CSRF_COOKIE_SECURE = True
108
109 SESSION_COOKIE_DOMAIN = '.utn.se'
110
111 SESSION_COOKIE_SECURE = True
112
113 MELOS_URL = os.environ.get('MELOS_URL')
114 MELOS_ORG_ID = os.environ.get('MELOS_ORG_ID')
115 MELOS_ADMIN = os.environ.get('MELOS_ADMIN')
116
117 # Google API
118 GOOGLE_API_KEY = os.environ.get('GOOGLE_API_KEY')
119
120 RECAPTCHA_PUBLIC_KEY = os.environ.get("RECAPTCHA_PUBLIC_KEY", "")
121 RECAPTCHA_PRIVATE_KEY = os.environ.get("RECAPTCHA_PRIVATE_KEY", "")
122
123 try:
124 from .local import *
125 except ImportError:
126 pass
127
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/moore/settings/production.py b/src/moore/settings/production.py
--- a/src/moore/settings/production.py
+++ b/src/moore/settings/production.py
@@ -106,8 +106,6 @@
CSRF_COOKIE_SECURE = True
-SESSION_COOKIE_DOMAIN = '.utn.se'
-
SESSION_COOKIE_SECURE = True
MELOS_URL = os.environ.get('MELOS_URL')
|
{"golden_diff": "diff --git a/src/moore/settings/production.py b/src/moore/settings/production.py\n--- a/src/moore/settings/production.py\n+++ b/src/moore/settings/production.py\n@@ -106,8 +106,6 @@\n \n CSRF_COOKIE_SECURE = True\n \n-SESSION_COOKIE_DOMAIN = '.utn.se'\n-\n SESSION_COOKIE_SECURE = True\n \n MELOS_URL = os.environ.get('MELOS_URL')\n", "issue": "Fix cookies for Utnarm\nUtnarm recently switched to utnarm.se instead of utnarm.utn.se. This lead to that you can\u2019t sign in to utnarm.se. Per default, moore uses utn.se as cookie domain and since utnarm.se is a different top level domain the cookies can\u2019t be used. \r\n\r\nWe need to dynamically add utnarm.se as a cookie domain. This python package might be useful https://github.com/ViktorStiskala/django-shared-session\n", "before_files": [{"content": "\"\"\"\nDjango settings for the production environment of Project Moore.\n\nFor more information regarding running in production see,\nSee https://docs.djangoproject.com/en/1.10/howto/deployment/checklist/\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.10/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.10/ref/settings/\n\"\"\"\nfrom __future__ import absolute_import, unicode_literals\nimport raven\n\nfrom .base import *\n\n# SECURITY WARNING: don't run with debug turned on in production!\nDEBUG = False\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = os.environ.get(\n 'DJANGO_SECRET',\n 'za7^0@54n&p-dg4)_l12q_3^o5awz_uym0osqaz2!myki_8kw0'\n)\n\n# Database\n# https://docs.djangoproject.com/en/1.10/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.postgresql',\n 'NAME': os.environ.get('DJANGO_DB_NAME', 'moore'),\n 'USER': os.environ.get('DJANGO_DB_USER', 'moore'),\n 'PASSWORD': os.environ.get('DJANGO_DB_PASS'),\n 'HOST': os.environ.get('DJANGO_DB_HOST', '127.0.0.1'),\n 'PORT': os.environ.get('DJANGO_DB_PORT', '5432'),\n }\n}\n\n\n# CONN_MAX_AGE = 0\n\n# Base URL to use when referring to full URLs within the Wagtail admin\n# backend - e.g. in notification emails. Don't include '/admin' or a\n# trailing slash\nBASE_URL = 'https://utn.se'\n\nALLOWED_HOSTS = ['.utn.se', '.utnarm.se']\n\n# Email settings\nDEFAULT_FROM_EMAIL = '[email protected]'\n\nEMAIL_SUBJECT_PREFIX = '[UTN] '\n\n# Sentry Configuration - will be sent error messages\nRAVEN_CONFIG = {\n 'dsn': os.environ.get('SENTRY_DSN'),\n 'release': raven.fetch_git_sha(os.path.dirname(BASE_DIR)),\n}\n\nLOGGING = {\n 'version': 1,\n 'disable_existing_loggers': True,\n 'root': {\n 'level': 'WARNING',\n 'handlers': ['sentry'],\n },\n 'formatters': {\n 'verbose': {\n 'format': '%(levelname)s %(asctime)s %(module)s '\n '%(process)d %(thread)d %(message)s'\n },\n },\n 'handlers': {\n 'sentry': {\n 'level': 'ERROR',\n 'class': 'raven.contrib.django.raven_compat'\n '.handlers.SentryHandler',\n 'tags': {'custom-tag': 'x'},\n },\n 'console': {\n 'level': 'DEBUG',\n 'class': 'logging.StreamHandler',\n 'formatter': 'verbose'\n }\n },\n 'loggers': {\n 'django.db.backends': {\n 'level': 'ERROR',\n 'handlers': ['console'],\n 'propagate': False,\n },\n 'raven': {\n 'level': 'DEBUG',\n 'handlers': ['console'],\n 'propagate': False,\n },\n 'sentry.errors': {\n 'level': 'DEBUG',\n 'handlers': ['console'],\n 'propagate': False,\n },\n },\n}\n\nCSRF_COOKIE_SECURE = True\n\nSESSION_COOKIE_DOMAIN = '.utn.se'\n\nSESSION_COOKIE_SECURE = True\n\nMELOS_URL = os.environ.get('MELOS_URL')\nMELOS_ORG_ID = os.environ.get('MELOS_ORG_ID')\nMELOS_ADMIN = os.environ.get('MELOS_ADMIN')\n\n# Google API\nGOOGLE_API_KEY = os.environ.get('GOOGLE_API_KEY')\n\nRECAPTCHA_PUBLIC_KEY = os.environ.get(\"RECAPTCHA_PUBLIC_KEY\", \"\")\nRECAPTCHA_PRIVATE_KEY = os.environ.get(\"RECAPTCHA_PRIVATE_KEY\", \"\")\n\ntry:\n from .local import *\nexcept ImportError:\n pass\n", "path": "src/moore/settings/production.py"}], "after_files": [{"content": "\"\"\"\nDjango settings for the production environment of Project Moore.\n\nFor more information regarding running in production see,\nSee https://docs.djangoproject.com/en/1.10/howto/deployment/checklist/\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.10/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.10/ref/settings/\n\"\"\"\nfrom __future__ import absolute_import, unicode_literals\nimport raven\n\nfrom .base import *\n\n# SECURITY WARNING: don't run with debug turned on in production!\nDEBUG = False\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = os.environ.get(\n 'DJANGO_SECRET',\n 'za7^0@54n&p-dg4)_l12q_3^o5awz_uym0osqaz2!myki_8kw0'\n)\n\n# Database\n# https://docs.djangoproject.com/en/1.10/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.postgresql',\n 'NAME': os.environ.get('DJANGO_DB_NAME', 'moore'),\n 'USER': os.environ.get('DJANGO_DB_USER', 'moore'),\n 'PASSWORD': os.environ.get('DJANGO_DB_PASS'),\n 'HOST': os.environ.get('DJANGO_DB_HOST', '127.0.0.1'),\n 'PORT': os.environ.get('DJANGO_DB_PORT', '5432'),\n }\n}\n\n\n# CONN_MAX_AGE = 0\n\n# Base URL to use when referring to full URLs within the Wagtail admin\n# backend - e.g. in notification emails. Don't include '/admin' or a\n# trailing slash\nBASE_URL = 'https://utn.se'\n\nALLOWED_HOSTS = ['.utn.se', '.utnarm.se']\n\n# Email settings\nDEFAULT_FROM_EMAIL = '[email protected]'\n\nEMAIL_SUBJECT_PREFIX = '[UTN] '\n\n# Sentry Configuration - will be sent error messages\nRAVEN_CONFIG = {\n 'dsn': os.environ.get('SENTRY_DSN'),\n 'release': raven.fetch_git_sha(os.path.dirname(BASE_DIR)),\n}\n\nLOGGING = {\n 'version': 1,\n 'disable_existing_loggers': True,\n 'root': {\n 'level': 'WARNING',\n 'handlers': ['sentry'],\n },\n 'formatters': {\n 'verbose': {\n 'format': '%(levelname)s %(asctime)s %(module)s '\n '%(process)d %(thread)d %(message)s'\n },\n },\n 'handlers': {\n 'sentry': {\n 'level': 'ERROR',\n 'class': 'raven.contrib.django.raven_compat'\n '.handlers.SentryHandler',\n 'tags': {'custom-tag': 'x'},\n },\n 'console': {\n 'level': 'DEBUG',\n 'class': 'logging.StreamHandler',\n 'formatter': 'verbose'\n }\n },\n 'loggers': {\n 'django.db.backends': {\n 'level': 'ERROR',\n 'handlers': ['console'],\n 'propagate': False,\n },\n 'raven': {\n 'level': 'DEBUG',\n 'handlers': ['console'],\n 'propagate': False,\n },\n 'sentry.errors': {\n 'level': 'DEBUG',\n 'handlers': ['console'],\n 'propagate': False,\n },\n },\n}\n\nCSRF_COOKIE_SECURE = True\n\nSESSION_COOKIE_SECURE = True\n\nMELOS_URL = os.environ.get('MELOS_URL')\nMELOS_ORG_ID = os.environ.get('MELOS_ORG_ID')\nMELOS_ADMIN = os.environ.get('MELOS_ADMIN')\n\n# Google API\nGOOGLE_API_KEY = os.environ.get('GOOGLE_API_KEY')\n\nRECAPTCHA_PUBLIC_KEY = os.environ.get(\"RECAPTCHA_PUBLIC_KEY\", \"\")\nRECAPTCHA_PRIVATE_KEY = os.environ.get(\"RECAPTCHA_PRIVATE_KEY\", \"\")\n\ntry:\n from .local import *\nexcept ImportError:\n pass\n", "path": "src/moore/settings/production.py"}]}
| 1,532 | 94 |
gh_patches_debug_116
|
rasdani/github-patches
|
git_diff
|
sanic-org__sanic-1530
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Publish 19.3 release to PyPI
Thank you for the release 3 days ago!
https://github.com/huge-success/sanic/releases/tag/19.3
It's missing from PyPI at the moment:
https://pypi.org/project/sanic/#history
Please publish it at your convenience 🙇
Keep up the awesome work ❤️
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sanic/__init__.py`
Content:
```
1 from sanic.app import Sanic
2 from sanic.blueprints import Blueprint
3
4
5 __version__ = "19.03.0"
6
7 __all__ = ["Sanic", "Blueprint"]
8
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sanic/__init__.py b/sanic/__init__.py
--- a/sanic/__init__.py
+++ b/sanic/__init__.py
@@ -2,6 +2,6 @@
from sanic.blueprints import Blueprint
-__version__ = "19.03.0"
+__version__ = "19.03.1"
__all__ = ["Sanic", "Blueprint"]
|
{"golden_diff": "diff --git a/sanic/__init__.py b/sanic/__init__.py\n--- a/sanic/__init__.py\n+++ b/sanic/__init__.py\n@@ -2,6 +2,6 @@\n from sanic.blueprints import Blueprint\n \n \n-__version__ = \"19.03.0\"\n+__version__ = \"19.03.1\"\n \n __all__ = [\"Sanic\", \"Blueprint\"]\n", "issue": "Publish 19.3 release to PyPI\nThank you for the release 3 days ago!\r\n\r\nhttps://github.com/huge-success/sanic/releases/tag/19.3\r\n\r\nIt's missing from PyPI at the moment:\r\n\r\nhttps://pypi.org/project/sanic/#history\r\n\r\nPlease publish it at your convenience \ud83d\ude47 \r\n\r\nKeep up the awesome work \u2764\ufe0f \n", "before_files": [{"content": "from sanic.app import Sanic\nfrom sanic.blueprints import Blueprint\n\n\n__version__ = \"19.03.0\"\n\n__all__ = [\"Sanic\", \"Blueprint\"]\n", "path": "sanic/__init__.py"}], "after_files": [{"content": "from sanic.app import Sanic\nfrom sanic.blueprints import Blueprint\n\n\n__version__ = \"19.03.1\"\n\n__all__ = [\"Sanic\", \"Blueprint\"]\n", "path": "sanic/__init__.py"}]}
| 389 | 95 |
gh_patches_debug_38710
|
rasdani/github-patches
|
git_diff
|
nvaccess__nvda-12486
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
No braille feedback in Windows Calculator
### Steps to reproduce:
1. Open Windows Calculator.
2. Type 2
3. Type +
4. Type 3
5. Type = (or Enter)
Note that the new appmodule listens for the Enter key, but not the equals (=) key. However, both work to display the result. Speech reports the result when you press Enter, but braille provides no feedback regardless of what key you press.
### Actual behavior:
The braille display continues to show "Display is 0".
There are two problems with this:
1. As you type (without using speech) you cannot verify what you just entered.
2. Once the result appears, you cannot read it unless you temporarily move the focus away from the display and back again.
### Expected behavior:
The braille display shows each number, operator and result as you type.
### System configuration
#### NVDA installed/portable/running from source:
Installed
#### NVDA version:
2020.4
#### Windows version:
Version 10.0.19042 Build 19042
#### Name and version of other software in use when reproducing the issue:
#### Other information about your system:
### Other questions
#### Does the issue still occur after restarting your computer?
Yes
#### Have you tried any other versions of NVDA? If so, please report their behaviors.
Yes, it worked in NVDA 2019.3.
#### If add-ons are disabled, is your problem still occurring?
Yes
#### Did you try to run the COM registry fixing tool in NVDA menu / tools?
Yes
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `source/appModules/calculator.py`
Content:
```
1 # A part of NonVisual Desktop Access (NVDA)
2 # Copyright (C) 2020 NV Access Limited, Joseph Lee
3 # This file is covered by the GNU General Public License.
4 # See the file COPYING for more details.
5
6 """App module for Windows 10 Calculator"""
7
8 import appModuleHandler
9 import api
10 from NVDAObjects.UIA import UIA
11 import queueHandler
12 import ui
13 import scriptHandler
14
15 # #9428: do not announce current values until calculations are done in order to avoid repetitions.
16 noCalculatorEntryAnnouncements = [
17 # Display field with Calculator set to full screen mode.
18 "CalculatorResults",
19 # In the middle of a calculation expression entry.
20 "CalculatorExpression",
21 # Results display with Calculator set to compact overlay i.e. always on top mode.
22 "CalculatorAlwaysOnTopResults",
23 # Calculator expressions with Calculator set to always on top mode.
24 "ExpressionContainer",
25 # Date range selector.
26 "ContentPresenter",
27 # Briefly shown when closing date calculation calendar.
28 "Light Dismiss",
29 # Unit conversion/convert from.
30 "Value1",
31 # Unit conversion/converts into.
32 "Value2",
33 ]
34
35
36 class AppModule(appModuleHandler.AppModule):
37
38 _shouldAnnounceResult = False
39 # Name change says the same thing multiple times for some items.
40 _resultsCache = ""
41
42 def event_nameChange(self, obj, nextHandler):
43 if not isinstance(obj, UIA):
44 return
45 # No, announce value changes immediately except for calculator results and expressions.
46 if (
47 obj.UIAAutomationId in noCalculatorEntryAnnouncements
48 or obj.UIAElement.cachedClassName == "LandmarkTarget"
49 ):
50 self._shouldAnnounceResult = False
51 # For the rest:
52 elif (
53 obj.UIAAutomationId not in noCalculatorEntryAnnouncements
54 and obj.name != self._resultsCache
55 ):
56 # For unit conversion, both name change and notification events are fired,
57 # although UIA notification event presents much better messages.
58 # For date calculation, live region change event is also fired for difference between dates.
59 if obj.UIAAutomationId != "DateDiffAllUnitsResultLabel":
60 ui.message(obj.name)
61 self._resultsCache = obj.name
62 if not self._shouldAnnounceResult:
63 return
64 self._shouldAnnounceResult = False
65 nextHandler()
66
67 def event_UIA_notification(self, obj, nextHandler, activityId=None, **kwargs):
68 try:
69 shouldAnnounceNotification = (
70 obj.previous.UIAAutomationId in
71 ("numberPad", "UnitConverterRootGrid")
72 )
73 except AttributeError:
74 resultElement = api.getForegroundObject().children[1].lastChild
75 # Redesigned in 2019 due to introduction of "always on top" i.e. compact overlay mode.
76 if resultElement.UIAElement.cachedClassName != "LandmarkTarget":
77 resultElement = resultElement.parent.children[1]
78 shouldAnnounceNotification = (
79 resultElement
80 and resultElement.firstChild
81 and resultElement.firstChild.UIAAutomationId not in noCalculatorEntryAnnouncements
82 )
83 # Display updated activity ID seen when entering calculations should be ignored
84 # as as it is redundant if speak typed characters is on.
85 if shouldAnnounceNotification or activityId != "DisplayUpdated":
86 nextHandler()
87
88 # A list of native commands to handle calculator result announcement.
89 _calculatorResultGestures = ("kb:enter", "kb:numpadEnter", "kb:escape")
90
91 @scriptHandler.script(gestures=_calculatorResultGestures)
92 def script_calculatorResult(self, gesture):
93 # To prevent double focus announcement, check where we are.
94 focus = api.getFocusObject()
95 gesture.send()
96 # In redstone, calculator result keeps firing name change,
97 # so tell it to do so if and only if enter has been pressed.
98 self._shouldAnnounceResult = True
99 # Hack: only announce display text when an actual calculator button (usually equals button) is pressed.
100 # In redstone, pressing enter does not move focus to equals button.
101 if isinstance(focus, UIA):
102 if focus.UIAAutomationId in ("CalculatorResults", "CalculatorAlwaysOnTopResults"):
103 queueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)
104 else:
105 resultsScreen = api.getForegroundObject().children[1].lastChild
106 if isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == "LandmarkTarget":
107 # And no, do not allow focus to move.
108 queueHandler.queueFunction(queueHandler.eventQueue, resultsScreen.firstChild.reportFocus)
109
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/source/appModules/calculator.py b/source/appModules/calculator.py
--- a/source/appModules/calculator.py
+++ b/source/appModules/calculator.py
@@ -1,5 +1,5 @@
# A part of NonVisual Desktop Access (NVDA)
-# Copyright (C) 2020 NV Access Limited, Joseph Lee
+# Copyright (C) 2020-2021 NV Access Limited, Joseph Lee
# This file is covered by the GNU General Public License.
# See the file COPYING for more details.
@@ -11,6 +11,7 @@
import queueHandler
import ui
import scriptHandler
+import braille
# #9428: do not announce current values until calculations are done in order to avoid repetitions.
noCalculatorEntryAnnouncements = [
@@ -64,7 +65,10 @@
self._shouldAnnounceResult = False
nextHandler()
- def event_UIA_notification(self, obj, nextHandler, activityId=None, **kwargs):
+ def event_UIA_notification(self, obj, nextHandler, displayString=None, activityId=None, **kwargs):
+ # #12268: for "DisplayUpdated", announce display strings in braille and move on.
+ if activityId == "DisplayUpdated":
+ braille.handler.message(displayString)
try:
shouldAnnounceNotification = (
obj.previous.UIAAutomationId in
@@ -86,7 +90,13 @@
nextHandler()
# A list of native commands to handle calculator result announcement.
- _calculatorResultGestures = ("kb:enter", "kb:numpadEnter", "kb:escape")
+ _calculatorResultGestures = (
+ "kb:enter",
+ "kb:numpadEnter",
+ "kb:escape",
+ "kb:delete",
+ "kb:numpadDelete"
+ )
@scriptHandler.script(gestures=_calculatorResultGestures)
def script_calculatorResult(self, gesture):
@@ -100,9 +110,9 @@
# In redstone, pressing enter does not move focus to equals button.
if isinstance(focus, UIA):
if focus.UIAAutomationId in ("CalculatorResults", "CalculatorAlwaysOnTopResults"):
- queueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)
+ queueHandler.queueFunction(queueHandler.eventQueue, ui.message, focus.name)
else:
resultsScreen = api.getForegroundObject().children[1].lastChild
if isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == "LandmarkTarget":
# And no, do not allow focus to move.
- queueHandler.queueFunction(queueHandler.eventQueue, resultsScreen.firstChild.reportFocus)
+ queueHandler.queueFunction(queueHandler.eventQueue, ui.message, resultsScreen.firstChild.name)
|
{"golden_diff": "diff --git a/source/appModules/calculator.py b/source/appModules/calculator.py\n--- a/source/appModules/calculator.py\n+++ b/source/appModules/calculator.py\n@@ -1,5 +1,5 @@\n # A part of NonVisual Desktop Access (NVDA)\n-# Copyright (C) 2020 NV Access Limited, Joseph Lee\n+# Copyright (C) 2020-2021 NV Access Limited, Joseph Lee\n # This file is covered by the GNU General Public License.\n # See the file COPYING for more details.\n \n@@ -11,6 +11,7 @@\n import queueHandler\n import ui\n import scriptHandler\n+import braille\n \n # #9428: do not announce current values until calculations are done in order to avoid repetitions.\n noCalculatorEntryAnnouncements = [\n@@ -64,7 +65,10 @@\n \t\tself._shouldAnnounceResult = False\n \t\tnextHandler()\n \n-\tdef event_UIA_notification(self, obj, nextHandler, activityId=None, **kwargs):\n+\tdef event_UIA_notification(self, obj, nextHandler, displayString=None, activityId=None, **kwargs):\n+\t\t# #12268: for \"DisplayUpdated\", announce display strings in braille and move on.\n+\t\tif activityId == \"DisplayUpdated\":\n+\t\t\tbraille.handler.message(displayString)\n \t\ttry:\n \t\t\tshouldAnnounceNotification = (\n \t\t\t\tobj.previous.UIAAutomationId in\n@@ -86,7 +90,13 @@\n \t\t\tnextHandler()\n \n \t# A list of native commands to handle calculator result announcement.\n-\t_calculatorResultGestures = (\"kb:enter\", \"kb:numpadEnter\", \"kb:escape\")\n+\t_calculatorResultGestures = (\n+\t\t\"kb:enter\",\n+\t\t\"kb:numpadEnter\",\n+\t\t\"kb:escape\",\n+\t\t\"kb:delete\",\n+\t\t\"kb:numpadDelete\"\n+\t)\n \n \[email protected](gestures=_calculatorResultGestures)\n \tdef script_calculatorResult(self, gesture):\n@@ -100,9 +110,9 @@\n \t\t# In redstone, pressing enter does not move focus to equals button.\n \t\tif isinstance(focus, UIA):\n \t\t\tif focus.UIAAutomationId in (\"CalculatorResults\", \"CalculatorAlwaysOnTopResults\"):\n-\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)\n+\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, ui.message, focus.name)\n \t\t\telse:\n \t\t\t\tresultsScreen = api.getForegroundObject().children[1].lastChild\n \t\t\t\tif isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == \"LandmarkTarget\":\n \t\t\t\t\t# And no, do not allow focus to move.\n-\t\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, resultsScreen.firstChild.reportFocus)\n+\t\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, ui.message, resultsScreen.firstChild.name)\n", "issue": "No braille feedback in Windows Calculator\n### Steps to reproduce:\r\n1. Open Windows Calculator.\r\n2. Type 2\r\n3. Type +\r\n4. Type 3\r\n5. Type = (or Enter)\r\n\r\nNote that the new appmodule listens for the Enter key, but not the equals (=) key. However, both work to display the result. Speech reports the result when you press Enter, but braille provides no feedback regardless of what key you press.\r\n\r\n### Actual behavior:\r\nThe braille display continues to show \"Display is 0\".\r\nThere are two problems with this:\r\n1. As you type (without using speech) you cannot verify what you just entered.\r\n2. Once the result appears, you cannot read it unless you temporarily move the focus away from the display and back again.\r\n\r\n### Expected behavior:\r\nThe braille display shows each number, operator and result as you type.\r\n\r\n### System configuration\r\n#### NVDA installed/portable/running from source:\r\nInstalled\r\n\r\n#### NVDA version:\r\n2020.4\r\n\r\n#### Windows version:\r\nVersion\t10.0.19042 Build 19042\r\n\r\n#### Name and version of other software in use when reproducing the issue:\r\n\r\n#### Other information about your system:\r\n\r\n### Other questions\r\n#### Does the issue still occur after restarting your computer?\r\nYes\r\n\r\n#### Have you tried any other versions of NVDA? If so, please report their behaviors.\r\nYes, it worked in NVDA 2019.3.\r\n\r\n#### If add-ons are disabled, is your problem still occurring?\r\nYes\r\n\r\n#### Did you try to run the COM registry fixing tool in NVDA menu / tools?\r\nYes\n", "before_files": [{"content": "# A part of NonVisual Desktop Access (NVDA)\n# Copyright (C) 2020 NV Access Limited, Joseph Lee\n# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n\n\"\"\"App module for Windows 10 Calculator\"\"\"\n\nimport appModuleHandler\nimport api\nfrom NVDAObjects.UIA import UIA\nimport queueHandler\nimport ui\nimport scriptHandler\n\n# #9428: do not announce current values until calculations are done in order to avoid repetitions.\nnoCalculatorEntryAnnouncements = [\n\t# Display field with Calculator set to full screen mode.\n\t\"CalculatorResults\",\n\t# In the middle of a calculation expression entry.\n\t\"CalculatorExpression\",\n\t# Results display with Calculator set to compact overlay i.e. always on top mode.\n\t\"CalculatorAlwaysOnTopResults\",\n\t# Calculator expressions with Calculator set to always on top mode.\n\t\"ExpressionContainer\",\n\t# Date range selector.\n\t\"ContentPresenter\",\n\t# Briefly shown when closing date calculation calendar.\n\t\"Light Dismiss\",\n\t# Unit conversion/convert from.\n\t\"Value1\",\n\t# Unit conversion/converts into.\n\t\"Value2\",\n]\n\n\nclass AppModule(appModuleHandler.AppModule):\n\n\t_shouldAnnounceResult = False\n\t# Name change says the same thing multiple times for some items.\n\t_resultsCache = \"\"\n\n\tdef event_nameChange(self, obj, nextHandler):\n\t\tif not isinstance(obj, UIA):\n\t\t\treturn\n\t\t# No, announce value changes immediately except for calculator results and expressions.\n\t\tif (\n\t\t\tobj.UIAAutomationId in noCalculatorEntryAnnouncements\n\t\t\tor obj.UIAElement.cachedClassName == \"LandmarkTarget\"\n\t\t):\n\t\t\tself._shouldAnnounceResult = False\n\t\t# For the rest:\n\t\telif (\n\t\t\tobj.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\tand obj.name != self._resultsCache\n\t\t):\n\t\t\t# For unit conversion, both name change and notification events are fired,\n\t\t\t# although UIA notification event presents much better messages.\n\t\t\t# For date calculation, live region change event is also fired for difference between dates.\n\t\t\tif obj.UIAAutomationId != \"DateDiffAllUnitsResultLabel\":\n\t\t\t\tui.message(obj.name)\n\t\t\tself._resultsCache = obj.name\n\t\tif not self._shouldAnnounceResult:\n\t\t\treturn\n\t\tself._shouldAnnounceResult = False\n\t\tnextHandler()\n\n\tdef event_UIA_notification(self, obj, nextHandler, activityId=None, **kwargs):\n\t\ttry:\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tobj.previous.UIAAutomationId in\n\t\t\t\t(\"numberPad\", \"UnitConverterRootGrid\")\n\t\t\t)\n\t\texcept AttributeError:\n\t\t\tresultElement = api.getForegroundObject().children[1].lastChild\n\t\t\t# Redesigned in 2019 due to introduction of \"always on top\" i.e. compact overlay mode.\n\t\t\tif resultElement.UIAElement.cachedClassName != \"LandmarkTarget\":\n\t\t\t\tresultElement = resultElement.parent.children[1]\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tresultElement\n\t\t\t\tand resultElement.firstChild\n\t\t\t\tand resultElement.firstChild.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\t)\n\t\t# Display updated activity ID seen when entering calculations should be ignored\n\t\t# as as it is redundant if speak typed characters is on.\n\t\tif shouldAnnounceNotification or activityId != \"DisplayUpdated\":\n\t\t\tnextHandler()\n\n\t# A list of native commands to handle calculator result announcement.\n\t_calculatorResultGestures = (\"kb:enter\", \"kb:numpadEnter\", \"kb:escape\")\n\n\[email protected](gestures=_calculatorResultGestures)\n\tdef script_calculatorResult(self, gesture):\n\t\t# To prevent double focus announcement, check where we are.\n\t\tfocus = api.getFocusObject()\n\t\tgesture.send()\n\t\t# In redstone, calculator result keeps firing name change,\n\t\t# so tell it to do so if and only if enter has been pressed.\n\t\tself._shouldAnnounceResult = True\n\t\t# Hack: only announce display text when an actual calculator button (usually equals button) is pressed.\n\t\t# In redstone, pressing enter does not move focus to equals button.\n\t\tif isinstance(focus, UIA):\n\t\t\tif focus.UIAAutomationId in (\"CalculatorResults\", \"CalculatorAlwaysOnTopResults\"):\n\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)\n\t\t\telse:\n\t\t\t\tresultsScreen = api.getForegroundObject().children[1].lastChild\n\t\t\t\tif isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == \"LandmarkTarget\":\n\t\t\t\t\t# And no, do not allow focus to move.\n\t\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, resultsScreen.firstChild.reportFocus)\n", "path": "source/appModules/calculator.py"}], "after_files": [{"content": "# A part of NonVisual Desktop Access (NVDA)\n# Copyright (C) 2020-2021 NV Access Limited, Joseph Lee\n# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n\n\"\"\"App module for Windows 10 Calculator\"\"\"\n\nimport appModuleHandler\nimport api\nfrom NVDAObjects.UIA import UIA\nimport queueHandler\nimport ui\nimport scriptHandler\nimport braille\n\n# #9428: do not announce current values until calculations are done in order to avoid repetitions.\nnoCalculatorEntryAnnouncements = [\n\t# Display field with Calculator set to full screen mode.\n\t\"CalculatorResults\",\n\t# In the middle of a calculation expression entry.\n\t\"CalculatorExpression\",\n\t# Results display with Calculator set to compact overlay i.e. always on top mode.\n\t\"CalculatorAlwaysOnTopResults\",\n\t# Calculator expressions with Calculator set to always on top mode.\n\t\"ExpressionContainer\",\n\t# Date range selector.\n\t\"ContentPresenter\",\n\t# Briefly shown when closing date calculation calendar.\n\t\"Light Dismiss\",\n\t# Unit conversion/convert from.\n\t\"Value1\",\n\t# Unit conversion/converts into.\n\t\"Value2\",\n]\n\n\nclass AppModule(appModuleHandler.AppModule):\n\n\t_shouldAnnounceResult = False\n\t# Name change says the same thing multiple times for some items.\n\t_resultsCache = \"\"\n\n\tdef event_nameChange(self, obj, nextHandler):\n\t\tif not isinstance(obj, UIA):\n\t\t\treturn\n\t\t# No, announce value changes immediately except for calculator results and expressions.\n\t\tif (\n\t\t\tobj.UIAAutomationId in noCalculatorEntryAnnouncements\n\t\t\tor obj.UIAElement.cachedClassName == \"LandmarkTarget\"\n\t\t):\n\t\t\tself._shouldAnnounceResult = False\n\t\t# For the rest:\n\t\telif (\n\t\t\tobj.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\tand obj.name != self._resultsCache\n\t\t):\n\t\t\t# For unit conversion, both name change and notification events are fired,\n\t\t\t# although UIA notification event presents much better messages.\n\t\t\t# For date calculation, live region change event is also fired for difference between dates.\n\t\t\tif obj.UIAAutomationId != \"DateDiffAllUnitsResultLabel\":\n\t\t\t\tui.message(obj.name)\n\t\t\tself._resultsCache = obj.name\n\t\tif not self._shouldAnnounceResult:\n\t\t\treturn\n\t\tself._shouldAnnounceResult = False\n\t\tnextHandler()\n\n\tdef event_UIA_notification(self, obj, nextHandler, displayString=None, activityId=None, **kwargs):\n\t\t# #12268: for \"DisplayUpdated\", announce display strings in braille and move on.\n\t\tif activityId == \"DisplayUpdated\":\n\t\t\tbraille.handler.message(displayString)\n\t\ttry:\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tobj.previous.UIAAutomationId in\n\t\t\t\t(\"numberPad\", \"UnitConverterRootGrid\")\n\t\t\t)\n\t\texcept AttributeError:\n\t\t\tresultElement = api.getForegroundObject().children[1].lastChild\n\t\t\t# Redesigned in 2019 due to introduction of \"always on top\" i.e. compact overlay mode.\n\t\t\tif resultElement.UIAElement.cachedClassName != \"LandmarkTarget\":\n\t\t\t\tresultElement = resultElement.parent.children[1]\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tresultElement\n\t\t\t\tand resultElement.firstChild\n\t\t\t\tand resultElement.firstChild.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\t)\n\t\t# Display updated activity ID seen when entering calculations should be ignored\n\t\t# as as it is redundant if speak typed characters is on.\n\t\tif shouldAnnounceNotification or activityId != \"DisplayUpdated\":\n\t\t\tnextHandler()\n\n\t# A list of native commands to handle calculator result announcement.\n\t_calculatorResultGestures = (\n\t\t\"kb:enter\",\n\t\t\"kb:numpadEnter\",\n\t\t\"kb:escape\",\n\t\t\"kb:delete\",\n\t\t\"kb:numpadDelete\"\n\t)\n\n\[email protected](gestures=_calculatorResultGestures)\n\tdef script_calculatorResult(self, gesture):\n\t\t# To prevent double focus announcement, check where we are.\n\t\tfocus = api.getFocusObject()\n\t\tgesture.send()\n\t\t# In redstone, calculator result keeps firing name change,\n\t\t# so tell it to do so if and only if enter has been pressed.\n\t\tself._shouldAnnounceResult = True\n\t\t# Hack: only announce display text when an actual calculator button (usually equals button) is pressed.\n\t\t# In redstone, pressing enter does not move focus to equals button.\n\t\tif isinstance(focus, UIA):\n\t\t\tif focus.UIAAutomationId in (\"CalculatorResults\", \"CalculatorAlwaysOnTopResults\"):\n\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, ui.message, focus.name)\n\t\t\telse:\n\t\t\t\tresultsScreen = api.getForegroundObject().children[1].lastChild\n\t\t\t\tif isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == \"LandmarkTarget\":\n\t\t\t\t\t# And no, do not allow focus to move.\n\t\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, ui.message, resultsScreen.firstChild.name)\n", "path": "source/appModules/calculator.py"}]}
| 1,868 | 639 |
gh_patches_debug_34427
|
rasdani/github-patches
|
git_diff
|
mozilla__pontoon-2520
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add an option to link to multiple strings
*This issue was created automatically by a [script](https://github.com/mathjazz/bugzilla2github/).*
## [Bug 1324933](https://bugzilla.mozilla.org/show_bug.cgi?id=1324933)
Bug Reporter: @tomer
CC: @gaby2300, @ItielMaN, @mathjazz
Blocker for: [Bug 1390931](https://bugzilla.mozilla.org/show_bug.cgi?id=1390931)
In dxr.mozilla.org I can select multiple lines in a file and than have a link that will show these lines highlighted. I'd suggest having a similar feature in pontoon, so we could send a team member a link that contains multiple strings, instead of one of the following workarounds:
* Sending one string per link
* Sending link to a search results or a resource, and asking them to look on the second string, the third from the bottom, etc.
I'd like to have a link similar to this one:
https://pontoon.mozilla.org/he/firefox-aurora/all-resources/?string=75295,75296
… Which will show me both strings in the list.
As for the UI for creating such lists, I guess having control-click to toggle multiple selections will be enough; this is an advanced feature after all. ☺
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pontoon/projects/management/commands/send_review_notifications.py`
Content:
```
1 from collections import defaultdict
2 from datetime import timedelta
3 from urllib.parse import urlencode
4
5 from django.core.management.base import BaseCommand
6 from django.db.models import Q
7 from django.urls import reverse
8 from django.utils import timezone
9 from notifications.signals import notify
10 from pontoon.base.models import Translation
11
12
13 class Command(BaseCommand):
14 help = "Notify translators about their newly reviewed suggestions"
15
16 def get_description(self, author, notifyData):
17 desc = "Your suggestions have been reviewed:\n<ul>"
18
19 for (locale, project), (approved, rejected) in notifyData.items():
20 url = reverse(
21 "pontoon.translate",
22 kwargs={
23 "locale": locale.code,
24 "project": project.slug,
25 "resource": "all-resources",
26 },
27 )
28 url += "?" + urlencode({"author": author.email})
29 if len(approved) == 1 and len(rejected) == 0:
30 url += "&" + urlencode({"string": approved[0]})
31 elif len(approved) == 0 and len(rejected) == 1:
32 url += "&" + urlencode({"string": rejected[0]})
33
34 # Filter out rejections where the author's own suggestion replaced the previous
35 rejected = [x for x in rejected if x not in approved]
36
37 if len(approved) == 0:
38 msg = f"{len(rejected)} Rejected"
39 else:
40 msg = f"{len(approved)} Approved"
41 if len(rejected) > 0:
42 msg += f", {len(rejected)} Rejected"
43
44 desc += (
45 f'\n<li><a href="{url}">{project.name} ({locale.code})</a>: {msg}</li>'
46 )
47
48 return desc + "\n</ul>"
49
50 def handle(self, *args, **options):
51 """
52 This command sends notifications about newly reviewed
53 suggestions to the authors of those suggestions.
54
55 The command is designed to run on a daily basis.
56 """
57 self.stdout.write("Sending review notifications...")
58
59 # (author) -> (locale, project) -> (approved, rejected)
60 data = defaultdict(lambda: defaultdict(lambda: (list(), list())))
61 start = timezone.now() - timedelta(days=1)
62 for suggestion in Translation.objects.filter(
63 (Q(approved_date__gt=start) | Q(rejected_date__gt=start))
64 & Q(user__profile__review_notifications=True)
65 ):
66 author = suggestion.user
67 locale = suggestion.locale
68 project = suggestion.entity.resource.project
69
70 if suggestion.approved and suggestion.approved_user != author:
71 data[author][(locale, project)][0].append(suggestion.entity.pk)
72 elif suggestion.rejected and suggestion.rejected_user != author:
73 data[author][(locale, project)][1].append(suggestion.entity.pk)
74
75 for author, notifyData in data.items():
76 desc = self.get_description(author, notifyData)
77 notify.send(
78 sender=author,
79 recipient=author,
80 verb="has reviewed suggestions",
81 description=desc,
82 )
83
84 self.stdout.write(f"Sent {len(data)} review notifications.")
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pontoon/projects/management/commands/send_review_notifications.py b/pontoon/projects/management/commands/send_review_notifications.py
--- a/pontoon/projects/management/commands/send_review_notifications.py
+++ b/pontoon/projects/management/commands/send_review_notifications.py
@@ -1,6 +1,5 @@
from collections import defaultdict
from datetime import timedelta
-from urllib.parse import urlencode
from django.core.management.base import BaseCommand
from django.db.models import Q
@@ -13,7 +12,7 @@
class Command(BaseCommand):
help = "Notify translators about their newly reviewed suggestions"
- def get_description(self, author, notifyData):
+ def get_description(self, notifyData):
desc = "Your suggestions have been reviewed:\n<ul>"
for (locale, project), (approved, rejected) in notifyData.items():
@@ -25,11 +24,8 @@
"resource": "all-resources",
},
)
- url += "?" + urlencode({"author": author.email})
- if len(approved) == 1 and len(rejected) == 0:
- url += "&" + urlencode({"string": approved[0]})
- elif len(approved) == 0 and len(rejected) == 1:
- url += "&" + urlencode({"string": rejected[0]})
+ list = map(str, approved + rejected)
+ url += "?list=" + ",".join(list)
# Filter out rejections where the author's own suggestion replaced the previous
rejected = [x for x in rejected if x not in approved]
@@ -73,7 +69,7 @@
data[author][(locale, project)][1].append(suggestion.entity.pk)
for author, notifyData in data.items():
- desc = self.get_description(author, notifyData)
+ desc = self.get_description(notifyData)
notify.send(
sender=author,
recipient=author,
|
{"golden_diff": "diff --git a/pontoon/projects/management/commands/send_review_notifications.py b/pontoon/projects/management/commands/send_review_notifications.py\n--- a/pontoon/projects/management/commands/send_review_notifications.py\n+++ b/pontoon/projects/management/commands/send_review_notifications.py\n@@ -1,6 +1,5 @@\n from collections import defaultdict\n from datetime import timedelta\n-from urllib.parse import urlencode\n \n from django.core.management.base import BaseCommand\n from django.db.models import Q\n@@ -13,7 +12,7 @@\n class Command(BaseCommand):\n help = \"Notify translators about their newly reviewed suggestions\"\n \n- def get_description(self, author, notifyData):\n+ def get_description(self, notifyData):\n desc = \"Your suggestions have been reviewed:\\n<ul>\"\n \n for (locale, project), (approved, rejected) in notifyData.items():\n@@ -25,11 +24,8 @@\n \"resource\": \"all-resources\",\n },\n )\n- url += \"?\" + urlencode({\"author\": author.email})\n- if len(approved) == 1 and len(rejected) == 0:\n- url += \"&\" + urlencode({\"string\": approved[0]})\n- elif len(approved) == 0 and len(rejected) == 1:\n- url += \"&\" + urlencode({\"string\": rejected[0]})\n+ list = map(str, approved + rejected)\n+ url += \"?list=\" + \",\".join(list)\n \n # Filter out rejections where the author's own suggestion replaced the previous\n rejected = [x for x in rejected if x not in approved]\n@@ -73,7 +69,7 @@\n data[author][(locale, project)][1].append(suggestion.entity.pk)\n \n for author, notifyData in data.items():\n- desc = self.get_description(author, notifyData)\n+ desc = self.get_description(notifyData)\n notify.send(\n sender=author,\n recipient=author,\n", "issue": "Add an option to link to multiple strings\n*This issue was created automatically by a [script](https://github.com/mathjazz/bugzilla2github/).*\n## [Bug 1324933](https://bugzilla.mozilla.org/show_bug.cgi?id=1324933)\nBug Reporter: @tomer\nCC: @gaby2300, @ItielMaN, @mathjazz\nBlocker for: [Bug 1390931](https://bugzilla.mozilla.org/show_bug.cgi?id=1390931)\n\nIn dxr.mozilla.org I can select multiple lines in a file and than have a link that will show these lines highlighted. I'd suggest having a similar feature in pontoon, so we could send a team member a link that contains multiple strings, instead of one of the following workarounds:\n\n* Sending one string per link\n* Sending link to a search results or a resource, and asking them to look on the second string, the third from the bottom, etc. \n\nI'd like to have a link similar to this one: \nhttps://pontoon.mozilla.org/he/firefox-aurora/all-resources/?string=75295,75296 \n\u2026 Which will show me both strings in the list.\n\n\nAs for the UI for creating such lists, I guess having control-click to toggle multiple selections will be enough; this is an advanced feature after all. \u263a\n", "before_files": [{"content": "from collections import defaultdict\nfrom datetime import timedelta\nfrom urllib.parse import urlencode\n\nfrom django.core.management.base import BaseCommand\nfrom django.db.models import Q\nfrom django.urls import reverse\nfrom django.utils import timezone\nfrom notifications.signals import notify\nfrom pontoon.base.models import Translation\n\n\nclass Command(BaseCommand):\n help = \"Notify translators about their newly reviewed suggestions\"\n\n def get_description(self, author, notifyData):\n desc = \"Your suggestions have been reviewed:\\n<ul>\"\n\n for (locale, project), (approved, rejected) in notifyData.items():\n url = reverse(\n \"pontoon.translate\",\n kwargs={\n \"locale\": locale.code,\n \"project\": project.slug,\n \"resource\": \"all-resources\",\n },\n )\n url += \"?\" + urlencode({\"author\": author.email})\n if len(approved) == 1 and len(rejected) == 0:\n url += \"&\" + urlencode({\"string\": approved[0]})\n elif len(approved) == 0 and len(rejected) == 1:\n url += \"&\" + urlencode({\"string\": rejected[0]})\n\n # Filter out rejections where the author's own suggestion replaced the previous\n rejected = [x for x in rejected if x not in approved]\n\n if len(approved) == 0:\n msg = f\"{len(rejected)} Rejected\"\n else:\n msg = f\"{len(approved)} Approved\"\n if len(rejected) > 0:\n msg += f\", {len(rejected)} Rejected\"\n\n desc += (\n f'\\n<li><a href=\"{url}\">{project.name} ({locale.code})</a>: {msg}</li>'\n )\n\n return desc + \"\\n</ul>\"\n\n def handle(self, *args, **options):\n \"\"\"\n This command sends notifications about newly reviewed\n suggestions to the authors of those suggestions.\n\n The command is designed to run on a daily basis.\n \"\"\"\n self.stdout.write(\"Sending review notifications...\")\n\n # (author) -> (locale, project) -> (approved, rejected)\n data = defaultdict(lambda: defaultdict(lambda: (list(), list())))\n start = timezone.now() - timedelta(days=1)\n for suggestion in Translation.objects.filter(\n (Q(approved_date__gt=start) | Q(rejected_date__gt=start))\n & Q(user__profile__review_notifications=True)\n ):\n author = suggestion.user\n locale = suggestion.locale\n project = suggestion.entity.resource.project\n\n if suggestion.approved and suggestion.approved_user != author:\n data[author][(locale, project)][0].append(suggestion.entity.pk)\n elif suggestion.rejected and suggestion.rejected_user != author:\n data[author][(locale, project)][1].append(suggestion.entity.pk)\n\n for author, notifyData in data.items():\n desc = self.get_description(author, notifyData)\n notify.send(\n sender=author,\n recipient=author,\n verb=\"has reviewed suggestions\",\n description=desc,\n )\n\n self.stdout.write(f\"Sent {len(data)} review notifications.\")\n", "path": "pontoon/projects/management/commands/send_review_notifications.py"}], "after_files": [{"content": "from collections import defaultdict\nfrom datetime import timedelta\n\nfrom django.core.management.base import BaseCommand\nfrom django.db.models import Q\nfrom django.urls import reverse\nfrom django.utils import timezone\nfrom notifications.signals import notify\nfrom pontoon.base.models import Translation\n\n\nclass Command(BaseCommand):\n help = \"Notify translators about their newly reviewed suggestions\"\n\n def get_description(self, notifyData):\n desc = \"Your suggestions have been reviewed:\\n<ul>\"\n\n for (locale, project), (approved, rejected) in notifyData.items():\n url = reverse(\n \"pontoon.translate\",\n kwargs={\n \"locale\": locale.code,\n \"project\": project.slug,\n \"resource\": \"all-resources\",\n },\n )\n list = map(str, approved + rejected)\n url += \"?list=\" + \",\".join(list)\n\n # Filter out rejections where the author's own suggestion replaced the previous\n rejected = [x for x in rejected if x not in approved]\n\n if len(approved) == 0:\n msg = f\"{len(rejected)} Rejected\"\n else:\n msg = f\"{len(approved)} Approved\"\n if len(rejected) > 0:\n msg += f\", {len(rejected)} Rejected\"\n\n desc += (\n f'\\n<li><a href=\"{url}\">{project.name} ({locale.code})</a>: {msg}</li>'\n )\n\n return desc + \"\\n</ul>\"\n\n def handle(self, *args, **options):\n \"\"\"\n This command sends notifications about newly reviewed\n suggestions to the authors of those suggestions.\n\n The command is designed to run on a daily basis.\n \"\"\"\n self.stdout.write(\"Sending review notifications...\")\n\n # (author) -> (locale, project) -> (approved, rejected)\n data = defaultdict(lambda: defaultdict(lambda: (list(), list())))\n start = timezone.now() - timedelta(days=1)\n for suggestion in Translation.objects.filter(\n (Q(approved_date__gt=start) | Q(rejected_date__gt=start))\n & Q(user__profile__review_notifications=True)\n ):\n author = suggestion.user\n locale = suggestion.locale\n project = suggestion.entity.resource.project\n\n if suggestion.approved and suggestion.approved_user != author:\n data[author][(locale, project)][0].append(suggestion.entity.pk)\n elif suggestion.rejected and suggestion.rejected_user != author:\n data[author][(locale, project)][1].append(suggestion.entity.pk)\n\n for author, notifyData in data.items():\n desc = self.get_description(notifyData)\n notify.send(\n sender=author,\n recipient=author,\n verb=\"has reviewed suggestions\",\n description=desc,\n )\n\n self.stdout.write(f\"Sent {len(data)} review notifications.\")\n", "path": "pontoon/projects/management/commands/send_review_notifications.py"}]}
| 1,408 | 426 |
gh_patches_debug_36189
|
rasdani/github-patches
|
git_diff
|
streamlink__streamlink-3947
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
sportschau: Does not work with the newly introduced bundesliga livestreams (Konferenz)
### Checklist
- [X] This is a plugin issue and not a different kind of issue
- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)
- [X] [I have checked the list of open and recently closed plugin issues](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22)
- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)
### Streamlink version
Latest stable release
### Description
Does not work with the newly introduced bundesliga livestreams (Konferenz).
e.g. https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html
Neither under windows nor linux. The linux version (Mint 19) was installed with apt-get, maybe is an older version.
Windows version is actual.
### Debug log
```text
Windows:
c:\Users\deiss\Downloads\streamlink>streamlink https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html "best" --loglevel debug
[cli][debug] OS: Windows 10
[cli][debug] Python: 3.7.9
[cli][debug] Streamlink: 2.1.2
[cli][debug] Requests(2.25.1), Socks(1.7.1), Websocket(1.0.0)
[cli][debug] Arguments:
[cli][debug] url=https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html
[cli][debug] stream=['best']
[cli][debug] --config=['c:\\Users\\deiss\\Downloads\\streamlink\\\\streamlinkrc']
[cli][debug] --loglevel=debug
[cli][debug] --rtmp-rtmpdump=c:\Users\deiss\Downloads\streamlink\\rtmpdump\rtmpdump.exe
[cli][debug] --ffmpeg-ffmpeg=c:\Users\deiss\Downloads\streamlink\\ffmpeg\ffmpeg.exe
[cli][info] Found matching plugin sportschau for URL https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html
[plugins.sportschau][debug] Found player js https://deviceids-medp.wdr.de/ondemand/247/2477304.js
[utils.l10n][debug] Language code: de_DE
error: Unable to open URL: https://b'' (HTTPSConnectionPool(host="b''", port=443): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000001BAE8276108>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed')))
Linux:
streamlink https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html "best" --loglevel debug
[cli][info] Found matching plugin sportschau for URL https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html
[plugin.sportschau][info] Found player js http://deviceids-medp.wdr.de/ondemand/247/2477304.js
Traceback (most recent call last):
File "/usr/bin/streamlink", line 11, in <module>
load_entry_point('streamlink==0.10.0', 'console_scripts', 'streamlink')()
File "/usr/share/streamlink/streamlink_cli/main.py", line 1055, in main
handle_url()
File "/usr/share/streamlink/streamlink_cli/main.py", line 486, in handle_url
streams = fetch_streams(plugin)
File "/usr/share/streamlink/streamlink_cli/main.py", line 398, in fetch_streams
sorting_excludes=args.stream_sorting_excludes)
File "/usr/lib/python3/dist-packages/streamlink/plugin/plugin.py", line 385, in get_streams
return self.streams(*args, **kwargs)
File "/usr/lib/python3/dist-packages/streamlink/plugin/plugin.py", line 288, in streams
ostreams = self._get_streams()
File "/usr/lib/python3/dist-packages/streamlink/plugins/sportschau.py", line 40, in _get_streams
return HDSStream.parse_manifest(self.session, stream_metadata['mediaResource']['dflt']['videoURL']).items()
KeyError: 'videoURL'
Error in sys.excepthook:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/apport_python_hook.py", line 145, in apport_excepthook
os.O_WRONLY | os.O_CREAT | os.O_EXCL, 0o640), 'wb') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/var/crash/_usr_share_streamlink_streamlink.1000.crash'
Original exception was:
Traceback (most recent call last):
File "/usr/bin/streamlink", line 11, in <module>
load_entry_point('streamlink==0.10.0', 'console_scripts', 'streamlink')()
File "/usr/share/streamlink/streamlink_cli/main.py", line 1055, in main
handle_url()
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/sportschau.py`
Content:
```
1 import logging
2 import re
3
4 from streamlink.plugin import Plugin, pluginmatcher
5 from streamlink.plugin.api import validate
6 from streamlink.stream import HLSStream
7 from streamlink.utils import parse_json, update_scheme
8
9 log = logging.getLogger(__name__)
10
11
12 @pluginmatcher(re.compile(
13 r"https?://(?:\w+\.)*sportschau\.de/"
14 ))
15 class Sportschau(Plugin):
16 _re_player = re.compile(r"https?:(//deviceids-medp.wdr.de/ondemand/\S+\.js)")
17 _re_json = re.compile(r"\$mediaObject.jsonpHelper.storeAndPlay\(({.+})\);?")
18
19 _schema_player = validate.Schema(
20 validate.transform(_re_player.search),
21 validate.any(None, validate.Schema(
22 validate.get(1),
23 validate.transform(lambda url: update_scheme("https:", url))
24 ))
25 )
26 _schema_json = validate.Schema(
27 validate.transform(_re_json.match),
28 validate.get(1),
29 validate.transform(parse_json),
30 validate.get("mediaResource"),
31 validate.get("dflt"),
32 validate.get("videoURL"),
33 validate.transform(lambda url: update_scheme("https:", url))
34 )
35
36 def _get_streams(self):
37 player_js = self.session.http.get(self.url, schema=self._schema_player)
38 if not player_js:
39 return
40
41 log.debug("Found player js {0}".format(player_js))
42
43 hls_url = self.session.http.get(player_js, schema=self._schema_json)
44
45 yield from HLSStream.parse_variant_playlist(self.session, hls_url).items()
46
47
48 __plugin__ = Sportschau
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/streamlink/plugins/sportschau.py b/src/streamlink/plugins/sportschau.py
--- a/src/streamlink/plugins/sportschau.py
+++ b/src/streamlink/plugins/sportschau.py
@@ -3,7 +3,7 @@
from streamlink.plugin import Plugin, pluginmatcher
from streamlink.plugin.api import validate
-from streamlink.stream import HLSStream
+from streamlink.stream import HLSStream, HTTPStream
from streamlink.utils import parse_json, update_scheme
log = logging.getLogger(__name__)
@@ -16,33 +16,34 @@
_re_player = re.compile(r"https?:(//deviceids-medp.wdr.de/ondemand/\S+\.js)")
_re_json = re.compile(r"\$mediaObject.jsonpHelper.storeAndPlay\(({.+})\);?")
- _schema_player = validate.Schema(
- validate.transform(_re_player.search),
- validate.any(None, validate.Schema(
- validate.get(1),
- validate.transform(lambda url: update_scheme("https:", url))
- ))
- )
- _schema_json = validate.Schema(
- validate.transform(_re_json.match),
- validate.get(1),
- validate.transform(parse_json),
- validate.get("mediaResource"),
- validate.get("dflt"),
- validate.get("videoURL"),
- validate.transform(lambda url: update_scheme("https:", url))
- )
-
def _get_streams(self):
- player_js = self.session.http.get(self.url, schema=self._schema_player)
+ player_js = self.session.http.get(self.url, schema=validate.Schema(
+ validate.transform(self._re_player.search),
+ validate.any(None, validate.Schema(
+ validate.get(1),
+ validate.transform(lambda url: update_scheme("https:", url))
+ ))
+ ))
if not player_js:
return
- log.debug("Found player js {0}".format(player_js))
-
- hls_url = self.session.http.get(player_js, schema=self._schema_json)
+ log.debug(f"Found player js {player_js}")
+ data = self.session.http.get(player_js, schema=validate.Schema(
+ validate.transform(self._re_json.match),
+ validate.get(1),
+ validate.transform(parse_json),
+ validate.get("mediaResource"),
+ validate.get("dflt"),
+ {
+ validate.optional("audioURL"): validate.url(),
+ validate.optional("videoURL"): validate.url()
+ }
+ ))
- yield from HLSStream.parse_variant_playlist(self.session, hls_url).items()
+ if data.get("videoURL"):
+ yield from HLSStream.parse_variant_playlist(self.session, update_scheme("https:", data.get("videoURL"))).items()
+ if data.get("audioURL"):
+ yield "audio", HTTPStream(self.session, update_scheme("https:", data.get("audioURL")))
__plugin__ = Sportschau
|
{"golden_diff": "diff --git a/src/streamlink/plugins/sportschau.py b/src/streamlink/plugins/sportschau.py\n--- a/src/streamlink/plugins/sportschau.py\n+++ b/src/streamlink/plugins/sportschau.py\n@@ -3,7 +3,7 @@\n \n from streamlink.plugin import Plugin, pluginmatcher\n from streamlink.plugin.api import validate\n-from streamlink.stream import HLSStream\n+from streamlink.stream import HLSStream, HTTPStream\n from streamlink.utils import parse_json, update_scheme\n \n log = logging.getLogger(__name__)\n@@ -16,33 +16,34 @@\n _re_player = re.compile(r\"https?:(//deviceids-medp.wdr.de/ondemand/\\S+\\.js)\")\n _re_json = re.compile(r\"\\$mediaObject.jsonpHelper.storeAndPlay\\(({.+})\\);?\")\n \n- _schema_player = validate.Schema(\n- validate.transform(_re_player.search),\n- validate.any(None, validate.Schema(\n- validate.get(1),\n- validate.transform(lambda url: update_scheme(\"https:\", url))\n- ))\n- )\n- _schema_json = validate.Schema(\n- validate.transform(_re_json.match),\n- validate.get(1),\n- validate.transform(parse_json),\n- validate.get(\"mediaResource\"),\n- validate.get(\"dflt\"),\n- validate.get(\"videoURL\"),\n- validate.transform(lambda url: update_scheme(\"https:\", url))\n- )\n-\n def _get_streams(self):\n- player_js = self.session.http.get(self.url, schema=self._schema_player)\n+ player_js = self.session.http.get(self.url, schema=validate.Schema(\n+ validate.transform(self._re_player.search),\n+ validate.any(None, validate.Schema(\n+ validate.get(1),\n+ validate.transform(lambda url: update_scheme(\"https:\", url))\n+ ))\n+ ))\n if not player_js:\n return\n \n- log.debug(\"Found player js {0}\".format(player_js))\n-\n- hls_url = self.session.http.get(player_js, schema=self._schema_json)\n+ log.debug(f\"Found player js {player_js}\")\n+ data = self.session.http.get(player_js, schema=validate.Schema(\n+ validate.transform(self._re_json.match),\n+ validate.get(1),\n+ validate.transform(parse_json),\n+ validate.get(\"mediaResource\"),\n+ validate.get(\"dflt\"),\n+ {\n+ validate.optional(\"audioURL\"): validate.url(),\n+ validate.optional(\"videoURL\"): validate.url()\n+ }\n+ ))\n \n- yield from HLSStream.parse_variant_playlist(self.session, hls_url).items()\n+ if data.get(\"videoURL\"):\n+ yield from HLSStream.parse_variant_playlist(self.session, update_scheme(\"https:\", data.get(\"videoURL\"))).items()\n+ if data.get(\"audioURL\"):\n+ yield \"audio\", HTTPStream(self.session, update_scheme(\"https:\", data.get(\"audioURL\")))\n \n \n __plugin__ = Sportschau\n", "issue": "sportschau: Does not work with the newly introduced bundesliga livestreams (Konferenz)\n### Checklist\n\n- [X] This is a plugin issue and not a different kind of issue\n- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)\n- [X] [I have checked the list of open and recently closed plugin issues](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22)\n- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)\n\n### Streamlink version\n\nLatest stable release\n\n### Description\n\nDoes not work with the newly introduced bundesliga livestreams (Konferenz).\r\n\r\ne.g. https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html\r\n\r\nNeither under windows nor linux. The linux version (Mint 19) was installed with apt-get, maybe is an older version.\r\nWindows version is actual.\r\n\r\n\n\n### Debug log\n\n```text\nWindows:\r\n\r\nc:\\Users\\deiss\\Downloads\\streamlink>streamlink https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html \"best\" --loglevel debug\r\n[cli][debug] OS: Windows 10\r\n[cli][debug] Python: 3.7.9\r\n[cli][debug] Streamlink: 2.1.2\r\n[cli][debug] Requests(2.25.1), Socks(1.7.1), Websocket(1.0.0)\r\n[cli][debug] Arguments:\r\n[cli][debug] url=https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html\r\n[cli][debug] stream=['best']\r\n[cli][debug] --config=['c:\\\\Users\\\\deiss\\\\Downloads\\\\streamlink\\\\\\\\streamlinkrc']\r\n[cli][debug] --loglevel=debug\r\n[cli][debug] --rtmp-rtmpdump=c:\\Users\\deiss\\Downloads\\streamlink\\\\rtmpdump\\rtmpdump.exe\r\n[cli][debug] --ffmpeg-ffmpeg=c:\\Users\\deiss\\Downloads\\streamlink\\\\ffmpeg\\ffmpeg.exe\r\n[cli][info] Found matching plugin sportschau for URL https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html\r\n[plugins.sportschau][debug] Found player js https://deviceids-medp.wdr.de/ondemand/247/2477304.js\r\n[utils.l10n][debug] Language code: de_DE\r\nerror: Unable to open URL: https://b'' (HTTPSConnectionPool(host=\"b''\", port=443): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000001BAE8276108>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed')))\r\n\r\n\r\nLinux:\r\n\r\nstreamlink https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html \"best\" --loglevel debug\r\n[cli][info] Found matching plugin sportschau for URL https://www.sportschau.de/fussball/bundesliga/audiostream-die-konferenz---bundesliga-live-100.html\r\n[plugin.sportschau][info] Found player js http://deviceids-medp.wdr.de/ondemand/247/2477304.js\r\nTraceback (most recent call last):\r\n File \"/usr/bin/streamlink\", line 11, in <module>\r\n load_entry_point('streamlink==0.10.0', 'console_scripts', 'streamlink')()\r\n File \"/usr/share/streamlink/streamlink_cli/main.py\", line 1055, in main\r\n handle_url()\r\n File \"/usr/share/streamlink/streamlink_cli/main.py\", line 486, in handle_url\r\n streams = fetch_streams(plugin)\r\n File \"/usr/share/streamlink/streamlink_cli/main.py\", line 398, in fetch_streams\r\n sorting_excludes=args.stream_sorting_excludes)\r\n File \"/usr/lib/python3/dist-packages/streamlink/plugin/plugin.py\", line 385, in get_streams\r\n return self.streams(*args, **kwargs)\r\n File \"/usr/lib/python3/dist-packages/streamlink/plugin/plugin.py\", line 288, in streams\r\n ostreams = self._get_streams()\r\n File \"/usr/lib/python3/dist-packages/streamlink/plugins/sportschau.py\", line 40, in _get_streams\r\n return HDSStream.parse_manifest(self.session, stream_metadata['mediaResource']['dflt']['videoURL']).items()\r\nKeyError: 'videoURL'\r\nError in sys.excepthook:\r\nTraceback (most recent call last):\r\n File \"/usr/lib/python3/dist-packages/apport_python_hook.py\", line 145, in apport_excepthook\r\n os.O_WRONLY | os.O_CREAT | os.O_EXCL, 0o640), 'wb') as f:\r\nFileNotFoundError: [Errno 2] No such file or directory: '/var/crash/_usr_share_streamlink_streamlink.1000.crash'\r\n\r\nOriginal exception was:\r\nTraceback (most recent call last):\r\n File \"/usr/bin/streamlink\", line 11, in <module>\r\n load_entry_point('streamlink==0.10.0', 'console_scripts', 'streamlink')()\r\n File \"/usr/share/streamlink/streamlink_cli/main.py\", line 1055, in main\r\n handle_url()\n```\n\n", "before_files": [{"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream\nfrom streamlink.utils import parse_json, update_scheme\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"https?://(?:\\w+\\.)*sportschau\\.de/\"\n))\nclass Sportschau(Plugin):\n _re_player = re.compile(r\"https?:(//deviceids-medp.wdr.de/ondemand/\\S+\\.js)\")\n _re_json = re.compile(r\"\\$mediaObject.jsonpHelper.storeAndPlay\\(({.+})\\);?\")\n\n _schema_player = validate.Schema(\n validate.transform(_re_player.search),\n validate.any(None, validate.Schema(\n validate.get(1),\n validate.transform(lambda url: update_scheme(\"https:\", url))\n ))\n )\n _schema_json = validate.Schema(\n validate.transform(_re_json.match),\n validate.get(1),\n validate.transform(parse_json),\n validate.get(\"mediaResource\"),\n validate.get(\"dflt\"),\n validate.get(\"videoURL\"),\n validate.transform(lambda url: update_scheme(\"https:\", url))\n )\n\n def _get_streams(self):\n player_js = self.session.http.get(self.url, schema=self._schema_player)\n if not player_js:\n return\n\n log.debug(\"Found player js {0}\".format(player_js))\n\n hls_url = self.session.http.get(player_js, schema=self._schema_json)\n\n yield from HLSStream.parse_variant_playlist(self.session, hls_url).items()\n\n\n__plugin__ = Sportschau\n", "path": "src/streamlink/plugins/sportschau.py"}], "after_files": [{"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream, HTTPStream\nfrom streamlink.utils import parse_json, update_scheme\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"https?://(?:\\w+\\.)*sportschau\\.de/\"\n))\nclass Sportschau(Plugin):\n _re_player = re.compile(r\"https?:(//deviceids-medp.wdr.de/ondemand/\\S+\\.js)\")\n _re_json = re.compile(r\"\\$mediaObject.jsonpHelper.storeAndPlay\\(({.+})\\);?\")\n\n def _get_streams(self):\n player_js = self.session.http.get(self.url, schema=validate.Schema(\n validate.transform(self._re_player.search),\n validate.any(None, validate.Schema(\n validate.get(1),\n validate.transform(lambda url: update_scheme(\"https:\", url))\n ))\n ))\n if not player_js:\n return\n\n log.debug(f\"Found player js {player_js}\")\n data = self.session.http.get(player_js, schema=validate.Schema(\n validate.transform(self._re_json.match),\n validate.get(1),\n validate.transform(parse_json),\n validate.get(\"mediaResource\"),\n validate.get(\"dflt\"),\n {\n validate.optional(\"audioURL\"): validate.url(),\n validate.optional(\"videoURL\"): validate.url()\n }\n ))\n\n if data.get(\"videoURL\"):\n yield from HLSStream.parse_variant_playlist(self.session, update_scheme(\"https:\", data.get(\"videoURL\"))).items()\n if data.get(\"audioURL\"):\n yield \"audio\", HTTPStream(self.session, update_scheme(\"https:\", data.get(\"audioURL\")))\n\n\n__plugin__ = Sportschau\n", "path": "src/streamlink/plugins/sportschau.py"}]}
| 2,047 | 637 |
gh_patches_debug_31771
|
rasdani/github-patches
|
git_diff
|
nvaccess__nvda-11883
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Windows 10 Calculator: results are not announced in compact overlay mode
Hi,
Follow-up to PR #9429
### Steps to reproduce:
In Windows 10 Calculator:
1. Switch to standard calculator mode (Alt+number row 1).
2. Press Alt+Up arrow to switch to always on top (compact overlay) mode.
3. Perform some calculations.
### Actual behavior:
Results are not announced.
### Expected behavior:
Results are announced.
### System configuration
#### NVDA installed/portable/running from source:
Installed
#### NVDA version:
alpha-21429,874230a7
#### Windows version:
Windows 10 Version 20H2 (build 19042.662 beta)
#### Name and version of other software in use when reproducing the issue:
Calculator 10.2010.0.0
#### Other information about your system:
N/A
### Other questions
#### Does the issue still occur after restarting your computer?
Yes
#### Have you tried any other versions of NVDA? If so, please report their behaviors.
Same behavior in 2020.3.
#### If addons are disabled, is your problem still occuring?
Yes
#### Did you try to run the COM registry fixing tool in NVDA menu / tools?
Not applicable
### Additional context and solution
Caused by a missing Automation Id that should be included in result announcement script (UIA Automation Id is different for always on top result element). The same PR for #11880 will be applicable to this issue as well.
Thanks.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `source/appModules/calculator.py`
Content:
```
1 # A part of NonVisual Desktop Access (NVDA)
2 # Copyright (C) 2020 NV Access Limited, Joseph Lee
3 # This file is covered by the GNU General Public License.
4 # See the file COPYING for more details.
5
6 """App module for Windows 10 Calculator"""
7
8 import appModuleHandler
9 import api
10 from NVDAObjects.UIA import UIA
11 import queueHandler
12 import ui
13 import scriptHandler
14
15 # #9428: do not announce current values until calculations are done in order to avoid repetitions.
16 noCalculatorEntryAnnouncements = [
17 # Display field with Calculator set to full screen mode.
18 "CalculatorResults",
19 # In the middle of a calculation expression entry.
20 "CalculatorExpression",
21 # Results display with Calculator set to compact overlay i.e. always on top mode.
22 "CalculatorAlwaysOnTopResults",
23 # Calculator expressions with Calculator set to always on top mode.
24 "ExpressionContainer",
25 # Date range selector.
26 "ContentPresenter",
27 # Briefly shown when closing date calculation calendar.
28 "Light Dismiss",
29 ]
30
31
32 class AppModule(appModuleHandler.AppModule):
33
34 _shouldAnnounceResult = False
35 # Name change says the same thing multiple times for some items.
36 _resultsCache = ""
37
38 def event_nameChange(self, obj, nextHandler):
39 if not isinstance(obj, UIA):
40 return
41 # No, announce value changes immediately except for calculator results and expressions.
42 if (
43 obj.UIAAutomationId in noCalculatorEntryAnnouncements
44 or obj.UIAElement.cachedClassName == "LandmarkTarget"
45 ):
46 self._shouldAnnounceResult = False
47 # For the rest:
48 elif (
49 obj.UIAAutomationId not in noCalculatorEntryAnnouncements
50 and obj.name != self._resultsCache
51 ):
52 # For unit conversion, UIA notification event presents much better messages.
53 # For date calculation, live region change event is also fired for difference between dates.
54 if obj.UIAAutomationId not in ("Value1", "Value2", "DateDiffAllUnitsResultLabel"):
55 ui.message(obj.name)
56 self._resultsCache = obj.name
57 if not self._shouldAnnounceResult:
58 return
59 self._shouldAnnounceResult = False
60 nextHandler()
61
62 def event_UIA_notification(self, obj, nextHandler, activityId=None, **kwargs):
63 try:
64 shouldAnnounceNotification = (
65 obj.previous.UIAAutomationId in
66 ("numberPad", "UnitConverterRootGrid")
67 )
68 except AttributeError:
69 resultElement = api.getForegroundObject().children[1].lastChild
70 # Redesigned in 2019 due to introduction of "always on top" i.e. compact overlay mode.
71 if resultElement.UIAElement.cachedClassName != "LandmarkTarget":
72 resultElement = resultElement.parent.children[1]
73 shouldAnnounceNotification = (
74 resultElement
75 and resultElement.firstChild
76 and resultElement.firstChild.UIAAutomationId not in noCalculatorEntryAnnouncements
77 )
78 # Display updated activity ID seen when entering calculations should be ignored
79 # as as it is redundant if speak typed characters is on.
80 if shouldAnnounceNotification or activityId != "DisplayUpdated":
81 nextHandler()
82
83 # A list of native commands to handle calculator result announcement.
84 _calculatorResultGestures = ("kb:enter", "kb:numpadEnter", "kb:escape")
85
86 @scriptHandler.script(gestures=_calculatorResultGestures)
87 def script_calculatorResult(self, gesture):
88 # To prevent double focus announcement, check where we are.
89 focus = api.getFocusObject()
90 gesture.send()
91 # In redstone, calculator result keeps firing name change,
92 # so tell it to do so if and only if enter has been pressed.
93 self._shouldAnnounceResult = True
94 # Hack: only announce display text when an actual calculator button (usually equals button) is pressed.
95 # In redstone, pressing enter does not move focus to equals button.
96 if isinstance(focus, UIA):
97 if focus.UIAAutomationId == "CalculatorResults":
98 queueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)
99 else:
100 resultsScreen = api.getForegroundObject().children[1].lastChild
101 if isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == "LandmarkTarget":
102 # And no, do not allow focus to move.
103 queueHandler.queueFunction(queueHandler.eventQueue, resultsScreen.firstChild.reportFocus)
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/source/appModules/calculator.py b/source/appModules/calculator.py
--- a/source/appModules/calculator.py
+++ b/source/appModules/calculator.py
@@ -26,6 +26,10 @@
"ContentPresenter",
# Briefly shown when closing date calculation calendar.
"Light Dismiss",
+ # Unit conversion/convert from.
+ "Value1",
+ # Unit conversion/converts into.
+ "Value2",
]
@@ -49,9 +53,10 @@
obj.UIAAutomationId not in noCalculatorEntryAnnouncements
and obj.name != self._resultsCache
):
- # For unit conversion, UIA notification event presents much better messages.
+ # For unit conversion, both name change and notification events are fired,
+ # although UIA notification event presents much better messages.
# For date calculation, live region change event is also fired for difference between dates.
- if obj.UIAAutomationId not in ("Value1", "Value2", "DateDiffAllUnitsResultLabel"):
+ if obj.UIAAutomationId != "DateDiffAllUnitsResultLabel":
ui.message(obj.name)
self._resultsCache = obj.name
if not self._shouldAnnounceResult:
@@ -94,7 +99,7 @@
# Hack: only announce display text when an actual calculator button (usually equals button) is pressed.
# In redstone, pressing enter does not move focus to equals button.
if isinstance(focus, UIA):
- if focus.UIAAutomationId == "CalculatorResults":
+ if focus.UIAAutomationId in ("CalculatorResults", "CalculatorAlwaysOnTopResults"):
queueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)
else:
resultsScreen = api.getForegroundObject().children[1].lastChild
|
{"golden_diff": "diff --git a/source/appModules/calculator.py b/source/appModules/calculator.py\n--- a/source/appModules/calculator.py\n+++ b/source/appModules/calculator.py\n@@ -26,6 +26,10 @@\n \t\"ContentPresenter\",\n \t# Briefly shown when closing date calculation calendar.\n \t\"Light Dismiss\",\n+\t# Unit conversion/convert from.\n+\t\"Value1\",\n+\t# Unit conversion/converts into.\n+\t\"Value2\",\n ]\n \n \n@@ -49,9 +53,10 @@\n \t\t\tobj.UIAAutomationId not in noCalculatorEntryAnnouncements\n \t\t\tand obj.name != self._resultsCache\n \t\t):\n-\t\t\t# For unit conversion, UIA notification event presents much better messages.\n+\t\t\t# For unit conversion, both name change and notification events are fired,\n+\t\t\t# although UIA notification event presents much better messages.\n \t\t\t# For date calculation, live region change event is also fired for difference between dates.\n-\t\t\tif obj.UIAAutomationId not in (\"Value1\", \"Value2\", \"DateDiffAllUnitsResultLabel\"):\n+\t\t\tif obj.UIAAutomationId != \"DateDiffAllUnitsResultLabel\":\n \t\t\t\tui.message(obj.name)\n \t\t\tself._resultsCache = obj.name\n \t\tif not self._shouldAnnounceResult:\n@@ -94,7 +99,7 @@\n \t\t# Hack: only announce display text when an actual calculator button (usually equals button) is pressed.\n \t\t# In redstone, pressing enter does not move focus to equals button.\n \t\tif isinstance(focus, UIA):\n-\t\t\tif focus.UIAAutomationId == \"CalculatorResults\":\n+\t\t\tif focus.UIAAutomationId in (\"CalculatorResults\", \"CalculatorAlwaysOnTopResults\"):\n \t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)\n \t\t\telse:\n \t\t\t\tresultsScreen = api.getForegroundObject().children[1].lastChild\n", "issue": "Windows 10 Calculator: results are not announced in compact overlay mode\nHi,\r\n\r\nFollow-up to PR #9429 \r\n\r\n### Steps to reproduce:\r\nIn Windows 10 Calculator:\r\n\r\n1. Switch to standard calculator mode (Alt+number row 1).\r\n2. Press Alt+Up arrow to switch to always on top (compact overlay) mode.\r\n3. Perform some calculations.\r\n\r\n### Actual behavior:\r\nResults are not announced.\r\n\r\n### Expected behavior:\r\nResults are announced.\r\n\r\n### System configuration\r\n#### NVDA installed/portable/running from source:\r\nInstalled\r\n\r\n#### NVDA version:\r\nalpha-21429,874230a7\r\n\r\n#### Windows version:\r\nWindows 10 Version 20H2 (build 19042.662 beta)\r\n\r\n#### Name and version of other software in use when reproducing the issue:\r\nCalculator 10.2010.0.0\r\n\r\n#### Other information about your system:\r\nN/A\r\n\r\n### Other questions\r\n#### Does the issue still occur after restarting your computer?\r\nYes\r\n\r\n#### Have you tried any other versions of NVDA? If so, please report their behaviors.\r\nSame behavior in 2020.3.\r\n\r\n#### If addons are disabled, is your problem still occuring?\r\nYes\r\n\r\n#### Did you try to run the COM registry fixing tool in NVDA menu / tools?\r\nNot applicable\r\n\r\n### Additional context and solution\r\nCaused by a missing Automation Id that should be included in result announcement script (UIA Automation Id is different for always on top result element). The same PR for #11880 will be applicable to this issue as well.\r\n\r\nThanks.\n", "before_files": [{"content": "# A part of NonVisual Desktop Access (NVDA)\n# Copyright (C) 2020 NV Access Limited, Joseph Lee\n# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n\n\"\"\"App module for Windows 10 Calculator\"\"\"\n\nimport appModuleHandler\nimport api\nfrom NVDAObjects.UIA import UIA\nimport queueHandler\nimport ui\nimport scriptHandler\n\n# #9428: do not announce current values until calculations are done in order to avoid repetitions.\nnoCalculatorEntryAnnouncements = [\n\t# Display field with Calculator set to full screen mode.\n\t\"CalculatorResults\",\n\t# In the middle of a calculation expression entry.\n\t\"CalculatorExpression\",\n\t# Results display with Calculator set to compact overlay i.e. always on top mode.\n\t\"CalculatorAlwaysOnTopResults\",\n\t# Calculator expressions with Calculator set to always on top mode.\n\t\"ExpressionContainer\",\n\t# Date range selector.\n\t\"ContentPresenter\",\n\t# Briefly shown when closing date calculation calendar.\n\t\"Light Dismiss\",\n]\n\n\nclass AppModule(appModuleHandler.AppModule):\n\n\t_shouldAnnounceResult = False\n\t# Name change says the same thing multiple times for some items.\n\t_resultsCache = \"\"\n\n\tdef event_nameChange(self, obj, nextHandler):\n\t\tif not isinstance(obj, UIA):\n\t\t\treturn\n\t\t# No, announce value changes immediately except for calculator results and expressions.\n\t\tif (\n\t\t\tobj.UIAAutomationId in noCalculatorEntryAnnouncements\n\t\t\tor obj.UIAElement.cachedClassName == \"LandmarkTarget\"\n\t\t):\n\t\t\tself._shouldAnnounceResult = False\n\t\t# For the rest:\n\t\telif (\n\t\t\tobj.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\tand obj.name != self._resultsCache\n\t\t):\n\t\t\t# For unit conversion, UIA notification event presents much better messages.\n\t\t\t# For date calculation, live region change event is also fired for difference between dates.\n\t\t\tif obj.UIAAutomationId not in (\"Value1\", \"Value2\", \"DateDiffAllUnitsResultLabel\"):\n\t\t\t\tui.message(obj.name)\n\t\t\tself._resultsCache = obj.name\n\t\tif not self._shouldAnnounceResult:\n\t\t\treturn\n\t\tself._shouldAnnounceResult = False\n\t\tnextHandler()\n\n\tdef event_UIA_notification(self, obj, nextHandler, activityId=None, **kwargs):\n\t\ttry:\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tobj.previous.UIAAutomationId in\n\t\t\t\t(\"numberPad\", \"UnitConverterRootGrid\")\n\t\t\t)\n\t\texcept AttributeError:\n\t\t\tresultElement = api.getForegroundObject().children[1].lastChild\n\t\t\t# Redesigned in 2019 due to introduction of \"always on top\" i.e. compact overlay mode.\n\t\t\tif resultElement.UIAElement.cachedClassName != \"LandmarkTarget\":\n\t\t\t\tresultElement = resultElement.parent.children[1]\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tresultElement\n\t\t\t\tand resultElement.firstChild\n\t\t\t\tand resultElement.firstChild.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\t)\n\t\t# Display updated activity ID seen when entering calculations should be ignored\n\t\t# as as it is redundant if speak typed characters is on.\n\t\tif shouldAnnounceNotification or activityId != \"DisplayUpdated\":\n\t\t\tnextHandler()\n\n\t# A list of native commands to handle calculator result announcement.\n\t_calculatorResultGestures = (\"kb:enter\", \"kb:numpadEnter\", \"kb:escape\")\n\n\[email protected](gestures=_calculatorResultGestures)\n\tdef script_calculatorResult(self, gesture):\n\t\t# To prevent double focus announcement, check where we are.\n\t\tfocus = api.getFocusObject()\n\t\tgesture.send()\n\t\t# In redstone, calculator result keeps firing name change,\n\t\t# so tell it to do so if and only if enter has been pressed.\n\t\tself._shouldAnnounceResult = True\n\t\t# Hack: only announce display text when an actual calculator button (usually equals button) is pressed.\n\t\t# In redstone, pressing enter does not move focus to equals button.\n\t\tif isinstance(focus, UIA):\n\t\t\tif focus.UIAAutomationId == \"CalculatorResults\":\n\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)\n\t\t\telse:\n\t\t\t\tresultsScreen = api.getForegroundObject().children[1].lastChild\n\t\t\t\tif isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == \"LandmarkTarget\":\n\t\t\t\t\t# And no, do not allow focus to move.\n\t\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, resultsScreen.firstChild.reportFocus)\n", "path": "source/appModules/calculator.py"}], "after_files": [{"content": "# A part of NonVisual Desktop Access (NVDA)\n# Copyright (C) 2020 NV Access Limited, Joseph Lee\n# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n\n\"\"\"App module for Windows 10 Calculator\"\"\"\n\nimport appModuleHandler\nimport api\nfrom NVDAObjects.UIA import UIA\nimport queueHandler\nimport ui\nimport scriptHandler\n\n# #9428: do not announce current values until calculations are done in order to avoid repetitions.\nnoCalculatorEntryAnnouncements = [\n\t# Display field with Calculator set to full screen mode.\n\t\"CalculatorResults\",\n\t# In the middle of a calculation expression entry.\n\t\"CalculatorExpression\",\n\t# Results display with Calculator set to compact overlay i.e. always on top mode.\n\t\"CalculatorAlwaysOnTopResults\",\n\t# Calculator expressions with Calculator set to always on top mode.\n\t\"ExpressionContainer\",\n\t# Date range selector.\n\t\"ContentPresenter\",\n\t# Briefly shown when closing date calculation calendar.\n\t\"Light Dismiss\",\n\t# Unit conversion/convert from.\n\t\"Value1\",\n\t# Unit conversion/converts into.\n\t\"Value2\",\n]\n\n\nclass AppModule(appModuleHandler.AppModule):\n\n\t_shouldAnnounceResult = False\n\t# Name change says the same thing multiple times for some items.\n\t_resultsCache = \"\"\n\n\tdef event_nameChange(self, obj, nextHandler):\n\t\tif not isinstance(obj, UIA):\n\t\t\treturn\n\t\t# No, announce value changes immediately except for calculator results and expressions.\n\t\tif (\n\t\t\tobj.UIAAutomationId in noCalculatorEntryAnnouncements\n\t\t\tor obj.UIAElement.cachedClassName == \"LandmarkTarget\"\n\t\t):\n\t\t\tself._shouldAnnounceResult = False\n\t\t# For the rest:\n\t\telif (\n\t\t\tobj.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\tand obj.name != self._resultsCache\n\t\t):\n\t\t\t# For unit conversion, both name change and notification events are fired,\n\t\t\t# although UIA notification event presents much better messages.\n\t\t\t# For date calculation, live region change event is also fired for difference between dates.\n\t\t\tif obj.UIAAutomationId != \"DateDiffAllUnitsResultLabel\":\n\t\t\t\tui.message(obj.name)\n\t\t\tself._resultsCache = obj.name\n\t\tif not self._shouldAnnounceResult:\n\t\t\treturn\n\t\tself._shouldAnnounceResult = False\n\t\tnextHandler()\n\n\tdef event_UIA_notification(self, obj, nextHandler, activityId=None, **kwargs):\n\t\ttry:\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tobj.previous.UIAAutomationId in\n\t\t\t\t(\"numberPad\", \"UnitConverterRootGrid\")\n\t\t\t)\n\t\texcept AttributeError:\n\t\t\tresultElement = api.getForegroundObject().children[1].lastChild\n\t\t\t# Redesigned in 2019 due to introduction of \"always on top\" i.e. compact overlay mode.\n\t\t\tif resultElement.UIAElement.cachedClassName != \"LandmarkTarget\":\n\t\t\t\tresultElement = resultElement.parent.children[1]\n\t\t\tshouldAnnounceNotification = (\n\t\t\t\tresultElement\n\t\t\t\tand resultElement.firstChild\n\t\t\t\tand resultElement.firstChild.UIAAutomationId not in noCalculatorEntryAnnouncements\n\t\t\t)\n\t\t# Display updated activity ID seen when entering calculations should be ignored\n\t\t# as as it is redundant if speak typed characters is on.\n\t\tif shouldAnnounceNotification or activityId != \"DisplayUpdated\":\n\t\t\tnextHandler()\n\n\t# A list of native commands to handle calculator result announcement.\n\t_calculatorResultGestures = (\"kb:enter\", \"kb:numpadEnter\", \"kb:escape\")\n\n\[email protected](gestures=_calculatorResultGestures)\n\tdef script_calculatorResult(self, gesture):\n\t\t# To prevent double focus announcement, check where we are.\n\t\tfocus = api.getFocusObject()\n\t\tgesture.send()\n\t\t# In redstone, calculator result keeps firing name change,\n\t\t# so tell it to do so if and only if enter has been pressed.\n\t\tself._shouldAnnounceResult = True\n\t\t# Hack: only announce display text when an actual calculator button (usually equals button) is pressed.\n\t\t# In redstone, pressing enter does not move focus to equals button.\n\t\tif isinstance(focus, UIA):\n\t\t\tif focus.UIAAutomationId in (\"CalculatorResults\", \"CalculatorAlwaysOnTopResults\"):\n\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, focus.reportFocus)\n\t\t\telse:\n\t\t\t\tresultsScreen = api.getForegroundObject().children[1].lastChild\n\t\t\t\tif isinstance(resultsScreen, UIA) and resultsScreen.UIAElement.cachedClassName == \"LandmarkTarget\":\n\t\t\t\t\t# And no, do not allow focus to move.\n\t\t\t\t\tqueueHandler.queueFunction(queueHandler.eventQueue, resultsScreen.firstChild.reportFocus)\n", "path": "source/appModules/calculator.py"}]}
| 1,810 | 408 |
gh_patches_debug_27260
|
rasdani/github-patches
|
git_diff
|
ContinualAI__avalanche-52
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
LWF "warmup_train" fun never used
Hi @AntonioCarta, I've noticed this function in the `LearningWithoutForgetting` class is never used, do we need it?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `avalanche/training/strategies/lwf/lwf.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3
4 ################################################################################
5 # Copyright (c) 2020 ContinualAI Research #
6 # Copyrights licensed under the CC BY 4.0 License. #
7 # See the accompanying LICENSE file for terms. #
8 # #
9 # Date: 1-05-2020 #
10 # Author(s): ContinualAI #
11 # E-mail: [email protected] #
12 # Website: clair.continualai.org #
13 ################################################################################
14
15 """ Rehearsal Strategy Implementation """
16
17 # Python 2-3 compatible
18 from __future__ import print_function
19 from __future__ import division
20 from __future__ import absolute_import
21
22 from training.strategies.strategy import Strategy
23 from avalanche.evaluation.eval_protocol import EvalProtocol
24 from avalanche.evaluation.metrics import ACC
25 from avalanche.training.utils import pad_data, shuffle_in_unison
26 import torch
27 import torch.nn.functional as F
28 import numpy as np
29 import copy
30
31
32 def distillation_loss(y_pred, y_teacher, temperature):
33 """ Distillation loss. """
34 scale = y_teacher.shape[-1] # kl_div is normalized by element instead of observation
35 log_p = F.log_softmax(y_pred / temperature, dim=1)
36 q = F.softmax(y_teacher / temperature, dim=1)
37 res = scale * F.kl_div(log_p, q, reduction='mean')
38 return res
39
40
41 class LearningWithoutForgetting(Strategy):
42 def __init__(self, model, classes_per_task, alpha=0.5, distillation_loss_T=2, warmup_epochs=2, optimizer=None,
43 criterion=torch.nn.CrossEntropyLoss(), mb_size=256,
44 train_ep=2, device=None, preproc=None,
45 eval_protocol=EvalProtocol(metrics=[ACC()])):
46 """
47 Learning without Forgetting Strategy.
48
49 paper: https://arxiv.org/abs/1606.09282
50 original implementation (Matlab): https://github.com/lizhitwo/LearningWithoutForgetting
51 reference implementation (pytorch): https://github.com/arunmallya/packnet/blob/master/src/lwf.py
52
53 Args:
54 classes_per_task:
55 alpha: distillation loss coefficient. Can be an integer or a list of values (one for each task).
56 distillation_loss_T: distillation loss temperature
57 warmup_epochs: number of warmup epochs training only the new parameters.
58 """
59 super(LearningWithoutForgetting, self).__init__(
60 model, optimizer, criterion, mb_size, train_ep, multi_head=False,
61 device=device, preproc=preproc, eval_protocol=eval_protocol
62 )
63
64 # LwF parameters
65 self.classes_per_task = classes_per_task
66 self.prev_model = None
67 self.distillation_loss_T = distillation_loss_T
68 self.alpha = alpha
69 self.warmup_epochs = warmup_epochs
70
71 def warmup_train(self):
72 """ Train only the new parameters for the first epochs. """
73 # add only the last layer to the trainable parameters
74 opt = torch.optim.SGD(lr=0.01, params=self.model.classifier.parameters())
75
76 train_x, train_y, it_x_ep = self.preproc_batch_data(self.x, self.y, self.t)
77 model = self.model.to(self.device)
78
79 train_x = torch.tensor(train_x, dtype=torch.float)
80 train_y = torch.tensor(train_y, dtype=torch.long)
81 for ep in range(self.train_ep):
82 for it in range(it_x_ep):
83 start = it * self.mb_size
84 end = (it + 1) * self.mb_size
85
86 self.optimizer.zero_grad()
87 x_mb = train_x[start:end].to(self.device)
88 y_mb = train_y[start:end].to(self.device)
89 logits = model(x_mb)
90 # loss computed only on the new classes
91 loss = self.criterion(logits[:, self.t*self.classes_per_task:(self.t+1)*self.classes_per_task],
92 y_mb - self.t*self.classes_per_task)
93 loss.backward()
94 opt.step()
95
96 def compute_loss(self, logits, y_mb):
97 dist_loss = 0
98 if self.prev_model is not None:
99 y_prev = self.prev_model(self.x_mb).detach()
100 loss = self.criterion(logits, y_mb)
101 dist_loss += distillation_loss(logits, y_prev, self.distillation_loss_T)
102
103 if isinstance(self.alpha, list):
104 loss = loss + self.alpha[self.t] * dist_loss
105 else:
106 loss = loss + self.alpha * dist_loss
107 else:
108 loss = self.criterion(logits, y_mb)
109 return loss
110
111 def after_train(self):
112 self.prev_model = copy.deepcopy(self.model)
113
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/avalanche/training/strategies/lwf/lwf.py b/avalanche/training/strategies/lwf/lwf.py
--- a/avalanche/training/strategies/lwf/lwf.py
+++ b/avalanche/training/strategies/lwf/lwf.py
@@ -39,7 +39,7 @@
class LearningWithoutForgetting(Strategy):
- def __init__(self, model, classes_per_task, alpha=0.5, distillation_loss_T=2, warmup_epochs=2, optimizer=None,
+ def __init__(self, model, classes_per_task, alpha=0.5, distillation_loss_T=2, warmup_epochs=0, optimizer=None,
criterion=torch.nn.CrossEntropyLoss(), mb_size=256,
train_ep=2, device=None, preproc=None,
eval_protocol=EvalProtocol(metrics=[ACC()])):
@@ -78,7 +78,7 @@
train_x = torch.tensor(train_x, dtype=torch.float)
train_y = torch.tensor(train_y, dtype=torch.long)
- for ep in range(self.train_ep):
+ for ep in range(self.warmup_epochs):
for it in range(it_x_ep):
start = it * self.mb_size
end = (it + 1) * self.mb_size
@@ -108,6 +108,9 @@
loss = self.criterion(logits, y_mb)
return loss
+ def before_train(self):
+ self.warmup_train()
+
def after_train(self):
self.prev_model = copy.deepcopy(self.model)
|
{"golden_diff": "diff --git a/avalanche/training/strategies/lwf/lwf.py b/avalanche/training/strategies/lwf/lwf.py\n--- a/avalanche/training/strategies/lwf/lwf.py\n+++ b/avalanche/training/strategies/lwf/lwf.py\n@@ -39,7 +39,7 @@\n \n \n class LearningWithoutForgetting(Strategy):\n- def __init__(self, model, classes_per_task, alpha=0.5, distillation_loss_T=2, warmup_epochs=2, optimizer=None,\n+ def __init__(self, model, classes_per_task, alpha=0.5, distillation_loss_T=2, warmup_epochs=0, optimizer=None,\n criterion=torch.nn.CrossEntropyLoss(), mb_size=256,\n train_ep=2, device=None, preproc=None,\n eval_protocol=EvalProtocol(metrics=[ACC()])):\n@@ -78,7 +78,7 @@\n \n train_x = torch.tensor(train_x, dtype=torch.float)\n train_y = torch.tensor(train_y, dtype=torch.long)\n- for ep in range(self.train_ep):\n+ for ep in range(self.warmup_epochs):\n for it in range(it_x_ep):\n start = it * self.mb_size\n end = (it + 1) * self.mb_size\n@@ -108,6 +108,9 @@\n loss = self.criterion(logits, y_mb)\n return loss\n \n+ def before_train(self):\n+ self.warmup_train()\n+\n def after_train(self):\n self.prev_model = copy.deepcopy(self.model)\n", "issue": "LWF \"warmup_train\" fun never used\nHi @AntonioCarta, I've noticed this function in the `LearningWithoutForgetting` class is never used, do we need it?\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n################################################################################\n# Copyright (c) 2020 ContinualAI Research #\n# Copyrights licensed under the CC BY 4.0 License. #\n# See the accompanying LICENSE file for terms. #\n# #\n# Date: 1-05-2020 #\n# Author(s): ContinualAI #\n# E-mail: [email protected] #\n# Website: clair.continualai.org #\n################################################################################\n\n\"\"\" Rehearsal Strategy Implementation \"\"\"\n\n# Python 2-3 compatible\nfrom __future__ import print_function\nfrom __future__ import division\nfrom __future__ import absolute_import\n\nfrom training.strategies.strategy import Strategy\nfrom avalanche.evaluation.eval_protocol import EvalProtocol\nfrom avalanche.evaluation.metrics import ACC\nfrom avalanche.training.utils import pad_data, shuffle_in_unison\nimport torch\nimport torch.nn.functional as F\nimport numpy as np\nimport copy\n\n\ndef distillation_loss(y_pred, y_teacher, temperature):\n \"\"\" Distillation loss. \"\"\"\n scale = y_teacher.shape[-1] # kl_div is normalized by element instead of observation\n log_p = F.log_softmax(y_pred / temperature, dim=1)\n q = F.softmax(y_teacher / temperature, dim=1)\n res = scale * F.kl_div(log_p, q, reduction='mean')\n return res\n\n\nclass LearningWithoutForgetting(Strategy):\n def __init__(self, model, classes_per_task, alpha=0.5, distillation_loss_T=2, warmup_epochs=2, optimizer=None,\n criterion=torch.nn.CrossEntropyLoss(), mb_size=256,\n train_ep=2, device=None, preproc=None,\n eval_protocol=EvalProtocol(metrics=[ACC()])):\n \"\"\"\n Learning without Forgetting Strategy.\n\n paper: https://arxiv.org/abs/1606.09282\n original implementation (Matlab): https://github.com/lizhitwo/LearningWithoutForgetting\n reference implementation (pytorch): https://github.com/arunmallya/packnet/blob/master/src/lwf.py\n\n Args:\n classes_per_task:\n alpha: distillation loss coefficient. Can be an integer or a list of values (one for each task).\n distillation_loss_T: distillation loss temperature\n warmup_epochs: number of warmup epochs training only the new parameters.\n \"\"\"\n super(LearningWithoutForgetting, self).__init__(\n model, optimizer, criterion, mb_size, train_ep, multi_head=False,\n device=device, preproc=preproc, eval_protocol=eval_protocol\n )\n\n # LwF parameters\n self.classes_per_task = classes_per_task\n self.prev_model = None\n self.distillation_loss_T = distillation_loss_T\n self.alpha = alpha\n self.warmup_epochs = warmup_epochs\n\n def warmup_train(self):\n \"\"\" Train only the new parameters for the first epochs. \"\"\"\n # add only the last layer to the trainable parameters\n opt = torch.optim.SGD(lr=0.01, params=self.model.classifier.parameters())\n\n train_x, train_y, it_x_ep = self.preproc_batch_data(self.x, self.y, self.t)\n model = self.model.to(self.device)\n\n train_x = torch.tensor(train_x, dtype=torch.float)\n train_y = torch.tensor(train_y, dtype=torch.long)\n for ep in range(self.train_ep):\n for it in range(it_x_ep):\n start = it * self.mb_size\n end = (it + 1) * self.mb_size\n\n self.optimizer.zero_grad()\n x_mb = train_x[start:end].to(self.device)\n y_mb = train_y[start:end].to(self.device)\n logits = model(x_mb)\n # loss computed only on the new classes\n loss = self.criterion(logits[:, self.t*self.classes_per_task:(self.t+1)*self.classes_per_task],\n y_mb - self.t*self.classes_per_task)\n loss.backward()\n opt.step()\n\n def compute_loss(self, logits, y_mb):\n dist_loss = 0\n if self.prev_model is not None:\n y_prev = self.prev_model(self.x_mb).detach()\n loss = self.criterion(logits, y_mb)\n dist_loss += distillation_loss(logits, y_prev, self.distillation_loss_T)\n\n if isinstance(self.alpha, list):\n loss = loss + self.alpha[self.t] * dist_loss\n else:\n loss = loss + self.alpha * dist_loss\n else:\n loss = self.criterion(logits, y_mb)\n return loss\n\n def after_train(self):\n self.prev_model = copy.deepcopy(self.model)\n\n", "path": "avalanche/training/strategies/lwf/lwf.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n################################################################################\n# Copyright (c) 2020 ContinualAI Research #\n# Copyrights licensed under the CC BY 4.0 License. #\n# See the accompanying LICENSE file for terms. #\n# #\n# Date: 1-05-2020 #\n# Author(s): ContinualAI #\n# E-mail: [email protected] #\n# Website: clair.continualai.org #\n################################################################################\n\n\"\"\" Rehearsal Strategy Implementation \"\"\"\n\n# Python 2-3 compatible\nfrom __future__ import print_function\nfrom __future__ import division\nfrom __future__ import absolute_import\n\nfrom training.strategies.strategy import Strategy\nfrom avalanche.evaluation.eval_protocol import EvalProtocol\nfrom avalanche.evaluation.metrics import ACC\nfrom avalanche.training.utils import pad_data, shuffle_in_unison\nimport torch\nimport torch.nn.functional as F\nimport numpy as np\nimport copy\n\n\ndef distillation_loss(y_pred, y_teacher, temperature):\n \"\"\" Distillation loss. \"\"\"\n scale = y_teacher.shape[-1] # kl_div is normalized by element instead of observation\n log_p = F.log_softmax(y_pred / temperature, dim=1)\n q = F.softmax(y_teacher / temperature, dim=1)\n res = scale * F.kl_div(log_p, q, reduction='mean')\n return res\n\n\nclass LearningWithoutForgetting(Strategy):\n def __init__(self, model, classes_per_task, alpha=0.5, distillation_loss_T=2, warmup_epochs=0, optimizer=None,\n criterion=torch.nn.CrossEntropyLoss(), mb_size=256,\n train_ep=2, device=None, preproc=None,\n eval_protocol=EvalProtocol(metrics=[ACC()])):\n \"\"\"\n Learning without Forgetting Strategy.\n\n paper: https://arxiv.org/abs/1606.09282\n original implementation (Matlab): https://github.com/lizhitwo/LearningWithoutForgetting\n reference implementation (pytorch): https://github.com/arunmallya/packnet/blob/master/src/lwf.py\n\n Args:\n classes_per_task:\n alpha: distillation loss coefficient. Can be an integer or a list of values (one for each task).\n distillation_loss_T: distillation loss temperature\n warmup_epochs: number of warmup epochs training only the new parameters.\n \"\"\"\n super(LearningWithoutForgetting, self).__init__(\n model, optimizer, criterion, mb_size, train_ep, multi_head=False,\n device=device, preproc=preproc, eval_protocol=eval_protocol\n )\n\n # LwF parameters\n self.classes_per_task = classes_per_task\n self.prev_model = None\n self.distillation_loss_T = distillation_loss_T\n self.alpha = alpha\n self.warmup_epochs = warmup_epochs\n\n def warmup_train(self):\n \"\"\" Train only the new parameters for the first epochs. \"\"\"\n # add only the last layer to the trainable parameters\n opt = torch.optim.SGD(lr=0.01, params=self.model.classifier.parameters())\n\n train_x, train_y, it_x_ep = self.preproc_batch_data(self.x, self.y, self.t)\n model = self.model.to(self.device)\n\n train_x = torch.tensor(train_x, dtype=torch.float)\n train_y = torch.tensor(train_y, dtype=torch.long)\n for ep in range(self.warmup_epochs):\n for it in range(it_x_ep):\n start = it * self.mb_size\n end = (it + 1) * self.mb_size\n\n self.optimizer.zero_grad()\n x_mb = train_x[start:end].to(self.device)\n y_mb = train_y[start:end].to(self.device)\n logits = model(x_mb)\n # loss computed only on the new classes\n loss = self.criterion(logits[:, self.t*self.classes_per_task:(self.t+1)*self.classes_per_task],\n y_mb - self.t*self.classes_per_task)\n loss.backward()\n opt.step()\n\n def compute_loss(self, logits, y_mb):\n dist_loss = 0\n if self.prev_model is not None:\n y_prev = self.prev_model(self.x_mb).detach()\n loss = self.criterion(logits, y_mb)\n dist_loss += distillation_loss(logits, y_prev, self.distillation_loss_T)\n\n if isinstance(self.alpha, list):\n loss = loss + self.alpha[self.t] * dist_loss\n else:\n loss = loss + self.alpha * dist_loss\n else:\n loss = self.criterion(logits, y_mb)\n return loss\n\n def before_train(self):\n self.warmup_train()\n\n def after_train(self):\n self.prev_model = copy.deepcopy(self.model)\n\n", "path": "avalanche/training/strategies/lwf/lwf.py"}]}
| 1,567 | 350 |
gh_patches_debug_8384
|
rasdani/github-patches
|
git_diff
|
Qiskit__qiskit-2350
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
disassemble_circuits() suggested in qobj_to_circuits.py DeprecationWarning doesn't exist
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: 0.8.0
- **Python version**: 3.7.2
- **Operating system**: macOS
`qobj_to_circuits` gives the following `DeprecationWarning`:
```python
.../qiskit/converters/qobj_to_circuits.py:34: DeprecationWarning: qiskit.converters.qobj_to_circuit() is deprecated and will be removed in Qiskit Terra 0.9. Please use qiskit.compiler.disassemble_circuits() to convert a qobj to list of circuits.
```
but `qiskit.compiler.disassemble_circuits()` doesn't exist.
### What is the current behavior?
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: cannot import name 'disassemble_circuits' from 'qiskit.compiler' (/Users/matteo/Work/projects/ibmq/env/lib/python3.7/site-packages/qiskit/compiler/__init__.py)
```
### Steps to reproduce the problem
1. Installed qiskit in a new python virtualenv with `pip install qiskit`
2. `from qiskit.compiler import disassemble_circuits`
```
>>> qiskit.__qiskit_version__
{'qiskit': '0.10.0', 'qiskit-terra': '0.8.0', 'qiskit-ignis': '0.1.1', 'qiskit-aer': '0.2.0', 'qiskit-ibmq-provider': '0.2.1', 'qiskit-aqua': '0.5.0'}
```
### What is the expected behavior?
If a function is deprecated, and the warning suggests to use a new function, this function should exist in the current release.
### Suggested solutions
Implement the function or change the deprecation warning.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qiskit/converters/qobj_to_circuits.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # This code is part of Qiskit.
4 #
5 # (C) Copyright IBM 2017, 2018.
6 #
7 # This code is licensed under the Apache License, Version 2.0. You may
8 # obtain a copy of this license in the LICENSE.txt file in the root directory
9 # of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
10 #
11 # Any modifications or derivative works of this code must retain this
12 # copyright notice, and modified files need to carry a notice indicating
13 # that they have been altered from the originals.
14
15 """Helper function for converting qobj to a list of circuits"""
16
17 import warnings
18
19 from qiskit.assembler import disassemble
20
21
22 def qobj_to_circuits(qobj):
23 """Return a list of QuantumCircuit object(s) from a qobj
24
25 Args:
26 qobj (Qobj): The Qobj object to convert to QuantumCircuits
27 Returns:
28 list: A list of QuantumCircuit objects from the qobj
29
30 """
31 warnings.warn('qiskit.converters.qobj_to_circuit() is deprecated and will '
32 'be removed in Qiskit Terra 0.9. Please use '
33 'qiskit.compiler.disassemble_circuits() to convert a qobj '
34 'to list of circuits.', DeprecationWarning)
35
36 variables = disassemble(qobj)
37 return variables[0]
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/qiskit/converters/qobj_to_circuits.py b/qiskit/converters/qobj_to_circuits.py
--- a/qiskit/converters/qobj_to_circuits.py
+++ b/qiskit/converters/qobj_to_circuits.py
@@ -30,7 +30,7 @@
"""
warnings.warn('qiskit.converters.qobj_to_circuit() is deprecated and will '
'be removed in Qiskit Terra 0.9. Please use '
- 'qiskit.compiler.disassemble_circuits() to convert a qobj '
+ 'qiskit.assembler.disassemble() to convert a qobj '
'to list of circuits.', DeprecationWarning)
variables = disassemble(qobj)
|
{"golden_diff": "diff --git a/qiskit/converters/qobj_to_circuits.py b/qiskit/converters/qobj_to_circuits.py\n--- a/qiskit/converters/qobj_to_circuits.py\n+++ b/qiskit/converters/qobj_to_circuits.py\n@@ -30,7 +30,7 @@\n \"\"\"\n warnings.warn('qiskit.converters.qobj_to_circuit() is deprecated and will '\n 'be removed in Qiskit Terra 0.9. Please use '\n- 'qiskit.compiler.disassemble_circuits() to convert a qobj '\n+ 'qiskit.assembler.disassemble() to convert a qobj '\n 'to list of circuits.', DeprecationWarning)\n \n variables = disassemble(qobj)\n", "issue": "disassemble_circuits() suggested in qobj_to_circuits.py DeprecationWarning doesn't exist\n<!-- \u26a0\ufe0f If you do not respect this template, your issue will be closed -->\r\n<!-- \u26a0\ufe0f Make sure to browse the opened and closed issues -->\r\n\r\n### Information\r\n\r\n- **Qiskit Terra version**: 0.8.0\r\n- **Python version**: 3.7.2\r\n- **Operating system**: macOS\r\n\r\n`qobj_to_circuits` gives the following `DeprecationWarning`:\r\n\r\n```python\r\n.../qiskit/converters/qobj_to_circuits.py:34: DeprecationWarning: qiskit.converters.qobj_to_circuit() is deprecated and will be removed in Qiskit Terra 0.9. Please use qiskit.compiler.disassemble_circuits() to convert a qobj to list of circuits.\r\n```\r\n\r\nbut `qiskit.compiler.disassemble_circuits()` doesn't exist.\r\n\r\n### What is the current behavior?\r\n```\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\nImportError: cannot import name 'disassemble_circuits' from 'qiskit.compiler' (/Users/matteo/Work/projects/ibmq/env/lib/python3.7/site-packages/qiskit/compiler/__init__.py)\r\n```\r\n\r\n### Steps to reproduce the problem\r\n1. Installed qiskit in a new python virtualenv with `pip install qiskit`\r\n2. `from qiskit.compiler import disassemble_circuits`\r\n\r\n```\r\n>>> qiskit.__qiskit_version__\r\n{'qiskit': '0.10.0', 'qiskit-terra': '0.8.0', 'qiskit-ignis': '0.1.1', 'qiskit-aer': '0.2.0', 'qiskit-ibmq-provider': '0.2.1', 'qiskit-aqua': '0.5.0'}\r\n```\r\n\r\n### What is the expected behavior?\r\nIf a function is deprecated, and the warning suggests to use a new function, this function should exist in the current release.\r\n\r\n### Suggested solutions\r\n\r\nImplement the function or change the deprecation warning.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017, 2018.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"Helper function for converting qobj to a list of circuits\"\"\"\n\nimport warnings\n\nfrom qiskit.assembler import disassemble\n\n\ndef qobj_to_circuits(qobj):\n \"\"\"Return a list of QuantumCircuit object(s) from a qobj\n\n Args:\n qobj (Qobj): The Qobj object to convert to QuantumCircuits\n Returns:\n list: A list of QuantumCircuit objects from the qobj\n\n \"\"\"\n warnings.warn('qiskit.converters.qobj_to_circuit() is deprecated and will '\n 'be removed in Qiskit Terra 0.9. Please use '\n 'qiskit.compiler.disassemble_circuits() to convert a qobj '\n 'to list of circuits.', DeprecationWarning)\n\n variables = disassemble(qobj)\n return variables[0]\n", "path": "qiskit/converters/qobj_to_circuits.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017, 2018.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"Helper function for converting qobj to a list of circuits\"\"\"\n\nimport warnings\n\nfrom qiskit.assembler import disassemble\n\n\ndef qobj_to_circuits(qobj):\n \"\"\"Return a list of QuantumCircuit object(s) from a qobj\n\n Args:\n qobj (Qobj): The Qobj object to convert to QuantumCircuits\n Returns:\n list: A list of QuantumCircuit objects from the qobj\n\n \"\"\"\n warnings.warn('qiskit.converters.qobj_to_circuit() is deprecated and will '\n 'be removed in Qiskit Terra 0.9. Please use '\n 'qiskit.assembler.disassemble() to convert a qobj '\n 'to list of circuits.', DeprecationWarning)\n\n variables = disassemble(qobj)\n return variables[0]\n", "path": "qiskit/converters/qobj_to_circuits.py"}]}
| 1,128 | 172 |
gh_patches_debug_14386
|
rasdani/github-patches
|
git_diff
|
MycroftAI__mycroft-core-845
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Mycroft doesn't starts if /etc/mycroft/mycroft.conf exists and has comments
In log.py there are these statements:
if isfile(SYSTEM_CONFIG):
with open(SYSTEM_CONFIG) as f:
config = json.load(f)
log_level = config.get("log_level", "DEBUG")
note no filter of comments has been done using "load_commented_json".
Note: no pull request will be done, because could be the best solution is do not load any config here but use the common configuration procedure.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mycroft/util/log.py`
Content:
```
1 # Copyright 2016 Mycroft AI, Inc.
2 #
3 # This file is part of Mycroft Core.
4 #
5 # Mycroft Core is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU General Public License as published by
7 # the Free Software Foundation, either version 3 of the License, or
8 # (at your option) any later version.
9 #
10 # Mycroft Core is distributed in the hope that it will be useful,
11 # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 # GNU General Public License for more details.
14 #
15 # You should have received a copy of the GNU General Public License
16 # along with Mycroft Core. If not, see <http://www.gnu.org/licenses/>.
17 import json
18 import logging
19
20 from os.path import isfile
21
22 SYSTEM_CONFIG = '/etc/mycroft/mycroft.conf'
23
24 __author__ = 'seanfitz'
25
26 log_level = "DEBUG"
27
28 if isfile(SYSTEM_CONFIG):
29 with open(SYSTEM_CONFIG) as f:
30 config = json.load(f)
31 log_level = config.get("log_level", "DEBUG")
32
33 FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
34 logging.basicConfig(format=FORMAT, level=logging.getLevelName(log_level))
35 logger = logging.getLogger("MYCROFT")
36
37
38 def getLogger(name="MYCROFT"):
39 """
40 Get a python logger
41
42 :param name: Module name for the logger
43
44 :return: an instance of logging.Logger
45 """
46 return logging.getLogger(name)
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mycroft/util/log.py b/mycroft/util/log.py
--- a/mycroft/util/log.py
+++ b/mycroft/util/log.py
@@ -18,6 +18,7 @@
import logging
from os.path import isfile
+from mycroft.util.json_helper import load_commented_json
SYSTEM_CONFIG = '/etc/mycroft/mycroft.conf'
@@ -26,9 +27,8 @@
log_level = "DEBUG"
if isfile(SYSTEM_CONFIG):
- with open(SYSTEM_CONFIG) as f:
- config = json.load(f)
- log_level = config.get("log_level", "DEBUG")
+ config = load_commented_json(SYSTEM_CONFIG)
+ log_level = config.get("log_level", "DEBUG")
FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
logging.basicConfig(format=FORMAT, level=logging.getLevelName(log_level))
|
{"golden_diff": "diff --git a/mycroft/util/log.py b/mycroft/util/log.py\n--- a/mycroft/util/log.py\n+++ b/mycroft/util/log.py\n@@ -18,6 +18,7 @@\n import logging\n \n from os.path import isfile\n+from mycroft.util.json_helper import load_commented_json\n \n SYSTEM_CONFIG = '/etc/mycroft/mycroft.conf'\n \n@@ -26,9 +27,8 @@\n log_level = \"DEBUG\"\n \n if isfile(SYSTEM_CONFIG):\n- with open(SYSTEM_CONFIG) as f:\n- config = json.load(f)\n- log_level = config.get(\"log_level\", \"DEBUG\")\n+ config = load_commented_json(SYSTEM_CONFIG)\n+ log_level = config.get(\"log_level\", \"DEBUG\")\n \n FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'\n logging.basicConfig(format=FORMAT, level=logging.getLevelName(log_level))\n", "issue": "Mycroft doesn't starts if /etc/mycroft/mycroft.conf exists and has comments\nIn log.py there are these statements:\r\n\r\nif isfile(SYSTEM_CONFIG):\r\n with open(SYSTEM_CONFIG) as f:\r\n config = json.load(f)\r\n log_level = config.get(\"log_level\", \"DEBUG\")\r\n\r\nnote no filter of comments has been done using \"load_commented_json\".\r\n\r\nNote: no pull request will be done, because could be the best solution is do not load any config here but use the common configuration procedure.\n", "before_files": [{"content": "# Copyright 2016 Mycroft AI, Inc.\n#\n# This file is part of Mycroft Core.\n#\n# Mycroft Core is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Mycroft Core is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Mycroft Core. If not, see <http://www.gnu.org/licenses/>.\nimport json\nimport logging\n\nfrom os.path import isfile\n\nSYSTEM_CONFIG = '/etc/mycroft/mycroft.conf'\n\n__author__ = 'seanfitz'\n\nlog_level = \"DEBUG\"\n\nif isfile(SYSTEM_CONFIG):\n with open(SYSTEM_CONFIG) as f:\n config = json.load(f)\n log_level = config.get(\"log_level\", \"DEBUG\")\n\nFORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'\nlogging.basicConfig(format=FORMAT, level=logging.getLevelName(log_level))\nlogger = logging.getLogger(\"MYCROFT\")\n\n\ndef getLogger(name=\"MYCROFT\"):\n \"\"\"\n Get a python logger\n\n :param name: Module name for the logger\n\n :return: an instance of logging.Logger\n \"\"\"\n return logging.getLogger(name)\n", "path": "mycroft/util/log.py"}], "after_files": [{"content": "# Copyright 2016 Mycroft AI, Inc.\n#\n# This file is part of Mycroft Core.\n#\n# Mycroft Core is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Mycroft Core is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Mycroft Core. If not, see <http://www.gnu.org/licenses/>.\nimport json\nimport logging\n\nfrom os.path import isfile\nfrom mycroft.util.json_helper import load_commented_json\n\nSYSTEM_CONFIG = '/etc/mycroft/mycroft.conf'\n\n__author__ = 'seanfitz'\n\nlog_level = \"DEBUG\"\n\nif isfile(SYSTEM_CONFIG):\n config = load_commented_json(SYSTEM_CONFIG)\n log_level = config.get(\"log_level\", \"DEBUG\")\n\nFORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'\nlogging.basicConfig(format=FORMAT, level=logging.getLevelName(log_level))\nlogger = logging.getLogger(\"MYCROFT\")\n\n\ndef getLogger(name=\"MYCROFT\"):\n \"\"\"\n Get a python logger\n\n :param name: Module name for the logger\n\n :return: an instance of logging.Logger\n \"\"\"\n return logging.getLogger(name)\n", "path": "mycroft/util/log.py"}]}
| 798 | 198 |
gh_patches_debug_9058
|
rasdani/github-patches
|
git_diff
|
wagtail__wagtail-11992
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Wagtail 6.1.x update breaks `RadioSelect` initialiastion after replacing jQuery with vanilla JS
### Issue Summary
Wagtail 6.1.x update breaks `RadioSelect` initialiastion after replacing jQuery with vanilla JS.
The JS adapter [`BoundRadioSelect`](https://github.com/wagtail/wagtail/blob/main/client/src/entrypoints/admin/telepath/widgets.js#L150) previously tested and set the checked option by comparing the element value against an array value - an array value is passed to the constructor:
#### Wagtail 6.0 implementation
```
setState(state) {
this.element.find('input[name="' + this.name + '"]').val([state]);
}
```
It now tests against a string, and the equality operator fails, so the radio option is not correctly set as `checked` on initialisation:
#### Wagtail 6.1.x implementation
```
setState(state) {
const inputs = this.element.querySelectorAll(`input[name="${this.name}"]`);
for (let i = 0; i < inputs.length; i += 1) {
inputs[i].checked = inputs[i].value === state;
}
}
```
The breaking change is here:
https://github.com/wagtail/wagtail/compare/v6.0.3...v6.1#diff-70fd977dfb7c5b823eab70485fa842987589fc8656203ffb729a4b7d688c1d04L130
### Steps to Reproduce
1. Use a choice block within a Streamfield such as:
```
link_type = blocks.ChoiceBlock(
choices=[
('page', 'Page'),
('document', 'Document'),
('url', 'URL'),
],
widget=forms.RadioSelect,
required=True,
default='page',
)
```
2. View/edit a page/snippet where this field is used
3. In Wagtail 6.0.x the first choice is `checked` as expected
4. In Wagtail 6.1.x the first choice is not `checked` as expected
- I have confirmed that this issue can be reproduced as described on a fresh Wagtail project: yes
### Technical details
- Python version: Python 3.12.3
- Django version: 5.0.6
- Wagtail version: 6.1.x
- Browser version: Chrome 125
### Working on this
Anyone can contribute to this. View our [contributing guidelines](https://docs.wagtail.org/en/latest/contributing/index.html), add a comment to the issue once you’re ready to start.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/widget_adapters.py`
Content:
```
1 """
2 Register Telepath adapters for core Django form widgets, so that they can
3 have corresponding Javascript objects with the ability to render new instances
4 and extract field values.
5 """
6
7 from django import forms
8 from django.core.exceptions import ValidationError
9 from django.utils.functional import cached_property
10
11 from wagtail.admin.staticfiles import versioned_static
12 from wagtail.telepath import Adapter, register
13
14
15 class WidgetAdapter(Adapter):
16 js_constructor = "wagtail.widgets.Widget"
17
18 def js_args(self, widget):
19 return [
20 widget.render("__NAME__", None, attrs={"id": "__ID__"}),
21 widget.id_for_label("__ID__"),
22 ]
23
24 def get_media(self, widget):
25 media = super().get_media(widget)
26 return media + widget.media
27
28 @cached_property
29 def media(self):
30 return forms.Media(
31 js=[
32 versioned_static("wagtailadmin/js/telepath/widgets.js"),
33 ]
34 )
35
36
37 register(WidgetAdapter(), forms.widgets.Input)
38 register(WidgetAdapter(), forms.Textarea)
39 register(WidgetAdapter(), forms.CheckboxSelectMultiple)
40
41
42 class CheckboxInputAdapter(WidgetAdapter):
43 js_constructor = "wagtail.widgets.CheckboxInput"
44
45
46 register(CheckboxInputAdapter(), forms.CheckboxInput)
47
48
49 class RadioSelectAdapter(WidgetAdapter):
50 js_constructor = "wagtail.widgets.RadioSelect"
51
52
53 register(RadioSelectAdapter(), forms.RadioSelect)
54
55
56 class SelectAdapter(WidgetAdapter):
57 js_constructor = "wagtail.widgets.Select"
58
59
60 register(SelectAdapter(), forms.Select)
61
62
63 class ValidationErrorAdapter(Adapter):
64 js_constructor = "wagtail.errors.ValidationError"
65
66 def js_args(self, error):
67 return [
68 error.messages,
69 ]
70
71 @cached_property
72 def media(self):
73 return forms.Media(
74 js=[
75 versioned_static("wagtailadmin/js/telepath/widgets.js"),
76 ]
77 )
78
79
80 register(ValidationErrorAdapter(), ValidationError)
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/wagtail/widget_adapters.py b/wagtail/widget_adapters.py
--- a/wagtail/widget_adapters.py
+++ b/wagtail/widget_adapters.py
@@ -36,7 +36,6 @@
register(WidgetAdapter(), forms.widgets.Input)
register(WidgetAdapter(), forms.Textarea)
-register(WidgetAdapter(), forms.CheckboxSelectMultiple)
class CheckboxInputAdapter(WidgetAdapter):
@@ -51,6 +50,7 @@
register(RadioSelectAdapter(), forms.RadioSelect)
+register(RadioSelectAdapter(), forms.CheckboxSelectMultiple)
class SelectAdapter(WidgetAdapter):
|
{"golden_diff": "diff --git a/wagtail/widget_adapters.py b/wagtail/widget_adapters.py\n--- a/wagtail/widget_adapters.py\n+++ b/wagtail/widget_adapters.py\n@@ -36,7 +36,6 @@\n \n register(WidgetAdapter(), forms.widgets.Input)\n register(WidgetAdapter(), forms.Textarea)\n-register(WidgetAdapter(), forms.CheckboxSelectMultiple)\n \n \n class CheckboxInputAdapter(WidgetAdapter):\n@@ -51,6 +50,7 @@\n \n \n register(RadioSelectAdapter(), forms.RadioSelect)\n+register(RadioSelectAdapter(), forms.CheckboxSelectMultiple)\n \n \n class SelectAdapter(WidgetAdapter):\n", "issue": "Wagtail 6.1.x update breaks `RadioSelect` initialiastion after replacing jQuery with vanilla JS\n### Issue Summary\r\n\r\nWagtail 6.1.x update breaks `RadioSelect` initialiastion after replacing jQuery with vanilla JS. \r\n\r\nThe JS adapter [`BoundRadioSelect`](https://github.com/wagtail/wagtail/blob/main/client/src/entrypoints/admin/telepath/widgets.js#L150) previously tested and set the checked option by comparing the element value against an array value - an array value is passed to the constructor:\r\n\r\n#### Wagtail 6.0 implementation\r\n```\r\nsetState(state) {\r\n this.element.find('input[name=\"' + this.name + '\"]').val([state]);\r\n}\r\n```\r\n\r\nIt now tests against a string, and the equality operator fails, so the radio option is not correctly set as `checked` on initialisation:\r\n\r\n#### Wagtail 6.1.x implementation\r\n```\r\n setState(state) {\r\n const inputs = this.element.querySelectorAll(`input[name=\"${this.name}\"]`);\r\n for (let i = 0; i < inputs.length; i += 1) {\r\n inputs[i].checked = inputs[i].value === state;\r\n }\r\n }\r\n ```\r\n\r\nThe breaking change is here:\r\n\r\nhttps://github.com/wagtail/wagtail/compare/v6.0.3...v6.1#diff-70fd977dfb7c5b823eab70485fa842987589fc8656203ffb729a4b7d688c1d04L130\r\n\r\n### Steps to Reproduce\r\n\r\n1. Use a choice block within a Streamfield such as:\r\n\r\n```\r\nlink_type = blocks.ChoiceBlock(\r\n choices=[\r\n ('page', 'Page'),\r\n ('document', 'Document'),\r\n ('url', 'URL'),\r\n ],\r\n widget=forms.RadioSelect,\r\n required=True,\r\n default='page',\r\n)\r\n```\r\n2. View/edit a page/snippet where this field is used\r\n3. In Wagtail 6.0.x the first choice is `checked` as expected\r\n4. In Wagtail 6.1.x the first choice is not `checked` as expected\r\n\r\n- I have confirmed that this issue can be reproduced as described on a fresh Wagtail project: yes\r\n\r\n### Technical details\r\n\r\n- Python version: Python 3.12.3\r\n- Django version: 5.0.6\r\n- Wagtail version: 6.1.x\r\n- Browser version: Chrome 125\r\n\r\n### Working on this\r\n\r\nAnyone can contribute to this. View our [contributing guidelines](https://docs.wagtail.org/en/latest/contributing/index.html), add a comment to the issue once you\u2019re ready to start.\r\n\n", "before_files": [{"content": "\"\"\"\nRegister Telepath adapters for core Django form widgets, so that they can\nhave corresponding Javascript objects with the ability to render new instances\nand extract field values.\n\"\"\"\n\nfrom django import forms\nfrom django.core.exceptions import ValidationError\nfrom django.utils.functional import cached_property\n\nfrom wagtail.admin.staticfiles import versioned_static\nfrom wagtail.telepath import Adapter, register\n\n\nclass WidgetAdapter(Adapter):\n js_constructor = \"wagtail.widgets.Widget\"\n\n def js_args(self, widget):\n return [\n widget.render(\"__NAME__\", None, attrs={\"id\": \"__ID__\"}),\n widget.id_for_label(\"__ID__\"),\n ]\n\n def get_media(self, widget):\n media = super().get_media(widget)\n return media + widget.media\n\n @cached_property\n def media(self):\n return forms.Media(\n js=[\n versioned_static(\"wagtailadmin/js/telepath/widgets.js\"),\n ]\n )\n\n\nregister(WidgetAdapter(), forms.widgets.Input)\nregister(WidgetAdapter(), forms.Textarea)\nregister(WidgetAdapter(), forms.CheckboxSelectMultiple)\n\n\nclass CheckboxInputAdapter(WidgetAdapter):\n js_constructor = \"wagtail.widgets.CheckboxInput\"\n\n\nregister(CheckboxInputAdapter(), forms.CheckboxInput)\n\n\nclass RadioSelectAdapter(WidgetAdapter):\n js_constructor = \"wagtail.widgets.RadioSelect\"\n\n\nregister(RadioSelectAdapter(), forms.RadioSelect)\n\n\nclass SelectAdapter(WidgetAdapter):\n js_constructor = \"wagtail.widgets.Select\"\n\n\nregister(SelectAdapter(), forms.Select)\n\n\nclass ValidationErrorAdapter(Adapter):\n js_constructor = \"wagtail.errors.ValidationError\"\n\n def js_args(self, error):\n return [\n error.messages,\n ]\n\n @cached_property\n def media(self):\n return forms.Media(\n js=[\n versioned_static(\"wagtailadmin/js/telepath/widgets.js\"),\n ]\n )\n\n\nregister(ValidationErrorAdapter(), ValidationError)\n", "path": "wagtail/widget_adapters.py"}], "after_files": [{"content": "\"\"\"\nRegister Telepath adapters for core Django form widgets, so that they can\nhave corresponding Javascript objects with the ability to render new instances\nand extract field values.\n\"\"\"\n\nfrom django import forms\nfrom django.core.exceptions import ValidationError\nfrom django.utils.functional import cached_property\n\nfrom wagtail.admin.staticfiles import versioned_static\nfrom wagtail.telepath import Adapter, register\n\n\nclass WidgetAdapter(Adapter):\n js_constructor = \"wagtail.widgets.Widget\"\n\n def js_args(self, widget):\n return [\n widget.render(\"__NAME__\", None, attrs={\"id\": \"__ID__\"}),\n widget.id_for_label(\"__ID__\"),\n ]\n\n def get_media(self, widget):\n media = super().get_media(widget)\n return media + widget.media\n\n @cached_property\n def media(self):\n return forms.Media(\n js=[\n versioned_static(\"wagtailadmin/js/telepath/widgets.js\"),\n ]\n )\n\n\nregister(WidgetAdapter(), forms.widgets.Input)\nregister(WidgetAdapter(), forms.Textarea)\n\n\nclass CheckboxInputAdapter(WidgetAdapter):\n js_constructor = \"wagtail.widgets.CheckboxInput\"\n\n\nregister(CheckboxInputAdapter(), forms.CheckboxInput)\n\n\nclass RadioSelectAdapter(WidgetAdapter):\n js_constructor = \"wagtail.widgets.RadioSelect\"\n\n\nregister(RadioSelectAdapter(), forms.RadioSelect)\nregister(RadioSelectAdapter(), forms.CheckboxSelectMultiple)\n\n\nclass SelectAdapter(WidgetAdapter):\n js_constructor = \"wagtail.widgets.Select\"\n\n\nregister(SelectAdapter(), forms.Select)\n\n\nclass ValidationErrorAdapter(Adapter):\n js_constructor = \"wagtail.errors.ValidationError\"\n\n def js_args(self, error):\n return [\n error.messages,\n ]\n\n @cached_property\n def media(self):\n return forms.Media(\n js=[\n versioned_static(\"wagtailadmin/js/telepath/widgets.js\"),\n ]\n )\n\n\nregister(ValidationErrorAdapter(), ValidationError)\n", "path": "wagtail/widget_adapters.py"}]}
| 1,435 | 136 |
gh_patches_debug_14443
|
rasdani/github-patches
|
git_diff
|
aws__aws-cli-4231
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Check if AWS::Include location is a string before packacing. Fixes #4087
Fixes #4087
If the value of Location in AWS::Include is *not* a string, we should skip
packaging it. This can happen if customers use, say, an intrinsic function
to construct the Include location:
Example:
```
AWS::Include:
Location:
Fn::Sub: "${S3Bucket}/file.txt"
```
*Issue #, if available:*
*Description of changes:*
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 import codecs
3 import os.path
4 import re
5 import sys
6
7 from setuptools import setup, find_packages
8
9
10 here = os.path.abspath(os.path.dirname(__file__))
11
12
13 def read(*parts):
14 return codecs.open(os.path.join(here, *parts), 'r').read()
15
16
17 def find_version(*file_paths):
18 version_file = read(*file_paths)
19 version_match = re.search(r"^__version__ = ['\"]([^'\"]*)['\"]",
20 version_file, re.M)
21 if version_match:
22 return version_match.group(1)
23 raise RuntimeError("Unable to find version string.")
24
25
26 requires = ['botocore==1.12.165',
27 'colorama>=0.2.5,<=0.3.9',
28 'docutils>=0.10',
29 'rsa>=3.1.2,<=3.5.0',
30 's3transfer>=0.2.0,<0.3.0',
31 'PyYAML>=3.10,<=3.13']
32
33
34 if sys.version_info[:2] == (2, 6):
35 # For python2.6 we have to require argparse since it
36 # was not in stdlib until 2.7.
37 requires.append('argparse>=1.1')
38
39
40 setup_options = dict(
41 name='awscli',
42 version=find_version("awscli", "__init__.py"),
43 description='Universal Command Line Environment for AWS.',
44 long_description=read('README.rst'),
45 author='Amazon Web Services',
46 url='http://aws.amazon.com/cli/',
47 scripts=['bin/aws', 'bin/aws.cmd',
48 'bin/aws_completer', 'bin/aws_zsh_completer.sh',
49 'bin/aws_bash_completer'],
50 packages=find_packages(exclude=['tests*']),
51 package_data={'awscli': ['data/*.json', 'examples/*/*.rst',
52 'examples/*/*/*.rst', 'topics/*.rst',
53 'topics/*.json']},
54 install_requires=requires,
55 extras_require={
56 ':python_version=="2.6"': [
57 'argparse>=1.1',
58 ]
59 },
60 license="Apache License 2.0",
61 classifiers=[
62 'Development Status :: 5 - Production/Stable',
63 'Intended Audience :: Developers',
64 'Intended Audience :: System Administrators',
65 'Natural Language :: English',
66 'License :: OSI Approved :: Apache Software License',
67 'Programming Language :: Python',
68 'Programming Language :: Python :: 2',
69 'Programming Language :: Python :: 2.6',
70 'Programming Language :: Python :: 2.7',
71 'Programming Language :: Python :: 3',
72 'Programming Language :: Python :: 3.3',
73 'Programming Language :: Python :: 3.4',
74 'Programming Language :: Python :: 3.5',
75 'Programming Language :: Python :: 3.6',
76 'Programming Language :: Python :: 3.7',
77 ],
78 )
79
80 if 'py2exe' in sys.argv:
81 # This will actually give us a py2exe command.
82 import py2exe
83 # And we have some py2exe specific options.
84 setup_options['options'] = {
85 'py2exe': {
86 'optimize': 0,
87 'skip_archive': True,
88 'dll_excludes': ['crypt32.dll'],
89 'packages': ['docutils', 'urllib', 'httplib', 'HTMLParser',
90 'awscli', 'ConfigParser', 'xml.etree', 'pipes'],
91 }
92 }
93 setup_options['console'] = ['bin/aws']
94
95
96 setup(**setup_options)
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -27,8 +27,7 @@
'colorama>=0.2.5,<=0.3.9',
'docutils>=0.10',
'rsa>=3.1.2,<=3.5.0',
- 's3transfer>=0.2.0,<0.3.0',
- 'PyYAML>=3.10,<=3.13']
+ 's3transfer>=0.2.0,<0.3.0']
if sys.version_info[:2] == (2, 6):
@@ -36,6 +35,12 @@
# was not in stdlib until 2.7.
requires.append('argparse>=1.1')
+ # For Python 2.6, we have to require a different verion of PyYAML since the latest
+ # versions dropped support for Python 2.6.
+ requires.append('PyYAML>=3.10,<=3.13')
+else:
+ requires.append('PyYAML>=3.10,<=5.1')
+
setup_options = dict(
name='awscli',
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -27,8 +27,7 @@\n 'colorama>=0.2.5,<=0.3.9',\n 'docutils>=0.10',\n 'rsa>=3.1.2,<=3.5.0',\n- 's3transfer>=0.2.0,<0.3.0',\n- 'PyYAML>=3.10,<=3.13']\n+ 's3transfer>=0.2.0,<0.3.0']\n \n \n if sys.version_info[:2] == (2, 6):\n@@ -36,6 +35,12 @@\n # was not in stdlib until 2.7.\n requires.append('argparse>=1.1')\n \n+ # For Python 2.6, we have to require a different verion of PyYAML since the latest\n+ # versions dropped support for Python 2.6.\n+ requires.append('PyYAML>=3.10,<=3.13')\n+else:\n+ requires.append('PyYAML>=3.10,<=5.1')\n+\n \n setup_options = dict(\n name='awscli',\n", "issue": "Check if AWS::Include location is a string before packacing. Fixes #4087\nFixes #4087 \r\n\r\nIf the value of Location in AWS::Include is *not* a string, we should skip\r\npackaging it. This can happen if customers use, say, an intrinsic function\r\nto construct the Include location:\r\n\r\nExample:\r\n```\r\nAWS::Include:\r\n Location:\r\n Fn::Sub: \"${S3Bucket}/file.txt\"\r\n```\r\n\r\n*Issue #, if available:*\r\n\r\n*Description of changes:*\r\n\r\n\r\nBy submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\nimport codecs\nimport os.path\nimport re\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read(*parts):\n return codecs.open(os.path.join(here, *parts), 'r').read()\n\n\ndef find_version(*file_paths):\n version_file = read(*file_paths)\n version_match = re.search(r\"^__version__ = ['\\\"]([^'\\\"]*)['\\\"]\",\n version_file, re.M)\n if version_match:\n return version_match.group(1)\n raise RuntimeError(\"Unable to find version string.\")\n\n\nrequires = ['botocore==1.12.165',\n 'colorama>=0.2.5,<=0.3.9',\n 'docutils>=0.10',\n 'rsa>=3.1.2,<=3.5.0',\n 's3transfer>=0.2.0,<0.3.0',\n 'PyYAML>=3.10,<=3.13']\n\n\nif sys.version_info[:2] == (2, 6):\n # For python2.6 we have to require argparse since it\n # was not in stdlib until 2.7.\n requires.append('argparse>=1.1')\n\n\nsetup_options = dict(\n name='awscli',\n version=find_version(\"awscli\", \"__init__.py\"),\n description='Universal Command Line Environment for AWS.',\n long_description=read('README.rst'),\n author='Amazon Web Services',\n url='http://aws.amazon.com/cli/',\n scripts=['bin/aws', 'bin/aws.cmd',\n 'bin/aws_completer', 'bin/aws_zsh_completer.sh',\n 'bin/aws_bash_completer'],\n packages=find_packages(exclude=['tests*']),\n package_data={'awscli': ['data/*.json', 'examples/*/*.rst',\n 'examples/*/*/*.rst', 'topics/*.rst',\n 'topics/*.json']},\n install_requires=requires,\n extras_require={\n ':python_version==\"2.6\"': [\n 'argparse>=1.1',\n ]\n },\n license=\"Apache License 2.0\",\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Intended Audience :: Developers',\n 'Intended Audience :: System Administrators',\n 'Natural Language :: English',\n 'License :: OSI Approved :: Apache Software License',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.6',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n ],\n)\n\nif 'py2exe' in sys.argv:\n # This will actually give us a py2exe command.\n import py2exe\n # And we have some py2exe specific options.\n setup_options['options'] = {\n 'py2exe': {\n 'optimize': 0,\n 'skip_archive': True,\n 'dll_excludes': ['crypt32.dll'],\n 'packages': ['docutils', 'urllib', 'httplib', 'HTMLParser',\n 'awscli', 'ConfigParser', 'xml.etree', 'pipes'],\n }\n }\n setup_options['console'] = ['bin/aws']\n\n\nsetup(**setup_options)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\nimport codecs\nimport os.path\nimport re\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read(*parts):\n return codecs.open(os.path.join(here, *parts), 'r').read()\n\n\ndef find_version(*file_paths):\n version_file = read(*file_paths)\n version_match = re.search(r\"^__version__ = ['\\\"]([^'\\\"]*)['\\\"]\",\n version_file, re.M)\n if version_match:\n return version_match.group(1)\n raise RuntimeError(\"Unable to find version string.\")\n\n\nrequires = ['botocore==1.12.165',\n 'colorama>=0.2.5,<=0.3.9',\n 'docutils>=0.10',\n 'rsa>=3.1.2,<=3.5.0',\n 's3transfer>=0.2.0,<0.3.0']\n\n\nif sys.version_info[:2] == (2, 6):\n # For python2.6 we have to require argparse since it\n # was not in stdlib until 2.7.\n requires.append('argparse>=1.1')\n\n # For Python 2.6, we have to require a different verion of PyYAML since the latest\n # versions dropped support for Python 2.6.\n requires.append('PyYAML>=3.10,<=3.13')\nelse:\n requires.append('PyYAML>=3.10,<=5.1')\n\n\nsetup_options = dict(\n name='awscli',\n version=find_version(\"awscli\", \"__init__.py\"),\n description='Universal Command Line Environment for AWS.',\n long_description=read('README.rst'),\n author='Amazon Web Services',\n url='http://aws.amazon.com/cli/',\n scripts=['bin/aws', 'bin/aws.cmd',\n 'bin/aws_completer', 'bin/aws_zsh_completer.sh',\n 'bin/aws_bash_completer'],\n packages=find_packages(exclude=['tests*']),\n package_data={'awscli': ['data/*.json', 'examples/*/*.rst',\n 'examples/*/*/*.rst', 'topics/*.rst',\n 'topics/*.json']},\n install_requires=requires,\n extras_require={\n ':python_version==\"2.6\"': [\n 'argparse>=1.1',\n ]\n },\n license=\"Apache License 2.0\",\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Intended Audience :: Developers',\n 'Intended Audience :: System Administrators',\n 'Natural Language :: English',\n 'License :: OSI Approved :: Apache Software License',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.6',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n ],\n)\n\nif 'py2exe' in sys.argv:\n # This will actually give us a py2exe command.\n import py2exe\n # And we have some py2exe specific options.\n setup_options['options'] = {\n 'py2exe': {\n 'optimize': 0,\n 'skip_archive': True,\n 'dll_excludes': ['crypt32.dll'],\n 'packages': ['docutils', 'urllib', 'httplib', 'HTMLParser',\n 'awscli', 'ConfigParser', 'xml.etree', 'pipes'],\n }\n }\n setup_options['console'] = ['bin/aws']\n\n\nsetup(**setup_options)\n", "path": "setup.py"}]}
| 1,371 | 280 |
gh_patches_debug_1996
|
rasdani/github-patches
|
git_diff
|
ipython__ipython-11978
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
7.10 breaking tests with exception in publish
The new 7.10 release is breaking Bokeh unit tests with an exception coming from within ipython:
```
self = <IPython.core.displaypub.DisplayPublisher object at 0x11883d7f0>
data = {'text/html': '\n <div class="bk-root">\n <a href="https://bokeh.org" target="_blank" class="bk-logo bk-logo...version \'1.0\' from Bokeh development version \'1.0-1-abc\'. This configuration is unsupported and may not work!</p>'}
metadata = None, source = None, transient = None, update = False, kwargs = {}
handlers = {}
<< omitted >>
handlers = {}
if self.shell is not None:
> handlers = self.shell.mime_renderers
E AttributeError: 'InteractiveShell' object has no attribute 'mime_renderers'
../miniconda/envs/testenv/lib/python3.6/site-packages/IPython/core/displaypub.py:108: AttributeError
```
Is this an intentional change (documented anwhere?) or a regression/bug?
cc @Carreau
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/core/displaypub.py`
Content:
```
1 """An interface for publishing rich data to frontends.
2
3 There are two components of the display system:
4
5 * Display formatters, which take a Python object and compute the
6 representation of the object in various formats (text, HTML, SVG, etc.).
7 * The display publisher that is used to send the representation data to the
8 various frontends.
9
10 This module defines the logic display publishing. The display publisher uses
11 the ``display_data`` message type that is defined in the IPython messaging
12 spec.
13 """
14
15 # Copyright (c) IPython Development Team.
16 # Distributed under the terms of the Modified BSD License.
17
18
19 import sys
20
21 from traitlets.config.configurable import Configurable
22 from traitlets import List, Dict
23
24 # This used to be defined here - it is imported for backwards compatibility
25 from .display import publish_display_data
26
27 #-----------------------------------------------------------------------------
28 # Main payload class
29 #-----------------------------------------------------------------------------
30
31
32 class DisplayPublisher(Configurable):
33 """A traited class that publishes display data to frontends.
34
35 Instances of this class are created by the main IPython object and should
36 be accessed there.
37 """
38
39 def __init__(self, shell=None, *args, **kwargs):
40 self.shell = shell
41 super().__init__(*args, **kwargs)
42
43 def _validate_data(self, data, metadata=None):
44 """Validate the display data.
45
46 Parameters
47 ----------
48 data : dict
49 The formata data dictionary.
50 metadata : dict
51 Any metadata for the data.
52 """
53
54 if not isinstance(data, dict):
55 raise TypeError('data must be a dict, got: %r' % data)
56 if metadata is not None:
57 if not isinstance(metadata, dict):
58 raise TypeError('metadata must be a dict, got: %r' % data)
59
60 # use * to indicate transient, update are keyword-only
61 def publish(self, data, metadata=None, source=None, *, transient=None, update=False, **kwargs) -> None:
62 """Publish data and metadata to all frontends.
63
64 See the ``display_data`` message in the messaging documentation for
65 more details about this message type.
66
67 The following MIME types are currently implemented:
68
69 * text/plain
70 * text/html
71 * text/markdown
72 * text/latex
73 * application/json
74 * application/javascript
75 * image/png
76 * image/jpeg
77 * image/svg+xml
78
79 Parameters
80 ----------
81 data : dict
82 A dictionary having keys that are valid MIME types (like
83 'text/plain' or 'image/svg+xml') and values that are the data for
84 that MIME type. The data itself must be a JSON'able data
85 structure. Minimally all data should have the 'text/plain' data,
86 which can be displayed by all frontends. If more than the plain
87 text is given, it is up to the frontend to decide which
88 representation to use.
89 metadata : dict
90 A dictionary for metadata related to the data. This can contain
91 arbitrary key, value pairs that frontends can use to interpret
92 the data. Metadata specific to each mime-type can be specified
93 in the metadata dict with the same mime-type keys as
94 the data itself.
95 source : str, deprecated
96 Unused.
97 transient: dict, keyword-only
98 A dictionary for transient data.
99 Data in this dictionary should not be persisted as part of saving this output.
100 Examples include 'display_id'.
101 update: bool, keyword-only, default: False
102 If True, only update existing outputs with the same display_id,
103 rather than creating a new output.
104 """
105
106 handlers = {}
107 if self.shell is not None:
108 handlers = self.shell.mime_renderers
109
110 for mime, handler in handlers.items():
111 if mime in data:
112 handler(data[mime], metadata.get(mime, None))
113 return
114
115 if 'text/plain' in data:
116 print(data['text/plain'])
117
118 def clear_output(self, wait=False):
119 """Clear the output of the cell receiving output."""
120 print('\033[2K\r', end='')
121 sys.stdout.flush()
122 print('\033[2K\r', end='')
123 sys.stderr.flush()
124
125
126 class CapturingDisplayPublisher(DisplayPublisher):
127 """A DisplayPublisher that stores"""
128 outputs = List()
129
130 def publish(self, data, metadata=None, source=None, *, transient=None, update=False):
131 self.outputs.append({'data':data, 'metadata':metadata,
132 'transient':transient, 'update':update})
133
134 def clear_output(self, wait=False):
135 super(CapturingDisplayPublisher, self).clear_output(wait)
136
137 # empty the list, *do not* reassign a new list
138 self.outputs.clear()
139
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/IPython/core/displaypub.py b/IPython/core/displaypub.py
--- a/IPython/core/displaypub.py
+++ b/IPython/core/displaypub.py
@@ -105,7 +105,7 @@
handlers = {}
if self.shell is not None:
- handlers = self.shell.mime_renderers
+ handlers = getattr(self.shell, 'mime_renderers', {})
for mime, handler in handlers.items():
if mime in data:
|
{"golden_diff": "diff --git a/IPython/core/displaypub.py b/IPython/core/displaypub.py\n--- a/IPython/core/displaypub.py\n+++ b/IPython/core/displaypub.py\n@@ -105,7 +105,7 @@\n \n handlers = {}\n if self.shell is not None:\n- handlers = self.shell.mime_renderers\n+ handlers = getattr(self.shell, 'mime_renderers', {})\n \n for mime, handler in handlers.items():\n if mime in data:\n", "issue": "7.10 breaking tests with exception in publish\nThe new 7.10 release is breaking Bokeh unit tests with an exception coming from within ipython:\r\n```\r\n\r\nself = <IPython.core.displaypub.DisplayPublisher object at 0x11883d7f0>\r\ndata = {'text/html': '\\n <div class=\"bk-root\">\\n <a href=\"https://bokeh.org\" target=\"_blank\" class=\"bk-logo bk-logo...version \\'1.0\\' from Bokeh development version \\'1.0-1-abc\\'. This configuration is unsupported and may not work!</p>'}\r\nmetadata = None, source = None, transient = None, update = False, kwargs = {}\r\nhandlers = {}\r\n\r\n<< omitted >>\r\n\r\n handlers = {}\r\n if self.shell is not None:\r\n> handlers = self.shell.mime_renderers\r\nE AttributeError: 'InteractiveShell' object has no attribute 'mime_renderers'\r\n\r\n../miniconda/envs/testenv/lib/python3.6/site-packages/IPython/core/displaypub.py:108: AttributeError\r\n```\r\n\r\nIs this an intentional change (documented anwhere?) or a regression/bug?\r\n\r\ncc @Carreau \n", "before_files": [{"content": "\"\"\"An interface for publishing rich data to frontends.\n\nThere are two components of the display system:\n\n* Display formatters, which take a Python object and compute the\n representation of the object in various formats (text, HTML, SVG, etc.).\n* The display publisher that is used to send the representation data to the\n various frontends.\n\nThis module defines the logic display publishing. The display publisher uses\nthe ``display_data`` message type that is defined in the IPython messaging\nspec.\n\"\"\"\n\n# Copyright (c) IPython Development Team.\n# Distributed under the terms of the Modified BSD License.\n\n\nimport sys\n\nfrom traitlets.config.configurable import Configurable\nfrom traitlets import List, Dict\n\n# This used to be defined here - it is imported for backwards compatibility\nfrom .display import publish_display_data\n\n#-----------------------------------------------------------------------------\n# Main payload class\n#-----------------------------------------------------------------------------\n\n\nclass DisplayPublisher(Configurable):\n \"\"\"A traited class that publishes display data to frontends.\n\n Instances of this class are created by the main IPython object and should\n be accessed there.\n \"\"\"\n\n def __init__(self, shell=None, *args, **kwargs):\n self.shell = shell\n super().__init__(*args, **kwargs)\n\n def _validate_data(self, data, metadata=None):\n \"\"\"Validate the display data.\n\n Parameters\n ----------\n data : dict\n The formata data dictionary.\n metadata : dict\n Any metadata for the data.\n \"\"\"\n\n if not isinstance(data, dict):\n raise TypeError('data must be a dict, got: %r' % data)\n if metadata is not None:\n if not isinstance(metadata, dict):\n raise TypeError('metadata must be a dict, got: %r' % data)\n\n # use * to indicate transient, update are keyword-only\n def publish(self, data, metadata=None, source=None, *, transient=None, update=False, **kwargs) -> None:\n \"\"\"Publish data and metadata to all frontends.\n\n See the ``display_data`` message in the messaging documentation for\n more details about this message type.\n\n The following MIME types are currently implemented:\n\n * text/plain\n * text/html\n * text/markdown\n * text/latex\n * application/json\n * application/javascript\n * image/png\n * image/jpeg\n * image/svg+xml\n\n Parameters\n ----------\n data : dict\n A dictionary having keys that are valid MIME types (like\n 'text/plain' or 'image/svg+xml') and values that are the data for\n that MIME type. The data itself must be a JSON'able data\n structure. Minimally all data should have the 'text/plain' data,\n which can be displayed by all frontends. If more than the plain\n text is given, it is up to the frontend to decide which\n representation to use.\n metadata : dict\n A dictionary for metadata related to the data. This can contain\n arbitrary key, value pairs that frontends can use to interpret\n the data. Metadata specific to each mime-type can be specified\n in the metadata dict with the same mime-type keys as\n the data itself.\n source : str, deprecated\n Unused.\n transient: dict, keyword-only\n A dictionary for transient data.\n Data in this dictionary should not be persisted as part of saving this output.\n Examples include 'display_id'.\n update: bool, keyword-only, default: False\n If True, only update existing outputs with the same display_id,\n rather than creating a new output.\n \"\"\"\n\n handlers = {}\n if self.shell is not None:\n handlers = self.shell.mime_renderers\n\n for mime, handler in handlers.items():\n if mime in data:\n handler(data[mime], metadata.get(mime, None))\n return\n\n if 'text/plain' in data:\n print(data['text/plain'])\n\n def clear_output(self, wait=False):\n \"\"\"Clear the output of the cell receiving output.\"\"\"\n print('\\033[2K\\r', end='')\n sys.stdout.flush()\n print('\\033[2K\\r', end='')\n sys.stderr.flush()\n\n\nclass CapturingDisplayPublisher(DisplayPublisher):\n \"\"\"A DisplayPublisher that stores\"\"\"\n outputs = List()\n\n def publish(self, data, metadata=None, source=None, *, transient=None, update=False):\n self.outputs.append({'data':data, 'metadata':metadata,\n 'transient':transient, 'update':update})\n\n def clear_output(self, wait=False):\n super(CapturingDisplayPublisher, self).clear_output(wait)\n\n # empty the list, *do not* reassign a new list\n self.outputs.clear()\n", "path": "IPython/core/displaypub.py"}], "after_files": [{"content": "\"\"\"An interface for publishing rich data to frontends.\n\nThere are two components of the display system:\n\n* Display formatters, which take a Python object and compute the\n representation of the object in various formats (text, HTML, SVG, etc.).\n* The display publisher that is used to send the representation data to the\n various frontends.\n\nThis module defines the logic display publishing. The display publisher uses\nthe ``display_data`` message type that is defined in the IPython messaging\nspec.\n\"\"\"\n\n# Copyright (c) IPython Development Team.\n# Distributed under the terms of the Modified BSD License.\n\n\nimport sys\n\nfrom traitlets.config.configurable import Configurable\nfrom traitlets import List, Dict\n\n# This used to be defined here - it is imported for backwards compatibility\nfrom .display import publish_display_data\n\n#-----------------------------------------------------------------------------\n# Main payload class\n#-----------------------------------------------------------------------------\n\n\nclass DisplayPublisher(Configurable):\n \"\"\"A traited class that publishes display data to frontends.\n\n Instances of this class are created by the main IPython object and should\n be accessed there.\n \"\"\"\n\n def __init__(self, shell=None, *args, **kwargs):\n self.shell = shell\n super().__init__(*args, **kwargs)\n\n def _validate_data(self, data, metadata=None):\n \"\"\"Validate the display data.\n\n Parameters\n ----------\n data : dict\n The formata data dictionary.\n metadata : dict\n Any metadata for the data.\n \"\"\"\n\n if not isinstance(data, dict):\n raise TypeError('data must be a dict, got: %r' % data)\n if metadata is not None:\n if not isinstance(metadata, dict):\n raise TypeError('metadata must be a dict, got: %r' % data)\n\n # use * to indicate transient, update are keyword-only\n def publish(self, data, metadata=None, source=None, *, transient=None, update=False, **kwargs) -> None:\n \"\"\"Publish data and metadata to all frontends.\n\n See the ``display_data`` message in the messaging documentation for\n more details about this message type.\n\n The following MIME types are currently implemented:\n\n * text/plain\n * text/html\n * text/markdown\n * text/latex\n * application/json\n * application/javascript\n * image/png\n * image/jpeg\n * image/svg+xml\n\n Parameters\n ----------\n data : dict\n A dictionary having keys that are valid MIME types (like\n 'text/plain' or 'image/svg+xml') and values that are the data for\n that MIME type. The data itself must be a JSON'able data\n structure. Minimally all data should have the 'text/plain' data,\n which can be displayed by all frontends. If more than the plain\n text is given, it is up to the frontend to decide which\n representation to use.\n metadata : dict\n A dictionary for metadata related to the data. This can contain\n arbitrary key, value pairs that frontends can use to interpret\n the data. Metadata specific to each mime-type can be specified\n in the metadata dict with the same mime-type keys as\n the data itself.\n source : str, deprecated\n Unused.\n transient: dict, keyword-only\n A dictionary for transient data.\n Data in this dictionary should not be persisted as part of saving this output.\n Examples include 'display_id'.\n update: bool, keyword-only, default: False\n If True, only update existing outputs with the same display_id,\n rather than creating a new output.\n \"\"\"\n\n handlers = {}\n if self.shell is not None:\n handlers = getattr(self.shell, 'mime_renderers', {})\n\n for mime, handler in handlers.items():\n if mime in data:\n handler(data[mime], metadata.get(mime, None))\n return\n\n if 'text/plain' in data:\n print(data['text/plain'])\n\n def clear_output(self, wait=False):\n \"\"\"Clear the output of the cell receiving output.\"\"\"\n print('\\033[2K\\r', end='')\n sys.stdout.flush()\n print('\\033[2K\\r', end='')\n sys.stderr.flush()\n\n\nclass CapturingDisplayPublisher(DisplayPublisher):\n \"\"\"A DisplayPublisher that stores\"\"\"\n outputs = List()\n\n def publish(self, data, metadata=None, source=None, *, transient=None, update=False):\n self.outputs.append({'data':data, 'metadata':metadata,\n 'transient':transient, 'update':update})\n\n def clear_output(self, wait=False):\n super(CapturingDisplayPublisher, self).clear_output(wait)\n\n # empty the list, *do not* reassign a new list\n self.outputs.clear()\n", "path": "IPython/core/displaypub.py"}]}
| 1,843 | 103 |
gh_patches_debug_14119
|
rasdani/github-patches
|
git_diff
|
buildbot__buildbot-3918
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Transferring Files with LocalWorker seems not working
This ticket is a migrated Trac ticket [3628](http://trac.buildbot.net/ticket/3628)
People contributed to the original ticket: @unknown_contributor, @sa2ajj
Ticket created on: `Oct 20 2016`
Ticket last modified on: `Oct 27 2016`
---
Hi,
I just update my buildbot to 0.9. And i convert my old Slave to [[LocalWorker]].
But now my builds are stuck during the copy process "DirectoryUpload" or "!Multiple[[FileUpload]]"
I check the `twistd.log` and it looks like the first light files are copied but then the copy is stuck during a transfer of 5MB. The file has always a size of 3227648o.
Thanks for your help
---
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `master/buildbot/worker/protocols/null.py`
Content:
```
1 # This file is part of Buildbot. Buildbot is free software: you can
2 # redistribute it and/or modify it under the terms of the GNU General Public
3 # License as published by the Free Software Foundation, version 2.
4 #
5 # This program is distributed in the hope that it will be useful, but WITHOUT
6 # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
7 # FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
8 # details.
9 #
10 # You should have received a copy of the GNU General Public License along with
11 # this program; if not, write to the Free Software Foundation, Inc., 51
12 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
13 #
14 # Copyright Buildbot Team Members
15
16 from __future__ import absolute_import
17 from __future__ import division
18 from __future__ import print_function
19
20 from twisted.internet import defer
21 from twisted.python import log
22
23 from buildbot.worker.protocols import base
24
25
26 class Listener(base.Listener):
27 pass
28
29
30 class ProxyMixin():
31
32 def __init__(self, impl):
33 assert isinstance(impl, self.ImplClass)
34 self.impl = impl
35 self._disconnect_listeners = []
36
37 def callRemote(self, message, *args, **kw):
38 method = getattr(self.impl, "remote_%s" % message, None)
39 if method is None:
40 raise AttributeError("No such method: remote_%s" % (message,))
41 try:
42 state = method(*args, **kw)
43 except TypeError:
44 log.msg("%s didn't accept %s and %s" % (method, args, kw))
45 raise
46 return defer.maybeDeferred(lambda: state)
47
48 def notifyOnDisconnect(self, cb):
49 pass
50
51 def dontNotifyOnDisconnect(self, cb):
52 pass
53
54
55 # just add ProxyMixin capability to the RemoteCommandProxy
56 # so that callers of callRemote actually directly call the proper method
57 class RemoteCommandProxy(ProxyMixin):
58 ImplClass = base.RemoteCommandImpl
59
60
61 class FileReaderProxy(ProxyMixin):
62 ImplClass = base.FileReaderImpl
63
64
65 class FileWriterProxy(ProxyMixin):
66 ImplClass = base.FileWriterImpl
67
68
69 class Connection(base.Connection):
70 proxies = {base.FileWriterImpl: FileWriterProxy,
71 base.FileReaderImpl: FileReaderProxy}
72
73 def loseConnection(self):
74 pass
75
76 def remotePrint(self, message):
77 return defer.maybeDeferred(self.worker.bot.remote_print, message)
78
79 def remoteGetWorkerInfo(self):
80 return defer.maybeDeferred(self.worker.bot.remote_getWorkerInfo)
81
82 def remoteSetBuilderList(self, builders):
83 return defer.maybeDeferred(self.worker.bot.remote_setBuilderList, builders)
84
85 def remoteStartCommand(self, remoteCommand, builderName, commandId, commandName, args):
86 remoteCommand = RemoteCommandProxy(remoteCommand)
87 args = self.createArgsProxies(args)
88 workerforbuilder = self.worker.bot.builders[builderName]
89 return defer.maybeDeferred(workerforbuilder.remote_startCommand, remoteCommand,
90 commandId, commandName, args)
91
92 def remoteShutdown(self):
93 return defer.maybeDeferred(self.worker.stopService)
94
95 def remoteStartBuild(self, builderName):
96 return defer.succeed(self.worker.bot.builders[builderName].remote_startBuild())
97
98 def remoteInterruptCommand(self, builderName, commandId, why):
99 workerforbuilder = self.worker.bot.builders[builderName]
100 return defer.maybeDeferred(workerforbuilder.remote_interruptCommand, commandId, why)
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/master/buildbot/worker/protocols/null.py b/master/buildbot/worker/protocols/null.py
--- a/master/buildbot/worker/protocols/null.py
+++ b/master/buildbot/worker/protocols/null.py
@@ -20,6 +20,7 @@
from twisted.internet import defer
from twisted.python import log
+from buildbot.util.eventual import fireEventually
from buildbot.worker.protocols import base
@@ -43,7 +44,8 @@
except TypeError:
log.msg("%s didn't accept %s and %s" % (method, args, kw))
raise
- return defer.maybeDeferred(lambda: state)
+ # break callback recursion for large transfers by using fireEventually
+ return fireEventually(state)
def notifyOnDisconnect(self, cb):
pass
|
{"golden_diff": "diff --git a/master/buildbot/worker/protocols/null.py b/master/buildbot/worker/protocols/null.py\n--- a/master/buildbot/worker/protocols/null.py\n+++ b/master/buildbot/worker/protocols/null.py\n@@ -20,6 +20,7 @@\n from twisted.internet import defer\n from twisted.python import log\n \n+from buildbot.util.eventual import fireEventually\n from buildbot.worker.protocols import base\n \n \n@@ -43,7 +44,8 @@\n except TypeError:\n log.msg(\"%s didn't accept %s and %s\" % (method, args, kw))\n raise\n- return defer.maybeDeferred(lambda: state)\n+ # break callback recursion for large transfers by using fireEventually\n+ return fireEventually(state)\n \n def notifyOnDisconnect(self, cb):\n pass\n", "issue": "Transferring Files with LocalWorker seems not working\nThis ticket is a migrated Trac ticket [3628](http://trac.buildbot.net/ticket/3628)\n\nPeople contributed to the original ticket: @unknown_contributor, @sa2ajj\nTicket created on: `Oct 20 2016`\nTicket last modified on: `Oct 27 2016`\n\n---\n\nHi,\n\nI just update my buildbot to 0.9. And i convert my old Slave to [[LocalWorker]].\n\nBut now my builds are stuck during the copy process \"DirectoryUpload\" or \"!Multiple[[FileUpload]]\"\n\nI check the `twistd.log` and it looks like the first light files are copied but then the copy is stuck during a transfer of 5MB. The file has always a size of 3227648o.\n\nThanks for your help\n\n\n---\n\n\n\n", "before_files": [{"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS\n# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more\n# details.\n#\n# You should have received a copy of the GNU General Public License along with\n# this program; if not, write to the Free Software Foundation, Inc., 51\n# Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.\n#\n# Copyright Buildbot Team Members\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom twisted.internet import defer\nfrom twisted.python import log\n\nfrom buildbot.worker.protocols import base\n\n\nclass Listener(base.Listener):\n pass\n\n\nclass ProxyMixin():\n\n def __init__(self, impl):\n assert isinstance(impl, self.ImplClass)\n self.impl = impl\n self._disconnect_listeners = []\n\n def callRemote(self, message, *args, **kw):\n method = getattr(self.impl, \"remote_%s\" % message, None)\n if method is None:\n raise AttributeError(\"No such method: remote_%s\" % (message,))\n try:\n state = method(*args, **kw)\n except TypeError:\n log.msg(\"%s didn't accept %s and %s\" % (method, args, kw))\n raise\n return defer.maybeDeferred(lambda: state)\n\n def notifyOnDisconnect(self, cb):\n pass\n\n def dontNotifyOnDisconnect(self, cb):\n pass\n\n\n# just add ProxyMixin capability to the RemoteCommandProxy\n# so that callers of callRemote actually directly call the proper method\nclass RemoteCommandProxy(ProxyMixin):\n ImplClass = base.RemoteCommandImpl\n\n\nclass FileReaderProxy(ProxyMixin):\n ImplClass = base.FileReaderImpl\n\n\nclass FileWriterProxy(ProxyMixin):\n ImplClass = base.FileWriterImpl\n\n\nclass Connection(base.Connection):\n proxies = {base.FileWriterImpl: FileWriterProxy,\n base.FileReaderImpl: FileReaderProxy}\n\n def loseConnection(self):\n pass\n\n def remotePrint(self, message):\n return defer.maybeDeferred(self.worker.bot.remote_print, message)\n\n def remoteGetWorkerInfo(self):\n return defer.maybeDeferred(self.worker.bot.remote_getWorkerInfo)\n\n def remoteSetBuilderList(self, builders):\n return defer.maybeDeferred(self.worker.bot.remote_setBuilderList, builders)\n\n def remoteStartCommand(self, remoteCommand, builderName, commandId, commandName, args):\n remoteCommand = RemoteCommandProxy(remoteCommand)\n args = self.createArgsProxies(args)\n workerforbuilder = self.worker.bot.builders[builderName]\n return defer.maybeDeferred(workerforbuilder.remote_startCommand, remoteCommand,\n commandId, commandName, args)\n\n def remoteShutdown(self):\n return defer.maybeDeferred(self.worker.stopService)\n\n def remoteStartBuild(self, builderName):\n return defer.succeed(self.worker.bot.builders[builderName].remote_startBuild())\n\n def remoteInterruptCommand(self, builderName, commandId, why):\n workerforbuilder = self.worker.bot.builders[builderName]\n return defer.maybeDeferred(workerforbuilder.remote_interruptCommand, commandId, why)\n", "path": "master/buildbot/worker/protocols/null.py"}], "after_files": [{"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS\n# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more\n# details.\n#\n# You should have received a copy of the GNU General Public License along with\n# this program; if not, write to the Free Software Foundation, Inc., 51\n# Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.\n#\n# Copyright Buildbot Team Members\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom twisted.internet import defer\nfrom twisted.python import log\n\nfrom buildbot.util.eventual import fireEventually\nfrom buildbot.worker.protocols import base\n\n\nclass Listener(base.Listener):\n pass\n\n\nclass ProxyMixin():\n\n def __init__(self, impl):\n assert isinstance(impl, self.ImplClass)\n self.impl = impl\n self._disconnect_listeners = []\n\n def callRemote(self, message, *args, **kw):\n method = getattr(self.impl, \"remote_%s\" % message, None)\n if method is None:\n raise AttributeError(\"No such method: remote_%s\" % (message,))\n try:\n state = method(*args, **kw)\n except TypeError:\n log.msg(\"%s didn't accept %s and %s\" % (method, args, kw))\n raise\n # break callback recursion for large transfers by using fireEventually\n return fireEventually(state)\n\n def notifyOnDisconnect(self, cb):\n pass\n\n def dontNotifyOnDisconnect(self, cb):\n pass\n\n\n# just add ProxyMixin capability to the RemoteCommandProxy\n# so that callers of callRemote actually directly call the proper method\nclass RemoteCommandProxy(ProxyMixin):\n ImplClass = base.RemoteCommandImpl\n\n\nclass FileReaderProxy(ProxyMixin):\n ImplClass = base.FileReaderImpl\n\n\nclass FileWriterProxy(ProxyMixin):\n ImplClass = base.FileWriterImpl\n\n\nclass Connection(base.Connection):\n proxies = {base.FileWriterImpl: FileWriterProxy,\n base.FileReaderImpl: FileReaderProxy}\n\n def loseConnection(self):\n pass\n\n def remotePrint(self, message):\n return defer.maybeDeferred(self.worker.bot.remote_print, message)\n\n def remoteGetWorkerInfo(self):\n return defer.maybeDeferred(self.worker.bot.remote_getWorkerInfo)\n\n def remoteSetBuilderList(self, builders):\n return defer.maybeDeferred(self.worker.bot.remote_setBuilderList, builders)\n\n def remoteStartCommand(self, remoteCommand, builderName, commandId, commandName, args):\n remoteCommand = RemoteCommandProxy(remoteCommand)\n args = self.createArgsProxies(args)\n workerforbuilder = self.worker.bot.builders[builderName]\n return defer.maybeDeferred(workerforbuilder.remote_startCommand, remoteCommand,\n commandId, commandName, args)\n\n def remoteShutdown(self):\n return defer.maybeDeferred(self.worker.stopService)\n\n def remoteStartBuild(self, builderName):\n return defer.succeed(self.worker.bot.builders[builderName].remote_startBuild())\n\n def remoteInterruptCommand(self, builderName, commandId, why):\n workerforbuilder = self.worker.bot.builders[builderName]\n return defer.maybeDeferred(workerforbuilder.remote_interruptCommand, commandId, why)\n", "path": "master/buildbot/worker/protocols/null.py"}]}
| 1,413 | 179 |
gh_patches_debug_8612
|
rasdani/github-patches
|
git_diff
|
pyqtgraph__pyqtgraph-2934
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
import pyqtgraph.canvas fails.
<!-- In the following, please describe your issue in detail! -->
<!-- If some sections do not apply, just remove them. -->
### Short description
Importing pyqtgraph.canvas causes the program to abort and exit with an error code..
### Code to reproduce
<!-- Please provide a minimal working example that reproduces the issue in the code block below.
Ideally, this should be a full example someone else could run without additional setup. -->
```python
print("Mark 0", flush=True)
import time
print("Mark 1", flush=True)
import pyqtgraph.canvas
print("Mark 2", flush=True)
time.sleep(10)
print("Mark 3", flush=True)
```
### Expected behavior
1. The program should print out all the markers four ,0 to 3, with a 10 seconds delay between marker 2 and 3.
2. The program should exit with status code 0 (ok)
### Real behavior
1. The program prints only markers 0 and 1.
2. The program exist with status code 127 (error)
### An error occurred?
No error messages were noticed. The program just exits when it tries to import pyqtgraph.canvas.
### Tested environment(s)
* PyQtGraph version: <!-- output of pyqtgraph.__version__ --> 0.13.3
* Qt Python binding: <!-- output of pyqtgraph.Qt.VERSION_INFO --> PyQt6 6.5.2 Qt 6.5.2
* Python version: 3.12.0
* NumPy version: <!-- output of numpy.__version__ --> 1.26.0
* Operating system: Windows 10, Ryzen
* Installation method: <!-- e.g. pip, conda, system packages, ... --> pip
### Additional context
The issue is discussed here in the pyinstaller repository https://github.com/pyinstaller/pyinstaller/issues/7991#issuecomment-1752032919
Swapping the two lines of CanvasManager as described the link above seems to fix the problem.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyqtgraph/canvas/CanvasManager.py`
Content:
```
1 from ..Qt import QtCore, QtWidgets
2
3 if not hasattr(QtCore, 'Signal'):
4 QtCore.Signal = QtCore.pyqtSignal
5 import weakref
6
7
8 class CanvasManager(QtCore.QObject):
9 SINGLETON = None
10
11 sigCanvasListChanged = QtCore.Signal()
12
13 def __init__(self):
14 if CanvasManager.SINGLETON is not None:
15 raise Exception("Can only create one canvas manager.")
16 CanvasManager.SINGLETON = self
17 QtCore.QObject.__init__(self)
18 self.canvases = weakref.WeakValueDictionary()
19
20 @classmethod
21 def instance(cls):
22 return CanvasManager.SINGLETON
23
24 def registerCanvas(self, canvas, name):
25 n2 = name
26 i = 0
27 while n2 in self.canvases:
28 n2 = "%s_%03d" % (name, i)
29 i += 1
30 self.canvases[n2] = canvas
31 self.sigCanvasListChanged.emit()
32 return n2
33
34 def unregisterCanvas(self, name):
35 c = self.canvases[name]
36 del self.canvases[name]
37 self.sigCanvasListChanged.emit()
38
39 def listCanvases(self):
40 return list(self.canvases.keys())
41
42 def getCanvas(self, name):
43 return self.canvases[name]
44
45
46 manager = CanvasManager()
47
48
49 class CanvasCombo(QtWidgets.QComboBox):
50 def __init__(self, parent=None):
51 QtWidgets.QComboBox.__init__(self, parent)
52 man = CanvasManager.instance()
53 man.sigCanvasListChanged.connect(self.updateCanvasList)
54 self.hostName = None
55 self.updateCanvasList()
56
57 def updateCanvasList(self):
58 canvases = CanvasManager.instance().listCanvases()
59 canvases.insert(0, "")
60 if self.hostName in canvases:
61 canvases.remove(self.hostName)
62
63 sel = self.currentText()
64 if sel in canvases:
65 self.blockSignals(True) ## change does not affect current selection; block signals during update
66 self.clear()
67 for i in canvases:
68 self.addItem(i)
69 if i == sel:
70 self.setCurrentIndex(self.count())
71
72 self.blockSignals(False)
73
74 def setHostName(self, name):
75 self.hostName = name
76 self.updateCanvasList()
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pyqtgraph/canvas/CanvasManager.py b/pyqtgraph/canvas/CanvasManager.py
--- a/pyqtgraph/canvas/CanvasManager.py
+++ b/pyqtgraph/canvas/CanvasManager.py
@@ -13,8 +13,9 @@
def __init__(self):
if CanvasManager.SINGLETON is not None:
raise Exception("Can only create one canvas manager.")
- CanvasManager.SINGLETON = self
+ # It is important to save SINGLETON *after* the call to QObject.__init__, see #2838.
QtCore.QObject.__init__(self)
+ CanvasManager.SINGLETON = self
self.canvases = weakref.WeakValueDictionary()
@classmethod
|
{"golden_diff": "diff --git a/pyqtgraph/canvas/CanvasManager.py b/pyqtgraph/canvas/CanvasManager.py\n--- a/pyqtgraph/canvas/CanvasManager.py\n+++ b/pyqtgraph/canvas/CanvasManager.py\n@@ -13,8 +13,9 @@\n def __init__(self):\n if CanvasManager.SINGLETON is not None:\n raise Exception(\"Can only create one canvas manager.\")\n- CanvasManager.SINGLETON = self\n+ # It is important to save SINGLETON *after* the call to QObject.__init__, see #2838.\n QtCore.QObject.__init__(self)\n+ CanvasManager.SINGLETON = self\n self.canvases = weakref.WeakValueDictionary()\n \n @classmethod\n", "issue": "import pyqtgraph.canvas fails.\n<!-- In the following, please describe your issue in detail! -->\r\n<!-- If some sections do not apply, just remove them. -->\r\n\r\n### Short description\r\nImporting pyqtgraph.canvas causes the program to abort and exit with an error code..\r\n\r\n### Code to reproduce\r\n<!-- Please provide a minimal working example that reproduces the issue in the code block below.\r\n Ideally, this should be a full example someone else could run without additional setup. -->\r\n\r\n```python\r\nprint(\"Mark 0\", flush=True)\r\nimport time\r\nprint(\"Mark 1\", flush=True)\r\nimport pyqtgraph.canvas\r\nprint(\"Mark 2\", flush=True)\r\ntime.sleep(10)\r\nprint(\"Mark 3\", flush=True)\r\n\r\n```\r\n\r\n### Expected behavior\r\n1. The program should print out all the markers four ,0 to 3, with a 10 seconds delay between marker 2 and 3.\r\n2. The program should exit with status code 0 (ok)\r\n\r\n\r\n### Real behavior\r\n1. The program prints only markers 0 and 1.\r\n2. The program exist with status code 127 (error)\r\n\r\n### An error occurred?\r\nNo error messages were noticed. The program just exits when it tries to import pyqtgraph.canvas.\r\n\r\n### Tested environment(s)\r\n\r\n * PyQtGraph version: <!-- output of pyqtgraph.__version__ --> 0.13.3\r\n * Qt Python binding: <!-- output of pyqtgraph.Qt.VERSION_INFO --> PyQt6 6.5.2 Qt 6.5.2\r\n * Python version: 3.12.0\r\n * NumPy version: <!-- output of numpy.__version__ --> 1.26.0\r\n * Operating system: Windows 10, Ryzen\r\n * Installation method: <!-- e.g. pip, conda, system packages, ... --> pip\r\n\r\n### Additional context\r\n\r\nThe issue is discussed here in the pyinstaller repository https://github.com/pyinstaller/pyinstaller/issues/7991#issuecomment-1752032919\r\n\r\nSwapping the two lines of CanvasManager as described the link above seems to fix the problem.\r\n\n", "before_files": [{"content": "from ..Qt import QtCore, QtWidgets\n\nif not hasattr(QtCore, 'Signal'):\n QtCore.Signal = QtCore.pyqtSignal\nimport weakref\n\n\nclass CanvasManager(QtCore.QObject):\n SINGLETON = None\n \n sigCanvasListChanged = QtCore.Signal()\n \n def __init__(self):\n if CanvasManager.SINGLETON is not None:\n raise Exception(\"Can only create one canvas manager.\")\n CanvasManager.SINGLETON = self\n QtCore.QObject.__init__(self)\n self.canvases = weakref.WeakValueDictionary()\n\n @classmethod\n def instance(cls):\n return CanvasManager.SINGLETON\n \n def registerCanvas(self, canvas, name):\n n2 = name\n i = 0\n while n2 in self.canvases:\n n2 = \"%s_%03d\" % (name, i)\n i += 1\n self.canvases[n2] = canvas\n self.sigCanvasListChanged.emit()\n return n2\n \n def unregisterCanvas(self, name):\n c = self.canvases[name]\n del self.canvases[name]\n self.sigCanvasListChanged.emit()\n \n def listCanvases(self):\n return list(self.canvases.keys())\n \n def getCanvas(self, name):\n return self.canvases[name]\n \n \nmanager = CanvasManager()\n\n\nclass CanvasCombo(QtWidgets.QComboBox):\n def __init__(self, parent=None):\n QtWidgets.QComboBox.__init__(self, parent)\n man = CanvasManager.instance()\n man.sigCanvasListChanged.connect(self.updateCanvasList)\n self.hostName = None\n self.updateCanvasList()\n \n def updateCanvasList(self):\n canvases = CanvasManager.instance().listCanvases()\n canvases.insert(0, \"\")\n if self.hostName in canvases:\n canvases.remove(self.hostName)\n \n sel = self.currentText()\n if sel in canvases:\n self.blockSignals(True) ## change does not affect current selection; block signals during update\n self.clear()\n for i in canvases:\n self.addItem(i)\n if i == sel:\n self.setCurrentIndex(self.count())\n \n self.blockSignals(False)\n \n def setHostName(self, name):\n self.hostName = name\n self.updateCanvasList()\n", "path": "pyqtgraph/canvas/CanvasManager.py"}], "after_files": [{"content": "from ..Qt import QtCore, QtWidgets\n\nif not hasattr(QtCore, 'Signal'):\n QtCore.Signal = QtCore.pyqtSignal\nimport weakref\n\n\nclass CanvasManager(QtCore.QObject):\n SINGLETON = None\n \n sigCanvasListChanged = QtCore.Signal()\n \n def __init__(self):\n if CanvasManager.SINGLETON is not None:\n raise Exception(\"Can only create one canvas manager.\")\n # It is important to save SINGLETON *after* the call to QObject.__init__, see #2838.\n QtCore.QObject.__init__(self)\n CanvasManager.SINGLETON = self\n self.canvases = weakref.WeakValueDictionary()\n\n @classmethod\n def instance(cls):\n return CanvasManager.SINGLETON\n \n def registerCanvas(self, canvas, name):\n n2 = name\n i = 0\n while n2 in self.canvases:\n n2 = \"%s_%03d\" % (name, i)\n i += 1\n self.canvases[n2] = canvas\n self.sigCanvasListChanged.emit()\n return n2\n \n def unregisterCanvas(self, name):\n c = self.canvases[name]\n del self.canvases[name]\n self.sigCanvasListChanged.emit()\n \n def listCanvases(self):\n return list(self.canvases.keys())\n \n def getCanvas(self, name):\n return self.canvases[name]\n \n \nmanager = CanvasManager()\n\n\nclass CanvasCombo(QtWidgets.QComboBox):\n def __init__(self, parent=None):\n QtWidgets.QComboBox.__init__(self, parent)\n man = CanvasManager.instance()\n man.sigCanvasListChanged.connect(self.updateCanvasList)\n self.hostName = None\n self.updateCanvasList()\n \n def updateCanvasList(self):\n canvases = CanvasManager.instance().listCanvases()\n canvases.insert(0, \"\")\n if self.hostName in canvases:\n canvases.remove(self.hostName)\n \n sel = self.currentText()\n if sel in canvases:\n self.blockSignals(True) ## change does not affect current selection; block signals during update\n self.clear()\n for i in canvases:\n self.addItem(i)\n if i == sel:\n self.setCurrentIndex(self.count())\n \n self.blockSignals(False)\n \n def setHostName(self, name):\n self.hostName = name\n self.updateCanvasList()\n", "path": "pyqtgraph/canvas/CanvasManager.py"}]}
| 1,350 | 163 |
gh_patches_debug_30736
|
rasdani/github-patches
|
git_diff
|
Zeroto521__my-data-toolkit-705
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ENH: New geoaccessor to generate great circle distances matrix
<!--
Thanks for contributing a pull request!
Please follow these standard acronyms to start the commit message:
- ENH: enhancement
- BUG: bug fix
- DOC: documentation
- TYP: type annotations
- TST: addition or modification of tests
- MAINT: maintenance commit (refactoring, typos, etc.)
- BLD: change related to building
- REL: related to releasing
- API: an (incompatible) API change
- DEP: deprecate something, or remove a deprecated object
- DEV: development tool or utility
- REV: revert an earlier commit
- PERF: performance improvement
- BOT: always commit via a bot
- CI: related to CI or CD
- CLN: Code cleanup
-->
- [x] closes #699
- [ ] whatsnew entry
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dtoolkit/geoaccessor/geoseries/geodistance_matrix.py`
Content:
```
1 from __future__ import annotations
2
3 import geopandas as gpd
4 import numpy as np
5 import pandas as pd
6
7 from dtoolkit.geoaccessor.register import register_geoseries_method
8
9
10 @register_geoseries_method
11 def geodistance_matrix(
12 s: gpd.GeoSeries,
13 /,
14 other: gpd.GeoSeries | gpd.GeoDataFrame | None = None,
15 radius: float = 6371008.7714150598,
16 ) -> pd.DataFrame:
17 """
18 Returns a ``DataFrame`` containing the `great-circle`__ distances matrix between in
19 ``s`` and ``other`` via haversine formula.
20
21 __ https://en.wikipedia.org/wiki/Great-circle_distance
22
23 .. math::
24
25 D(x, y) = 2 \\arcsin [
26 \\sqrt{
27 \\sin^2 ((x_1 - y_1) / 2)
28 + \\cos(x_1) \\cos(y_1) \\sin^2 ((x_2 - y_2) / 2)
29 }
30 ]
31
32 Parameters
33 ----------
34 other : GeoSeries, or GeoDataFrame, default None
35 If None, uses ``other=s``.
36
37 radius : float, default 6371008.7714150598
38 Great-circle distance uses a spherical model of the earth, using the mean earth
39 radius as defined by the International Union of Geodesy and Geophysics,
40 (2\\ *a* + *b*)/3 = 6371008.7714150598 meters for WGS-84.
41
42 Returns
43 -------
44 DataFrame
45 - The index and columns are the same as the index of ``s`` and ``other``.
46 - The values are the great-circle distances and its unit is meters.
47
48 Raises
49 ------
50 ModuleNotFoundError
51 If don't have module named 'sklearn'.
52
53 ValueError
54 If the CRS is not ``ESGP:4326``.
55
56 See Also
57 --------
58 sklearn.metrics.pairwise.haversine_distances
59 dtoolkit.geoaccessor.geoseries.geodistance
60 dtoolkit.geoaccessor.geoseries.geodistance_matrix
61 dtoolkit.geoaccessor.geodataframe.geodistance
62 dtoolkit.geoaccessor.geodataframe.geodistance_matrix
63
64 Notes
65 -----
66 - Currently, only supports Point geometry.
67 - The great-circle distance is the angular distance between two points on the
68 surface of a sphere. As the Earth is nearly spherical, the haversine formula
69 provides a good approximation of the distance between two points of the Earth
70 surface, with a less than 1% error on average.
71
72 Examples
73 --------
74 >>> import dtoolkit.geoaccessor
75 >>> df = pd.DataFrame(
76 ... {
77 ... "x": [120, 122, 100],
78 ... "y":[30, 55, 1],
79 ... },
80 ... ).from_xy("x", "y", crs=4326)
81 >>> df
82 x y geometry
83 0 120 30 POINT (120.00000 30.00000)
84 1 122 55 POINT (122.00000 55.00000)
85 2 100 1 POINT (100.00000 1.00000)
86 >>> other = pd.DataFrame(
87 ... {
88 ... "x": [120, 110],
89 ... "y":[30, 40],
90 ... },
91 ... ).from_xy("x", "y", crs=4326)
92 >>> other
93 x y geometry
94 0 120 30 POINT (120.00000 30.00000)
95 1 110 40 POINT (110.00000 40.00000)
96 >>> df.geodistance_matrix(other)
97 0 1
98 0 0.000000e+00 1.435335e+06
99 1 2.784435e+06 1.889892e+06
100 2 3.855604e+06 4.453100e+06
101 """
102 from sklearn.metrics.pairwise import haversine_distances
103
104 if s.crs != 4326:
105 raise ValueError(f"Only support 'EPSG:4326' CRS, but got {s.crs!r}.")
106
107 if isinstance(other, gpd.base.GeoPandasBase):
108 if other.crs != 4326:
109 raise ValueError(f"Only support 'EPSG:4326' CRS, but got {other.crs!r}.")
110
111 # Force convert to GeoSeries
112 other = other.geometry
113
114 X = np.radians(np.stack((s.y, s.x), axis=1))
115 Y = np.radians(np.stack((other.y, other.x), axis=1)) if other is not None else other
116 return pd.DataFrame(
117 radius * haversine_distances(X, Y),
118 index=s.index,
119 columns=other.index,
120 )
121
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py b/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py
--- a/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py
+++ b/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py
@@ -53,6 +53,9 @@
ValueError
If the CRS is not ``ESGP:4326``.
+ TypeError
+ If the other is not a GeoSeries, GeoDataFrame, or None type.
+
See Also
--------
sklearn.metrics.pairwise.haversine_distances
@@ -72,6 +75,7 @@
Examples
--------
>>> import dtoolkit.geoaccessor
+ >>> import pandas as pd
>>> df = pd.DataFrame(
... {
... "x": [120, 122, 100],
@@ -104,17 +108,19 @@
if s.crs != 4326:
raise ValueError(f"Only support 'EPSG:4326' CRS, but got {s.crs!r}.")
- if isinstance(other, gpd.base.GeoPandasBase):
+ if other is None:
+ Y = None
+ elif isinstance(other, gpd.base.GeoPandasBase):
if other.crs != 4326:
raise ValueError(f"Only support 'EPSG:4326' CRS, but got {other.crs!r}.")
- # Force convert to GeoSeries
- other = other.geometry
+ Y = np.radians(np.stack((other.geometry.y, other.geometry.x), axis=1))
+ else:
+ raise TypeError(f"Unknown type: {type(other).__name__!r}.")
X = np.radians(np.stack((s.y, s.x), axis=1))
- Y = np.radians(np.stack((other.y, other.x), axis=1)) if other is not None else other
return pd.DataFrame(
radius * haversine_distances(X, Y),
index=s.index,
- columns=other.index,
+ columns=other.index if other is not None else s.index,
)
|
{"golden_diff": "diff --git a/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py b/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py\n--- a/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py\n+++ b/dtoolkit/geoaccessor/geoseries/geodistance_matrix.py\n@@ -53,6 +53,9 @@\n ValueError\n If the CRS is not ``ESGP:4326``.\n \n+ TypeError\n+ If the other is not a GeoSeries, GeoDataFrame, or None type.\n+\n See Also\n --------\n sklearn.metrics.pairwise.haversine_distances\n@@ -72,6 +75,7 @@\n Examples\n --------\n >>> import dtoolkit.geoaccessor\n+ >>> import pandas as pd\n >>> df = pd.DataFrame(\n ... {\n ... \"x\": [120, 122, 100],\n@@ -104,17 +108,19 @@\n if s.crs != 4326:\n raise ValueError(f\"Only support 'EPSG:4326' CRS, but got {s.crs!r}.\")\n \n- if isinstance(other, gpd.base.GeoPandasBase):\n+ if other is None:\n+ Y = None\n+ elif isinstance(other, gpd.base.GeoPandasBase):\n if other.crs != 4326:\n raise ValueError(f\"Only support 'EPSG:4326' CRS, but got {other.crs!r}.\")\n \n- # Force convert to GeoSeries\n- other = other.geometry\n+ Y = np.radians(np.stack((other.geometry.y, other.geometry.x), axis=1))\n+ else:\n+ raise TypeError(f\"Unknown type: {type(other).__name__!r}.\")\n \n X = np.radians(np.stack((s.y, s.x), axis=1))\n- Y = np.radians(np.stack((other.y, other.x), axis=1)) if other is not None else other\n return pd.DataFrame(\n radius * haversine_distances(X, Y),\n index=s.index,\n- columns=other.index,\n+ columns=other.index if other is not None else s.index,\n )\n", "issue": "ENH: New geoaccessor to generate great circle distances matrix\n<!--\r\nThanks for contributing a pull request!\r\n\r\nPlease follow these standard acronyms to start the commit message:\r\n\r\n- ENH: enhancement\r\n- BUG: bug fix\r\n- DOC: documentation\r\n- TYP: type annotations\r\n- TST: addition or modification of tests\r\n- MAINT: maintenance commit (refactoring, typos, etc.)\r\n- BLD: change related to building\r\n- REL: related to releasing\r\n- API: an (incompatible) API change\r\n- DEP: deprecate something, or remove a deprecated object\r\n- DEV: development tool or utility\r\n- REV: revert an earlier commit\r\n- PERF: performance improvement\r\n- BOT: always commit via a bot\r\n- CI: related to CI or CD\r\n- CLN: Code cleanup\r\n-->\r\n\r\n- [x] closes #699\r\n- [ ] whatsnew entry\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nimport geopandas as gpd\nimport numpy as np\nimport pandas as pd\n\nfrom dtoolkit.geoaccessor.register import register_geoseries_method\n\n\n@register_geoseries_method\ndef geodistance_matrix(\n s: gpd.GeoSeries,\n /,\n other: gpd.GeoSeries | gpd.GeoDataFrame | None = None,\n radius: float = 6371008.7714150598,\n) -> pd.DataFrame:\n \"\"\"\n Returns a ``DataFrame`` containing the `great-circle`__ distances matrix between in\n ``s`` and ``other`` via haversine formula.\n\n __ https://en.wikipedia.org/wiki/Great-circle_distance\n\n .. math::\n\n D(x, y) = 2 \\\\arcsin [\n \\\\sqrt{\n \\\\sin^2 ((x_1 - y_1) / 2)\n + \\\\cos(x_1) \\\\cos(y_1) \\\\sin^2 ((x_2 - y_2) / 2)\n }\n ]\n\n Parameters\n ----------\n other : GeoSeries, or GeoDataFrame, default None\n If None, uses ``other=s``.\n\n radius : float, default 6371008.7714150598\n Great-circle distance uses a spherical model of the earth, using the mean earth\n radius as defined by the International Union of Geodesy and Geophysics,\n (2\\\\ *a* + *b*)/3 = 6371008.7714150598 meters for WGS-84.\n\n Returns\n -------\n DataFrame\n - The index and columns are the same as the index of ``s`` and ``other``.\n - The values are the great-circle distances and its unit is meters.\n\n Raises\n ------\n ModuleNotFoundError\n If don't have module named 'sklearn'.\n\n ValueError\n If the CRS is not ``ESGP:4326``.\n\n See Also\n --------\n sklearn.metrics.pairwise.haversine_distances\n dtoolkit.geoaccessor.geoseries.geodistance\n dtoolkit.geoaccessor.geoseries.geodistance_matrix\n dtoolkit.geoaccessor.geodataframe.geodistance\n dtoolkit.geoaccessor.geodataframe.geodistance_matrix\n\n Notes\n -----\n - Currently, only supports Point geometry.\n - The great-circle distance is the angular distance between two points on the\n surface of a sphere. As the Earth is nearly spherical, the haversine formula\n provides a good approximation of the distance between two points of the Earth\n surface, with a less than 1% error on average.\n\n Examples\n --------\n >>> import dtoolkit.geoaccessor\n >>> df = pd.DataFrame(\n ... {\n ... \"x\": [120, 122, 100],\n ... \"y\":[30, 55, 1],\n ... },\n ... ).from_xy(\"x\", \"y\", crs=4326)\n >>> df\n x y geometry\n 0 120 30 POINT (120.00000 30.00000)\n 1 122 55 POINT (122.00000 55.00000)\n 2 100 1 POINT (100.00000 1.00000)\n >>> other = pd.DataFrame(\n ... {\n ... \"x\": [120, 110],\n ... \"y\":[30, 40],\n ... },\n ... ).from_xy(\"x\", \"y\", crs=4326)\n >>> other\n x y geometry\n 0 120 30 POINT (120.00000 30.00000)\n 1 110 40 POINT (110.00000 40.00000)\n >>> df.geodistance_matrix(other)\n 0 1\n 0 0.000000e+00 1.435335e+06\n 1 2.784435e+06 1.889892e+06\n 2 3.855604e+06 4.453100e+06\n \"\"\"\n from sklearn.metrics.pairwise import haversine_distances\n\n if s.crs != 4326:\n raise ValueError(f\"Only support 'EPSG:4326' CRS, but got {s.crs!r}.\")\n\n if isinstance(other, gpd.base.GeoPandasBase):\n if other.crs != 4326:\n raise ValueError(f\"Only support 'EPSG:4326' CRS, but got {other.crs!r}.\")\n\n # Force convert to GeoSeries\n other = other.geometry\n\n X = np.radians(np.stack((s.y, s.x), axis=1))\n Y = np.radians(np.stack((other.y, other.x), axis=1)) if other is not None else other\n return pd.DataFrame(\n radius * haversine_distances(X, Y),\n index=s.index,\n columns=other.index,\n )\n", "path": "dtoolkit/geoaccessor/geoseries/geodistance_matrix.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport geopandas as gpd\nimport numpy as np\nimport pandas as pd\n\nfrom dtoolkit.geoaccessor.register import register_geoseries_method\n\n\n@register_geoseries_method\ndef geodistance_matrix(\n s: gpd.GeoSeries,\n /,\n other: gpd.GeoSeries | gpd.GeoDataFrame | None = None,\n radius: float = 6371008.7714150598,\n) -> pd.DataFrame:\n \"\"\"\n Returns a ``DataFrame`` containing the `great-circle`__ distances matrix between in\n ``s`` and ``other`` via haversine formula.\n\n __ https://en.wikipedia.org/wiki/Great-circle_distance\n\n .. math::\n\n D(x, y) = 2 \\\\arcsin [\n \\\\sqrt{\n \\\\sin^2 ((x_1 - y_1) / 2)\n + \\\\cos(x_1) \\\\cos(y_1) \\\\sin^2 ((x_2 - y_2) / 2)\n }\n ]\n\n Parameters\n ----------\n other : GeoSeries, or GeoDataFrame, default None\n If None, uses ``other=s``.\n\n radius : float, default 6371008.7714150598\n Great-circle distance uses a spherical model of the earth, using the mean earth\n radius as defined by the International Union of Geodesy and Geophysics,\n (2\\\\ *a* + *b*)/3 = 6371008.7714150598 meters for WGS-84.\n\n Returns\n -------\n DataFrame\n - The index and columns are the same as the index of ``s`` and ``other``.\n - The values are the great-circle distances and its unit is meters.\n\n Raises\n ------\n ModuleNotFoundError\n If don't have module named 'sklearn'.\n\n ValueError\n If the CRS is not ``ESGP:4326``.\n\n TypeError\n If the other is not a GeoSeries, GeoDataFrame, or None type.\n\n See Also\n --------\n sklearn.metrics.pairwise.haversine_distances\n dtoolkit.geoaccessor.geoseries.geodistance\n dtoolkit.geoaccessor.geoseries.geodistance_matrix\n dtoolkit.geoaccessor.geodataframe.geodistance\n dtoolkit.geoaccessor.geodataframe.geodistance_matrix\n\n Notes\n -----\n - Currently, only supports Point geometry.\n - The great-circle distance is the angular distance between two points on the\n surface of a sphere. As the Earth is nearly spherical, the haversine formula\n provides a good approximation of the distance between two points of the Earth\n surface, with a less than 1% error on average.\n\n Examples\n --------\n >>> import dtoolkit.geoaccessor\n >>> import pandas as pd\n >>> df = pd.DataFrame(\n ... {\n ... \"x\": [120, 122, 100],\n ... \"y\":[30, 55, 1],\n ... },\n ... ).from_xy(\"x\", \"y\", crs=4326)\n >>> df\n x y geometry\n 0 120 30 POINT (120.00000 30.00000)\n 1 122 55 POINT (122.00000 55.00000)\n 2 100 1 POINT (100.00000 1.00000)\n >>> other = pd.DataFrame(\n ... {\n ... \"x\": [120, 110],\n ... \"y\":[30, 40],\n ... },\n ... ).from_xy(\"x\", \"y\", crs=4326)\n >>> other\n x y geometry\n 0 120 30 POINT (120.00000 30.00000)\n 1 110 40 POINT (110.00000 40.00000)\n >>> df.geodistance_matrix(other)\n 0 1\n 0 0.000000e+00 1.435335e+06\n 1 2.784435e+06 1.889892e+06\n 2 3.855604e+06 4.453100e+06\n \"\"\"\n from sklearn.metrics.pairwise import haversine_distances\n\n if s.crs != 4326:\n raise ValueError(f\"Only support 'EPSG:4326' CRS, but got {s.crs!r}.\")\n\n if other is None:\n Y = None\n elif isinstance(other, gpd.base.GeoPandasBase):\n if other.crs != 4326:\n raise ValueError(f\"Only support 'EPSG:4326' CRS, but got {other.crs!r}.\")\n\n Y = np.radians(np.stack((other.geometry.y, other.geometry.x), axis=1))\n else:\n raise TypeError(f\"Unknown type: {type(other).__name__!r}.\")\n\n X = np.radians(np.stack((s.y, s.x), axis=1))\n return pd.DataFrame(\n radius * haversine_distances(X, Y),\n index=s.index,\n columns=other.index if other is not None else s.index,\n )\n", "path": "dtoolkit/geoaccessor/geoseries/geodistance_matrix.py"}]}
| 1,986 | 502 |
gh_patches_debug_7900
|
rasdani/github-patches
|
git_diff
|
vyperlang__vyper-3338
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
unable to use vyper-serve 0.3.6
### Version Information
* vyper Version (output of `vyper --version`): 0.3.6
* OS: ubutu
* Python Version 3.8
### What's your issue about?
unable to use the vyper-serve api endpoint, the call always fail with an error:
`Unable to json serialize IRNode`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `vyper/cli/vyper_serve.py`
Content:
```
1 #!/usr/bin/env python3
2
3 import argparse
4 import json
5 import sys
6 from http.server import BaseHTTPRequestHandler, HTTPServer
7 from socketserver import ThreadingMixIn
8
9 import vyper
10 from vyper.codegen import ir_node
11 from vyper.evm.opcodes import DEFAULT_EVM_VERSION
12 from vyper.exceptions import VyperException
13
14
15 def _parse_cli_args():
16 return _parse_args(sys.argv[1:])
17
18
19 def _parse_args(argv):
20 parser = argparse.ArgumentParser(description="Serve Vyper compiler as an HTTP Service")
21 parser.add_argument(
22 "--version", action="version", version=f"{vyper.__version__}+commit{vyper.__commit__}"
23 )
24 parser.add_argument(
25 "-b",
26 help="Address to bind JSON server on, default: localhost:8000",
27 default="localhost:8000",
28 dest="bind_address",
29 )
30
31 args = parser.parse_args(argv)
32
33 if ":" in args.bind_address:
34 ir_node.VYPER_COLOR_OUTPUT = False
35 runserver(*args.bind_address.split(":"))
36 else:
37 print('Provide bind address in "{address}:{port}" format')
38
39
40 class VyperRequestHandler(BaseHTTPRequestHandler):
41 def send_404(self):
42 self.send_response(404)
43 self.end_headers()
44 return
45
46 def send_cors_all(self):
47 self.send_header("Access-Control-Allow-Origin", "*")
48 self.send_header("Access-Control-Allow-Headers", "X-Requested-With, Content-type")
49
50 def do_OPTIONS(self):
51 self.send_response(200)
52 self.send_cors_all()
53 self.end_headers()
54
55 def do_GET(self):
56 if self.path == "/":
57 self.send_response(200)
58 self.send_cors_all()
59 self.end_headers()
60 self.wfile.write(f"Vyper Compiler. Version: {vyper.__version__}\n".encode())
61 else:
62 self.send_404()
63
64 return
65
66 def do_POST(self):
67 if self.path == "/compile":
68 content_len = int(self.headers.get("content-length"))
69 post_body = self.rfile.read(content_len)
70 data = json.loads(post_body)
71
72 response, status_code = self._compile(data)
73
74 self.send_response(status_code)
75 self.send_header("Content-type", "application/json")
76 self.send_cors_all()
77 self.end_headers()
78 self.wfile.write(json.dumps(response).encode())
79
80 else:
81 self.send_404()
82
83 return
84
85 def _compile(self, data):
86 code = data.get("code")
87 if not code:
88 return {"status": "failed", "message": 'No "code" key supplied'}, 400
89 if not isinstance(code, str):
90 return {"status": "failed", "message": '"code" must be a non-empty string'}, 400
91
92 try:
93 code = data["code"]
94 out_dict = vyper.compile_codes(
95 {"": code},
96 list(vyper.compiler.OUTPUT_FORMATS.keys()),
97 evm_version=data.get("evm_version", DEFAULT_EVM_VERSION),
98 )[""]
99 out_dict["ir"] = str(out_dict["ir"])
100 except VyperException as e:
101 return (
102 {"status": "failed", "message": str(e), "column": e.col_offset, "line": e.lineno},
103 400,
104 )
105 except SyntaxError as e:
106 return (
107 {"status": "failed", "message": str(e), "column": e.offset, "line": e.lineno},
108 400,
109 )
110
111 out_dict.update({"status": "success"})
112
113 return out_dict, 200
114
115
116 class VyperHTTPServer(ThreadingMixIn, HTTPServer):
117 """Handle requests in a separate thread."""
118
119 pass
120
121
122 def runserver(host="", port=8000):
123 server_address = (host, int(port))
124 httpd = VyperHTTPServer(server_address, VyperRequestHandler)
125 print(f"Listening on http://{host}:{port}")
126 httpd.serve_forever()
127
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/vyper/cli/vyper_serve.py b/vyper/cli/vyper_serve.py
--- a/vyper/cli/vyper_serve.py
+++ b/vyper/cli/vyper_serve.py
@@ -97,6 +97,7 @@
evm_version=data.get("evm_version", DEFAULT_EVM_VERSION),
)[""]
out_dict["ir"] = str(out_dict["ir"])
+ out_dict["ir_runtime"] = str(out_dict["ir_runtime"])
except VyperException as e:
return (
{"status": "failed", "message": str(e), "column": e.col_offset, "line": e.lineno},
|
{"golden_diff": "diff --git a/vyper/cli/vyper_serve.py b/vyper/cli/vyper_serve.py\n--- a/vyper/cli/vyper_serve.py\n+++ b/vyper/cli/vyper_serve.py\n@@ -97,6 +97,7 @@\n evm_version=data.get(\"evm_version\", DEFAULT_EVM_VERSION),\n )[\"\"]\n out_dict[\"ir\"] = str(out_dict[\"ir\"])\n+ out_dict[\"ir_runtime\"] = str(out_dict[\"ir_runtime\"])\n except VyperException as e:\n return (\n {\"status\": \"failed\", \"message\": str(e), \"column\": e.col_offset, \"line\": e.lineno},\n", "issue": "unable to use vyper-serve 0.3.6\n### Version Information\r\n\r\n* vyper Version (output of `vyper --version`): 0.3.6\r\n* OS: ubutu\r\n* Python Version 3.8\r\n\r\n### What's your issue about?\r\n\r\nunable to use the vyper-serve api endpoint, the call always fail with an error:\r\n`Unable to json serialize IRNode`\n", "before_files": [{"content": "#!/usr/bin/env python3\n\nimport argparse\nimport json\nimport sys\nfrom http.server import BaseHTTPRequestHandler, HTTPServer\nfrom socketserver import ThreadingMixIn\n\nimport vyper\nfrom vyper.codegen import ir_node\nfrom vyper.evm.opcodes import DEFAULT_EVM_VERSION\nfrom vyper.exceptions import VyperException\n\n\ndef _parse_cli_args():\n return _parse_args(sys.argv[1:])\n\n\ndef _parse_args(argv):\n parser = argparse.ArgumentParser(description=\"Serve Vyper compiler as an HTTP Service\")\n parser.add_argument(\n \"--version\", action=\"version\", version=f\"{vyper.__version__}+commit{vyper.__commit__}\"\n )\n parser.add_argument(\n \"-b\",\n help=\"Address to bind JSON server on, default: localhost:8000\",\n default=\"localhost:8000\",\n dest=\"bind_address\",\n )\n\n args = parser.parse_args(argv)\n\n if \":\" in args.bind_address:\n ir_node.VYPER_COLOR_OUTPUT = False\n runserver(*args.bind_address.split(\":\"))\n else:\n print('Provide bind address in \"{address}:{port}\" format')\n\n\nclass VyperRequestHandler(BaseHTTPRequestHandler):\n def send_404(self):\n self.send_response(404)\n self.end_headers()\n return\n\n def send_cors_all(self):\n self.send_header(\"Access-Control-Allow-Origin\", \"*\")\n self.send_header(\"Access-Control-Allow-Headers\", \"X-Requested-With, Content-type\")\n\n def do_OPTIONS(self):\n self.send_response(200)\n self.send_cors_all()\n self.end_headers()\n\n def do_GET(self):\n if self.path == \"/\":\n self.send_response(200)\n self.send_cors_all()\n self.end_headers()\n self.wfile.write(f\"Vyper Compiler. Version: {vyper.__version__}\\n\".encode())\n else:\n self.send_404()\n\n return\n\n def do_POST(self):\n if self.path == \"/compile\":\n content_len = int(self.headers.get(\"content-length\"))\n post_body = self.rfile.read(content_len)\n data = json.loads(post_body)\n\n response, status_code = self._compile(data)\n\n self.send_response(status_code)\n self.send_header(\"Content-type\", \"application/json\")\n self.send_cors_all()\n self.end_headers()\n self.wfile.write(json.dumps(response).encode())\n\n else:\n self.send_404()\n\n return\n\n def _compile(self, data):\n code = data.get(\"code\")\n if not code:\n return {\"status\": \"failed\", \"message\": 'No \"code\" key supplied'}, 400\n if not isinstance(code, str):\n return {\"status\": \"failed\", \"message\": '\"code\" must be a non-empty string'}, 400\n\n try:\n code = data[\"code\"]\n out_dict = vyper.compile_codes(\n {\"\": code},\n list(vyper.compiler.OUTPUT_FORMATS.keys()),\n evm_version=data.get(\"evm_version\", DEFAULT_EVM_VERSION),\n )[\"\"]\n out_dict[\"ir\"] = str(out_dict[\"ir\"])\n except VyperException as e:\n return (\n {\"status\": \"failed\", \"message\": str(e), \"column\": e.col_offset, \"line\": e.lineno},\n 400,\n )\n except SyntaxError as e:\n return (\n {\"status\": \"failed\", \"message\": str(e), \"column\": e.offset, \"line\": e.lineno},\n 400,\n )\n\n out_dict.update({\"status\": \"success\"})\n\n return out_dict, 200\n\n\nclass VyperHTTPServer(ThreadingMixIn, HTTPServer):\n \"\"\"Handle requests in a separate thread.\"\"\"\n\n pass\n\n\ndef runserver(host=\"\", port=8000):\n server_address = (host, int(port))\n httpd = VyperHTTPServer(server_address, VyperRequestHandler)\n print(f\"Listening on http://{host}:{port}\")\n httpd.serve_forever()\n", "path": "vyper/cli/vyper_serve.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nimport argparse\nimport json\nimport sys\nfrom http.server import BaseHTTPRequestHandler, HTTPServer\nfrom socketserver import ThreadingMixIn\n\nimport vyper\nfrom vyper.codegen import ir_node\nfrom vyper.evm.opcodes import DEFAULT_EVM_VERSION\nfrom vyper.exceptions import VyperException\n\n\ndef _parse_cli_args():\n return _parse_args(sys.argv[1:])\n\n\ndef _parse_args(argv):\n parser = argparse.ArgumentParser(description=\"Serve Vyper compiler as an HTTP Service\")\n parser.add_argument(\n \"--version\", action=\"version\", version=f\"{vyper.__version__}+commit{vyper.__commit__}\"\n )\n parser.add_argument(\n \"-b\",\n help=\"Address to bind JSON server on, default: localhost:8000\",\n default=\"localhost:8000\",\n dest=\"bind_address\",\n )\n\n args = parser.parse_args(argv)\n\n if \":\" in args.bind_address:\n ir_node.VYPER_COLOR_OUTPUT = False\n runserver(*args.bind_address.split(\":\"))\n else:\n print('Provide bind address in \"{address}:{port}\" format')\n\n\nclass VyperRequestHandler(BaseHTTPRequestHandler):\n def send_404(self):\n self.send_response(404)\n self.end_headers()\n return\n\n def send_cors_all(self):\n self.send_header(\"Access-Control-Allow-Origin\", \"*\")\n self.send_header(\"Access-Control-Allow-Headers\", \"X-Requested-With, Content-type\")\n\n def do_OPTIONS(self):\n self.send_response(200)\n self.send_cors_all()\n self.end_headers()\n\n def do_GET(self):\n if self.path == \"/\":\n self.send_response(200)\n self.send_cors_all()\n self.end_headers()\n self.wfile.write(f\"Vyper Compiler. Version: {vyper.__version__}\\n\".encode())\n else:\n self.send_404()\n\n return\n\n def do_POST(self):\n if self.path == \"/compile\":\n content_len = int(self.headers.get(\"content-length\"))\n post_body = self.rfile.read(content_len)\n data = json.loads(post_body)\n\n response, status_code = self._compile(data)\n\n self.send_response(status_code)\n self.send_header(\"Content-type\", \"application/json\")\n self.send_cors_all()\n self.end_headers()\n self.wfile.write(json.dumps(response).encode())\n\n else:\n self.send_404()\n\n return\n\n def _compile(self, data):\n code = data.get(\"code\")\n if not code:\n return {\"status\": \"failed\", \"message\": 'No \"code\" key supplied'}, 400\n if not isinstance(code, str):\n return {\"status\": \"failed\", \"message\": '\"code\" must be a non-empty string'}, 400\n\n try:\n code = data[\"code\"]\n out_dict = vyper.compile_codes(\n {\"\": code},\n list(vyper.compiler.OUTPUT_FORMATS.keys()),\n evm_version=data.get(\"evm_version\", DEFAULT_EVM_VERSION),\n )[\"\"]\n out_dict[\"ir\"] = str(out_dict[\"ir\"])\n out_dict[\"ir_runtime\"] = str(out_dict[\"ir_runtime\"])\n except VyperException as e:\n return (\n {\"status\": \"failed\", \"message\": str(e), \"column\": e.col_offset, \"line\": e.lineno},\n 400,\n )\n except SyntaxError as e:\n return (\n {\"status\": \"failed\", \"message\": str(e), \"column\": e.offset, \"line\": e.lineno},\n 400,\n )\n\n out_dict.update({\"status\": \"success\"})\n\n return out_dict, 200\n\n\nclass VyperHTTPServer(ThreadingMixIn, HTTPServer):\n \"\"\"Handle requests in a separate thread.\"\"\"\n\n pass\n\n\ndef runserver(host=\"\", port=8000):\n server_address = (host, int(port))\n httpd = VyperHTTPServer(server_address, VyperRequestHandler)\n print(f\"Listening on http://{host}:{port}\")\n httpd.serve_forever()\n", "path": "vyper/cli/vyper_serve.py"}]}
| 1,520 | 143 |
gh_patches_debug_530
|
rasdani/github-patches
|
git_diff
|
craiga__will-of-the-prophets-35
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Mobile-friendly layout
Sidebar should drop below the board once "mornhammered" becomes too wide for the window.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `will_of_the_prophets/views.py`
Content:
```
1 """Views."""
2
3 from django.shortcuts import render
4 from django.views.generic.edit import CreateView
5 from django.contrib.auth.mixins import LoginRequiredMixin
6 from django.urls import reverse
7
8 from will_of_the_prophets import board, forms, models
9
10
11 def public_board(request):
12 """
13 Board for the public.
14
15 Does not take embargoed rolls into account.
16 """
17 special_square_types = models.SpecialSquareType.objects.all()
18 return render(request, 'will_of_the_prophets/public_board.html',
19 {'board': board.Board(),
20 'special_square_types': special_square_types})
21
22
23 class RollView(LoginRequiredMixin, CreateView):
24 """View for rolling the die."""
25
26 form_class = forms.RollForm
27 template_name = 'will_of_the_prophets/roll.html'
28
29 def get_context_data(self, **kwargs):
30 last_roll = models.Roll.objects.order_by('-embargo').first()
31 return super().get_context_data(
32 **kwargs,
33 last_roll=last_roll,
34 board=board.Board(now=last_roll.embargo),
35 special_square_types=models.SpecialSquareType.objects.all())
36
37 def get_success_url(self):
38 return reverse('roll')
39
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/will_of_the_prophets/views.py b/will_of_the_prophets/views.py
--- a/will_of_the_prophets/views.py
+++ b/will_of_the_prophets/views.py
@@ -35,4 +35,4 @@
special_square_types=models.SpecialSquareType.objects.all())
def get_success_url(self):
- return reverse('roll')
+ return reverse('roll') + "#chula"
|
{"golden_diff": "diff --git a/will_of_the_prophets/views.py b/will_of_the_prophets/views.py\n--- a/will_of_the_prophets/views.py\n+++ b/will_of_the_prophets/views.py\n@@ -35,4 +35,4 @@\n special_square_types=models.SpecialSquareType.objects.all())\n \n def get_success_url(self):\n- return reverse('roll')\n+ return reverse('roll') + \"#chula\"\n", "issue": "Mobile-friendly layout\nSidebar should drop below the board once \"mornhammered\" becomes too wide for the window.\n", "before_files": [{"content": "\"\"\"Views.\"\"\"\n\nfrom django.shortcuts import render\nfrom django.views.generic.edit import CreateView\nfrom django.contrib.auth.mixins import LoginRequiredMixin\nfrom django.urls import reverse\n\nfrom will_of_the_prophets import board, forms, models\n\n\ndef public_board(request):\n \"\"\"\n Board for the public.\n\n Does not take embargoed rolls into account.\n \"\"\"\n special_square_types = models.SpecialSquareType.objects.all()\n return render(request, 'will_of_the_prophets/public_board.html',\n {'board': board.Board(),\n 'special_square_types': special_square_types})\n\n\nclass RollView(LoginRequiredMixin, CreateView):\n \"\"\"View for rolling the die.\"\"\"\n\n form_class = forms.RollForm\n template_name = 'will_of_the_prophets/roll.html'\n\n def get_context_data(self, **kwargs):\n last_roll = models.Roll.objects.order_by('-embargo').first()\n return super().get_context_data(\n **kwargs,\n last_roll=last_roll,\n board=board.Board(now=last_roll.embargo),\n special_square_types=models.SpecialSquareType.objects.all())\n\n def get_success_url(self):\n return reverse('roll')\n", "path": "will_of_the_prophets/views.py"}], "after_files": [{"content": "\"\"\"Views.\"\"\"\n\nfrom django.shortcuts import render\nfrom django.views.generic.edit import CreateView\nfrom django.contrib.auth.mixins import LoginRequiredMixin\nfrom django.urls import reverse\n\nfrom will_of_the_prophets import board, forms, models\n\n\ndef public_board(request):\n \"\"\"\n Board for the public.\n\n Does not take embargoed rolls into account.\n \"\"\"\n special_square_types = models.SpecialSquareType.objects.all()\n return render(request, 'will_of_the_prophets/public_board.html',\n {'board': board.Board(),\n 'special_square_types': special_square_types})\n\n\nclass RollView(LoginRequiredMixin, CreateView):\n \"\"\"View for rolling the die.\"\"\"\n\n form_class = forms.RollForm\n template_name = 'will_of_the_prophets/roll.html'\n\n def get_context_data(self, **kwargs):\n last_roll = models.Roll.objects.order_by('-embargo').first()\n return super().get_context_data(\n **kwargs,\n last_roll=last_roll,\n board=board.Board(now=last_roll.embargo),\n special_square_types=models.SpecialSquareType.objects.all())\n\n def get_success_url(self):\n return reverse('roll') + \"#chula\"\n", "path": "will_of_the_prophets/views.py"}]}
| 604 | 98 |
gh_patches_debug_23496
|
rasdani/github-patches
|
git_diff
|
pypa__setuptools-2858
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[FR] distutils.errors.* is not fully re-exported
### What's the problem this feature will solve?
The [Migration Advice in PEP 0632](https://www.python.org/dev/peps/pep-0632/#migration-advice) says that `setuptools` is a sufficient replacement for `distutils.errors`. However, only `DistutilsError` and `RemovedCommandError` exist there. In `distutils`, there are:
```
In [7]: [e for e in dir(distutils.errors) if e.endswith('Error')]
Out[7]:
['CCompilerError',
'CompileError',
'DistutilsArgError',
'DistutilsByteCompileError',
'DistutilsClassError',
'DistutilsError',
'DistutilsExecError',
'DistutilsFileError',
'DistutilsGetoptError',
'DistutilsInternalError',
'DistutilsModuleError',
'DistutilsOptionError',
'DistutilsPlatformError',
'DistutilsSetupError',
'DistutilsTemplateError',
'LibError',
'LinkError',
'PreprocessError',
'UnknownFileError']
```
While all of the `Distutils`-prefixed ones derive from `DistutilsError`, the ones without the prefix are derived from `CCompilerError`, and so cannot be caught using anything exported by `setuptools`. Concretely, this makes it difficult to check for compiler flags, as [done here](https://github.com/matplotlib/matplotlib/blob/69395749a4e588eade55b90dbf095549258f4b16/setup.py#L62-L72).
### Describe the solution you'd like
Re-export all `distutils.errors.*` in `setuptools.errors`.
### Alternative Solutions
Currently, we use `distutils.errors`, but won't be able to after Python 3.12. One can use a generic `Exception` too, but that's not recommended.
Or as a complete alternative, if `setuptools` had a way to check whether a compiler supported a flag in a builtin way, then we wouldn't need to catch the `CompileError` ourselves.
### Additional context
_No response_
### Code of Conduct
- [X] I agree to follow the PSF Code of Conduct
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setuptools/errors.py`
Content:
```
1 """setuptools.errors
2
3 Provides exceptions used by setuptools modules.
4 """
5
6 from distutils.errors import DistutilsError
7
8
9 class RemovedCommandError(DistutilsError, RuntimeError):
10 """Error used for commands that have been removed in setuptools.
11
12 Since ``setuptools`` is built on ``distutils``, simply removing a command
13 from ``setuptools`` will make the behavior fall back to ``distutils``; this
14 error is raised if a command exists in ``distutils`` but has been actively
15 removed in ``setuptools``.
16 """
17
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setuptools/errors.py b/setuptools/errors.py
--- a/setuptools/errors.py
+++ b/setuptools/errors.py
@@ -3,6 +3,7 @@
Provides exceptions used by setuptools modules.
"""
+from distutils import errors as _distutils_errors
from distutils.errors import DistutilsError
@@ -14,3 +15,26 @@
error is raised if a command exists in ``distutils`` but has been actively
removed in ``setuptools``.
"""
+
+
+# Re-export errors from distutils to facilitate the migration to PEP632
+
+ByteCompileError = _distutils_errors.DistutilsByteCompileError
+CCompilerError = _distutils_errors.CCompilerError
+ClassError = _distutils_errors.DistutilsClassError
+CompileError = _distutils_errors.CompileError
+ExecError = _distutils_errors.DistutilsExecError
+FileError = _distutils_errors.DistutilsFileError
+InternalError = _distutils_errors.DistutilsInternalError
+LibError = _distutils_errors.LibError
+LinkError = _distutils_errors.LinkError
+ModuleError = _distutils_errors.DistutilsModuleError
+OptionError = _distutils_errors.DistutilsOptionError
+PlatformError = _distutils_errors.DistutilsPlatformError
+PreprocessError = _distutils_errors.PreprocessError
+SetupError = _distutils_errors.DistutilsSetupError
+TemplateError = _distutils_errors.DistutilsTemplateError
+UnknownFileError = _distutils_errors.UnknownFileError
+
+# The root error class in the hierarchy
+BaseError = _distutils_errors.DistutilsError
|
{"golden_diff": "diff --git a/setuptools/errors.py b/setuptools/errors.py\n--- a/setuptools/errors.py\n+++ b/setuptools/errors.py\n@@ -3,6 +3,7 @@\n Provides exceptions used by setuptools modules.\n \"\"\"\n \n+from distutils import errors as _distutils_errors\n from distutils.errors import DistutilsError\n \n \n@@ -14,3 +15,26 @@\n error is raised if a command exists in ``distutils`` but has been actively\n removed in ``setuptools``.\n \"\"\"\n+\n+\n+# Re-export errors from distutils to facilitate the migration to PEP632\n+\n+ByteCompileError = _distutils_errors.DistutilsByteCompileError\n+CCompilerError = _distutils_errors.CCompilerError\n+ClassError = _distutils_errors.DistutilsClassError\n+CompileError = _distutils_errors.CompileError\n+ExecError = _distutils_errors.DistutilsExecError\n+FileError = _distutils_errors.DistutilsFileError\n+InternalError = _distutils_errors.DistutilsInternalError\n+LibError = _distutils_errors.LibError\n+LinkError = _distutils_errors.LinkError\n+ModuleError = _distutils_errors.DistutilsModuleError\n+OptionError = _distutils_errors.DistutilsOptionError\n+PlatformError = _distutils_errors.DistutilsPlatformError\n+PreprocessError = _distutils_errors.PreprocessError\n+SetupError = _distutils_errors.DistutilsSetupError\n+TemplateError = _distutils_errors.DistutilsTemplateError\n+UnknownFileError = _distutils_errors.UnknownFileError\n+\n+# The root error class in the hierarchy\n+BaseError = _distutils_errors.DistutilsError\n", "issue": "[FR] distutils.errors.* is not fully re-exported\n### What's the problem this feature will solve?\n\nThe [Migration Advice in PEP 0632](https://www.python.org/dev/peps/pep-0632/#migration-advice) says that `setuptools` is a sufficient replacement for `distutils.errors`. However, only `DistutilsError` and `RemovedCommandError` exist there. In `distutils`, there are:\r\n```\r\nIn [7]: [e for e in dir(distutils.errors) if e.endswith('Error')]\r\nOut[7]: \r\n['CCompilerError',\r\n 'CompileError',\r\n 'DistutilsArgError',\r\n 'DistutilsByteCompileError',\r\n 'DistutilsClassError',\r\n 'DistutilsError',\r\n 'DistutilsExecError',\r\n 'DistutilsFileError',\r\n 'DistutilsGetoptError',\r\n 'DistutilsInternalError',\r\n 'DistutilsModuleError',\r\n 'DistutilsOptionError',\r\n 'DistutilsPlatformError',\r\n 'DistutilsSetupError',\r\n 'DistutilsTemplateError',\r\n 'LibError',\r\n 'LinkError',\r\n 'PreprocessError',\r\n 'UnknownFileError']\r\n```\r\nWhile all of the `Distutils`-prefixed ones derive from `DistutilsError`, the ones without the prefix are derived from `CCompilerError`, and so cannot be caught using anything exported by `setuptools`. Concretely, this makes it difficult to check for compiler flags, as [done here](https://github.com/matplotlib/matplotlib/blob/69395749a4e588eade55b90dbf095549258f4b16/setup.py#L62-L72).\n\n### Describe the solution you'd like\n\nRe-export all `distutils.errors.*` in `setuptools.errors`.\n\n### Alternative Solutions\n\nCurrently, we use `distutils.errors`, but won't be able to after Python 3.12. One can use a generic `Exception` too, but that's not recommended.\r\n\r\nOr as a complete alternative, if `setuptools` had a way to check whether a compiler supported a flag in a builtin way, then we wouldn't need to catch the `CompileError` ourselves.\n\n### Additional context\n\n_No response_\n\n### Code of Conduct\n\n- [X] I agree to follow the PSF Code of Conduct\n", "before_files": [{"content": "\"\"\"setuptools.errors\n\nProvides exceptions used by setuptools modules.\n\"\"\"\n\nfrom distutils.errors import DistutilsError\n\n\nclass RemovedCommandError(DistutilsError, RuntimeError):\n \"\"\"Error used for commands that have been removed in setuptools.\n\n Since ``setuptools`` is built on ``distutils``, simply removing a command\n from ``setuptools`` will make the behavior fall back to ``distutils``; this\n error is raised if a command exists in ``distutils`` but has been actively\n removed in ``setuptools``.\n \"\"\"\n", "path": "setuptools/errors.py"}], "after_files": [{"content": "\"\"\"setuptools.errors\n\nProvides exceptions used by setuptools modules.\n\"\"\"\n\nfrom distutils import errors as _distutils_errors\nfrom distutils.errors import DistutilsError\n\n\nclass RemovedCommandError(DistutilsError, RuntimeError):\n \"\"\"Error used for commands that have been removed in setuptools.\n\n Since ``setuptools`` is built on ``distutils``, simply removing a command\n from ``setuptools`` will make the behavior fall back to ``distutils``; this\n error is raised if a command exists in ``distutils`` but has been actively\n removed in ``setuptools``.\n \"\"\"\n\n\n# Re-export errors from distutils to facilitate the migration to PEP632\n\nByteCompileError = _distutils_errors.DistutilsByteCompileError\nCCompilerError = _distutils_errors.CCompilerError\nClassError = _distutils_errors.DistutilsClassError\nCompileError = _distutils_errors.CompileError\nExecError = _distutils_errors.DistutilsExecError\nFileError = _distutils_errors.DistutilsFileError\nInternalError = _distutils_errors.DistutilsInternalError\nLibError = _distutils_errors.LibError\nLinkError = _distutils_errors.LinkError\nModuleError = _distutils_errors.DistutilsModuleError\nOptionError = _distutils_errors.DistutilsOptionError\nPlatformError = _distutils_errors.DistutilsPlatformError\nPreprocessError = _distutils_errors.PreprocessError\nSetupError = _distutils_errors.DistutilsSetupError\nTemplateError = _distutils_errors.DistutilsTemplateError\nUnknownFileError = _distutils_errors.UnknownFileError\n\n# The root error class in the hierarchy\nBaseError = _distutils_errors.DistutilsError\n", "path": "setuptools/errors.py"}]}
| 891 | 367 |
gh_patches_debug_1250
|
rasdani/github-patches
|
git_diff
|
numba__numba-941
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update README
Two issues with our README file:
- it is not up-to-date (e.g. it mentions Cython, which we don't use anymore)
- it uses Markdown rather than reST, and therefore is badly formatted when used for the PyPI long description: https://pypi.python.org/pypi/numba
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 try:
2 # Try to use setuptools so as to enable support of the special
3 # "Microsoft Visual C++ Compiler for Python 2.7" (http://aka.ms/vcpython27)
4 # for building under Windows.
5 # Note setuptools >= 6.0 is required for this.
6 from setuptools import setup, Extension
7 except ImportError:
8 from distutils.core import setup, Extension
9
10 import sys
11 import os
12 import numpy
13 import numpy.distutils.misc_util as np_misc
14 import versioneer
15
16 versioneer.versionfile_source = 'numba/_version.py'
17 versioneer.versionfile_build = 'numba/_version.py'
18 versioneer.tag_prefix = ''
19 versioneer.parentdir_prefix = 'numba-'
20
21 cmdclass = versioneer.get_cmdclass()
22
23 setup_args = {
24 'long_description': open('README.md').read(),
25 }
26
27 GCCFLAGS = ["-std=c89", "-Wdeclaration-after-statement", "-Werror"]
28
29 if os.environ.get("NUMBA_GCC_FLAGS"):
30 CFLAGS = GCCFLAGS
31 else:
32 CFLAGS = []
33
34
35 if sys.platform == 'darwin' and sys.version_info[:2] == (2, 6):
36 cpp_link_args = ['-lstdc++']
37 else:
38 cpp_link_args = []
39
40 npymath_info = np_misc.get_info('npymath')
41
42 ext_dynfunc = Extension(name='numba._dynfunc', sources=['numba/_dynfunc.c'],
43 extra_compile_args=CFLAGS,
44 depends=["numba/_pymodule.h"])
45
46 ext_npymath_exports = Extension(name='numba._npymath_exports',
47 sources=['numba/_npymath_exports.c'],
48 include_dirs=npymath_info['include_dirs'],
49 libraries=npymath_info['libraries'],
50 library_dirs=npymath_info['library_dirs'],
51 define_macros=npymath_info['define_macros'])
52
53
54 ext_dispatcher = Extension(name="numba._dispatcher",
55 include_dirs=[numpy.get_include()],
56 sources=['numba/_dispatcher.c',
57 'numba/_dispatcherimpl.cpp',
58 'numba/typeconv/typeconv.cpp'],
59 depends=["numba/_pymodule.h",
60 "numba/_dispatcher.h"],
61 extra_link_args=cpp_link_args)
62
63 ext_helperlib = Extension(name="numba._helperlib",
64 include_dirs=[numpy.get_include()],
65 sources=["numba/_helperlib.c", "numba/_math_c99.c"],
66 extra_compile_args=CFLAGS,
67 depends=["numba/_pymodule.h",
68 "numba/_math_c99.h",
69 "numba/mathnames.inc"])
70
71 ext_typeconv = Extension(name="numba.typeconv._typeconv",
72 sources=["numba/typeconv/typeconv.cpp",
73 "numba/typeconv/_typeconv.cpp"],
74 depends=["numba/_pymodule.h"],
75 extra_link_args=cpp_link_args)
76
77 ext_npyufunc_ufunc = Extension(name="numba.npyufunc._internal",
78 sources=["numba/npyufunc/_internal.c"],
79 include_dirs=[numpy.get_include()],
80 depends=["numba/npyufunc/_ufunc.c",
81 "numba/npyufunc/_internal.h",
82 "numba/_pymodule.h"])
83
84 ext_mviewbuf = Extension(name='numba.mviewbuf',
85 sources=['numba/mviewbuf.c'])
86
87 ext_modules = [ext_dynfunc, ext_npymath_exports, ext_dispatcher,
88 ext_helperlib, ext_typeconv, ext_npyufunc_ufunc, ext_mviewbuf]
89
90 packages = [
91 "numba",
92 "numba.targets",
93 "numba.tests",
94 "numba.typing",
95 "numba.typeconv",
96 "numba.npyufunc",
97 "numba.pycc",
98 "numba.servicelib",
99 "numba.cuda",
100 "numba.cuda.cudadrv",
101 "numba.cuda.tests",
102 "numba.cuda.tests.cudadrv",
103 "numba.cuda.tests.cudadrv.data",
104 "numba.cuda.tests.cudapy",
105 ]
106
107 setup(name='numba',
108 description="compiling Python code using LLVM",
109 version=versioneer.get_version(),
110
111 classifiers=[
112 "Development Status :: 4 - Beta",
113 "Intended Audience :: Developers",
114 "License :: OSI Approved :: BSD License",
115 "Operating System :: OS Independent",
116 "Programming Language :: Python",
117 "Programming Language :: Python :: 2.6",
118 "Programming Language :: Python :: 2.7",
119 "Programming Language :: Python :: 3.3",
120 "Programming Language :: Python :: 3.4",
121 "Topic :: Software Development :: Compilers",
122 ],
123 package_data={
124 "numba": ["*.c", "*.h", "*.cpp", "*.inc"],
125 "numba.npyufunc": ["*.c", "*.h"],
126 "numba.typeconv": ["*.cpp", "*.hpp"],
127 "numba.cuda.tests.cudadrv.data": ["*.ptx"],
128 },
129 scripts=["numba/pycc/pycc", "bin/numba"],
130 author="Continuum Analytics, Inc.",
131 author_email="[email protected]",
132 url="http://numba.github.com",
133 ext_modules=ext_modules,
134 packages=packages,
135 license="BSD",
136 cmdclass=cmdclass,
137 **setup_args)
138
139
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -21,7 +21,7 @@
cmdclass = versioneer.get_cmdclass()
setup_args = {
- 'long_description': open('README.md').read(),
+ 'long_description': open('README.rst').read(),
}
GCCFLAGS = ["-std=c89", "-Wdeclaration-after-statement", "-Werror"]
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -21,7 +21,7 @@\n cmdclass = versioneer.get_cmdclass()\n \n setup_args = {\n- 'long_description': open('README.md').read(),\n+ 'long_description': open('README.rst').read(),\n }\n \n GCCFLAGS = [\"-std=c89\", \"-Wdeclaration-after-statement\", \"-Werror\"]\n", "issue": "Update README\nTwo issues with our README file:\n- it is not up-to-date (e.g. it mentions Cython, which we don't use anymore)\n- it uses Markdown rather than reST, and therefore is badly formatted when used for the PyPI long description: https://pypi.python.org/pypi/numba\n\n", "before_files": [{"content": "try:\n # Try to use setuptools so as to enable support of the special\n # \"Microsoft Visual C++ Compiler for Python 2.7\" (http://aka.ms/vcpython27)\n # for building under Windows.\n # Note setuptools >= 6.0 is required for this.\n from setuptools import setup, Extension\nexcept ImportError:\n from distutils.core import setup, Extension\n\nimport sys\nimport os\nimport numpy\nimport numpy.distutils.misc_util as np_misc\nimport versioneer\n\nversioneer.versionfile_source = 'numba/_version.py'\nversioneer.versionfile_build = 'numba/_version.py'\nversioneer.tag_prefix = ''\nversioneer.parentdir_prefix = 'numba-'\n\ncmdclass = versioneer.get_cmdclass()\n\nsetup_args = {\n 'long_description': open('README.md').read(),\n}\n\nGCCFLAGS = [\"-std=c89\", \"-Wdeclaration-after-statement\", \"-Werror\"]\n\nif os.environ.get(\"NUMBA_GCC_FLAGS\"):\n CFLAGS = GCCFLAGS\nelse:\n CFLAGS = []\n\n\nif sys.platform == 'darwin' and sys.version_info[:2] == (2, 6):\n cpp_link_args = ['-lstdc++']\nelse:\n cpp_link_args = []\n\nnpymath_info = np_misc.get_info('npymath')\n\next_dynfunc = Extension(name='numba._dynfunc', sources=['numba/_dynfunc.c'],\n extra_compile_args=CFLAGS,\n depends=[\"numba/_pymodule.h\"])\n\next_npymath_exports = Extension(name='numba._npymath_exports',\n sources=['numba/_npymath_exports.c'],\n include_dirs=npymath_info['include_dirs'],\n libraries=npymath_info['libraries'],\n library_dirs=npymath_info['library_dirs'],\n define_macros=npymath_info['define_macros'])\n\n\next_dispatcher = Extension(name=\"numba._dispatcher\",\n include_dirs=[numpy.get_include()],\n sources=['numba/_dispatcher.c',\n 'numba/_dispatcherimpl.cpp',\n 'numba/typeconv/typeconv.cpp'],\n depends=[\"numba/_pymodule.h\",\n \"numba/_dispatcher.h\"],\n extra_link_args=cpp_link_args)\n\next_helperlib = Extension(name=\"numba._helperlib\",\n include_dirs=[numpy.get_include()],\n sources=[\"numba/_helperlib.c\", \"numba/_math_c99.c\"],\n extra_compile_args=CFLAGS,\n depends=[\"numba/_pymodule.h\",\n \"numba/_math_c99.h\",\n \"numba/mathnames.inc\"])\n\next_typeconv = Extension(name=\"numba.typeconv._typeconv\",\n sources=[\"numba/typeconv/typeconv.cpp\",\n \"numba/typeconv/_typeconv.cpp\"],\n depends=[\"numba/_pymodule.h\"],\n extra_link_args=cpp_link_args)\n\next_npyufunc_ufunc = Extension(name=\"numba.npyufunc._internal\",\n sources=[\"numba/npyufunc/_internal.c\"],\n include_dirs=[numpy.get_include()],\n depends=[\"numba/npyufunc/_ufunc.c\",\n \"numba/npyufunc/_internal.h\",\n \"numba/_pymodule.h\"])\n\next_mviewbuf = Extension(name='numba.mviewbuf',\n sources=['numba/mviewbuf.c'])\n\next_modules = [ext_dynfunc, ext_npymath_exports, ext_dispatcher,\n ext_helperlib, ext_typeconv, ext_npyufunc_ufunc, ext_mviewbuf]\n\npackages = [\n \"numba\",\n \"numba.targets\",\n \"numba.tests\",\n \"numba.typing\",\n \"numba.typeconv\",\n \"numba.npyufunc\",\n \"numba.pycc\",\n \"numba.servicelib\",\n \"numba.cuda\",\n \"numba.cuda.cudadrv\",\n \"numba.cuda.tests\",\n \"numba.cuda.tests.cudadrv\",\n \"numba.cuda.tests.cudadrv.data\",\n \"numba.cuda.tests.cudapy\",\n]\n\nsetup(name='numba',\n description=\"compiling Python code using LLVM\",\n version=versioneer.get_version(),\n\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2.6\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3.3\",\n \"Programming Language :: Python :: 3.4\",\n \"Topic :: Software Development :: Compilers\",\n ],\n package_data={\n \"numba\": [\"*.c\", \"*.h\", \"*.cpp\", \"*.inc\"],\n \"numba.npyufunc\": [\"*.c\", \"*.h\"],\n \"numba.typeconv\": [\"*.cpp\", \"*.hpp\"],\n \"numba.cuda.tests.cudadrv.data\": [\"*.ptx\"],\n },\n scripts=[\"numba/pycc/pycc\", \"bin/numba\"],\n author=\"Continuum Analytics, Inc.\",\n author_email=\"[email protected]\",\n url=\"http://numba.github.com\",\n ext_modules=ext_modules,\n packages=packages,\n license=\"BSD\",\n cmdclass=cmdclass,\n **setup_args)\n\n", "path": "setup.py"}], "after_files": [{"content": "try:\n # Try to use setuptools so as to enable support of the special\n # \"Microsoft Visual C++ Compiler for Python 2.7\" (http://aka.ms/vcpython27)\n # for building under Windows.\n # Note setuptools >= 6.0 is required for this.\n from setuptools import setup, Extension\nexcept ImportError:\n from distutils.core import setup, Extension\n\nimport sys\nimport os\nimport numpy\nimport numpy.distutils.misc_util as np_misc\nimport versioneer\n\nversioneer.versionfile_source = 'numba/_version.py'\nversioneer.versionfile_build = 'numba/_version.py'\nversioneer.tag_prefix = ''\nversioneer.parentdir_prefix = 'numba-'\n\ncmdclass = versioneer.get_cmdclass()\n\nsetup_args = {\n 'long_description': open('README.rst').read(),\n}\n\nGCCFLAGS = [\"-std=c89\", \"-Wdeclaration-after-statement\", \"-Werror\"]\n\nif os.environ.get(\"NUMBA_GCC_FLAGS\"):\n CFLAGS = GCCFLAGS\nelse:\n CFLAGS = []\n\n\nif sys.platform == 'darwin' and sys.version_info[:2] == (2, 6):\n cpp_link_args = ['-lstdc++']\nelse:\n cpp_link_args = []\n\nnpymath_info = np_misc.get_info('npymath')\n\next_dynfunc = Extension(name='numba._dynfunc', sources=['numba/_dynfunc.c'],\n extra_compile_args=CFLAGS,\n depends=[\"numba/_pymodule.h\"])\n\next_npymath_exports = Extension(name='numba._npymath_exports',\n sources=['numba/_npymath_exports.c'],\n include_dirs=npymath_info['include_dirs'],\n libraries=npymath_info['libraries'],\n library_dirs=npymath_info['library_dirs'],\n define_macros=npymath_info['define_macros'])\n\n\next_dispatcher = Extension(name=\"numba._dispatcher\",\n include_dirs=[numpy.get_include()],\n sources=['numba/_dispatcher.c',\n 'numba/_dispatcherimpl.cpp',\n 'numba/typeconv/typeconv.cpp'],\n depends=[\"numba/_pymodule.h\",\n \"numba/_dispatcher.h\"],\n extra_link_args=cpp_link_args)\n\next_helperlib = Extension(name=\"numba._helperlib\",\n include_dirs=[numpy.get_include()],\n sources=[\"numba/_helperlib.c\", \"numba/_math_c99.c\"],\n extra_compile_args=CFLAGS,\n depends=[\"numba/_pymodule.h\",\n \"numba/_math_c99.h\",\n \"numba/mathnames.inc\"])\n\next_typeconv = Extension(name=\"numba.typeconv._typeconv\",\n sources=[\"numba/typeconv/typeconv.cpp\",\n \"numba/typeconv/_typeconv.cpp\"],\n depends=[\"numba/_pymodule.h\"],\n extra_link_args=cpp_link_args)\n\next_npyufunc_ufunc = Extension(name=\"numba.npyufunc._internal\",\n sources=[\"numba/npyufunc/_internal.c\"],\n include_dirs=[numpy.get_include()],\n depends=[\"numba/npyufunc/_ufunc.c\",\n \"numba/npyufunc/_internal.h\",\n \"numba/_pymodule.h\"])\n\next_mviewbuf = Extension(name='numba.mviewbuf',\n sources=['numba/mviewbuf.c'])\n\next_modules = [ext_dynfunc, ext_npymath_exports, ext_dispatcher,\n ext_helperlib, ext_typeconv, ext_npyufunc_ufunc, ext_mviewbuf]\n\npackages = [\n \"numba\",\n \"numba.targets\",\n \"numba.tests\",\n \"numba.typing\",\n \"numba.typeconv\",\n \"numba.npyufunc\",\n \"numba.pycc\",\n \"numba.servicelib\",\n \"numba.cuda\",\n \"numba.cuda.cudadrv\",\n \"numba.cuda.tests\",\n \"numba.cuda.tests.cudadrv\",\n \"numba.cuda.tests.cudadrv.data\",\n \"numba.cuda.tests.cudapy\",\n]\n\nsetup(name='numba',\n description=\"compiling Python code using LLVM\",\n version=versioneer.get_version(),\n\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2.6\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3.3\",\n \"Programming Language :: Python :: 3.4\",\n \"Topic :: Software Development :: Compilers\",\n ],\n package_data={\n \"numba\": [\"*.c\", \"*.h\", \"*.cpp\", \"*.inc\"],\n \"numba.npyufunc\": [\"*.c\", \"*.h\"],\n \"numba.typeconv\": [\"*.cpp\", \"*.hpp\"],\n \"numba.cuda.tests.cudadrv.data\": [\"*.ptx\"],\n },\n scripts=[\"numba/pycc/pycc\", \"bin/numba\"],\n author=\"Continuum Analytics, Inc.\",\n author_email=\"[email protected]\",\n url=\"http://numba.github.com\",\n ext_modules=ext_modules,\n packages=packages,\n license=\"BSD\",\n cmdclass=cmdclass,\n **setup_args)\n\n", "path": "setup.py"}]}
| 1,780 | 96 |
gh_patches_debug_1141
|
rasdani/github-patches
|
git_diff
|
paperless-ngx__paperless-ngx-2280
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug] cannot save Mail Rule with "mail and attachment as seperate documents" in 1.11.1
Maybe it's just me, but I cannot save Mail Rule with "mail and attachment as seperate documents".
_Originally posted by @Limerick-gh in https://github.com/paperless-ngx/paperless-ngx/discussions/2265#discussioncomment-4557234_
[Bug] Missing consumption scope options in frontend
### Discussed in https://github.com/paperless-ngx/paperless-ngx/discussions/2265
<div type='discussions-op-text'>
<sup>Originally posted by **morremeyer** December 30, 2022</sup>
With #2000, frontend configuration for mail consumption was added.
With #848, at about the same time, email body & .eml file consumption was added.
#848 added the **consumption scope** for email consumption (see https://github.com/p-h-a-i-l/paperless-ngx/blob/0fda35723d62275a5beb783cbf9061d4d4a15703/src/paperless_mail/models.py#L59-L65) to decide between consuming:
* only the attachments
* the full email as .eml
* the full email as .eml **and** the attachments
The **consumption scope** is not yet configurable on the frontend. I'd be really happy if it were configurable in the frontend in a future version.
I'm pretty sure someone already has that planned, but I couldn't find an issue or discussion for it, so I'm opening this one to track this request.</div>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/paperless_mail/serialisers.py`
Content:
```
1 from documents.serialisers import CorrespondentField
2 from documents.serialisers import DocumentTypeField
3 from documents.serialisers import TagsField
4 from paperless_mail.models import MailAccount
5 from paperless_mail.models import MailRule
6 from rest_framework import serializers
7
8
9 class ObfuscatedPasswordField(serializers.Field):
10 """
11 Sends *** string instead of password in the clear
12 """
13
14 def to_representation(self, value):
15 return "*" * len(value)
16
17 def to_internal_value(self, data):
18 return data
19
20
21 class MailAccountSerializer(serializers.ModelSerializer):
22 password = ObfuscatedPasswordField()
23
24 class Meta:
25 model = MailAccount
26 depth = 1
27 fields = [
28 "id",
29 "name",
30 "imap_server",
31 "imap_port",
32 "imap_security",
33 "username",
34 "password",
35 "character_set",
36 ]
37
38 def update(self, instance, validated_data):
39 if "password" in validated_data:
40 if len(validated_data.get("password").replace("*", "")) == 0:
41 validated_data.pop("password")
42 super().update(instance, validated_data)
43 return instance
44
45 def create(self, validated_data):
46 mail_account = MailAccount.objects.create(**validated_data)
47 return mail_account
48
49
50 class AccountField(serializers.PrimaryKeyRelatedField):
51 def get_queryset(self):
52 return MailAccount.objects.all().order_by("-id")
53
54
55 class MailRuleSerializer(serializers.ModelSerializer):
56 account = AccountField(required=True)
57 action_parameter = serializers.CharField(
58 allow_null=True,
59 required=False,
60 default="",
61 )
62 assign_correspondent = CorrespondentField(allow_null=True, required=False)
63 assign_tags = TagsField(many=True, allow_null=True, required=False)
64 assign_document_type = DocumentTypeField(allow_null=True, required=False)
65 order = serializers.IntegerField(required=False)
66
67 class Meta:
68 model = MailRule
69 depth = 1
70 fields = [
71 "id",
72 "name",
73 "account",
74 "folder",
75 "filter_from",
76 "filter_subject",
77 "filter_body",
78 "filter_attachment_filename",
79 "maximum_age",
80 "action",
81 "action_parameter",
82 "assign_title_from",
83 "assign_tags",
84 "assign_correspondent_from",
85 "assign_correspondent",
86 "assign_document_type",
87 "order",
88 "attachment_type",
89 ]
90
91 def update(self, instance, validated_data):
92 super().update(instance, validated_data)
93 return instance
94
95 def create(self, validated_data):
96 if "assign_tags" in validated_data:
97 assign_tags = validated_data.pop("assign_tags")
98 mail_rule = MailRule.objects.create(**validated_data)
99 if assign_tags:
100 mail_rule.assign_tags.set(assign_tags)
101 return mail_rule
102
103 def validate(self, attrs):
104 if (
105 attrs["action"] == MailRule.MailAction.TAG
106 or attrs["action"] == MailRule.MailAction.MOVE
107 ) and attrs["action_parameter"] is None:
108 raise serializers.ValidationError("An action parameter is required.")
109
110 return attrs
111
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/paperless_mail/serialisers.py b/src/paperless_mail/serialisers.py
--- a/src/paperless_mail/serialisers.py
+++ b/src/paperless_mail/serialisers.py
@@ -86,6 +86,7 @@
"assign_document_type",
"order",
"attachment_type",
+ "consumption_scope",
]
def update(self, instance, validated_data):
|
{"golden_diff": "diff --git a/src/paperless_mail/serialisers.py b/src/paperless_mail/serialisers.py\n--- a/src/paperless_mail/serialisers.py\n+++ b/src/paperless_mail/serialisers.py\n@@ -86,6 +86,7 @@\n \"assign_document_type\",\n \"order\",\n \"attachment_type\",\n+ \"consumption_scope\",\n ]\n \n def update(self, instance, validated_data):\n", "issue": "[Bug] cannot save Mail Rule with \"mail and attachment as seperate documents\" in 1.11.1\n Maybe it's just me, but I cannot save Mail Rule with \"mail and attachment as seperate documents\".\n\n_Originally posted by @Limerick-gh in https://github.com/paperless-ngx/paperless-ngx/discussions/2265#discussioncomment-4557234_\n \n[Bug] Missing consumption scope options in frontend\n### Discussed in https://github.com/paperless-ngx/paperless-ngx/discussions/2265\r\n\r\n<div type='discussions-op-text'>\r\n\r\n<sup>Originally posted by **morremeyer** December 30, 2022</sup>\r\nWith #2000, frontend configuration for mail consumption was added.\r\nWith #848, at about the same time, email body & .eml file consumption was added.\r\n\r\n#848 added the **consumption scope** for email consumption (see https://github.com/p-h-a-i-l/paperless-ngx/blob/0fda35723d62275a5beb783cbf9061d4d4a15703/src/paperless_mail/models.py#L59-L65) to decide between consuming:\r\n\r\n* only the attachments\r\n* the full email as .eml\r\n* the full email as .eml **and** the attachments\r\n\r\nThe **consumption scope** is not yet configurable on the frontend. I'd be really happy if it were configurable in the frontend in a future version.\r\n\r\nI'm pretty sure someone already has that planned, but I couldn't find an issue or discussion for it, so I'm opening this one to track this request.</div>\n", "before_files": [{"content": "from documents.serialisers import CorrespondentField\nfrom documents.serialisers import DocumentTypeField\nfrom documents.serialisers import TagsField\nfrom paperless_mail.models import MailAccount\nfrom paperless_mail.models import MailRule\nfrom rest_framework import serializers\n\n\nclass ObfuscatedPasswordField(serializers.Field):\n \"\"\"\n Sends *** string instead of password in the clear\n \"\"\"\n\n def to_representation(self, value):\n return \"*\" * len(value)\n\n def to_internal_value(self, data):\n return data\n\n\nclass MailAccountSerializer(serializers.ModelSerializer):\n password = ObfuscatedPasswordField()\n\n class Meta:\n model = MailAccount\n depth = 1\n fields = [\n \"id\",\n \"name\",\n \"imap_server\",\n \"imap_port\",\n \"imap_security\",\n \"username\",\n \"password\",\n \"character_set\",\n ]\n\n def update(self, instance, validated_data):\n if \"password\" in validated_data:\n if len(validated_data.get(\"password\").replace(\"*\", \"\")) == 0:\n validated_data.pop(\"password\")\n super().update(instance, validated_data)\n return instance\n\n def create(self, validated_data):\n mail_account = MailAccount.objects.create(**validated_data)\n return mail_account\n\n\nclass AccountField(serializers.PrimaryKeyRelatedField):\n def get_queryset(self):\n return MailAccount.objects.all().order_by(\"-id\")\n\n\nclass MailRuleSerializer(serializers.ModelSerializer):\n account = AccountField(required=True)\n action_parameter = serializers.CharField(\n allow_null=True,\n required=False,\n default=\"\",\n )\n assign_correspondent = CorrespondentField(allow_null=True, required=False)\n assign_tags = TagsField(many=True, allow_null=True, required=False)\n assign_document_type = DocumentTypeField(allow_null=True, required=False)\n order = serializers.IntegerField(required=False)\n\n class Meta:\n model = MailRule\n depth = 1\n fields = [\n \"id\",\n \"name\",\n \"account\",\n \"folder\",\n \"filter_from\",\n \"filter_subject\",\n \"filter_body\",\n \"filter_attachment_filename\",\n \"maximum_age\",\n \"action\",\n \"action_parameter\",\n \"assign_title_from\",\n \"assign_tags\",\n \"assign_correspondent_from\",\n \"assign_correspondent\",\n \"assign_document_type\",\n \"order\",\n \"attachment_type\",\n ]\n\n def update(self, instance, validated_data):\n super().update(instance, validated_data)\n return instance\n\n def create(self, validated_data):\n if \"assign_tags\" in validated_data:\n assign_tags = validated_data.pop(\"assign_tags\")\n mail_rule = MailRule.objects.create(**validated_data)\n if assign_tags:\n mail_rule.assign_tags.set(assign_tags)\n return mail_rule\n\n def validate(self, attrs):\n if (\n attrs[\"action\"] == MailRule.MailAction.TAG\n or attrs[\"action\"] == MailRule.MailAction.MOVE\n ) and attrs[\"action_parameter\"] is None:\n raise serializers.ValidationError(\"An action parameter is required.\")\n\n return attrs\n", "path": "src/paperless_mail/serialisers.py"}], "after_files": [{"content": "from documents.serialisers import CorrespondentField\nfrom documents.serialisers import DocumentTypeField\nfrom documents.serialisers import TagsField\nfrom paperless_mail.models import MailAccount\nfrom paperless_mail.models import MailRule\nfrom rest_framework import serializers\n\n\nclass ObfuscatedPasswordField(serializers.Field):\n \"\"\"\n Sends *** string instead of password in the clear\n \"\"\"\n\n def to_representation(self, value):\n return \"*\" * len(value)\n\n def to_internal_value(self, data):\n return data\n\n\nclass MailAccountSerializer(serializers.ModelSerializer):\n password = ObfuscatedPasswordField()\n\n class Meta:\n model = MailAccount\n depth = 1\n fields = [\n \"id\",\n \"name\",\n \"imap_server\",\n \"imap_port\",\n \"imap_security\",\n \"username\",\n \"password\",\n \"character_set\",\n ]\n\n def update(self, instance, validated_data):\n if \"password\" in validated_data:\n if len(validated_data.get(\"password\").replace(\"*\", \"\")) == 0:\n validated_data.pop(\"password\")\n super().update(instance, validated_data)\n return instance\n\n def create(self, validated_data):\n mail_account = MailAccount.objects.create(**validated_data)\n return mail_account\n\n\nclass AccountField(serializers.PrimaryKeyRelatedField):\n def get_queryset(self):\n return MailAccount.objects.all().order_by(\"-id\")\n\n\nclass MailRuleSerializer(serializers.ModelSerializer):\n account = AccountField(required=True)\n action_parameter = serializers.CharField(\n allow_null=True,\n required=False,\n default=\"\",\n )\n assign_correspondent = CorrespondentField(allow_null=True, required=False)\n assign_tags = TagsField(many=True, allow_null=True, required=False)\n assign_document_type = DocumentTypeField(allow_null=True, required=False)\n order = serializers.IntegerField(required=False)\n\n class Meta:\n model = MailRule\n depth = 1\n fields = [\n \"id\",\n \"name\",\n \"account\",\n \"folder\",\n \"filter_from\",\n \"filter_subject\",\n \"filter_body\",\n \"filter_attachment_filename\",\n \"maximum_age\",\n \"action\",\n \"action_parameter\",\n \"assign_title_from\",\n \"assign_tags\",\n \"assign_correspondent_from\",\n \"assign_correspondent\",\n \"assign_document_type\",\n \"order\",\n \"attachment_type\",\n \"consumption_scope\",\n ]\n\n def update(self, instance, validated_data):\n super().update(instance, validated_data)\n return instance\n\n def create(self, validated_data):\n if \"assign_tags\" in validated_data:\n assign_tags = validated_data.pop(\"assign_tags\")\n mail_rule = MailRule.objects.create(**validated_data)\n if assign_tags:\n mail_rule.assign_tags.set(assign_tags)\n return mail_rule\n\n def validate(self, attrs):\n if (\n attrs[\"action\"] == MailRule.MailAction.TAG\n or attrs[\"action\"] == MailRule.MailAction.MOVE\n ) and attrs[\"action_parameter\"] is None:\n raise serializers.ValidationError(\"An action parameter is required.\")\n\n return attrs\n", "path": "src/paperless_mail/serialisers.py"}]}
| 1,522 | 95 |
gh_patches_debug_17612
|
rasdani/github-patches
|
git_diff
|
alltheplaces__alltheplaces-6739
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Poundland spider returning Pep&Co only stores as both Poundland and Pep&Co
Poundland has a clothing brand called "Pep&Co", and typically their larger stores have a "Pep&Co" area inside, looking a bit like a concession in a department store. In their storefinder data, these stores are indicated by a "Pep Shop" flag. The ATP poundland.py spider looks for this, and when it's found generates an additional "Pep&Co" store item in addition to the "Poundland" one.
So far so good. However, it appears that there are also a few (about 14) "Pep&Co" stores that aren't also regular Poundland stores. For example, these can occur in shopping centres, where Poundland is renting two nearby units and using one as a regular Poundland store (without Pep&Co) and the other as a Pep&Co only store. Currently ATP is returning a "Poundland" entry for the "Pep&Co" only store. Since there's a separate entry in the storefinder for the actual Poundland store, this leads to duplicates.
Here's an example mapped in OSM:
Poundland - https://www.openstreetmap.org/node/11293224534 - CV21 2JT
Pep&Co - https://www.openstreetmap.org/node/11293224520 - CV21 2JS
Another (unmapped example) is two stores with postcode TS8 0TJ.
I think these "Pep&Co" only stores can be detected by the branch name (stored in item["branch"]) starting with "Pep & Co". I guess we should test for this, and then not yield the final item if it's found. (In case there's any inconsistency in their data, it would probably be good to add the same test to the` if "Pep Shop" in ` line as an alternative. This would ensure that we don't completely drop a branch who's name starts with "Pep&Co" but doesn't have the "Pep shop" flag set.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/poundland.py`
Content:
```
1 from locations.categories import Extras, apply_yes_no
2 from locations.items import Feature
3 from locations.storefinders.woosmap import WoosmapSpider
4
5
6 class PoundlandSpider(WoosmapSpider):
7 name = "poundland"
8 item_attributes = {"brand": "Poundland", "brand_wikidata": "Q1434528"}
9 key = "woos-4108db5c-39f8-360b-9b7e-102c38034b94"
10 origin = "https://www.poundland.co.uk"
11
12 def parse_item(self, item: Feature, feature: dict, **kwargs):
13 item["branch"] = item.pop("name")
14
15 if "Pep Shop" in feature["properties"]["tags"]:
16 pep = item.deepcopy()
17
18 pep["ref"] = pep["ref"] + "_pep"
19
20 pep["brand"] = "Pep&Co"
21 pep["brand_wikidata"] = "Q24908166"
22
23 pep["located_in"] = self.item_attributes["brand"]
24 pep["located_in_wikidata"] = self.item_attributes["brand_wikidata"]
25
26 yield pep
27
28 apply_yes_no(Extras.ATM, item, "ATM" in feature["properties"]["tags"])
29 item["extras"]["icestore"] = "yes" if "Ice Store" in feature["properties"]["tags"] else "no"
30
31 yield item
32
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/locations/spiders/poundland.py b/locations/spiders/poundland.py
--- a/locations/spiders/poundland.py
+++ b/locations/spiders/poundland.py
@@ -12,7 +12,7 @@
def parse_item(self, item: Feature, feature: dict, **kwargs):
item["branch"] = item.pop("name")
- if "Pep Shop" in feature["properties"]["tags"]:
+ if "Pep Shop" in feature["properties"]["tags"] or item["branch"].startswith("Pep & Co "):
pep = item.deepcopy()
pep["ref"] = pep["ref"] + "_pep"
@@ -25,6 +25,9 @@
yield pep
+ if item["branch"].startswith("Pep & Co "):
+ return
+
apply_yes_no(Extras.ATM, item, "ATM" in feature["properties"]["tags"])
item["extras"]["icestore"] = "yes" if "Ice Store" in feature["properties"]["tags"] else "no"
|
{"golden_diff": "diff --git a/locations/spiders/poundland.py b/locations/spiders/poundland.py\n--- a/locations/spiders/poundland.py\n+++ b/locations/spiders/poundland.py\n@@ -12,7 +12,7 @@\n def parse_item(self, item: Feature, feature: dict, **kwargs):\n item[\"branch\"] = item.pop(\"name\")\n \n- if \"Pep Shop\" in feature[\"properties\"][\"tags\"]:\n+ if \"Pep Shop\" in feature[\"properties\"][\"tags\"] or item[\"branch\"].startswith(\"Pep & Co \"):\n pep = item.deepcopy()\n \n pep[\"ref\"] = pep[\"ref\"] + \"_pep\"\n@@ -25,6 +25,9 @@\n \n yield pep\n \n+ if item[\"branch\"].startswith(\"Pep & Co \"):\n+ return\n+\n apply_yes_no(Extras.ATM, item, \"ATM\" in feature[\"properties\"][\"tags\"])\n item[\"extras\"][\"icestore\"] = \"yes\" if \"Ice Store\" in feature[\"properties\"][\"tags\"] else \"no\"\n", "issue": "Poundland spider returning Pep&Co only stores as both Poundland and Pep&Co\nPoundland has a clothing brand called \"Pep&Co\", and typically their larger stores have a \"Pep&Co\" area inside, looking a bit like a concession in a department store. In their storefinder data, these stores are indicated by a \"Pep Shop\" flag. The ATP poundland.py spider looks for this, and when it's found generates an additional \"Pep&Co\" store item in addition to the \"Poundland\" one.\r\n\r\nSo far so good. However, it appears that there are also a few (about 14) \"Pep&Co\" stores that aren't also regular Poundland stores. For example, these can occur in shopping centres, where Poundland is renting two nearby units and using one as a regular Poundland store (without Pep&Co) and the other as a Pep&Co only store. Currently ATP is returning a \"Poundland\" entry for the \"Pep&Co\" only store. Since there's a separate entry in the storefinder for the actual Poundland store, this leads to duplicates.\r\n\r\nHere's an example mapped in OSM:\r\nPoundland - https://www.openstreetmap.org/node/11293224534 - CV21 2JT\r\nPep&Co - https://www.openstreetmap.org/node/11293224520 - CV21 2JS\r\n\r\nAnother (unmapped example) is two stores with postcode TS8 0TJ.\r\n\r\nI think these \"Pep&Co\" only stores can be detected by the branch name (stored in item[\"branch\"]) starting with \"Pep & Co\". I guess we should test for this, and then not yield the final item if it's found. (In case there's any inconsistency in their data, it would probably be good to add the same test to the` if \"Pep Shop\" in ` line as an alternative. This would ensure that we don't completely drop a branch who's name starts with \"Pep&Co\" but doesn't have the \"Pep shop\" flag set.\n", "before_files": [{"content": "from locations.categories import Extras, apply_yes_no\nfrom locations.items import Feature\nfrom locations.storefinders.woosmap import WoosmapSpider\n\n\nclass PoundlandSpider(WoosmapSpider):\n name = \"poundland\"\n item_attributes = {\"brand\": \"Poundland\", \"brand_wikidata\": \"Q1434528\"}\n key = \"woos-4108db5c-39f8-360b-9b7e-102c38034b94\"\n origin = \"https://www.poundland.co.uk\"\n\n def parse_item(self, item: Feature, feature: dict, **kwargs):\n item[\"branch\"] = item.pop(\"name\")\n\n if \"Pep Shop\" in feature[\"properties\"][\"tags\"]:\n pep = item.deepcopy()\n\n pep[\"ref\"] = pep[\"ref\"] + \"_pep\"\n\n pep[\"brand\"] = \"Pep&Co\"\n pep[\"brand_wikidata\"] = \"Q24908166\"\n\n pep[\"located_in\"] = self.item_attributes[\"brand\"]\n pep[\"located_in_wikidata\"] = self.item_attributes[\"brand_wikidata\"]\n\n yield pep\n\n apply_yes_no(Extras.ATM, item, \"ATM\" in feature[\"properties\"][\"tags\"])\n item[\"extras\"][\"icestore\"] = \"yes\" if \"Ice Store\" in feature[\"properties\"][\"tags\"] else \"no\"\n\n yield item\n", "path": "locations/spiders/poundland.py"}], "after_files": [{"content": "from locations.categories import Extras, apply_yes_no\nfrom locations.items import Feature\nfrom locations.storefinders.woosmap import WoosmapSpider\n\n\nclass PoundlandSpider(WoosmapSpider):\n name = \"poundland\"\n item_attributes = {\"brand\": \"Poundland\", \"brand_wikidata\": \"Q1434528\"}\n key = \"woos-4108db5c-39f8-360b-9b7e-102c38034b94\"\n origin = \"https://www.poundland.co.uk\"\n\n def parse_item(self, item: Feature, feature: dict, **kwargs):\n item[\"branch\"] = item.pop(\"name\")\n\n if \"Pep Shop\" in feature[\"properties\"][\"tags\"] or item[\"branch\"].startswith(\"Pep & Co \"):\n pep = item.deepcopy()\n\n pep[\"ref\"] = pep[\"ref\"] + \"_pep\"\n\n pep[\"brand\"] = \"Pep&Co\"\n pep[\"brand_wikidata\"] = \"Q24908166\"\n\n pep[\"located_in\"] = self.item_attributes[\"brand\"]\n pep[\"located_in_wikidata\"] = self.item_attributes[\"brand_wikidata\"]\n\n yield pep\n\n if item[\"branch\"].startswith(\"Pep & Co \"):\n return\n\n apply_yes_no(Extras.ATM, item, \"ATM\" in feature[\"properties\"][\"tags\"])\n item[\"extras\"][\"icestore\"] = \"yes\" if \"Ice Store\" in feature[\"properties\"][\"tags\"] else \"no\"\n\n yield item\n", "path": "locations/spiders/poundland.py"}]}
| 1,106 | 239 |
gh_patches_debug_8907
|
rasdani/github-patches
|
git_diff
|
pymedusa__Medusa-1035
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
FINDSUBTITLES :: [3824246] Failed to refine video Traceback (most recent call last)
### Before submitting your issue:
Enable debug logging in SickRage settings, reproduce the error (be sure to disable after the bug is fixed)
Branch/Commit: develop/3824246
OS: Linux
What you did: nothing
What happened: ~~rato appeared~~ warning
What you expected: no warning
Logs:
```
2016-09-05 17:55:06 WARNING FINDSUBTITLES :: [3824246] Failed to refine video
Traceback (most recent call last):
File "/home/**********/.sickrage/lib/subliminal/core.py", line 543, in refine
refiner_manager[refiner].plugin(video, **kwargs)
File "/home/**********/.sickrage/sickbeard/refiners/tvepisode.py", line 62, in refine
enrich({'resolution': guess['screen_size'], 'format': guess['format']}, video, overwrite=False)
KeyError: u'screen_size'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sickbeard/refiners/tvepisode.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """TVEpisode refiner."""
3 from __future__ import unicode_literals
4
5 import logging
6 import re
7
8 from subliminal.video import Episode
9
10 from ..common import Quality
11
12 logger = logging.getLogger(__name__)
13
14 SHOW_MAPPING = {
15 'series_tvdb_id': 'tvdb_id',
16 'series_imdb_id': 'imdbid',
17 'year': 'startyear'
18 }
19
20 EPISODE_MAPPING = {
21 'tvdb_id': 'tvdb_id',
22 'episode': 'episode',
23 'season': 'season',
24 'size': 'file_size',
25 'title': 'name',
26 }
27
28 series_re = re.compile(r'^(?P<series>.*?)(?: \((?:(?P<year>\d{4})|(?P<country>[A-Z]{2}))\))?$')
29
30
31 def refine(video, tv_episode=None, **kwargs):
32 """Refine a video by using TVEpisode information.
33
34 :param video: the video to refine.
35 :type video: Episode
36 :param tv_episode: the TVEpisode to be used.
37 :type tv_episode: sickbeard.tv.TVEpisode
38 :param kwargs:
39 """
40 if video.series_tvdb_id and video.tvdb_id:
41 logger.debug('No need to refine with TVEpisode')
42 return
43
44 if not tv_episode:
45 logger.debug('No TVEpisode to be used to refine')
46 return
47
48 if not isinstance(video, Episode):
49 logger.debug('Video {name} is not an episode. Skipping refiner...', name=video.name)
50 return
51
52 if tv_episode.show:
53 logger.debug('Refining using TVShow information.')
54 series, year, country = series_re.match(tv_episode.show.name).groups()
55 enrich({'series': series, 'year': int(year) if year else None}, video)
56 enrich(SHOW_MAPPING, video, tv_episode.show)
57
58 logger.debug('Refining using TVEpisode information.')
59 enrich(EPISODE_MAPPING, video, tv_episode)
60 enrich({'release_group': tv_episode.release_group}, video, overwrite=False)
61 guess = Quality.to_guessit(tv_episode.status)
62 enrich({'resolution': guess['screen_size'], 'format': guess['format']}, video, overwrite=False)
63
64
65 def enrich(attributes, target, source=None, overwrite=True):
66 """Copy attributes from source to target.
67
68 :param attributes: the attributes mapping
69 :type attributes: dict(str -> str)
70 :param target: the target object
71 :param source: the source object. If None, the value in attributes dict will be used as new_value
72 :param overwrite: if source field should be overwritten if not already set
73 :type overwrite: bool
74 """
75 for key, value in attributes.items():
76 old_value = getattr(target, key)
77 if old_value and not overwrite:
78 continue
79
80 new_value = getattr(source, value) if source else value
81
82 if new_value and old_value != new_value:
83 setattr(target, key, new_value)
84 logger.debug('Attribute {key} changed from {old} to {new}', key=key, old=old_value, new=new_value)
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sickbeard/refiners/tvepisode.py b/sickbeard/refiners/tvepisode.py
--- a/sickbeard/refiners/tvepisode.py
+++ b/sickbeard/refiners/tvepisode.py
@@ -59,7 +59,7 @@
enrich(EPISODE_MAPPING, video, tv_episode)
enrich({'release_group': tv_episode.release_group}, video, overwrite=False)
guess = Quality.to_guessit(tv_episode.status)
- enrich({'resolution': guess['screen_size'], 'format': guess['format']}, video, overwrite=False)
+ enrich({'resolution': guess.get('screen_size'), 'format': guess.get('format')}, video, overwrite=False)
def enrich(attributes, target, source=None, overwrite=True):
|
{"golden_diff": "diff --git a/sickbeard/refiners/tvepisode.py b/sickbeard/refiners/tvepisode.py\n--- a/sickbeard/refiners/tvepisode.py\n+++ b/sickbeard/refiners/tvepisode.py\n@@ -59,7 +59,7 @@\n enrich(EPISODE_MAPPING, video, tv_episode)\n enrich({'release_group': tv_episode.release_group}, video, overwrite=False)\n guess = Quality.to_guessit(tv_episode.status)\n- enrich({'resolution': guess['screen_size'], 'format': guess['format']}, video, overwrite=False)\n+ enrich({'resolution': guess.get('screen_size'), 'format': guess.get('format')}, video, overwrite=False)\n \n \n def enrich(attributes, target, source=None, overwrite=True):\n", "issue": "FINDSUBTITLES :: [3824246] Failed to refine video Traceback (most recent call last)\n### Before submitting your issue:\n\nEnable debug logging in SickRage settings, reproduce the error (be sure to disable after the bug is fixed)\n\nBranch/Commit: develop/3824246\nOS: Linux\nWhat you did: nothing\nWhat happened: ~~rato appeared~~ warning\nWhat you expected: no warning\nLogs:\n\n```\n2016-09-05 17:55:06 WARNING FINDSUBTITLES :: [3824246] Failed to refine video\nTraceback (most recent call last):\n File \"/home/**********/.sickrage/lib/subliminal/core.py\", line 543, in refine\n refiner_manager[refiner].plugin(video, **kwargs)\n File \"/home/**********/.sickrage/sickbeard/refiners/tvepisode.py\", line 62, in refine\n enrich({'resolution': guess['screen_size'], 'format': guess['format']}, video, overwrite=False)\nKeyError: u'screen_size'\n```\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"TVEpisode refiner.\"\"\"\nfrom __future__ import unicode_literals\n\nimport logging\nimport re\n\nfrom subliminal.video import Episode\n\nfrom ..common import Quality\n\nlogger = logging.getLogger(__name__)\n\nSHOW_MAPPING = {\n 'series_tvdb_id': 'tvdb_id',\n 'series_imdb_id': 'imdbid',\n 'year': 'startyear'\n}\n\nEPISODE_MAPPING = {\n 'tvdb_id': 'tvdb_id',\n 'episode': 'episode',\n 'season': 'season',\n 'size': 'file_size',\n 'title': 'name',\n}\n\nseries_re = re.compile(r'^(?P<series>.*?)(?: \\((?:(?P<year>\\d{4})|(?P<country>[A-Z]{2}))\\))?$')\n\n\ndef refine(video, tv_episode=None, **kwargs):\n \"\"\"Refine a video by using TVEpisode information.\n\n :param video: the video to refine.\n :type video: Episode\n :param tv_episode: the TVEpisode to be used.\n :type tv_episode: sickbeard.tv.TVEpisode\n :param kwargs:\n \"\"\"\n if video.series_tvdb_id and video.tvdb_id:\n logger.debug('No need to refine with TVEpisode')\n return\n\n if not tv_episode:\n logger.debug('No TVEpisode to be used to refine')\n return\n\n if not isinstance(video, Episode):\n logger.debug('Video {name} is not an episode. Skipping refiner...', name=video.name)\n return\n\n if tv_episode.show:\n logger.debug('Refining using TVShow information.')\n series, year, country = series_re.match(tv_episode.show.name).groups()\n enrich({'series': series, 'year': int(year) if year else None}, video)\n enrich(SHOW_MAPPING, video, tv_episode.show)\n\n logger.debug('Refining using TVEpisode information.')\n enrich(EPISODE_MAPPING, video, tv_episode)\n enrich({'release_group': tv_episode.release_group}, video, overwrite=False)\n guess = Quality.to_guessit(tv_episode.status)\n enrich({'resolution': guess['screen_size'], 'format': guess['format']}, video, overwrite=False)\n\n\ndef enrich(attributes, target, source=None, overwrite=True):\n \"\"\"Copy attributes from source to target.\n\n :param attributes: the attributes mapping\n :type attributes: dict(str -> str)\n :param target: the target object\n :param source: the source object. If None, the value in attributes dict will be used as new_value\n :param overwrite: if source field should be overwritten if not already set\n :type overwrite: bool\n \"\"\"\n for key, value in attributes.items():\n old_value = getattr(target, key)\n if old_value and not overwrite:\n continue\n\n new_value = getattr(source, value) if source else value\n\n if new_value and old_value != new_value:\n setattr(target, key, new_value)\n logger.debug('Attribute {key} changed from {old} to {new}', key=key, old=old_value, new=new_value)\n", "path": "sickbeard/refiners/tvepisode.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"TVEpisode refiner.\"\"\"\nfrom __future__ import unicode_literals\n\nimport logging\nimport re\n\nfrom subliminal.video import Episode\n\nfrom ..common import Quality\n\nlogger = logging.getLogger(__name__)\n\nSHOW_MAPPING = {\n 'series_tvdb_id': 'tvdb_id',\n 'series_imdb_id': 'imdbid',\n 'year': 'startyear'\n}\n\nEPISODE_MAPPING = {\n 'tvdb_id': 'tvdb_id',\n 'episode': 'episode',\n 'season': 'season',\n 'size': 'file_size',\n 'title': 'name',\n}\n\nseries_re = re.compile(r'^(?P<series>.*?)(?: \\((?:(?P<year>\\d{4})|(?P<country>[A-Z]{2}))\\))?$')\n\n\ndef refine(video, tv_episode=None, **kwargs):\n \"\"\"Refine a video by using TVEpisode information.\n\n :param video: the video to refine.\n :type video: Episode\n :param tv_episode: the TVEpisode to be used.\n :type tv_episode: sickbeard.tv.TVEpisode\n :param kwargs:\n \"\"\"\n if video.series_tvdb_id and video.tvdb_id:\n logger.debug('No need to refine with TVEpisode')\n return\n\n if not tv_episode:\n logger.debug('No TVEpisode to be used to refine')\n return\n\n if not isinstance(video, Episode):\n logger.debug('Video {name} is not an episode. Skipping refiner...', name=video.name)\n return\n\n if tv_episode.show:\n logger.debug('Refining using TVShow information.')\n series, year, country = series_re.match(tv_episode.show.name).groups()\n enrich({'series': series, 'year': int(year) if year else None}, video)\n enrich(SHOW_MAPPING, video, tv_episode.show)\n\n logger.debug('Refining using TVEpisode information.')\n enrich(EPISODE_MAPPING, video, tv_episode)\n enrich({'release_group': tv_episode.release_group}, video, overwrite=False)\n guess = Quality.to_guessit(tv_episode.status)\n enrich({'resolution': guess.get('screen_size'), 'format': guess.get('format')}, video, overwrite=False)\n\n\ndef enrich(attributes, target, source=None, overwrite=True):\n \"\"\"Copy attributes from source to target.\n\n :param attributes: the attributes mapping\n :type attributes: dict(str -> str)\n :param target: the target object\n :param source: the source object. If None, the value in attributes dict will be used as new_value\n :param overwrite: if source field should be overwritten if not already set\n :type overwrite: bool\n \"\"\"\n for key, value in attributes.items():\n old_value = getattr(target, key)\n if old_value and not overwrite:\n continue\n\n new_value = getattr(source, value) if source else value\n\n if new_value and old_value != new_value:\n setattr(target, key, new_value)\n logger.debug('Attribute {key} changed from {old} to {new}', key=key, old=old_value, new=new_value)\n", "path": "sickbeard/refiners/tvepisode.py"}]}
| 1,375 | 170 |
gh_patches_debug_29166
|
rasdani/github-patches
|
git_diff
|
svthalia__concrexit-1870
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add cancelled registrations to event in api v2
### Is your feature request related to a problem? Please describe.
API v2 doesn't return a cancelled registration when you've cancelled a registration (in the event serializer). So we cannot display stuff like 'Your registration is cancelled (after the deadline)'.
### Describe the solution you'd like
Also return a registration if it's cancelled, and include fields like is_cancelled, is_late_cancellation, etc.
### Motivation
This makes it possible to show nice status texts like in the old app and the website.
### Describe alternatives you've considered
Leaving out the messages for cancelled registrations. In that case, you can only see that there is no register button, but not why.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/events/api/v2/serializers/event_registration.py`
Content:
```
1 from rest_framework import serializers
2
3 from events.models import EventRegistration
4 from members.api.v2.serializers.member import MemberSerializer
5 from payments.api.v2.serializers import PaymentSerializer
6
7
8 class EventRegistrationSerializer(serializers.ModelSerializer):
9 """Serializer for event registrations."""
10
11 def __init__(self, *args, **kwargs):
12 # Don't pass the 'fields' arg up to the superclass
13 fields = kwargs.pop("fields", {"pk", "member", "name"})
14
15 # Instantiate the superclass normally
16 super().__init__(*args, **kwargs)
17
18 allowed = set(fields)
19 existing = set(self.fields.keys())
20 for field_name in existing - allowed:
21 self.fields.pop(field_name)
22
23 class Meta:
24 model = EventRegistration
25 fields = (
26 "pk",
27 "present",
28 "queue_position",
29 "date",
30 "payment",
31 "member",
32 "name",
33 )
34
35 payment = PaymentSerializer()
36 member = MemberSerializer(detailed=False, read_only=True)
37
```
Path: `website/events/api/v2/serializers/event.py`
Content:
```
1 from rest_framework import serializers
2
3 from activemembers.api.v2.serializers.member_group import MemberGroupSerializer
4 from announcements.api.v2.serializers import SlideSerializer
5 from documents.api.v2.serializers.document import DocumentSerializer
6 from events import services
7 from events.api.v2.serializers.event_registration import EventRegistrationSerializer
8 from events.models import Event, EventRegistration
9 from thaliawebsite.api.v2.serializers import CleanedHTMLSerializer
10 from utils.snippets import create_google_maps_url
11
12
13 class EventSerializer(serializers.ModelSerializer):
14 """Serializer for events."""
15
16 class Meta:
17 model = Event
18 fields = (
19 "pk",
20 "title",
21 "description",
22 "start",
23 "end",
24 "category",
25 "registration_start",
26 "registration_end",
27 "cancel_deadline",
28 "optional_registrations",
29 "location",
30 "price",
31 "fine",
32 "num_participants",
33 "max_participants",
34 "no_registration_message",
35 "cancel_too_late_message",
36 "has_fields",
37 "food_event",
38 "maps_url",
39 "user_permissions",
40 "user_registration",
41 "organiser",
42 "slide",
43 "documents",
44 )
45
46 description = CleanedHTMLSerializer()
47 organiser = MemberGroupSerializer()
48 user_registration = serializers.SerializerMethodField("_user_registration")
49 num_participants = serializers.SerializerMethodField("_num_participants")
50 maps_url = serializers.SerializerMethodField("_maps_url")
51 price = serializers.DecimalField(max_digits=5, decimal_places=2)
52 fine = serializers.DecimalField(max_digits=5, decimal_places=2)
53 slide = SlideSerializer()
54 documents = DocumentSerializer(many=True)
55 user_permissions = serializers.SerializerMethodField("_user_permissions")
56
57 def _user_registration(self, instance):
58 try:
59 if self.context["request"].member:
60 reg = instance.eventregistration_set.get(
61 member=self.context["request"].member, date_cancelled=None
62 )
63 return EventRegistrationSerializer(
64 reg,
65 context=self.context,
66 fields=("pk", "present", "queue_position", "date", "payment"),
67 ).data
68 except EventRegistration.DoesNotExist:
69 pass
70 return None
71
72 def _num_participants(self, instance):
73 if (
74 instance.max_participants
75 and instance.participants.count() > instance.max_participants
76 ):
77 return instance.max_participants
78 return instance.participants.count()
79
80 def _user_permissions(self, instance):
81 member = self.context["request"].member
82 return services.event_permissions(member, instance)
83
84 def _maps_url(self, instance):
85 return create_google_maps_url(instance.map_location, zoom=13, size="450x250")
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/website/events/api/v2/serializers/event.py b/website/events/api/v2/serializers/event.py
--- a/website/events/api/v2/serializers/event.py
+++ b/website/events/api/v2/serializers/event.py
@@ -58,12 +58,20 @@
try:
if self.context["request"].member:
reg = instance.eventregistration_set.get(
- member=self.context["request"].member, date_cancelled=None
+ member=self.context["request"].member
)
return EventRegistrationSerializer(
reg,
context=self.context,
- fields=("pk", "present", "queue_position", "date", "payment"),
+ fields=(
+ "pk",
+ "present",
+ "queue_position",
+ "is_cancelled",
+ "is_late_cancellation",
+ "date",
+ "payment",
+ ),
).data
except EventRegistration.DoesNotExist:
pass
diff --git a/website/events/api/v2/serializers/event_registration.py b/website/events/api/v2/serializers/event_registration.py
--- a/website/events/api/v2/serializers/event_registration.py
+++ b/website/events/api/v2/serializers/event_registration.py
@@ -34,3 +34,16 @@
payment = PaymentSerializer()
member = MemberSerializer(detailed=False, read_only=True)
+ is_cancelled = serializers.SerializerMethodField("_is_cancelled")
+ is_late_cancellation = serializers.SerializerMethodField("_is_late_cancellation")
+ queue_position = serializers.SerializerMethodField("_queue_position")
+
+ def _is_late_cancellation(self, instance):
+ return instance.is_late_cancellation()
+
+ def _queue_position(self, instance):
+ pos = instance.queue_position
+ return pos if pos and pos > 0 else None
+
+ def _is_cancelled(self, instance):
+ return instance.date_cancelled is not None
|
{"golden_diff": "diff --git a/website/events/api/v2/serializers/event.py b/website/events/api/v2/serializers/event.py\n--- a/website/events/api/v2/serializers/event.py\n+++ b/website/events/api/v2/serializers/event.py\n@@ -58,12 +58,20 @@\n try:\n if self.context[\"request\"].member:\n reg = instance.eventregistration_set.get(\n- member=self.context[\"request\"].member, date_cancelled=None\n+ member=self.context[\"request\"].member\n )\n return EventRegistrationSerializer(\n reg,\n context=self.context,\n- fields=(\"pk\", \"present\", \"queue_position\", \"date\", \"payment\"),\n+ fields=(\n+ \"pk\",\n+ \"present\",\n+ \"queue_position\",\n+ \"is_cancelled\",\n+ \"is_late_cancellation\",\n+ \"date\",\n+ \"payment\",\n+ ),\n ).data\n except EventRegistration.DoesNotExist:\n pass\ndiff --git a/website/events/api/v2/serializers/event_registration.py b/website/events/api/v2/serializers/event_registration.py\n--- a/website/events/api/v2/serializers/event_registration.py\n+++ b/website/events/api/v2/serializers/event_registration.py\n@@ -34,3 +34,16 @@\n \n payment = PaymentSerializer()\n member = MemberSerializer(detailed=False, read_only=True)\n+ is_cancelled = serializers.SerializerMethodField(\"_is_cancelled\")\n+ is_late_cancellation = serializers.SerializerMethodField(\"_is_late_cancellation\")\n+ queue_position = serializers.SerializerMethodField(\"_queue_position\")\n+\n+ def _is_late_cancellation(self, instance):\n+ return instance.is_late_cancellation()\n+\n+ def _queue_position(self, instance):\n+ pos = instance.queue_position\n+ return pos if pos and pos > 0 else None\n+\n+ def _is_cancelled(self, instance):\n+ return instance.date_cancelled is not None\n", "issue": "Add cancelled registrations to event in api v2\n### Is your feature request related to a problem? Please describe.\r\nAPI v2 doesn't return a cancelled registration when you've cancelled a registration (in the event serializer). So we cannot display stuff like 'Your registration is cancelled (after the deadline)'.\r\n\r\n### Describe the solution you'd like\r\nAlso return a registration if it's cancelled, and include fields like is_cancelled, is_late_cancellation, etc.\r\n\r\n### Motivation\r\nThis makes it possible to show nice status texts like in the old app and the website.\r\n\r\n### Describe alternatives you've considered\r\nLeaving out the messages for cancelled registrations. In that case, you can only see that there is no register button, but not why.\r\n\n", "before_files": [{"content": "from rest_framework import serializers\n\nfrom events.models import EventRegistration\nfrom members.api.v2.serializers.member import MemberSerializer\nfrom payments.api.v2.serializers import PaymentSerializer\n\n\nclass EventRegistrationSerializer(serializers.ModelSerializer):\n \"\"\"Serializer for event registrations.\"\"\"\n\n def __init__(self, *args, **kwargs):\n # Don't pass the 'fields' arg up to the superclass\n fields = kwargs.pop(\"fields\", {\"pk\", \"member\", \"name\"})\n\n # Instantiate the superclass normally\n super().__init__(*args, **kwargs)\n\n allowed = set(fields)\n existing = set(self.fields.keys())\n for field_name in existing - allowed:\n self.fields.pop(field_name)\n\n class Meta:\n model = EventRegistration\n fields = (\n \"pk\",\n \"present\",\n \"queue_position\",\n \"date\",\n \"payment\",\n \"member\",\n \"name\",\n )\n\n payment = PaymentSerializer()\n member = MemberSerializer(detailed=False, read_only=True)\n", "path": "website/events/api/v2/serializers/event_registration.py"}, {"content": "from rest_framework import serializers\n\nfrom activemembers.api.v2.serializers.member_group import MemberGroupSerializer\nfrom announcements.api.v2.serializers import SlideSerializer\nfrom documents.api.v2.serializers.document import DocumentSerializer\nfrom events import services\nfrom events.api.v2.serializers.event_registration import EventRegistrationSerializer\nfrom events.models import Event, EventRegistration\nfrom thaliawebsite.api.v2.serializers import CleanedHTMLSerializer\nfrom utils.snippets import create_google_maps_url\n\n\nclass EventSerializer(serializers.ModelSerializer):\n \"\"\"Serializer for events.\"\"\"\n\n class Meta:\n model = Event\n fields = (\n \"pk\",\n \"title\",\n \"description\",\n \"start\",\n \"end\",\n \"category\",\n \"registration_start\",\n \"registration_end\",\n \"cancel_deadline\",\n \"optional_registrations\",\n \"location\",\n \"price\",\n \"fine\",\n \"num_participants\",\n \"max_participants\",\n \"no_registration_message\",\n \"cancel_too_late_message\",\n \"has_fields\",\n \"food_event\",\n \"maps_url\",\n \"user_permissions\",\n \"user_registration\",\n \"organiser\",\n \"slide\",\n \"documents\",\n )\n\n description = CleanedHTMLSerializer()\n organiser = MemberGroupSerializer()\n user_registration = serializers.SerializerMethodField(\"_user_registration\")\n num_participants = serializers.SerializerMethodField(\"_num_participants\")\n maps_url = serializers.SerializerMethodField(\"_maps_url\")\n price = serializers.DecimalField(max_digits=5, decimal_places=2)\n fine = serializers.DecimalField(max_digits=5, decimal_places=2)\n slide = SlideSerializer()\n documents = DocumentSerializer(many=True)\n user_permissions = serializers.SerializerMethodField(\"_user_permissions\")\n\n def _user_registration(self, instance):\n try:\n if self.context[\"request\"].member:\n reg = instance.eventregistration_set.get(\n member=self.context[\"request\"].member, date_cancelled=None\n )\n return EventRegistrationSerializer(\n reg,\n context=self.context,\n fields=(\"pk\", \"present\", \"queue_position\", \"date\", \"payment\"),\n ).data\n except EventRegistration.DoesNotExist:\n pass\n return None\n\n def _num_participants(self, instance):\n if (\n instance.max_participants\n and instance.participants.count() > instance.max_participants\n ):\n return instance.max_participants\n return instance.participants.count()\n\n def _user_permissions(self, instance):\n member = self.context[\"request\"].member\n return services.event_permissions(member, instance)\n\n def _maps_url(self, instance):\n return create_google_maps_url(instance.map_location, zoom=13, size=\"450x250\")\n", "path": "website/events/api/v2/serializers/event.py"}], "after_files": [{"content": "from rest_framework import serializers\n\nfrom events.models import EventRegistration\nfrom members.api.v2.serializers.member import MemberSerializer\nfrom payments.api.v2.serializers import PaymentSerializer\n\n\nclass EventRegistrationSerializer(serializers.ModelSerializer):\n \"\"\"Serializer for event registrations.\"\"\"\n\n def __init__(self, *args, **kwargs):\n # Don't pass the 'fields' arg up to the superclass\n fields = kwargs.pop(\"fields\", {\"pk\", \"member\", \"name\"})\n\n # Instantiate the superclass normally\n super().__init__(*args, **kwargs)\n\n allowed = set(fields)\n existing = set(self.fields.keys())\n for field_name in existing - allowed:\n self.fields.pop(field_name)\n\n class Meta:\n model = EventRegistration\n fields = (\n \"pk\",\n \"present\",\n \"queue_position\",\n \"date\",\n \"payment\",\n \"member\",\n \"name\",\n )\n\n payment = PaymentSerializer()\n member = MemberSerializer(detailed=False, read_only=True)\n is_cancelled = serializers.SerializerMethodField(\"_is_cancelled\")\n is_late_cancellation = serializers.SerializerMethodField(\"_is_late_cancellation\")\n queue_position = serializers.SerializerMethodField(\"_queue_position\")\n\n def _is_late_cancellation(self, instance):\n return instance.is_late_cancellation()\n\n def _queue_position(self, instance):\n pos = instance.queue_position\n return pos if pos and pos > 0 else None\n\n def _is_cancelled(self, instance):\n return instance.date_cancelled is not None\n", "path": "website/events/api/v2/serializers/event_registration.py"}, {"content": "from rest_framework import serializers\n\nfrom activemembers.api.v2.serializers.member_group import MemberGroupSerializer\nfrom announcements.api.v2.serializers import SlideSerializer\nfrom documents.api.v2.serializers.document import DocumentSerializer\nfrom events import services\nfrom events.api.v2.serializers.event_registration import EventRegistrationSerializer\nfrom events.models import Event, EventRegistration\nfrom thaliawebsite.api.v2.serializers import CleanedHTMLSerializer\nfrom utils.snippets import create_google_maps_url\n\n\nclass EventSerializer(serializers.ModelSerializer):\n \"\"\"Serializer for events.\"\"\"\n\n class Meta:\n model = Event\n fields = (\n \"pk\",\n \"title\",\n \"description\",\n \"start\",\n \"end\",\n \"category\",\n \"registration_start\",\n \"registration_end\",\n \"cancel_deadline\",\n \"optional_registrations\",\n \"location\",\n \"price\",\n \"fine\",\n \"num_participants\",\n \"max_participants\",\n \"no_registration_message\",\n \"cancel_too_late_message\",\n \"has_fields\",\n \"food_event\",\n \"maps_url\",\n \"user_permissions\",\n \"user_registration\",\n \"organiser\",\n \"slide\",\n \"documents\",\n )\n\n description = CleanedHTMLSerializer()\n organiser = MemberGroupSerializer()\n user_registration = serializers.SerializerMethodField(\"_user_registration\")\n num_participants = serializers.SerializerMethodField(\"_num_participants\")\n maps_url = serializers.SerializerMethodField(\"_maps_url\")\n price = serializers.DecimalField(max_digits=5, decimal_places=2)\n fine = serializers.DecimalField(max_digits=5, decimal_places=2)\n slide = SlideSerializer()\n documents = DocumentSerializer(many=True)\n user_permissions = serializers.SerializerMethodField(\"_user_permissions\")\n\n def _user_registration(self, instance):\n try:\n if self.context[\"request\"].member:\n reg = instance.eventregistration_set.get(\n member=self.context[\"request\"].member\n )\n return EventRegistrationSerializer(\n reg,\n context=self.context,\n fields=(\n \"pk\",\n \"present\",\n \"queue_position\",\n \"is_cancelled\",\n \"is_late_cancellation\",\n \"date\",\n \"payment\",\n ),\n ).data\n except EventRegistration.DoesNotExist:\n pass\n return None\n\n def _num_participants(self, instance):\n if (\n instance.max_participants\n and instance.participants.count() > instance.max_participants\n ):\n return instance.max_participants\n return instance.participants.count()\n\n def _user_permissions(self, instance):\n member = self.context[\"request\"].member\n return services.event_permissions(member, instance)\n\n def _maps_url(self, instance):\n return create_google_maps_url(instance.map_location, zoom=13, size=\"450x250\")\n", "path": "website/events/api/v2/serializers/event.py"}]}
| 1,457 | 433 |
gh_patches_debug_58117
|
rasdani/github-patches
|
git_diff
|
weni-ai__bothub-engine-76
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Next link in pagination broken in production
The links starts with ```https://bothub/```, correct is ```https://bothub.it/```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_packages
2
3 with open('requirements.txt') as fp:
4 install_requires = fp.read()
5 install_requires = list(
6 filter(lambda x: len(x) > 0, install_requires.split('\n')))
7
8 setup(
9 name='bothub',
10 version='1.7.1',
11 description='bothub',
12 packages=find_packages(),
13 install_requires=install_requires,
14 python_requires='>=3.6',
15 )
16
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -7,7 +7,7 @@
setup(
name='bothub',
- version='1.7.1',
+ version='1.7.2',
description='bothub',
packages=find_packages(),
install_requires=install_requires,
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -7,7 +7,7 @@\n \n setup(\n name='bothub',\n- version='1.7.1',\n+ version='1.7.2',\n description='bothub',\n packages=find_packages(),\n install_requires=install_requires,\n", "issue": "Next link in pagination broken in production\nThe links starts with ```https://bothub/```, correct is ```https://bothub.it/```\n", "before_files": [{"content": "from setuptools import setup, find_packages\n\nwith open('requirements.txt') as fp:\n install_requires = fp.read()\ninstall_requires = list(\n filter(lambda x: len(x) > 0, install_requires.split('\\n')))\n\nsetup(\n name='bothub',\n version='1.7.1',\n description='bothub',\n packages=find_packages(),\n install_requires=install_requires,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\n\nwith open('requirements.txt') as fp:\n install_requires = fp.read()\ninstall_requires = list(\n filter(lambda x: len(x) > 0, install_requires.split('\\n')))\n\nsetup(\n name='bothub',\n version='1.7.2',\n description='bothub',\n packages=find_packages(),\n install_requires=install_requires,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}]}
| 404 | 77 |
gh_patches_debug_40165
|
rasdani/github-patches
|
git_diff
|
learningequality__kolibri-5274
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
MultipleObjectsReturned: get() returned more than one Lesson -- it returned 2!
Sentry Issue: [KOLIBRI-BACKEND-7](https://sentry.io/learningequality/kolibri-backend/issues/877453921/?referrer=github_integration)
```
MultipleObjectsReturned: get() returned more than one Lesson -- it returned 2!
(8 additional frame(s) were not displayed)
...
File "rest_framework/mixins.py", line 56, in retrieve
instance = self.get_object()
File "rest_framework/generics.py", line 98, in get_object
obj = get_object_or_404(queryset, **filter_kwargs)
File "rest_framework/generics.py", line 21, in get_object_or_404
return _get_object_or_404(queryset, *filter_args, **filter_kwargs)
File "django/shortcuts.py", line 85, in get_object_or_404
return queryset.get(*args, **kwargs)
File "django/db/models/query.py", line 384, in get
(self.model._meta.object_name, num)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kolibri/plugins/learn/viewsets.py`
Content:
```
1 from django.db.models.query import F
2 from rest_framework.permissions import IsAuthenticated
3 from rest_framework.viewsets import ReadOnlyModelViewSet
4
5 from .serializers import LearnerClassroomSerializer
6 from kolibri.core.auth.api import KolibriAuthPermissionsFilter
7 from kolibri.core.auth.filters import HierarchyRelationsFilter
8 from kolibri.core.auth.models import Classroom
9 from kolibri.core.lessons.models import Lesson
10 from kolibri.core.lessons.models import LessonAssignment
11 from kolibri.core.lessons.serializers import LessonSerializer
12
13
14 class LearnerClassroomViewset(ReadOnlyModelViewSet):
15 """
16 Returns all Classrooms for which the requesting User is a member,
17 along with all associated assignments.
18 """
19 filter_backends = (KolibriAuthPermissionsFilter,)
20 permission_classes = (IsAuthenticated,)
21 serializer_class = LearnerClassroomSerializer
22
23 def get_queryset(self):
24 current_user = self.request.user
25 memberships = current_user.memberships.filter(
26 collection__kind='classroom',
27 ).values('collection_id')
28 return Classroom.objects.filter(id__in=memberships)
29
30
31 class LearnerLessonViewset(ReadOnlyModelViewSet):
32 """
33 Special Viewset for Learners to view Lessons to which they are assigned.
34 The core Lesson Viewset is locked down to Admin users only.
35 """
36 serializer_class = LessonSerializer
37 permission_classes = (IsAuthenticated,)
38
39 def get_queryset(self):
40 assignments = HierarchyRelationsFilter(LessonAssignment.objects.all()) \
41 .filter_by_hierarchy(
42 target_user=self.request.user,
43 ancestor_collection=F('collection')
44 )
45 return Lesson.objects.filter(
46 lesson_assignments__in=assignments,
47 is_active=True
48 )
49
```
Path: `kolibri/plugins/learn/serializers.py`
Content:
```
1 from django.db.models import Q
2 from django.db.models import Sum
3 from rest_framework.serializers import JSONField
4 from rest_framework.serializers import ModelSerializer
5 from rest_framework.serializers import SerializerMethodField
6
7 from kolibri.core.auth.models import Classroom
8 from kolibri.core.exams.models import Exam
9 from kolibri.core.lessons.models import Lesson
10 from kolibri.core.logger.models import ContentSummaryLog
11 from kolibri.core.logger.models import ExamLog
12
13
14 class ExamProgressSerializer(ModelSerializer):
15 """
16 Annotates an Exam with progress information based on logs generated
17 by the requesting User
18 """
19 class Meta:
20 model = Exam
21 fields = (
22 'active',
23 'id',
24 'progress',
25 'question_count',
26 'title',
27 )
28
29 progress = SerializerMethodField()
30
31 # Mostly copied from UserExamSerializer.to_representation, but working directly
32 # from Exam Model instead of ExamAssignment
33 def get_progress(self, instance):
34 try:
35 examlogs = instance.examlogs.get(user=self.context['user'])
36 return {
37 'score': examlogs.attemptlogs.aggregate(Sum('correct')).get('correct__sum'),
38 'answer_count': examlogs.attemptlogs.count(),
39 'closed': examlogs.closed,
40 'started': True,
41 }
42 except ExamLog.DoesNotExist:
43 return {
44 'score': None,
45 'answer_count': None,
46 'closed': None,
47 'started': False,
48 }
49
50
51 class LessonProgressSerializer(ModelSerializer):
52 """
53 Annotates a Lesson with progress information based on logs generated
54 by the requesting User
55 """
56 progress = SerializerMethodField()
57 resources = JSONField(default='[]')
58
59 class Meta:
60 model = Lesson
61 fields = (
62 'description',
63 'id',
64 'is_active',
65 'title',
66 'progress',
67 'resources',
68 )
69
70 def get_progress(self, instance):
71 content_ids = [resource['content_id'] for resource in instance.resources]
72 resource_progress = ContentSummaryLog.objects \
73 .filter(
74 user=self.context['user'],
75 content_id__in=content_ids
76 ) \
77 .aggregate(Sum('progress')).get('progress__sum')
78 return {
79 'resource_progress': resource_progress,
80 'total_resources': len(instance.resources),
81 }
82
83
84 class LearnerClassroomSerializer(ModelSerializer):
85 assignments = SerializerMethodField()
86
87 class Meta:
88 model = Classroom
89 fields = (
90 'id',
91 'name',
92 'assignments',
93 )
94
95 def get_assignments(self, instance):
96 """
97 Returns all Exams and Lessons (and progress) assigned to the requesting User
98 """
99 current_user = self.context['request'].user
100 memberships = current_user.memberships.all()
101 learner_groups = [m.collection for m in memberships]
102
103 # Return only active Lessons that are assigned to the requesting user's groups
104 # TODO move this to a permission_class on Lesson
105 filtered_lessons = Lesson.objects.filter(
106 lesson_assignments__collection__in=learner_groups,
107 collection=instance,
108 is_active=True,
109 ).distinct()
110
111 filtered_exams = Exam.objects.filter(
112 assignments__collection__in=learner_groups,
113 collection=instance,
114 ).filter(Q(active=True) | Q(examlogs__user=current_user)).distinct()
115
116 return {
117 'lessons': LessonProgressSerializer(
118 filtered_lessons,
119 many=True,
120 context={'user': current_user},
121 ).data,
122 'exams': ExamProgressSerializer(
123 filtered_exams,
124 many=True,
125 context={'user': current_user},
126 ).data,
127 }
128
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/kolibri/plugins/learn/serializers.py b/kolibri/plugins/learn/serializers.py
--- a/kolibri/plugins/learn/serializers.py
+++ b/kolibri/plugins/learn/serializers.py
@@ -4,9 +4,12 @@
from rest_framework.serializers import ModelSerializer
from rest_framework.serializers import SerializerMethodField
+from kolibri.core.auth.filters import HierarchyRelationsFilter
from kolibri.core.auth.models import Classroom
from kolibri.core.exams.models import Exam
+from kolibri.core.exams.models import ExamAssignment
from kolibri.core.lessons.models import Lesson
+from kolibri.core.lessons.models import LessonAssignment
from kolibri.core.logger.models import ContentSummaryLog
from kolibri.core.logger.models import ExamLog
@@ -97,19 +100,27 @@
Returns all Exams and Lessons (and progress) assigned to the requesting User
"""
current_user = self.context['request'].user
- memberships = current_user.memberships.all()
- learner_groups = [m.collection for m in memberships]
# Return only active Lessons that are assigned to the requesting user's groups
# TODO move this to a permission_class on Lesson
+ lesson_assignments = HierarchyRelationsFilter(LessonAssignment.objects.all()) \
+ .filter_by_hierarchy(
+ target_user=current_user,
+ ancestor_collection=instance
+ )
filtered_lessons = Lesson.objects.filter(
- lesson_assignments__collection__in=learner_groups,
- collection=instance,
- is_active=True,
+ lesson_assignments__in=lesson_assignments,
+ is_active=True
).distinct()
+ exam_assignments = HierarchyRelationsFilter(ExamAssignment.objects.all()) \
+ .filter_by_hierarchy(
+ target_user=current_user,
+ ancestor_collection=instance
+ )
+
filtered_exams = Exam.objects.filter(
- assignments__collection__in=learner_groups,
+ assignments__in=exam_assignments,
collection=instance,
).filter(Q(active=True) | Q(examlogs__user=current_user)).distinct()
diff --git a/kolibri/plugins/learn/viewsets.py b/kolibri/plugins/learn/viewsets.py
--- a/kolibri/plugins/learn/viewsets.py
+++ b/kolibri/plugins/learn/viewsets.py
@@ -21,11 +21,10 @@
serializer_class = LearnerClassroomSerializer
def get_queryset(self):
- current_user = self.request.user
- memberships = current_user.memberships.filter(
- collection__kind='classroom',
- ).values('collection_id')
- return Classroom.objects.filter(id__in=memberships)
+ return HierarchyRelationsFilter(Classroom.objects.all()).filter_by_hierarchy(
+ target_user=self.request.user,
+ ancestor_collection=F('id')
+ )
class LearnerLessonViewset(ReadOnlyModelViewSet):
@@ -45,4 +44,4 @@
return Lesson.objects.filter(
lesson_assignments__in=assignments,
is_active=True
- )
+ ).distinct()
|
{"golden_diff": "diff --git a/kolibri/plugins/learn/serializers.py b/kolibri/plugins/learn/serializers.py\n--- a/kolibri/plugins/learn/serializers.py\n+++ b/kolibri/plugins/learn/serializers.py\n@@ -4,9 +4,12 @@\n from rest_framework.serializers import ModelSerializer\n from rest_framework.serializers import SerializerMethodField\n \n+from kolibri.core.auth.filters import HierarchyRelationsFilter\n from kolibri.core.auth.models import Classroom\n from kolibri.core.exams.models import Exam\n+from kolibri.core.exams.models import ExamAssignment\n from kolibri.core.lessons.models import Lesson\n+from kolibri.core.lessons.models import LessonAssignment\n from kolibri.core.logger.models import ContentSummaryLog\n from kolibri.core.logger.models import ExamLog\n \n@@ -97,19 +100,27 @@\n Returns all Exams and Lessons (and progress) assigned to the requesting User\n \"\"\"\n current_user = self.context['request'].user\n- memberships = current_user.memberships.all()\n- learner_groups = [m.collection for m in memberships]\n \n # Return only active Lessons that are assigned to the requesting user's groups\n # TODO move this to a permission_class on Lesson\n+ lesson_assignments = HierarchyRelationsFilter(LessonAssignment.objects.all()) \\\n+ .filter_by_hierarchy(\n+ target_user=current_user,\n+ ancestor_collection=instance\n+ )\n filtered_lessons = Lesson.objects.filter(\n- lesson_assignments__collection__in=learner_groups,\n- collection=instance,\n- is_active=True,\n+ lesson_assignments__in=lesson_assignments,\n+ is_active=True\n ).distinct()\n \n+ exam_assignments = HierarchyRelationsFilter(ExamAssignment.objects.all()) \\\n+ .filter_by_hierarchy(\n+ target_user=current_user,\n+ ancestor_collection=instance\n+ )\n+\n filtered_exams = Exam.objects.filter(\n- assignments__collection__in=learner_groups,\n+ assignments__in=exam_assignments,\n collection=instance,\n ).filter(Q(active=True) | Q(examlogs__user=current_user)).distinct()\n \ndiff --git a/kolibri/plugins/learn/viewsets.py b/kolibri/plugins/learn/viewsets.py\n--- a/kolibri/plugins/learn/viewsets.py\n+++ b/kolibri/plugins/learn/viewsets.py\n@@ -21,11 +21,10 @@\n serializer_class = LearnerClassroomSerializer\n \n def get_queryset(self):\n- current_user = self.request.user\n- memberships = current_user.memberships.filter(\n- collection__kind='classroom',\n- ).values('collection_id')\n- return Classroom.objects.filter(id__in=memberships)\n+ return HierarchyRelationsFilter(Classroom.objects.all()).filter_by_hierarchy(\n+ target_user=self.request.user,\n+ ancestor_collection=F('id')\n+ )\n \n \n class LearnerLessonViewset(ReadOnlyModelViewSet):\n@@ -45,4 +44,4 @@\n return Lesson.objects.filter(\n lesson_assignments__in=assignments,\n is_active=True\n- )\n+ ).distinct()\n", "issue": "MultipleObjectsReturned: get() returned more than one Lesson -- it returned 2!\nSentry Issue: [KOLIBRI-BACKEND-7](https://sentry.io/learningequality/kolibri-backend/issues/877453921/?referrer=github_integration)\n\n```\nMultipleObjectsReturned: get() returned more than one Lesson -- it returned 2!\n(8 additional frame(s) were not displayed)\n...\n File \"rest_framework/mixins.py\", line 56, in retrieve\n instance = self.get_object()\n File \"rest_framework/generics.py\", line 98, in get_object\n obj = get_object_or_404(queryset, **filter_kwargs)\n File \"rest_framework/generics.py\", line 21, in get_object_or_404\n return _get_object_or_404(queryset, *filter_args, **filter_kwargs)\n File \"django/shortcuts.py\", line 85, in get_object_or_404\n return queryset.get(*args, **kwargs)\n File \"django/db/models/query.py\", line 384, in get\n (self.model._meta.object_name, num)\n```\n", "before_files": [{"content": "from django.db.models.query import F\nfrom rest_framework.permissions import IsAuthenticated\nfrom rest_framework.viewsets import ReadOnlyModelViewSet\n\nfrom .serializers import LearnerClassroomSerializer\nfrom kolibri.core.auth.api import KolibriAuthPermissionsFilter\nfrom kolibri.core.auth.filters import HierarchyRelationsFilter\nfrom kolibri.core.auth.models import Classroom\nfrom kolibri.core.lessons.models import Lesson\nfrom kolibri.core.lessons.models import LessonAssignment\nfrom kolibri.core.lessons.serializers import LessonSerializer\n\n\nclass LearnerClassroomViewset(ReadOnlyModelViewSet):\n \"\"\"\n Returns all Classrooms for which the requesting User is a member,\n along with all associated assignments.\n \"\"\"\n filter_backends = (KolibriAuthPermissionsFilter,)\n permission_classes = (IsAuthenticated,)\n serializer_class = LearnerClassroomSerializer\n\n def get_queryset(self):\n current_user = self.request.user\n memberships = current_user.memberships.filter(\n collection__kind='classroom',\n ).values('collection_id')\n return Classroom.objects.filter(id__in=memberships)\n\n\nclass LearnerLessonViewset(ReadOnlyModelViewSet):\n \"\"\"\n Special Viewset for Learners to view Lessons to which they are assigned.\n The core Lesson Viewset is locked down to Admin users only.\n \"\"\"\n serializer_class = LessonSerializer\n permission_classes = (IsAuthenticated,)\n\n def get_queryset(self):\n assignments = HierarchyRelationsFilter(LessonAssignment.objects.all()) \\\n .filter_by_hierarchy(\n target_user=self.request.user,\n ancestor_collection=F('collection')\n )\n return Lesson.objects.filter(\n lesson_assignments__in=assignments,\n is_active=True\n )\n", "path": "kolibri/plugins/learn/viewsets.py"}, {"content": "from django.db.models import Q\nfrom django.db.models import Sum\nfrom rest_framework.serializers import JSONField\nfrom rest_framework.serializers import ModelSerializer\nfrom rest_framework.serializers import SerializerMethodField\n\nfrom kolibri.core.auth.models import Classroom\nfrom kolibri.core.exams.models import Exam\nfrom kolibri.core.lessons.models import Lesson\nfrom kolibri.core.logger.models import ContentSummaryLog\nfrom kolibri.core.logger.models import ExamLog\n\n\nclass ExamProgressSerializer(ModelSerializer):\n \"\"\"\n Annotates an Exam with progress information based on logs generated\n by the requesting User\n \"\"\"\n class Meta:\n model = Exam\n fields = (\n 'active',\n 'id',\n 'progress',\n 'question_count',\n 'title',\n )\n\n progress = SerializerMethodField()\n\n # Mostly copied from UserExamSerializer.to_representation, but working directly\n # from Exam Model instead of ExamAssignment\n def get_progress(self, instance):\n try:\n examlogs = instance.examlogs.get(user=self.context['user'])\n return {\n 'score': examlogs.attemptlogs.aggregate(Sum('correct')).get('correct__sum'),\n 'answer_count': examlogs.attemptlogs.count(),\n 'closed': examlogs.closed,\n 'started': True,\n }\n except ExamLog.DoesNotExist:\n return {\n 'score': None,\n 'answer_count': None,\n 'closed': None,\n 'started': False,\n }\n\n\nclass LessonProgressSerializer(ModelSerializer):\n \"\"\"\n Annotates a Lesson with progress information based on logs generated\n by the requesting User\n \"\"\"\n progress = SerializerMethodField()\n resources = JSONField(default='[]')\n\n class Meta:\n model = Lesson\n fields = (\n 'description',\n 'id',\n 'is_active',\n 'title',\n 'progress',\n 'resources',\n )\n\n def get_progress(self, instance):\n content_ids = [resource['content_id'] for resource in instance.resources]\n resource_progress = ContentSummaryLog.objects \\\n .filter(\n user=self.context['user'],\n content_id__in=content_ids\n ) \\\n .aggregate(Sum('progress')).get('progress__sum')\n return {\n 'resource_progress': resource_progress,\n 'total_resources': len(instance.resources),\n }\n\n\nclass LearnerClassroomSerializer(ModelSerializer):\n assignments = SerializerMethodField()\n\n class Meta:\n model = Classroom\n fields = (\n 'id',\n 'name',\n 'assignments',\n )\n\n def get_assignments(self, instance):\n \"\"\"\n Returns all Exams and Lessons (and progress) assigned to the requesting User\n \"\"\"\n current_user = self.context['request'].user\n memberships = current_user.memberships.all()\n learner_groups = [m.collection for m in memberships]\n\n # Return only active Lessons that are assigned to the requesting user's groups\n # TODO move this to a permission_class on Lesson\n filtered_lessons = Lesson.objects.filter(\n lesson_assignments__collection__in=learner_groups,\n collection=instance,\n is_active=True,\n ).distinct()\n\n filtered_exams = Exam.objects.filter(\n assignments__collection__in=learner_groups,\n collection=instance,\n ).filter(Q(active=True) | Q(examlogs__user=current_user)).distinct()\n\n return {\n 'lessons': LessonProgressSerializer(\n filtered_lessons,\n many=True,\n context={'user': current_user},\n ).data,\n 'exams': ExamProgressSerializer(\n filtered_exams,\n many=True,\n context={'user': current_user},\n ).data,\n }\n", "path": "kolibri/plugins/learn/serializers.py"}], "after_files": [{"content": "from django.db.models.query import F\nfrom rest_framework.permissions import IsAuthenticated\nfrom rest_framework.viewsets import ReadOnlyModelViewSet\n\nfrom .serializers import LearnerClassroomSerializer\nfrom kolibri.core.auth.api import KolibriAuthPermissionsFilter\nfrom kolibri.core.auth.filters import HierarchyRelationsFilter\nfrom kolibri.core.auth.models import Classroom\nfrom kolibri.core.lessons.models import Lesson\nfrom kolibri.core.lessons.models import LessonAssignment\nfrom kolibri.core.lessons.serializers import LessonSerializer\n\n\nclass LearnerClassroomViewset(ReadOnlyModelViewSet):\n \"\"\"\n Returns all Classrooms for which the requesting User is a member,\n along with all associated assignments.\n \"\"\"\n filter_backends = (KolibriAuthPermissionsFilter,)\n permission_classes = (IsAuthenticated,)\n serializer_class = LearnerClassroomSerializer\n\n def get_queryset(self):\n return HierarchyRelationsFilter(Classroom.objects.all()).filter_by_hierarchy(\n target_user=self.request.user,\n ancestor_collection=F('id')\n )\n\n\nclass LearnerLessonViewset(ReadOnlyModelViewSet):\n \"\"\"\n Special Viewset for Learners to view Lessons to which they are assigned.\n The core Lesson Viewset is locked down to Admin users only.\n \"\"\"\n serializer_class = LessonSerializer\n permission_classes = (IsAuthenticated,)\n\n def get_queryset(self):\n assignments = HierarchyRelationsFilter(LessonAssignment.objects.all()) \\\n .filter_by_hierarchy(\n target_user=self.request.user,\n ancestor_collection=F('collection')\n )\n return Lesson.objects.filter(\n lesson_assignments__in=assignments,\n is_active=True\n ).distinct()\n", "path": "kolibri/plugins/learn/viewsets.py"}, {"content": "from django.db.models import Q\nfrom django.db.models import Sum\nfrom rest_framework.serializers import JSONField\nfrom rest_framework.serializers import ModelSerializer\nfrom rest_framework.serializers import SerializerMethodField\n\nfrom kolibri.core.auth.filters import HierarchyRelationsFilter\nfrom kolibri.core.auth.models import Classroom\nfrom kolibri.core.exams.models import Exam\nfrom kolibri.core.exams.models import ExamAssignment\nfrom kolibri.core.lessons.models import Lesson\nfrom kolibri.core.lessons.models import LessonAssignment\nfrom kolibri.core.logger.models import ContentSummaryLog\nfrom kolibri.core.logger.models import ExamLog\n\n\nclass ExamProgressSerializer(ModelSerializer):\n \"\"\"\n Annotates an Exam with progress information based on logs generated\n by the requesting User\n \"\"\"\n class Meta:\n model = Exam\n fields = (\n 'active',\n 'id',\n 'progress',\n 'question_count',\n 'title',\n )\n\n progress = SerializerMethodField()\n\n # Mostly copied from UserExamSerializer.to_representation, but working directly\n # from Exam Model instead of ExamAssignment\n def get_progress(self, instance):\n try:\n examlogs = instance.examlogs.get(user=self.context['user'])\n return {\n 'score': examlogs.attemptlogs.aggregate(Sum('correct')).get('correct__sum'),\n 'answer_count': examlogs.attemptlogs.count(),\n 'closed': examlogs.closed,\n 'started': True,\n }\n except ExamLog.DoesNotExist:\n return {\n 'score': None,\n 'answer_count': None,\n 'closed': None,\n 'started': False,\n }\n\n\nclass LessonProgressSerializer(ModelSerializer):\n \"\"\"\n Annotates a Lesson with progress information based on logs generated\n by the requesting User\n \"\"\"\n progress = SerializerMethodField()\n resources = JSONField(default='[]')\n\n class Meta:\n model = Lesson\n fields = (\n 'description',\n 'id',\n 'is_active',\n 'title',\n 'progress',\n 'resources',\n )\n\n def get_progress(self, instance):\n content_ids = [resource['content_id'] for resource in instance.resources]\n resource_progress = ContentSummaryLog.objects \\\n .filter(\n user=self.context['user'],\n content_id__in=content_ids\n ) \\\n .aggregate(Sum('progress')).get('progress__sum')\n return {\n 'resource_progress': resource_progress,\n 'total_resources': len(instance.resources),\n }\n\n\nclass LearnerClassroomSerializer(ModelSerializer):\n assignments = SerializerMethodField()\n\n class Meta:\n model = Classroom\n fields = (\n 'id',\n 'name',\n 'assignments',\n )\n\n def get_assignments(self, instance):\n \"\"\"\n Returns all Exams and Lessons (and progress) assigned to the requesting User\n \"\"\"\n current_user = self.context['request'].user\n\n # Return only active Lessons that are assigned to the requesting user's groups\n # TODO move this to a permission_class on Lesson\n lesson_assignments = HierarchyRelationsFilter(LessonAssignment.objects.all()) \\\n .filter_by_hierarchy(\n target_user=current_user,\n ancestor_collection=instance\n )\n filtered_lessons = Lesson.objects.filter(\n lesson_assignments__in=lesson_assignments,\n is_active=True\n ).distinct()\n\n exam_assignments = HierarchyRelationsFilter(ExamAssignment.objects.all()) \\\n .filter_by_hierarchy(\n target_user=current_user,\n ancestor_collection=instance\n )\n\n filtered_exams = Exam.objects.filter(\n assignments__in=exam_assignments,\n collection=instance,\n ).filter(Q(active=True) | Q(examlogs__user=current_user)).distinct()\n\n return {\n 'lessons': LessonProgressSerializer(\n filtered_lessons,\n many=True,\n context={'user': current_user},\n ).data,\n 'exams': ExamProgressSerializer(\n filtered_exams,\n many=True,\n context={'user': current_user},\n ).data,\n }\n", "path": "kolibri/plugins/learn/serializers.py"}]}
| 2,035 | 675 |
gh_patches_debug_9419
|
rasdani/github-patches
|
git_diff
|
privacyidea__privacyidea-904
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add VASCO token type to web UI
As we cannot generate new secrets by ourselves, we just add a possibility to enter a hex-encoded blob directly.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `privacyidea/lib/tokens/vascotoken.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # License: AGPLv3
4 # contact: http://www.privacyidea.org
5 #
6 # 2018-01-15 Friedrich Weber <[email protected]>
7 # Initial version of the VASCO token
8 #
9 # This code is free software; you can redistribute it and/or
10 # modify it under the terms of the GNU AFFERO GENERAL PUBLIC LICENSE
11 # License as published by the Free Software Foundation; either
12 # version 3 of the License, or any later version.
13 #
14 # This code is distributed in the hope that it will be useful,
15 # but WITHOUT ANY WARRANTY; without even the implied warranty of
16 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
17 # GNU AFFERO GENERAL PUBLIC LICENSE for more details.
18 #
19 # You should have received a copy of the GNU Affero General Public
20 # License along with this program. If not, see <http://www.gnu.org/licenses/>.
21 #
22 #
23 import binascii
24
25 __doc__ = """This is the implementation of the VASCO token"""
26
27 import logging
28 from privacyidea.api.lib.utils import getParam
29 from privacyidea.lib.utils import is_true
30 from privacyidea.lib.decorators import check_token_locked
31 from privacyidea.lib.error import ParameterError
32 from privacyidea.lib.log import log_with
33 from privacyidea.lib.tokenclass import TokenClass
34 from privacyidea.lib.tokens.vasco import vasco_otp_check
35 from privacyidea.lib import _
36
37 optional = True
38 required = False
39
40 log = logging.getLogger(__name__)
41
42
43 class VascoTokenClass(TokenClass):
44 """
45 Token class for VASCO Digipass tokens. Relies on vendor-specific
46 shared library, whose location needs to be set in the PI_VASCO_LIBRARY
47 config option.
48
49 VASCO Tokens can be read from a CSV file which is structured as follows::
50
51 <serial1>,<hexlify(blob1)>,vasco
52 <serial2>,<hexlify(blob2)>,vasco
53 ...
54
55 whereas blobX is the 248-byte blob holding the token information.
56 Consequently, hexlify(blobX) is a 496-character hex string.
57
58 The CSV file can be imported by using the "Import Tokens" feature of the Web UI,
59 where "OATH CSV" needs to be chosen as the file type.
60 """
61
62 def __init__(self, db_token):
63 """
64 constructor - create a token class object with its db token binding
65
66 :param aToken: the db bound token
67 """
68 TokenClass.__init__(self, db_token)
69 self.set_type(u"vasco")
70 self.hKeyRequired = True
71
72 @staticmethod
73 def get_class_type():
74 """
75 return the class type identifier
76 """
77 return "vasco"
78
79 @staticmethod
80 def get_class_prefix():
81 """
82 return the token type prefix
83 """
84 # TODO: Revisit token type?
85 return "VASC"
86
87 @staticmethod
88 @log_with(log)
89 def get_class_info(key=None, ret='all'):
90 """
91 :param key: subsection identifier
92 :type key: string
93 :param ret: default return value, if nothing is found
94 :type ret: user defined
95 :return: subsection if key exists or user defined
96 :rtype: dict or string
97 """
98 res = {'type': 'vasco',
99 'title': 'VASCO Token',
100 'description': _('VASCO Token: Authentication using VASCO tokens'),
101 'user': ["enroll"],
102 'policy': {},
103 }
104
105 if key:
106 ret = res.get(key, {})
107 else:
108 if ret == 'all':
109 ret = res
110 return ret
111
112 @log_with(log)
113 def update(self, param, reset_failcount=True):
114 """
115 update - process initialization parameters
116
117 :param param: dict of initialization parameters
118 :type param: dict
119
120 :return: nothing
121 """
122 if is_true(getParam(param, 'genkey', optional)):
123 raise ParameterError("Generating OTP keys is not supported")
124
125 upd_param = param.copy()
126
127 # If the OTP key is given, it is given as a 496-character hex string which
128 # encodes a 248-byte blob. As we want to set a 248-byte OTPKey (= Blob),
129 # we unhexlify the OTP key
130 if 'otpkey' in param:
131 if len(param['otpkey']) != 496:
132 raise ParameterError('Expected OTP key as 496-character hex string, but length is {!s}'.format(
133 len(param['otpkey'])
134 ))
135 upd_param['otpkey'] = binascii.unhexlify(upd_param['otpkey'])
136
137 TokenClass.update(self, upd_param, reset_failcount)
138
139 @check_token_locked
140 def check_otp(self, otpval, counter=None, window=None, options=None):
141 secret = self.token.get_otpkey().getKey()
142 result, new_secret = vasco_otp_check(secret, otpval)
143 # By default, setting a new OTP key resets the failcounter. In case of the VASCO token,
144 # this would mean that the failcounter is reset at every authentication attempt
145 # (regardless of success or failure), which must be avoided.
146 self.token.set_otpkey(new_secret, reset_failcount=False)
147 self.save()
148
149 if result == 0:
150 # Successful authentication
151 return 0
152 else:
153 if result == 1:
154 # wrong OTP value, no log message
155 pass
156 elif result == 201:
157 log.warning("A previous OTP value was used again!")
158 elif result == 202:
159 log.warning("Token-internal fail counter reached its maximum!")
160 else:
161 log.warning("VASCO token failed to authenticate, result: {!r}".format(result))
162 return -1
163
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/privacyidea/lib/tokens/vascotoken.py b/privacyidea/lib/tokens/vascotoken.py
--- a/privacyidea/lib/tokens/vascotoken.py
+++ b/privacyidea/lib/tokens/vascotoken.py
@@ -98,7 +98,10 @@
res = {'type': 'vasco',
'title': 'VASCO Token',
'description': _('VASCO Token: Authentication using VASCO tokens'),
- 'user': ["enroll"],
+ # If this was set, the user could enroll a Vasco token via the API
+ #'user': ["enroll"],
+ # only administrators can enroll the token in the UI
+ 'ui_enroll': ["admin"],
'policy': {},
}
|
{"golden_diff": "diff --git a/privacyidea/lib/tokens/vascotoken.py b/privacyidea/lib/tokens/vascotoken.py\n--- a/privacyidea/lib/tokens/vascotoken.py\n+++ b/privacyidea/lib/tokens/vascotoken.py\n@@ -98,7 +98,10 @@\n res = {'type': 'vasco',\n 'title': 'VASCO Token',\n 'description': _('VASCO Token: Authentication using VASCO tokens'),\n- 'user': [\"enroll\"],\n+ # If this was set, the user could enroll a Vasco token via the API\n+ #'user': [\"enroll\"],\n+ # only administrators can enroll the token in the UI\n+ 'ui_enroll': [\"admin\"],\n 'policy': {},\n }\n", "issue": "Add VASCO token type to web UI\nAs we cannot generate new secrets by ourselves, we just add a possibility to enter a hex-encoded blob directly.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# License: AGPLv3\n# contact: http://www.privacyidea.org\n#\n# 2018-01-15 Friedrich Weber <[email protected]>\n# Initial version of the VASCO token\n#\n# This code is free software; you can redistribute it and/or\n# modify it under the terms of the GNU AFFERO GENERAL PUBLIC LICENSE\n# License as published by the Free Software Foundation; either\n# version 3 of the License, or any later version.\n#\n# This code is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU AFFERO GENERAL PUBLIC LICENSE for more details.\n#\n# You should have received a copy of the GNU Affero General Public\n# License along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n#\nimport binascii\n\n__doc__ = \"\"\"This is the implementation of the VASCO token\"\"\"\n\nimport logging\nfrom privacyidea.api.lib.utils import getParam\nfrom privacyidea.lib.utils import is_true\nfrom privacyidea.lib.decorators import check_token_locked\nfrom privacyidea.lib.error import ParameterError\nfrom privacyidea.lib.log import log_with\nfrom privacyidea.lib.tokenclass import TokenClass\nfrom privacyidea.lib.tokens.vasco import vasco_otp_check\nfrom privacyidea.lib import _\n\noptional = True\nrequired = False\n\nlog = logging.getLogger(__name__)\n\n\nclass VascoTokenClass(TokenClass):\n \"\"\"\n Token class for VASCO Digipass tokens. Relies on vendor-specific\n shared library, whose location needs to be set in the PI_VASCO_LIBRARY\n config option.\n\n VASCO Tokens can be read from a CSV file which is structured as follows::\n\n <serial1>,<hexlify(blob1)>,vasco\n <serial2>,<hexlify(blob2)>,vasco\n ...\n\n whereas blobX is the 248-byte blob holding the token information.\n Consequently, hexlify(blobX) is a 496-character hex string.\n\n The CSV file can be imported by using the \"Import Tokens\" feature of the Web UI,\n where \"OATH CSV\" needs to be chosen as the file type.\n \"\"\"\n\n def __init__(self, db_token):\n \"\"\"\n constructor - create a token class object with its db token binding\n\n :param aToken: the db bound token\n \"\"\"\n TokenClass.__init__(self, db_token)\n self.set_type(u\"vasco\")\n self.hKeyRequired = True\n\n @staticmethod\n def get_class_type():\n \"\"\"\n return the class type identifier\n \"\"\"\n return \"vasco\"\n\n @staticmethod\n def get_class_prefix():\n \"\"\"\n return the token type prefix\n \"\"\"\n # TODO: Revisit token type?\n return \"VASC\"\n\n @staticmethod\n @log_with(log)\n def get_class_info(key=None, ret='all'):\n \"\"\"\n :param key: subsection identifier\n :type key: string\n :param ret: default return value, if nothing is found\n :type ret: user defined\n :return: subsection if key exists or user defined\n :rtype: dict or string\n \"\"\"\n res = {'type': 'vasco',\n 'title': 'VASCO Token',\n 'description': _('VASCO Token: Authentication using VASCO tokens'),\n 'user': [\"enroll\"],\n 'policy': {},\n }\n\n if key:\n ret = res.get(key, {})\n else:\n if ret == 'all':\n ret = res\n return ret\n\n @log_with(log)\n def update(self, param, reset_failcount=True):\n \"\"\"\n update - process initialization parameters\n\n :param param: dict of initialization parameters\n :type param: dict\n\n :return: nothing\n \"\"\"\n if is_true(getParam(param, 'genkey', optional)):\n raise ParameterError(\"Generating OTP keys is not supported\")\n\n upd_param = param.copy()\n\n # If the OTP key is given, it is given as a 496-character hex string which\n # encodes a 248-byte blob. As we want to set a 248-byte OTPKey (= Blob),\n # we unhexlify the OTP key\n if 'otpkey' in param:\n if len(param['otpkey']) != 496:\n raise ParameterError('Expected OTP key as 496-character hex string, but length is {!s}'.format(\n len(param['otpkey'])\n ))\n upd_param['otpkey'] = binascii.unhexlify(upd_param['otpkey'])\n\n TokenClass.update(self, upd_param, reset_failcount)\n\n @check_token_locked\n def check_otp(self, otpval, counter=None, window=None, options=None):\n secret = self.token.get_otpkey().getKey()\n result, new_secret = vasco_otp_check(secret, otpval)\n # By default, setting a new OTP key resets the failcounter. In case of the VASCO token,\n # this would mean that the failcounter is reset at every authentication attempt\n # (regardless of success or failure), which must be avoided.\n self.token.set_otpkey(new_secret, reset_failcount=False)\n self.save()\n\n if result == 0:\n # Successful authentication\n return 0\n else:\n if result == 1:\n # wrong OTP value, no log message\n pass\n elif result == 201:\n log.warning(\"A previous OTP value was used again!\")\n elif result == 202:\n log.warning(\"Token-internal fail counter reached its maximum!\")\n else:\n log.warning(\"VASCO token failed to authenticate, result: {!r}\".format(result))\n return -1\n", "path": "privacyidea/lib/tokens/vascotoken.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# License: AGPLv3\n# contact: http://www.privacyidea.org\n#\n# 2018-01-15 Friedrich Weber <[email protected]>\n# Initial version of the VASCO token\n#\n# This code is free software; you can redistribute it and/or\n# modify it under the terms of the GNU AFFERO GENERAL PUBLIC LICENSE\n# License as published by the Free Software Foundation; either\n# version 3 of the License, or any later version.\n#\n# This code is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU AFFERO GENERAL PUBLIC LICENSE for more details.\n#\n# You should have received a copy of the GNU Affero General Public\n# License along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n#\nimport binascii\n\n__doc__ = \"\"\"This is the implementation of the VASCO token\"\"\"\n\nimport logging\nfrom privacyidea.api.lib.utils import getParam\nfrom privacyidea.lib.utils import is_true\nfrom privacyidea.lib.decorators import check_token_locked\nfrom privacyidea.lib.error import ParameterError\nfrom privacyidea.lib.log import log_with\nfrom privacyidea.lib.tokenclass import TokenClass\nfrom privacyidea.lib.tokens.vasco import vasco_otp_check\nfrom privacyidea.lib import _\n\noptional = True\nrequired = False\n\nlog = logging.getLogger(__name__)\n\n\nclass VascoTokenClass(TokenClass):\n \"\"\"\n Token class for VASCO Digipass tokens. Relies on vendor-specific\n shared library, whose location needs to be set in the PI_VASCO_LIBRARY\n config option.\n\n VASCO Tokens can be read from a CSV file which is structured as follows::\n\n <serial1>,<hexlify(blob1)>,vasco\n <serial2>,<hexlify(blob2)>,vasco\n ...\n\n whereas blobX is the 248-byte blob holding the token information.\n Consequently, hexlify(blobX) is a 496-character hex string.\n\n The CSV file can be imported by using the \"Import Tokens\" feature of the Web UI,\n where \"OATH CSV\" needs to be chosen as the file type.\n \"\"\"\n\n def __init__(self, db_token):\n \"\"\"\n constructor - create a token class object with its db token binding\n\n :param aToken: the db bound token\n \"\"\"\n TokenClass.__init__(self, db_token)\n self.set_type(u\"vasco\")\n self.hKeyRequired = True\n\n @staticmethod\n def get_class_type():\n \"\"\"\n return the class type identifier\n \"\"\"\n return \"vasco\"\n\n @staticmethod\n def get_class_prefix():\n \"\"\"\n return the token type prefix\n \"\"\"\n # TODO: Revisit token type?\n return \"VASC\"\n\n @staticmethod\n @log_with(log)\n def get_class_info(key=None, ret='all'):\n \"\"\"\n :param key: subsection identifier\n :type key: string\n :param ret: default return value, if nothing is found\n :type ret: user defined\n :return: subsection if key exists or user defined\n :rtype: dict or string\n \"\"\"\n res = {'type': 'vasco',\n 'title': 'VASCO Token',\n 'description': _('VASCO Token: Authentication using VASCO tokens'),\n # If this was set, the user could enroll a Vasco token via the API\n #'user': [\"enroll\"],\n # only administrators can enroll the token in the UI\n 'ui_enroll': [\"admin\"],\n 'policy': {},\n }\n\n if key:\n ret = res.get(key, {})\n else:\n if ret == 'all':\n ret = res\n return ret\n\n @log_with(log)\n def update(self, param, reset_failcount=True):\n \"\"\"\n update - process initialization parameters\n\n :param param: dict of initialization parameters\n :type param: dict\n\n :return: nothing\n \"\"\"\n if is_true(getParam(param, 'genkey', optional)):\n raise ParameterError(\"Generating OTP keys is not supported\")\n\n upd_param = param.copy()\n\n # If the OTP key is given, it is given as a 496-character hex string which\n # encodes a 248-byte blob. As we want to set a 248-byte OTPKey (= Blob),\n # we unhexlify the OTP key\n if 'otpkey' in param:\n if len(param['otpkey']) != 496:\n raise ParameterError('Expected OTP key as 496-character hex string, but length is {!s}'.format(\n len(param['otpkey'])\n ))\n upd_param['otpkey'] = binascii.unhexlify(upd_param['otpkey'])\n\n TokenClass.update(self, upd_param, reset_failcount)\n\n @check_token_locked\n def check_otp(self, otpval, counter=None, window=None, options=None):\n secret = self.token.get_otpkey().getKey()\n result, new_secret = vasco_otp_check(secret, otpval)\n # By default, setting a new OTP key resets the failcounter. In case of the VASCO token,\n # this would mean that the failcounter is reset at every authentication attempt\n # (regardless of success or failure), which must be avoided.\n self.token.set_otpkey(new_secret, reset_failcount=False)\n self.save()\n\n if result == 0:\n # Successful authentication\n return 0\n else:\n if result == 1:\n # wrong OTP value, no log message\n pass\n elif result == 201:\n log.warning(\"A previous OTP value was used again!\")\n elif result == 202:\n log.warning(\"Token-internal fail counter reached its maximum!\")\n else:\n log.warning(\"VASCO token failed to authenticate, result: {!r}\".format(result))\n return -1\n", "path": "privacyidea/lib/tokens/vascotoken.py"}]}
| 1,973 | 172 |
gh_patches_debug_18546
|
rasdani/github-patches
|
git_diff
|
ethereum__web3.py-803
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve warning on failed address checksum
* Version: 4
### What was wrong?
Error message about checksum failure isn't clear enough, especially when the address is supplied as all-lower-case.
### How can it be fixed?
Special-case the warning, so that an all-lower-case address gives a specific warning, along the lines of: "web3py only accepts checksummed addresses. Please report it as a bug on any project that does not provide you checksummed addresses. In the meantime, you can force an address into checksummed state with `valid_address = w3.toChecksumAddress(questionable_address)` after double-checking that there are no errors in the address. Alternatively, you can use an ENS name in its place."
But, you know, not a paragraph...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `web3/utils/validation.py`
Content:
```
1 import itertools
2
3 from eth_utils import (
4 is_0x_prefixed,
5 is_boolean,
6 is_bytes,
7 is_checksum_address,
8 is_dict,
9 is_hex_address,
10 is_integer,
11 is_list_like,
12 is_string,
13 )
14
15 from web3.exceptions import (
16 InvalidAddress,
17 )
18 from web3.utils.abi import (
19 is_address_type,
20 is_array_type,
21 is_bool_type,
22 is_bytes_type,
23 is_int_type,
24 is_recognized_type,
25 is_string_type,
26 is_uint_type,
27 length_of_array_type,
28 sub_type_of_array_type,
29 )
30
31
32 def validate_abi(abi):
33 """
34 Helper function for validating an ABI
35 """
36 if not is_list_like(abi):
37 raise ValueError("'abi' is not a list")
38 for e in abi:
39 if not is_dict(e):
40 raise ValueError("The elements of 'abi' are not all dictionaries")
41
42
43 def validate_abi_type(abi_type):
44 """
45 Helper function for validating an abi_type
46 """
47 if not is_recognized_type(abi_type):
48 raise ValueError("Unrecognized abi_type: {abi_type}".format(abi_type=abi_type))
49
50
51 def validate_abi_value(abi_type, value):
52 """
53 Helper function for validating a value against the expected abi_type
54 Note: abi_type 'bytes' must either be python3 'bytes' object or ''
55 """
56 if is_array_type(abi_type) and is_list_like(value):
57 # validate length
58 specified_length = length_of_array_type(abi_type)
59 if specified_length is not None:
60 if specified_length < 1:
61 raise TypeError(
62 "Invalid abi-type: {abi_type}. Length of fixed sized arrays"
63 "must be greater than 0."
64 .format(abi_type=abi_type)
65 )
66 if specified_length != len(value):
67 raise TypeError(
68 "The following array length does not the length specified"
69 "by the abi-type, {abi_type}: {value}"
70 .format(abi_type=abi_type, value=value)
71 )
72
73 # validate sub_types
74 sub_type = sub_type_of_array_type(abi_type)
75 for v in value:
76 validate_abi_value(sub_type, v)
77 return
78 elif is_bool_type(abi_type) and is_boolean(value):
79 return
80 elif is_uint_type(abi_type) and is_integer(value) and value >= 0:
81 return
82 elif is_int_type(abi_type) and is_integer(value):
83 return
84 elif is_address_type(abi_type):
85 validate_address(value)
86 return
87 elif is_bytes_type(abi_type):
88 if is_bytes(value):
89 return
90 elif is_string(value):
91 if is_0x_prefixed(value):
92 return
93 else:
94 raise TypeError(
95 "ABI values of abi-type 'bytes' must be either"
96 "a python3 'bytes' object or an '0x' prefixed string."
97 )
98 elif is_string_type(abi_type) and is_string(value):
99 return
100
101 raise TypeError(
102 "The following abi value is not a '{abi_type}': {value}"
103 .format(abi_type=abi_type, value=value)
104 )
105
106
107 def validate_address(value):
108 """
109 Helper function for validating an address
110 """
111 if not isinstance(value, str):
112 raise TypeError('Address {} must be provided as a string'.format(value))
113 if not is_hex_address(value):
114 raise InvalidAddress("Address must be 20 bytes, as a hex string with a 0x prefix", value)
115 if not is_checksum_address(value):
116 raise InvalidAddress("Address has an invalid EIP checksum", value)
117
118
119 def has_one_val(*args, **kwargs):
120 vals = itertools.chain(args, kwargs.values())
121 not_nones = list(filter(lambda val: val is not None, vals))
122 return len(not_nones) == 1
123
124
125 def assert_one_val(*args, **kwargs):
126 if not has_one_val(*args, **kwargs):
127 raise TypeError(
128 "Exactly one of the passed values can be specified. "
129 "Instead, values were: %r, %r" % (args, kwargs)
130 )
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/web3/utils/validation.py b/web3/utils/validation.py
--- a/web3/utils/validation.py
+++ b/web3/utils/validation.py
@@ -113,7 +113,21 @@
if not is_hex_address(value):
raise InvalidAddress("Address must be 20 bytes, as a hex string with a 0x prefix", value)
if not is_checksum_address(value):
- raise InvalidAddress("Address has an invalid EIP checksum", value)
+ if value == value.lower():
+ raise InvalidAddress(
+ "Web3.py only accepts checksum addresses. "
+ "The software that gave you this non-checksum address should be considered unsafe, "
+ "please file it as a bug on their platform. "
+ "Try using an ENS name instead. Or, if you must accept lower safety, "
+ "use Web3.toChecksumAddress(lower_case_address).",
+ value,
+ )
+ else:
+ raise InvalidAddress(
+ "Address has an invalid EIP-55 checksum. "
+ "After looking up the address from the original source, try again.",
+ value,
+ )
def has_one_val(*args, **kwargs):
|
{"golden_diff": "diff --git a/web3/utils/validation.py b/web3/utils/validation.py\n--- a/web3/utils/validation.py\n+++ b/web3/utils/validation.py\n@@ -113,7 +113,21 @@\n if not is_hex_address(value):\n raise InvalidAddress(\"Address must be 20 bytes, as a hex string with a 0x prefix\", value)\n if not is_checksum_address(value):\n- raise InvalidAddress(\"Address has an invalid EIP checksum\", value)\n+ if value == value.lower():\n+ raise InvalidAddress(\n+ \"Web3.py only accepts checksum addresses. \"\n+ \"The software that gave you this non-checksum address should be considered unsafe, \"\n+ \"please file it as a bug on their platform. \"\n+ \"Try using an ENS name instead. Or, if you must accept lower safety, \"\n+ \"use Web3.toChecksumAddress(lower_case_address).\",\n+ value,\n+ )\n+ else:\n+ raise InvalidAddress(\n+ \"Address has an invalid EIP-55 checksum. \"\n+ \"After looking up the address from the original source, try again.\",\n+ value,\n+ )\n \n \n def has_one_val(*args, **kwargs):\n", "issue": "Improve warning on failed address checksum\n* Version: 4\r\n\r\n### What was wrong?\r\n\r\nError message about checksum failure isn't clear enough, especially when the address is supplied as all-lower-case.\r\n\r\n### How can it be fixed?\r\n\r\nSpecial-case the warning, so that an all-lower-case address gives a specific warning, along the lines of: \"web3py only accepts checksummed addresses. Please report it as a bug on any project that does not provide you checksummed addresses. In the meantime, you can force an address into checksummed state with `valid_address = w3.toChecksumAddress(questionable_address)` after double-checking that there are no errors in the address. Alternatively, you can use an ENS name in its place.\"\r\n\r\nBut, you know, not a paragraph...\n", "before_files": [{"content": "import itertools\n\nfrom eth_utils import (\n is_0x_prefixed,\n is_boolean,\n is_bytes,\n is_checksum_address,\n is_dict,\n is_hex_address,\n is_integer,\n is_list_like,\n is_string,\n)\n\nfrom web3.exceptions import (\n InvalidAddress,\n)\nfrom web3.utils.abi import (\n is_address_type,\n is_array_type,\n is_bool_type,\n is_bytes_type,\n is_int_type,\n is_recognized_type,\n is_string_type,\n is_uint_type,\n length_of_array_type,\n sub_type_of_array_type,\n)\n\n\ndef validate_abi(abi):\n \"\"\"\n Helper function for validating an ABI\n \"\"\"\n if not is_list_like(abi):\n raise ValueError(\"'abi' is not a list\")\n for e in abi:\n if not is_dict(e):\n raise ValueError(\"The elements of 'abi' are not all dictionaries\")\n\n\ndef validate_abi_type(abi_type):\n \"\"\"\n Helper function for validating an abi_type\n \"\"\"\n if not is_recognized_type(abi_type):\n raise ValueError(\"Unrecognized abi_type: {abi_type}\".format(abi_type=abi_type))\n\n\ndef validate_abi_value(abi_type, value):\n \"\"\"\n Helper function for validating a value against the expected abi_type\n Note: abi_type 'bytes' must either be python3 'bytes' object or ''\n \"\"\"\n if is_array_type(abi_type) and is_list_like(value):\n # validate length\n specified_length = length_of_array_type(abi_type)\n if specified_length is not None:\n if specified_length < 1:\n raise TypeError(\n \"Invalid abi-type: {abi_type}. Length of fixed sized arrays\"\n \"must be greater than 0.\"\n .format(abi_type=abi_type)\n )\n if specified_length != len(value):\n raise TypeError(\n \"The following array length does not the length specified\"\n \"by the abi-type, {abi_type}: {value}\"\n .format(abi_type=abi_type, value=value)\n )\n\n # validate sub_types\n sub_type = sub_type_of_array_type(abi_type)\n for v in value:\n validate_abi_value(sub_type, v)\n return\n elif is_bool_type(abi_type) and is_boolean(value):\n return\n elif is_uint_type(abi_type) and is_integer(value) and value >= 0:\n return\n elif is_int_type(abi_type) and is_integer(value):\n return\n elif is_address_type(abi_type):\n validate_address(value)\n return\n elif is_bytes_type(abi_type):\n if is_bytes(value):\n return\n elif is_string(value):\n if is_0x_prefixed(value):\n return\n else:\n raise TypeError(\n \"ABI values of abi-type 'bytes' must be either\"\n \"a python3 'bytes' object or an '0x' prefixed string.\"\n )\n elif is_string_type(abi_type) and is_string(value):\n return\n\n raise TypeError(\n \"The following abi value is not a '{abi_type}': {value}\"\n .format(abi_type=abi_type, value=value)\n )\n\n\ndef validate_address(value):\n \"\"\"\n Helper function for validating an address\n \"\"\"\n if not isinstance(value, str):\n raise TypeError('Address {} must be provided as a string'.format(value))\n if not is_hex_address(value):\n raise InvalidAddress(\"Address must be 20 bytes, as a hex string with a 0x prefix\", value)\n if not is_checksum_address(value):\n raise InvalidAddress(\"Address has an invalid EIP checksum\", value)\n\n\ndef has_one_val(*args, **kwargs):\n vals = itertools.chain(args, kwargs.values())\n not_nones = list(filter(lambda val: val is not None, vals))\n return len(not_nones) == 1\n\n\ndef assert_one_val(*args, **kwargs):\n if not has_one_val(*args, **kwargs):\n raise TypeError(\n \"Exactly one of the passed values can be specified. \"\n \"Instead, values were: %r, %r\" % (args, kwargs)\n )\n", "path": "web3/utils/validation.py"}], "after_files": [{"content": "import itertools\n\nfrom eth_utils import (\n is_0x_prefixed,\n is_boolean,\n is_bytes,\n is_checksum_address,\n is_dict,\n is_hex_address,\n is_integer,\n is_list_like,\n is_string,\n)\n\nfrom web3.exceptions import (\n InvalidAddress,\n)\nfrom web3.utils.abi import (\n is_address_type,\n is_array_type,\n is_bool_type,\n is_bytes_type,\n is_int_type,\n is_recognized_type,\n is_string_type,\n is_uint_type,\n length_of_array_type,\n sub_type_of_array_type,\n)\n\n\ndef validate_abi(abi):\n \"\"\"\n Helper function for validating an ABI\n \"\"\"\n if not is_list_like(abi):\n raise ValueError(\"'abi' is not a list\")\n for e in abi:\n if not is_dict(e):\n raise ValueError(\"The elements of 'abi' are not all dictionaries\")\n\n\ndef validate_abi_type(abi_type):\n \"\"\"\n Helper function for validating an abi_type\n \"\"\"\n if not is_recognized_type(abi_type):\n raise ValueError(\"Unrecognized abi_type: {abi_type}\".format(abi_type=abi_type))\n\n\ndef validate_abi_value(abi_type, value):\n \"\"\"\n Helper function for validating a value against the expected abi_type\n Note: abi_type 'bytes' must either be python3 'bytes' object or ''\n \"\"\"\n if is_array_type(abi_type) and is_list_like(value):\n # validate length\n specified_length = length_of_array_type(abi_type)\n if specified_length is not None:\n if specified_length < 1:\n raise TypeError(\n \"Invalid abi-type: {abi_type}. Length of fixed sized arrays\"\n \"must be greater than 0.\"\n .format(abi_type=abi_type)\n )\n if specified_length != len(value):\n raise TypeError(\n \"The following array length does not the length specified\"\n \"by the abi-type, {abi_type}: {value}\"\n .format(abi_type=abi_type, value=value)\n )\n\n # validate sub_types\n sub_type = sub_type_of_array_type(abi_type)\n for v in value:\n validate_abi_value(sub_type, v)\n return\n elif is_bool_type(abi_type) and is_boolean(value):\n return\n elif is_uint_type(abi_type) and is_integer(value) and value >= 0:\n return\n elif is_int_type(abi_type) and is_integer(value):\n return\n elif is_address_type(abi_type):\n validate_address(value)\n return\n elif is_bytes_type(abi_type):\n if is_bytes(value):\n return\n elif is_string(value):\n if is_0x_prefixed(value):\n return\n else:\n raise TypeError(\n \"ABI values of abi-type 'bytes' must be either\"\n \"a python3 'bytes' object or an '0x' prefixed string.\"\n )\n elif is_string_type(abi_type) and is_string(value):\n return\n\n raise TypeError(\n \"The following abi value is not a '{abi_type}': {value}\"\n .format(abi_type=abi_type, value=value)\n )\n\n\ndef validate_address(value):\n \"\"\"\n Helper function for validating an address\n \"\"\"\n if not isinstance(value, str):\n raise TypeError('Address {} must be provided as a string'.format(value))\n if not is_hex_address(value):\n raise InvalidAddress(\"Address must be 20 bytes, as a hex string with a 0x prefix\", value)\n if not is_checksum_address(value):\n if value == value.lower():\n raise InvalidAddress(\n \"Web3.py only accepts checksum addresses. \"\n \"The software that gave you this non-checksum address should be considered unsafe, \"\n \"please file it as a bug on their platform. \"\n \"Try using an ENS name instead. Or, if you must accept lower safety, \"\n \"use Web3.toChecksumAddress(lower_case_address).\",\n value,\n )\n else:\n raise InvalidAddress(\n \"Address has an invalid EIP-55 checksum. \"\n \"After looking up the address from the original source, try again.\",\n value,\n )\n\n\ndef has_one_val(*args, **kwargs):\n vals = itertools.chain(args, kwargs.values())\n not_nones = list(filter(lambda val: val is not None, vals))\n return len(not_nones) == 1\n\n\ndef assert_one_val(*args, **kwargs):\n if not has_one_val(*args, **kwargs):\n raise TypeError(\n \"Exactly one of the passed values can be specified. \"\n \"Instead, values were: %r, %r\" % (args, kwargs)\n )\n", "path": "web3/utils/validation.py"}]}
| 1,611 | 265 |
gh_patches_debug_37334
|
rasdani/github-patches
|
git_diff
|
networkx__networkx-4160
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
steiner_tree should accept MultiGraph
I'm using `steiner_tree` on a road network which may have multiple edges between nodes.
It looks like `steiner_tree` will fail if passed a `MultiGraph`:
- as a next-to-last step, edges are generated as `(u, v)` tuples pairwise [here](https://github.com/networkx/networkx/blob/master/networkx/algorithms/approximation/steinertree.py#L87)
- before being passed to `G.edge_subgraph` which raises a `ValueError` from `nx.filter.show_multiedges`
This should reproduce the issue:
```python
import networkx as nx
import networkx.algorithms.approximation as nxa
def test_simple_steiner_tree():
G = nx.Graph()
G.add_weighted_edges_from([
(1, 2, 1),
(2, 3, 1),
(3, 4, 1),
(3, 5, 1)
])
terminal_nodes = [2, 4, 5]
expected_edges = [
(2, 3),
(3, 4),
(3, 5)
]
T = nxa.steiner_tree(G, terminal_nodes)
assert list(T.edges) == expected_edges
def test_multi_steiner_tree():
G = nx.MultiGraph()
G.add_weighted_edges_from([
(1, 2, 1),
(2, 3, 1),
(2, 3, 999),
(3, 4, 1),
(3, 5, 1)
])
terminal_nodes = [2, 4, 5]
expected_edges = [
(2, 3, 0),
(3, 4, 0), # first edge has weight one
(3, 5, 0)
]
T = nxa.steiner_tree(G, terminal_nodes)
test_simple_steiner_tree() # passes
test_multi_steiner_tree() # throws ValueError
```
The quick fix might be to add `@not_implemented_for('multigraph')`.
For my current purposes, the following does the trick to handle the `MultiGraph` case:
```python
# get unique links pairwise (u, v)
links = set(chain.from_iterable(
pairwise(d['path'])
for u, v, d in mst_edges
))
# for each link in the chain
multi_edges = []
for u, v in links:
# consider each edge between the pair of nodes,
# keeping track of the one with the minimum weight
# (there may be a better way - convenience functions/accessors?)
num_edges = G.number_of_edges(u, v)
min_k = 0
min_weight = None
for k in range(num_edges):
curr_weight = G.edges[u, v, k][weight]
if min_weight is None:
min_weight = curr_weight
elif curr_weight < min_weight:
min_weight = curr_weight
min_k = k
multi_edges.append((u, v, min_k))
# create subgraph from multi edges - list of (u, v, k)
T = G.edge_subgraph(multi_edges)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `networkx/algorithms/approximation/steinertree.py`
Content:
```
1 from itertools import chain
2
3 from networkx.utils import pairwise, not_implemented_for
4 import networkx as nx
5
6 __all__ = ["metric_closure", "steiner_tree"]
7
8
9 @not_implemented_for("directed")
10 def metric_closure(G, weight="weight"):
11 """ Return the metric closure of a graph.
12
13 The metric closure of a graph *G* is the complete graph in which each edge
14 is weighted by the shortest path distance between the nodes in *G* .
15
16 Parameters
17 ----------
18 G : NetworkX graph
19
20 Returns
21 -------
22 NetworkX graph
23 Metric closure of the graph `G`.
24
25 """
26 M = nx.Graph()
27
28 Gnodes = set(G)
29
30 # check for connected graph while processing first node
31 all_paths_iter = nx.all_pairs_dijkstra(G, weight=weight)
32 u, (distance, path) = next(all_paths_iter)
33 if Gnodes - set(distance):
34 msg = "G is not a connected graph. metric_closure is not defined."
35 raise nx.NetworkXError(msg)
36 Gnodes.remove(u)
37 for v in Gnodes:
38 M.add_edge(u, v, distance=distance[v], path=path[v])
39
40 # first node done -- now process the rest
41 for u, (distance, path) in all_paths_iter:
42 Gnodes.remove(u)
43 for v in Gnodes:
44 M.add_edge(u, v, distance=distance[v], path=path[v])
45
46 return M
47
48
49 @not_implemented_for("multigraph")
50 @not_implemented_for("directed")
51 def steiner_tree(G, terminal_nodes, weight="weight"):
52 """ Return an approximation to the minimum Steiner tree of a graph.
53
54 Parameters
55 ----------
56 G : NetworkX graph
57
58 terminal_nodes : list
59 A list of terminal nodes for which minimum steiner tree is
60 to be found.
61
62 Returns
63 -------
64 NetworkX graph
65 Approximation to the minimum steiner tree of `G` induced by
66 `terminal_nodes` .
67
68 Notes
69 -----
70 Steiner tree can be approximated by computing the minimum spanning
71 tree of the subgraph of the metric closure of the graph induced by the
72 terminal nodes, where the metric closure of *G* is the complete graph in
73 which each edge is weighted by the shortest path distance between the
74 nodes in *G* .
75 This algorithm produces a tree whose weight is within a (2 - (2 / t))
76 factor of the weight of the optimal Steiner tree where *t* is number of
77 terminal nodes.
78
79 """
80 # M is the subgraph of the metric closure induced by the terminal nodes of
81 # G.
82 M = metric_closure(G, weight=weight)
83 # Use the 'distance' attribute of each edge provided by the metric closure
84 # graph.
85 H = M.subgraph(terminal_nodes)
86 mst_edges = nx.minimum_spanning_edges(H, weight="distance", data=True)
87 # Create an iterator over each edge in each shortest path; repeats are okay
88 edges = chain.from_iterable(pairwise(d["path"]) for u, v, d in mst_edges)
89 T = G.edge_subgraph(edges)
90 return T
91
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/networkx/algorithms/approximation/steinertree.py b/networkx/algorithms/approximation/steinertree.py
--- a/networkx/algorithms/approximation/steinertree.py
+++ b/networkx/algorithms/approximation/steinertree.py
@@ -46,11 +46,23 @@
return M
-@not_implemented_for("multigraph")
@not_implemented_for("directed")
def steiner_tree(G, terminal_nodes, weight="weight"):
""" Return an approximation to the minimum Steiner tree of a graph.
+ The minimum Steiner tree of `G` w.r.t a set of `terminal_nodes`
+ is a tree within `G` that spans those nodes and has minimum size
+ (sum of edge weights) among all such trees.
+
+ The minimum Steiner tree can be approximated by computing the minimum
+ spanning tree of the subgraph of the metric closure of *G* induced by the
+ terminal nodes, where the metric closure of *G* is the complete graph in
+ which each edge is weighted by the shortest path distance between the
+ nodes in *G* .
+ This algorithm produces a tree whose weight is within a (2 - (2 / t))
+ factor of the weight of the optimal Steiner tree where *t* is number of
+ terminal nodes.
+
Parameters
----------
G : NetworkX graph
@@ -67,24 +79,26 @@
Notes
-----
- Steiner tree can be approximated by computing the minimum spanning
- tree of the subgraph of the metric closure of the graph induced by the
- terminal nodes, where the metric closure of *G* is the complete graph in
- which each edge is weighted by the shortest path distance between the
- nodes in *G* .
- This algorithm produces a tree whose weight is within a (2 - (2 / t))
- factor of the weight of the optimal Steiner tree where *t* is number of
- terminal nodes.
+ For multigraphs, the edge between two nodes with minimum weight is the
+ edge put into the Steiner tree.
+
+ References
+ ----------
+ .. [1] Steiner_tree_problem on Wikipedia.
+ https://en.wikipedia.org/wiki/Steiner_tree_problem
"""
- # M is the subgraph of the metric closure induced by the terminal nodes of
- # G.
+ # H is the subgraph induced by terminal_nodes in the metric closure M of G.
M = metric_closure(G, weight=weight)
- # Use the 'distance' attribute of each edge provided by the metric closure
- # graph.
H = M.subgraph(terminal_nodes)
+ # Use the 'distance' attribute of each edge provided by M.
mst_edges = nx.minimum_spanning_edges(H, weight="distance", data=True)
# Create an iterator over each edge in each shortest path; repeats are okay
edges = chain.from_iterable(pairwise(d["path"]) for u, v, d in mst_edges)
+ # For multigraph we should add the minimal weight edge keys
+ if G.is_multigraph():
+ edges = (
+ (u, v, min(G[u][v], key=lambda k: G[u][v][k][weight])) for u, v in edges
+ )
T = G.edge_subgraph(edges)
return T
|
{"golden_diff": "diff --git a/networkx/algorithms/approximation/steinertree.py b/networkx/algorithms/approximation/steinertree.py\n--- a/networkx/algorithms/approximation/steinertree.py\n+++ b/networkx/algorithms/approximation/steinertree.py\n@@ -46,11 +46,23 @@\n return M\n \n \n-@not_implemented_for(\"multigraph\")\n @not_implemented_for(\"directed\")\n def steiner_tree(G, terminal_nodes, weight=\"weight\"):\n \"\"\" Return an approximation to the minimum Steiner tree of a graph.\n \n+ The minimum Steiner tree of `G` w.r.t a set of `terminal_nodes`\n+ is a tree within `G` that spans those nodes and has minimum size\n+ (sum of edge weights) among all such trees.\n+\n+ The minimum Steiner tree can be approximated by computing the minimum\n+ spanning tree of the subgraph of the metric closure of *G* induced by the\n+ terminal nodes, where the metric closure of *G* is the complete graph in\n+ which each edge is weighted by the shortest path distance between the\n+ nodes in *G* .\n+ This algorithm produces a tree whose weight is within a (2 - (2 / t))\n+ factor of the weight of the optimal Steiner tree where *t* is number of\n+ terminal nodes.\n+\n Parameters\n ----------\n G : NetworkX graph\n@@ -67,24 +79,26 @@\n \n Notes\n -----\n- Steiner tree can be approximated by computing the minimum spanning\n- tree of the subgraph of the metric closure of the graph induced by the\n- terminal nodes, where the metric closure of *G* is the complete graph in\n- which each edge is weighted by the shortest path distance between the\n- nodes in *G* .\n- This algorithm produces a tree whose weight is within a (2 - (2 / t))\n- factor of the weight of the optimal Steiner tree where *t* is number of\n- terminal nodes.\n+ For multigraphs, the edge between two nodes with minimum weight is the\n+ edge put into the Steiner tree.\n \n+\n+ References\n+ ----------\n+ .. [1] Steiner_tree_problem on Wikipedia.\n+ https://en.wikipedia.org/wiki/Steiner_tree_problem\n \"\"\"\n- # M is the subgraph of the metric closure induced by the terminal nodes of\n- # G.\n+ # H is the subgraph induced by terminal_nodes in the metric closure M of G.\n M = metric_closure(G, weight=weight)\n- # Use the 'distance' attribute of each edge provided by the metric closure\n- # graph.\n H = M.subgraph(terminal_nodes)\n+ # Use the 'distance' attribute of each edge provided by M.\n mst_edges = nx.minimum_spanning_edges(H, weight=\"distance\", data=True)\n # Create an iterator over each edge in each shortest path; repeats are okay\n edges = chain.from_iterable(pairwise(d[\"path\"]) for u, v, d in mst_edges)\n+ # For multigraph we should add the minimal weight edge keys\n+ if G.is_multigraph():\n+ edges = (\n+ (u, v, min(G[u][v], key=lambda k: G[u][v][k][weight])) for u, v in edges\n+ )\n T = G.edge_subgraph(edges)\n return T\n", "issue": "steiner_tree should accept MultiGraph\nI'm using `steiner_tree` on a road network which may have multiple edges between nodes. \r\n\r\nIt looks like `steiner_tree` will fail if passed a `MultiGraph`:\r\n- as a next-to-last step, edges are generated as `(u, v)` tuples pairwise [here](https://github.com/networkx/networkx/blob/master/networkx/algorithms/approximation/steinertree.py#L87)\r\n- before being passed to `G.edge_subgraph` which raises a `ValueError` from `nx.filter.show_multiedges`\r\n\r\nThis should reproduce the issue:\r\n\r\n```python\r\nimport networkx as nx\r\nimport networkx.algorithms.approximation as nxa\r\n\r\ndef test_simple_steiner_tree():\r\n G = nx.Graph()\r\n G.add_weighted_edges_from([\r\n (1, 2, 1),\r\n (2, 3, 1),\r\n (3, 4, 1),\r\n (3, 5, 1)\r\n ])\r\n terminal_nodes = [2, 4, 5]\r\n expected_edges = [\r\n (2, 3),\r\n (3, 4),\r\n (3, 5)\r\n ]\r\n T = nxa.steiner_tree(G, terminal_nodes)\r\n assert list(T.edges) == expected_edges\r\n\r\ndef test_multi_steiner_tree():\r\n G = nx.MultiGraph()\r\n G.add_weighted_edges_from([\r\n (1, 2, 1),\r\n (2, 3, 1),\r\n (2, 3, 999),\r\n (3, 4, 1),\r\n (3, 5, 1)\r\n ])\r\n terminal_nodes = [2, 4, 5]\r\n expected_edges = [\r\n (2, 3, 0),\r\n (3, 4, 0), # first edge has weight one\r\n (3, 5, 0)\r\n ]\r\n T = nxa.steiner_tree(G, terminal_nodes)\r\n\r\ntest_simple_steiner_tree() # passes\r\ntest_multi_steiner_tree() # throws ValueError \r\n```\r\n\r\nThe quick fix might be to add `@not_implemented_for('multigraph')`. \r\n\r\nFor my current purposes, the following does the trick to handle the `MultiGraph` case:\r\n\r\n```python\r\n# get unique links pairwise (u, v)\r\nlinks = set(chain.from_iterable(\r\n pairwise(d['path'])\r\n for u, v, d in mst_edges\r\n))\r\n\r\n# for each link in the chain\r\nmulti_edges = []\r\nfor u, v in links: \r\n # consider each edge between the pair of nodes,\r\n # keeping track of the one with the minimum weight\r\n # (there may be a better way - convenience functions/accessors?)\r\n num_edges = G.number_of_edges(u, v)\r\n min_k = 0\r\n min_weight = None\r\n for k in range(num_edges):\r\n curr_weight = G.edges[u, v, k][weight]\r\n if min_weight is None:\r\n min_weight = curr_weight\r\n elif curr_weight < min_weight:\r\n min_weight = curr_weight\r\n min_k = k\r\n multi_edges.append((u, v, min_k))\r\n\r\n# create subgraph from multi edges - list of (u, v, k)\r\nT = G.edge_subgraph(multi_edges)\r\n```\r\n\n", "before_files": [{"content": "from itertools import chain\n\nfrom networkx.utils import pairwise, not_implemented_for\nimport networkx as nx\n\n__all__ = [\"metric_closure\", \"steiner_tree\"]\n\n\n@not_implemented_for(\"directed\")\ndef metric_closure(G, weight=\"weight\"):\n \"\"\" Return the metric closure of a graph.\n\n The metric closure of a graph *G* is the complete graph in which each edge\n is weighted by the shortest path distance between the nodes in *G* .\n\n Parameters\n ----------\n G : NetworkX graph\n\n Returns\n -------\n NetworkX graph\n Metric closure of the graph `G`.\n\n \"\"\"\n M = nx.Graph()\n\n Gnodes = set(G)\n\n # check for connected graph while processing first node\n all_paths_iter = nx.all_pairs_dijkstra(G, weight=weight)\n u, (distance, path) = next(all_paths_iter)\n if Gnodes - set(distance):\n msg = \"G is not a connected graph. metric_closure is not defined.\"\n raise nx.NetworkXError(msg)\n Gnodes.remove(u)\n for v in Gnodes:\n M.add_edge(u, v, distance=distance[v], path=path[v])\n\n # first node done -- now process the rest\n for u, (distance, path) in all_paths_iter:\n Gnodes.remove(u)\n for v in Gnodes:\n M.add_edge(u, v, distance=distance[v], path=path[v])\n\n return M\n\n\n@not_implemented_for(\"multigraph\")\n@not_implemented_for(\"directed\")\ndef steiner_tree(G, terminal_nodes, weight=\"weight\"):\n \"\"\" Return an approximation to the minimum Steiner tree of a graph.\n\n Parameters\n ----------\n G : NetworkX graph\n\n terminal_nodes : list\n A list of terminal nodes for which minimum steiner tree is\n to be found.\n\n Returns\n -------\n NetworkX graph\n Approximation to the minimum steiner tree of `G` induced by\n `terminal_nodes` .\n\n Notes\n -----\n Steiner tree can be approximated by computing the minimum spanning\n tree of the subgraph of the metric closure of the graph induced by the\n terminal nodes, where the metric closure of *G* is the complete graph in\n which each edge is weighted by the shortest path distance between the\n nodes in *G* .\n This algorithm produces a tree whose weight is within a (2 - (2 / t))\n factor of the weight of the optimal Steiner tree where *t* is number of\n terminal nodes.\n\n \"\"\"\n # M is the subgraph of the metric closure induced by the terminal nodes of\n # G.\n M = metric_closure(G, weight=weight)\n # Use the 'distance' attribute of each edge provided by the metric closure\n # graph.\n H = M.subgraph(terminal_nodes)\n mst_edges = nx.minimum_spanning_edges(H, weight=\"distance\", data=True)\n # Create an iterator over each edge in each shortest path; repeats are okay\n edges = chain.from_iterable(pairwise(d[\"path\"]) for u, v, d in mst_edges)\n T = G.edge_subgraph(edges)\n return T\n", "path": "networkx/algorithms/approximation/steinertree.py"}], "after_files": [{"content": "from itertools import chain\n\nfrom networkx.utils import pairwise, not_implemented_for\nimport networkx as nx\n\n__all__ = [\"metric_closure\", \"steiner_tree\"]\n\n\n@not_implemented_for(\"directed\")\ndef metric_closure(G, weight=\"weight\"):\n \"\"\" Return the metric closure of a graph.\n\n The metric closure of a graph *G* is the complete graph in which each edge\n is weighted by the shortest path distance between the nodes in *G* .\n\n Parameters\n ----------\n G : NetworkX graph\n\n Returns\n -------\n NetworkX graph\n Metric closure of the graph `G`.\n\n \"\"\"\n M = nx.Graph()\n\n Gnodes = set(G)\n\n # check for connected graph while processing first node\n all_paths_iter = nx.all_pairs_dijkstra(G, weight=weight)\n u, (distance, path) = next(all_paths_iter)\n if Gnodes - set(distance):\n msg = \"G is not a connected graph. metric_closure is not defined.\"\n raise nx.NetworkXError(msg)\n Gnodes.remove(u)\n for v in Gnodes:\n M.add_edge(u, v, distance=distance[v], path=path[v])\n\n # first node done -- now process the rest\n for u, (distance, path) in all_paths_iter:\n Gnodes.remove(u)\n for v in Gnodes:\n M.add_edge(u, v, distance=distance[v], path=path[v])\n\n return M\n\n\n@not_implemented_for(\"directed\")\ndef steiner_tree(G, terminal_nodes, weight=\"weight\"):\n \"\"\" Return an approximation to the minimum Steiner tree of a graph.\n\n The minimum Steiner tree of `G` w.r.t a set of `terminal_nodes`\n is a tree within `G` that spans those nodes and has minimum size\n (sum of edge weights) among all such trees.\n\n The minimum Steiner tree can be approximated by computing the minimum\n spanning tree of the subgraph of the metric closure of *G* induced by the\n terminal nodes, where the metric closure of *G* is the complete graph in\n which each edge is weighted by the shortest path distance between the\n nodes in *G* .\n This algorithm produces a tree whose weight is within a (2 - (2 / t))\n factor of the weight of the optimal Steiner tree where *t* is number of\n terminal nodes.\n\n Parameters\n ----------\n G : NetworkX graph\n\n terminal_nodes : list\n A list of terminal nodes for which minimum steiner tree is\n to be found.\n\n Returns\n -------\n NetworkX graph\n Approximation to the minimum steiner tree of `G` induced by\n `terminal_nodes` .\n\n Notes\n -----\n For multigraphs, the edge between two nodes with minimum weight is the\n edge put into the Steiner tree.\n\n\n References\n ----------\n .. [1] Steiner_tree_problem on Wikipedia.\n https://en.wikipedia.org/wiki/Steiner_tree_problem\n \"\"\"\n # H is the subgraph induced by terminal_nodes in the metric closure M of G.\n M = metric_closure(G, weight=weight)\n H = M.subgraph(terminal_nodes)\n # Use the 'distance' attribute of each edge provided by M.\n mst_edges = nx.minimum_spanning_edges(H, weight=\"distance\", data=True)\n # Create an iterator over each edge in each shortest path; repeats are okay\n edges = chain.from_iterable(pairwise(d[\"path\"]) for u, v, d in mst_edges)\n # For multigraph we should add the minimal weight edge keys\n if G.is_multigraph():\n edges = (\n (u, v, min(G[u][v], key=lambda k: G[u][v][k][weight])) for u, v in edges\n )\n T = G.edge_subgraph(edges)\n return T\n", "path": "networkx/algorithms/approximation/steinertree.py"}]}
| 1,852 | 763 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.