problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_11468
|
rasdani/github-patches
|
git_diff
|
getredash__redash-4582
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
TreasureData getSchema fails when setting non-default region
<!--
#####################################################################
#
# Need support? USE THE FORUM! https://discuss.redash.io/c/support.
#
# Don't have steps to reproduce and actually not sure it's a bug?
# Use the forum! https://discuss.redash.io/c/support.
#
#####################################################################
**Got an idea for a new feature?** Check if it isn't on the roadmap already: https://bit.ly/redash-roadmap and start a new discussion in the features category: https://discuss.redash.io/c/feature-requests 🌟.
Found a bug? Please fill out the sections below... thank you 👍
Found a security vulnerability? Please email [email protected] to report any security vulnerabilities. We will acknowledge receipt of your vulnerability and strive to send you regular updates about our progress. If you're curious about the status of your disclosure please feel free to email us again. If you want to encrypt your disclosure email, you can use this PGP key.
-->
### Issue Summary
There are some regions in Treasure Data, but getSchema alsways fails when setting non-default region.
### Steps to Reproduce
1. Set datasource using non-default region (e.g. Tokyo region)
2. Push schema refresh then "Schema refresh failed" error occurs
### Technical details:
* Redash Version: confirmed v5.0.2
* Browser/OS: any Browsers/OSs
* How did you install Redash: from Amazon AMI
### Details
When accessing Treasure Data to get schema, always default region will be set because the parameter is not prepared.
https://github.com/getredash/redash/blob/6c364369bb0eb98e2191c2e502fed72abe5a74c7/redash/query_runner/treasuredata.py#L82
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `redash/query_runner/treasuredata.py`
Content:
```
1 import logging
2
3 from redash.query_runner import *
4 from redash.utils import json_dumps
5
6 logger = logging.getLogger(__name__)
7
8 try:
9 import tdclient
10 from tdclient import errors
11
12 enabled = True
13
14 except ImportError:
15 enabled = False
16
17 TD_TYPES_MAPPING = {
18 "bigint": TYPE_INTEGER,
19 "tinyint": TYPE_INTEGER,
20 "smallint": TYPE_INTEGER,
21 "int": TYPE_INTEGER,
22 "integer": TYPE_INTEGER,
23 "long": TYPE_INTEGER,
24 "double": TYPE_FLOAT,
25 "decimal": TYPE_FLOAT,
26 "float": TYPE_FLOAT,
27 "real": TYPE_FLOAT,
28 "boolean": TYPE_BOOLEAN,
29 "timestamp": TYPE_DATETIME,
30 "date": TYPE_DATETIME,
31 "char": TYPE_STRING,
32 "string": TYPE_STRING,
33 "varchar": TYPE_STRING,
34 }
35
36
37 class TreasureData(BaseQueryRunner):
38 should_annotate_query = False
39 noop_query = "SELECT 1"
40
41 @classmethod
42 def configuration_schema(cls):
43 return {
44 "type": "object",
45 "properties": {
46 "endpoint": {"type": "string"},
47 "apikey": {"type": "string"},
48 "type": {"type": "string"},
49 "db": {"type": "string", "title": "Database Name"},
50 "get_schema": {
51 "type": "boolean",
52 "title": "Auto Schema Retrieval",
53 "default": False,
54 },
55 },
56 "required": ["apikey", "db"],
57 }
58
59 @classmethod
60 def enabled(cls):
61 return enabled
62
63 @classmethod
64 def type(cls):
65 return "treasuredata"
66
67 def get_schema(self, get_stats=False):
68 schema = {}
69 if self.configuration.get("get_schema", False):
70 try:
71 with tdclient.Client(self.configuration.get("apikey")) as client:
72 for table in client.tables(self.configuration.get("db")):
73 table_name = "{}.{}".format(
74 self.configuration.get("db"), table.name
75 )
76 for table_schema in table.schema:
77 schema[table_name] = {
78 "name": table_name,
79 "columns": [column[0] for column in table.schema],
80 }
81 except Exception as ex:
82 raise Exception("Failed getting schema")
83 return list(schema.values())
84
85 def run_query(self, query, user):
86 connection = tdclient.connect(
87 endpoint=self.configuration.get("endpoint", "https://api.treasuredata.com"),
88 apikey=self.configuration.get("apikey"),
89 type=self.configuration.get("type", "hive").lower(),
90 db=self.configuration.get("db"),
91 )
92
93 cursor = connection.cursor()
94 try:
95 cursor.execute(query)
96 columns_tuples = [
97 (i[0], TD_TYPES_MAPPING.get(i[1], None))
98 for i in cursor.show_job()["hive_result_schema"]
99 ]
100 columns = self.fetch_columns(columns_tuples)
101
102 if cursor.rowcount == 0:
103 rows = []
104 else:
105 rows = [
106 dict(zip(([column["name"] for column in columns]), r))
107 for r in cursor.fetchall()
108 ]
109 data = {"columns": columns, "rows": rows}
110 json_data = json_dumps(data)
111 error = None
112 except errors.InternalError as e:
113 json_data = None
114 error = "%s: %s" % (
115 str(e),
116 cursor.show_job()
117 .get("debug", {})
118 .get("stderr", "No stderr message in the response"),
119 )
120 return json_data, error
121
122
123 register(TreasureData)
124
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/redash/query_runner/treasuredata.py b/redash/query_runner/treasuredata.py
--- a/redash/query_runner/treasuredata.py
+++ b/redash/query_runner/treasuredata.py
@@ -68,7 +68,7 @@
schema = {}
if self.configuration.get("get_schema", False):
try:
- with tdclient.Client(self.configuration.get("apikey")) as client:
+ with tdclient.Client(self.configuration.get("apikey"),endpoint=self.configuration.get("endpoint")) as client:
for table in client.tables(self.configuration.get("db")):
table_name = "{}.{}".format(
self.configuration.get("db"), table.name
|
{"golden_diff": "diff --git a/redash/query_runner/treasuredata.py b/redash/query_runner/treasuredata.py\n--- a/redash/query_runner/treasuredata.py\n+++ b/redash/query_runner/treasuredata.py\n@@ -68,7 +68,7 @@\n schema = {}\n if self.configuration.get(\"get_schema\", False):\n try:\n- with tdclient.Client(self.configuration.get(\"apikey\")) as client:\n+ with tdclient.Client(self.configuration.get(\"apikey\"),endpoint=self.configuration.get(\"endpoint\")) as client:\n for table in client.tables(self.configuration.get(\"db\")):\n table_name = \"{}.{}\".format(\n self.configuration.get(\"db\"), table.name\n", "issue": "TreasureData getSchema fails when setting non-default region\n<!--\r\n#####################################################################\r\n#\r\n# Need support? USE THE FORUM! https://discuss.redash.io/c/support.\r\n#\r\n# Don't have steps to reproduce and actually not sure it's a bug?\r\n# Use the forum! https://discuss.redash.io/c/support.\r\n#\r\n#####################################################################\r\n\r\n**Got an idea for a new feature?** Check if it isn't on the roadmap already: https://bit.ly/redash-roadmap and start a new discussion in the features category: https://discuss.redash.io/c/feature-requests \ud83c\udf1f.\r\n\r\nFound a bug? Please fill out the sections below... thank you \ud83d\udc4d\r\n\r\nFound a security vulnerability? Please email [email protected] to report any security vulnerabilities. We will acknowledge receipt of your vulnerability and strive to send you regular updates about our progress. If you're curious about the status of your disclosure please feel free to email us again. If you want to encrypt your disclosure email, you can use this PGP key.\r\n\r\n-->\r\n\r\n### Issue Summary\r\n\r\nThere are some regions in Treasure Data, but getSchema alsways fails when setting non-default region.\r\n\r\n### Steps to Reproduce\r\n\r\n1. Set datasource using non-default region (e.g. Tokyo region)\r\n2. Push schema refresh then \"Schema refresh failed\" error occurs\r\n\r\n### Technical details:\r\n\r\n* Redash Version: confirmed v5.0.2\r\n* Browser/OS: any Browsers/OSs\r\n* How did you install Redash: from Amazon AMI\r\n\r\n### Details\r\n\r\nWhen accessing Treasure Data to get schema, always default region will be set because the parameter is not prepared.\r\nhttps://github.com/getredash/redash/blob/6c364369bb0eb98e2191c2e502fed72abe5a74c7/redash/query_runner/treasuredata.py#L82\n", "before_files": [{"content": "import logging\n\nfrom redash.query_runner import *\nfrom redash.utils import json_dumps\n\nlogger = logging.getLogger(__name__)\n\ntry:\n import tdclient\n from tdclient import errors\n\n enabled = True\n\nexcept ImportError:\n enabled = False\n\nTD_TYPES_MAPPING = {\n \"bigint\": TYPE_INTEGER,\n \"tinyint\": TYPE_INTEGER,\n \"smallint\": TYPE_INTEGER,\n \"int\": TYPE_INTEGER,\n \"integer\": TYPE_INTEGER,\n \"long\": TYPE_INTEGER,\n \"double\": TYPE_FLOAT,\n \"decimal\": TYPE_FLOAT,\n \"float\": TYPE_FLOAT,\n \"real\": TYPE_FLOAT,\n \"boolean\": TYPE_BOOLEAN,\n \"timestamp\": TYPE_DATETIME,\n \"date\": TYPE_DATETIME,\n \"char\": TYPE_STRING,\n \"string\": TYPE_STRING,\n \"varchar\": TYPE_STRING,\n}\n\n\nclass TreasureData(BaseQueryRunner):\n should_annotate_query = False\n noop_query = \"SELECT 1\"\n\n @classmethod\n def configuration_schema(cls):\n return {\n \"type\": \"object\",\n \"properties\": {\n \"endpoint\": {\"type\": \"string\"},\n \"apikey\": {\"type\": \"string\"},\n \"type\": {\"type\": \"string\"},\n \"db\": {\"type\": \"string\", \"title\": \"Database Name\"},\n \"get_schema\": {\n \"type\": \"boolean\",\n \"title\": \"Auto Schema Retrieval\",\n \"default\": False,\n },\n },\n \"required\": [\"apikey\", \"db\"],\n }\n\n @classmethod\n def enabled(cls):\n return enabled\n\n @classmethod\n def type(cls):\n return \"treasuredata\"\n\n def get_schema(self, get_stats=False):\n schema = {}\n if self.configuration.get(\"get_schema\", False):\n try:\n with tdclient.Client(self.configuration.get(\"apikey\")) as client:\n for table in client.tables(self.configuration.get(\"db\")):\n table_name = \"{}.{}\".format(\n self.configuration.get(\"db\"), table.name\n )\n for table_schema in table.schema:\n schema[table_name] = {\n \"name\": table_name,\n \"columns\": [column[0] for column in table.schema],\n }\n except Exception as ex:\n raise Exception(\"Failed getting schema\")\n return list(schema.values())\n\n def run_query(self, query, user):\n connection = tdclient.connect(\n endpoint=self.configuration.get(\"endpoint\", \"https://api.treasuredata.com\"),\n apikey=self.configuration.get(\"apikey\"),\n type=self.configuration.get(\"type\", \"hive\").lower(),\n db=self.configuration.get(\"db\"),\n )\n\n cursor = connection.cursor()\n try:\n cursor.execute(query)\n columns_tuples = [\n (i[0], TD_TYPES_MAPPING.get(i[1], None))\n for i in cursor.show_job()[\"hive_result_schema\"]\n ]\n columns = self.fetch_columns(columns_tuples)\n\n if cursor.rowcount == 0:\n rows = []\n else:\n rows = [\n dict(zip(([column[\"name\"] for column in columns]), r))\n for r in cursor.fetchall()\n ]\n data = {\"columns\": columns, \"rows\": rows}\n json_data = json_dumps(data)\n error = None\n except errors.InternalError as e:\n json_data = None\n error = \"%s: %s\" % (\n str(e),\n cursor.show_job()\n .get(\"debug\", {})\n .get(\"stderr\", \"No stderr message in the response\"),\n )\n return json_data, error\n\n\nregister(TreasureData)\n", "path": "redash/query_runner/treasuredata.py"}], "after_files": [{"content": "import logging\n\nfrom redash.query_runner import *\nfrom redash.utils import json_dumps\n\nlogger = logging.getLogger(__name__)\n\ntry:\n import tdclient\n from tdclient import errors\n\n enabled = True\n\nexcept ImportError:\n enabled = False\n\nTD_TYPES_MAPPING = {\n \"bigint\": TYPE_INTEGER,\n \"tinyint\": TYPE_INTEGER,\n \"smallint\": TYPE_INTEGER,\n \"int\": TYPE_INTEGER,\n \"integer\": TYPE_INTEGER,\n \"long\": TYPE_INTEGER,\n \"double\": TYPE_FLOAT,\n \"decimal\": TYPE_FLOAT,\n \"float\": TYPE_FLOAT,\n \"real\": TYPE_FLOAT,\n \"boolean\": TYPE_BOOLEAN,\n \"timestamp\": TYPE_DATETIME,\n \"date\": TYPE_DATETIME,\n \"char\": TYPE_STRING,\n \"string\": TYPE_STRING,\n \"varchar\": TYPE_STRING,\n}\n\n\nclass TreasureData(BaseQueryRunner):\n should_annotate_query = False\n noop_query = \"SELECT 1\"\n\n @classmethod\n def configuration_schema(cls):\n return {\n \"type\": \"object\",\n \"properties\": {\n \"endpoint\": {\"type\": \"string\"},\n \"apikey\": {\"type\": \"string\"},\n \"type\": {\"type\": \"string\"},\n \"db\": {\"type\": \"string\", \"title\": \"Database Name\"},\n \"get_schema\": {\n \"type\": \"boolean\",\n \"title\": \"Auto Schema Retrieval\",\n \"default\": False,\n },\n },\n \"required\": [\"apikey\", \"db\"],\n }\n\n @classmethod\n def enabled(cls):\n return enabled\n\n @classmethod\n def type(cls):\n return \"treasuredata\"\n\n def get_schema(self, get_stats=False):\n schema = {}\n if self.configuration.get(\"get_schema\", False):\n try:\n with tdclient.Client(self.configuration.get(\"apikey\"),endpoint=self.configuration.get(\"endpoint\")) as client:\n for table in client.tables(self.configuration.get(\"db\")):\n table_name = \"{}.{}\".format(\n self.configuration.get(\"db\"), table.name\n )\n for table_schema in table.schema:\n schema[table_name] = {\n \"name\": table_name,\n \"columns\": [column[0] for column in table.schema],\n }\n except Exception as ex:\n raise Exception(\"Failed getting schema\")\n return list(schema.values())\n\n def run_query(self, query, user):\n connection = tdclient.connect(\n endpoint=self.configuration.get(\"endpoint\", \"https://api.treasuredata.com\"),\n apikey=self.configuration.get(\"apikey\"),\n type=self.configuration.get(\"type\", \"hive\").lower(),\n db=self.configuration.get(\"db\"),\n )\n\n cursor = connection.cursor()\n try:\n cursor.execute(query)\n columns_tuples = [\n (i[0], TD_TYPES_MAPPING.get(i[1], None))\n for i in cursor.show_job()[\"hive_result_schema\"]\n ]\n columns = self.fetch_columns(columns_tuples)\n\n if cursor.rowcount == 0:\n rows = []\n else:\n rows = [\n dict(zip(([column[\"name\"] for column in columns]), r))\n for r in cursor.fetchall()\n ]\n data = {\"columns\": columns, \"rows\": rows}\n json_data = json_dumps(data)\n error = None\n except errors.InternalError as e:\n json_data = None\n error = \"%s: %s\" % (\n str(e),\n cursor.show_job()\n .get(\"debug\", {})\n .get(\"stderr\", \"No stderr message in the response\"),\n )\n return json_data, error\n\n\nregister(TreasureData)\n", "path": "redash/query_runner/treasuredata.py"}]}
| 1,691 | 147 |
gh_patches_debug_34692
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-3747
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Checkov failing to perform a check against a Terraform Plan when skipping a specific AWS check
**Describe the issue**
Checkov failing to perform a check against a Terraform Plan when skipping a specific AWS check
**Examples**
```
resource "aws_iam_role" "backend" {
#checkov:skip=CKV_AWS_274:TODO Generate a policy from CloudTrail later
name = "${var.repo}-foo"
assume_role_policy = data.aws_iam_policy_document.backend-assume.json
managed_policy_arns = ["arn:aws:iam::aws:policy/AdministratorAccess"]
inline_policy {
name = "foo"
policy = data.aws_iam_policy_document.backend-permissions.json
}
tags = {
component = var.foo
}
}
```
**Exception Trace**
```
Traceback (most recent call last):
File "/usr/local/bin/checkov", line 9, in <module>
sys.exit(run())
File "/usr/local/lib/python3.9/site-packages/checkov/main.py", line 355, in run
scan_reports = runner_registry.run(external_checks_dir=external_checks_dir, files=config.file,
File "/usr/local/lib/python3.9/site-packages/checkov/common/runners/runner_registry.py", line 79, in run
self.runners[0].run(root_folder, external_checks_dir=external_checks_dir, files=files,
File "/usr/local/lib/python3.9/site-packages/checkov/terraform/plan_runner.py", line 81, in run
self.check_tf_definition(report, root_folder, runner_filter)
File "/usr/local/lib/python3.9/site-packages/checkov/terraform/plan_runner.py", line 99, in check_tf_definition
self.run_block(definition[block_type], None, full_file_path, root_folder, report, scanned_file,
File "/usr/local/lib/python3.9/site-packages/checkov/terraform/plan_runner.py", line 119, in run_block
results = registry.scan(scanned_file, entity, [], runner_filter, report_type=CheckType.TERRAFORM_PLAN)
File "/usr/local/lib/python3.9/site-packages/checkov/common/checks/base_check_registry.py", line 127, in scan
result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)
File "/usr/local/lib/python3.9/site-packages/checkov/common/checks/base_check_registry.py", line 141, in run_check
result = check.run(
File "/usr/local/lib/python3.9/site-packages/checkov/common/checks/base_check.py", line 70, in run
check_result["result"] = self.scan_entity_conf(entity_configuration, entity_type)
File "/usr/local/lib/python3.9/site-packages/checkov/terraform/checks/resource/base_resource_check.py", line 43, in scan_entity_conf
return self.scan_resource_conf(conf)
File "/usr/local/lib/python3.9/site-packages/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py", line 42, in scan_resource_conf
if conf.get("policy_arn")[0] == ADMIN_POLICY_ARN:
TypeError: 'NoneType' object is not subscriptable
```
**Desktop (please complete the following information):**
- MacOS 11.7
- Checkov Version 2.2.0
**Additional context**
This fails as of whateverv version CKV_AWS_274 was added. Last time a build didn't crash I was using 2.1.294 and it worked.
Also if I skip it with a command-line switch then this crash does not happen (which is going to be my temp workaround)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py`
Content:
```
1 from checkov.common.models.enums import CheckResult, CheckCategories
2 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
3
4
5 ADMIN_POLICY_NAME = "AdministratorAccess"
6 ADMIN_POLICY_ARN = f"arn:aws:iam::aws:policy/{ADMIN_POLICY_NAME}"
7
8
9 class IAMManagedAdminPolicy(BaseResourceCheck):
10 def __init__(self):
11 # This is the full description of your check
12 description = "Disallow IAM roles, users, and groups from using the AWS AdministratorAccess policy"
13
14 # This is the Unique ID for your check
15 id = "CKV_AWS_274"
16
17 # These are the terraform objects supported by this check (ex: aws_iam_policy_document)
18 supported_resources = (
19 "aws_iam_role",
20 "aws_iam_policy_attachment",
21 "aws_iam_role_policy_attachment",
22 "aws_iam_user_policy_attachment",
23 "aws_iam_group_policy_attachment",
24 )
25
26 # Valid CheckCategories are defined in checkov/common/models/enums.py
27 categories = (CheckCategories.IAM,)
28 super().__init__(name=description, id=id, categories=categories, supported_resources=supported_resources)
29
30 def scan_resource_conf(self, conf):
31 if self.entity_type == "aws_iam_role":
32 if "managed_policy_arns" in conf.keys():
33 if ADMIN_POLICY_ARN in conf.get("managed_policy_arns")[0]:
34 return CheckResult.FAILED
35
36 elif self.entity_type in (
37 "aws_iam_policy_attachment",
38 "aws_iam_role_policy_attachment",
39 "aws_iam_user_policy_attachment",
40 "aws_iam_group_policy_attachment",
41 ):
42 if conf.get("policy_arn")[0] == ADMIN_POLICY_ARN:
43 return CheckResult.FAILED
44
45 return CheckResult.PASSED
46
47
48 check = IAMManagedAdminPolicy()
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py b/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py
--- a/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py
+++ b/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py
@@ -1,3 +1,7 @@
+from __future__ import annotations
+
+from typing import Any
+
from checkov.common.models.enums import CheckResult, CheckCategories
from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
@@ -7,7 +11,7 @@
class IAMManagedAdminPolicy(BaseResourceCheck):
- def __init__(self):
+ def __init__(self) -> None:
# This is the full description of your check
description = "Disallow IAM roles, users, and groups from using the AWS AdministratorAccess policy"
@@ -27,10 +31,10 @@
categories = (CheckCategories.IAM,)
super().__init__(name=description, id=id, categories=categories, supported_resources=supported_resources)
- def scan_resource_conf(self, conf):
+ def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:
if self.entity_type == "aws_iam_role":
if "managed_policy_arns" in conf.keys():
- if ADMIN_POLICY_ARN in conf.get("managed_policy_arns")[0]:
+ if ADMIN_POLICY_ARN in conf["managed_policy_arns"][0]:
return CheckResult.FAILED
elif self.entity_type in (
@@ -39,10 +43,11 @@
"aws_iam_user_policy_attachment",
"aws_iam_group_policy_attachment",
):
- if conf.get("policy_arn")[0] == ADMIN_POLICY_ARN:
+ policy_arn = conf.get("policy_arn")
+ if policy_arn and policy_arn[0] == ADMIN_POLICY_ARN:
return CheckResult.FAILED
return CheckResult.PASSED
-check = IAMManagedAdminPolicy()
\ No newline at end of file
+check = IAMManagedAdminPolicy()
|
{"golden_diff": "diff --git a/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py b/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py\n--- a/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py\n+++ b/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py\n@@ -1,3 +1,7 @@\n+from __future__ import annotations\n+\n+from typing import Any\n+\n from checkov.common.models.enums import CheckResult, CheckCategories\n from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n \n@@ -7,7 +11,7 @@\n \n \n class IAMManagedAdminPolicy(BaseResourceCheck):\n- def __init__(self):\n+ def __init__(self) -> None:\n # This is the full description of your check\n description = \"Disallow IAM roles, users, and groups from using the AWS AdministratorAccess policy\"\n \n@@ -27,10 +31,10 @@\n categories = (CheckCategories.IAM,)\n super().__init__(name=description, id=id, categories=categories, supported_resources=supported_resources)\n \n- def scan_resource_conf(self, conf):\n+ def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:\n if self.entity_type == \"aws_iam_role\":\n if \"managed_policy_arns\" in conf.keys():\n- if ADMIN_POLICY_ARN in conf.get(\"managed_policy_arns\")[0]:\n+ if ADMIN_POLICY_ARN in conf[\"managed_policy_arns\"][0]:\n return CheckResult.FAILED\n \n elif self.entity_type in (\n@@ -39,10 +43,11 @@\n \"aws_iam_user_policy_attachment\",\n \"aws_iam_group_policy_attachment\",\n ):\n- if conf.get(\"policy_arn\")[0] == ADMIN_POLICY_ARN:\n+ policy_arn = conf.get(\"policy_arn\")\n+ if policy_arn and policy_arn[0] == ADMIN_POLICY_ARN:\n return CheckResult.FAILED\n \n return CheckResult.PASSED\n \n \n-check = IAMManagedAdminPolicy()\n\\ No newline at end of file\n+check = IAMManagedAdminPolicy()\n", "issue": "Checkov failing to perform a check against a Terraform Plan when skipping a specific AWS check\n**Describe the issue**\r\nCheckov failing to perform a check against a Terraform Plan when skipping a specific AWS check\r\n\r\n**Examples**\r\n```\r\nresource \"aws_iam_role\" \"backend\" {\r\n #checkov:skip=CKV_AWS_274:TODO Generate a policy from CloudTrail later\r\n name = \"${var.repo}-foo\"\r\n assume_role_policy = data.aws_iam_policy_document.backend-assume.json\r\n managed_policy_arns = [\"arn:aws:iam::aws:policy/AdministratorAccess\"]\r\n inline_policy {\r\n name = \"foo\"\r\n policy = data.aws_iam_policy_document.backend-permissions.json\r\n }\r\n tags = {\r\n component = var.foo\r\n }\r\n}\r\n```\r\n\r\n**Exception Trace**\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/checkov\", line 9, in <module>\r\n sys.exit(run())\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/main.py\", line 355, in run\r\n scan_reports = runner_registry.run(external_checks_dir=external_checks_dir, files=config.file,\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/common/runners/runner_registry.py\", line 79, in run\r\n self.runners[0].run(root_folder, external_checks_dir=external_checks_dir, files=files,\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/terraform/plan_runner.py\", line 81, in run\r\n self.check_tf_definition(report, root_folder, runner_filter)\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/terraform/plan_runner.py\", line 99, in check_tf_definition\r\n self.run_block(definition[block_type], None, full_file_path, root_folder, report, scanned_file,\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/terraform/plan_runner.py\", line 119, in run_block\r\n results = registry.scan(scanned_file, entity, [], runner_filter, report_type=CheckType.TERRAFORM_PLAN)\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/common/checks/base_check_registry.py\", line 127, in scan\r\n result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/common/checks/base_check_registry.py\", line 141, in run_check\r\n result = check.run(\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/common/checks/base_check.py\", line 70, in run\r\n check_result[\"result\"] = self.scan_entity_conf(entity_configuration, entity_type)\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/terraform/checks/resource/base_resource_check.py\", line 43, in scan_entity_conf\r\n return self.scan_resource_conf(conf)\r\n File \"/usr/local/lib/python3.9/site-packages/checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py\", line 42, in scan_resource_conf\r\n if conf.get(\"policy_arn\")[0] == ADMIN_POLICY_ARN:\r\nTypeError: 'NoneType' object is not subscriptable\r\n```\r\n\r\n**Desktop (please complete the following information):**\r\n - MacOS 11.7\r\n - Checkov Version 2.2.0\r\n\r\n**Additional context**\r\nThis fails as of whateverv version CKV_AWS_274 was added. Last time a build didn't crash I was using 2.1.294 and it worked.\r\nAlso if I skip it with a command-line switch then this crash does not happen (which is going to be my temp workaround)\r\n\n", "before_files": [{"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nADMIN_POLICY_NAME = \"AdministratorAccess\"\nADMIN_POLICY_ARN = f\"arn:aws:iam::aws:policy/{ADMIN_POLICY_NAME}\"\n\n\nclass IAMManagedAdminPolicy(BaseResourceCheck):\n def __init__(self):\n # This is the full description of your check\n description = \"Disallow IAM roles, users, and groups from using the AWS AdministratorAccess policy\"\n\n # This is the Unique ID for your check\n id = \"CKV_AWS_274\"\n\n # These are the terraform objects supported by this check (ex: aws_iam_policy_document)\n supported_resources = (\n \"aws_iam_role\",\n \"aws_iam_policy_attachment\",\n \"aws_iam_role_policy_attachment\",\n \"aws_iam_user_policy_attachment\",\n \"aws_iam_group_policy_attachment\",\n )\n\n # Valid CheckCategories are defined in checkov/common/models/enums.py\n categories = (CheckCategories.IAM,)\n super().__init__(name=description, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n if self.entity_type == \"aws_iam_role\":\n if \"managed_policy_arns\" in conf.keys():\n if ADMIN_POLICY_ARN in conf.get(\"managed_policy_arns\")[0]:\n return CheckResult.FAILED\n\n elif self.entity_type in (\n \"aws_iam_policy_attachment\",\n \"aws_iam_role_policy_attachment\",\n \"aws_iam_user_policy_attachment\",\n \"aws_iam_group_policy_attachment\",\n ):\n if conf.get(\"policy_arn\")[0] == ADMIN_POLICY_ARN:\n return CheckResult.FAILED\n\n return CheckResult.PASSED\n\n\ncheck = IAMManagedAdminPolicy()", "path": "checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import Any\n\nfrom checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nADMIN_POLICY_NAME = \"AdministratorAccess\"\nADMIN_POLICY_ARN = f\"arn:aws:iam::aws:policy/{ADMIN_POLICY_NAME}\"\n\n\nclass IAMManagedAdminPolicy(BaseResourceCheck):\n def __init__(self) -> None:\n # This is the full description of your check\n description = \"Disallow IAM roles, users, and groups from using the AWS AdministratorAccess policy\"\n\n # This is the Unique ID for your check\n id = \"CKV_AWS_274\"\n\n # These are the terraform objects supported by this check (ex: aws_iam_policy_document)\n supported_resources = (\n \"aws_iam_role\",\n \"aws_iam_policy_attachment\",\n \"aws_iam_role_policy_attachment\",\n \"aws_iam_user_policy_attachment\",\n \"aws_iam_group_policy_attachment\",\n )\n\n # Valid CheckCategories are defined in checkov/common/models/enums.py\n categories = (CheckCategories.IAM,)\n super().__init__(name=description, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:\n if self.entity_type == \"aws_iam_role\":\n if \"managed_policy_arns\" in conf.keys():\n if ADMIN_POLICY_ARN in conf[\"managed_policy_arns\"][0]:\n return CheckResult.FAILED\n\n elif self.entity_type in (\n \"aws_iam_policy_attachment\",\n \"aws_iam_role_policy_attachment\",\n \"aws_iam_user_policy_attachment\",\n \"aws_iam_group_policy_attachment\",\n ):\n policy_arn = conf.get(\"policy_arn\")\n if policy_arn and policy_arn[0] == ADMIN_POLICY_ARN:\n return CheckResult.FAILED\n\n return CheckResult.PASSED\n\n\ncheck = IAMManagedAdminPolicy()\n", "path": "checkov/terraform/checks/resource/aws/IAMManagedAdminPolicy.py"}]}
| 1,566 | 472 |
gh_patches_debug_40754
|
rasdani/github-patches
|
git_diff
|
qtile__qtile-4716
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Clock and tzupdate
### The issue:
qtile version:
0.21.0
These days I need to travel with my system and have the need to update my timezone in order to work with my calendar. I'm using `tzupdate` which easily updates my timezone with one command.
I'm using the Clock widget in the qtile bar as so:
``` python
widget.Clock(format="%A %d %b %Y %H:%M:%S %z"),
```
Updating the timezone with `tzupdate` however does not change the timezone on the Clock widget. It requires restarting qtile in order to get this done. I would expect qtile to poll the current system timezone at each tik.
### Required:
- [X] I have searched past issues to see if this bug has already been reported.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libqtile/widget/clock.py`
Content:
```
1 # Copyright (c) 2010 Aldo Cortesi
2 # Copyright (c) 2012 Andrew Grigorev
3 # Copyright (c) 2014 Sean Vig
4 # Copyright (c) 2014 Tycho Andersen
5 #
6 # Permission is hereby granted, free of charge, to any person obtaining a copy
7 # of this software and associated documentation files (the "Software"), to deal
8 # in the Software without restriction, including without limitation the rights
9 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
10 # copies of the Software, and to permit persons to whom the Software is
11 # furnished to do so, subject to the following conditions:
12 #
13 # The above copyright notice and this permission notice shall be included in
14 # all copies or substantial portions of the Software.
15 #
16 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
17 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
19 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
20 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
21 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
22 # SOFTWARE.
23
24 import sys
25 import time
26 from datetime import datetime, timedelta, timezone
27
28 from libqtile.log_utils import logger
29 from libqtile.widget import base
30
31 try:
32 import pytz
33 except ImportError:
34 pass
35
36 try:
37 import dateutil.tz
38 except ImportError:
39 pass
40
41
42 class Clock(base.InLoopPollText):
43 """A simple but flexible text-based clock"""
44
45 defaults = [
46 ("format", "%H:%M", "A Python datetime format string"),
47 ("update_interval", 1.0, "Update interval for the clock"),
48 (
49 "timezone",
50 None,
51 "The timezone to use for this clock, either as"
52 ' string if pytz or dateutil is installed (e.g. "US/Central" or'
53 " anything in /usr/share/zoneinfo), or as tzinfo (e.g."
54 " datetime.timezone.utc). None means the system local timezone and is"
55 " the default.",
56 ),
57 ]
58 DELTA = timedelta(seconds=0.5)
59
60 def __init__(self, **config):
61 base.InLoopPollText.__init__(self, **config)
62 self.add_defaults(Clock.defaults)
63 if isinstance(self.timezone, str):
64 if "pytz" in sys.modules:
65 self.timezone = pytz.timezone(self.timezone)
66 elif "dateutil" in sys.modules:
67 self.timezone = dateutil.tz.gettz(self.timezone)
68 else:
69 logger.warning(
70 "Clock widget can not infer its timezone from a"
71 " string without pytz or dateutil. Install one"
72 " of these libraries, or give it a"
73 " datetime.tzinfo instance."
74 )
75 if self.timezone is None:
76 logger.debug("Defaulting to the system local timezone.")
77
78 def tick(self):
79 self.update(self.poll())
80 return self.update_interval - time.time() % self.update_interval
81
82 # adding .5 to get a proper seconds value because glib could
83 # theoreticaly call our method too early and we could get something
84 # like (x-1).999 instead of x.000
85 def poll(self):
86 if self.timezone:
87 now = datetime.now(timezone.utc).astimezone(self.timezone)
88 else:
89 now = datetime.now(timezone.utc).astimezone()
90 return (now + self.DELTA).strftime(self.format)
91
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/libqtile/widget/clock.py b/libqtile/widget/clock.py
--- a/libqtile/widget/clock.py
+++ b/libqtile/widget/clock.py
@@ -20,11 +20,13 @@
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
+from __future__ import annotations
import sys
import time
-from datetime import datetime, timedelta, timezone
+from datetime import datetime, timedelta, timezone, tzinfo
+from libqtile.command.base import expose_command
from libqtile.log_utils import logger
from libqtile.widget import base
@@ -60,11 +62,24 @@
def __init__(self, **config):
base.InLoopPollText.__init__(self, **config)
self.add_defaults(Clock.defaults)
- if isinstance(self.timezone, str):
+ self.timezone = self._lift_timezone(self.timezone)
+
+ if self.timezone is None:
+ logger.debug("Defaulting to the system local timezone.")
+
+ def _lift_timezone(self, timezone):
+ if isinstance(timezone, tzinfo):
+ return timezone
+ elif isinstance(timezone, str):
+ # Empty string can be used to force use of system time
+ if not timezone:
+ return None
+
+ # A string timezone needs to be converted to a tzinfo object
if "pytz" in sys.modules:
- self.timezone = pytz.timezone(self.timezone)
+ return pytz.timezone(timezone)
elif "dateutil" in sys.modules:
- self.timezone = dateutil.tz.gettz(self.timezone)
+ return dateutil.tz.gettz(timezone)
else:
logger.warning(
"Clock widget can not infer its timezone from a"
@@ -72,8 +87,12 @@
" of these libraries, or give it a"
" datetime.tzinfo instance."
)
- if self.timezone is None:
- logger.debug("Defaulting to the system local timezone.")
+ elif timezone is None:
+ pass
+ else:
+ logger.warning("Invalid timezone value %s.", timezone)
+
+ return None
def tick(self):
self.update(self.poll())
@@ -88,3 +107,27 @@
else:
now = datetime.now(timezone.utc).astimezone()
return (now + self.DELTA).strftime(self.format)
+
+ @expose_command
+ def update_timezone(self, timezone: str | tzinfo | None = None):
+ """
+ Force the clock to update timezone information.
+
+ If the method is called with no arguments then the widget will reload
+ the timzeone set on the computer (e.g. via ``timedatectl set-timezone ..``).
+ This will have no effect if you have previously set a ``timezone`` value.
+
+ Alternatively, you can pass a timezone string (e.g. ``"Europe/Lisbon"``) to change
+ the specified timezone. Setting this to an empty string will cause the clock
+ to rely on the system timezone.
+ """
+ self.timezone = self._lift_timezone(timezone)
+
+ # Force python to update timezone info (e.g. if system time has changed)
+ time.tzset()
+ self.update(self.poll())
+
+ @expose_command
+ def use_system_timezone(self):
+ """Force clock to use system timezone."""
+ self.update_timezone("")
|
{"golden_diff": "diff --git a/libqtile/widget/clock.py b/libqtile/widget/clock.py\n--- a/libqtile/widget/clock.py\n+++ b/libqtile/widget/clock.py\n@@ -20,11 +20,13 @@\n # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n # SOFTWARE.\n+from __future__ import annotations\n \n import sys\n import time\n-from datetime import datetime, timedelta, timezone\n+from datetime import datetime, timedelta, timezone, tzinfo\n \n+from libqtile.command.base import expose_command\n from libqtile.log_utils import logger\n from libqtile.widget import base\n \n@@ -60,11 +62,24 @@\n def __init__(self, **config):\n base.InLoopPollText.__init__(self, **config)\n self.add_defaults(Clock.defaults)\n- if isinstance(self.timezone, str):\n+ self.timezone = self._lift_timezone(self.timezone)\n+\n+ if self.timezone is None:\n+ logger.debug(\"Defaulting to the system local timezone.\")\n+\n+ def _lift_timezone(self, timezone):\n+ if isinstance(timezone, tzinfo):\n+ return timezone\n+ elif isinstance(timezone, str):\n+ # Empty string can be used to force use of system time\n+ if not timezone:\n+ return None\n+\n+ # A string timezone needs to be converted to a tzinfo object\n if \"pytz\" in sys.modules:\n- self.timezone = pytz.timezone(self.timezone)\n+ return pytz.timezone(timezone)\n elif \"dateutil\" in sys.modules:\n- self.timezone = dateutil.tz.gettz(self.timezone)\n+ return dateutil.tz.gettz(timezone)\n else:\n logger.warning(\n \"Clock widget can not infer its timezone from a\"\n@@ -72,8 +87,12 @@\n \" of these libraries, or give it a\"\n \" datetime.tzinfo instance.\"\n )\n- if self.timezone is None:\n- logger.debug(\"Defaulting to the system local timezone.\")\n+ elif timezone is None:\n+ pass\n+ else:\n+ logger.warning(\"Invalid timezone value %s.\", timezone)\n+\n+ return None\n \n def tick(self):\n self.update(self.poll())\n@@ -88,3 +107,27 @@\n else:\n now = datetime.now(timezone.utc).astimezone()\n return (now + self.DELTA).strftime(self.format)\n+\n+ @expose_command\n+ def update_timezone(self, timezone: str | tzinfo | None = None):\n+ \"\"\"\n+ Force the clock to update timezone information.\n+\n+ If the method is called with no arguments then the widget will reload\n+ the timzeone set on the computer (e.g. via ``timedatectl set-timezone ..``).\n+ This will have no effect if you have previously set a ``timezone`` value.\n+\n+ Alternatively, you can pass a timezone string (e.g. ``\"Europe/Lisbon\"``) to change\n+ the specified timezone. Setting this to an empty string will cause the clock\n+ to rely on the system timezone.\n+ \"\"\"\n+ self.timezone = self._lift_timezone(timezone)\n+\n+ # Force python to update timezone info (e.g. if system time has changed)\n+ time.tzset()\n+ self.update(self.poll())\n+\n+ @expose_command\n+ def use_system_timezone(self):\n+ \"\"\"Force clock to use system timezone.\"\"\"\n+ self.update_timezone(\"\")\n", "issue": "Clock and tzupdate \n### The issue:\n\nqtile version:\r\n0.21.0\r\n\r\nThese days I need to travel with my system and have the need to update my timezone in order to work with my calendar. I'm using `tzupdate` which easily updates my timezone with one command. \r\n\r\nI'm using the Clock widget in the qtile bar as so:\r\n``` python \r\n widget.Clock(format=\"%A %d %b %Y %H:%M:%S %z\"),\r\n```\r\n\r\nUpdating the timezone with `tzupdate` however does not change the timezone on the Clock widget. It requires restarting qtile in order to get this done. I would expect qtile to poll the current system timezone at each tik.\n\n### Required:\n\n- [X] I have searched past issues to see if this bug has already been reported.\n", "before_files": [{"content": "# Copyright (c) 2010 Aldo Cortesi\n# Copyright (c) 2012 Andrew Grigorev\n# Copyright (c) 2014 Sean Vig\n# Copyright (c) 2014 Tycho Andersen\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\nimport sys\nimport time\nfrom datetime import datetime, timedelta, timezone\n\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\ntry:\n import pytz\nexcept ImportError:\n pass\n\ntry:\n import dateutil.tz\nexcept ImportError:\n pass\n\n\nclass Clock(base.InLoopPollText):\n \"\"\"A simple but flexible text-based clock\"\"\"\n\n defaults = [\n (\"format\", \"%H:%M\", \"A Python datetime format string\"),\n (\"update_interval\", 1.0, \"Update interval for the clock\"),\n (\n \"timezone\",\n None,\n \"The timezone to use for this clock, either as\"\n ' string if pytz or dateutil is installed (e.g. \"US/Central\" or'\n \" anything in /usr/share/zoneinfo), or as tzinfo (e.g.\"\n \" datetime.timezone.utc). None means the system local timezone and is\"\n \" the default.\",\n ),\n ]\n DELTA = timedelta(seconds=0.5)\n\n def __init__(self, **config):\n base.InLoopPollText.__init__(self, **config)\n self.add_defaults(Clock.defaults)\n if isinstance(self.timezone, str):\n if \"pytz\" in sys.modules:\n self.timezone = pytz.timezone(self.timezone)\n elif \"dateutil\" in sys.modules:\n self.timezone = dateutil.tz.gettz(self.timezone)\n else:\n logger.warning(\n \"Clock widget can not infer its timezone from a\"\n \" string without pytz or dateutil. Install one\"\n \" of these libraries, or give it a\"\n \" datetime.tzinfo instance.\"\n )\n if self.timezone is None:\n logger.debug(\"Defaulting to the system local timezone.\")\n\n def tick(self):\n self.update(self.poll())\n return self.update_interval - time.time() % self.update_interval\n\n # adding .5 to get a proper seconds value because glib could\n # theoreticaly call our method too early and we could get something\n # like (x-1).999 instead of x.000\n def poll(self):\n if self.timezone:\n now = datetime.now(timezone.utc).astimezone(self.timezone)\n else:\n now = datetime.now(timezone.utc).astimezone()\n return (now + self.DELTA).strftime(self.format)\n", "path": "libqtile/widget/clock.py"}], "after_files": [{"content": "# Copyright (c) 2010 Aldo Cortesi\n# Copyright (c) 2012 Andrew Grigorev\n# Copyright (c) 2014 Sean Vig\n# Copyright (c) 2014 Tycho Andersen\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\nfrom __future__ import annotations\n\nimport sys\nimport time\nfrom datetime import datetime, timedelta, timezone, tzinfo\n\nfrom libqtile.command.base import expose_command\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\ntry:\n import pytz\nexcept ImportError:\n pass\n\ntry:\n import dateutil.tz\nexcept ImportError:\n pass\n\n\nclass Clock(base.InLoopPollText):\n \"\"\"A simple but flexible text-based clock\"\"\"\n\n defaults = [\n (\"format\", \"%H:%M\", \"A Python datetime format string\"),\n (\"update_interval\", 1.0, \"Update interval for the clock\"),\n (\n \"timezone\",\n None,\n \"The timezone to use for this clock, either as\"\n ' string if pytz or dateutil is installed (e.g. \"US/Central\" or'\n \" anything in /usr/share/zoneinfo), or as tzinfo (e.g.\"\n \" datetime.timezone.utc). None means the system local timezone and is\"\n \" the default.\",\n ),\n ]\n DELTA = timedelta(seconds=0.5)\n\n def __init__(self, **config):\n base.InLoopPollText.__init__(self, **config)\n self.add_defaults(Clock.defaults)\n self.timezone = self._lift_timezone(self.timezone)\n\n if self.timezone is None:\n logger.debug(\"Defaulting to the system local timezone.\")\n\n def _lift_timezone(self, timezone):\n if isinstance(timezone, tzinfo):\n return timezone\n elif isinstance(timezone, str):\n # Empty string can be used to force use of system time\n if not timezone:\n return None\n\n # A string timezone needs to be converted to a tzinfo object\n if \"pytz\" in sys.modules:\n return pytz.timezone(timezone)\n elif \"dateutil\" in sys.modules:\n return dateutil.tz.gettz(timezone)\n else:\n logger.warning(\n \"Clock widget can not infer its timezone from a\"\n \" string without pytz or dateutil. Install one\"\n \" of these libraries, or give it a\"\n \" datetime.tzinfo instance.\"\n )\n elif timezone is None:\n pass\n else:\n logger.warning(\"Invalid timezone value %s.\", timezone)\n\n return None\n\n def tick(self):\n self.update(self.poll())\n return self.update_interval - time.time() % self.update_interval\n\n # adding .5 to get a proper seconds value because glib could\n # theoreticaly call our method too early and we could get something\n # like (x-1).999 instead of x.000\n def poll(self):\n if self.timezone:\n now = datetime.now(timezone.utc).astimezone(self.timezone)\n else:\n now = datetime.now(timezone.utc).astimezone()\n return (now + self.DELTA).strftime(self.format)\n\n @expose_command\n def update_timezone(self, timezone: str | tzinfo | None = None):\n \"\"\"\n Force the clock to update timezone information.\n\n If the method is called with no arguments then the widget will reload\n the timzeone set on the computer (e.g. via ``timedatectl set-timezone ..``).\n This will have no effect if you have previously set a ``timezone`` value.\n\n Alternatively, you can pass a timezone string (e.g. ``\"Europe/Lisbon\"``) to change\n the specified timezone. Setting this to an empty string will cause the clock\n to rely on the system timezone.\n \"\"\"\n self.timezone = self._lift_timezone(timezone)\n\n # Force python to update timezone info (e.g. if system time has changed)\n time.tzset()\n self.update(self.poll())\n\n @expose_command\n def use_system_timezone(self):\n \"\"\"Force clock to use system timezone.\"\"\"\n self.update_timezone(\"\")\n", "path": "libqtile/widget/clock.py"}]}
| 1,390 | 786 |
gh_patches_debug_11504
|
rasdani/github-patches
|
git_diff
|
alltheplaces__alltheplaces-3347
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider subway is broken
During the global build at 2021-09-15-14-42-44, spider **subway** failed with **31396 features** and **22 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-09-15-14-42-44/logs/subway.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-09-15-14-42-44/output/subway.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-09-15-14-42-44/output/subway.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/subway.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import scrapy
3 from locations.items import GeojsonPointItem
4 from locations.hours import OpeningHours
5
6 from urllib.parse import urlparse
7 import json
8
9
10 DAY_MAPPING = {
11 "MONDAY": "Mo",
12 "TUESDAY": "Tu",
13 "WEDNESDAY": "We",
14 "THURSDAY": "Th",
15 "FRIDAY": "Fr",
16 "SATURDAY": "Sa",
17 "SUNDAY": "Su",
18 }
19
20
21 class SubwaySpider(scrapy.Spider):
22 name = "subway"
23 item_attributes = {"name": "Subway", "brand": "Subway", "brand_wikidata": "Q244457"}
24 allowed_domains = ["restaurants.subway.com"]
25 start_urls = ["https://restaurants.subway.com/"]
26
27 link_extractor = scrapy.linkextractors.LinkExtractor(
28 restrict_css=".Directory-listLinks, .Directory-listTeasers"
29 )
30
31 def parse(self, response):
32 for link in self.link_extractor.extract_links(response):
33 yield scrapy.Request(link.url)
34
35 js = response.xpath('//script[@class="js-hours-config"]/text()').get()
36 if js:
37 yield from self.parse_restaurant(json.loads(js))
38
39 def parse_restaurant(self, js):
40 # Note: Of the five different coordinate fields, this is the one that always exists
41 lat_long = js["profile"]["yextDisplayCoordinate"]
42 website = urlparse(js["profile"]["websiteUrl"])._replace(query="").geturl()
43 properties = {
44 "lat": lat_long["lat"],
45 "lon": lat_long["long"],
46 "ref": js["profile"]["meta"]["id"],
47 "addr_full": js["profile"]["address"]["line1"],
48 "extras": {
49 "addr:unit": js["profile"]["address"]["line2"],
50 # Note: line3 is always null
51 "loc_name": js["profile"]["address"]["extraDescription"],
52 },
53 "city": js["profile"]["address"]["city"],
54 "state": js["profile"]["address"]["region"],
55 "postcode": js["profile"]["address"]["postalCode"],
56 "country": js["profile"]["address"]["countryCode"],
57 "phone": js["profile"].get("mainPhone", {}).get("number"),
58 "opening_hours": self.parse_hours(js["profile"]["hours"]["normalHours"]),
59 "website": website,
60 }
61 yield GeojsonPointItem(**properties)
62
63 def parse_hours(self, hours_json):
64 opening_hours = OpeningHours()
65 for date in hours_json:
66 day = DAY_MAPPING[date["day"]]
67 for interval in date["intervals"]:
68 start_hr, start_min = divmod(interval["start"], 100)
69 end_hr, end_min = divmod(interval["end"], 100)
70 opening_hours.add_range(
71 day, f"{start_hr}:{start_min}", f"{end_hr}:{end_min}"
72 )
73 return opening_hours.as_opening_hours()
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/locations/spiders/subway.py b/locations/spiders/subway.py
--- a/locations/spiders/subway.py
+++ b/locations/spiders/subway.py
@@ -39,7 +39,9 @@
def parse_restaurant(self, js):
# Note: Of the five different coordinate fields, this is the one that always exists
lat_long = js["profile"]["yextDisplayCoordinate"]
- website = urlparse(js["profile"]["websiteUrl"])._replace(query="").geturl()
+ website = ""
+ if 'websiteUrl' in js["profile"]:
+ website = urlparse(js["profile"]["websiteUrl"])._replace(query="").geturl()
properties = {
"lat": lat_long["lat"],
"lon": lat_long["long"],
|
{"golden_diff": "diff --git a/locations/spiders/subway.py b/locations/spiders/subway.py\n--- a/locations/spiders/subway.py\n+++ b/locations/spiders/subway.py\n@@ -39,7 +39,9 @@\n def parse_restaurant(self, js):\n # Note: Of the five different coordinate fields, this is the one that always exists\n lat_long = js[\"profile\"][\"yextDisplayCoordinate\"]\n- website = urlparse(js[\"profile\"][\"websiteUrl\"])._replace(query=\"\").geturl()\n+ website = \"\"\n+ if 'websiteUrl' in js[\"profile\"]:\n+ website = urlparse(js[\"profile\"][\"websiteUrl\"])._replace(query=\"\").geturl()\n properties = {\n \"lat\": lat_long[\"lat\"],\n \"lon\": lat_long[\"long\"],\n", "issue": "Spider subway is broken\nDuring the global build at 2021-09-15-14-42-44, spider **subway** failed with **31396 features** and **22 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-09-15-14-42-44/logs/subway.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-09-15-14-42-44/output/subway.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-09-15-14-42-44/output/subway.geojson))\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nfrom locations.items import GeojsonPointItem\nfrom locations.hours import OpeningHours\n\nfrom urllib.parse import urlparse\nimport json\n\n\nDAY_MAPPING = {\n \"MONDAY\": \"Mo\",\n \"TUESDAY\": \"Tu\",\n \"WEDNESDAY\": \"We\",\n \"THURSDAY\": \"Th\",\n \"FRIDAY\": \"Fr\",\n \"SATURDAY\": \"Sa\",\n \"SUNDAY\": \"Su\",\n}\n\n\nclass SubwaySpider(scrapy.Spider):\n name = \"subway\"\n item_attributes = {\"name\": \"Subway\", \"brand\": \"Subway\", \"brand_wikidata\": \"Q244457\"}\n allowed_domains = [\"restaurants.subway.com\"]\n start_urls = [\"https://restaurants.subway.com/\"]\n\n link_extractor = scrapy.linkextractors.LinkExtractor(\n restrict_css=\".Directory-listLinks, .Directory-listTeasers\"\n )\n\n def parse(self, response):\n for link in self.link_extractor.extract_links(response):\n yield scrapy.Request(link.url)\n\n js = response.xpath('//script[@class=\"js-hours-config\"]/text()').get()\n if js:\n yield from self.parse_restaurant(json.loads(js))\n\n def parse_restaurant(self, js):\n # Note: Of the five different coordinate fields, this is the one that always exists\n lat_long = js[\"profile\"][\"yextDisplayCoordinate\"]\n website = urlparse(js[\"profile\"][\"websiteUrl\"])._replace(query=\"\").geturl()\n properties = {\n \"lat\": lat_long[\"lat\"],\n \"lon\": lat_long[\"long\"],\n \"ref\": js[\"profile\"][\"meta\"][\"id\"],\n \"addr_full\": js[\"profile\"][\"address\"][\"line1\"],\n \"extras\": {\n \"addr:unit\": js[\"profile\"][\"address\"][\"line2\"],\n # Note: line3 is always null\n \"loc_name\": js[\"profile\"][\"address\"][\"extraDescription\"],\n },\n \"city\": js[\"profile\"][\"address\"][\"city\"],\n \"state\": js[\"profile\"][\"address\"][\"region\"],\n \"postcode\": js[\"profile\"][\"address\"][\"postalCode\"],\n \"country\": js[\"profile\"][\"address\"][\"countryCode\"],\n \"phone\": js[\"profile\"].get(\"mainPhone\", {}).get(\"number\"),\n \"opening_hours\": self.parse_hours(js[\"profile\"][\"hours\"][\"normalHours\"]),\n \"website\": website,\n }\n yield GeojsonPointItem(**properties)\n\n def parse_hours(self, hours_json):\n opening_hours = OpeningHours()\n for date in hours_json:\n day = DAY_MAPPING[date[\"day\"]]\n for interval in date[\"intervals\"]:\n start_hr, start_min = divmod(interval[\"start\"], 100)\n end_hr, end_min = divmod(interval[\"end\"], 100)\n opening_hours.add_range(\n day, f\"{start_hr}:{start_min}\", f\"{end_hr}:{end_min}\"\n )\n return opening_hours.as_opening_hours()\n", "path": "locations/spiders/subway.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nfrom locations.items import GeojsonPointItem\nfrom locations.hours import OpeningHours\n\nfrom urllib.parse import urlparse\nimport json\n\n\nDAY_MAPPING = {\n \"MONDAY\": \"Mo\",\n \"TUESDAY\": \"Tu\",\n \"WEDNESDAY\": \"We\",\n \"THURSDAY\": \"Th\",\n \"FRIDAY\": \"Fr\",\n \"SATURDAY\": \"Sa\",\n \"SUNDAY\": \"Su\",\n}\n\n\nclass SubwaySpider(scrapy.Spider):\n name = \"subway\"\n item_attributes = {\"name\": \"Subway\", \"brand\": \"Subway\", \"brand_wikidata\": \"Q244457\"}\n allowed_domains = [\"restaurants.subway.com\"]\n start_urls = [\"https://restaurants.subway.com/\"]\n\n link_extractor = scrapy.linkextractors.LinkExtractor(\n restrict_css=\".Directory-listLinks, .Directory-listTeasers\"\n )\n\n def parse(self, response):\n for link in self.link_extractor.extract_links(response):\n yield scrapy.Request(link.url)\n\n js = response.xpath('//script[@class=\"js-hours-config\"]/text()').get()\n if js:\n yield from self.parse_restaurant(json.loads(js))\n\n def parse_restaurant(self, js):\n # Note: Of the five different coordinate fields, this is the one that always exists\n lat_long = js[\"profile\"][\"yextDisplayCoordinate\"]\n website = \"\"\n if 'websiteUrl' in js[\"profile\"]:\n website = urlparse(js[\"profile\"][\"websiteUrl\"])._replace(query=\"\").geturl()\n properties = {\n \"lat\": lat_long[\"lat\"],\n \"lon\": lat_long[\"long\"],\n \"ref\": js[\"profile\"][\"meta\"][\"id\"],\n \"addr_full\": js[\"profile\"][\"address\"][\"line1\"],\n \"extras\": {\n \"addr:unit\": js[\"profile\"][\"address\"][\"line2\"],\n # Note: line3 is always null\n \"loc_name\": js[\"profile\"][\"address\"][\"extraDescription\"],\n },\n \"city\": js[\"profile\"][\"address\"][\"city\"],\n \"state\": js[\"profile\"][\"address\"][\"region\"],\n \"postcode\": js[\"profile\"][\"address\"][\"postalCode\"],\n \"country\": js[\"profile\"][\"address\"][\"countryCode\"],\n \"phone\": js[\"profile\"].get(\"mainPhone\", {}).get(\"number\"),\n \"opening_hours\": self.parse_hours(js[\"profile\"][\"hours\"][\"normalHours\"]),\n \"website\": website,\n }\n yield GeojsonPointItem(**properties)\n\n def parse_hours(self, hours_json):\n opening_hours = OpeningHours()\n for date in hours_json:\n day = DAY_MAPPING[date[\"day\"]]\n for interval in date[\"intervals\"]:\n start_hr, start_min = divmod(interval[\"start\"], 100)\n end_hr, end_min = divmod(interval[\"end\"], 100)\n opening_hours.add_range(\n day, f\"{start_hr}:{start_min}\", f\"{end_hr}:{end_min}\"\n )\n return opening_hours.as_opening_hours()\n", "path": "locations/spiders/subway.py"}]}
| 1,224 | 172 |
gh_patches_debug_1207
|
rasdani/github-patches
|
git_diff
|
pytorch__vision-2933
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Change default value of eps in FrozenBatchNorm to match BatchNorm
## ❓ Questions and Help
Hello
Loss is nan error occurs when I learn fast rcnn with resnext101 backbone
My code is as follows
```python
backbone = resnet_fpn_backbone('resnext101_32x8d', pretrained=True)
model = FasterRCNN(backbone, num_classes)
in_features = model.roi_heads.box_predictor.cls_score.in_features
model.roi_heads.box_predictor = FastRCNNPredictor(in_features, num_classes)
```
error message
```
Epoch: [0] [ 0/7208] eta: 1:27:42 lr: 0.000040 loss: 40613806080.0000 (40613806080.0000) loss_box_reg: 7979147264.0000 (7979147264.0000) loss_classifier: 11993160704.0000 (11993160704.0000) loss_objectness: 9486380032.0000 (9486380032.0000) loss_rpn_box_reg: 11155118080.0000 (11155118080.0000) time: 0.7301 data: 0.4106 max mem: 1241
Loss is nan, stopping training
```
When i change the backbone to resnet50 and resnet152, no error occrus.
### Please note that this issue tracker is not a help form and this issue will be closed.
We have a set of [listed resources available on the website](https://pytorch.org/resources). Our primary means of support is our discussion forum:
- [Discussion Forum](https://discuss.pytorch.org/)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchvision/ops/misc.py`
Content:
```
1 """
2 helper class that supports empty tensors on some nn functions.
3
4 Ideally, add support directly in PyTorch to empty tensors in
5 those functions.
6
7 This can be removed once https://github.com/pytorch/pytorch/issues/12013
8 is implemented
9 """
10
11 import warnings
12 import torch
13 from torch import Tensor, Size
14 from torch.jit.annotations import List, Optional, Tuple
15
16
17 class Conv2d(torch.nn.Conv2d):
18 def __init__(self, *args, **kwargs):
19 super().__init__(*args, **kwargs)
20 warnings.warn(
21 "torchvision.ops.misc.Conv2d is deprecated and will be "
22 "removed in future versions, use torch.nn.Conv2d instead.", FutureWarning)
23
24
25 class ConvTranspose2d(torch.nn.ConvTranspose2d):
26 def __init__(self, *args, **kwargs):
27 super().__init__(*args, **kwargs)
28 warnings.warn(
29 "torchvision.ops.misc.ConvTranspose2d is deprecated and will be "
30 "removed in future versions, use torch.nn.ConvTranspose2d instead.", FutureWarning)
31
32
33 class BatchNorm2d(torch.nn.BatchNorm2d):
34 def __init__(self, *args, **kwargs):
35 super().__init__(*args, **kwargs)
36 warnings.warn(
37 "torchvision.ops.misc.BatchNorm2d is deprecated and will be "
38 "removed in future versions, use torch.nn.BatchNorm2d instead.", FutureWarning)
39
40
41 interpolate = torch.nn.functional.interpolate
42
43
44 # This is not in nn
45 class FrozenBatchNorm2d(torch.nn.Module):
46 """
47 BatchNorm2d where the batch statistics and the affine parameters
48 are fixed
49 """
50
51 def __init__(
52 self,
53 num_features: int,
54 eps: float = 0.,
55 n: Optional[int] = None,
56 ):
57 # n=None for backward-compatibility
58 if n is not None:
59 warnings.warn("`n` argument is deprecated and has been renamed `num_features`",
60 DeprecationWarning)
61 num_features = n
62 super(FrozenBatchNorm2d, self).__init__()
63 self.eps = eps
64 self.register_buffer("weight", torch.ones(num_features))
65 self.register_buffer("bias", torch.zeros(num_features))
66 self.register_buffer("running_mean", torch.zeros(num_features))
67 self.register_buffer("running_var", torch.ones(num_features))
68
69 def _load_from_state_dict(
70 self,
71 state_dict: dict,
72 prefix: str,
73 local_metadata: dict,
74 strict: bool,
75 missing_keys: List[str],
76 unexpected_keys: List[str],
77 error_msgs: List[str],
78 ):
79 num_batches_tracked_key = prefix + 'num_batches_tracked'
80 if num_batches_tracked_key in state_dict:
81 del state_dict[num_batches_tracked_key]
82
83 super(FrozenBatchNorm2d, self)._load_from_state_dict(
84 state_dict, prefix, local_metadata, strict,
85 missing_keys, unexpected_keys, error_msgs)
86
87 def forward(self, x: Tensor) -> Tensor:
88 # move reshapes to the beginning
89 # to make it fuser-friendly
90 w = self.weight.reshape(1, -1, 1, 1)
91 b = self.bias.reshape(1, -1, 1, 1)
92 rv = self.running_var.reshape(1, -1, 1, 1)
93 rm = self.running_mean.reshape(1, -1, 1, 1)
94 scale = w * (rv + self.eps).rsqrt()
95 bias = b - rm * scale
96 return x * scale + bias
97
98 def __repr__(self) -> str:
99 return f"{self.__class__.__name__}({self.weight.shape[0]}, eps={self.eps})"
100
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/torchvision/ops/misc.py b/torchvision/ops/misc.py
--- a/torchvision/ops/misc.py
+++ b/torchvision/ops/misc.py
@@ -51,7 +51,7 @@
def __init__(
self,
num_features: int,
- eps: float = 0.,
+ eps: float = 1e-5,
n: Optional[int] = None,
):
# n=None for backward-compatibility
|
{"golden_diff": "diff --git a/torchvision/ops/misc.py b/torchvision/ops/misc.py\n--- a/torchvision/ops/misc.py\n+++ b/torchvision/ops/misc.py\n@@ -51,7 +51,7 @@\n def __init__(\n self,\n num_features: int,\n- eps: float = 0.,\n+ eps: float = 1e-5,\n n: Optional[int] = None,\n ):\n # n=None for backward-compatibility\n", "issue": "Change default value of eps in FrozenBatchNorm to match BatchNorm\n## \u2753 Questions and Help\r\nHello\r\nLoss is nan error occurs when I learn fast rcnn with resnext101 backbone\r\nMy code is as follows\r\n```python\r\nbackbone = resnet_fpn_backbone('resnext101_32x8d', pretrained=True)\r\nmodel = FasterRCNN(backbone, num_classes)\r\nin_features = model.roi_heads.box_predictor.cls_score.in_features\r\nmodel.roi_heads.box_predictor = FastRCNNPredictor(in_features, num_classes)\r\n```\r\n\r\nerror message\r\n```\r\nEpoch: [0] [ 0/7208] eta: 1:27:42 lr: 0.000040 loss: 40613806080.0000 (40613806080.0000) loss_box_reg: 7979147264.0000 (7979147264.0000) loss_classifier: 11993160704.0000 (11993160704.0000) loss_objectness: 9486380032.0000 (9486380032.0000) loss_rpn_box_reg: 11155118080.0000 (11155118080.0000) time: 0.7301 data: 0.4106 max mem: 1241\r\nLoss is nan, stopping training\r\n```\r\n\r\nWhen i change the backbone to resnet50 and resnet152, no error occrus.\r\n### Please note that this issue tracker is not a help form and this issue will be closed.\r\n\r\nWe have a set of [listed resources available on the website](https://pytorch.org/resources). Our primary means of support is our discussion forum:\r\n\r\n- [Discussion Forum](https://discuss.pytorch.org/)\r\n\n", "before_files": [{"content": "\"\"\"\nhelper class that supports empty tensors on some nn functions.\n\nIdeally, add support directly in PyTorch to empty tensors in\nthose functions.\n\nThis can be removed once https://github.com/pytorch/pytorch/issues/12013\nis implemented\n\"\"\"\n\nimport warnings\nimport torch\nfrom torch import Tensor, Size\nfrom torch.jit.annotations import List, Optional, Tuple\n\n\nclass Conv2d(torch.nn.Conv2d):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n warnings.warn(\n \"torchvision.ops.misc.Conv2d is deprecated and will be \"\n \"removed in future versions, use torch.nn.Conv2d instead.\", FutureWarning)\n\n\nclass ConvTranspose2d(torch.nn.ConvTranspose2d):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n warnings.warn(\n \"torchvision.ops.misc.ConvTranspose2d is deprecated and will be \"\n \"removed in future versions, use torch.nn.ConvTranspose2d instead.\", FutureWarning)\n\n\nclass BatchNorm2d(torch.nn.BatchNorm2d):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n warnings.warn(\n \"torchvision.ops.misc.BatchNorm2d is deprecated and will be \"\n \"removed in future versions, use torch.nn.BatchNorm2d instead.\", FutureWarning)\n\n\ninterpolate = torch.nn.functional.interpolate\n\n\n# This is not in nn\nclass FrozenBatchNorm2d(torch.nn.Module):\n \"\"\"\n BatchNorm2d where the batch statistics and the affine parameters\n are fixed\n \"\"\"\n\n def __init__(\n self,\n num_features: int,\n eps: float = 0.,\n n: Optional[int] = None,\n ):\n # n=None for backward-compatibility\n if n is not None:\n warnings.warn(\"`n` argument is deprecated and has been renamed `num_features`\",\n DeprecationWarning)\n num_features = n\n super(FrozenBatchNorm2d, self).__init__()\n self.eps = eps\n self.register_buffer(\"weight\", torch.ones(num_features))\n self.register_buffer(\"bias\", torch.zeros(num_features))\n self.register_buffer(\"running_mean\", torch.zeros(num_features))\n self.register_buffer(\"running_var\", torch.ones(num_features))\n\n def _load_from_state_dict(\n self,\n state_dict: dict,\n prefix: str,\n local_metadata: dict,\n strict: bool,\n missing_keys: List[str],\n unexpected_keys: List[str],\n error_msgs: List[str],\n ):\n num_batches_tracked_key = prefix + 'num_batches_tracked'\n if num_batches_tracked_key in state_dict:\n del state_dict[num_batches_tracked_key]\n\n super(FrozenBatchNorm2d, self)._load_from_state_dict(\n state_dict, prefix, local_metadata, strict,\n missing_keys, unexpected_keys, error_msgs)\n\n def forward(self, x: Tensor) -> Tensor:\n # move reshapes to the beginning\n # to make it fuser-friendly\n w = self.weight.reshape(1, -1, 1, 1)\n b = self.bias.reshape(1, -1, 1, 1)\n rv = self.running_var.reshape(1, -1, 1, 1)\n rm = self.running_mean.reshape(1, -1, 1, 1)\n scale = w * (rv + self.eps).rsqrt()\n bias = b - rm * scale\n return x * scale + bias\n\n def __repr__(self) -> str:\n return f\"{self.__class__.__name__}({self.weight.shape[0]}, eps={self.eps})\"\n", "path": "torchvision/ops/misc.py"}], "after_files": [{"content": "\"\"\"\nhelper class that supports empty tensors on some nn functions.\n\nIdeally, add support directly in PyTorch to empty tensors in\nthose functions.\n\nThis can be removed once https://github.com/pytorch/pytorch/issues/12013\nis implemented\n\"\"\"\n\nimport warnings\nimport torch\nfrom torch import Tensor, Size\nfrom torch.jit.annotations import List, Optional, Tuple\n\n\nclass Conv2d(torch.nn.Conv2d):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n warnings.warn(\n \"torchvision.ops.misc.Conv2d is deprecated and will be \"\n \"removed in future versions, use torch.nn.Conv2d instead.\", FutureWarning)\n\n\nclass ConvTranspose2d(torch.nn.ConvTranspose2d):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n warnings.warn(\n \"torchvision.ops.misc.ConvTranspose2d is deprecated and will be \"\n \"removed in future versions, use torch.nn.ConvTranspose2d instead.\", FutureWarning)\n\n\nclass BatchNorm2d(torch.nn.BatchNorm2d):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n warnings.warn(\n \"torchvision.ops.misc.BatchNorm2d is deprecated and will be \"\n \"removed in future versions, use torch.nn.BatchNorm2d instead.\", FutureWarning)\n\n\ninterpolate = torch.nn.functional.interpolate\n\n\n# This is not in nn\nclass FrozenBatchNorm2d(torch.nn.Module):\n \"\"\"\n BatchNorm2d where the batch statistics and the affine parameters\n are fixed\n \"\"\"\n\n def __init__(\n self,\n num_features: int,\n eps: float = 1e-5,\n n: Optional[int] = None,\n ):\n # n=None for backward-compatibility\n if n is not None:\n warnings.warn(\"`n` argument is deprecated and has been renamed `num_features`\",\n DeprecationWarning)\n num_features = n\n super(FrozenBatchNorm2d, self).__init__()\n self.eps = eps\n self.register_buffer(\"weight\", torch.ones(num_features))\n self.register_buffer(\"bias\", torch.zeros(num_features))\n self.register_buffer(\"running_mean\", torch.zeros(num_features))\n self.register_buffer(\"running_var\", torch.ones(num_features))\n\n def _load_from_state_dict(\n self,\n state_dict: dict,\n prefix: str,\n local_metadata: dict,\n strict: bool,\n missing_keys: List[str],\n unexpected_keys: List[str],\n error_msgs: List[str],\n ):\n num_batches_tracked_key = prefix + 'num_batches_tracked'\n if num_batches_tracked_key in state_dict:\n del state_dict[num_batches_tracked_key]\n\n super(FrozenBatchNorm2d, self)._load_from_state_dict(\n state_dict, prefix, local_metadata, strict,\n missing_keys, unexpected_keys, error_msgs)\n\n def forward(self, x: Tensor) -> Tensor:\n # move reshapes to the beginning\n # to make it fuser-friendly\n w = self.weight.reshape(1, -1, 1, 1)\n b = self.bias.reshape(1, -1, 1, 1)\n rv = self.running_var.reshape(1, -1, 1, 1)\n rm = self.running_mean.reshape(1, -1, 1, 1)\n scale = w * (rv + self.eps).rsqrt()\n bias = b - rm * scale\n return x * scale + bias\n\n def __repr__(self) -> str:\n return f\"{self.__class__.__name__}({self.weight.shape[0]}, eps={self.eps})\"\n", "path": "torchvision/ops/misc.py"}]}
| 1,742 | 107 |
gh_patches_debug_17358
|
rasdani/github-patches
|
git_diff
|
weecology__retriever-548
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
download_only w/path fails to use path argument when checking for file
When `download_only` checks to see if the file already exists before copying it, it ignores the path argument. This means that:
```
retriever download MoM2003 -p testdata
```
will keep overwriting the file in `testdata` if it exists, and it will not copy the file to `testdata` if the file exists in `.`.
Fixes this is probably just a little logic improvement in the `final_cleanup` function of `download_only`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `engines/download_only.py`
Content:
```
1 from __future__ import print_function
2 from builtins import object
3 import os
4 import platform
5 import shutil
6 import inspect
7
8 from retriever.lib.engine import filename_from_url
9 from retriever.lib.models import Engine, no_cleanup
10 from retriever import DATA_DIR, HOME_DIR
11
12
13 class DummyConnection(object):
14
15 def cursor(self):
16 pass
17
18 def commit(self):
19 pass
20
21 def rollback(self):
22 pass
23
24 def close(self):
25 pass
26
27
28 class DummyCursor(DummyConnection):
29 pass
30
31
32 class engine(Engine):
33 """Engine instance for writing data to a CSV file."""
34 name = "Download Only"
35 abbreviation = "download"
36 required_opts = [("path",
37 "File path to copy data files",
38 "./"),
39 ("subdir",
40 "Keep the subdirectories for archived files",
41 False)
42 ]
43
44 def table_exists(self, dbname, tablename):
45 """Checks if the file to be downloaded already exists"""
46 try:
47 tablename = self.table_name(name=tablename, dbname=dbname)
48 return os.path.exists(tablename)
49 except:
50 return False
51
52 def get_connection(self):
53 """Gets the db connection."""
54 self.get_input()
55 return DummyConnection()
56
57 def final_cleanup(self):
58 """Copies downloaded files to desired directory
59
60 Copies the downloaded files into the chosen directory unless files with the same
61 name already exist in the directory.
62
63 """
64 if hasattr(self, "all_files"):
65 for file_name in self.all_files:
66 file_path, file_name_nopath = os.path.split(file_name)
67 subdir = os.path.split(file_path)[1] if self.opts['subdir'] else ''
68 dest_path = os.path.join(self.opts['path'], subdir)
69 if os.path.abspath(file_path) == os.path.abspath(os.path.join(DATA_DIR, subdir)):
70 print ("%s is already in the working directory" %
71 file_name_nopath)
72 print("Keeping existing copy.")
73 else:
74 print("Copying %s from %s" % (file_name_nopath, file_path))
75 if os.path.isdir(dest_path):
76 try:
77 shutil.copy(file_name, dest_path)
78 except:
79 print("Couldn't copy file to %s" % dest_path)
80 else:
81 try:
82 print("Creating directory %s" % dest_path)
83 os.makedirs(dest_path)
84 shutil.copy(file_name, dest_path)
85 except:
86 print("Couldn't create directory %s" % dest_path)
87 self.all_files = set()
88
89 def auto_create_table(self, table, url=None, filename=None, pk=None):
90 """Download the file if it doesn't exist"""
91 if url and not filename:
92 filename = filename_from_url(url)
93
94 if url and not self.find_file(filename):
95 # If the file doesn't exist, download it
96 self.download_file(url, filename)
97
98 def insert_data_from_url(self, url):
99 """Insert data from a web resource"""
100 filename = filename_from_url(url)
101 find = self.find_file(filename)
102 if not find:
103 self.create_raw_data_dir()
104 self.download_file(url, filename)
105
106 def find_file(self, filename):
107 """Checks for the given file and adds it to the list of all files"""
108 result = Engine.find_file(self, filename)
109 if not hasattr(self, "all_files"):
110 self.all_files = set()
111 if result:
112 self.all_files.add(result)
113 return result
114
115 def register_files(self, filenames):
116 """Identify a list of files to be moved by the download
117
118 When downloading archives with multiple files the engine needs to be
119 informed of all of the file names so that it can move them.
120
121 """
122 full_filenames = {self.find_file(filename) for filename in filenames
123 if self.find_file(filename)}
124 self.all_files = self.all_files.union(full_filenames)
125
126
127 # replace all other methods with a function that does nothing
128 def dummy_method(self, *args, **kwargs):
129 pass
130
131
132 methods = inspect.getmembers(engine, predicate=inspect.ismethod)
133 keep_methods = {'table_exists',
134 'get_connection',
135 'final_cleanup',
136 'auto_create_table',
137 'insert_data_from_url',
138 }
139 remove_methods = ['insert_data_from_file']
140 for name, method in methods:
141 if (name not in keep_methods and
142 'download' not in name and
143 'file' not in name and
144 'dir' not in name):
145 setattr(engine, name, dummy_method)
146 for name in remove_methods:
147 setattr(engine, name, dummy_method)
148
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/engines/download_only.py b/engines/download_only.py
--- a/engines/download_only.py
+++ b/engines/download_only.py
@@ -66,7 +66,9 @@
file_path, file_name_nopath = os.path.split(file_name)
subdir = os.path.split(file_path)[1] if self.opts['subdir'] else ''
dest_path = os.path.join(self.opts['path'], subdir)
- if os.path.abspath(file_path) == os.path.abspath(os.path.join(DATA_DIR, subdir)):
+ if os.path.isfile(os.path.join(dest_path, file_name_nopath)):
+ print ("File already exists at specified location")
+ elif os.path.abspath(file_path) == os.path.abspath(os.path.join(DATA_DIR, subdir)):
print ("%s is already in the working directory" %
file_name_nopath)
print("Keeping existing copy.")
|
{"golden_diff": "diff --git a/engines/download_only.py b/engines/download_only.py\n--- a/engines/download_only.py\n+++ b/engines/download_only.py\n@@ -66,7 +66,9 @@\n file_path, file_name_nopath = os.path.split(file_name)\n subdir = os.path.split(file_path)[1] if self.opts['subdir'] else ''\n dest_path = os.path.join(self.opts['path'], subdir)\n- if os.path.abspath(file_path) == os.path.abspath(os.path.join(DATA_DIR, subdir)):\n+ if os.path.isfile(os.path.join(dest_path, file_name_nopath)):\n+ print (\"File already exists at specified location\")\n+ elif os.path.abspath(file_path) == os.path.abspath(os.path.join(DATA_DIR, subdir)):\n print (\"%s is already in the working directory\" %\n file_name_nopath)\n print(\"Keeping existing copy.\")\n", "issue": "download_only w/path fails to use path argument when checking for file\nWhen `download_only` checks to see if the file already exists before copying it, it ignores the path argument. This means that:\n\n```\nretriever download MoM2003 -p testdata\n```\n\nwill keep overwriting the file in `testdata` if it exists, and it will not copy the file to `testdata` if the file exists in `.`.\n\nFixes this is probably just a little logic improvement in the `final_cleanup` function of `download_only`.\n\n", "before_files": [{"content": "from __future__ import print_function\nfrom builtins import object\nimport os\nimport platform\nimport shutil\nimport inspect\n\nfrom retriever.lib.engine import filename_from_url\nfrom retriever.lib.models import Engine, no_cleanup\nfrom retriever import DATA_DIR, HOME_DIR\n\n\nclass DummyConnection(object):\n\n def cursor(self):\n pass\n\n def commit(self):\n pass\n\n def rollback(self):\n pass\n\n def close(self):\n pass\n\n\nclass DummyCursor(DummyConnection):\n pass\n\n\nclass engine(Engine):\n \"\"\"Engine instance for writing data to a CSV file.\"\"\"\n name = \"Download Only\"\n abbreviation = \"download\"\n required_opts = [(\"path\",\n \"File path to copy data files\",\n \"./\"),\n (\"subdir\",\n \"Keep the subdirectories for archived files\",\n False)\n ]\n\n def table_exists(self, dbname, tablename):\n \"\"\"Checks if the file to be downloaded already exists\"\"\"\n try:\n tablename = self.table_name(name=tablename, dbname=dbname)\n return os.path.exists(tablename)\n except:\n return False\n\n def get_connection(self):\n \"\"\"Gets the db connection.\"\"\"\n self.get_input()\n return DummyConnection()\n\n def final_cleanup(self):\n \"\"\"Copies downloaded files to desired directory\n\n Copies the downloaded files into the chosen directory unless files with the same\n name already exist in the directory.\n\n \"\"\"\n if hasattr(self, \"all_files\"):\n for file_name in self.all_files:\n file_path, file_name_nopath = os.path.split(file_name)\n subdir = os.path.split(file_path)[1] if self.opts['subdir'] else ''\n dest_path = os.path.join(self.opts['path'], subdir)\n if os.path.abspath(file_path) == os.path.abspath(os.path.join(DATA_DIR, subdir)):\n print (\"%s is already in the working directory\" %\n file_name_nopath)\n print(\"Keeping existing copy.\")\n else:\n print(\"Copying %s from %s\" % (file_name_nopath, file_path))\n if os.path.isdir(dest_path):\n try:\n shutil.copy(file_name, dest_path)\n except:\n print(\"Couldn't copy file to %s\" % dest_path)\n else:\n try:\n print(\"Creating directory %s\" % dest_path)\n os.makedirs(dest_path)\n shutil.copy(file_name, dest_path)\n except:\n print(\"Couldn't create directory %s\" % dest_path)\n self.all_files = set()\n\n def auto_create_table(self, table, url=None, filename=None, pk=None):\n \"\"\"Download the file if it doesn't exist\"\"\"\n if url and not filename:\n filename = filename_from_url(url)\n\n if url and not self.find_file(filename):\n # If the file doesn't exist, download it\n self.download_file(url, filename)\n\n def insert_data_from_url(self, url):\n \"\"\"Insert data from a web resource\"\"\"\n filename = filename_from_url(url)\n find = self.find_file(filename)\n if not find:\n self.create_raw_data_dir()\n self.download_file(url, filename)\n\n def find_file(self, filename):\n \"\"\"Checks for the given file and adds it to the list of all files\"\"\"\n result = Engine.find_file(self, filename)\n if not hasattr(self, \"all_files\"):\n self.all_files = set()\n if result:\n self.all_files.add(result)\n return result\n\n def register_files(self, filenames):\n \"\"\"Identify a list of files to be moved by the download\n\n When downloading archives with multiple files the engine needs to be\n informed of all of the file names so that it can move them.\n\n \"\"\"\n full_filenames = {self.find_file(filename) for filename in filenames\n if self.find_file(filename)}\n self.all_files = self.all_files.union(full_filenames)\n\n\n# replace all other methods with a function that does nothing\ndef dummy_method(self, *args, **kwargs):\n pass\n\n\nmethods = inspect.getmembers(engine, predicate=inspect.ismethod)\nkeep_methods = {'table_exists',\n 'get_connection',\n 'final_cleanup',\n 'auto_create_table',\n 'insert_data_from_url',\n }\nremove_methods = ['insert_data_from_file']\nfor name, method in methods:\n if (name not in keep_methods and\n 'download' not in name and\n 'file' not in name and\n 'dir' not in name):\n setattr(engine, name, dummy_method)\nfor name in remove_methods:\n setattr(engine, name, dummy_method)\n", "path": "engines/download_only.py"}], "after_files": [{"content": "from __future__ import print_function\nfrom builtins import object\nimport os\nimport platform\nimport shutil\nimport inspect\n\nfrom retriever.lib.engine import filename_from_url\nfrom retriever.lib.models import Engine, no_cleanup\nfrom retriever import DATA_DIR, HOME_DIR\n\n\nclass DummyConnection(object):\n\n def cursor(self):\n pass\n\n def commit(self):\n pass\n\n def rollback(self):\n pass\n\n def close(self):\n pass\n\n\nclass DummyCursor(DummyConnection):\n pass\n\n\nclass engine(Engine):\n \"\"\"Engine instance for writing data to a CSV file.\"\"\"\n name = \"Download Only\"\n abbreviation = \"download\"\n required_opts = [(\"path\",\n \"File path to copy data files\",\n \"./\"),\n (\"subdir\",\n \"Keep the subdirectories for archived files\",\n False)\n ]\n\n def table_exists(self, dbname, tablename):\n \"\"\"Checks if the file to be downloaded already exists\"\"\"\n try:\n tablename = self.table_name(name=tablename, dbname=dbname)\n return os.path.exists(tablename)\n except:\n return False\n\n def get_connection(self):\n \"\"\"Gets the db connection.\"\"\"\n self.get_input()\n return DummyConnection()\n\n def final_cleanup(self):\n \"\"\"Copies downloaded files to desired directory\n\n Copies the downloaded files into the chosen directory unless files with the same\n name already exist in the directory.\n\n \"\"\"\n if hasattr(self, \"all_files\"):\n for file_name in self.all_files:\n file_path, file_name_nopath = os.path.split(file_name)\n subdir = os.path.split(file_path)[1] if self.opts['subdir'] else ''\n dest_path = os.path.join(self.opts['path'], subdir)\n if os.path.isfile(os.path.join(dest_path, file_name_nopath)):\n print (\"File already exists at specified location\")\n elif os.path.abspath(file_path) == os.path.abspath(os.path.join(DATA_DIR, subdir)):\n print (\"%s is already in the working directory\" %\n file_name_nopath)\n print(\"Keeping existing copy.\")\n else:\n print(\"Copying %s from %s\" % (file_name_nopath, file_path))\n if os.path.isdir(dest_path):\n try:\n shutil.copy(file_name, dest_path)\n except:\n print(\"Couldn't copy file to %s\" % dest_path)\n else:\n try:\n print(\"Creating directory %s\" % dest_path)\n os.makedirs(dest_path)\n shutil.copy(file_name, dest_path)\n except:\n print(\"Couldn't create directory %s\" % dest_path)\n self.all_files = set()\n\n def auto_create_table(self, table, url=None, filename=None, pk=None):\n \"\"\"Download the file if it doesn't exist\"\"\"\n if url and not filename:\n filename = filename_from_url(url)\n\n if url and not self.find_file(filename):\n # If the file doesn't exist, download it\n self.download_file(url, filename)\n\n def insert_data_from_url(self, url):\n \"\"\"Insert data from a web resource\"\"\"\n filename = filename_from_url(url)\n find = self.find_file(filename)\n if not find:\n self.create_raw_data_dir()\n self.download_file(url, filename)\n\n def find_file(self, filename):\n \"\"\"Checks for the given file and adds it to the list of all files\"\"\"\n result = Engine.find_file(self, filename)\n if not hasattr(self, \"all_files\"):\n self.all_files = set()\n if result:\n self.all_files.add(result)\n return result\n\n def register_files(self, filenames):\n \"\"\"Identify a list of files to be moved by the download\n\n When downloading archives with multiple files the engine needs to be\n informed of all of the file names so that it can move them.\n\n \"\"\"\n full_filenames = {self.find_file(filename) for filename in filenames\n if self.find_file(filename)}\n self.all_files = self.all_files.union(full_filenames)\n\n\n# replace all other methods with a function that does nothing\ndef dummy_method(self, *args, **kwargs):\n pass\n\n\nmethods = inspect.getmembers(engine, predicate=inspect.ismethod)\nkeep_methods = {'table_exists',\n 'get_connection',\n 'final_cleanup',\n 'auto_create_table',\n 'insert_data_from_url',\n }\nremove_methods = ['insert_data_from_file']\nfor name, method in methods:\n if (name not in keep_methods and\n 'download' not in name and\n 'file' not in name and\n 'dir' not in name):\n setattr(engine, name, dummy_method)\nfor name in remove_methods:\n setattr(engine, name, dummy_method)\n", "path": "engines/download_only.py"}]}
| 1,689 | 192 |
gh_patches_debug_6830
|
rasdani/github-patches
|
git_diff
|
platformsh__platformsh-docs-2432
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
🐛 Allow searching for code strings with separators
### Where on docs.platform.sh should be changed?
The search
### What exactly should be updated?
Searching for strings with separators like `X-Frame-Options` and `memory_ratio` don't show the results for pages that have those strings directly. Putting quotes around the strings doesn't help.
We'd like people to be able to get info on specific properties and strings, so the search should return these results.
### Additional context
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `search/main.py`
Content:
```
1 import os
2 import glob
3 import json
4 import meilisearch
5 from platformshconfig import Config
6
7 class Search:
8 def __init__(self):
9 self.default = {
10 "host": "http://127.0.0.1",
11 "key": None,
12 "port": 7700
13 }
14
15 self.scrape_dir = "output"
16 self.scrape_config = "config/scrape.json"
17 self.docs_index = "docs"
18 self.primaryKey = "documentId"
19 self.index_name = "Docs"
20
21 # Below are Platform.sh custom settings for how the search engine functions.
22
23 # Data available to the dropdown React app in docs, used to fill out autocomplete results.
24 self.displayed_attributes = ['keywords', 'title', 'text', 'url', 'site', 'section']
25 # Data actually searchable by our queries.
26 self.searchable_attributes = ['keywords', 'title', 'pageUrl', 'section', 'text', 'url']
27
28 # Show results for one query with the listed pages, when they by default would not show up as best results.
29 # Note: these aren't automatically two-way, which is why they're all defined twice.
30 self.synonyms = {
31 "cron": ["crons"],
32 "crons": ["cron tasks", "cron jobs"],
33 "e-mail": ["email"],
34 "routes.yaml": ["routes"],
35 "routes": ["routes.yaml"],
36 "services": ["services.yaml"],
37 "services.yaml": ["services"],
38 "application": [".platform.app.yaml", "app.yaml", "applications.yaml"],
39 ".platform.app.yaml": ["application"],
40 "app.yaml": ["application"],
41 "applications.yaml": ["application", "multi-app"],
42 "multi-app": ["applications.yaml"],
43 "regions": ["public ip addresses"],
44 "public ip addresses": ["regions"],
45 "ssl": ["https", "tls"],
46 "https": ["ssl"],
47 "auth": ["authentication", "access control"], # Only needs to be one way since we don't use "auth" in the docs
48 }
49
50 # Ranking rules:
51 #
52 # - Default order: ["words", "typo", "proximity", "attribute", "sort", "exactness"]
53 #
54 # - words: number of times query is in document (greater number gets priority)
55 # - typo: fewer typos > more typos
56 # - proximity: smaller distance between multiple occurences of query in same document > larger distances
57 # - attribute: sorted according to order of importance of attributes (searchable_attributes). terms in
58 # more important attributes first.
59 # - sort: queries are sorted at query time
60 # - exactness: similarity of matched words in document with query
61
62 self.ranking_rules = ["rank:asc", "attribute", "typo", "words", "proximity", "exactness"]
63
64 self.updated_settings = {
65 "rankingRules": self.ranking_rules,
66 "searchableAttributes": self.searchable_attributes,
67 "displayedAttributes": self.displayed_attributes
68 }
69
70 # Group results by page
71 self.distinct_attribute = "pageUrl"
72
73 def getConnectionString(self):
74 """
75 Sets the Meilisearch host string, depending on the environment.
76
77 Returns:
78 string: Meilisearch host string.
79 """
80 if os.environ.get('PORT'):
81 return "{}:{}".format(self.default["host"], os.environ['PORT'])
82 else:
83 return "{}:{}".format(self.default["host"], self.default["port"])
84
85 def getMasterKey(self):
86 """
87 Retrieves the Meilisearch master key, either from the Platform.sh environment or locally.
88 """
89 config = Config()
90 if config.is_valid_platform():
91 return config.projectEntropy
92 elif os.environ.get("MEILI_MASTER_KEY"):
93 return os.environ["MEILI_MASTER_KEY"]
94 else:
95 return self.default["key"]
96
97 def add_documents(self, index):
98 """
99 Cycle through the individual site indexes in /outputs so their individual documents can be added to Meilisearch.
100 """
101 documents = [f for f in glob.glob("{}/*.json".format(self.scrape_dir))]
102 for doc in documents:
103 self.add(doc, index)
104
105 def add(self, doc, index):
106 """
107 Add an individual site's index to the Meilisearch service.
108 """
109 with open(doc) as scraped_index:
110 data = json.load(scraped_index)
111 index.add_documents(data)
112
113 def update(self):
114 """
115 Updates the Meilisearch index.
116 """
117 # Create a Meilisearch client.
118 client = meilisearch.Client(self.getConnectionString(), self.getMasterKey())
119
120 # Delete previous index
121 if len(client.get_indexes()):
122 client.index(self.docs_index).delete()
123
124 # Create a new index
125 create_index_task = client.create_index(uid=self.docs_index, options={'primaryKey': self.primaryKey, 'uid': self.index_name})
126
127 client.wait_for_task(create_index_task['uid'])
128
129 index = client.get_index(create_index_task['indexUid'])
130
131 # Add synonyms for the index
132 index.update_synonyms(self.synonyms)
133
134 # Update its settings: what can be searched, what's displayable, and how results should be ranked.
135 index.update_settings(self.updated_settings)
136
137 # Update distinct attribute.
138 index.update_distinct_attribute(self.distinct_attribute)
139
140 # Add documents to the index
141 self.add_documents(index)
142
143 if __name__ == "__main__":
144 meili = Search()
145 meili.update()
146
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/search/main.py b/search/main.py
--- a/search/main.py
+++ b/search/main.py
@@ -59,7 +59,7 @@
# - sort: queries are sorted at query time
# - exactness: similarity of matched words in document with query
- self.ranking_rules = ["rank:asc", "attribute", "typo", "words", "proximity", "exactness"]
+ self.ranking_rules = ["rank:asc", "exactness", "attribute", "proximity", "typo", "words"]
self.updated_settings = {
"rankingRules": self.ranking_rules,
|
{"golden_diff": "diff --git a/search/main.py b/search/main.py\n--- a/search/main.py\n+++ b/search/main.py\n@@ -59,7 +59,7 @@\n # - sort: queries are sorted at query time\n # - exactness: similarity of matched words in document with query\n \n- self.ranking_rules = [\"rank:asc\", \"attribute\", \"typo\", \"words\", \"proximity\", \"exactness\"]\n+ self.ranking_rules = [\"rank:asc\", \"exactness\", \"attribute\", \"proximity\", \"typo\", \"words\"]\n \n self.updated_settings = {\n \"rankingRules\": self.ranking_rules,\n", "issue": "\ud83d\udc1b Allow searching for code strings with separators\n### Where on docs.platform.sh should be changed?\r\n\r\nThe search\r\n\r\n### What exactly should be updated?\r\n\r\nSearching for strings with separators like `X-Frame-Options` and `memory_ratio` don't show the results for pages that have those strings directly. Putting quotes around the strings doesn't help.\r\n\r\nWe'd like people to be able to get info on specific properties and strings, so the search should return these results.\r\n\r\n### Additional context\r\n\r\n_No response_\n", "before_files": [{"content": "import os\nimport glob\nimport json\nimport meilisearch\nfrom platformshconfig import Config\n\nclass Search:\n def __init__(self):\n self.default = {\n \"host\": \"http://127.0.0.1\",\n \"key\": None,\n \"port\": 7700\n }\n\n self.scrape_dir = \"output\"\n self.scrape_config = \"config/scrape.json\"\n self.docs_index = \"docs\"\n self.primaryKey = \"documentId\"\n self.index_name = \"Docs\"\n\n # Below are Platform.sh custom settings for how the search engine functions.\n\n # Data available to the dropdown React app in docs, used to fill out autocomplete results.\n self.displayed_attributes = ['keywords', 'title', 'text', 'url', 'site', 'section']\n # Data actually searchable by our queries.\n self.searchable_attributes = ['keywords', 'title', 'pageUrl', 'section', 'text', 'url']\n\n # Show results for one query with the listed pages, when they by default would not show up as best results.\n # Note: these aren't automatically two-way, which is why they're all defined twice.\n self.synonyms = {\n \"cron\": [\"crons\"],\n \"crons\": [\"cron tasks\", \"cron jobs\"],\n \"e-mail\": [\"email\"],\n \"routes.yaml\": [\"routes\"],\n \"routes\": [\"routes.yaml\"],\n \"services\": [\"services.yaml\"],\n \"services.yaml\": [\"services\"],\n \"application\": [\".platform.app.yaml\", \"app.yaml\", \"applications.yaml\"],\n \".platform.app.yaml\": [\"application\"],\n \"app.yaml\": [\"application\"],\n \"applications.yaml\": [\"application\", \"multi-app\"],\n \"multi-app\": [\"applications.yaml\"],\n \"regions\": [\"public ip addresses\"],\n \"public ip addresses\": [\"regions\"],\n \"ssl\": [\"https\", \"tls\"],\n \"https\": [\"ssl\"],\n \"auth\": [\"authentication\", \"access control\"], # Only needs to be one way since we don't use \"auth\" in the docs\n }\n\n # Ranking rules:\n #\n # - Default order: [\"words\", \"typo\", \"proximity\", \"attribute\", \"sort\", \"exactness\"]\n #\n # - words: number of times query is in document (greater number gets priority)\n # - typo: fewer typos > more typos\n # - proximity: smaller distance between multiple occurences of query in same document > larger distances\n # - attribute: sorted according to order of importance of attributes (searchable_attributes). terms in\n # more important attributes first.\n # - sort: queries are sorted at query time\n # - exactness: similarity of matched words in document with query\n\n self.ranking_rules = [\"rank:asc\", \"attribute\", \"typo\", \"words\", \"proximity\", \"exactness\"]\n\n self.updated_settings = {\n \"rankingRules\": self.ranking_rules,\n \"searchableAttributes\": self.searchable_attributes,\n \"displayedAttributes\": self.displayed_attributes\n }\n\n # Group results by page\n self.distinct_attribute = \"pageUrl\"\n\n def getConnectionString(self):\n \"\"\"\n Sets the Meilisearch host string, depending on the environment.\n\n Returns:\n string: Meilisearch host string.\n \"\"\"\n if os.environ.get('PORT'):\n return \"{}:{}\".format(self.default[\"host\"], os.environ['PORT'])\n else:\n return \"{}:{}\".format(self.default[\"host\"], self.default[\"port\"])\n\n def getMasterKey(self):\n \"\"\"\n Retrieves the Meilisearch master key, either from the Platform.sh environment or locally.\n \"\"\"\n config = Config()\n if config.is_valid_platform():\n return config.projectEntropy\n elif os.environ.get(\"MEILI_MASTER_KEY\"):\n return os.environ[\"MEILI_MASTER_KEY\"]\n else:\n return self.default[\"key\"]\n\n def add_documents(self, index):\n \"\"\"\n Cycle through the individual site indexes in /outputs so their individual documents can be added to Meilisearch.\n \"\"\"\n documents = [f for f in glob.glob(\"{}/*.json\".format(self.scrape_dir))]\n for doc in documents:\n self.add(doc, index)\n\n def add(self, doc, index):\n \"\"\"\n Add an individual site's index to the Meilisearch service.\n \"\"\"\n with open(doc) as scraped_index:\n data = json.load(scraped_index)\n index.add_documents(data)\n\n def update(self):\n \"\"\"\n Updates the Meilisearch index.\n \"\"\"\n # Create a Meilisearch client.\n client = meilisearch.Client(self.getConnectionString(), self.getMasterKey())\n\n # Delete previous index\n if len(client.get_indexes()):\n client.index(self.docs_index).delete()\n\n # Create a new index\n create_index_task = client.create_index(uid=self.docs_index, options={'primaryKey': self.primaryKey, 'uid': self.index_name})\n\n client.wait_for_task(create_index_task['uid'])\n\n index = client.get_index(create_index_task['indexUid'])\n\n # Add synonyms for the index\n index.update_synonyms(self.synonyms)\n\n # Update its settings: what can be searched, what's displayable, and how results should be ranked.\n index.update_settings(self.updated_settings)\n\n # Update distinct attribute.\n index.update_distinct_attribute(self.distinct_attribute)\n\n # Add documents to the index\n self.add_documents(index)\n\nif __name__ == \"__main__\":\n meili = Search()\n meili.update()\n", "path": "search/main.py"}], "after_files": [{"content": "import os\nimport glob\nimport json\nimport meilisearch\nfrom platformshconfig import Config\n\nclass Search:\n def __init__(self):\n self.default = {\n \"host\": \"http://127.0.0.1\",\n \"key\": None,\n \"port\": 7700\n }\n\n self.scrape_dir = \"output\"\n self.scrape_config = \"config/scrape.json\"\n self.docs_index = \"docs\"\n self.primaryKey = \"documentId\"\n self.index_name = \"Docs\"\n\n # Below are Platform.sh custom settings for how the search engine functions.\n\n # Data available to the dropdown React app in docs, used to fill out autocomplete results.\n self.displayed_attributes = ['keywords', 'title', 'text', 'url', 'site', 'section']\n # Data actually searchable by our queries.\n self.searchable_attributes = ['keywords', 'title', 'pageUrl', 'section', 'text', 'url']\n\n # Show results for one query with the listed pages, when they by default would not show up as best results.\n # Note: these aren't automatically two-way, which is why they're all defined twice.\n self.synonyms = {\n \"cron\": [\"crons\"],\n \"crons\": [\"cron tasks\", \"cron jobs\"],\n \"e-mail\": [\"email\"],\n \"routes.yaml\": [\"routes\"],\n \"routes\": [\"routes.yaml\"],\n \"services\": [\"services.yaml\"],\n \"services.yaml\": [\"services\"],\n \"application\": [\".platform.app.yaml\", \"app.yaml\", \"applications.yaml\"],\n \".platform.app.yaml\": [\"application\"],\n \"app.yaml\": [\"application\"],\n \"applications.yaml\": [\"application\", \"multi-app\"],\n \"multi-app\": [\"applications.yaml\"],\n \"regions\": [\"public ip addresses\"],\n \"public ip addresses\": [\"regions\"],\n \"ssl\": [\"https\", \"tls\"],\n \"https\": [\"ssl\"],\n \"auth\": [\"authentication\", \"access control\"], # Only needs to be one way since we don't use \"auth\" in the docs\n }\n\n # Ranking rules:\n #\n # - Default order: [\"words\", \"typo\", \"proximity\", \"attribute\", \"sort\", \"exactness\"]\n #\n # - words: number of times query is in document (greater number gets priority)\n # - typo: fewer typos > more typos\n # - proximity: smaller distance between multiple occurences of query in same document > larger distances\n # - attribute: sorted according to order of importance of attributes (searchable_attributes). terms in\n # more important attributes first.\n # - sort: queries are sorted at query time\n # - exactness: similarity of matched words in document with query\n\n self.ranking_rules = [\"rank:asc\", \"exactness\", \"attribute\", \"proximity\", \"typo\", \"words\"]\n\n self.updated_settings = {\n \"rankingRules\": self.ranking_rules,\n \"searchableAttributes\": self.searchable_attributes,\n \"displayedAttributes\": self.displayed_attributes\n }\n\n # Group results by page\n self.distinct_attribute = \"pageUrl\"\n\n def getConnectionString(self):\n \"\"\"\n Sets the Meilisearch host string, depending on the environment.\n\n Returns:\n string: Meilisearch host string.\n \"\"\"\n if os.environ.get('PORT'):\n return \"{}:{}\".format(self.default[\"host\"], os.environ['PORT'])\n else:\n return \"{}:{}\".format(self.default[\"host\"], self.default[\"port\"])\n\n def getMasterKey(self):\n \"\"\"\n Retrieves the Meilisearch master key, either from the Platform.sh environment or locally.\n \"\"\"\n config = Config()\n if config.is_valid_platform():\n return config.projectEntropy\n elif os.environ.get(\"MEILI_MASTER_KEY\"):\n return os.environ[\"MEILI_MASTER_KEY\"]\n else:\n return self.default[\"key\"]\n\n def add_documents(self, index):\n \"\"\"\n Cycle through the individual site indexes in /outputs so their individual documents can be added to Meilisearch.\n \"\"\"\n documents = [f for f in glob.glob(\"{}/*.json\".format(self.scrape_dir))]\n for doc in documents:\n self.add(doc, index)\n\n def add(self, doc, index):\n \"\"\"\n Add an individual site's index to the Meilisearch service.\n \"\"\"\n with open(doc) as scraped_index:\n data = json.load(scraped_index)\n index.add_documents(data)\n\n def update(self):\n \"\"\"\n Updates the Meilisearch index.\n \"\"\"\n # Create a Meilisearch client.\n client = meilisearch.Client(self.getConnectionString(), self.getMasterKey())\n\n # Delete previous index\n if len(client.get_indexes()):\n client.index(self.docs_index).delete()\n\n # Create a new index\n create_index_task = client.create_index(uid=self.docs_index, options={'primaryKey': self.primaryKey, 'uid': self.index_name})\n\n client.wait_for_task(create_index_task['uid'])\n\n index = client.get_index(create_index_task['indexUid'])\n\n # Add synonyms for the index\n index.update_synonyms(self.synonyms)\n\n # Update its settings: what can be searched, what's displayable, and how results should be ranked.\n index.update_settings(self.updated_settings)\n\n # Update distinct attribute.\n index.update_distinct_attribute(self.distinct_attribute)\n\n # Add documents to the index\n self.add_documents(index)\n\nif __name__ == \"__main__\":\n meili = Search()\n meili.update()\n", "path": "search/main.py"}]}
| 1,899 | 142 |
gh_patches_debug_7495
|
rasdani/github-patches
|
git_diff
|
kymatio__kymatio-890
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make NumPy the default frontend
Since we promised earlier:
```
/home/jenkins/workspace/kymatio_dev/kymatio/frontend/entry.py:20: DeprecationWarning: Torch frontend is currently the default, but NumPy will become the default in the next version.
warnings.warn("Torch frontend is currently the default, but NumPy will become the default in the next"
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kymatio/frontend/entry.py`
Content:
```
1 import logging
2 import warnings
3 import importlib
4
5
6 class ScatteringEntry(object):
7 def __init__(self, *args, **kwargs):
8 self.name = kwargs['name']
9 self.class_name = kwargs['class_name']
10 kwargs.pop('name')
11 kwargs.pop('class_name')
12
13 frontend_suffixes = {'torch' : 'Torch',
14 'numpy' : 'NumPy',
15 'tensorflow' : 'TensorFlow',
16 'keras': 'Keras',
17 'sklearn': 'Transformer'}
18
19 if 'frontend' not in kwargs:
20 warnings.warn("Torch frontend is currently the default, but NumPy will become the default in the next"
21 " version.", DeprecationWarning)
22 frontend = 'torch'
23 else:
24 frontend = kwargs['frontend'].lower()
25 kwargs.pop('frontend')
26
27 frontends = list(frontend_suffixes.keys())
28
29 if frontend not in frontends:
30 raise RuntimeError('The frontend \'%s\" is not valid. Must be '
31 'one of \'%s\', or \'%s\'.' %
32 (frontend, '\', \''.join(frontends[:-1]),
33 frontends[-1]))
34
35 try:
36 module = importlib.import_module('kymatio.' + self.class_name + '.frontend.' + frontend + '_frontend')
37
38 # Create frontend-specific class name by inserting frontend name
39 # after `Scattering`.
40 frontend = frontend_suffixes[frontend]
41
42 class_name = self.__class__.__name__
43
44 base_name = class_name[:-len('Entry*D')]
45 dim_suffix = class_name[-len('*D'):]
46
47 class_name = base_name + frontend + dim_suffix
48
49 self.__class__ = getattr(module, class_name)
50 self.__init__(*args, **kwargs)
51 except Exception as e:
52 raise e from RuntimeError('\nThe frontend \'' + frontend + '\' could not be correctly imported.')
53
54 logging.info('The ' + self.name + ' frontend ' + frontend + ' was imported.')
55
56
57 __all__ = ['ScatteringEntry']
58
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/kymatio/frontend/entry.py b/kymatio/frontend/entry.py
--- a/kymatio/frontend/entry.py
+++ b/kymatio/frontend/entry.py
@@ -17,9 +17,7 @@
'sklearn': 'Transformer'}
if 'frontend' not in kwargs:
- warnings.warn("Torch frontend is currently the default, but NumPy will become the default in the next"
- " version.", DeprecationWarning)
- frontend = 'torch'
+ frontend = 'numpy'
else:
frontend = kwargs['frontend'].lower()
kwargs.pop('frontend')
|
{"golden_diff": "diff --git a/kymatio/frontend/entry.py b/kymatio/frontend/entry.py\n--- a/kymatio/frontend/entry.py\n+++ b/kymatio/frontend/entry.py\n@@ -17,9 +17,7 @@\n 'sklearn': 'Transformer'}\n \n if 'frontend' not in kwargs:\n- warnings.warn(\"Torch frontend is currently the default, but NumPy will become the default in the next\"\n- \" version.\", DeprecationWarning)\n- frontend = 'torch'\n+ frontend = 'numpy'\n else:\n frontend = kwargs['frontend'].lower()\n kwargs.pop('frontend')\n", "issue": "Make NumPy the default frontend\nSince we promised earlier:\r\n\r\n```\r\n /home/jenkins/workspace/kymatio_dev/kymatio/frontend/entry.py:20: DeprecationWarning: Torch frontend is currently the default, but NumPy will become the default in the next version.\r\n warnings.warn(\"Torch frontend is currently the default, but NumPy will become the default in the next\"\r\n```\n", "before_files": [{"content": "import logging\nimport warnings\nimport importlib\n\n\nclass ScatteringEntry(object):\n def __init__(self, *args, **kwargs):\n self.name = kwargs['name']\n self.class_name = kwargs['class_name']\n kwargs.pop('name')\n kwargs.pop('class_name')\n\n frontend_suffixes = {'torch' : 'Torch',\n 'numpy' : 'NumPy',\n 'tensorflow' : 'TensorFlow',\n 'keras': 'Keras',\n 'sklearn': 'Transformer'}\n\n if 'frontend' not in kwargs:\n warnings.warn(\"Torch frontend is currently the default, but NumPy will become the default in the next\"\n \" version.\", DeprecationWarning)\n frontend = 'torch'\n else:\n frontend = kwargs['frontend'].lower()\n kwargs.pop('frontend')\n\n frontends = list(frontend_suffixes.keys())\n\n if frontend not in frontends:\n raise RuntimeError('The frontend \\'%s\\\" is not valid. Must be '\n 'one of \\'%s\\', or \\'%s\\'.' %\n (frontend, '\\', \\''.join(frontends[:-1]),\n frontends[-1]))\n\n try:\n module = importlib.import_module('kymatio.' + self.class_name + '.frontend.' + frontend + '_frontend')\n\n # Create frontend-specific class name by inserting frontend name\n # after `Scattering`.\n frontend = frontend_suffixes[frontend]\n\n class_name = self.__class__.__name__\n\n base_name = class_name[:-len('Entry*D')]\n dim_suffix = class_name[-len('*D'):]\n\n class_name = base_name + frontend + dim_suffix\n\n self.__class__ = getattr(module, class_name)\n self.__init__(*args, **kwargs)\n except Exception as e:\n raise e from RuntimeError('\\nThe frontend \\'' + frontend + '\\' could not be correctly imported.')\n\n logging.info('The ' + self.name + ' frontend ' + frontend + ' was imported.')\n\n\n__all__ = ['ScatteringEntry']\n", "path": "kymatio/frontend/entry.py"}], "after_files": [{"content": "import logging\nimport warnings\nimport importlib\n\n\nclass ScatteringEntry(object):\n def __init__(self, *args, **kwargs):\n self.name = kwargs['name']\n self.class_name = kwargs['class_name']\n kwargs.pop('name')\n kwargs.pop('class_name')\n\n frontend_suffixes = {'torch' : 'Torch',\n 'numpy' : 'NumPy',\n 'tensorflow' : 'TensorFlow',\n 'keras': 'Keras',\n 'sklearn': 'Transformer'}\n\n if 'frontend' not in kwargs:\n frontend = 'numpy'\n else:\n frontend = kwargs['frontend'].lower()\n kwargs.pop('frontend')\n\n frontends = list(frontend_suffixes.keys())\n\n if frontend not in frontends:\n raise RuntimeError('The frontend \\'%s\\\" is not valid. Must be '\n 'one of \\'%s\\', or \\'%s\\'.' %\n (frontend, '\\', \\''.join(frontends[:-1]),\n frontends[-1]))\n\n try:\n module = importlib.import_module('kymatio.' + self.class_name + '.frontend.' + frontend + '_frontend')\n\n # Create frontend-specific class name by inserting frontend name\n # after `Scattering`.\n frontend = frontend_suffixes[frontend]\n\n class_name = self.__class__.__name__\n\n base_name = class_name[:-len('Entry*D')]\n dim_suffix = class_name[-len('*D'):]\n\n class_name = base_name + frontend + dim_suffix\n\n self.__class__ = getattr(module, class_name)\n self.__init__(*args, **kwargs)\n except Exception as e:\n raise e from RuntimeError('\\nThe frontend \\'' + frontend + '\\' could not be correctly imported.')\n\n logging.info('The ' + self.name + ' frontend ' + frontend + ' was imported.')\n\n\n__all__ = ['ScatteringEntry']\n", "path": "kymatio/frontend/entry.py"}]}
| 890 | 136 |
gh_patches_debug_14010
|
rasdani/github-patches
|
git_diff
|
vaexio__vaex-404
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Failing to open arrow file format & misleading error message
Vaex fails to open arrow file format. It confusingly expects a file written not in the arrow file-format but in arrow stream-format.
If this is a non-negotiable vaex constraint, a less misleading error message might help very confused novices...
```
import pandas as pd
import pyarrow as pa
import numpy as np
import vaex
df = pd.DataFrame(
{
'col1': range(5)
}
)
table = pa.Table.from_pandas(df)
with pa.OSFile('test2.arrow', 'wb') as sink:
with pa.RecordBatchFileWriter(sink, table.schema) as writer:
writer.write_table(table)
with pa.OSFile('test2.arrow', 'rb') as source:
df = pa.ipc.open_file(source).read_pandas()
df = vaex.open('test2.arrow')
```
Error messages:
```
ERROR:MainThread:vaex:error opening 'test2.arrow'
---------------------------------------------------------------------------
ArrowInvalid Traceback (most recent call last)
in
16 with pa.OSFile('test2.arrow', 'rb') as source:
17 df = pa.ipc.open_file(source).read_pandas()
---> 18 df = vaex.open('test2.arrow')
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\vaex\__init__.py in open(path, convert, shuffle, copy_index, *args, **kwargs)
189 ds = from_csv(path, copy_index=copy_index, **kwargs)
190 else:
--> 191 ds = vaex.file.open(path, *args, **kwargs)
192 if convert and ds:
193 ds.export_hdf5(filename_hdf5, shuffle=shuffle)
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\vaex\file\__init__.py in open(path, *args, **kwargs)
28 for opener in opener_classes:
29 if opener.can_open(path, *args, **kwargs):
---> 30 return opener.open(path, *args, **kwargs)
31 if hdf5:
32 openers.extend(hdf5.dataset.dataset_type_map.items())
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\vaex_arrow\opener.py in open(path, *args, **kwargs)
9 def open(path, *args, **kwargs):
10 from .dataset import DatasetArrow
---> 11 return DatasetArrow(path, *args, **kwargs)
12
13 class ParquetOpener:
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\vaex_arrow\dataset.py in __init__(self, filename, table, write)
18 self._write = write
19 if table is None:
---> 20 self._load()
21 else:
22 self._load_table(table)
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\vaex_arrow\dataset.py in _load(self)
24 def _load(self):
25 source = pa.memory_map(self.path)
---> 26 reader = pa.ipc.open_stream(source)
27 table = pa.Table.from_batches([b for b in reader])
28 self._load_table(table)
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\pyarrow\ipc.py in open_stream(source)
123 reader : RecordBatchStreamReader
124 """
--> 125 return RecordBatchStreamReader(source)
126
127
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\pyarrow\ipc.py in __init__(self, source)
58 """
59 def __init__(self, source):
---> 60 self._open(source)
61
62
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\pyarrow\ipc.pxi in pyarrow.lib._RecordBatchStreamReader._open()
Z:\Systemdateien\Miniconda3\envs\finance\lib\site-packages\pyarrow\error.pxi in pyarrow.lib.check_status()
ArrowInvalid: Expected to read 1330795073 metadata bytes, but only read 1474
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `packages/vaex-arrow/vaex_arrow/dataset.py`
Content:
```
1 __author__ = 'maartenbreddels'
2 import logging
3
4 import pyarrow as pa
5 import pyarrow.parquet as pq
6
7 import vaex.dataset
8 import vaex.file.other
9 from .convert import column_from_arrow_array
10 logger = logging.getLogger("vaex_arrow")
11
12
13 class DatasetArrow(vaex.dataset.DatasetLocal):
14 """Implements storage using arrow"""
15
16 def __init__(self, filename=None, table=None, write=False):
17 super(DatasetArrow, self).__init__(name=filename, path=filename, column_names=[])
18 self._write = write
19 if table is None:
20 self._load()
21 else:
22 self._load_table(table)
23
24 def _load(self):
25 source = pa.memory_map(self.path)
26 reader = pa.ipc.open_stream(source)
27 table = pa.Table.from_batches([b for b in reader])
28 self._load_table(table)
29
30 def _load_table(self, table):
31 self._length_unfiltered = self._length_original = table.num_rows
32 self._index_end = self._length_original = table.num_rows
33 for col in table.columns:
34 name = col.name
35 # TODO: keep the arrow columns, and support and test chunks
36 arrow_array = col.data.chunks[0]
37 column = column_from_arrow_array(arrow_array)
38
39 self.columns[name] = column
40 self.column_names.append(name)
41 self._save_assign_expression(name, vaex.expression.Expression(self, name))
42
43
44 @classmethod
45 def can_open(cls, path, *args, **kwargs):
46 return path.rpartition('.')[2] == 'arrow'
47
48 @classmethod
49 def get_options(cls, path):
50 return []
51
52 @classmethod
53 def option_to_args(cls, option):
54 return []
55
56 class DatasetParquet(DatasetArrow):
57 def _load(self):
58 # might not be optimal, but it works, we can always see if we can
59 # do mmapping later on
60 table = pq.read_table(self.path)
61 self._load_table(table)
62
63 vaex.file.other.dataset_type_map["arrow"] = DatasetArrow
64 vaex.file.other.dataset_type_map["parquet"] = DatasetParquet
65
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/packages/vaex-arrow/vaex_arrow/dataset.py b/packages/vaex-arrow/vaex_arrow/dataset.py
--- a/packages/vaex-arrow/vaex_arrow/dataset.py
+++ b/packages/vaex-arrow/vaex_arrow/dataset.py
@@ -23,8 +23,18 @@
def _load(self):
source = pa.memory_map(self.path)
- reader = pa.ipc.open_stream(source)
- table = pa.Table.from_batches([b for b in reader])
+ try:
+ # first we try if it opens as stream
+ reader = pa.ipc.open_stream(source)
+ except pa.lib.ArrowInvalid:
+ # if not, we open as file
+ reader = pa.ipc.open_file(source)
+ # for some reason this reader is not iterable
+ batches = [reader.get_batch(i) for i in range(reader.num_record_batches)]
+ else:
+ # if a stream, we're good
+ batches = reader # this reader is iterable
+ table = pa.Table.from_batches(batches)
self._load_table(table)
def _load_table(self, table):
|
{"golden_diff": "diff --git a/packages/vaex-arrow/vaex_arrow/dataset.py b/packages/vaex-arrow/vaex_arrow/dataset.py\n--- a/packages/vaex-arrow/vaex_arrow/dataset.py\n+++ b/packages/vaex-arrow/vaex_arrow/dataset.py\n@@ -23,8 +23,18 @@\n \n def _load(self):\n source = pa.memory_map(self.path)\n- reader = pa.ipc.open_stream(source)\n- table = pa.Table.from_batches([b for b in reader])\n+ try:\n+ # first we try if it opens as stream\n+ reader = pa.ipc.open_stream(source)\n+ except pa.lib.ArrowInvalid:\n+ # if not, we open as file\n+ reader = pa.ipc.open_file(source)\n+ # for some reason this reader is not iterable\n+ batches = [reader.get_batch(i) for i in range(reader.num_record_batches)]\n+ else:\n+ # if a stream, we're good\n+ batches = reader # this reader is iterable\n+ table = pa.Table.from_batches(batches)\n self._load_table(table)\n \n def _load_table(self, table):\n", "issue": "Failing to open arrow file format & misleading error message\nVaex fails to open arrow file format. It confusingly expects a file written not in the arrow file-format but in arrow stream-format.\r\n\r\nIf this is a non-negotiable vaex constraint, a less misleading error message might help very confused novices...\r\n\r\n```\r\n\r\nimport pandas as pd\r\nimport pyarrow as pa\r\nimport numpy as np\r\nimport vaex\r\ndf = pd.DataFrame(\r\n {\r\n 'col1': range(5)\r\n }\r\n)\r\ntable = pa.Table.from_pandas(df)\r\n\r\nwith pa.OSFile('test2.arrow', 'wb') as sink:\r\n with pa.RecordBatchFileWriter(sink, table.schema) as writer:\r\n writer.write_table(table)\r\n\r\nwith pa.OSFile('test2.arrow', 'rb') as source:\r\n df = pa.ipc.open_file(source).read_pandas()\r\ndf = vaex.open('test2.arrow')\r\n```\r\n\r\nError messages:\r\n```\r\nERROR:MainThread:vaex:error opening 'test2.arrow'\r\n---------------------------------------------------------------------------\r\nArrowInvalid Traceback (most recent call last)\r\n in \r\n 16 with pa.OSFile('test2.arrow', 'rb') as source:\r\n 17 df = pa.ipc.open_file(source).read_pandas()\r\n---> 18 df = vaex.open('test2.arrow')\r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\vaex\\__init__.py in open(path, convert, shuffle, copy_index, *args, **kwargs)\r\n 189 ds = from_csv(path, copy_index=copy_index, **kwargs)\r\n 190 else:\r\n--> 191 ds = vaex.file.open(path, *args, **kwargs)\r\n 192 if convert and ds:\r\n 193 ds.export_hdf5(filename_hdf5, shuffle=shuffle)\r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\vaex\\file\\__init__.py in open(path, *args, **kwargs)\r\n 28 for opener in opener_classes:\r\n 29 if opener.can_open(path, *args, **kwargs):\r\n---> 30 return opener.open(path, *args, **kwargs)\r\n 31 if hdf5:\r\n 32 openers.extend(hdf5.dataset.dataset_type_map.items())\r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\vaex_arrow\\opener.py in open(path, *args, **kwargs)\r\n 9 def open(path, *args, **kwargs):\r\n 10 from .dataset import DatasetArrow\r\n---> 11 return DatasetArrow(path, *args, **kwargs)\r\n 12 \r\n 13 class ParquetOpener:\r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\vaex_arrow\\dataset.py in __init__(self, filename, table, write)\r\n 18 self._write = write\r\n 19 if table is None:\r\n---> 20 self._load()\r\n 21 else:\r\n 22 self._load_table(table)\r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\vaex_arrow\\dataset.py in _load(self)\r\n 24 def _load(self):\r\n 25 source = pa.memory_map(self.path)\r\n---> 26 reader = pa.ipc.open_stream(source)\r\n 27 table = pa.Table.from_batches([b for b in reader])\r\n 28 self._load_table(table)\r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\pyarrow\\ipc.py in open_stream(source)\r\n 123 reader : RecordBatchStreamReader\r\n 124 \"\"\"\r\n--> 125 return RecordBatchStreamReader(source)\r\n 126 \r\n 127 \r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\pyarrow\\ipc.py in __init__(self, source)\r\n 58 \"\"\"\r\n 59 def __init__(self, source):\r\n---> 60 self._open(source)\r\n 61 \r\n 62 \r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\pyarrow\\ipc.pxi in pyarrow.lib._RecordBatchStreamReader._open()\r\n\r\nZ:\\Systemdateien\\Miniconda3\\envs\\finance\\lib\\site-packages\\pyarrow\\error.pxi in pyarrow.lib.check_status()\r\n\r\nArrowInvalid: Expected to read 1330795073 metadata bytes, but only read 1474\r\n```\n", "before_files": [{"content": "__author__ = 'maartenbreddels'\nimport logging\n\nimport pyarrow as pa\nimport pyarrow.parquet as pq\n\nimport vaex.dataset\nimport vaex.file.other\nfrom .convert import column_from_arrow_array\nlogger = logging.getLogger(\"vaex_arrow\")\n\n\nclass DatasetArrow(vaex.dataset.DatasetLocal):\n \"\"\"Implements storage using arrow\"\"\"\n\n def __init__(self, filename=None, table=None, write=False):\n super(DatasetArrow, self).__init__(name=filename, path=filename, column_names=[])\n self._write = write\n if table is None:\n self._load()\n else:\n self._load_table(table)\n\n def _load(self):\n source = pa.memory_map(self.path)\n reader = pa.ipc.open_stream(source)\n table = pa.Table.from_batches([b for b in reader])\n self._load_table(table)\n \n def _load_table(self, table):\n self._length_unfiltered = self._length_original = table.num_rows\n self._index_end = self._length_original = table.num_rows\n for col in table.columns:\n name = col.name\n # TODO: keep the arrow columns, and support and test chunks\n arrow_array = col.data.chunks[0]\n column = column_from_arrow_array(arrow_array)\n\n self.columns[name] = column\n self.column_names.append(name)\n self._save_assign_expression(name, vaex.expression.Expression(self, name))\n\n\n @classmethod\n def can_open(cls, path, *args, **kwargs):\n return path.rpartition('.')[2] == 'arrow'\n\n @classmethod\n def get_options(cls, path):\n return []\n\n @classmethod\n def option_to_args(cls, option):\n return []\n\nclass DatasetParquet(DatasetArrow):\n def _load(self):\n # might not be optimal, but it works, we can always see if we can\n # do mmapping later on\n table = pq.read_table(self.path)\n self._load_table(table)\n\nvaex.file.other.dataset_type_map[\"arrow\"] = DatasetArrow\nvaex.file.other.dataset_type_map[\"parquet\"] = DatasetParquet\n\n", "path": "packages/vaex-arrow/vaex_arrow/dataset.py"}], "after_files": [{"content": "__author__ = 'maartenbreddels'\nimport logging\n\nimport pyarrow as pa\nimport pyarrow.parquet as pq\n\nimport vaex.dataset\nimport vaex.file.other\nfrom .convert import column_from_arrow_array\nlogger = logging.getLogger(\"vaex_arrow\")\n\n\nclass DatasetArrow(vaex.dataset.DatasetLocal):\n \"\"\"Implements storage using arrow\"\"\"\n\n def __init__(self, filename=None, table=None, write=False):\n super(DatasetArrow, self).__init__(name=filename, path=filename, column_names=[])\n self._write = write\n if table is None:\n self._load()\n else:\n self._load_table(table)\n\n def _load(self):\n source = pa.memory_map(self.path)\n try:\n # first we try if it opens as stream\n reader = pa.ipc.open_stream(source)\n except pa.lib.ArrowInvalid:\n # if not, we open as file\n reader = pa.ipc.open_file(source)\n # for some reason this reader is not iterable\n batches = [reader.get_batch(i) for i in range(reader.num_record_batches)]\n else:\n # if a stream, we're good\n batches = reader # this reader is iterable\n table = pa.Table.from_batches(batches)\n self._load_table(table)\n \n def _load_table(self, table):\n self._length_unfiltered = self._length_original = table.num_rows\n self._index_end = self._length_original = table.num_rows\n for col in table.columns:\n name = col.name\n # TODO: keep the arrow columns, and support and test chunks\n arrow_array = col.data.chunks[0]\n column = column_from_arrow_array(arrow_array)\n\n self.columns[name] = column\n self.column_names.append(name)\n self._save_assign_expression(name, vaex.expression.Expression(self, name))\n\n\n @classmethod\n def can_open(cls, path, *args, **kwargs):\n return path.rpartition('.')[2] == 'arrow'\n\n @classmethod\n def get_options(cls, path):\n return []\n\n @classmethod\n def option_to_args(cls, option):\n return []\n\nclass DatasetParquet(DatasetArrow):\n def _load(self):\n # might not be optimal, but it works, we can always see if we can\n # do mmapping later on\n table = pq.read_table(self.path)\n self._load_table(table)\n\nvaex.file.other.dataset_type_map[\"arrow\"] = DatasetArrow\nvaex.file.other.dataset_type_map[\"parquet\"] = DatasetParquet\n\n", "path": "packages/vaex-arrow/vaex_arrow/dataset.py"}]}
| 1,909 | 259 |
gh_patches_debug_30712
|
rasdani/github-patches
|
git_diff
|
nvaccess__nvda-11841
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Visual Studio: IntelliSense tooltips reported twice
### Steps to reproduce:
1. Open Visual Studio 2019
2. Open a C# project
3. Enable reporting of Tooltips
4. Trigger an IntelliSense autocomplete suggestion by typing something.
5. Arrow through the suggestions
### Actual behavior:
The selected item is announced, followed by twice the tooltip.
### Expected behavior:
The selected item is announced, followed by once the tooltip.
### System configuration
#### NVDA installed/portable/running from source:
Installed
#### NVDA version:
alpha-20957
#### Windows version:
Windows 10 2004
#### Name and version of other software in use when reproducing the issue:
Visual Studio 2019 16.7.3 Enterprise
### Other questions
#### Does the issue still occur after restarting your computer?
Yes
#### Have you tried any other versions of NVDA? If so, please report their behaviors.
No
#### If addons are disabled, is your problem still occuring?
Yes
#### Did you try to run the COM registry fixing tool in NVDA menu / tools?
Yes
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `source/NVDAObjects/UIA/VisualStudio.py`
Content:
```
1 # This file is covered by the GNU General Public License.
2 # See the file COPYING for more details.
3 # Copyright (C) 2020 NV Access Limited, Leonard de Ruijter
4
5 """
6 Object overlay classes for Visual Studio components
7 available in Visual Studio and SQL Server Management Studio.
8 """
9
10 from . import UIA
11 import speech
12 import braille
13 import api
14
15
16 class IntelliSenseItem(UIA):
17
18 def _get_name(self):
19 return self.UIAElement.cachedAutomationID
20
21 def event_UIA_elementSelected(self):
22 # Cancel speech to have speech announce the selection as soon as possible.
23 # This is needed because L{reportFocus} does not cancel speech.
24 # Therefore, if speech wouldn't be cancelled,
25 # selection announcements would queue up when changing selection rapidly.
26 speech.cancelSpeech()
27 api.setNavigatorObject(self, isFocus=True)
28 self.reportFocus()
29 # Display results as flash messages.
30 braille.handler.message(braille.getPropertiesBraille(
31 name=self.name, role=self.role, positionInfo=self.positionInfo, description=self.description
32 ))
33
34
35 class IntelliSenseList(UIA):
36 ...
37
38
39 class IntelliSenseLiveRegion(UIA):
40 """
41 Visual Studio uses both Intellisense menu item objects and a live region
42 to communicate Intellisense selections.
43 NVDA uses the menu item approach and therefore the live region provides doubled information
44 and is disabled.
45 """
46
47 _shouldAllowUIALiveRegionChangeEvent = False
48
49
50 _INTELLISENSE_LIST_AUTOMATION_IDS = {
51 "listBoxCompletions",
52 "CompletionList"
53 }
54
55
56 def findExtraOverlayClasses(obj, clsList):
57 if obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:
58 clsList.insert(0, IntelliSenseList)
59 elif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == "IntellisenseMenuItem":
60 clsList.insert(0, IntelliSenseItem)
61 elif (
62 obj.UIAElement.cachedClassName == "LiveTextBlock"
63 and obj.previous
64 and isinstance(obj.previous.previous, IntelliSenseList)
65 ):
66 clsList.insert(0, IntelliSenseLiveRegion)
67
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/source/NVDAObjects/UIA/VisualStudio.py b/source/NVDAObjects/UIA/VisualStudio.py
--- a/source/NVDAObjects/UIA/VisualStudio.py
+++ b/source/NVDAObjects/UIA/VisualStudio.py
@@ -7,10 +7,11 @@
available in Visual Studio and SQL Server Management Studio.
"""
-from . import UIA
+from . import UIA, ToolTip
import speech
import braille
import api
+import time
class IntelliSenseItem(UIA):
@@ -53,6 +54,34 @@
}
+class CompletionToolTip(ToolTip):
+ """ A tool tip for which duplicate open events can be fired.
+ """
+
+ #: Keeps track of the last ToolTipOpened event (text, time)
+ _lastToolTipOpenedInfo = (None, None)
+ #: The duplicate tooltip events will be dropped within this time window
+ _preventDuplicateToolTipSeconds = 0.2
+
+ def event_UIA_toolTipOpened(self):
+ oldText, oldTime = self._lastToolTipOpenedInfo
+ newText = self.name
+ newTime = time.time()
+ self.__class__._lastToolTipOpenedInfo = (newText, newTime)
+ withinPossibleDupToolTipTimeWindow = (
+ oldTime is not None
+ and (newTime - oldTime) < self._preventDuplicateToolTipSeconds
+ )
+ if newText == oldText and withinPossibleDupToolTipTimeWindow:
+ # Tool-tip event suspected to be a duplicate, drop the event.
+ # - Users attempting to rapidly re-announce tool-tips may
+ # have the announcement erroneously suppressed
+ # - Users on slower systems (or systems under load) may still
+ # receive duplicate announcements
+ return
+ super().event_UIA_toolTipOpened()
+
+
def findExtraOverlayClasses(obj, clsList):
if obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:
clsList.insert(0, IntelliSenseList)
@@ -64,3 +93,5 @@
and isinstance(obj.previous.previous, IntelliSenseList)
):
clsList.insert(0, IntelliSenseLiveRegion)
+ elif obj.UIAAutomationId == "completion tooltip":
+ clsList.insert(0, CompletionToolTip)
|
{"golden_diff": "diff --git a/source/NVDAObjects/UIA/VisualStudio.py b/source/NVDAObjects/UIA/VisualStudio.py\n--- a/source/NVDAObjects/UIA/VisualStudio.py\n+++ b/source/NVDAObjects/UIA/VisualStudio.py\n@@ -7,10 +7,11 @@\n available in Visual Studio and SQL Server Management Studio.\n \"\"\"\n \n-from . import UIA\n+from . import UIA, ToolTip\n import speech\n import braille\n import api\n+import time\n \n \n class IntelliSenseItem(UIA):\n@@ -53,6 +54,34 @@\n }\n \n \n+class CompletionToolTip(ToolTip):\n+\t\"\"\" A tool tip for which duplicate open events can be fired.\n+\t\"\"\"\n+\n+\t#: Keeps track of the last ToolTipOpened event (text, time)\n+\t_lastToolTipOpenedInfo = (None, None)\n+\t#: The duplicate tooltip events will be dropped within this time window\n+\t_preventDuplicateToolTipSeconds = 0.2\n+\n+\tdef event_UIA_toolTipOpened(self):\n+\t\toldText, oldTime = self._lastToolTipOpenedInfo\n+\t\tnewText = self.name\n+\t\tnewTime = time.time()\n+\t\tself.__class__._lastToolTipOpenedInfo = (newText, newTime)\n+\t\twithinPossibleDupToolTipTimeWindow = (\n+\t\t\toldTime is not None\n+\t\t\tand (newTime - oldTime) < self._preventDuplicateToolTipSeconds\n+\t\t)\n+\t\tif newText == oldText and withinPossibleDupToolTipTimeWindow:\n+\t\t\t# Tool-tip event suspected to be a duplicate, drop the event.\n+\t\t\t# - Users attempting to rapidly re-announce tool-tips may\n+\t\t\t# have the announcement erroneously suppressed\n+\t\t\t# - Users on slower systems (or systems under load) may still\n+\t\t\t# receive duplicate announcements\n+\t\t\treturn\n+\t\tsuper().event_UIA_toolTipOpened()\n+\n+\n def findExtraOverlayClasses(obj, clsList):\n \tif obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:\n \t\tclsList.insert(0, IntelliSenseList)\n@@ -64,3 +93,5 @@\n \t\tand isinstance(obj.previous.previous, IntelliSenseList)\n \t):\n \t\tclsList.insert(0, IntelliSenseLiveRegion)\n+\telif obj.UIAAutomationId == \"completion tooltip\":\n+\t\tclsList.insert(0, CompletionToolTip)\n", "issue": "Visual Studio: IntelliSense tooltips reported twice\n### Steps to reproduce:\r\n1. Open Visual Studio 2019\r\n2. Open a C# project\r\n3. Enable reporting of Tooltips\r\n4. Trigger an IntelliSense autocomplete suggestion by typing something.\r\n5. Arrow through the suggestions\r\n\r\n### Actual behavior:\r\nThe selected item is announced, followed by twice the tooltip.\r\n\r\n### Expected behavior:\r\nThe selected item is announced, followed by once the tooltip.\r\n\r\n### System configuration\r\n#### NVDA installed/portable/running from source:\r\nInstalled\r\n\r\n#### NVDA version:\r\nalpha-20957\r\n\r\n#### Windows version:\r\nWindows 10 2004\r\n\r\n#### Name and version of other software in use when reproducing the issue:\r\nVisual Studio 2019 16.7.3 Enterprise\r\n\r\n### Other questions\r\n#### Does the issue still occur after restarting your computer?\r\nYes\r\n\r\n#### Have you tried any other versions of NVDA? If so, please report their behaviors.\r\nNo\r\n\r\n#### If addons are disabled, is your problem still occuring?\r\nYes\r\n\r\n#### Did you try to run the COM registry fixing tool in NVDA menu / tools?\r\nYes\n", "before_files": [{"content": "# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n# Copyright (C) 2020 NV Access Limited, Leonard de Ruijter\n\n\"\"\"\nObject overlay classes for Visual Studio components\navailable in Visual Studio and SQL Server Management Studio.\n\"\"\"\n\nfrom . import UIA\nimport speech\nimport braille\nimport api\n\n\nclass IntelliSenseItem(UIA):\n\n\tdef _get_name(self):\n\t\treturn self.UIAElement.cachedAutomationID\n\n\tdef event_UIA_elementSelected(self):\n\t\t# Cancel speech to have speech announce the selection as soon as possible.\n\t\t# This is needed because L{reportFocus} does not cancel speech.\n\t\t# Therefore, if speech wouldn't be cancelled,\n\t\t# selection announcements would queue up when changing selection rapidly.\n\t\tspeech.cancelSpeech()\n\t\tapi.setNavigatorObject(self, isFocus=True)\n\t\tself.reportFocus()\n\t\t# Display results as flash messages.\n\t\tbraille.handler.message(braille.getPropertiesBraille(\n\t\t\tname=self.name, role=self.role, positionInfo=self.positionInfo, description=self.description\n\t\t))\n\n\nclass IntelliSenseList(UIA):\n\t...\n\n\nclass IntelliSenseLiveRegion(UIA):\n\t\"\"\"\n\tVisual Studio uses both Intellisense menu item objects and a live region\n\tto communicate Intellisense selections.\n\tNVDA uses the menu item approach and therefore the live region provides doubled information\n\tand is disabled.\n\t\"\"\"\n\n\t_shouldAllowUIALiveRegionChangeEvent = False\n\n\n_INTELLISENSE_LIST_AUTOMATION_IDS = {\n\t\"listBoxCompletions\",\n\t\"CompletionList\"\n}\n\n\ndef findExtraOverlayClasses(obj, clsList):\n\tif obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:\n\t\tclsList.insert(0, IntelliSenseList)\n\telif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == \"IntellisenseMenuItem\":\n\t\tclsList.insert(0, IntelliSenseItem)\n\telif (\n\t\tobj.UIAElement.cachedClassName == \"LiveTextBlock\"\n\t\tand obj.previous\n\t\tand isinstance(obj.previous.previous, IntelliSenseList)\n\t):\n\t\tclsList.insert(0, IntelliSenseLiveRegion)\n", "path": "source/NVDAObjects/UIA/VisualStudio.py"}], "after_files": [{"content": "# This file is covered by the GNU General Public License.\n# See the file COPYING for more details.\n# Copyright (C) 2020 NV Access Limited, Leonard de Ruijter\n\n\"\"\"\nObject overlay classes for Visual Studio components\navailable in Visual Studio and SQL Server Management Studio.\n\"\"\"\n\nfrom . import UIA, ToolTip\nimport speech\nimport braille\nimport api\nimport time\n\n\nclass IntelliSenseItem(UIA):\n\n\tdef _get_name(self):\n\t\treturn self.UIAElement.cachedAutomationID\n\n\tdef event_UIA_elementSelected(self):\n\t\t# Cancel speech to have speech announce the selection as soon as possible.\n\t\t# This is needed because L{reportFocus} does not cancel speech.\n\t\t# Therefore, if speech wouldn't be cancelled,\n\t\t# selection announcements would queue up when changing selection rapidly.\n\t\tspeech.cancelSpeech()\n\t\tapi.setNavigatorObject(self, isFocus=True)\n\t\tself.reportFocus()\n\t\t# Display results as flash messages.\n\t\tbraille.handler.message(braille.getPropertiesBraille(\n\t\t\tname=self.name, role=self.role, positionInfo=self.positionInfo, description=self.description\n\t\t))\n\n\nclass IntelliSenseList(UIA):\n\t...\n\n\nclass IntelliSenseLiveRegion(UIA):\n\t\"\"\"\n\tVisual Studio uses both Intellisense menu item objects and a live region\n\tto communicate Intellisense selections.\n\tNVDA uses the menu item approach and therefore the live region provides doubled information\n\tand is disabled.\n\t\"\"\"\n\n\t_shouldAllowUIALiveRegionChangeEvent = False\n\n\n_INTELLISENSE_LIST_AUTOMATION_IDS = {\n\t\"listBoxCompletions\",\n\t\"CompletionList\"\n}\n\n\nclass CompletionToolTip(ToolTip):\n\t\"\"\" A tool tip for which duplicate open events can be fired.\n\t\"\"\"\n\n\t#: Keeps track of the last ToolTipOpened event (text, time)\n\t_lastToolTipOpenedInfo = (None, None)\n\t#: The duplicate tooltip events will be dropped within this time window\n\t_preventDuplicateToolTipSeconds = 0.2\n\n\tdef event_UIA_toolTipOpened(self):\n\t\toldText, oldTime = self._lastToolTipOpenedInfo\n\t\tnewText = self.name\n\t\tnewTime = time.time()\n\t\tself.__class__._lastToolTipOpenedInfo = (newText, newTime)\n\t\twithinPossibleDupToolTipTimeWindow = (\n\t\t\toldTime is not None\n\t\t\tand (newTime - oldTime) < self._preventDuplicateToolTipSeconds\n\t\t)\n\t\tif newText == oldText and withinPossibleDupToolTipTimeWindow:\n\t\t\t# Tool-tip event suspected to be a duplicate, drop the event.\n\t\t\t# - Users attempting to rapidly re-announce tool-tips may\n\t\t\t# have the announcement erroneously suppressed\n\t\t\t# - Users on slower systems (or systems under load) may still\n\t\t\t# receive duplicate announcements\n\t\t\treturn\n\t\tsuper().event_UIA_toolTipOpened()\n\n\ndef findExtraOverlayClasses(obj, clsList):\n\tif obj.UIAAutomationId in _INTELLISENSE_LIST_AUTOMATION_IDS:\n\t\tclsList.insert(0, IntelliSenseList)\n\telif isinstance(obj.parent, IntelliSenseList) and obj.UIAElement.cachedClassName == \"IntellisenseMenuItem\":\n\t\tclsList.insert(0, IntelliSenseItem)\n\telif (\n\t\tobj.UIAElement.cachedClassName == \"LiveTextBlock\"\n\t\tand obj.previous\n\t\tand isinstance(obj.previous.previous, IntelliSenseList)\n\t):\n\t\tclsList.insert(0, IntelliSenseLiveRegion)\n\telif obj.UIAAutomationId == \"completion tooltip\":\n\t\tclsList.insert(0, CompletionToolTip)\n", "path": "source/NVDAObjects/UIA/VisualStudio.py"}]}
| 1,123 | 529 |
gh_patches_debug_1740
|
rasdani/github-patches
|
git_diff
|
flairNLP__flair-239
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug in tokenizer?
Here's a minimum viable code to reproduce:
```
from flair.data import Sentence
from flair.models import SequenceTagger
model = SequenceTagger.load("ner-ontonotes-fast")
full_text = "\"In the 1960s and 1970s...\" Then came Thierry Mugler and Gianni Versace."
sentence = Sentence(full_text, use_tokenizer=True)
model.predict(sentence)
print(f"full text : {full_text}")
print(f"text length: {len(full_text)}")
print("tag\tstart\tend\tto_original_text()")
for entity in sentence.get_spans('ner'):
print(f"{entity.tag}\t{entity.start_pos}\t{entity.end_pos}\t{entity.to_original_text()}")
```
Output:
``` $ python predict.py
full text : "In the 1960s and 1970s..." Then came Thierry Mugler and Gianni Versace.
text length: 72
tag start end to_original_text()
DATE 8 13 1960s
DATE 18 23 1970s
PERSON 81 94 ThierryMugler
PERSON 97 110 GianniVersace
```
Seems the resulting tokens have start_pos and end_pos indexes larger than the real text length. Note also that the method to_original_text() is eating the spaces, so I suppose it is related.
Any ideas about what is causing the trouble?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_packages
2
3 setup(
4 name='flair',
5 version='0.3.2',
6 description='A very simple framework for state-of-the-art NLP',
7 long_description=open("README.md", encoding='utf-8').read(),
8 long_description_content_type="text/markdown",
9 author='Alan Akbik',
10 author_email='[email protected]',
11 url='https://github.com/zalandoresearch/flair',
12 packages=find_packages(exclude='test'), # same as name
13 license='MIT',
14 install_requires=[
15 'torch==0.4.1',
16 'gensim==3.4.0',
17 'typing==3.6.4',
18 'tqdm==4.23.4',
19 'segtok==1.5.6',
20 'matplotlib==3.0.0',
21 'mpld3==0.3',
22 'sklearn',
23 'sqlitedict==1.6.0',
24 'deprecated==1.2.4',
25 ],
26 include_package_data=True,
27 python_requires='>=3.6',
28 )
29
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -15,8 +15,8 @@
'torch==0.4.1',
'gensim==3.4.0',
'typing==3.6.4',
- 'tqdm==4.23.4',
- 'segtok==1.5.6',
+ 'tqdm==4.26.0',
+ 'segtok==1.5.7',
'matplotlib==3.0.0',
'mpld3==0.3',
'sklearn',
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -15,8 +15,8 @@\n 'torch==0.4.1',\n 'gensim==3.4.0',\n 'typing==3.6.4',\n- 'tqdm==4.23.4',\n- 'segtok==1.5.6',\n+ 'tqdm==4.26.0',\n+ 'segtok==1.5.7',\n 'matplotlib==3.0.0',\n 'mpld3==0.3',\n 'sklearn',\n", "issue": "Bug in tokenizer?\nHere's a minimum viable code to reproduce:\r\n\r\n```\r\nfrom flair.data import Sentence\r\nfrom flair.models import SequenceTagger\r\n\r\nmodel = SequenceTagger.load(\"ner-ontonotes-fast\")\r\nfull_text = \"\\\"In the 1960s and 1970s...\\\" Then came Thierry Mugler and Gianni Versace.\"\r\nsentence = Sentence(full_text, use_tokenizer=True)\r\nmodel.predict(sentence)\r\nprint(f\"full text : {full_text}\")\r\nprint(f\"text length: {len(full_text)}\")\r\nprint(\"tag\\tstart\\tend\\tto_original_text()\")\r\nfor entity in sentence.get_spans('ner'):\r\n print(f\"{entity.tag}\\t{entity.start_pos}\\t{entity.end_pos}\\t{entity.to_original_text()}\")\r\n```\r\n\r\nOutput:\r\n\r\n``` $ python predict.py \r\nfull text : \"In the 1960s and 1970s...\" Then came Thierry Mugler and Gianni Versace.\r\ntext length: 72\r\ntag\tstart\tend\tto_original_text()\r\nDATE\t8\t13\t1960s\r\nDATE\t18\t23\t1970s\r\nPERSON\t81\t94\tThierryMugler\r\nPERSON\t97\t110\tGianniVersace\r\n```\r\nSeems the resulting tokens have start_pos and end_pos indexes larger than the real text length. Note also that the method to_original_text() is eating the spaces, so I suppose it is related.\r\n\r\nAny ideas about what is causing the trouble?\n", "before_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='flair',\n version='0.3.2',\n description='A very simple framework for state-of-the-art NLP',\n long_description=open(\"README.md\", encoding='utf-8').read(),\n long_description_content_type=\"text/markdown\",\n author='Alan Akbik',\n author_email='[email protected]',\n url='https://github.com/zalandoresearch/flair',\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n 'torch==0.4.1',\n 'gensim==3.4.0',\n 'typing==3.6.4',\n 'tqdm==4.23.4',\n 'segtok==1.5.6',\n 'matplotlib==3.0.0',\n 'mpld3==0.3',\n 'sklearn',\n 'sqlitedict==1.6.0',\n 'deprecated==1.2.4',\n ],\n include_package_data=True,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='flair',\n version='0.3.2',\n description='A very simple framework for state-of-the-art NLP',\n long_description=open(\"README.md\", encoding='utf-8').read(),\n long_description_content_type=\"text/markdown\",\n author='Alan Akbik',\n author_email='[email protected]',\n url='https://github.com/zalandoresearch/flair',\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n 'torch==0.4.1',\n 'gensim==3.4.0',\n 'typing==3.6.4',\n 'tqdm==4.26.0',\n 'segtok==1.5.7',\n 'matplotlib==3.0.0',\n 'mpld3==0.3',\n 'sklearn',\n 'sqlitedict==1.6.0',\n 'deprecated==1.2.4',\n ],\n include_package_data=True,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}]}
| 881 | 142 |
gh_patches_debug_18422
|
rasdani/github-patches
|
git_diff
|
ansible__awx-7280
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Mattermost Notification fails on latest release
##### ISSUE TYPE
- Bug Report
##### SUMMARY
Trying to send a (test) notification to a Mattermost Channel fails with
```
mattermostinfo: Notification failed.
Error sending notification mattermost: {"id":"api.webhook.incoming.error","message":"Could not decode the multipart payload of incoming webhook.","detailed_error":"","request_id":"<request ID>","status_code":400}
```
##### ENVIRONMENT
* AWX version: 11.2.0
* AWX install method: docker on linux
* Ansible version: 2.9.7
* Operating System: CentOS 7.8
* Web Browser: Chrome,Chromium,Firefox
* Mattermost Server Version: 5.22.1
##### STEPS TO REPRODUCE
- Create an incomming webhook
- Create a mattermost notification
- Send a test notification
##### EXPECTED RESULTS
Having a notification in the Channel
##### ACTUAL RESULTS
Sending failed with above error message
##### ADDITIONAL INFORMATION
The error message in the mattermost log shows
```
{"level":"error","ts":1591342011.6592789,"caller":"mlog/log.go:175","msg":"Could not decode the multipart payload of incoming webhook.","path":"/
hooks/<hook ID>","request_id":"<request ID>","ip_addr":"<IP Address>","user_id":"","method":"POST","err_where":"
incomingWebhook","http_code":400,"err_details":"mime: no media type"}
```
---
edit: some ID removed in the log sample, mattermost server version added
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `awx/main/notifications/mattermost_backend.py`
Content:
```
1 # Copyright (c) 2016 Ansible, Inc.
2 # All Rights Reserved.
3
4 import logging
5 import requests
6 import json
7
8 from django.utils.encoding import smart_text
9 from django.utils.translation import ugettext_lazy as _
10
11 from awx.main.notifications.base import AWXBaseEmailBackend
12 from awx.main.notifications.custom_notification_base import CustomNotificationBase
13
14 logger = logging.getLogger('awx.main.notifications.mattermost_backend')
15
16
17 class MattermostBackend(AWXBaseEmailBackend, CustomNotificationBase):
18
19 init_parameters = {"mattermost_url": {"label": "Target URL", "type": "string"},
20 "mattermost_no_verify_ssl": {"label": "Verify SSL", "type": "bool"}}
21 recipient_parameter = "mattermost_url"
22 sender_parameter = None
23
24 def __init__(self, mattermost_no_verify_ssl=False, mattermost_channel=None, mattermost_username=None,
25 mattermost_icon_url=None, fail_silently=False, **kwargs):
26 super(MattermostBackend, self).__init__(fail_silently=fail_silently)
27 self.mattermost_channel = mattermost_channel
28 self.mattermost_username = mattermost_username
29 self.mattermost_icon_url = mattermost_icon_url
30 self.mattermost_no_verify_ssl = mattermost_no_verify_ssl
31
32 def format_body(self, body):
33 return body
34
35 def send_messages(self, messages):
36 sent_messages = 0
37 for m in messages:
38 payload = {}
39 for opt, optval in {'mattermost_icon_url':'icon_url',
40 'mattermost_channel': 'channel', 'mattermost_username': 'username'}.items():
41 optvalue = getattr(self, opt)
42 if optvalue is not None:
43 payload[optval] = optvalue.strip()
44
45 payload['text'] = m.subject
46
47 r = requests.post("{}".format(m.recipients()[0]),
48 data=json.dumps(payload), verify=(not self.mattermost_no_verify_ssl))
49 if r.status_code >= 400:
50 logger.error(smart_text(_("Error sending notification mattermost: {}").format(r.text)))
51 if not self.fail_silently:
52 raise Exception(smart_text(_("Error sending notification mattermost: {}").format(r.text)))
53 sent_messages += 1
54 return sent_messages
55
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/awx/main/notifications/mattermost_backend.py b/awx/main/notifications/mattermost_backend.py
--- a/awx/main/notifications/mattermost_backend.py
+++ b/awx/main/notifications/mattermost_backend.py
@@ -3,7 +3,6 @@
import logging
import requests
-import json
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
@@ -45,7 +44,7 @@
payload['text'] = m.subject
r = requests.post("{}".format(m.recipients()[0]),
- data=json.dumps(payload), verify=(not self.mattermost_no_verify_ssl))
+ json=payload, verify=(not self.mattermost_no_verify_ssl))
if r.status_code >= 400:
logger.error(smart_text(_("Error sending notification mattermost: {}").format(r.text)))
if not self.fail_silently:
|
{"golden_diff": "diff --git a/awx/main/notifications/mattermost_backend.py b/awx/main/notifications/mattermost_backend.py\n--- a/awx/main/notifications/mattermost_backend.py\n+++ b/awx/main/notifications/mattermost_backend.py\n@@ -3,7 +3,6 @@\n \n import logging\n import requests\n-import json\n \n from django.utils.encoding import smart_text\n from django.utils.translation import ugettext_lazy as _\n@@ -45,7 +44,7 @@\n payload['text'] = m.subject\n \n r = requests.post(\"{}\".format(m.recipients()[0]),\n- data=json.dumps(payload), verify=(not self.mattermost_no_verify_ssl))\n+ json=payload, verify=(not self.mattermost_no_verify_ssl))\n if r.status_code >= 400:\n logger.error(smart_text(_(\"Error sending notification mattermost: {}\").format(r.text)))\n if not self.fail_silently:\n", "issue": "Mattermost Notification fails on latest release\n##### ISSUE TYPE\r\n - Bug Report\r\n\r\n##### SUMMARY\r\nTrying to send a (test) notification to a Mattermost Channel fails with\r\n```\r\n mattermostinfo: Notification failed.\r\nError sending notification mattermost: {\"id\":\"api.webhook.incoming.error\",\"message\":\"Could not decode the multipart payload of incoming webhook.\",\"detailed_error\":\"\",\"request_id\":\"<request ID>\",\"status_code\":400}\r\n```\r\n##### ENVIRONMENT\r\n* AWX version: 11.2.0\r\n* AWX install method: docker on linux\r\n* Ansible version: 2.9.7\r\n* Operating System: CentOS 7.8\r\n* Web Browser: Chrome,Chromium,Firefox\r\n* Mattermost Server Version: 5.22.1\r\n\r\n##### STEPS TO REPRODUCE\r\n- Create an incomming webhook\r\n- Create a mattermost notification\r\n- Send a test notification\r\n\r\n\r\n##### EXPECTED RESULTS\r\nHaving a notification in the Channel\r\n\r\n\r\n##### ACTUAL RESULTS\r\n\r\nSending failed with above error message\r\n\r\n##### ADDITIONAL INFORMATION\r\n\r\nThe error message in the mattermost log shows\r\n```\r\n{\"level\":\"error\",\"ts\":1591342011.6592789,\"caller\":\"mlog/log.go:175\",\"msg\":\"Could not decode the multipart payload of incoming webhook.\",\"path\":\"/\r\nhooks/<hook ID>\",\"request_id\":\"<request ID>\",\"ip_addr\":\"<IP Address>\",\"user_id\":\"\",\"method\":\"POST\",\"err_where\":\"\r\nincomingWebhook\",\"http_code\":400,\"err_details\":\"mime: no media type\"}\r\n```\r\n---\r\nedit: some ID removed in the log sample, mattermost server version added\n", "before_files": [{"content": "# Copyright (c) 2016 Ansible, Inc.\n# All Rights Reserved.\n\nimport logging\nimport requests\nimport json\n\nfrom django.utils.encoding import smart_text\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom awx.main.notifications.base import AWXBaseEmailBackend\nfrom awx.main.notifications.custom_notification_base import CustomNotificationBase\n\nlogger = logging.getLogger('awx.main.notifications.mattermost_backend')\n\n\nclass MattermostBackend(AWXBaseEmailBackend, CustomNotificationBase):\n\n init_parameters = {\"mattermost_url\": {\"label\": \"Target URL\", \"type\": \"string\"},\n \"mattermost_no_verify_ssl\": {\"label\": \"Verify SSL\", \"type\": \"bool\"}}\n recipient_parameter = \"mattermost_url\"\n sender_parameter = None\n\n def __init__(self, mattermost_no_verify_ssl=False, mattermost_channel=None, mattermost_username=None,\n mattermost_icon_url=None, fail_silently=False, **kwargs):\n super(MattermostBackend, self).__init__(fail_silently=fail_silently)\n self.mattermost_channel = mattermost_channel\n self.mattermost_username = mattermost_username\n self.mattermost_icon_url = mattermost_icon_url\n self.mattermost_no_verify_ssl = mattermost_no_verify_ssl\n\n def format_body(self, body):\n return body\n\n def send_messages(self, messages):\n sent_messages = 0\n for m in messages:\n payload = {}\n for opt, optval in {'mattermost_icon_url':'icon_url',\n 'mattermost_channel': 'channel', 'mattermost_username': 'username'}.items():\n optvalue = getattr(self, opt)\n if optvalue is not None:\n payload[optval] = optvalue.strip()\n\n payload['text'] = m.subject\n\n r = requests.post(\"{}\".format(m.recipients()[0]),\n data=json.dumps(payload), verify=(not self.mattermost_no_verify_ssl))\n if r.status_code >= 400:\n logger.error(smart_text(_(\"Error sending notification mattermost: {}\").format(r.text)))\n if not self.fail_silently:\n raise Exception(smart_text(_(\"Error sending notification mattermost: {}\").format(r.text)))\n sent_messages += 1\n return sent_messages\n", "path": "awx/main/notifications/mattermost_backend.py"}], "after_files": [{"content": "# Copyright (c) 2016 Ansible, Inc.\n# All Rights Reserved.\n\nimport logging\nimport requests\n\nfrom django.utils.encoding import smart_text\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom awx.main.notifications.base import AWXBaseEmailBackend\nfrom awx.main.notifications.custom_notification_base import CustomNotificationBase\n\nlogger = logging.getLogger('awx.main.notifications.mattermost_backend')\n\n\nclass MattermostBackend(AWXBaseEmailBackend, CustomNotificationBase):\n\n init_parameters = {\"mattermost_url\": {\"label\": \"Target URL\", \"type\": \"string\"},\n \"mattermost_no_verify_ssl\": {\"label\": \"Verify SSL\", \"type\": \"bool\"}}\n recipient_parameter = \"mattermost_url\"\n sender_parameter = None\n\n def __init__(self, mattermost_no_verify_ssl=False, mattermost_channel=None, mattermost_username=None,\n mattermost_icon_url=None, fail_silently=False, **kwargs):\n super(MattermostBackend, self).__init__(fail_silently=fail_silently)\n self.mattermost_channel = mattermost_channel\n self.mattermost_username = mattermost_username\n self.mattermost_icon_url = mattermost_icon_url\n self.mattermost_no_verify_ssl = mattermost_no_verify_ssl\n\n def format_body(self, body):\n return body\n\n def send_messages(self, messages):\n sent_messages = 0\n for m in messages:\n payload = {}\n for opt, optval in {'mattermost_icon_url':'icon_url',\n 'mattermost_channel': 'channel', 'mattermost_username': 'username'}.items():\n optvalue = getattr(self, opt)\n if optvalue is not None:\n payload[optval] = optvalue.strip()\n\n payload['text'] = m.subject\n\n r = requests.post(\"{}\".format(m.recipients()[0]),\n json=payload, verify=(not self.mattermost_no_verify_ssl))\n if r.status_code >= 400:\n logger.error(smart_text(_(\"Error sending notification mattermost: {}\").format(r.text)))\n if not self.fail_silently:\n raise Exception(smart_text(_(\"Error sending notification mattermost: {}\").format(r.text)))\n sent_messages += 1\n return sent_messages\n", "path": "awx/main/notifications/mattermost_backend.py"}]}
| 1,222 | 203 |
gh_patches_debug_10731
|
rasdani/github-patches
|
git_diff
|
litestar-org__litestar-2982
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug: openapi schema generation fails for Union of/in msgspec.Struct models
### Description
Hello!
In the latest versions(s) (I think this originates from the changes regarding nested models in openapi generation) we cannot use Unions of `msgspec.Struct`s anymore. Neither as direct return types for routes nor nested within return types.
The result is a 500 Error. The MCVE below raises `'types.UnionType' object has no attribute '__qualname__'` internally. In our production app I get `typing.Union is not a module, class, method, or function.` instead.
Cheers
### URL to code causing the issue
_No response_
### MCVE
```python
import msgspec
import uvicorn
from litestar import Litestar, get
class SubStructA(msgspec.Struct):
a: int
class SubStructB(msgspec.Struct):
a: int
class StructyStruct(msgspec.Struct):
sub: SubStructA | SubStructB
@get("/subunion")
async def testSubUnion() -> StructyStruct:
return StructyStruct(SubStructA(0))
@get("/union")
async def testUnion() -> SubStructA | SubStructB:
return SubStructA(0)
app = Litestar(route_handlers=[test2]) # or test
uvicorn.run(app)
```
### Steps to reproduce
```bash
Run the example and browse to `localhost:8000/schema`
```
### Screenshots
_No response_
### Logs
_No response_
### Litestar Version
2.5.0
### Platform
- [X] Linux
- [ ] Mac
- [ ] Windows
- [ ] Other (Please specify in the description above)
<!-- POLAR PLEDGE BADGE START -->
---
> [!NOTE]
> While we are open for sponsoring on [GitHub Sponsors](https://github.com/sponsors/litestar-org/) and
> [OpenCollective](https://opencollective.com/litestar), we also utilize [Polar.sh](https://polar.sh/) to engage in pledge-based sponsorship.
>
> Check out all issues funded or available for funding [on our Polar.sh dashboard](https://polar.sh/litestar-org)
> * If you would like to see an issue prioritized, make a pledge towards it!
> * We receive the pledge once the issue is completed & verified
> * This, along with engagement in the community, helps us know which features are a priority to our users.
<a href="https://polar.sh/litestar-org/litestar/issues/2971">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://polar.sh/api/github/litestar-org/litestar/issues/2971/pledge.svg?darkmode=1">
<img alt="Fund with Polar" src="https://polar.sh/api/github/litestar-org/litestar/issues/2971/pledge.svg">
</picture>
</a>
<!-- POLAR PLEDGE BADGE END -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `litestar/_openapi/schema_generation/plugins/struct.py`
Content:
```
1 from __future__ import annotations
2
3 from typing import TYPE_CHECKING
4
5 from msgspec import Struct
6 from msgspec.structs import fields
7
8 from litestar.plugins import OpenAPISchemaPlugin
9 from litestar.types.empty import Empty
10 from litestar.typing import FieldDefinition
11 from litestar.utils.predicates import is_optional_union
12
13 if TYPE_CHECKING:
14 from msgspec.structs import FieldInfo
15
16 from litestar._openapi.schema_generation import SchemaCreator
17 from litestar.openapi.spec import Schema
18
19
20 class StructSchemaPlugin(OpenAPISchemaPlugin):
21 def is_plugin_supported_field(self, field_definition: FieldDefinition) -> bool:
22 return field_definition.is_subclass_of(Struct)
23
24 def to_openapi_schema(self, field_definition: FieldDefinition, schema_creator: SchemaCreator) -> Schema:
25 def is_field_required(field: FieldInfo) -> bool:
26 return field.required or field.default_factory is Empty
27
28 type_hints = field_definition.get_type_hints(include_extras=True, resolve_generics=True)
29 struct_fields = fields(field_definition.type_)
30
31 return schema_creator.create_component_schema(
32 field_definition,
33 required=sorted(
34 [
35 field.encode_name
36 for field in struct_fields
37 if is_field_required(field=field) and not is_optional_union(type_hints[field.name])
38 ]
39 ),
40 property_fields={
41 field.encode_name: FieldDefinition.from_kwarg(type_hints[field.name], field.encode_name)
42 for field in struct_fields
43 },
44 )
45
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/litestar/_openapi/schema_generation/plugins/struct.py b/litestar/_openapi/schema_generation/plugins/struct.py
--- a/litestar/_openapi/schema_generation/plugins/struct.py
+++ b/litestar/_openapi/schema_generation/plugins/struct.py
@@ -19,7 +19,7 @@
class StructSchemaPlugin(OpenAPISchemaPlugin):
def is_plugin_supported_field(self, field_definition: FieldDefinition) -> bool:
- return field_definition.is_subclass_of(Struct)
+ return not field_definition.is_union and field_definition.is_subclass_of(Struct)
def to_openapi_schema(self, field_definition: FieldDefinition, schema_creator: SchemaCreator) -> Schema:
def is_field_required(field: FieldInfo) -> bool:
|
{"golden_diff": "diff --git a/litestar/_openapi/schema_generation/plugins/struct.py b/litestar/_openapi/schema_generation/plugins/struct.py\n--- a/litestar/_openapi/schema_generation/plugins/struct.py\n+++ b/litestar/_openapi/schema_generation/plugins/struct.py\n@@ -19,7 +19,7 @@\n \n class StructSchemaPlugin(OpenAPISchemaPlugin):\n def is_plugin_supported_field(self, field_definition: FieldDefinition) -> bool:\n- return field_definition.is_subclass_of(Struct)\n+ return not field_definition.is_union and field_definition.is_subclass_of(Struct)\n \n def to_openapi_schema(self, field_definition: FieldDefinition, schema_creator: SchemaCreator) -> Schema:\n def is_field_required(field: FieldInfo) -> bool:\n", "issue": "Bug: openapi schema generation fails for Union of/in msgspec.Struct models\n### Description\r\n\r\nHello!\r\n\r\nIn the latest versions(s) (I think this originates from the changes regarding nested models in openapi generation) we cannot use Unions of `msgspec.Struct`s anymore. Neither as direct return types for routes nor nested within return types. \r\n\r\nThe result is a 500 Error. The MCVE below raises `'types.UnionType' object has no attribute '__qualname__'` internally. In our production app I get `typing.Union is not a module, class, method, or function.` instead.\r\n\r\nCheers\r\n\r\n### URL to code causing the issue\r\n\r\n_No response_\r\n\r\n### MCVE\r\n\r\n```python\r\nimport msgspec\r\nimport uvicorn\r\nfrom litestar import Litestar, get\r\n\r\n\r\nclass SubStructA(msgspec.Struct):\r\n a: int\r\n\r\n\r\nclass SubStructB(msgspec.Struct):\r\n a: int\r\n\r\n\r\nclass StructyStruct(msgspec.Struct):\r\n sub: SubStructA | SubStructB\r\n\r\n\r\n@get(\"/subunion\")\r\nasync def testSubUnion() -> StructyStruct:\r\n return StructyStruct(SubStructA(0))\r\n\r\n\r\n@get(\"/union\")\r\nasync def testUnion() -> SubStructA | SubStructB:\r\n return SubStructA(0)\r\n\r\n\r\napp = Litestar(route_handlers=[test2]) # or test\r\nuvicorn.run(app)\r\n```\r\n\r\n\r\n### Steps to reproduce\r\n\r\n```bash\r\nRun the example and browse to `localhost:8000/schema`\r\n```\r\n\r\n\r\n### Screenshots\r\n\r\n_No response_\r\n\r\n### Logs\r\n\r\n_No response_\r\n\r\n### Litestar Version\r\n\r\n2.5.0\r\n\r\n### Platform\r\n\r\n- [X] Linux\r\n- [ ] Mac\r\n- [ ] Windows\r\n- [ ] Other (Please specify in the description above)\r\n\r\n<!-- POLAR PLEDGE BADGE START -->\r\n---\r\n> [!NOTE] \r\n> While we are open for sponsoring on [GitHub Sponsors](https://github.com/sponsors/litestar-org/) and \r\n> [OpenCollective](https://opencollective.com/litestar), we also utilize [Polar.sh](https://polar.sh/) to engage in pledge-based sponsorship.\r\n>\r\n> Check out all issues funded or available for funding [on our Polar.sh dashboard](https://polar.sh/litestar-org)\r\n> * If you would like to see an issue prioritized, make a pledge towards it!\r\n> * We receive the pledge once the issue is completed & verified\r\n> * This, along with engagement in the community, helps us know which features are a priority to our users.\r\n\r\n<a href=\"https://polar.sh/litestar-org/litestar/issues/2971\">\r\n<picture>\r\n <source media=\"(prefers-color-scheme: dark)\" srcset=\"https://polar.sh/api/github/litestar-org/litestar/issues/2971/pledge.svg?darkmode=1\">\r\n <img alt=\"Fund with Polar\" src=\"https://polar.sh/api/github/litestar-org/litestar/issues/2971/pledge.svg\">\r\n</picture>\r\n</a>\r\n<!-- POLAR PLEDGE BADGE END -->\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom typing import TYPE_CHECKING\n\nfrom msgspec import Struct\nfrom msgspec.structs import fields\n\nfrom litestar.plugins import OpenAPISchemaPlugin\nfrom litestar.types.empty import Empty\nfrom litestar.typing import FieldDefinition\nfrom litestar.utils.predicates import is_optional_union\n\nif TYPE_CHECKING:\n from msgspec.structs import FieldInfo\n\n from litestar._openapi.schema_generation import SchemaCreator\n from litestar.openapi.spec import Schema\n\n\nclass StructSchemaPlugin(OpenAPISchemaPlugin):\n def is_plugin_supported_field(self, field_definition: FieldDefinition) -> bool:\n return field_definition.is_subclass_of(Struct)\n\n def to_openapi_schema(self, field_definition: FieldDefinition, schema_creator: SchemaCreator) -> Schema:\n def is_field_required(field: FieldInfo) -> bool:\n return field.required or field.default_factory is Empty\n\n type_hints = field_definition.get_type_hints(include_extras=True, resolve_generics=True)\n struct_fields = fields(field_definition.type_)\n\n return schema_creator.create_component_schema(\n field_definition,\n required=sorted(\n [\n field.encode_name\n for field in struct_fields\n if is_field_required(field=field) and not is_optional_union(type_hints[field.name])\n ]\n ),\n property_fields={\n field.encode_name: FieldDefinition.from_kwarg(type_hints[field.name], field.encode_name)\n for field in struct_fields\n },\n )\n", "path": "litestar/_openapi/schema_generation/plugins/struct.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import TYPE_CHECKING\n\nfrom msgspec import Struct\nfrom msgspec.structs import fields\n\nfrom litestar.plugins import OpenAPISchemaPlugin\nfrom litestar.types.empty import Empty\nfrom litestar.typing import FieldDefinition\nfrom litestar.utils.predicates import is_optional_union\n\nif TYPE_CHECKING:\n from msgspec.structs import FieldInfo\n\n from litestar._openapi.schema_generation import SchemaCreator\n from litestar.openapi.spec import Schema\n\n\nclass StructSchemaPlugin(OpenAPISchemaPlugin):\n def is_plugin_supported_field(self, field_definition: FieldDefinition) -> bool:\n return not field_definition.is_union and field_definition.is_subclass_of(Struct)\n\n def to_openapi_schema(self, field_definition: FieldDefinition, schema_creator: SchemaCreator) -> Schema:\n def is_field_required(field: FieldInfo) -> bool:\n return field.required or field.default_factory is Empty\n\n type_hints = field_definition.get_type_hints(include_extras=True, resolve_generics=True)\n struct_fields = fields(field_definition.type_)\n\n return schema_creator.create_component_schema(\n field_definition,\n required=sorted(\n [\n field.encode_name\n for field in struct_fields\n if is_field_required(field=field) and not is_optional_union(type_hints[field.name])\n ]\n ),\n property_fields={\n field.encode_name: FieldDefinition.from_kwarg(type_hints[field.name], field.encode_name)\n for field in struct_fields\n },\n )\n", "path": "litestar/_openapi/schema_generation/plugins/struct.py"}]}
| 1,323 | 168 |
gh_patches_debug_6714
|
rasdani/github-patches
|
git_diff
|
open-mmlab__mmocr-570
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Write image name to pickle file
Hi MMOCR team,
Thank you for this awesome framework. I have a task to get coordinate of bounding box from Textsnake model, so I use --out argument in test.py to export to a pickle file. But when I load this pickle, I just got ‘boundary_result’ and don't know this ‘boundary_result’ belongs to which image. How can I get the image to write to the pickle file? Thank you.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mmocr/models/textdet/dense_heads/head_mixin.py`
Content:
```
1 # Copyright (c) OpenMMLab. All rights reserved.
2 import numpy as np
3
4 from mmocr.models.builder import HEADS
5 from mmocr.models.textdet.postprocess import decode
6 from mmocr.utils import check_argument
7
8
9 @HEADS.register_module()
10 class HeadMixin:
11 """The head minxin for dbnet and pannet heads."""
12
13 def resize_boundary(self, boundaries, scale_factor):
14 """Rescale boundaries via scale_factor.
15
16 Args:
17 boundaries (list[list[float]]): The boundary list. Each boundary
18 with size 2k+1 with k>=4.
19 scale_factor(ndarray): The scale factor of size (4,).
20
21 Returns:
22 boundaries (list[list[float]]): The scaled boundaries.
23 """
24 assert check_argument.is_2dlist(boundaries)
25 assert isinstance(scale_factor, np.ndarray)
26 assert scale_factor.shape[0] == 4
27
28 for b in boundaries:
29 sz = len(b)
30 check_argument.valid_boundary(b, True)
31 b[:sz -
32 1] = (np.array(b[:sz - 1]) *
33 (np.tile(scale_factor[:2], int(
34 (sz - 1) / 2)).reshape(1, sz - 1))).flatten().tolist()
35 return boundaries
36
37 def get_boundary(self, score_maps, img_metas, rescale):
38 """Compute text boundaries via post processing.
39
40 Args:
41 score_maps (Tensor): The text score map.
42 img_metas (dict): The image meta info.
43 rescale (bool): Rescale boundaries to the original image resolution
44 if true, and keep the score_maps resolution if false.
45
46 Returns:
47 results (dict): The result dict.
48 """
49
50 assert check_argument.is_type_list(img_metas, dict)
51 assert isinstance(rescale, bool)
52
53 score_maps = score_maps.squeeze()
54 boundaries = decode(
55 decoding_type=self.decoding_type,
56 preds=score_maps,
57 text_repr_type=self.text_repr_type)
58 if rescale:
59 boundaries = self.resize_boundary(
60 boundaries,
61 1.0 / self.downsample_ratio / img_metas[0]['scale_factor'])
62 results = dict(boundary_result=boundaries)
63 return results
64
65 def loss(self, pred_maps, **kwargs):
66 """Compute the loss for text detection.
67
68 Args:
69 pred_maps (tensor): The input score maps of NxCxHxW.
70
71 Returns:
72 losses (dict): The dict for losses.
73 """
74 losses = self.loss_module(pred_maps, self.downsample_ratio, **kwargs)
75 return losses
76
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mmocr/models/textdet/dense_heads/head_mixin.py b/mmocr/models/textdet/dense_heads/head_mixin.py
--- a/mmocr/models/textdet/dense_heads/head_mixin.py
+++ b/mmocr/models/textdet/dense_heads/head_mixin.py
@@ -59,7 +59,9 @@
boundaries = self.resize_boundary(
boundaries,
1.0 / self.downsample_ratio / img_metas[0]['scale_factor'])
- results = dict(boundary_result=boundaries)
+ results = dict(
+ boundary_result=boundaries, filename=img_metas[0]['filename'])
+
return results
def loss(self, pred_maps, **kwargs):
|
{"golden_diff": "diff --git a/mmocr/models/textdet/dense_heads/head_mixin.py b/mmocr/models/textdet/dense_heads/head_mixin.py\n--- a/mmocr/models/textdet/dense_heads/head_mixin.py\n+++ b/mmocr/models/textdet/dense_heads/head_mixin.py\n@@ -59,7 +59,9 @@\n boundaries = self.resize_boundary(\n boundaries,\n 1.0 / self.downsample_ratio / img_metas[0]['scale_factor'])\n- results = dict(boundary_result=boundaries)\n+ results = dict(\n+ boundary_result=boundaries, filename=img_metas[0]['filename'])\n+\n return results\n \n def loss(self, pred_maps, **kwargs):\n", "issue": "Write image name to pickle file\nHi MMOCR team,\nThank you for this awesome framework. I have a task to get coordinate of bounding box from Textsnake model, so I use --out argument in test.py to export to a pickle file. But when I load this pickle, I just got \u2018boundary_result\u2019 and don't know this \u2018boundary_result\u2019 belongs to which image. How can I get the image to write to the pickle file? Thank you.\n", "before_files": [{"content": "# Copyright (c) OpenMMLab. All rights reserved.\nimport numpy as np\n\nfrom mmocr.models.builder import HEADS\nfrom mmocr.models.textdet.postprocess import decode\nfrom mmocr.utils import check_argument\n\n\[email protected]_module()\nclass HeadMixin:\n \"\"\"The head minxin for dbnet and pannet heads.\"\"\"\n\n def resize_boundary(self, boundaries, scale_factor):\n \"\"\"Rescale boundaries via scale_factor.\n\n Args:\n boundaries (list[list[float]]): The boundary list. Each boundary\n with size 2k+1 with k>=4.\n scale_factor(ndarray): The scale factor of size (4,).\n\n Returns:\n boundaries (list[list[float]]): The scaled boundaries.\n \"\"\"\n assert check_argument.is_2dlist(boundaries)\n assert isinstance(scale_factor, np.ndarray)\n assert scale_factor.shape[0] == 4\n\n for b in boundaries:\n sz = len(b)\n check_argument.valid_boundary(b, True)\n b[:sz -\n 1] = (np.array(b[:sz - 1]) *\n (np.tile(scale_factor[:2], int(\n (sz - 1) / 2)).reshape(1, sz - 1))).flatten().tolist()\n return boundaries\n\n def get_boundary(self, score_maps, img_metas, rescale):\n \"\"\"Compute text boundaries via post processing.\n\n Args:\n score_maps (Tensor): The text score map.\n img_metas (dict): The image meta info.\n rescale (bool): Rescale boundaries to the original image resolution\n if true, and keep the score_maps resolution if false.\n\n Returns:\n results (dict): The result dict.\n \"\"\"\n\n assert check_argument.is_type_list(img_metas, dict)\n assert isinstance(rescale, bool)\n\n score_maps = score_maps.squeeze()\n boundaries = decode(\n decoding_type=self.decoding_type,\n preds=score_maps,\n text_repr_type=self.text_repr_type)\n if rescale:\n boundaries = self.resize_boundary(\n boundaries,\n 1.0 / self.downsample_ratio / img_metas[0]['scale_factor'])\n results = dict(boundary_result=boundaries)\n return results\n\n def loss(self, pred_maps, **kwargs):\n \"\"\"Compute the loss for text detection.\n\n Args:\n pred_maps (tensor): The input score maps of NxCxHxW.\n\n Returns:\n losses (dict): The dict for losses.\n \"\"\"\n losses = self.loss_module(pred_maps, self.downsample_ratio, **kwargs)\n return losses\n", "path": "mmocr/models/textdet/dense_heads/head_mixin.py"}], "after_files": [{"content": "# Copyright (c) OpenMMLab. All rights reserved.\nimport numpy as np\n\nfrom mmocr.models.builder import HEADS\nfrom mmocr.models.textdet.postprocess import decode\nfrom mmocr.utils import check_argument\n\n\[email protected]_module()\nclass HeadMixin:\n \"\"\"The head minxin for dbnet and pannet heads.\"\"\"\n\n def resize_boundary(self, boundaries, scale_factor):\n \"\"\"Rescale boundaries via scale_factor.\n\n Args:\n boundaries (list[list[float]]): The boundary list. Each boundary\n with size 2k+1 with k>=4.\n scale_factor(ndarray): The scale factor of size (4,).\n\n Returns:\n boundaries (list[list[float]]): The scaled boundaries.\n \"\"\"\n assert check_argument.is_2dlist(boundaries)\n assert isinstance(scale_factor, np.ndarray)\n assert scale_factor.shape[0] == 4\n\n for b in boundaries:\n sz = len(b)\n check_argument.valid_boundary(b, True)\n b[:sz -\n 1] = (np.array(b[:sz - 1]) *\n (np.tile(scale_factor[:2], int(\n (sz - 1) / 2)).reshape(1, sz - 1))).flatten().tolist()\n return boundaries\n\n def get_boundary(self, score_maps, img_metas, rescale):\n \"\"\"Compute text boundaries via post processing.\n\n Args:\n score_maps (Tensor): The text score map.\n img_metas (dict): The image meta info.\n rescale (bool): Rescale boundaries to the original image resolution\n if true, and keep the score_maps resolution if false.\n\n Returns:\n results (dict): The result dict.\n \"\"\"\n\n assert check_argument.is_type_list(img_metas, dict)\n assert isinstance(rescale, bool)\n\n score_maps = score_maps.squeeze()\n boundaries = decode(\n decoding_type=self.decoding_type,\n preds=score_maps,\n text_repr_type=self.text_repr_type)\n if rescale:\n boundaries = self.resize_boundary(\n boundaries,\n 1.0 / self.downsample_ratio / img_metas[0]['scale_factor'])\n results = dict(\n boundary_result=boundaries, filename=img_metas[0]['filename'])\n\n return results\n\n def loss(self, pred_maps, **kwargs):\n \"\"\"Compute the loss for text detection.\n\n Args:\n pred_maps (tensor): The input score maps of NxCxHxW.\n\n Returns:\n losses (dict): The dict for losses.\n \"\"\"\n losses = self.loss_module(pred_maps, self.downsample_ratio, **kwargs)\n return losses\n", "path": "mmocr/models/textdet/dense_heads/head_mixin.py"}]}
| 1,065 | 153 |
gh_patches_debug_29188
|
rasdani/github-patches
|
git_diff
|
nilearn__nilearn-2670
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
check_paradigm should check for invalid keys in passed dict
Using the old `nipy` user logic, I passed `amplitude=somethx` instead of `modulation=somethx` in the `make_design_matrix`. I didn't crash but the values where ignored (e.g Error: unknown param, etc.). A default value of 1 was forced...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nilearn/glm/first_level/experimental_paradigm.py`
Content:
```
1 """
2 An experimental protocol is handled as a pandas DataFrame
3 that includes an 'onset' field.
4
5 This yields the onset time of the events in the experimental paradigm.
6 It can also contain:
7
8 * a 'trial_type' field that yields the condition identifier.
9 * a 'duration' field that yields event duration (for so-called block
10 paradigms).
11 * a 'modulation' field that associated a scalar value to each event.
12
13 Author: Bertrand Thirion, 2015
14
15 """
16 import warnings
17
18 import numpy as np
19
20
21 def check_events(events):
22 """Test that the events data describes a valid experimental paradigm
23
24 It is valid if the events data has an 'onset' key.
25
26 Parameters
27 ----------
28 events : pandas DataFrame
29 Events data that describes a functional experimental paradigm.
30
31 Returns
32 -------
33 trial_type : array of shape (n_events,), dtype='s'
34 Per-event experimental conditions identifier.
35 Defaults to np.repeat('dummy', len(onsets)).
36
37 onset : array of shape (n_events,), dtype='f'
38 Per-event onset time (in seconds)
39
40 duration : array of shape (n_events,), dtype='f'
41 Per-event durantion, (in seconds)
42 defaults to zeros(n_events) when no duration is provided
43
44 modulation : array of shape (n_events,), dtype='f'
45 Per-event modulation, (in seconds)
46 defaults to ones(n_events) when no duration is provided.
47
48 """
49 if 'onset' not in events.keys():
50 raise ValueError('The provided events data has no onset column.')
51 if 'duration' not in events.keys():
52 raise ValueError('The provided events data has no duration column.')
53
54 onset = np.array(events['onset'])
55 duration = np.array(events['duration']).astype(np.float)
56 n_events = len(onset)
57 trial_type = np.array(events['trial_type'])
58 modulation = np.ones(n_events)
59 if 'trial_type' not in events.keys():
60 warnings.warn("'trial_type' column not found "
61 "in the given events data.")
62 trial_type = np.repeat('dummy', n_events)
63 if 'modulation' in events.keys():
64 warnings.warn("'modulation' column found in the given events data.")
65 modulation = np.array(events['modulation']).astype(np.float)
66 return trial_type, onset, duration, modulation
67
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/nilearn/glm/first_level/experimental_paradigm.py b/nilearn/glm/first_level/experimental_paradigm.py
--- a/nilearn/glm/first_level/experimental_paradigm.py
+++ b/nilearn/glm/first_level/experimental_paradigm.py
@@ -17,6 +17,11 @@
import numpy as np
+VALID_FIELDS = set(["onset",
+ "duration",
+ "trial_type",
+ "modulation",
+ ])
def check_events(events):
"""Test that the events data describes a valid experimental paradigm
@@ -54,13 +59,19 @@
onset = np.array(events['onset'])
duration = np.array(events['duration']).astype(np.float)
n_events = len(onset)
- trial_type = np.array(events['trial_type'])
modulation = np.ones(n_events)
if 'trial_type' not in events.keys():
warnings.warn("'trial_type' column not found "
"in the given events data.")
trial_type = np.repeat('dummy', n_events)
+ else:
+ trial_type = np.array(events['trial_type'])
if 'modulation' in events.keys():
warnings.warn("'modulation' column found in the given events data.")
modulation = np.array(events['modulation']).astype(np.float)
+ for event,_ in events.items():
+ if event not in VALID_FIELDS:
+ warnings.warn("Unexpected key `{}` in events "
+ "will be ignored.".format(
+ event))
return trial_type, onset, duration, modulation
|
{"golden_diff": "diff --git a/nilearn/glm/first_level/experimental_paradigm.py b/nilearn/glm/first_level/experimental_paradigm.py\n--- a/nilearn/glm/first_level/experimental_paradigm.py\n+++ b/nilearn/glm/first_level/experimental_paradigm.py\n@@ -17,6 +17,11 @@\n \n import numpy as np\n \n+VALID_FIELDS = set([\"onset\",\n+ \"duration\",\n+ \"trial_type\",\n+ \"modulation\",\n+ ])\n \n def check_events(events):\n \"\"\"Test that the events data describes a valid experimental paradigm\n@@ -54,13 +59,19 @@\n onset = np.array(events['onset'])\n duration = np.array(events['duration']).astype(np.float)\n n_events = len(onset)\n- trial_type = np.array(events['trial_type'])\n modulation = np.ones(n_events)\n if 'trial_type' not in events.keys():\n warnings.warn(\"'trial_type' column not found \"\n \"in the given events data.\")\n trial_type = np.repeat('dummy', n_events)\n+ else:\n+ trial_type = np.array(events['trial_type'])\n if 'modulation' in events.keys():\n warnings.warn(\"'modulation' column found in the given events data.\")\n modulation = np.array(events['modulation']).astype(np.float)\n+ for event,_ in events.items():\n+ if event not in VALID_FIELDS:\n+ warnings.warn(\"Unexpected key `{}` in events \"\n+ \"will be ignored.\".format(\n+ event))\n return trial_type, onset, duration, modulation\n", "issue": "check_paradigm should check for invalid keys in passed dict\nUsing the old `nipy` user logic, I passed `amplitude=somethx` instead of `modulation=somethx` in the `make_design_matrix`. I didn't crash but the values where ignored (e.g Error: unknown param, etc.). A default value of 1 was forced...\n\n", "before_files": [{"content": "\"\"\"\nAn experimental protocol is handled as a pandas DataFrame\nthat includes an 'onset' field.\n\nThis yields the onset time of the events in the experimental paradigm.\nIt can also contain:\n\n * a 'trial_type' field that yields the condition identifier.\n * a 'duration' field that yields event duration (for so-called block\n paradigms).\n * a 'modulation' field that associated a scalar value to each event.\n\nAuthor: Bertrand Thirion, 2015\n\n\"\"\"\nimport warnings\n\nimport numpy as np\n\n\ndef check_events(events):\n \"\"\"Test that the events data describes a valid experimental paradigm\n\n It is valid if the events data has an 'onset' key.\n\n Parameters\n ----------\n events : pandas DataFrame\n Events data that describes a functional experimental paradigm.\n\n Returns\n -------\n trial_type : array of shape (n_events,), dtype='s'\n Per-event experimental conditions identifier.\n Defaults to np.repeat('dummy', len(onsets)).\n\n onset : array of shape (n_events,), dtype='f'\n Per-event onset time (in seconds)\n\n duration : array of shape (n_events,), dtype='f'\n Per-event durantion, (in seconds)\n defaults to zeros(n_events) when no duration is provided\n\n modulation : array of shape (n_events,), dtype='f'\n Per-event modulation, (in seconds)\n defaults to ones(n_events) when no duration is provided.\n\n \"\"\"\n if 'onset' not in events.keys():\n raise ValueError('The provided events data has no onset column.')\n if 'duration' not in events.keys():\n raise ValueError('The provided events data has no duration column.')\n\n onset = np.array(events['onset'])\n duration = np.array(events['duration']).astype(np.float)\n n_events = len(onset)\n trial_type = np.array(events['trial_type'])\n modulation = np.ones(n_events)\n if 'trial_type' not in events.keys():\n warnings.warn(\"'trial_type' column not found \"\n \"in the given events data.\")\n trial_type = np.repeat('dummy', n_events)\n if 'modulation' in events.keys():\n warnings.warn(\"'modulation' column found in the given events data.\")\n modulation = np.array(events['modulation']).astype(np.float)\n return trial_type, onset, duration, modulation\n", "path": "nilearn/glm/first_level/experimental_paradigm.py"}], "after_files": [{"content": "\"\"\"\nAn experimental protocol is handled as a pandas DataFrame\nthat includes an 'onset' field.\n\nThis yields the onset time of the events in the experimental paradigm.\nIt can also contain:\n\n * a 'trial_type' field that yields the condition identifier.\n * a 'duration' field that yields event duration (for so-called block\n paradigms).\n * a 'modulation' field that associated a scalar value to each event.\n\nAuthor: Bertrand Thirion, 2015\n\n\"\"\"\nimport warnings\n\nimport numpy as np\n\nVALID_FIELDS = set([\"onset\",\n \"duration\",\n \"trial_type\",\n \"modulation\",\n ])\n\ndef check_events(events):\n \"\"\"Test that the events data describes a valid experimental paradigm\n\n It is valid if the events data has an 'onset' key.\n\n Parameters\n ----------\n events : pandas DataFrame\n Events data that describes a functional experimental paradigm.\n\n Returns\n -------\n trial_type : array of shape (n_events,), dtype='s'\n Per-event experimental conditions identifier.\n Defaults to np.repeat('dummy', len(onsets)).\n\n onset : array of shape (n_events,), dtype='f'\n Per-event onset time (in seconds)\n\n duration : array of shape (n_events,), dtype='f'\n Per-event durantion, (in seconds)\n defaults to zeros(n_events) when no duration is provided\n\n modulation : array of shape (n_events,), dtype='f'\n Per-event modulation, (in seconds)\n defaults to ones(n_events) when no duration is provided.\n\n \"\"\"\n if 'onset' not in events.keys():\n raise ValueError('The provided events data has no onset column.')\n if 'duration' not in events.keys():\n raise ValueError('The provided events data has no duration column.')\n\n onset = np.array(events['onset'])\n duration = np.array(events['duration']).astype(np.float)\n n_events = len(onset)\n modulation = np.ones(n_events)\n if 'trial_type' not in events.keys():\n warnings.warn(\"'trial_type' column not found \"\n \"in the given events data.\")\n trial_type = np.repeat('dummy', n_events)\n else:\n trial_type = np.array(events['trial_type'])\n if 'modulation' in events.keys():\n warnings.warn(\"'modulation' column found in the given events data.\")\n modulation = np.array(events['modulation']).astype(np.float)\n for event,_ in events.items():\n if event not in VALID_FIELDS:\n warnings.warn(\"Unexpected key `{}` in events \"\n \"will be ignored.\".format(\n event))\n return trial_type, onset, duration, modulation\n", "path": "nilearn/glm/first_level/experimental_paradigm.py"}]}
| 983 | 351 |
gh_patches_debug_25318
|
rasdani/github-patches
|
git_diff
|
getsentry__sentry-python-484
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Celery - Queue object has no attribute 'all_tasks_done'
Hi all,
I'm integrating Sentry on a project in python that uses Celery. I'm getting this error when shutting down the worker:
```
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/Users/jibanez/API/.conda/envs/cimrender/lib/python3.6/site-packages/sentry_sdk/worker.py", line 84, in flush
self._wait_flush(timeout, callback)
File "/Users/jibanez/API/.conda/envs/cimrender/lib/python3.6/site-packages/sentry_sdk/worker.py", line 90, in _wait_flush
if not self._timed_queue_join(initial_timeout):
File "/Users/jibanez/API/.conda/envs/cimrender/lib/python3.6/site-packages/sentry_sdk/worker.py", line 48, in _timed_queue_join
queue.all_tasks_done.acquire() # type: ignore
AttributeError: 'Queue' object has no attribute 'all_tasks_done'
```
I'm using:
- Python 3.6
- Celery 4.3.0
- OSX Mojave
Any thoughts?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sentry_sdk/worker.py`
Content:
```
1 import os
2
3 from threading import Thread, Lock
4 from time import sleep, time
5 from sentry_sdk._compat import queue, check_thread_support
6 from sentry_sdk.utils import logger
7
8
9 from sentry_sdk._types import MYPY
10
11 if MYPY:
12 from queue import Queue
13 from typing import Any
14 from typing import Optional
15 from typing import Callable
16
17
18 _TERMINATOR = object()
19
20
21 class BackgroundWorker(object):
22 def __init__(self):
23 # type: () -> None
24 check_thread_support()
25 self._queue = queue.Queue(-1) # type: Queue[Any]
26 self._lock = Lock()
27 self._thread = None # type: Optional[Thread]
28 self._thread_for_pid = None # type: Optional[int]
29
30 @property
31 def is_alive(self):
32 # type: () -> bool
33 if self._thread_for_pid != os.getpid():
34 return False
35 if not self._thread:
36 return False
37 return self._thread.is_alive()
38
39 def _ensure_thread(self):
40 # type: () -> None
41 if not self.is_alive:
42 self.start()
43
44 def _timed_queue_join(self, timeout):
45 # type: (float) -> bool
46 deadline = time() + timeout
47 queue = self._queue
48 queue.all_tasks_done.acquire() # type: ignore
49 try:
50 while queue.unfinished_tasks: # type: ignore
51 delay = deadline - time()
52 if delay <= 0:
53 return False
54 queue.all_tasks_done.wait(timeout=delay) # type: ignore
55 return True
56 finally:
57 queue.all_tasks_done.release() # type: ignore
58
59 def start(self):
60 # type: () -> None
61 with self._lock:
62 if not self.is_alive:
63 self._thread = Thread(
64 target=self._target, name="raven-sentry.BackgroundWorker"
65 )
66 self._thread.setDaemon(True)
67 self._thread.start()
68 self._thread_for_pid = os.getpid()
69
70 def kill(self):
71 # type: () -> None
72 logger.debug("background worker got kill request")
73 with self._lock:
74 if self._thread:
75 self._queue.put_nowait(_TERMINATOR)
76 self._thread = None
77 self._thread_for_pid = None
78
79 def flush(self, timeout, callback=None):
80 # type: (float, Optional[Any]) -> None
81 logger.debug("background worker got flush request")
82 with self._lock:
83 if self.is_alive and timeout > 0.0:
84 self._wait_flush(timeout, callback)
85 logger.debug("background worker flushed")
86
87 def _wait_flush(self, timeout, callback):
88 # type: (float, Optional[Any]) -> None
89 initial_timeout = min(0.1, timeout)
90 if not self._timed_queue_join(initial_timeout):
91 pending = self._queue.qsize()
92 logger.debug("%d event(s) pending on flush", pending)
93 if callback is not None:
94 callback(pending, timeout)
95 self._timed_queue_join(timeout - initial_timeout)
96
97 def submit(self, callback):
98 # type: (Callable[[], None]) -> None
99 self._ensure_thread()
100 self._queue.put_nowait(callback)
101
102 def _target(self):
103 # type: () -> None
104 while True:
105 callback = self._queue.get()
106 try:
107 if callback is _TERMINATOR:
108 break
109 try:
110 callback()
111 except Exception:
112 logger.error("Failed processing job", exc_info=True)
113 finally:
114 self._queue.task_done()
115 sleep(0)
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sentry_sdk/worker.py b/sentry_sdk/worker.py
--- a/sentry_sdk/worker.py
+++ b/sentry_sdk/worker.py
@@ -45,16 +45,33 @@
# type: (float) -> bool
deadline = time() + timeout
queue = self._queue
- queue.all_tasks_done.acquire() # type: ignore
+
+ real_all_tasks_done = getattr(
+ queue, "all_tasks_done", None
+ ) # type: Optional[Any]
+ if real_all_tasks_done is not None:
+ real_all_tasks_done.acquire()
+ all_tasks_done = real_all_tasks_done # type: Optional[Any]
+ elif queue.__module__.startswith("eventlet."):
+ all_tasks_done = getattr(queue, "_cond", None)
+ else:
+ all_tasks_done = None
+
try:
while queue.unfinished_tasks: # type: ignore
delay = deadline - time()
if delay <= 0:
return False
- queue.all_tasks_done.wait(timeout=delay) # type: ignore
+ if all_tasks_done is not None:
+ all_tasks_done.wait(timeout=delay)
+ else:
+ # worst case, we just poll the number of remaining tasks
+ sleep(0.1)
+
return True
finally:
- queue.all_tasks_done.release() # type: ignore
+ if real_all_tasks_done is not None:
+ real_all_tasks_done.release() # type: ignore
def start(self):
# type: () -> None
|
{"golden_diff": "diff --git a/sentry_sdk/worker.py b/sentry_sdk/worker.py\n--- a/sentry_sdk/worker.py\n+++ b/sentry_sdk/worker.py\n@@ -45,16 +45,33 @@\n # type: (float) -> bool\n deadline = time() + timeout\n queue = self._queue\n- queue.all_tasks_done.acquire() # type: ignore\n+\n+ real_all_tasks_done = getattr(\n+ queue, \"all_tasks_done\", None\n+ ) # type: Optional[Any]\n+ if real_all_tasks_done is not None:\n+ real_all_tasks_done.acquire()\n+ all_tasks_done = real_all_tasks_done # type: Optional[Any]\n+ elif queue.__module__.startswith(\"eventlet.\"):\n+ all_tasks_done = getattr(queue, \"_cond\", None)\n+ else:\n+ all_tasks_done = None\n+\n try:\n while queue.unfinished_tasks: # type: ignore\n delay = deadline - time()\n if delay <= 0:\n return False\n- queue.all_tasks_done.wait(timeout=delay) # type: ignore\n+ if all_tasks_done is not None:\n+ all_tasks_done.wait(timeout=delay)\n+ else:\n+ # worst case, we just poll the number of remaining tasks\n+ sleep(0.1)\n+\n return True\n finally:\n- queue.all_tasks_done.release() # type: ignore\n+ if real_all_tasks_done is not None:\n+ real_all_tasks_done.release() # type: ignore\n \n def start(self):\n # type: () -> None\n", "issue": "Celery - Queue object has no attribute 'all_tasks_done'\nHi all, \r\n\r\nI'm integrating Sentry on a project in python that uses Celery. I'm getting this error when shutting down the worker: \r\n\r\n```\r\nError in atexit._run_exitfuncs:\r\nTraceback (most recent call last):\r\n File \"/Users/jibanez/API/.conda/envs/cimrender/lib/python3.6/site-packages/sentry_sdk/worker.py\", line 84, in flush\r\n self._wait_flush(timeout, callback)\r\n File \"/Users/jibanez/API/.conda/envs/cimrender/lib/python3.6/site-packages/sentry_sdk/worker.py\", line 90, in _wait_flush\r\n if not self._timed_queue_join(initial_timeout):\r\n File \"/Users/jibanez/API/.conda/envs/cimrender/lib/python3.6/site-packages/sentry_sdk/worker.py\", line 48, in _timed_queue_join\r\n queue.all_tasks_done.acquire() # type: ignore\r\nAttributeError: 'Queue' object has no attribute 'all_tasks_done'\r\n```\r\n\r\nI'm using: \r\n- Python 3.6\r\n- Celery 4.3.0\r\n- OSX Mojave\r\n\r\nAny thoughts? \n", "before_files": [{"content": "import os\n\nfrom threading import Thread, Lock\nfrom time import sleep, time\nfrom sentry_sdk._compat import queue, check_thread_support\nfrom sentry_sdk.utils import logger\n\n\nfrom sentry_sdk._types import MYPY\n\nif MYPY:\n from queue import Queue\n from typing import Any\n from typing import Optional\n from typing import Callable\n\n\n_TERMINATOR = object()\n\n\nclass BackgroundWorker(object):\n def __init__(self):\n # type: () -> None\n check_thread_support()\n self._queue = queue.Queue(-1) # type: Queue[Any]\n self._lock = Lock()\n self._thread = None # type: Optional[Thread]\n self._thread_for_pid = None # type: Optional[int]\n\n @property\n def is_alive(self):\n # type: () -> bool\n if self._thread_for_pid != os.getpid():\n return False\n if not self._thread:\n return False\n return self._thread.is_alive()\n\n def _ensure_thread(self):\n # type: () -> None\n if not self.is_alive:\n self.start()\n\n def _timed_queue_join(self, timeout):\n # type: (float) -> bool\n deadline = time() + timeout\n queue = self._queue\n queue.all_tasks_done.acquire() # type: ignore\n try:\n while queue.unfinished_tasks: # type: ignore\n delay = deadline - time()\n if delay <= 0:\n return False\n queue.all_tasks_done.wait(timeout=delay) # type: ignore\n return True\n finally:\n queue.all_tasks_done.release() # type: ignore\n\n def start(self):\n # type: () -> None\n with self._lock:\n if not self.is_alive:\n self._thread = Thread(\n target=self._target, name=\"raven-sentry.BackgroundWorker\"\n )\n self._thread.setDaemon(True)\n self._thread.start()\n self._thread_for_pid = os.getpid()\n\n def kill(self):\n # type: () -> None\n logger.debug(\"background worker got kill request\")\n with self._lock:\n if self._thread:\n self._queue.put_nowait(_TERMINATOR)\n self._thread = None\n self._thread_for_pid = None\n\n def flush(self, timeout, callback=None):\n # type: (float, Optional[Any]) -> None\n logger.debug(\"background worker got flush request\")\n with self._lock:\n if self.is_alive and timeout > 0.0:\n self._wait_flush(timeout, callback)\n logger.debug(\"background worker flushed\")\n\n def _wait_flush(self, timeout, callback):\n # type: (float, Optional[Any]) -> None\n initial_timeout = min(0.1, timeout)\n if not self._timed_queue_join(initial_timeout):\n pending = self._queue.qsize()\n logger.debug(\"%d event(s) pending on flush\", pending)\n if callback is not None:\n callback(pending, timeout)\n self._timed_queue_join(timeout - initial_timeout)\n\n def submit(self, callback):\n # type: (Callable[[], None]) -> None\n self._ensure_thread()\n self._queue.put_nowait(callback)\n\n def _target(self):\n # type: () -> None\n while True:\n callback = self._queue.get()\n try:\n if callback is _TERMINATOR:\n break\n try:\n callback()\n except Exception:\n logger.error(\"Failed processing job\", exc_info=True)\n finally:\n self._queue.task_done()\n sleep(0)\n", "path": "sentry_sdk/worker.py"}], "after_files": [{"content": "import os\n\nfrom threading import Thread, Lock\nfrom time import sleep, time\nfrom sentry_sdk._compat import queue, check_thread_support\nfrom sentry_sdk.utils import logger\n\n\nfrom sentry_sdk._types import MYPY\n\nif MYPY:\n from queue import Queue\n from typing import Any\n from typing import Optional\n from typing import Callable\n\n\n_TERMINATOR = object()\n\n\nclass BackgroundWorker(object):\n def __init__(self):\n # type: () -> None\n check_thread_support()\n self._queue = queue.Queue(-1) # type: Queue[Any]\n self._lock = Lock()\n self._thread = None # type: Optional[Thread]\n self._thread_for_pid = None # type: Optional[int]\n\n @property\n def is_alive(self):\n # type: () -> bool\n if self._thread_for_pid != os.getpid():\n return False\n if not self._thread:\n return False\n return self._thread.is_alive()\n\n def _ensure_thread(self):\n # type: () -> None\n if not self.is_alive:\n self.start()\n\n def _timed_queue_join(self, timeout):\n # type: (float) -> bool\n deadline = time() + timeout\n queue = self._queue\n\n real_all_tasks_done = getattr(\n queue, \"all_tasks_done\", None\n ) # type: Optional[Any]\n if real_all_tasks_done is not None:\n real_all_tasks_done.acquire()\n all_tasks_done = real_all_tasks_done # type: Optional[Any]\n elif queue.__module__.startswith(\"eventlet.\"):\n all_tasks_done = getattr(queue, \"_cond\", None)\n else:\n all_tasks_done = None\n\n try:\n while queue.unfinished_tasks: # type: ignore\n delay = deadline - time()\n if delay <= 0:\n return False\n if all_tasks_done is not None:\n all_tasks_done.wait(timeout=delay)\n else:\n # worst case, we just poll the number of remaining tasks\n sleep(0.1)\n\n return True\n finally:\n if real_all_tasks_done is not None:\n real_all_tasks_done.release() # type: ignore\n\n def start(self):\n # type: () -> None\n with self._lock:\n if not self.is_alive:\n self._thread = Thread(\n target=self._target, name=\"raven-sentry.BackgroundWorker\"\n )\n self._thread.setDaemon(True)\n self._thread.start()\n self._thread_for_pid = os.getpid()\n\n def kill(self):\n # type: () -> None\n logger.debug(\"background worker got kill request\")\n with self._lock:\n if self._thread:\n self._queue.put_nowait(_TERMINATOR)\n self._thread = None\n self._thread_for_pid = None\n\n def flush(self, timeout, callback=None):\n # type: (float, Optional[Any]) -> None\n logger.debug(\"background worker got flush request\")\n with self._lock:\n if self.is_alive and timeout > 0.0:\n self._wait_flush(timeout, callback)\n logger.debug(\"background worker flushed\")\n\n def _wait_flush(self, timeout, callback):\n # type: (float, Optional[Any]) -> None\n initial_timeout = min(0.1, timeout)\n if not self._timed_queue_join(initial_timeout):\n pending = self._queue.qsize()\n logger.debug(\"%d event(s) pending on flush\", pending)\n if callback is not None:\n callback(pending, timeout)\n self._timed_queue_join(timeout - initial_timeout)\n\n def submit(self, callback):\n # type: (Callable[[], None]) -> None\n self._ensure_thread()\n self._queue.put_nowait(callback)\n\n def _target(self):\n # type: () -> None\n while True:\n callback = self._queue.get()\n try:\n if callback is _TERMINATOR:\n break\n try:\n callback()\n except Exception:\n logger.error(\"Failed processing job\", exc_info=True)\n finally:\n self._queue.task_done()\n sleep(0)\n", "path": "sentry_sdk/worker.py"}]}
| 1,560 | 354 |
gh_patches_debug_12533
|
rasdani/github-patches
|
git_diff
|
getnikola__nikola-2108
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
output/assets/css/code.css is orphaned?
```
~/blog$ nikola build
Scanning posts................done!
copy_assets:output/assets/css/base.css
Scanning posts................done!
~/blog$
~/blog$ nikola build
Scanning posts................done!
~/blog$ nikola check -f
Scanning posts................done!
WARNING: check: Files from unknown origins (orphans):
WARNING: check: output/assets/css/code.css
~/blog$ nikola build
Scanning posts................done!
copy_assets:output/assets/css/base.css
~/blog$ nikola check -f
Scanning posts................done!
WARNING: check: Files from unknown origins (orphans):
WARNING: check: output/assets/css/code.css
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nikola/plugins/task/copy_assets.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Copyright © 2012-2015 Roberto Alsina and others.
4
5 # Permission is hereby granted, free of charge, to any
6 # person obtaining a copy of this software and associated
7 # documentation files (the "Software"), to deal in the
8 # Software without restriction, including without limitation
9 # the rights to use, copy, modify, merge, publish,
10 # distribute, sublicense, and/or sell copies of the
11 # Software, and to permit persons to whom the Software is
12 # furnished to do so, subject to the following conditions:
13 #
14 # The above copyright notice and this permission notice
15 # shall be included in all copies or substantial portions of
16 # the Software.
17 #
18 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
19 # KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
20 # WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
21 # PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
22 # OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
23 # OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
24 # OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
25 # SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
26
27 """Copy theme assets into output."""
28
29 from __future__ import unicode_literals
30
31 import io
32 import os
33
34 from nikola.plugin_categories import Task
35 from nikola import utils
36
37
38 class CopyAssets(Task):
39
40 """Copy theme assets into output."""
41
42 name = "copy_assets"
43
44 def gen_tasks(self):
45 """Create tasks to copy the assets of the whole theme chain.
46
47 If a file is present on two themes, use the version
48 from the "youngest" theme.
49 """
50 kw = {
51 "themes": self.site.THEMES,
52 "files_folders": self.site.config['FILES_FOLDERS'],
53 "output_folder": self.site.config['OUTPUT_FOLDER'],
54 "filters": self.site.config['FILTERS'],
55 "code_color_scheme": self.site.config['CODE_COLOR_SCHEME'],
56 "code.css_selectors": 'pre.code',
57 "code.css_head": '/* code.css file generated by Nikola */\n',
58 "code.css_close": "\ntable.codetable { width: 100%;} td.linenos {text-align: right; width: 4em;}\n",
59 }
60 tasks = {}
61 code_css_path = os.path.join(kw['output_folder'], 'assets', 'css', 'code.css')
62 code_css_input = utils.get_asset_path('assets/css/code.css',
63 themes=kw['themes'],
64 files_folders=kw['files_folders'])
65
66 kw["code.css_input"] = code_css_input
67
68 yield self.group_task()
69
70 for theme_name in kw['themes']:
71 src = os.path.join(utils.get_theme_path(theme_name), 'assets')
72 dst = os.path.join(kw['output_folder'], 'assets')
73 for task in utils.copy_tree(src, dst):
74 if task['name'] in tasks:
75 continue
76 tasks[task['name']] = task
77 task['uptodate'] = [utils.config_changed(kw, 'nikola.plugins.task.copy_assets')]
78 task['basename'] = self.name
79 if code_css_input:
80 if 'file_dep' not in task:
81 task['file_dep'] = []
82 task['file_dep'].append(code_css_input)
83 yield utils.apply_filters(task, kw['filters'])
84
85 # Check whether or not there is a code.css file around.
86 if not code_css_input:
87 def create_code_css():
88 from pygments.formatters import get_formatter_by_name
89 formatter = get_formatter_by_name('html', style=kw["code_color_scheme"])
90 utils.makedirs(os.path.dirname(code_css_path))
91 with io.open(code_css_path, 'w+', encoding='utf8') as outf:
92 outf.write(kw["code.css_head"])
93 outf.write(formatter.get_style_defs(kw["code.css_selectors"]))
94 outf.write(kw["code.css_close"])
95
96 if os.path.exists(code_css_path):
97 with io.open(code_css_path, 'r', encoding='utf-8') as fh:
98 testcontents = fh.read(len(kw["code.css_head"])) == kw["code.css_head"]
99 else:
100 testcontents = False
101
102 task = {
103 'basename': self.name,
104 'name': code_css_path,
105 'targets': [code_css_path],
106 'uptodate': [utils.config_changed(kw, 'nikola.plugins.task.copy_assets'), testcontents],
107 'actions': [(create_code_css, [])],
108 'clean': True,
109 }
110 yield utils.apply_filters(task, kw['filters'])
111
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/nikola/plugins/task/copy_assets.py b/nikola/plugins/task/copy_assets.py
--- a/nikola/plugins/task/copy_assets.py
+++ b/nikola/plugins/task/copy_assets.py
@@ -61,10 +61,7 @@
code_css_path = os.path.join(kw['output_folder'], 'assets', 'css', 'code.css')
code_css_input = utils.get_asset_path('assets/css/code.css',
themes=kw['themes'],
- files_folders=kw['files_folders'])
-
- kw["code.css_input"] = code_css_input
-
+ files_folders=kw['files_folders'], output_dir=None)
yield self.group_task()
for theme_name in kw['themes']:
|
{"golden_diff": "diff --git a/nikola/plugins/task/copy_assets.py b/nikola/plugins/task/copy_assets.py\n--- a/nikola/plugins/task/copy_assets.py\n+++ b/nikola/plugins/task/copy_assets.py\n@@ -61,10 +61,7 @@\n code_css_path = os.path.join(kw['output_folder'], 'assets', 'css', 'code.css')\n code_css_input = utils.get_asset_path('assets/css/code.css',\n themes=kw['themes'],\n- files_folders=kw['files_folders'])\n-\n- kw[\"code.css_input\"] = code_css_input\n-\n+ files_folders=kw['files_folders'], output_dir=None)\n yield self.group_task()\n \n for theme_name in kw['themes']:\n", "issue": "output/assets/css/code.css is orphaned?\n```\n~/blog$ nikola build\nScanning posts................done!\ncopy_assets:output/assets/css/base.css\nScanning posts................done!\n~/blog$ \n~/blog$ nikola build\nScanning posts................done!\n~/blog$ nikola check -f\nScanning posts................done!\nWARNING: check: Files from unknown origins (orphans):\nWARNING: check: output/assets/css/code.css\n~/blog$ nikola build\nScanning posts................done!\ncopy_assets:output/assets/css/base.css\n~/blog$ nikola check -f\nScanning posts................done!\nWARNING: check: Files from unknown origins (orphans):\nWARNING: check: output/assets/css/code.css\n```\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright \u00a9 2012-2015 Roberto Alsina and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish,\n# distribute, sublicense, and/or sell copies of the\n# Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice\n# shall be included in all copies or substantial portions of\n# the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\n# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\n# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\n# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\n# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR\n# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\n# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\n\"\"\"Copy theme assets into output.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport io\nimport os\n\nfrom nikola.plugin_categories import Task\nfrom nikola import utils\n\n\nclass CopyAssets(Task):\n\n \"\"\"Copy theme assets into output.\"\"\"\n\n name = \"copy_assets\"\n\n def gen_tasks(self):\n \"\"\"Create tasks to copy the assets of the whole theme chain.\n\n If a file is present on two themes, use the version\n from the \"youngest\" theme.\n \"\"\"\n kw = {\n \"themes\": self.site.THEMES,\n \"files_folders\": self.site.config['FILES_FOLDERS'],\n \"output_folder\": self.site.config['OUTPUT_FOLDER'],\n \"filters\": self.site.config['FILTERS'],\n \"code_color_scheme\": self.site.config['CODE_COLOR_SCHEME'],\n \"code.css_selectors\": 'pre.code',\n \"code.css_head\": '/* code.css file generated by Nikola */\\n',\n \"code.css_close\": \"\\ntable.codetable { width: 100%;} td.linenos {text-align: right; width: 4em;}\\n\",\n }\n tasks = {}\n code_css_path = os.path.join(kw['output_folder'], 'assets', 'css', 'code.css')\n code_css_input = utils.get_asset_path('assets/css/code.css',\n themes=kw['themes'],\n files_folders=kw['files_folders'])\n\n kw[\"code.css_input\"] = code_css_input\n\n yield self.group_task()\n\n for theme_name in kw['themes']:\n src = os.path.join(utils.get_theme_path(theme_name), 'assets')\n dst = os.path.join(kw['output_folder'], 'assets')\n for task in utils.copy_tree(src, dst):\n if task['name'] in tasks:\n continue\n tasks[task['name']] = task\n task['uptodate'] = [utils.config_changed(kw, 'nikola.plugins.task.copy_assets')]\n task['basename'] = self.name\n if code_css_input:\n if 'file_dep' not in task:\n task['file_dep'] = []\n task['file_dep'].append(code_css_input)\n yield utils.apply_filters(task, kw['filters'])\n\n # Check whether or not there is a code.css file around.\n if not code_css_input:\n def create_code_css():\n from pygments.formatters import get_formatter_by_name\n formatter = get_formatter_by_name('html', style=kw[\"code_color_scheme\"])\n utils.makedirs(os.path.dirname(code_css_path))\n with io.open(code_css_path, 'w+', encoding='utf8') as outf:\n outf.write(kw[\"code.css_head\"])\n outf.write(formatter.get_style_defs(kw[\"code.css_selectors\"]))\n outf.write(kw[\"code.css_close\"])\n\n if os.path.exists(code_css_path):\n with io.open(code_css_path, 'r', encoding='utf-8') as fh:\n testcontents = fh.read(len(kw[\"code.css_head\"])) == kw[\"code.css_head\"]\n else:\n testcontents = False\n\n task = {\n 'basename': self.name,\n 'name': code_css_path,\n 'targets': [code_css_path],\n 'uptodate': [utils.config_changed(kw, 'nikola.plugins.task.copy_assets'), testcontents],\n 'actions': [(create_code_css, [])],\n 'clean': True,\n }\n yield utils.apply_filters(task, kw['filters'])\n", "path": "nikola/plugins/task/copy_assets.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright \u00a9 2012-2015 Roberto Alsina and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish,\n# distribute, sublicense, and/or sell copies of the\n# Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice\n# shall be included in all copies or substantial portions of\n# the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\n# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\n# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\n# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\n# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR\n# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\n# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\n\"\"\"Copy theme assets into output.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport io\nimport os\n\nfrom nikola.plugin_categories import Task\nfrom nikola import utils\n\n\nclass CopyAssets(Task):\n\n \"\"\"Copy theme assets into output.\"\"\"\n\n name = \"copy_assets\"\n\n def gen_tasks(self):\n \"\"\"Create tasks to copy the assets of the whole theme chain.\n\n If a file is present on two themes, use the version\n from the \"youngest\" theme.\n \"\"\"\n kw = {\n \"themes\": self.site.THEMES,\n \"files_folders\": self.site.config['FILES_FOLDERS'],\n \"output_folder\": self.site.config['OUTPUT_FOLDER'],\n \"filters\": self.site.config['FILTERS'],\n \"code_color_scheme\": self.site.config['CODE_COLOR_SCHEME'],\n \"code.css_selectors\": 'pre.code',\n \"code.css_head\": '/* code.css file generated by Nikola */\\n',\n \"code.css_close\": \"\\ntable.codetable { width: 100%;} td.linenos {text-align: right; width: 4em;}\\n\",\n }\n tasks = {}\n code_css_path = os.path.join(kw['output_folder'], 'assets', 'css', 'code.css')\n code_css_input = utils.get_asset_path('assets/css/code.css',\n themes=kw['themes'],\n files_folders=kw['files_folders'], output_dir=None)\n yield self.group_task()\n\n for theme_name in kw['themes']:\n src = os.path.join(utils.get_theme_path(theme_name), 'assets')\n dst = os.path.join(kw['output_folder'], 'assets')\n for task in utils.copy_tree(src, dst):\n if task['name'] in tasks:\n continue\n tasks[task['name']] = task\n task['uptodate'] = [utils.config_changed(kw, 'nikola.plugins.task.copy_assets')]\n task['basename'] = self.name\n if code_css_input:\n if 'file_dep' not in task:\n task['file_dep'] = []\n task['file_dep'].append(code_css_input)\n yield utils.apply_filters(task, kw['filters'])\n\n # Check whether or not there is a code.css file around.\n if not code_css_input:\n def create_code_css():\n from pygments.formatters import get_formatter_by_name\n formatter = get_formatter_by_name('html', style=kw[\"code_color_scheme\"])\n utils.makedirs(os.path.dirname(code_css_path))\n with io.open(code_css_path, 'w+', encoding='utf8') as outf:\n outf.write(kw[\"code.css_head\"])\n outf.write(formatter.get_style_defs(kw[\"code.css_selectors\"]))\n outf.write(kw[\"code.css_close\"])\n\n if os.path.exists(code_css_path):\n with io.open(code_css_path, 'r', encoding='utf-8') as fh:\n testcontents = fh.read(len(kw[\"code.css_head\"])) == kw[\"code.css_head\"]\n else:\n testcontents = False\n\n task = {\n 'basename': self.name,\n 'name': code_css_path,\n 'targets': [code_css_path],\n 'uptodate': [utils.config_changed(kw, 'nikola.plugins.task.copy_assets'), testcontents],\n 'actions': [(create_code_css, [])],\n 'clean': True,\n }\n yield utils.apply_filters(task, kw['filters'])\n", "path": "nikola/plugins/task/copy_assets.py"}]}
| 1,638 | 162 |
gh_patches_debug_26319
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-center-index-5256
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[package] fontconfig/2.13.93: Please make gettext dependency optional
### Package and Environment Details (include every applicable attribute)
* Package Name/Version: **fontconfig/2.13.93**
* Operating System+version: **MacOs**
The current recipe adds an unconditional dependency on libgettext/0.20.1 on MacOs.
Since libgettext is licensed under GPLv3, it places an additional license restriction to an otherwise more liberally licensed library.
</details>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/fontconfig/all/conanfile.py`
Content:
```
1 import os
2 import glob
3
4 from conans import ConanFile, tools, AutoToolsBuildEnvironment
5 from conans.errors import ConanInvalidConfiguration
6
7
8 class FontconfigConan(ConanFile):
9 name = "fontconfig"
10 license = "MIT"
11 url = "https://github.com/conan-io/conan-center-index"
12 description = "Fontconfig is a library for configuring and customizing font access"
13 homepage = "https://gitlab.freedesktop.org/fontconfig/fontconfig"
14 topics = ("conan", "fontconfig", "fonts", "freedesktop")
15 settings = "os", "compiler", "build_type", "arch"
16 options = {"shared": [True, False], "fPIC": [True, False]}
17 default_options = {"shared": False, "fPIC": True}
18 generators = "pkg_config"
19
20 _autotools = None
21
22 @property
23 def _source_subfolder(self):
24 return "source_subfolder"
25
26 def config_options(self):
27 if self.settings.os == "Windows":
28 del self.options.fPIC
29
30 def configure(self):
31 if self.settings.compiler == "Visual Studio":
32 raise ConanInvalidConfiguration("Visual Studio builds are not supported.")
33 if self.options.shared:
34 del self.options.fPIC
35 del self.settings.compiler.libcxx
36 del self.settings.compiler.cppstd
37
38 def requirements(self):
39 self.requires("freetype/2.10.4")
40 self.requires("expat/2.2.10")
41 if self.settings.os == "Linux":
42 self.requires("libuuid/1.0.3")
43 elif self.settings.os == "Macos":
44 self.requires("libgettext/0.20.1")
45
46 def build_requirements(self):
47 self.build_requires("gperf/3.1")
48 self.build_requires("pkgconf/1.7.3")
49 if tools.os_info.is_windows and not tools.get_env("CONAN_BASH_PATH"):
50 self.build_requires("msys2/20200517")
51
52 def source(self):
53 tools.get(**self.conan_data["sources"][self.version])
54 extrated_dir = self.name + "-" + self.version
55 os.rename(extrated_dir, self._source_subfolder)
56
57 def _configure_autotools(self):
58 if not self._autotools:
59 args = ["--enable-static=%s" % ("no" if self.options.shared else "yes"),
60 "--enable-shared=%s" % ("yes" if self.options.shared else "no"),
61 "--disable-docs"]
62 args.append("--sysconfdir=%s" % tools.unix_path(os.path.join(self.package_folder, "bin", "etc")))
63 args.append("--datadir=%s" % tools.unix_path(os.path.join(self.package_folder, "bin", "share")))
64 args.append("--datarootdir=%s" % tools.unix_path(os.path.join(self.package_folder, "bin", "share")))
65 args.append("--localstatedir=%s" % tools.unix_path(os.path.join(self.package_folder, "bin", "var")))
66 self._autotools = AutoToolsBuildEnvironment(self, win_bash=tools.os_info.is_windows)
67 self._autotools.libs = []
68 self._autotools.configure(configure_dir=self._source_subfolder, args=args)
69 tools.replace_in_file("Makefile", "po-conf test", "po-conf")
70 return self._autotools
71
72 def _patch_files(self):
73 # - fontconfig requires libtool version number, change it for the corresponding freetype one
74 tools.replace_in_file(os.path.join(self._source_subfolder, 'configure'), '21.0.15', '2.8.1')
75
76 def build(self):
77 # Patch files from dependencies
78 self._patch_files()
79 with tools.run_environment(self):
80 autotools = self._configure_autotools()
81 autotools.make()
82
83 def package(self):
84 self.copy("COPYING", dst="licenses", src=self._source_subfolder)
85 with tools.run_environment(self):
86 autotools = self._configure_autotools()
87 autotools.install()
88 os.unlink(os.path.join(self.package_folder, "lib", "libfontconfig.la"))
89 tools.rmdir(os.path.join(self.package_folder, "lib", "pkgconfig"))
90 for f in glob.glob(os.path.join(self.package_folder, "bin", "etc", "fonts", "conf.d", "*.conf")):
91 if os.path.islink(f):
92 os.unlink(f)
93 for def_file in glob.glob(os.path.join(self.package_folder, "lib", "*.def")):
94 os.remove(def_file)
95
96 def package_info(self):
97 self.cpp_info.libs = ["fontconfig"]
98 if self.settings.os in ["Linux", "FreeBSD"]:
99 self.cpp_info.system_libs.extend(["m", "pthread"])
100 self.cpp_info.names["cmake_find_package"] = "Fontconfig"
101 self.cpp_info.names["cmake_find_package_multi"] = "Fontconfig"
102
103 fontconfig_file = os.path.join(self.package_folder, "bin", "etc", "fonts", "fonts.conf")
104 self.output.info("Creating FONTCONFIG_FILE environment variable: {}".format(fontconfig_file))
105 self.env_info.FONTCONFIG_FILE = fontconfig_file
106 fontconfig_path = os.path.join(self.package_folder, "bin", "etc", "fonts")
107 self.output.info("Creating FONTCONFIG_PATH environment variable: {}".format(fontconfig_path))
108 self.env_info.FONTCONFIG_PATH = fontconfig_path
109
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/recipes/fontconfig/all/conanfile.py b/recipes/fontconfig/all/conanfile.py
--- a/recipes/fontconfig/all/conanfile.py
+++ b/recipes/fontconfig/all/conanfile.py
@@ -40,8 +40,6 @@
self.requires("expat/2.2.10")
if self.settings.os == "Linux":
self.requires("libuuid/1.0.3")
- elif self.settings.os == "Macos":
- self.requires("libgettext/0.20.1")
def build_requirements(self):
self.build_requires("gperf/3.1")
@@ -58,7 +56,9 @@
if not self._autotools:
args = ["--enable-static=%s" % ("no" if self.options.shared else "yes"),
"--enable-shared=%s" % ("yes" if self.options.shared else "no"),
- "--disable-docs"]
+ "--disable-docs",
+ "--disable-nls",
+ ]
args.append("--sysconfdir=%s" % tools.unix_path(os.path.join(self.package_folder, "bin", "etc")))
args.append("--datadir=%s" % tools.unix_path(os.path.join(self.package_folder, "bin", "share")))
args.append("--datarootdir=%s" % tools.unix_path(os.path.join(self.package_folder, "bin", "share")))
|
{"golden_diff": "diff --git a/recipes/fontconfig/all/conanfile.py b/recipes/fontconfig/all/conanfile.py\n--- a/recipes/fontconfig/all/conanfile.py\n+++ b/recipes/fontconfig/all/conanfile.py\n@@ -40,8 +40,6 @@\n self.requires(\"expat/2.2.10\")\n if self.settings.os == \"Linux\":\n self.requires(\"libuuid/1.0.3\")\n- elif self.settings.os == \"Macos\":\n- self.requires(\"libgettext/0.20.1\")\n \n def build_requirements(self):\n self.build_requires(\"gperf/3.1\")\n@@ -58,7 +56,9 @@\n if not self._autotools:\n args = [\"--enable-static=%s\" % (\"no\" if self.options.shared else \"yes\"),\n \"--enable-shared=%s\" % (\"yes\" if self.options.shared else \"no\"),\n- \"--disable-docs\"]\n+ \"--disable-docs\",\n+ \"--disable-nls\",\n+ ]\n args.append(\"--sysconfdir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"etc\")))\n args.append(\"--datadir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"share\")))\n args.append(\"--datarootdir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"share\")))\n", "issue": "[package] fontconfig/2.13.93: Please make gettext dependency optional\n### Package and Environment Details (include every applicable attribute)\r\n * Package Name/Version: **fontconfig/2.13.93**\r\n * Operating System+version: **MacOs**\r\n\r\nThe current recipe adds an unconditional dependency on libgettext/0.20.1 on MacOs.\r\n\r\nSince libgettext is licensed under GPLv3, it places an additional license restriction to an otherwise more liberally licensed library.\r\n\r\n</details>\r\n\n", "before_files": [{"content": "import os\nimport glob\n\nfrom conans import ConanFile, tools, AutoToolsBuildEnvironment\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass FontconfigConan(ConanFile):\n name = \"fontconfig\"\n license = \"MIT\"\n url = \"https://github.com/conan-io/conan-center-index\"\n description = \"Fontconfig is a library for configuring and customizing font access\"\n homepage = \"https://gitlab.freedesktop.org/fontconfig/fontconfig\"\n topics = (\"conan\", \"fontconfig\", \"fonts\", \"freedesktop\")\n settings = \"os\", \"compiler\", \"build_type\", \"arch\"\n options = {\"shared\": [True, False], \"fPIC\": [True, False]}\n default_options = {\"shared\": False, \"fPIC\": True}\n generators = \"pkg_config\"\n\n _autotools = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.settings.compiler == \"Visual Studio\":\n raise ConanInvalidConfiguration(\"Visual Studio builds are not supported.\")\n if self.options.shared:\n del self.options.fPIC\n del self.settings.compiler.libcxx\n del self.settings.compiler.cppstd\n\n def requirements(self):\n self.requires(\"freetype/2.10.4\")\n self.requires(\"expat/2.2.10\")\n if self.settings.os == \"Linux\":\n self.requires(\"libuuid/1.0.3\")\n elif self.settings.os == \"Macos\":\n self.requires(\"libgettext/0.20.1\")\n\n def build_requirements(self):\n self.build_requires(\"gperf/3.1\")\n self.build_requires(\"pkgconf/1.7.3\")\n if tools.os_info.is_windows and not tools.get_env(\"CONAN_BASH_PATH\"):\n self.build_requires(\"msys2/20200517\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extrated_dir = self.name + \"-\" + self.version\n os.rename(extrated_dir, self._source_subfolder)\n\n def _configure_autotools(self):\n if not self._autotools:\n args = [\"--enable-static=%s\" % (\"no\" if self.options.shared else \"yes\"),\n \"--enable-shared=%s\" % (\"yes\" if self.options.shared else \"no\"),\n \"--disable-docs\"]\n args.append(\"--sysconfdir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"etc\")))\n args.append(\"--datadir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"share\")))\n args.append(\"--datarootdir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"share\")))\n args.append(\"--localstatedir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"var\")))\n self._autotools = AutoToolsBuildEnvironment(self, win_bash=tools.os_info.is_windows)\n self._autotools.libs = []\n self._autotools.configure(configure_dir=self._source_subfolder, args=args)\n tools.replace_in_file(\"Makefile\", \"po-conf test\", \"po-conf\")\n return self._autotools\n\n def _patch_files(self):\n # - fontconfig requires libtool version number, change it for the corresponding freetype one\n tools.replace_in_file(os.path.join(self._source_subfolder, 'configure'), '21.0.15', '2.8.1')\n\n def build(self):\n # Patch files from dependencies\n self._patch_files()\n with tools.run_environment(self):\n autotools = self._configure_autotools()\n autotools.make()\n\n def package(self):\n self.copy(\"COPYING\", dst=\"licenses\", src=self._source_subfolder)\n with tools.run_environment(self):\n autotools = self._configure_autotools()\n autotools.install()\n os.unlink(os.path.join(self.package_folder, \"lib\", \"libfontconfig.la\"))\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n for f in glob.glob(os.path.join(self.package_folder, \"bin\", \"etc\", \"fonts\", \"conf.d\", \"*.conf\")):\n if os.path.islink(f):\n os.unlink(f)\n for def_file in glob.glob(os.path.join(self.package_folder, \"lib\", \"*.def\")):\n os.remove(def_file)\n\n def package_info(self):\n self.cpp_info.libs = [\"fontconfig\"]\n if self.settings.os in [\"Linux\", \"FreeBSD\"]:\n self.cpp_info.system_libs.extend([\"m\", \"pthread\"])\n self.cpp_info.names[\"cmake_find_package\"] = \"Fontconfig\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"Fontconfig\"\n\n fontconfig_file = os.path.join(self.package_folder, \"bin\", \"etc\", \"fonts\", \"fonts.conf\")\n self.output.info(\"Creating FONTCONFIG_FILE environment variable: {}\".format(fontconfig_file))\n self.env_info.FONTCONFIG_FILE = fontconfig_file\n fontconfig_path = os.path.join(self.package_folder, \"bin\", \"etc\", \"fonts\")\n self.output.info(\"Creating FONTCONFIG_PATH environment variable: {}\".format(fontconfig_path))\n self.env_info.FONTCONFIG_PATH = fontconfig_path\n", "path": "recipes/fontconfig/all/conanfile.py"}], "after_files": [{"content": "import os\nimport glob\n\nfrom conans import ConanFile, tools, AutoToolsBuildEnvironment\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass FontconfigConan(ConanFile):\n name = \"fontconfig\"\n license = \"MIT\"\n url = \"https://github.com/conan-io/conan-center-index\"\n description = \"Fontconfig is a library for configuring and customizing font access\"\n homepage = \"https://gitlab.freedesktop.org/fontconfig/fontconfig\"\n topics = (\"conan\", \"fontconfig\", \"fonts\", \"freedesktop\")\n settings = \"os\", \"compiler\", \"build_type\", \"arch\"\n options = {\"shared\": [True, False], \"fPIC\": [True, False]}\n default_options = {\"shared\": False, \"fPIC\": True}\n generators = \"pkg_config\"\n\n _autotools = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.settings.compiler == \"Visual Studio\":\n raise ConanInvalidConfiguration(\"Visual Studio builds are not supported.\")\n if self.options.shared:\n del self.options.fPIC\n del self.settings.compiler.libcxx\n del self.settings.compiler.cppstd\n\n def requirements(self):\n self.requires(\"freetype/2.10.4\")\n self.requires(\"expat/2.2.10\")\n if self.settings.os == \"Linux\":\n self.requires(\"libuuid/1.0.3\")\n\n def build_requirements(self):\n self.build_requires(\"gperf/3.1\")\n self.build_requires(\"pkgconf/1.7.3\")\n if tools.os_info.is_windows and not tools.get_env(\"CONAN_BASH_PATH\"):\n self.build_requires(\"msys2/20200517\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extrated_dir = self.name + \"-\" + self.version\n os.rename(extrated_dir, self._source_subfolder)\n\n def _configure_autotools(self):\n if not self._autotools:\n args = [\"--enable-static=%s\" % (\"no\" if self.options.shared else \"yes\"),\n \"--enable-shared=%s\" % (\"yes\" if self.options.shared else \"no\"),\n \"--disable-docs\",\n \"--disable-nls\",\n ]\n args.append(\"--sysconfdir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"etc\")))\n args.append(\"--datadir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"share\")))\n args.append(\"--datarootdir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"share\")))\n args.append(\"--localstatedir=%s\" % tools.unix_path(os.path.join(self.package_folder, \"bin\", \"var\")))\n self._autotools = AutoToolsBuildEnvironment(self, win_bash=tools.os_info.is_windows)\n self._autotools.libs = []\n self._autotools.configure(configure_dir=self._source_subfolder, args=args)\n tools.replace_in_file(\"Makefile\", \"po-conf test\", \"po-conf\")\n return self._autotools\n\n def _patch_files(self):\n # - fontconfig requires libtool version number, change it for the corresponding freetype one\n tools.replace_in_file(os.path.join(self._source_subfolder, 'configure'), '21.0.15', '2.8.1')\n\n def build(self):\n # Patch files from dependencies\n self._patch_files()\n with tools.run_environment(self):\n autotools = self._configure_autotools()\n autotools.make()\n\n def package(self):\n self.copy(\"COPYING\", dst=\"licenses\", src=self._source_subfolder)\n with tools.run_environment(self):\n autotools = self._configure_autotools()\n autotools.install()\n os.unlink(os.path.join(self.package_folder, \"lib\", \"libfontconfig.la\"))\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n for f in glob.glob(os.path.join(self.package_folder, \"bin\", \"etc\", \"fonts\", \"conf.d\", \"*.conf\")):\n if os.path.islink(f):\n os.unlink(f)\n for def_file in glob.glob(os.path.join(self.package_folder, \"lib\", \"*.def\")):\n os.remove(def_file)\n\n def package_info(self):\n self.cpp_info.libs = [\"fontconfig\"]\n if self.settings.os in [\"Linux\", \"FreeBSD\"]:\n self.cpp_info.system_libs.extend([\"m\", \"pthread\"])\n self.cpp_info.names[\"cmake_find_package\"] = \"Fontconfig\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"Fontconfig\"\n\n fontconfig_file = os.path.join(self.package_folder, \"bin\", \"etc\", \"fonts\", \"fonts.conf\")\n self.output.info(\"Creating FONTCONFIG_FILE environment variable: {}\".format(fontconfig_file))\n self.env_info.FONTCONFIG_FILE = fontconfig_file\n fontconfig_path = os.path.join(self.package_folder, \"bin\", \"etc\", \"fonts\")\n self.output.info(\"Creating FONTCONFIG_PATH environment variable: {}\".format(fontconfig_path))\n self.env_info.FONTCONFIG_PATH = fontconfig_path\n", "path": "recipes/fontconfig/all/conanfile.py"}]}
| 1,786 | 311 |
gh_patches_debug_37168
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-center-index-2696
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[request] sentry-native/0.4.1
### Package Details
* Package Name/Version: **sentry-native/0.4.1**
* Changelog: **https://github.com/getsentry/sentry-native/blob/0.4.1/CHANGELOG.md**
https://github.com/getsentry/sentry-native/tree/0.4.1
The above mentioned version is newly released by the upstream project and not yet available as a recipe. Please add this version.
Also, **please add windows support.**
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/sentry-native/all/conanfile.py`
Content:
```
1 import os
2 from conans import ConanFile, CMake, tools
3 from conans.errors import ConanInvalidConfiguration
4
5
6 class SentryNativeConan(ConanFile):
7 name = "sentry-native"
8 description = "The Sentry Native SDK is an error and crash reporting client for native applications,\n" \
9 "optimized for C and C++. Sentry allows to add tags,\n" \
10 "breadcrumbs and arbitrary custom context to enrich error reports."
11 url = "https://github.com/conan-io/conan-center-index"
12 homepage = "https://github.com/getsentry/sentry-native"
13 license = "MIT"
14 topics = ("conan", "breakpad", "crashpad",
15 "error-reporting", "crash-reporting")
16 exports_sources = ["CMakeLists.txt"]
17 generators = "cmake", "cmake_find_package"
18 settings = "os", "arch", "compiler", "build_type"
19 options = {
20 "shared": [True, False],
21 "fPIC": [True, False],
22 "backend": ["none", "inproc", "crashpad", "breakpad"],
23 "transport": ["none", "curl", "winhttp"],
24 }
25 default_options = {
26 "shared": False,
27 "fPIC": True,
28 "backend": "inproc",
29 "transport": "curl"
30 }
31
32 @property
33 def _source_subfolder(self):
34 return "source_subfolder"
35
36 _cmake = None
37
38 def requirements(self):
39 if self.options.transport == "curl":
40 self.requires("libcurl/7.68.0")
41
42 if self.options.backend == "crashpad":
43 raise ConanInvalidConfiguration("crashpad not available yet in CCI")
44 if self.options.backend == "breakpad":
45 raise ConanInvalidConfiguration("breakpad not available yet in CCI")
46
47 def config_options(self):
48 if self.settings.os == "Windows":
49 del self.options.fPIC
50
51 def source(self):
52 tools.get(**self.conan_data["sources"][self.version])
53 extracted_dir = self.name + "-" + self.version
54 os.rename(extracted_dir, self._source_subfolder)
55
56 def configure(self):
57 if self.options.backend == "inproc" and self.settings.os == "Windows":
58 raise ConanInvalidConfiguration("The in-process backend is not supported on Windows")
59
60 def _configure_cmake(self):
61 if self._cmake:
62 return self._cmake
63 self._cmake = CMake(self)
64 self._cmake.definitions["SENTRY_BACKEND"] = self.options.backend
65 self._cmake.definitions["SENTRY_ENABLE_INSTALL"] = True
66 self._cmake.definitions["SENTRY_TRANSPORT"] = self.options.transport
67 self._cmake.definitions["SENTRY_PIC"] = self.options.get_safe("fPIC", False)
68 self._cmake.configure()
69 return self._cmake
70
71 def build(self):
72 cmake = self._configure_cmake()
73 cmake.build()
74
75 def package(self):
76 self.copy("LICENSE", dst="licenses", src=self._source_subfolder)
77 cmake = self._configure_cmake()
78 cmake.install()
79 tools.rmdir(os.path.join(self.package_folder, "lib", "cmake"))
80
81 def package_info(self):
82 self.cpp_info.libs = ["sentry"]
83 if self.settings.os in ("Android", "Windows"):
84 self.cpp_info.exelinkflags= ["--build-id=sha1"]
85 self.cpp_info.sharedlinkflags = ["--build-id=sha1"]
86 if self.settings.os == "Linux":
87 self.cpp_info.system_libs = ["pthread", "dl"]
88 elif self.settings.os == "Windows":
89 self.cpp_info.system_libs = ["winhttp", "dbghelp", "pathcch"]
90
91 if not self.options.shared:
92 self.cpp_info.defines = ["SENTRY_BUILD_STATIC"]
93
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/recipes/sentry-native/all/conanfile.py b/recipes/sentry-native/all/conanfile.py
--- a/recipes/sentry-native/all/conanfile.py
+++ b/recipes/sentry-native/all/conanfile.py
@@ -1,4 +1,5 @@
import os
+import glob
from conans import ConanFile, CMake, tools
from conans.errors import ConanInvalidConfiguration
@@ -37,8 +38,8 @@
def requirements(self):
if self.options.transport == "curl":
- self.requires("libcurl/7.68.0")
-
+ self.requires("libcurl/7.71.0")
+
if self.options.backend == "crashpad":
raise ConanInvalidConfiguration("crashpad not available yet in CCI")
if self.options.backend == "breakpad":
@@ -54,7 +55,7 @@
os.rename(extracted_dir, self._source_subfolder)
def configure(self):
- if self.options.backend == "inproc" and self.settings.os == "Windows":
+ if self.options.backend == "inproc" and self.settings.os == "Windows" and tools.Version(self.version) < "0.4":
raise ConanInvalidConfiguration("The in-process backend is not supported on Windows")
def _configure_cmake(self):
@@ -77,16 +78,18 @@
cmake = self._configure_cmake()
cmake.install()
tools.rmdir(os.path.join(self.package_folder, "lib", "cmake"))
+ for pdb in glob.glob(os.path.join(self.package_folder, "bin", "*.pdb")):
+ os.unlink(pdb)
def package_info(self):
self.cpp_info.libs = ["sentry"]
- if self.settings.os in ("Android", "Windows"):
- self.cpp_info.exelinkflags= ["--build-id=sha1"]
- self.cpp_info.sharedlinkflags = ["--build-id=sha1"]
+ if self.settings.os in ("Android", "Linux"):
+ self.cpp_info.exelinkflags = ["-Wl,-E,--build-id=sha1"]
+ self.cpp_info.sharedlinkflags = ["-Wl,-E,--build-id=sha1"]
if self.settings.os == "Linux":
self.cpp_info.system_libs = ["pthread", "dl"]
elif self.settings.os == "Windows":
- self.cpp_info.system_libs = ["winhttp", "dbghelp", "pathcch"]
+ self.cpp_info.system_libs = ["winhttp", "dbghelp", "pathcch", "shlwapi"]
if not self.options.shared:
self.cpp_info.defines = ["SENTRY_BUILD_STATIC"]
|
{"golden_diff": "diff --git a/recipes/sentry-native/all/conanfile.py b/recipes/sentry-native/all/conanfile.py\n--- a/recipes/sentry-native/all/conanfile.py\n+++ b/recipes/sentry-native/all/conanfile.py\n@@ -1,4 +1,5 @@\n import os\n+import glob\n from conans import ConanFile, CMake, tools\n from conans.errors import ConanInvalidConfiguration\n \n@@ -37,8 +38,8 @@\n \n def requirements(self):\n if self.options.transport == \"curl\":\n- self.requires(\"libcurl/7.68.0\")\n- \n+ self.requires(\"libcurl/7.71.0\")\n+\n if self.options.backend == \"crashpad\":\n raise ConanInvalidConfiguration(\"crashpad not available yet in CCI\")\n if self.options.backend == \"breakpad\":\n@@ -54,7 +55,7 @@\n os.rename(extracted_dir, self._source_subfolder)\n \n def configure(self):\n- if self.options.backend == \"inproc\" and self.settings.os == \"Windows\":\n+ if self.options.backend == \"inproc\" and self.settings.os == \"Windows\" and tools.Version(self.version) < \"0.4\":\n raise ConanInvalidConfiguration(\"The in-process backend is not supported on Windows\")\n \n def _configure_cmake(self):\n@@ -77,16 +78,18 @@\n cmake = self._configure_cmake()\n cmake.install()\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"cmake\"))\n+ for pdb in glob.glob(os.path.join(self.package_folder, \"bin\", \"*.pdb\")):\n+ os.unlink(pdb)\n \n def package_info(self):\n self.cpp_info.libs = [\"sentry\"]\n- if self.settings.os in (\"Android\", \"Windows\"):\n- self.cpp_info.exelinkflags= [\"--build-id=sha1\"]\n- self.cpp_info.sharedlinkflags = [\"--build-id=sha1\"]\n+ if self.settings.os in (\"Android\", \"Linux\"):\n+ self.cpp_info.exelinkflags = [\"-Wl,-E,--build-id=sha1\"]\n+ self.cpp_info.sharedlinkflags = [\"-Wl,-E,--build-id=sha1\"]\n if self.settings.os == \"Linux\":\n self.cpp_info.system_libs = [\"pthread\", \"dl\"]\n elif self.settings.os == \"Windows\":\n- self.cpp_info.system_libs = [\"winhttp\", \"dbghelp\", \"pathcch\"]\n+ self.cpp_info.system_libs = [\"winhttp\", \"dbghelp\", \"pathcch\", \"shlwapi\"]\n \n if not self.options.shared:\n self.cpp_info.defines = [\"SENTRY_BUILD_STATIC\"]\n", "issue": "[request] sentry-native/0.4.1\n### Package Details\r\n * Package Name/Version: **sentry-native/0.4.1**\r\n * Changelog: **https://github.com/getsentry/sentry-native/blob/0.4.1/CHANGELOG.md**\r\n\r\nhttps://github.com/getsentry/sentry-native/tree/0.4.1\r\n\r\nThe above mentioned version is newly released by the upstream project and not yet available as a recipe. Please add this version.\r\nAlso, **please add windows support.**\n", "before_files": [{"content": "import os\nfrom conans import ConanFile, CMake, tools\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass SentryNativeConan(ConanFile):\n name = \"sentry-native\"\n description = \"The Sentry Native SDK is an error and crash reporting client for native applications,\\n\" \\\n \"optimized for C and C++. Sentry allows to add tags,\\n\" \\\n \"breadcrumbs and arbitrary custom context to enrich error reports.\"\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/getsentry/sentry-native\"\n license = \"MIT\"\n topics = (\"conan\", \"breakpad\", \"crashpad\",\n \"error-reporting\", \"crash-reporting\")\n exports_sources = [\"CMakeLists.txt\"]\n generators = \"cmake\", \"cmake_find_package\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n \"backend\": [\"none\", \"inproc\", \"crashpad\", \"breakpad\"],\n \"transport\": [\"none\", \"curl\", \"winhttp\"],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n \"backend\": \"inproc\",\n \"transport\": \"curl\"\n }\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n _cmake = None\n\n def requirements(self):\n if self.options.transport == \"curl\":\n self.requires(\"libcurl/7.68.0\")\n \n if self.options.backend == \"crashpad\":\n raise ConanInvalidConfiguration(\"crashpad not available yet in CCI\")\n if self.options.backend == \"breakpad\":\n raise ConanInvalidConfiguration(\"breakpad not available yet in CCI\")\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = self.name + \"-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def configure(self):\n if self.options.backend == \"inproc\" and self.settings.os == \"Windows\":\n raise ConanInvalidConfiguration(\"The in-process backend is not supported on Windows\")\n\n def _configure_cmake(self):\n if self._cmake:\n return self._cmake\n self._cmake = CMake(self)\n self._cmake.definitions[\"SENTRY_BACKEND\"] = self.options.backend\n self._cmake.definitions[\"SENTRY_ENABLE_INSTALL\"] = True\n self._cmake.definitions[\"SENTRY_TRANSPORT\"] = self.options.transport\n self._cmake.definitions[\"SENTRY_PIC\"] = self.options.get_safe(\"fPIC\", False)\n self._cmake.configure()\n return self._cmake\n\n def build(self):\n cmake = self._configure_cmake()\n cmake.build()\n\n def package(self):\n self.copy(\"LICENSE\", dst=\"licenses\", src=self._source_subfolder)\n cmake = self._configure_cmake()\n cmake.install()\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"cmake\"))\n\n def package_info(self):\n self.cpp_info.libs = [\"sentry\"]\n if self.settings.os in (\"Android\", \"Windows\"):\n self.cpp_info.exelinkflags= [\"--build-id=sha1\"]\n self.cpp_info.sharedlinkflags = [\"--build-id=sha1\"]\n if self.settings.os == \"Linux\":\n self.cpp_info.system_libs = [\"pthread\", \"dl\"]\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"winhttp\", \"dbghelp\", \"pathcch\"]\n\n if not self.options.shared:\n self.cpp_info.defines = [\"SENTRY_BUILD_STATIC\"]\n", "path": "recipes/sentry-native/all/conanfile.py"}], "after_files": [{"content": "import os\nimport glob\nfrom conans import ConanFile, CMake, tools\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass SentryNativeConan(ConanFile):\n name = \"sentry-native\"\n description = \"The Sentry Native SDK is an error and crash reporting client for native applications,\\n\" \\\n \"optimized for C and C++. Sentry allows to add tags,\\n\" \\\n \"breadcrumbs and arbitrary custom context to enrich error reports.\"\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/getsentry/sentry-native\"\n license = \"MIT\"\n topics = (\"conan\", \"breakpad\", \"crashpad\",\n \"error-reporting\", \"crash-reporting\")\n exports_sources = [\"CMakeLists.txt\"]\n generators = \"cmake\", \"cmake_find_package\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n \"backend\": [\"none\", \"inproc\", \"crashpad\", \"breakpad\"],\n \"transport\": [\"none\", \"curl\", \"winhttp\"],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n \"backend\": \"inproc\",\n \"transport\": \"curl\"\n }\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n _cmake = None\n\n def requirements(self):\n if self.options.transport == \"curl\":\n self.requires(\"libcurl/7.71.0\")\n\n if self.options.backend == \"crashpad\":\n raise ConanInvalidConfiguration(\"crashpad not available yet in CCI\")\n if self.options.backend == \"breakpad\":\n raise ConanInvalidConfiguration(\"breakpad not available yet in CCI\")\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = self.name + \"-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def configure(self):\n if self.options.backend == \"inproc\" and self.settings.os == \"Windows\" and tools.Version(self.version) < \"0.4\":\n raise ConanInvalidConfiguration(\"The in-process backend is not supported on Windows\")\n\n def _configure_cmake(self):\n if self._cmake:\n return self._cmake\n self._cmake = CMake(self)\n self._cmake.definitions[\"SENTRY_BACKEND\"] = self.options.backend\n self._cmake.definitions[\"SENTRY_ENABLE_INSTALL\"] = True\n self._cmake.definitions[\"SENTRY_TRANSPORT\"] = self.options.transport\n self._cmake.definitions[\"SENTRY_PIC\"] = self.options.get_safe(\"fPIC\", False)\n self._cmake.configure()\n return self._cmake\n\n def build(self):\n cmake = self._configure_cmake()\n cmake.build()\n\n def package(self):\n self.copy(\"LICENSE\", dst=\"licenses\", src=self._source_subfolder)\n cmake = self._configure_cmake()\n cmake.install()\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"cmake\"))\n for pdb in glob.glob(os.path.join(self.package_folder, \"bin\", \"*.pdb\")):\n os.unlink(pdb)\n\n def package_info(self):\n self.cpp_info.libs = [\"sentry\"]\n if self.settings.os in (\"Android\", \"Linux\"):\n self.cpp_info.exelinkflags = [\"-Wl,-E,--build-id=sha1\"]\n self.cpp_info.sharedlinkflags = [\"-Wl,-E,--build-id=sha1\"]\n if self.settings.os == \"Linux\":\n self.cpp_info.system_libs = [\"pthread\", \"dl\"]\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"winhttp\", \"dbghelp\", \"pathcch\", \"shlwapi\"]\n\n if not self.options.shared:\n self.cpp_info.defines = [\"SENTRY_BUILD_STATIC\"]\n", "path": "recipes/sentry-native/all/conanfile.py"}]}
| 1,398 | 595 |
gh_patches_debug_4349
|
rasdani/github-patches
|
git_diff
|
dbt-labs__dbt-core-4890
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[CT-381] [HOTFIX] Homebrew Incident Resolution
### What Happened
dbt-core depends on the dbt-extractor package, and the dbt-extractor package depends on tree-sitter-jinja2. dbt-extractor specifies tree-sitter-jinja2 via a github link using the git protocol. Github security rules changed to require this link to use https which caused cargo to fail to build the dbt-extractor.
### Who Is Affected
Everyone attempting to build dbt-core from source after the github security rules took affect. This primarily affects homebrew users since homebrew builds dbt from source locally.
### Solution:
- release new dbt-extractor (0.4.1). The fix is already in main
- dbt-labs/dbt-extractor#51
- release new dbt-core patch from branch [1.0.4-hotfix](https://github.com/dbt-labs/dbt-core/tree/1.0.4-hotfix) which depends on this new version and accepts all future patch releases so we can skip this step in the future. This branch is only the 3 necessary commits ahead of 1.0.3 to fix this incident.
- main: #4890
- backport is directly on branch [1.0.4-hotfix](https://github.com/dbt-labs/dbt-core/tree/1.0.4-hotfix) because of complications with running the bump-version workflow for a hotfix branch.
Getting the release out has been delayed due to complications with github actions due to an [ongoing GitHub incident](https://www.githubstatus.com/incidents/dcnvr6zym66r).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/setup.py`
Content:
```
1 #!/usr/bin/env python
2 import os
3 import sys
4
5 if sys.version_info < (3, 7, 2):
6 print("Error: dbt does not support this version of Python.")
7 print("Please upgrade to Python 3.7.2 or higher.")
8 sys.exit(1)
9
10
11 from setuptools import setup
12
13 try:
14 from setuptools import find_namespace_packages
15 except ImportError:
16 # the user has a downlevel version of setuptools.
17 print("Error: dbt requires setuptools v40.1.0 or higher.")
18 print('Please upgrade setuptools with "pip install --upgrade setuptools" ' "and try again")
19 sys.exit(1)
20
21
22 this_directory = os.path.abspath(os.path.dirname(__file__))
23 with open(os.path.join(this_directory, "README.md")) as f:
24 long_description = f.read()
25
26
27 package_name = "dbt-core"
28 package_version = "1.0.1"
29 description = """With dbt, data analysts and engineers can build analytics \
30 the way engineers build applications."""
31
32
33 setup(
34 name=package_name,
35 version=package_version,
36 description=description,
37 long_description=long_description,
38 long_description_content_type="text/markdown",
39 author="dbt Labs",
40 author_email="[email protected]",
41 url="https://github.com/dbt-labs/dbt-core",
42 packages=find_namespace_packages(include=["dbt", "dbt.*"]),
43 include_package_data=True,
44 test_suite="test",
45 entry_points={
46 "console_scripts": [
47 "dbt = dbt.main:main",
48 ],
49 },
50 scripts=[
51 "scripts/dbt",
52 ],
53 install_requires=[
54 "Jinja2==2.11.3",
55 "MarkupSafe==2.0.1",
56 "agate>=1.6,<1.6.4",
57 "click>=7.0,<9",
58 "colorama>=0.3.9,<0.4.5",
59 "hologram==0.0.14",
60 "isodate>=0.6,<0.7",
61 "logbook>=1.5,<1.6",
62 "mashumaro==2.9",
63 "minimal-snowplow-tracker==0.0.2",
64 "networkx>=2.3,<3",
65 "packaging>=20.9,<22.0",
66 "sqlparse>=0.2.3,<0.5",
67 "dbt-extractor==0.4.0",
68 "typing-extensions>=3.7.4,<4.2",
69 "werkzeug>=1,<3",
70 # the following are all to match snowflake-connector-python
71 "requests<3.0.0",
72 "idna>=2.5,<4",
73 "cffi>=1.9,<2.0.0",
74 ],
75 zip_safe=False,
76 classifiers=[
77 "Development Status :: 5 - Production/Stable",
78 "License :: OSI Approved :: Apache Software License",
79 "Operating System :: Microsoft :: Windows",
80 "Operating System :: MacOS :: MacOS X",
81 "Operating System :: POSIX :: Linux",
82 "Programming Language :: Python :: 3.7",
83 "Programming Language :: Python :: 3.8",
84 "Programming Language :: Python :: 3.9",
85 ],
86 python_requires=">=3.7.2",
87 )
88
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/core/setup.py b/core/setup.py
--- a/core/setup.py
+++ b/core/setup.py
@@ -64,7 +64,7 @@
"networkx>=2.3,<3",
"packaging>=20.9,<22.0",
"sqlparse>=0.2.3,<0.5",
- "dbt-extractor==0.4.0",
+ "dbt-extractor~=0.4.1",
"typing-extensions>=3.7.4,<4.2",
"werkzeug>=1,<3",
# the following are all to match snowflake-connector-python
|
{"golden_diff": "diff --git a/core/setup.py b/core/setup.py\n--- a/core/setup.py\n+++ b/core/setup.py\n@@ -64,7 +64,7 @@\n \"networkx>=2.3,<3\",\n \"packaging>=20.9,<22.0\",\n \"sqlparse>=0.2.3,<0.5\",\n- \"dbt-extractor==0.4.0\",\n+ \"dbt-extractor~=0.4.1\",\n \"typing-extensions>=3.7.4,<4.2\",\n \"werkzeug>=1,<3\",\n # the following are all to match snowflake-connector-python\n", "issue": "[CT-381] [HOTFIX] Homebrew Incident Resolution\n### What Happened\r\n\r\ndbt-core depends on the dbt-extractor package, and the dbt-extractor package depends on tree-sitter-jinja2. dbt-extractor specifies tree-sitter-jinja2 via a github link using the git protocol. Github security rules changed to require this link to use https which caused cargo to fail to build the dbt-extractor.\r\n\r\n### Who Is Affected\r\n\r\nEveryone attempting to build dbt-core from source after the github security rules took affect. This primarily affects homebrew users since homebrew builds dbt from source locally.\r\n\r\n### Solution:\r\n- release new dbt-extractor (0.4.1). The fix is already in main\r\n - dbt-labs/dbt-extractor#51\r\n- release new dbt-core patch from branch [1.0.4-hotfix](https://github.com/dbt-labs/dbt-core/tree/1.0.4-hotfix) which depends on this new version and accepts all future patch releases so we can skip this step in the future. This branch is only the 3 necessary commits ahead of 1.0.3 to fix this incident.\r\n - main: #4890\r\n - backport is directly on branch [1.0.4-hotfix](https://github.com/dbt-labs/dbt-core/tree/1.0.4-hotfix) because of complications with running the bump-version workflow for a hotfix branch.\r\n \r\nGetting the release out has been delayed due to complications with github actions due to an [ongoing GitHub incident](https://www.githubstatus.com/incidents/dcnvr6zym66r).\r\n \n", "before_files": [{"content": "#!/usr/bin/env python\nimport os\nimport sys\n\nif sys.version_info < (3, 7, 2):\n print(\"Error: dbt does not support this version of Python.\")\n print(\"Please upgrade to Python 3.7.2 or higher.\")\n sys.exit(1)\n\n\nfrom setuptools import setup\n\ntry:\n from setuptools import find_namespace_packages\nexcept ImportError:\n # the user has a downlevel version of setuptools.\n print(\"Error: dbt requires setuptools v40.1.0 or higher.\")\n print('Please upgrade setuptools with \"pip install --upgrade setuptools\" ' \"and try again\")\n sys.exit(1)\n\n\nthis_directory = os.path.abspath(os.path.dirname(__file__))\nwith open(os.path.join(this_directory, \"README.md\")) as f:\n long_description = f.read()\n\n\npackage_name = \"dbt-core\"\npackage_version = \"1.0.1\"\ndescription = \"\"\"With dbt, data analysts and engineers can build analytics \\\nthe way engineers build applications.\"\"\"\n\n\nsetup(\n name=package_name,\n version=package_version,\n description=description,\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n author=\"dbt Labs\",\n author_email=\"[email protected]\",\n url=\"https://github.com/dbt-labs/dbt-core\",\n packages=find_namespace_packages(include=[\"dbt\", \"dbt.*\"]),\n include_package_data=True,\n test_suite=\"test\",\n entry_points={\n \"console_scripts\": [\n \"dbt = dbt.main:main\",\n ],\n },\n scripts=[\n \"scripts/dbt\",\n ],\n install_requires=[\n \"Jinja2==2.11.3\",\n \"MarkupSafe==2.0.1\",\n \"agate>=1.6,<1.6.4\",\n \"click>=7.0,<9\",\n \"colorama>=0.3.9,<0.4.5\",\n \"hologram==0.0.14\",\n \"isodate>=0.6,<0.7\",\n \"logbook>=1.5,<1.6\",\n \"mashumaro==2.9\",\n \"minimal-snowplow-tracker==0.0.2\",\n \"networkx>=2.3,<3\",\n \"packaging>=20.9,<22.0\",\n \"sqlparse>=0.2.3,<0.5\",\n \"dbt-extractor==0.4.0\",\n \"typing-extensions>=3.7.4,<4.2\",\n \"werkzeug>=1,<3\",\n # the following are all to match snowflake-connector-python\n \"requests<3.0.0\",\n \"idna>=2.5,<4\",\n \"cffi>=1.9,<2.0.0\",\n ],\n zip_safe=False,\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: Microsoft :: Windows\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX :: Linux\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n ],\n python_requires=\">=3.7.2\",\n)\n", "path": "core/setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\nimport os\nimport sys\n\nif sys.version_info < (3, 7, 2):\n print(\"Error: dbt does not support this version of Python.\")\n print(\"Please upgrade to Python 3.7.2 or higher.\")\n sys.exit(1)\n\n\nfrom setuptools import setup\n\ntry:\n from setuptools import find_namespace_packages\nexcept ImportError:\n # the user has a downlevel version of setuptools.\n print(\"Error: dbt requires setuptools v40.1.0 or higher.\")\n print('Please upgrade setuptools with \"pip install --upgrade setuptools\" ' \"and try again\")\n sys.exit(1)\n\n\nthis_directory = os.path.abspath(os.path.dirname(__file__))\nwith open(os.path.join(this_directory, \"README.md\")) as f:\n long_description = f.read()\n\n\npackage_name = \"dbt-core\"\npackage_version = \"1.0.1\"\ndescription = \"\"\"With dbt, data analysts and engineers can build analytics \\\nthe way engineers build applications.\"\"\"\n\n\nsetup(\n name=package_name,\n version=package_version,\n description=description,\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n author=\"dbt Labs\",\n author_email=\"[email protected]\",\n url=\"https://github.com/dbt-labs/dbt-core\",\n packages=find_namespace_packages(include=[\"dbt\", \"dbt.*\"]),\n include_package_data=True,\n test_suite=\"test\",\n entry_points={\n \"console_scripts\": [\n \"dbt = dbt.main:main\",\n ],\n },\n scripts=[\n \"scripts/dbt\",\n ],\n install_requires=[\n \"Jinja2==2.11.3\",\n \"MarkupSafe==2.0.1\",\n \"agate>=1.6,<1.6.4\",\n \"click>=7.0,<9\",\n \"colorama>=0.3.9,<0.4.5\",\n \"hologram==0.0.14\",\n \"isodate>=0.6,<0.7\",\n \"logbook>=1.5,<1.6\",\n \"mashumaro==2.9\",\n \"minimal-snowplow-tracker==0.0.2\",\n \"networkx>=2.3,<3\",\n \"packaging>=20.9,<22.0\",\n \"sqlparse>=0.2.3,<0.5\",\n \"dbt-extractor~=0.4.1\",\n \"typing-extensions>=3.7.4,<4.2\",\n \"werkzeug>=1,<3\",\n # the following are all to match snowflake-connector-python\n \"requests<3.0.0\",\n \"idna>=2.5,<4\",\n \"cffi>=1.9,<2.0.0\",\n ],\n zip_safe=False,\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: Microsoft :: Windows\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX :: Linux\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n ],\n python_requires=\">=3.7.2\",\n)\n", "path": "core/setup.py"}]}
| 1,511 | 144 |
gh_patches_debug_36232
|
rasdani/github-patches
|
git_diff
|
pymeasure__pymeasure-350
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error in examples/Notebook Experiments/script2.ipynb
script.ipynb runs fine but in script2.ipynb I hit the following error at `experiment = Experiment('test', procedure, analyse)`:
```python
C:\ProgramData\Anaconda3\lib\site-packages\matplotlib\__init__.py in __setitem__(self, key, val)
927 raise KeyError(
928 '%s is not a valid rc parameter. See rcParams.keys() for a '
--> 929 'list of valid parameters.' % (key,))
930
931 def __getitem__(self, key):
KeyError: 'axes.color_cycle is not a valid rc parameter. See rcParams.keys() for a list of valid parameters.'
```
Error in examples/Notebook Experiments/script2.ipynb
script.ipynb runs fine but in script2.ipynb I hit the following error at `experiment = Experiment('test', procedure, analyse)`:
```python
C:\ProgramData\Anaconda3\lib\site-packages\matplotlib\__init__.py in __setitem__(self, key, val)
927 raise KeyError(
928 '%s is not a valid rc parameter. See rcParams.keys() for a '
--> 929 'list of valid parameters.' % (key,))
930
931 def __getitem__(self, key):
KeyError: 'axes.color_cycle is not a valid rc parameter. See rcParams.keys() for a list of valid parameters.'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pymeasure/experiment/config.py`
Content:
```
1 #
2 # This file is part of the PyMeasure package.
3 #
4 # Copyright (c) 2013-2020 PyMeasure Developers
5 #
6 # Permission is hereby granted, free of charge, to any person obtaining a copy
7 # of this software and associated documentation files (the "Software"), to deal
8 # in the Software without restriction, including without limitation the rights
9 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
10 # copies of the Software, and to permit persons to whom the Software is
11 # furnished to do so, subject to the following conditions:
12 #
13 # The above copyright notice and this permission notice shall be included in
14 # all copies or substantial portions of the Software.
15 #
16 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
17 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
19 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
20 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
21 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
22 # THE SOFTWARE.
23 #
24
25 import configparser
26 import logging
27 import os
28
29 log = logging.getLogger(__name__)
30 log.addHandler(logging.NullHandler())
31
32
33 def set_file(filename):
34 os.environ['CONFIG'] = filename
35
36
37 def get_config(filename='default_config.ini'):
38 if 'CONFIG' in os.environ.keys():
39 filename = os.environ['CONFIG']
40 config = configparser.ConfigParser()
41 config.read(filename)
42 return config
43
44
45 # noinspection PyProtectedMember
46 def set_mpl_rcparams(config):
47 if 'matplotlib.rcParams' in config._sections.keys():
48 import matplotlib
49 for key in config._sections['matplotlib.rcParams']:
50 matplotlib.rcParams[key] = eval(config._sections['matplotlib.rcParams'][key])
51
```
Path: `examples/Notebook Experiments/procedures.py`
Content:
```
1 #
2 # This file is part of the PyMeasure package.
3 #
4 # Copyright (c) 2013-2016 PyMeasure Developers
5 #
6 # Permission is hereby granted, free of charge, to any person obtaining a copy
7 # of this software and associated documentation files (the "Software"), to deal
8 # in the Software without restriction, including without limitation the rights
9 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
10 # copies of the Software, and to permit persons to whom the Software is
11 # furnished to do so, subject to the following conditions:
12 #
13 # The above copyright notice and this permission notice shall be included in
14 # all copies or substantial portions of the Software.
15 #
16 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
17 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
19 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
20 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
21 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
22 # THE SOFTWARE.
23 #
24
25 import random
26 from time import sleep
27 from pymeasure.experiment import Procedure, IntegerParameter, Parameter, FloatParameter
28 import logging
29 log = logging.getLogger(__name__)
30 log.addHandler(logging.NullHandler())
31
32 class TestProcedure(Procedure):
33
34 iterations = IntegerParameter('Loop Iterations', default=100)
35 delay = FloatParameter('Delay Time', units='s', default=0.2)
36 seed = Parameter('Random Seed', default='12345')
37
38 DATA_COLUMNS = ['Iteration', 'Random Number']
39
40 def startup(self):
41 log.info("Setting up random number generator")
42 random.seed(self.seed)
43
44 def execute(self):
45 log.info("Starting to generate numbers")
46 for i in range(self.iterations):
47 data = {
48 'Iteration': i,
49 'Random Number': random.random()
50 }
51 log.debug("Produced numbers: %s" % data)
52 self.emit('results', data)
53 self.emit('progress', 100.*i/self.iterations)
54 sleep(self.delay)
55 if self.should_stop():
56 log.warning("Catch stop command in procedure")
57 break
58
59 def shutdown(self):
60 log.info("Finished")
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/examples/Notebook Experiments/procedures.py b/examples/Notebook Experiments/procedures.py
deleted file mode 100644
--- a/examples/Notebook Experiments/procedures.py
+++ /dev/null
@@ -1,60 +0,0 @@
-#
-# This file is part of the PyMeasure package.
-#
-# Copyright (c) 2013-2016 PyMeasure Developers
-#
-# Permission is hereby granted, free of charge, to any person obtaining a copy
-# of this software and associated documentation files (the "Software"), to deal
-# in the Software without restriction, including without limitation the rights
-# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-# copies of the Software, and to permit persons to whom the Software is
-# furnished to do so, subject to the following conditions:
-#
-# The above copyright notice and this permission notice shall be included in
-# all copies or substantial portions of the Software.
-#
-# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-# THE SOFTWARE.
-#
-
-import random
-from time import sleep
-from pymeasure.experiment import Procedure, IntegerParameter, Parameter, FloatParameter
-import logging
-log = logging.getLogger(__name__)
-log.addHandler(logging.NullHandler())
-
-class TestProcedure(Procedure):
-
- iterations = IntegerParameter('Loop Iterations', default=100)
- delay = FloatParameter('Delay Time', units='s', default=0.2)
- seed = Parameter('Random Seed', default='12345')
-
- DATA_COLUMNS = ['Iteration', 'Random Number']
-
- def startup(self):
- log.info("Setting up random number generator")
- random.seed(self.seed)
-
- def execute(self):
- log.info("Starting to generate numbers")
- for i in range(self.iterations):
- data = {
- 'Iteration': i,
- 'Random Number': random.random()
- }
- log.debug("Produced numbers: %s" % data)
- self.emit('results', data)
- self.emit('progress', 100.*i/self.iterations)
- sleep(self.delay)
- if self.should_stop():
- log.warning("Catch stop command in procedure")
- break
-
- def shutdown(self):
- log.info("Finished")
\ No newline at end of file
diff --git a/pymeasure/experiment/config.py b/pymeasure/experiment/config.py
--- a/pymeasure/experiment/config.py
+++ b/pymeasure/experiment/config.py
@@ -46,5 +46,6 @@
def set_mpl_rcparams(config):
if 'matplotlib.rcParams' in config._sections.keys():
import matplotlib
+ from cycler import cycler
for key in config._sections['matplotlib.rcParams']:
matplotlib.rcParams[key] = eval(config._sections['matplotlib.rcParams'][key])
|
{"golden_diff": "diff --git a/examples/Notebook Experiments/procedures.py b/examples/Notebook Experiments/procedures.py\ndeleted file mode 100644\n--- a/examples/Notebook Experiments/procedures.py\t\n+++ /dev/null\n@@ -1,60 +0,0 @@\n-#\n-# This file is part of the PyMeasure package.\n-#\n-# Copyright (c) 2013-2016 PyMeasure Developers\n-#\n-# Permission is hereby granted, free of charge, to any person obtaining a copy\n-# of this software and associated documentation files (the \"Software\"), to deal\n-# in the Software without restriction, including without limitation the rights\n-# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n-# copies of the Software, and to permit persons to whom the Software is\n-# furnished to do so, subject to the following conditions:\n-#\n-# The above copyright notice and this permission notice shall be included in\n-# all copies or substantial portions of the Software.\n-#\n-# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n-# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n-# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n-# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n-# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n-# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\n-# THE SOFTWARE.\n-#\n-\n-import random\n-from time import sleep\n-from pymeasure.experiment import Procedure, IntegerParameter, Parameter, FloatParameter\n-import logging\n-log = logging.getLogger(__name__)\n-log.addHandler(logging.NullHandler())\n-\n-class TestProcedure(Procedure):\n-\n- iterations = IntegerParameter('Loop Iterations', default=100)\n- delay = FloatParameter('Delay Time', units='s', default=0.2)\n- seed = Parameter('Random Seed', default='12345')\n- \n- DATA_COLUMNS = ['Iteration', 'Random Number']\n-\n- def startup(self):\n- log.info(\"Setting up random number generator\")\n- random.seed(self.seed)\n-\n- def execute(self):\n- log.info(\"Starting to generate numbers\")\n- for i in range(self.iterations):\n- data = {\n- 'Iteration': i,\n- 'Random Number': random.random()\n- }\n- log.debug(\"Produced numbers: %s\" % data)\n- self.emit('results', data)\n- self.emit('progress', 100.*i/self.iterations)\n- sleep(self.delay)\n- if self.should_stop():\n- log.warning(\"Catch stop command in procedure\")\n- break\n-\n- def shutdown(self):\n- log.info(\"Finished\")\n\\ No newline at end of file\ndiff --git a/pymeasure/experiment/config.py b/pymeasure/experiment/config.py\n--- a/pymeasure/experiment/config.py\n+++ b/pymeasure/experiment/config.py\n@@ -46,5 +46,6 @@\n def set_mpl_rcparams(config):\n if 'matplotlib.rcParams' in config._sections.keys():\n import matplotlib\n+ from cycler import cycler\n for key in config._sections['matplotlib.rcParams']:\n matplotlib.rcParams[key] = eval(config._sections['matplotlib.rcParams'][key])\n", "issue": "Error in examples/Notebook Experiments/script2.ipynb\nscript.ipynb runs fine but in script2.ipynb I hit the following error at `experiment = Experiment('test', procedure, analyse)`:\r\n\r\n```python\r\n\r\nC:\\ProgramData\\Anaconda3\\lib\\site-packages\\matplotlib\\__init__.py in __setitem__(self, key, val)\r\n 927 raise KeyError(\r\n 928 '%s is not a valid rc parameter. See rcParams.keys() for a '\r\n--> 929 'list of valid parameters.' % (key,))\r\n 930 \r\n 931 def __getitem__(self, key):\r\n\r\nKeyError: 'axes.color_cycle is not a valid rc parameter. See rcParams.keys() for a list of valid parameters.'\r\n```\r\n\nError in examples/Notebook Experiments/script2.ipynb\nscript.ipynb runs fine but in script2.ipynb I hit the following error at `experiment = Experiment('test', procedure, analyse)`:\r\n\r\n```python\r\n\r\nC:\\ProgramData\\Anaconda3\\lib\\site-packages\\matplotlib\\__init__.py in __setitem__(self, key, val)\r\n 927 raise KeyError(\r\n 928 '%s is not a valid rc parameter. See rcParams.keys() for a '\r\n--> 929 'list of valid parameters.' % (key,))\r\n 930 \r\n 931 def __getitem__(self, key):\r\n\r\nKeyError: 'axes.color_cycle is not a valid rc parameter. See rcParams.keys() for a list of valid parameters.'\r\n```\r\n\n", "before_files": [{"content": "#\n# This file is part of the PyMeasure package.\n#\n# Copyright (c) 2013-2020 PyMeasure Developers\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\n# THE SOFTWARE.\n#\n\nimport configparser\nimport logging\nimport os\n\nlog = logging.getLogger(__name__)\nlog.addHandler(logging.NullHandler())\n\n\ndef set_file(filename):\n os.environ['CONFIG'] = filename\n\n\ndef get_config(filename='default_config.ini'):\n if 'CONFIG' in os.environ.keys():\n filename = os.environ['CONFIG']\n config = configparser.ConfigParser()\n config.read(filename)\n return config\n\n\n# noinspection PyProtectedMember\ndef set_mpl_rcparams(config):\n if 'matplotlib.rcParams' in config._sections.keys():\n import matplotlib\n for key in config._sections['matplotlib.rcParams']:\n matplotlib.rcParams[key] = eval(config._sections['matplotlib.rcParams'][key])\n", "path": "pymeasure/experiment/config.py"}, {"content": "#\n# This file is part of the PyMeasure package.\n#\n# Copyright (c) 2013-2016 PyMeasure Developers\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\n# THE SOFTWARE.\n#\n\nimport random\nfrom time import sleep\nfrom pymeasure.experiment import Procedure, IntegerParameter, Parameter, FloatParameter\nimport logging\nlog = logging.getLogger(__name__)\nlog.addHandler(logging.NullHandler())\n\nclass TestProcedure(Procedure):\n\n iterations = IntegerParameter('Loop Iterations', default=100)\n delay = FloatParameter('Delay Time', units='s', default=0.2)\n seed = Parameter('Random Seed', default='12345')\n \n DATA_COLUMNS = ['Iteration', 'Random Number']\n\n def startup(self):\n log.info(\"Setting up random number generator\")\n random.seed(self.seed)\n\n def execute(self):\n log.info(\"Starting to generate numbers\")\n for i in range(self.iterations):\n data = {\n 'Iteration': i,\n 'Random Number': random.random()\n }\n log.debug(\"Produced numbers: %s\" % data)\n self.emit('results', data)\n self.emit('progress', 100.*i/self.iterations)\n sleep(self.delay)\n if self.should_stop():\n log.warning(\"Catch stop command in procedure\")\n break\n\n def shutdown(self):\n log.info(\"Finished\")", "path": "examples/Notebook Experiments/procedures.py"}], "after_files": [{"content": "#\n# This file is part of the PyMeasure package.\n#\n# Copyright (c) 2013-2020 PyMeasure Developers\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\n# THE SOFTWARE.\n#\n\nimport configparser\nimport logging\nimport os\n\nlog = logging.getLogger(__name__)\nlog.addHandler(logging.NullHandler())\n\n\ndef set_file(filename):\n os.environ['CONFIG'] = filename\n\n\ndef get_config(filename='default_config.ini'):\n if 'CONFIG' in os.environ.keys():\n filename = os.environ['CONFIG']\n config = configparser.ConfigParser()\n config.read(filename)\n return config\n\n\n# noinspection PyProtectedMember\ndef set_mpl_rcparams(config):\n if 'matplotlib.rcParams' in config._sections.keys():\n import matplotlib\n from cycler import cycler\n for key in config._sections['matplotlib.rcParams']:\n matplotlib.rcParams[key] = eval(config._sections['matplotlib.rcParams'][key])\n", "path": "pymeasure/experiment/config.py"}, {"content": null, "path": "examples/Notebook Experiments/procedures.py"}]}
| 1,743 | 736 |
gh_patches_debug_57667
|
rasdani/github-patches
|
git_diff
|
evennia__evennia-3042
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] "evennia xyzgrid help" causes TypeError: NoneType takes no arguments
#### Describe the bug
Fresh migration from git master to main and then installing xyzgrid prevents evennia xyzgrid commands from working. For example, "evennia xyzgrid help" causes TypeError: NoneType takes no arguments
#### To Reproduce
1. Migrated from git master branch to main branch for 1.x release of Evennia.
2. Installed [extra](use to be in requirements_extra).
At this point, I can run the server and log in.
3. Added the xyzgrid command set and restarted.
'path', 'goto', 'map' are seen in the command list. The Limbo room does not have a map. Everything appears to work fine.
4. Modify the server/conf/settings.py.
xyzgrid is now available.
When I use xyzgrid, such as 'evennia xyzgrid help', or any other xyzgrid command:
TypeError: NoneType takes no arguments
#### Expected behavior
'evennia xyzgrid <command>' should call the xyzgrid command.
#### Environment, Evennia version, OS etc
Evennia 1.0.1 (rev 38011cc48d)
OS: nt
Python: 3.11.1
Twisted: 22.10.0
Django: 4.1.4
#### Additional context
This is based off helix4's message in #general on discord. I added my current steps that seem to reproduce the same issue down below. Here is the original message from helix4, with steps for reproducing on the older version of the code.
I am trying to test XYZGrid on a brand new install but failing. 1. cloned the single branch of evennia-develop, and initiated an evennia game. 2. installed requirements_extra, and migrated. I can run the server and log in. 3. i added the command set and reloadead, i see path, goto, map ingame. the Limbo room does not have a map. seems to work well. 4. modify the server/conf/settings.py, xyzgrid is now available.
When I use xyzgrid, such as evennia xyzgrid help, or any other xyzgrid command:
from evennia.utils.eveditor import EvEditor
File "/home/ubuntu/3ProjectMUD/evennia/evennia/utils/eveditor.py", line 201, in <module>
class SaveYesNoCmdSet(CmdSet):
TypeError: NoneType takes no arguments
Original message
https://discord.com/channels/246323978879107073/246323978879107073/937578545704730624
Griatch's response
https://discord.com/channels/246323978879107073/246323978879107073/937610453184561183
Steps:
1. Migrated from git master branch to main branch for 1.x release of Evennia.
2. Installed [extra](use to be in requirements_extra).
At this point, I can run the server and log in.
3. Added the xyzgrid command set and restarted.
'path', 'goto', 'map' are seen in the command list. The Limbo room does not have a map. Everything appears to work fine.
4. Modify the server/conf/settings.py.
xyzgrid is now available.
When I use xyzgrid, such as 'evennia xyzgrid help', or any other xyzgrid command:
Traceback (most recent call last):
File "C:\muddev\evenv\Scripts\evennia_launcher.py", line 18, in <module>
main()
File "C:\muddev\evennia\evennia\server\evennia_launcher.py", line 2422, in main
if run_custom_commands(option, *unknown_args):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\muddev\evennia\evennia\server\evennia_launcher.py", line 2023, in run_custom_commands
mod = importlib.import_module(modpath)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...
File "<frozen importlib._bootstrap>", line 1206, in _gcd_import
File "<frozen importlib._bootstrap>", line 1178, in _find_and_load
File "<frozen importlib._bootstrap>", line 1128, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1206, in _gcd_import
File "<frozen importlib._bootstrap>", line 1178, in _find_and_load
File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 940, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "C:\muddev\evennia\evennia\contrib\grid\xyzgrid\__init__.py", line 6, in <module>
from . import commands # noqa
^^^^^^^^^^^^^^^^^^^^^^
File "C:\muddev\evennia\evennia\contrib\grid\xyzgrid\commands.py", line 15, in <module>
from evennia.commands.default import building
File "C:\muddev\evennia\evennia\commands\default\building.py", line 14, in <module>
from evennia.prototypes import menus as olc_menus
File "C:\muddev\evennia\evennia\prototypes\menus.py", line 20, in <module>
from evennia.utils.evmenu import EvMenu, list_node
File "C:\muddev\evennia\evennia\utils\evmenu.py", line 350, in <module>
class CmdEvMenuNode(Command):
TypeError: NoneType takes no arguments
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `evennia/contrib/grid/xyzgrid/__init__.py`
Content:
```
1 """
2 XYZGrid - Griatch 2021
3
4 """
5
6 from . import commands # noqa
7 from . import example # noqa
8 from . import launchcmd # noqa
9 from . import prototypes # noqa
10 from . import tests # noqa
11 from . import utils # noqa
12 from . import xymap # noqa
13 from . import xymap_legend # noqa
14 from . import xyzgrid # noqa
15 from . import xyzroom # noqa
16
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/evennia/contrib/grid/xyzgrid/__init__.py b/evennia/contrib/grid/xyzgrid/__init__.py
--- a/evennia/contrib/grid/xyzgrid/__init__.py
+++ b/evennia/contrib/grid/xyzgrid/__init__.py
@@ -2,14 +2,15 @@
XYZGrid - Griatch 2021
"""
-
-from . import commands # noqa
-from . import example # noqa
-from . import launchcmd # noqa
-from . import prototypes # noqa
-from . import tests # noqa
-from . import utils # noqa
-from . import xymap # noqa
-from . import xymap_legend # noqa
-from . import xyzgrid # noqa
-from . import xyzroom # noqa
+from . import (
+ example,
+ launchcmd,
+ prototypes,
+ tests,
+ utils,
+ xymap,
+ xymap_legend,
+ xyzgrid,
+ xyzroom,
+ commands,
+)
|
{"golden_diff": "diff --git a/evennia/contrib/grid/xyzgrid/__init__.py b/evennia/contrib/grid/xyzgrid/__init__.py\n--- a/evennia/contrib/grid/xyzgrid/__init__.py\n+++ b/evennia/contrib/grid/xyzgrid/__init__.py\n@@ -2,14 +2,15 @@\n XYZGrid - Griatch 2021\n \n \"\"\"\n-\n-from . import commands # noqa\n-from . import example # noqa\n-from . import launchcmd # noqa\n-from . import prototypes # noqa\n-from . import tests # noqa\n-from . import utils # noqa\n-from . import xymap # noqa\n-from . import xymap_legend # noqa\n-from . import xyzgrid # noqa\n-from . import xyzroom # noqa\n+from . import (\n+ example,\n+ launchcmd,\n+ prototypes,\n+ tests,\n+ utils,\n+ xymap,\n+ xymap_legend,\n+ xyzgrid,\n+ xyzroom,\n+ commands,\n+)\n", "issue": "[BUG] \"evennia xyzgrid help\" causes TypeError: NoneType takes no arguments\n#### Describe the bug\r\nFresh migration from git master to main and then installing xyzgrid prevents evennia xyzgrid commands from working. For example, \"evennia xyzgrid help\" causes TypeError: NoneType takes no arguments\r\n\r\n#### To Reproduce\r\n1. Migrated from git master branch to main branch for 1.x release of Evennia.\r\n2. Installed [extra](use to be in requirements_extra). \r\n\r\nAt this point, I can run the server and log in.\r\n\r\n3. Added the xyzgrid command set and restarted. \r\n\r\n'path', 'goto', 'map' are seen in the command list. The Limbo room does not have a map. Everything appears to work fine.\r\n\r\n4. Modify the server/conf/settings.py.\r\n\r\nxyzgrid is now available.\r\n\r\nWhen I use xyzgrid, such as 'evennia xyzgrid help', or any other xyzgrid command:\r\nTypeError: NoneType takes no arguments\r\n\r\n#### Expected behavior\r\n'evennia xyzgrid <command>' should call the xyzgrid command.\r\n\r\n#### Environment, Evennia version, OS etc\r\n\r\n Evennia 1.0.1 (rev 38011cc48d)\r\n OS: nt\r\n Python: 3.11.1\r\n Twisted: 22.10.0\r\n Django: 4.1.4\r\n\r\n#### Additional context\r\n\r\nThis is based off helix4's message in #general on discord. I added my current steps that seem to reproduce the same issue down below. Here is the original message from helix4, with steps for reproducing on the older version of the code.\r\n\r\nI am trying to test XYZGrid on a brand new install but failing. 1. cloned the single branch of evennia-develop, and initiated an evennia game. 2. installed requirements_extra, and migrated. I can run the server and log in. 3. i added the command set and reloadead, i see path, goto, map ingame. the Limbo room does not have a map. seems to work well. 4. modify the server/conf/settings.py, xyzgrid is now available.\r\n\r\nWhen I use xyzgrid, such as evennia xyzgrid help, or any other xyzgrid command:\r\n from evennia.utils.eveditor import EvEditor\r\n File \"/home/ubuntu/3ProjectMUD/evennia/evennia/utils/eveditor.py\", line 201, in <module>\r\n class SaveYesNoCmdSet(CmdSet):\r\nTypeError: NoneType takes no arguments\r\n\r\nOriginal message\r\n\r\nhttps://discord.com/channels/246323978879107073/246323978879107073/937578545704730624\r\n\r\nGriatch's response\r\n\r\nhttps://discord.com/channels/246323978879107073/246323978879107073/937610453184561183\r\n\r\nSteps:\r\n\r\n1. Migrated from git master branch to main branch for 1.x release of Evennia.\r\n2. Installed [extra](use to be in requirements_extra). \r\n\r\nAt this point, I can run the server and log in.\r\n\r\n3. Added the xyzgrid command set and restarted. \r\n\r\n'path', 'goto', 'map' are seen in the command list. The Limbo room does not have a map. Everything appears to work fine.\r\n\r\n4. Modify the server/conf/settings.py.\r\n\r\nxyzgrid is now available.\r\n\r\nWhen I use xyzgrid, such as 'evennia xyzgrid help', or any other xyzgrid command:\r\n\r\nTraceback (most recent call last):\r\n File \"C:\\muddev\\evenv\\Scripts\\evennia_launcher.py\", line 18, in <module>\r\n main()\r\n File \"C:\\muddev\\evennia\\evennia\\server\\evennia_launcher.py\", line 2422, in main\r\n if run_custom_commands(option, *unknown_args):\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"C:\\muddev\\evennia\\evennia\\server\\evennia_launcher.py\", line 2023, in run_custom_commands\r\n mod = importlib.import_module(modpath)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n...\r\n\r\n File \"<frozen importlib._bootstrap>\", line 1206, in _gcd_import\r\n File \"<frozen importlib._bootstrap>\", line 1178, in _find_and_load\r\n File \"<frozen importlib._bootstrap>\", line 1128, in _find_and_load_unlocked\r\n File \"<frozen importlib._bootstrap>\", line 241, in _call_with_frames_removed\r\n File \"<frozen importlib._bootstrap>\", line 1206, in _gcd_import\r\n File \"<frozen importlib._bootstrap>\", line 1178, in _find_and_load\r\n File \"<frozen importlib._bootstrap>\", line 1149, in _find_and_load_unlocked\r\n File \"<frozen importlib._bootstrap>\", line 690, in _load_unlocked\r\n File \"<frozen importlib._bootstrap_external>\", line 940, in exec_module\r\n File \"<frozen importlib._bootstrap>\", line 241, in _call_with_frames_removed\r\n File \"C:\\muddev\\evennia\\evennia\\contrib\\grid\\xyzgrid\\__init__.py\", line 6, in <module>\r\n from . import commands # noqa\r\n ^^^^^^^^^^^^^^^^^^^^^^\r\n File \"C:\\muddev\\evennia\\evennia\\contrib\\grid\\xyzgrid\\commands.py\", line 15, in <module>\r\n from evennia.commands.default import building\r\n File \"C:\\muddev\\evennia\\evennia\\commands\\default\\building.py\", line 14, in <module>\r\n from evennia.prototypes import menus as olc_menus\r\n File \"C:\\muddev\\evennia\\evennia\\prototypes\\menus.py\", line 20, in <module>\r\n from evennia.utils.evmenu import EvMenu, list_node\r\n File \"C:\\muddev\\evennia\\evennia\\utils\\evmenu.py\", line 350, in <module>\r\n class CmdEvMenuNode(Command):\r\nTypeError: NoneType takes no arguments\r\n\r\n\n", "before_files": [{"content": "\"\"\"\nXYZGrid - Griatch 2021\n\n\"\"\"\n\nfrom . import commands # noqa\nfrom . import example # noqa\nfrom . import launchcmd # noqa\nfrom . import prototypes # noqa\nfrom . import tests # noqa\nfrom . import utils # noqa\nfrom . import xymap # noqa\nfrom . import xymap_legend # noqa\nfrom . import xyzgrid # noqa\nfrom . import xyzroom # noqa\n", "path": "evennia/contrib/grid/xyzgrid/__init__.py"}], "after_files": [{"content": "\"\"\"\nXYZGrid - Griatch 2021\n\n\"\"\"\nfrom . import (\n example,\n launchcmd,\n prototypes,\n tests,\n utils,\n xymap,\n xymap_legend,\n xyzgrid,\n xyzroom,\n commands,\n)\n", "path": "evennia/contrib/grid/xyzgrid/__init__.py"}]}
| 1,817 | 232 |
gh_patches_debug_34336
|
rasdani/github-patches
|
git_diff
|
CTFd__CTFd-461
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Optimize top10 call
https://github.com/CTFd/CTFd/blob/master/CTFd/scoreboard.py#L125-L127
This code seems pretty wasteful and is likely getting hit fairly often. Optimizing this to be a single database query is likely a good idea.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `CTFd/scoreboard.py`
Content:
```
1 from flask import render_template, jsonify, Blueprint, redirect, url_for, request
2 from sqlalchemy.sql.expression import union_all
3
4 from CTFd.models import db, Teams, Solves, Awards, Challenges
5
6 from CTFd import utils
7
8 scoreboard = Blueprint('scoreboard', __name__)
9
10
11 def get_standings(admin=False, count=None):
12 scores = db.session.query(
13 Solves.teamid.label('teamid'),
14 db.func.sum(Challenges.value).label('score'),
15 db.func.max(Solves.id).label('id'),
16 db.func.max(Solves.date).label('date')
17 ).join(Challenges).group_by(Solves.teamid)
18
19 awards = db.session.query(
20 Awards.teamid.label('teamid'),
21 db.func.sum(Awards.value).label('score'),
22 db.func.max(Awards.id).label('id'),
23 db.func.max(Awards.date).label('date')
24 ).group_by(Awards.teamid)
25
26 """
27 Filter out solves and awards that are before a specific time point.
28 """
29 freeze = utils.get_config('freeze')
30 if not admin and freeze:
31 scores = scores.filter(Solves.date < utils.unix_time_to_utc(freeze))
32 awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))
33
34 """
35 Combine awards and solves with a union. They should have the same amount of columns
36 """
37 results = union_all(scores, awards).alias('results')
38
39 """
40 Sum each of the results by the team id to get their score.
41 """
42 sumscores = db.session.query(
43 results.columns.teamid,
44 db.func.sum(results.columns.score).label('score'),
45 db.func.max(results.columns.id).label('id'),
46 db.func.max(results.columns.date).label('date')
47 ).group_by(results.columns.teamid).subquery()
48
49 """
50 Admins can see scores for all users but the public cannot see banned users.
51
52 Filters out banned users.
53 Properly resolves value ties by ID.
54
55 Different databases treat time precision differently so resolve by the row ID instead.
56 """
57 if admin:
58 standings_query = db.session.query(
59 Teams.id.label('teamid'),
60 Teams.name.label('name'),
61 Teams.banned, sumscores.columns.score
62 )\
63 .join(sumscores, Teams.id == sumscores.columns.teamid) \
64 .order_by(sumscores.columns.score.desc(), sumscores.columns.id)
65 else:
66 standings_query = db.session.query(
67 Teams.id.label('teamid'),
68 Teams.name.label('name'),
69 sumscores.columns.score
70 )\
71 .join(sumscores, Teams.id == sumscores.columns.teamid) \
72 .filter(Teams.banned == False) \
73 .order_by(sumscores.columns.score.desc(), sumscores.columns.id)
74
75 """
76 Only select a certain amount of users if asked.
77 """
78 if count is None:
79 standings = standings_query.all()
80 else:
81 standings = standings_query.limit(count).all()
82 db.session.close()
83
84 return standings
85
86
87 @scoreboard.route('/scoreboard')
88 def scoreboard_view():
89 if utils.get_config('view_scoreboard_if_authed') and not utils.authed():
90 return redirect(url_for('auth.login', next=request.path))
91 if utils.hide_scores():
92 return render_template('scoreboard.html', errors=['Scores are currently hidden'])
93 standings = get_standings()
94 return render_template('scoreboard.html', teams=standings, score_frozen=utils.is_scoreboard_frozen())
95
96
97 @scoreboard.route('/scores')
98 def scores():
99 json = {'standings': []}
100 if utils.get_config('view_scoreboard_if_authed') and not utils.authed():
101 return redirect(url_for('auth.login', next=request.path))
102 if utils.hide_scores():
103 return jsonify(json)
104
105 standings = get_standings()
106
107 for i, x in enumerate(standings):
108 json['standings'].append({'pos': i + 1, 'id': x.teamid, 'team': x.name, 'score': int(x.score)})
109 return jsonify(json)
110
111
112 @scoreboard.route('/top/<int:count>')
113 def topteams(count):
114 json = {'places': {}}
115 if utils.get_config('view_scoreboard_if_authed') and not utils.authed():
116 return redirect(url_for('auth.login', next=request.path))
117 if utils.hide_scores():
118 return jsonify(json)
119
120 if count > 20 or count < 0:
121 count = 10
122
123 standings = get_standings(count=count)
124
125 for i, team in enumerate(standings):
126 solves = Solves.query.filter_by(teamid=team.teamid)
127 awards = Awards.query.filter_by(teamid=team.teamid)
128
129 freeze = utils.get_config('freeze')
130
131 if freeze:
132 solves = solves.filter(Solves.date < utils.unix_time_to_utc(freeze))
133 awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))
134
135 solves = solves.all()
136 awards = awards.all()
137
138 json['places'][i + 1] = {
139 'id': team.teamid,
140 'name': team.name,
141 'solves': []
142 }
143 for x in solves:
144 json['places'][i + 1]['solves'].append({
145 'chal': x.chalid,
146 'team': x.teamid,
147 'value': x.chal.value,
148 'time': utils.unix_time(x.date)
149 })
150 for award in awards:
151 json['places'][i + 1]['solves'].append({
152 'chal': None,
153 'team': award.teamid,
154 'value': award.value,
155 'time': utils.unix_time(award.date)
156 })
157 json['places'][i + 1]['solves'] = sorted(json['places'][i + 1]['solves'], key=lambda k: k['time'])
158 return jsonify(json)
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/CTFd/scoreboard.py b/CTFd/scoreboard.py
--- a/CTFd/scoreboard.py
+++ b/CTFd/scoreboard.py
@@ -122,37 +122,42 @@
standings = get_standings(count=count)
- for i, team in enumerate(standings):
- solves = Solves.query.filter_by(teamid=team.teamid)
- awards = Awards.query.filter_by(teamid=team.teamid)
+ team_ids = [team.teamid for team in standings]
- freeze = utils.get_config('freeze')
+ solves = Solves.query.filter(Solves.teamid.in_(team_ids))
+ awards = Awards.query.filter(Awards.teamid.in_(team_ids))
- if freeze:
- solves = solves.filter(Solves.date < utils.unix_time_to_utc(freeze))
- awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))
+ freeze = utils.get_config('freeze')
+
+ if freeze:
+ solves = solves.filter(Solves.date < utils.unix_time_to_utc(freeze))
+ awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))
- solves = solves.all()
- awards = awards.all()
+ solves = solves.all()
+ awards = awards.all()
+ for i, team in enumerate(team_ids):
json['places'][i + 1] = {
- 'id': team.teamid,
- 'name': team.name,
+ 'id': standings[i].teamid,
+ 'name': standings[i].name,
'solves': []
}
- for x in solves:
- json['places'][i + 1]['solves'].append({
- 'chal': x.chalid,
- 'team': x.teamid,
- 'value': x.chal.value,
- 'time': utils.unix_time(x.date)
- })
+ for solve in solves:
+ if solve.teamid == team:
+ json['places'][i + 1]['solves'].append({
+ 'chal': solve.chalid,
+ 'team': solve.teamid,
+ 'value': solve.chal.value,
+ 'time': utils.unix_time(solve.date)
+ })
for award in awards:
- json['places'][i + 1]['solves'].append({
- 'chal': None,
- 'team': award.teamid,
- 'value': award.value,
- 'time': utils.unix_time(award.date)
- })
+ if award.teamid == team:
+ json['places'][i + 1]['solves'].append({
+ 'chal': None,
+ 'team': award.teamid,
+ 'value': award.value,
+ 'time': utils.unix_time(award.date)
+ })
json['places'][i + 1]['solves'] = sorted(json['places'][i + 1]['solves'], key=lambda k: k['time'])
+
return jsonify(json)
|
{"golden_diff": "diff --git a/CTFd/scoreboard.py b/CTFd/scoreboard.py\n--- a/CTFd/scoreboard.py\n+++ b/CTFd/scoreboard.py\n@@ -122,37 +122,42 @@\n \n standings = get_standings(count=count)\n \n- for i, team in enumerate(standings):\n- solves = Solves.query.filter_by(teamid=team.teamid)\n- awards = Awards.query.filter_by(teamid=team.teamid)\n+ team_ids = [team.teamid for team in standings]\n \n- freeze = utils.get_config('freeze')\n+ solves = Solves.query.filter(Solves.teamid.in_(team_ids))\n+ awards = Awards.query.filter(Awards.teamid.in_(team_ids))\n \n- if freeze:\n- solves = solves.filter(Solves.date < utils.unix_time_to_utc(freeze))\n- awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))\n+ freeze = utils.get_config('freeze')\n+\n+ if freeze:\n+ solves = solves.filter(Solves.date < utils.unix_time_to_utc(freeze))\n+ awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))\n \n- solves = solves.all()\n- awards = awards.all()\n+ solves = solves.all()\n+ awards = awards.all()\n \n+ for i, team in enumerate(team_ids):\n json['places'][i + 1] = {\n- 'id': team.teamid,\n- 'name': team.name,\n+ 'id': standings[i].teamid,\n+ 'name': standings[i].name,\n 'solves': []\n }\n- for x in solves:\n- json['places'][i + 1]['solves'].append({\n- 'chal': x.chalid,\n- 'team': x.teamid,\n- 'value': x.chal.value,\n- 'time': utils.unix_time(x.date)\n- })\n+ for solve in solves:\n+ if solve.teamid == team:\n+ json['places'][i + 1]['solves'].append({\n+ 'chal': solve.chalid,\n+ 'team': solve.teamid,\n+ 'value': solve.chal.value,\n+ 'time': utils.unix_time(solve.date)\n+ })\n for award in awards:\n- json['places'][i + 1]['solves'].append({\n- 'chal': None,\n- 'team': award.teamid,\n- 'value': award.value,\n- 'time': utils.unix_time(award.date)\n- })\n+ if award.teamid == team:\n+ json['places'][i + 1]['solves'].append({\n+ 'chal': None,\n+ 'team': award.teamid,\n+ 'value': award.value,\n+ 'time': utils.unix_time(award.date)\n+ })\n json['places'][i + 1]['solves'] = sorted(json['places'][i + 1]['solves'], key=lambda k: k['time'])\n+\n return jsonify(json)\n", "issue": "Optimize top10 call\nhttps://github.com/CTFd/CTFd/blob/master/CTFd/scoreboard.py#L125-L127\r\n\r\nThis code seems pretty wasteful and is likely getting hit fairly often. Optimizing this to be a single database query is likely a good idea. \n", "before_files": [{"content": "from flask import render_template, jsonify, Blueprint, redirect, url_for, request\nfrom sqlalchemy.sql.expression import union_all\n\nfrom CTFd.models import db, Teams, Solves, Awards, Challenges\n\nfrom CTFd import utils\n\nscoreboard = Blueprint('scoreboard', __name__)\n\n\ndef get_standings(admin=False, count=None):\n scores = db.session.query(\n Solves.teamid.label('teamid'),\n db.func.sum(Challenges.value).label('score'),\n db.func.max(Solves.id).label('id'),\n db.func.max(Solves.date).label('date')\n ).join(Challenges).group_by(Solves.teamid)\n\n awards = db.session.query(\n Awards.teamid.label('teamid'),\n db.func.sum(Awards.value).label('score'),\n db.func.max(Awards.id).label('id'),\n db.func.max(Awards.date).label('date')\n ).group_by(Awards.teamid)\n\n \"\"\"\n Filter out solves and awards that are before a specific time point.\n \"\"\"\n freeze = utils.get_config('freeze')\n if not admin and freeze:\n scores = scores.filter(Solves.date < utils.unix_time_to_utc(freeze))\n awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))\n\n \"\"\"\n Combine awards and solves with a union. They should have the same amount of columns\n \"\"\"\n results = union_all(scores, awards).alias('results')\n\n \"\"\"\n Sum each of the results by the team id to get their score.\n \"\"\"\n sumscores = db.session.query(\n results.columns.teamid,\n db.func.sum(results.columns.score).label('score'),\n db.func.max(results.columns.id).label('id'),\n db.func.max(results.columns.date).label('date')\n ).group_by(results.columns.teamid).subquery()\n\n \"\"\"\n Admins can see scores for all users but the public cannot see banned users.\n\n Filters out banned users.\n Properly resolves value ties by ID.\n\n Different databases treat time precision differently so resolve by the row ID instead.\n \"\"\"\n if admin:\n standings_query = db.session.query(\n Teams.id.label('teamid'),\n Teams.name.label('name'),\n Teams.banned, sumscores.columns.score\n )\\\n .join(sumscores, Teams.id == sumscores.columns.teamid) \\\n .order_by(sumscores.columns.score.desc(), sumscores.columns.id)\n else:\n standings_query = db.session.query(\n Teams.id.label('teamid'),\n Teams.name.label('name'),\n sumscores.columns.score\n )\\\n .join(sumscores, Teams.id == sumscores.columns.teamid) \\\n .filter(Teams.banned == False) \\\n .order_by(sumscores.columns.score.desc(), sumscores.columns.id)\n\n \"\"\"\n Only select a certain amount of users if asked.\n \"\"\"\n if count is None:\n standings = standings_query.all()\n else:\n standings = standings_query.limit(count).all()\n db.session.close()\n\n return standings\n\n\[email protected]('/scoreboard')\ndef scoreboard_view():\n if utils.get_config('view_scoreboard_if_authed') and not utils.authed():\n return redirect(url_for('auth.login', next=request.path))\n if utils.hide_scores():\n return render_template('scoreboard.html', errors=['Scores are currently hidden'])\n standings = get_standings()\n return render_template('scoreboard.html', teams=standings, score_frozen=utils.is_scoreboard_frozen())\n\n\[email protected]('/scores')\ndef scores():\n json = {'standings': []}\n if utils.get_config('view_scoreboard_if_authed') and not utils.authed():\n return redirect(url_for('auth.login', next=request.path))\n if utils.hide_scores():\n return jsonify(json)\n\n standings = get_standings()\n\n for i, x in enumerate(standings):\n json['standings'].append({'pos': i + 1, 'id': x.teamid, 'team': x.name, 'score': int(x.score)})\n return jsonify(json)\n\n\[email protected]('/top/<int:count>')\ndef topteams(count):\n json = {'places': {}}\n if utils.get_config('view_scoreboard_if_authed') and not utils.authed():\n return redirect(url_for('auth.login', next=request.path))\n if utils.hide_scores():\n return jsonify(json)\n\n if count > 20 or count < 0:\n count = 10\n\n standings = get_standings(count=count)\n\n for i, team in enumerate(standings):\n solves = Solves.query.filter_by(teamid=team.teamid)\n awards = Awards.query.filter_by(teamid=team.teamid)\n\n freeze = utils.get_config('freeze')\n\n if freeze:\n solves = solves.filter(Solves.date < utils.unix_time_to_utc(freeze))\n awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))\n\n solves = solves.all()\n awards = awards.all()\n\n json['places'][i + 1] = {\n 'id': team.teamid,\n 'name': team.name,\n 'solves': []\n }\n for x in solves:\n json['places'][i + 1]['solves'].append({\n 'chal': x.chalid,\n 'team': x.teamid,\n 'value': x.chal.value,\n 'time': utils.unix_time(x.date)\n })\n for award in awards:\n json['places'][i + 1]['solves'].append({\n 'chal': None,\n 'team': award.teamid,\n 'value': award.value,\n 'time': utils.unix_time(award.date)\n })\n json['places'][i + 1]['solves'] = sorted(json['places'][i + 1]['solves'], key=lambda k: k['time'])\n return jsonify(json)\n", "path": "CTFd/scoreboard.py"}], "after_files": [{"content": "from flask import render_template, jsonify, Blueprint, redirect, url_for, request\nfrom sqlalchemy.sql.expression import union_all\n\nfrom CTFd.models import db, Teams, Solves, Awards, Challenges\n\nfrom CTFd import utils\n\nscoreboard = Blueprint('scoreboard', __name__)\n\n\ndef get_standings(admin=False, count=None):\n scores = db.session.query(\n Solves.teamid.label('teamid'),\n db.func.sum(Challenges.value).label('score'),\n db.func.max(Solves.id).label('id'),\n db.func.max(Solves.date).label('date')\n ).join(Challenges).group_by(Solves.teamid)\n\n awards = db.session.query(\n Awards.teamid.label('teamid'),\n db.func.sum(Awards.value).label('score'),\n db.func.max(Awards.id).label('id'),\n db.func.max(Awards.date).label('date')\n ).group_by(Awards.teamid)\n\n \"\"\"\n Filter out solves and awards that are before a specific time point.\n \"\"\"\n freeze = utils.get_config('freeze')\n if not admin and freeze:\n scores = scores.filter(Solves.date < utils.unix_time_to_utc(freeze))\n awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))\n\n \"\"\"\n Combine awards and solves with a union. They should have the same amount of columns\n \"\"\"\n results = union_all(scores, awards).alias('results')\n\n \"\"\"\n Sum each of the results by the team id to get their score.\n \"\"\"\n sumscores = db.session.query(\n results.columns.teamid,\n db.func.sum(results.columns.score).label('score'),\n db.func.max(results.columns.id).label('id'),\n db.func.max(results.columns.date).label('date')\n ).group_by(results.columns.teamid).subquery()\n\n \"\"\"\n Admins can see scores for all users but the public cannot see banned users.\n\n Filters out banned users.\n Properly resolves value ties by ID.\n\n Different databases treat time precision differently so resolve by the row ID instead.\n \"\"\"\n if admin:\n standings_query = db.session.query(\n Teams.id.label('teamid'),\n Teams.name.label('name'),\n Teams.banned, sumscores.columns.score\n )\\\n .join(sumscores, Teams.id == sumscores.columns.teamid) \\\n .order_by(sumscores.columns.score.desc(), sumscores.columns.id)\n else:\n standings_query = db.session.query(\n Teams.id.label('teamid'),\n Teams.name.label('name'),\n sumscores.columns.score\n )\\\n .join(sumscores, Teams.id == sumscores.columns.teamid) \\\n .filter(Teams.banned == False) \\\n .order_by(sumscores.columns.score.desc(), sumscores.columns.id)\n\n \"\"\"\n Only select a certain amount of users if asked.\n \"\"\"\n if count is None:\n standings = standings_query.all()\n else:\n standings = standings_query.limit(count).all()\n db.session.close()\n\n return standings\n\n\[email protected]('/scoreboard')\ndef scoreboard_view():\n if utils.get_config('view_scoreboard_if_authed') and not utils.authed():\n return redirect(url_for('auth.login', next=request.path))\n if utils.hide_scores():\n return render_template('scoreboard.html', errors=['Scores are currently hidden'])\n standings = get_standings()\n return render_template('scoreboard.html', teams=standings, score_frozen=utils.is_scoreboard_frozen())\n\n\[email protected]('/scores')\ndef scores():\n json = {'standings': []}\n if utils.get_config('view_scoreboard_if_authed') and not utils.authed():\n return redirect(url_for('auth.login', next=request.path))\n if utils.hide_scores():\n return jsonify(json)\n\n standings = get_standings()\n\n for i, x in enumerate(standings):\n json['standings'].append({'pos': i + 1, 'id': x.teamid, 'team': x.name, 'score': int(x.score)})\n return jsonify(json)\n\n\[email protected]('/top/<int:count>')\ndef topteams(count):\n json = {'places': {}}\n if utils.get_config('view_scoreboard_if_authed') and not utils.authed():\n return redirect(url_for('auth.login', next=request.path))\n if utils.hide_scores():\n return jsonify(json)\n\n if count > 20 or count < 0:\n count = 10\n\n standings = get_standings(count=count)\n\n team_ids = [team.teamid for team in standings]\n\n solves = Solves.query.filter(Solves.teamid.in_(team_ids))\n awards = Awards.query.filter(Awards.teamid.in_(team_ids))\n\n freeze = utils.get_config('freeze')\n\n if freeze:\n solves = solves.filter(Solves.date < utils.unix_time_to_utc(freeze))\n awards = awards.filter(Awards.date < utils.unix_time_to_utc(freeze))\n\n solves = solves.all()\n awards = awards.all()\n\n for i, team in enumerate(team_ids):\n json['places'][i + 1] = {\n 'id': standings[i].teamid,\n 'name': standings[i].name,\n 'solves': []\n }\n for solve in solves:\n if solve.teamid == team:\n json['places'][i + 1]['solves'].append({\n 'chal': solve.chalid,\n 'team': solve.teamid,\n 'value': solve.chal.value,\n 'time': utils.unix_time(solve.date)\n })\n for award in awards:\n if award.teamid == team:\n json['places'][i + 1]['solves'].append({\n 'chal': None,\n 'team': award.teamid,\n 'value': award.value,\n 'time': utils.unix_time(award.date)\n })\n json['places'][i + 1]['solves'] = sorted(json['places'][i + 1]['solves'], key=lambda k: k['time'])\n\n return jsonify(json)\n", "path": "CTFd/scoreboard.py"}]}
| 1,969 | 677 |
gh_patches_debug_25120
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-936
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ERROR:checkov.terraform.checks.resource.gcp.CloudStorageSelfLogging:Failed to run check: Bucket should not log to itself for configuration
**Describe the bug**
Error during check with rule CKV_GCP_63 (not clearly identify)
Here is the result of my terraform plan for my GCS resource using [terraform google storage module](https://github.com/terraform-google-modules/terraform-google-cloud-storage)
```
resource "google_storage_bucket" "buckets" {
bucket_policy_only = (known after apply)
force_destroy = false
id = (known after apply)
labels = {
"name" = "xxxxxxxx-bucket"
}
location = "EU"
name = "xxxxxxxx-bucket"
project = "xxxxxxxx-project"
self_link = (known after apply)
storage_class = "STANDARD"
uniform_bucket_level_access = true
url = (known after apply)
versioning {
enabled = true
}
}
```
```
ERROR:checkov.terraform.checks.resource.gcp.CloudStorageSelfLogging:Failed to run check: Bucket should not log to itself for configuration
[[]], 'default_event_based_hold': [None], 'encryption': [[]], 'force_destroy': [False], 'labels': [{'name': ['xxxxxxxx-bucket'], 'start_line': [158], 'end_line': [160]}], 'lifecycle_rule': [[]], 'location': ['EU'], 'logging': [[]], 'name': ['xxxxxxxx-bucket'], 'project': ['xxxxxxxx-project'], 'requester_pays': [None], 'retention_policy': [[]], 'storage_class': ['STANDARD'], 'uniform_bucket_level_access': [True], 'versioning': [{'enabled': [True], 'start_line': [171], 'end_line': [173]}], 'website': [[]], 'start_line': [153], 'end_line': [176]} at file: /checkov.tfplan.json
Traceback (most recent call last):
File "/usr/local/bin/checkov", line 5, in <module>
run()
File "/usr/local/lib/python3.8/site-packages/checkov/main.py", line 96, in run
scan_reports = runner_registry.run(external_checks_dir=external_checks_dir, files=args.file,
File "/usr/local/lib/python3.8/site-packages/checkov/common/runners/runner_registry.py", line 34, in run
scan_report = runner.run(root_folder, external_checks_dir=external_checks_dir, files=files,
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py", line 65, in run
self.check_tf_definition(report, runner_filter)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py", line 79, in check_tf_definition
self.run_block(definition[block_type], full_file_path, report, scanned_file,
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py", line 95, in run_block
results = registry.scan(scanned_file, entity, [], runner_filter)
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py", line 109, in scan
result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py", line 115, in run_check
result = check.run(scanned_file=scanned_file, entity_configuration=entity_configuration,
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py", line 62, in run
raise e
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py", line 42, in run
check_result['result'] = self.scan_entity_conf(entity_configuration, entity_type)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py", line 17, in scan_entity_conf
return self.scan_resource_conf(conf, entity_type)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py", line 33, in wrapper
return wrapped(self, conf)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py", line 17, in scan_resource_conf
if conf['logging'][0]['log_bucket']:
TypeError: list indices must be integers or slices, not str
```
**To Reproduce**
Steps to reproduce the behavior:
1. Using [terraform GCS module](https://github.com/terraform-google-modules/terraform-google-cloud-storage) in your terraform
2. Run checkov scan
3. See error
**Informations**
- Checkov Version 1.0.799 (from docker image)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/gcp/CloudStorageLogging.py`
Content:
```
1 from checkov.common.models.enums import CheckResult, CheckCategories
2 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
3
4
5 class CloudStorageLogging(BaseResourceCheck):
6 def __init__(self):
7 name = "Bucket should log access"
8 id = "CKV_GCP_62"
9 supported_resources = ['google_storage_bucket']
10 categories = [CheckCategories.LOGGING]
11 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
12
13 def scan_resource_conf(self, conf):
14 #check fot logging
15 if 'logging' in conf.keys():
16 if conf['logging'][0]['log_bucket']:
17 log_bucket_name = conf['logging'][0]['log_bucket']
18 if log_bucket_name != None:
19 return CheckResult.PASSED
20 else:
21 return CheckResult.FAILED
22 else:
23 return CheckResult.FAILED
24 return CheckResult.FAILED
25 else:
26 return CheckResult.FAILED
27 return CheckResult.FAILED
28
29 check = CloudStorageLogging()
30
```
Path: `checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py`
Content:
```
1 from checkov.common.models.enums import CheckResult, CheckCategories
2 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
3
4
5 class CloudStorageSelfLogging(BaseResourceCheck):
6 def __init__(self):
7 name = "Bucket should not log to itself"
8 id = "CKV_GCP_63"
9 supported_resources = ['google_storage_bucket']
10 categories = [CheckCategories.LOGGING]
11 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
12
13 def scan_resource_conf(self, conf):
14 bucket_name = conf['name']
15 #check fot logging
16 if 'logging' in conf.keys():
17 if conf['logging'][0]['log_bucket']:
18 log_bucket_name = conf['logging'][0]['log_bucket']
19 if log_bucket_name != bucket_name:
20 return CheckResult.PASSED
21 else:
22 return CheckResult.FAILED
23 else:
24 return CheckResult.FAILED
25 return CheckResult.FAILED
26 return CheckResult.UNKNOWN
27
28 check = CloudStorageSelfLogging()
29
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py b/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py
--- a/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py
+++ b/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py
@@ -11,9 +11,9 @@
super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
def scan_resource_conf(self, conf):
- #check fot logging
- if 'logging' in conf.keys():
- if conf['logging'][0]['log_bucket']:
+ #check for logging
+ if 'logging' in conf:
+ if conf['logging'][0]:
log_bucket_name = conf['logging'][0]['log_bucket']
if log_bucket_name != None:
return CheckResult.PASSED
diff --git a/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py b/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py
--- a/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py
+++ b/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py
@@ -12,9 +12,9 @@
def scan_resource_conf(self, conf):
bucket_name = conf['name']
- #check fot logging
- if 'logging' in conf.keys():
- if conf['logging'][0]['log_bucket']:
+ #check for logging
+ if 'logging' in conf:
+ if conf['logging'][0]:
log_bucket_name = conf['logging'][0]['log_bucket']
if log_bucket_name != bucket_name:
return CheckResult.PASSED
|
{"golden_diff": "diff --git a/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py b/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py\n--- a/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py\n+++ b/checkov/terraform/checks/resource/gcp/CloudStorageLogging.py\n@@ -11,9 +11,9 @@\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n \n def scan_resource_conf(self, conf):\n- #check fot logging\n- if 'logging' in conf.keys():\n- if conf['logging'][0]['log_bucket']:\n+ #check for logging\n+ if 'logging' in conf:\n+ if conf['logging'][0]:\n log_bucket_name = conf['logging'][0]['log_bucket']\n if log_bucket_name != None:\n return CheckResult.PASSED\ndiff --git a/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py b/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py\n--- a/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py\n+++ b/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py\n@@ -12,9 +12,9 @@\n \n def scan_resource_conf(self, conf):\n bucket_name = conf['name']\n- #check fot logging\n- if 'logging' in conf.keys():\n- if conf['logging'][0]['log_bucket']:\n+ #check for logging\n+ if 'logging' in conf:\n+ if conf['logging'][0]:\n log_bucket_name = conf['logging'][0]['log_bucket']\n if log_bucket_name != bucket_name:\n return CheckResult.PASSED\n", "issue": "ERROR:checkov.terraform.checks.resource.gcp.CloudStorageSelfLogging:Failed to run check: Bucket should not log to itself for configuration\n**Describe the bug**\r\nError during check with rule CKV_GCP_63 (not clearly identify)\r\n\r\nHere is the result of my terraform plan for my GCS resource using [terraform google storage module](https://github.com/terraform-google-modules/terraform-google-cloud-storage)\r\n```\r\nresource \"google_storage_bucket\" \"buckets\" {\r\n bucket_policy_only = (known after apply)\r\n force_destroy = false\r\n id = (known after apply)\r\n labels = {\r\n \"name\" = \"xxxxxxxx-bucket\"\r\n }\r\n location = \"EU\"\r\n name = \"xxxxxxxx-bucket\"\r\n project = \"xxxxxxxx-project\"\r\n self_link = (known after apply)\r\n storage_class = \"STANDARD\"\r\n uniform_bucket_level_access = true\r\n url = (known after apply)\r\n versioning {\r\n enabled = true\r\n }\r\n }\r\n```\r\n\r\n```\r\nERROR:checkov.terraform.checks.resource.gcp.CloudStorageSelfLogging:Failed to run check: Bucket should not log to itself for configuration\r\n[[]], 'default_event_based_hold': [None], 'encryption': [[]], 'force_destroy': [False], 'labels': [{'name': ['xxxxxxxx-bucket'], 'start_line': [158], 'end_line': [160]}], 'lifecycle_rule': [[]], 'location': ['EU'], 'logging': [[]], 'name': ['xxxxxxxx-bucket'], 'project': ['xxxxxxxx-project'], 'requester_pays': [None], 'retention_policy': [[]], 'storage_class': ['STANDARD'], 'uniform_bucket_level_access': [True], 'versioning': [{'enabled': [True], 'start_line': [171], 'end_line': [173]}], 'website': [[]], 'start_line': [153], 'end_line': [176]} at file: /checkov.tfplan.json\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/checkov\", line 5, in <module>\r\n run()\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/main.py\", line 96, in run\r\n scan_reports = runner_registry.run(external_checks_dir=external_checks_dir, files=args.file,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/runners/runner_registry.py\", line 34, in run\r\n scan_report = runner.run(root_folder, external_checks_dir=external_checks_dir, files=files,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py\", line 65, in run\r\n self.check_tf_definition(report, runner_filter)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py\", line 79, in check_tf_definition\r\n self.run_block(definition[block_type], full_file_path, report, scanned_file,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py\", line 95, in run_block\r\n results = registry.scan(scanned_file, entity, [], runner_filter)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py\", line 109, in scan\r\n result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py\", line 115, in run_check\r\n result = check.run(scanned_file=scanned_file, entity_configuration=entity_configuration,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py\", line 62, in run\r\n raise e\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py\", line 42, in run\r\n check_result['result'] = self.scan_entity_conf(entity_configuration, entity_type)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py\", line 17, in scan_entity_conf\r\n return self.scan_resource_conf(conf, entity_type)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py\", line 33, in wrapper\r\n return wrapped(self, conf)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py\", line 17, in scan_resource_conf\r\n if conf['logging'][0]['log_bucket']:\r\nTypeError: list indices must be integers or slices, not str\r\n```\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Using [terraform GCS module](https://github.com/terraform-google-modules/terraform-google-cloud-storage) in your terraform\r\n2. Run checkov scan\r\n3. See error\r\n\r\n**Informations**\r\n - Checkov Version 1.0.799 (from docker image)\n", "before_files": [{"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass CloudStorageLogging(BaseResourceCheck):\n def __init__(self):\n name = \"Bucket should log access\"\n id = \"CKV_GCP_62\"\n supported_resources = ['google_storage_bucket']\n categories = [CheckCategories.LOGGING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n #check fot logging\n if 'logging' in conf.keys():\n if conf['logging'][0]['log_bucket']:\n log_bucket_name = conf['logging'][0]['log_bucket']\n if log_bucket_name != None:\n return CheckResult.PASSED\n else:\n return CheckResult.FAILED\n else:\n return CheckResult.FAILED\n return CheckResult.FAILED\n else:\n return CheckResult.FAILED\n return CheckResult.FAILED\n\ncheck = CloudStorageLogging()\n", "path": "checkov/terraform/checks/resource/gcp/CloudStorageLogging.py"}, {"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass CloudStorageSelfLogging(BaseResourceCheck):\n def __init__(self):\n name = \"Bucket should not log to itself\"\n id = \"CKV_GCP_63\"\n supported_resources = ['google_storage_bucket']\n categories = [CheckCategories.LOGGING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n bucket_name = conf['name']\n #check fot logging\n if 'logging' in conf.keys():\n if conf['logging'][0]['log_bucket']:\n log_bucket_name = conf['logging'][0]['log_bucket']\n if log_bucket_name != bucket_name:\n return CheckResult.PASSED\n else:\n return CheckResult.FAILED\n else:\n return CheckResult.FAILED\n return CheckResult.FAILED\n return CheckResult.UNKNOWN\n\ncheck = CloudStorageSelfLogging()\n", "path": "checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py"}], "after_files": [{"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass CloudStorageLogging(BaseResourceCheck):\n def __init__(self):\n name = \"Bucket should log access\"\n id = \"CKV_GCP_62\"\n supported_resources = ['google_storage_bucket']\n categories = [CheckCategories.LOGGING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n #check for logging\n if 'logging' in conf:\n if conf['logging'][0]:\n log_bucket_name = conf['logging'][0]['log_bucket']\n if log_bucket_name != None:\n return CheckResult.PASSED\n else:\n return CheckResult.FAILED\n else:\n return CheckResult.FAILED\n return CheckResult.FAILED\n else:\n return CheckResult.FAILED\n return CheckResult.FAILED\n\ncheck = CloudStorageLogging()\n", "path": "checkov/terraform/checks/resource/gcp/CloudStorageLogging.py"}, {"content": "from checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass CloudStorageSelfLogging(BaseResourceCheck):\n def __init__(self):\n name = \"Bucket should not log to itself\"\n id = \"CKV_GCP_63\"\n supported_resources = ['google_storage_bucket']\n categories = [CheckCategories.LOGGING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n bucket_name = conf['name']\n #check for logging\n if 'logging' in conf:\n if conf['logging'][0]:\n log_bucket_name = conf['logging'][0]['log_bucket']\n if log_bucket_name != bucket_name:\n return CheckResult.PASSED\n else:\n return CheckResult.FAILED\n else:\n return CheckResult.FAILED\n return CheckResult.FAILED\n return CheckResult.UNKNOWN\n\ncheck = CloudStorageSelfLogging()\n", "path": "checkov/terraform/checks/resource/gcp/CloudStorageSelfLogging.py"}]}
| 1,943 | 382 |
gh_patches_debug_15645
|
rasdani/github-patches
|
git_diff
|
netbox-community__netbox-7928
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Don't fetch LDAP user and groups on all API request when FIND_GROUP_PERMS is disabled
### NetBox version
V3.0.9
### Feature type
Change to existing functionality
### Proposed functionality
Currently when using the LDAP backend for authentication, the AD is queried on every API request, regardless of other settings and regardless if the user is local or has been created by the LDAP backend. Additionally the LDAP cache built into django-auth-ldap does not function when using populate_user.
As the user is not actually authenticated against the AD when using the API (the token is used), I propose that the local user and it's group assignments are used when FIND_GROUP_PERMISSIONS is disabled.
I have a change ready for pull request if the issue is accepted.
For more info, please see the discussion I created: https://github.com/netbox-community/netbox/discussions/7708
This issue would also partly fix #6926 - it will not fix the caching, but the user who reported the issue is not using FIND_GROUP_PERMISSIONS.
### Use case
The end goal is vastly improved API performance when using the LDAP backend in most cases.
The above changes will result in the following changes for users:
**Not using the LDAP backend:**
No changes
**FIND_GROUP_PERMS = True:**
No changes
**MIRROR_GROUPS = True and FIND_GROUP_PERMS = True:**
No changes
**MIRROR_GROUPS = True and FIND_GROUP_PERMS = False:**
Local user and group assignments will be used when calling the API and the user and groups are never reloaded from the LDAP server during API calls. This means that LDAP users utilizing the API will have to login to the web ui to update group memberships. The change also allows one to use locally created users to call the API with querying the LDAP server.
**MIRROR_GROUPS = False and FIND_GROUP_PERMS = False:**
The user performing the API request has to be locally assigned groups or have local user object permissions.
### Database changes
No database changes
### External dependencies
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `netbox/netbox/api/authentication.py`
Content:
```
1 from django.conf import settings
2 from rest_framework import authentication, exceptions
3 from rest_framework.permissions import BasePermission, DjangoObjectPermissions, SAFE_METHODS
4
5 from users.models import Token
6
7
8 class TokenAuthentication(authentication.TokenAuthentication):
9 """
10 A custom authentication scheme which enforces Token expiration times.
11 """
12 model = Token
13
14 def authenticate_credentials(self, key):
15 model = self.get_model()
16 try:
17 token = model.objects.prefetch_related('user').get(key=key)
18 except model.DoesNotExist:
19 raise exceptions.AuthenticationFailed("Invalid token")
20
21 # Enforce the Token's expiration time, if one has been set.
22 if token.is_expired:
23 raise exceptions.AuthenticationFailed("Token expired")
24
25 if not token.user.is_active:
26 raise exceptions.AuthenticationFailed("User inactive")
27
28 # When LDAP authentication is active try to load user data from LDAP directory
29 if settings.REMOTE_AUTH_BACKEND == 'netbox.authentication.LDAPBackend':
30 from netbox.authentication import LDAPBackend
31 ldap_backend = LDAPBackend()
32 user = ldap_backend.populate_user(token.user.username)
33 # If the user is found in the LDAP directory use it, if not fallback to the local user
34 if user:
35 return user, token
36
37 return token.user, token
38
39
40 class TokenPermissions(DjangoObjectPermissions):
41 """
42 Custom permissions handler which extends the built-in DjangoModelPermissions to validate a Token's write ability
43 for unsafe requests (POST/PUT/PATCH/DELETE).
44 """
45 # Override the stock perm_map to enforce view permissions
46 perms_map = {
47 'GET': ['%(app_label)s.view_%(model_name)s'],
48 'OPTIONS': [],
49 'HEAD': ['%(app_label)s.view_%(model_name)s'],
50 'POST': ['%(app_label)s.add_%(model_name)s'],
51 'PUT': ['%(app_label)s.change_%(model_name)s'],
52 'PATCH': ['%(app_label)s.change_%(model_name)s'],
53 'DELETE': ['%(app_label)s.delete_%(model_name)s'],
54 }
55
56 def __init__(self):
57
58 # LOGIN_REQUIRED determines whether read-only access is provided to anonymous users.
59 self.authenticated_users_only = settings.LOGIN_REQUIRED
60
61 super().__init__()
62
63 def _verify_write_permission(self, request):
64
65 # If token authentication is in use, verify that the token allows write operations (for unsafe methods).
66 if request.method in SAFE_METHODS or request.auth.write_enabled:
67 return True
68
69 def has_permission(self, request, view):
70
71 # Enforce Token write ability
72 if isinstance(request.auth, Token) and not self._verify_write_permission(request):
73 return False
74
75 return super().has_permission(request, view)
76
77 def has_object_permission(self, request, view, obj):
78
79 # Enforce Token write ability
80 if isinstance(request.auth, Token) and not self._verify_write_permission(request):
81 return False
82
83 return super().has_object_permission(request, view, obj)
84
85
86 class IsAuthenticatedOrLoginNotRequired(BasePermission):
87 """
88 Returns True if the user is authenticated or LOGIN_REQUIRED is False.
89 """
90 def has_permission(self, request, view):
91 if not settings.LOGIN_REQUIRED:
92 return True
93 return request.user.is_authenticated
94
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/netbox/netbox/api/authentication.py b/netbox/netbox/api/authentication.py
--- a/netbox/netbox/api/authentication.py
+++ b/netbox/netbox/api/authentication.py
@@ -29,10 +29,13 @@
if settings.REMOTE_AUTH_BACKEND == 'netbox.authentication.LDAPBackend':
from netbox.authentication import LDAPBackend
ldap_backend = LDAPBackend()
- user = ldap_backend.populate_user(token.user.username)
- # If the user is found in the LDAP directory use it, if not fallback to the local user
- if user:
- return user, token
+
+ # Load from LDAP if FIND_GROUP_PERMS is active
+ if ldap_backend.settings.FIND_GROUP_PERMS:
+ user = ldap_backend.populate_user(token.user.username)
+ # If the user is found in the LDAP directory use it, if not fallback to the local user
+ if user:
+ return user, token
return token.user, token
|
{"golden_diff": "diff --git a/netbox/netbox/api/authentication.py b/netbox/netbox/api/authentication.py\n--- a/netbox/netbox/api/authentication.py\n+++ b/netbox/netbox/api/authentication.py\n@@ -29,10 +29,13 @@\n if settings.REMOTE_AUTH_BACKEND == 'netbox.authentication.LDAPBackend':\n from netbox.authentication import LDAPBackend\n ldap_backend = LDAPBackend()\n- user = ldap_backend.populate_user(token.user.username)\n- # If the user is found in the LDAP directory use it, if not fallback to the local user\n- if user:\n- return user, token\n+\n+ # Load from LDAP if FIND_GROUP_PERMS is active\n+ if ldap_backend.settings.FIND_GROUP_PERMS:\n+ user = ldap_backend.populate_user(token.user.username)\n+ # If the user is found in the LDAP directory use it, if not fallback to the local user\n+ if user:\n+ return user, token\n \n return token.user, token\n", "issue": "Don't fetch LDAP user and groups on all API request when FIND_GROUP_PERMS is disabled\n### NetBox version\n\nV3.0.9\n\n### Feature type\n\nChange to existing functionality\n\n### Proposed functionality\n\nCurrently when using the LDAP backend for authentication, the AD is queried on every API request, regardless of other settings and regardless if the user is local or has been created by the LDAP backend. Additionally the LDAP cache built into django-auth-ldap does not function when using populate_user.\r\n\r\nAs the user is not actually authenticated against the AD when using the API (the token is used), I propose that the local user and it's group assignments are used when FIND_GROUP_PERMISSIONS is disabled.\r\n\r\nI have a change ready for pull request if the issue is accepted.\r\n\r\nFor more info, please see the discussion I created: https://github.com/netbox-community/netbox/discussions/7708\r\n\r\nThis issue would also partly fix #6926 - it will not fix the caching, but the user who reported the issue is not using FIND_GROUP_PERMISSIONS.\n\n### Use case\n\nThe end goal is vastly improved API performance when using the LDAP backend in most cases.\r\n\r\nThe above changes will result in the following changes for users:\r\n\r\n**Not using the LDAP backend:**\r\n\r\nNo changes\r\n\r\n**FIND_GROUP_PERMS = True:**\r\n\r\nNo changes\r\n\r\n**MIRROR_GROUPS = True and FIND_GROUP_PERMS = True:**\r\n\r\nNo changes\r\n\r\n**MIRROR_GROUPS = True and FIND_GROUP_PERMS = False:**\r\n\r\nLocal user and group assignments will be used when calling the API and the user and groups are never reloaded from the LDAP server during API calls. This means that LDAP users utilizing the API will have to login to the web ui to update group memberships. The change also allows one to use locally created users to call the API with querying the LDAP server.\r\n\r\n**MIRROR_GROUPS = False and FIND_GROUP_PERMS = False:**\r\n\r\nThe user performing the API request has to be locally assigned groups or have local user object permissions.\n\n### Database changes\n\nNo database changes\n\n### External dependencies\n\n_No response_\n", "before_files": [{"content": "from django.conf import settings\nfrom rest_framework import authentication, exceptions\nfrom rest_framework.permissions import BasePermission, DjangoObjectPermissions, SAFE_METHODS\n\nfrom users.models import Token\n\n\nclass TokenAuthentication(authentication.TokenAuthentication):\n \"\"\"\n A custom authentication scheme which enforces Token expiration times.\n \"\"\"\n model = Token\n\n def authenticate_credentials(self, key):\n model = self.get_model()\n try:\n token = model.objects.prefetch_related('user').get(key=key)\n except model.DoesNotExist:\n raise exceptions.AuthenticationFailed(\"Invalid token\")\n\n # Enforce the Token's expiration time, if one has been set.\n if token.is_expired:\n raise exceptions.AuthenticationFailed(\"Token expired\")\n\n if not token.user.is_active:\n raise exceptions.AuthenticationFailed(\"User inactive\")\n\n # When LDAP authentication is active try to load user data from LDAP directory\n if settings.REMOTE_AUTH_BACKEND == 'netbox.authentication.LDAPBackend':\n from netbox.authentication import LDAPBackend\n ldap_backend = LDAPBackend()\n user = ldap_backend.populate_user(token.user.username)\n # If the user is found in the LDAP directory use it, if not fallback to the local user\n if user:\n return user, token\n\n return token.user, token\n\n\nclass TokenPermissions(DjangoObjectPermissions):\n \"\"\"\n Custom permissions handler which extends the built-in DjangoModelPermissions to validate a Token's write ability\n for unsafe requests (POST/PUT/PATCH/DELETE).\n \"\"\"\n # Override the stock perm_map to enforce view permissions\n perms_map = {\n 'GET': ['%(app_label)s.view_%(model_name)s'],\n 'OPTIONS': [],\n 'HEAD': ['%(app_label)s.view_%(model_name)s'],\n 'POST': ['%(app_label)s.add_%(model_name)s'],\n 'PUT': ['%(app_label)s.change_%(model_name)s'],\n 'PATCH': ['%(app_label)s.change_%(model_name)s'],\n 'DELETE': ['%(app_label)s.delete_%(model_name)s'],\n }\n\n def __init__(self):\n\n # LOGIN_REQUIRED determines whether read-only access is provided to anonymous users.\n self.authenticated_users_only = settings.LOGIN_REQUIRED\n\n super().__init__()\n\n def _verify_write_permission(self, request):\n\n # If token authentication is in use, verify that the token allows write operations (for unsafe methods).\n if request.method in SAFE_METHODS or request.auth.write_enabled:\n return True\n\n def has_permission(self, request, view):\n\n # Enforce Token write ability\n if isinstance(request.auth, Token) and not self._verify_write_permission(request):\n return False\n\n return super().has_permission(request, view)\n\n def has_object_permission(self, request, view, obj):\n\n # Enforce Token write ability\n if isinstance(request.auth, Token) and not self._verify_write_permission(request):\n return False\n\n return super().has_object_permission(request, view, obj)\n\n\nclass IsAuthenticatedOrLoginNotRequired(BasePermission):\n \"\"\"\n Returns True if the user is authenticated or LOGIN_REQUIRED is False.\n \"\"\"\n def has_permission(self, request, view):\n if not settings.LOGIN_REQUIRED:\n return True\n return request.user.is_authenticated\n", "path": "netbox/netbox/api/authentication.py"}], "after_files": [{"content": "from django.conf import settings\nfrom rest_framework import authentication, exceptions\nfrom rest_framework.permissions import BasePermission, DjangoObjectPermissions, SAFE_METHODS\n\nfrom users.models import Token\n\n\nclass TokenAuthentication(authentication.TokenAuthentication):\n \"\"\"\n A custom authentication scheme which enforces Token expiration times.\n \"\"\"\n model = Token\n\n def authenticate_credentials(self, key):\n model = self.get_model()\n try:\n token = model.objects.prefetch_related('user').get(key=key)\n except model.DoesNotExist:\n raise exceptions.AuthenticationFailed(\"Invalid token\")\n\n # Enforce the Token's expiration time, if one has been set.\n if token.is_expired:\n raise exceptions.AuthenticationFailed(\"Token expired\")\n\n if not token.user.is_active:\n raise exceptions.AuthenticationFailed(\"User inactive\")\n\n # When LDAP authentication is active try to load user data from LDAP directory\n if settings.REMOTE_AUTH_BACKEND == 'netbox.authentication.LDAPBackend':\n from netbox.authentication import LDAPBackend\n ldap_backend = LDAPBackend()\n\n # Load from LDAP if FIND_GROUP_PERMS is active\n if ldap_backend.settings.FIND_GROUP_PERMS:\n user = ldap_backend.populate_user(token.user.username)\n # If the user is found in the LDAP directory use it, if not fallback to the local user\n if user:\n return user, token\n\n return token.user, token\n\n\nclass TokenPermissions(DjangoObjectPermissions):\n \"\"\"\n Custom permissions handler which extends the built-in DjangoModelPermissions to validate a Token's write ability\n for unsafe requests (POST/PUT/PATCH/DELETE).\n \"\"\"\n # Override the stock perm_map to enforce view permissions\n perms_map = {\n 'GET': ['%(app_label)s.view_%(model_name)s'],\n 'OPTIONS': [],\n 'HEAD': ['%(app_label)s.view_%(model_name)s'],\n 'POST': ['%(app_label)s.add_%(model_name)s'],\n 'PUT': ['%(app_label)s.change_%(model_name)s'],\n 'PATCH': ['%(app_label)s.change_%(model_name)s'],\n 'DELETE': ['%(app_label)s.delete_%(model_name)s'],\n }\n\n def __init__(self):\n\n # LOGIN_REQUIRED determines whether read-only access is provided to anonymous users.\n self.authenticated_users_only = settings.LOGIN_REQUIRED\n\n super().__init__()\n\n def _verify_write_permission(self, request):\n\n # If token authentication is in use, verify that the token allows write operations (for unsafe methods).\n if request.method in SAFE_METHODS or request.auth.write_enabled:\n return True\n\n def has_permission(self, request, view):\n\n # Enforce Token write ability\n if isinstance(request.auth, Token) and not self._verify_write_permission(request):\n return False\n\n return super().has_permission(request, view)\n\n def has_object_permission(self, request, view, obj):\n\n # Enforce Token write ability\n if isinstance(request.auth, Token) and not self._verify_write_permission(request):\n return False\n\n return super().has_object_permission(request, view, obj)\n\n\nclass IsAuthenticatedOrLoginNotRequired(BasePermission):\n \"\"\"\n Returns True if the user is authenticated or LOGIN_REQUIRED is False.\n \"\"\"\n def has_permission(self, request, view):\n if not settings.LOGIN_REQUIRED:\n return True\n return request.user.is_authenticated\n", "path": "netbox/netbox/api/authentication.py"}]}
| 1,561 | 215 |
gh_patches_debug_31857
|
rasdani/github-patches
|
git_diff
|
projectmesa__mesa-301
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Documentation is not reflecting latest changes wrt width-height argument order in Grid()
As many people start with reading mesa on readthedocs, the documentation should be inline with the code changes wrt width-height argument order in Grid functions.This is not yet reflected.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mesa/visualization/modules/CanvasGridVisualization.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """
3 Modular Canvas Rendering
4 ========================
5
6 Module for visualizing model objects in grid cells.
7
8 """
9 from collections import defaultdict
10 from mesa.visualization.ModularVisualization import VisualizationElement
11
12
13 class CanvasGrid(VisualizationElement):
14 """ A CanvasGrid object uses a user-provided portrayal method to generate a
15 portrayal for each object. A portrayal is a JSON-ready dictionary which
16 tells the relevant JavaScript code (GridDraw.js) where to draw what shape.
17
18 The render method returns a dictionary, keyed on layers, with values as
19 lists of portrayals to draw. Portrayals themselves are generated by the
20 user-provided portrayal_method, which accepts an object as an input and
21 produces a portrayal of it.
22
23 A portrayal as a dictionary with the following structure:
24 "x", "y": Coordinates for the cell in which the object is placed.
25 "Shape": Can be either "circle" or "rect"
26 For Circles:
27 "r": The radius, defined as a fraction of cell size. r=1 will
28 fill the entire cell.
29 For rectangles:
30 "w", "h": The width and height of the rectangle, which are in
31 fractions of cell width and height.
32 "Color": The color to draw the shape in; needs to be a valid HTML
33 color, e.g."Red" or "#AA08F8"
34 "Filled": either "true" or "false", and determines whether the shape is
35 filled or not.
36 "Layer": Layer number of 0 or above; higher-numbered layers are drawn
37 above lower-numbered layers.
38 "text": The text to be inscribed inside the Shape. Normally useful for
39 showing the unique_id of the agent.
40 "text_color": The color to draw the inscribed text. Should be given in
41 conjunction of "text" property.
42
43
44 Attributes:
45 portrayal_method: Function which generates portrayals from objects, as
46 described above.
47 grid_height, grid_width: Size of the grid to visualize, in cells.
48 canvas_height, canvas_width: Size, in pixels, of the grid visualization
49 to draw on the client.
50 template: "canvas_module.html" stores the module's HTML template.
51
52 """
53 package_includes = ["GridDraw.js", "CanvasModule.js"]
54 portrayal_method = None # Portrayal function
55 canvas_width = 500
56 canvas_height = 500
57
58 def __init__(self, portrayal_method, grid_width, grid_height,
59 canvas_width=500, canvas_height=500):
60 """ Instantiate a new CanvasGrid.
61
62 Args:
63 portrayal_method: function to convert each object on the grid to
64 a portrayal, as described above.
65 grid_width, grid_height: Size of the grid, in cells.
66 canvas_height, canvas_width: Size of the canvas to draw in the
67 client, in pixels. (default: 500x500)
68
69 """
70 self.portrayal_method = portrayal_method
71 self.grid_width = grid_width
72 self.grid_height = grid_height
73 self.canvas_width = canvas_width
74 self.canvas_height = canvas_height
75
76 new_element = ("new CanvasModule({}, {}, {}, {})"
77 .format(self.canvas_width, self.canvas_height,
78 self.grid_width, self.grid_height))
79
80 self.js_code = "elements.push(" + new_element + ");"
81
82 def render(self, model):
83 grid_state = defaultdict(list)
84 for x in range(model.grid.width):
85 for y in range(model.grid.height):
86 cell_objects = model.grid.get_cell_list_contents([(x, y)])
87 for obj in cell_objects:
88 portrayal = self.portrayal_method(obj)
89 if portrayal:
90 portrayal["x"] = x
91 portrayal["y"] = y
92 grid_state[portrayal["Layer"]].append(portrayal)
93
94 return grid_state
95
```
Path: `examples/Basic/basic/server.py`
Content:
```
1 import random
2
3 from mesa.visualization.modules import CanvasGrid
4 from mesa.visualization.ModularVisualization import ModularServer
5
6 from basic.model import Walker, ShapesModel
7
8
9 def agent_draw(agent):
10 portrayal = None
11 if agent is None:
12 # Actually this if part is unnecessary, but still keeping it for
13 # aesthetics
14 pass
15 elif isinstance(agent, Walker):
16 print("Uid: {0}, Heading: {1}".format(agent.unique_id, agent.heading))
17 portrayal = {"Shape": "arrowHead",
18 "Filled": "true",
19 "Layer": 2,
20 "Color": "green",
21 "Filled": "true",
22 "heading0": agent.heading[0],
23 "heading1": agent.heading[1],
24 "text": agent.unique_id,
25 "text_color": "white",
26 "scale": 0.8,
27 }
28 return portrayal
29
30
31 def launch_basic():
32 width = 15
33 height = 10
34 num_agents = 2
35 pixel_ratio = 50
36 grid = CanvasGrid(agent_draw, width, height,
37 width * pixel_ratio, height * pixel_ratio)
38 server = ModularServer(ShapesModel, [grid], "Basic Example",
39 num_agents, width, height)
40 server.max_steps = 0
41 server.port = 8888
42 server.launch()
43
44 if __name__ == "__main__":
45 random.seed(3)
46 launch_basic()
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/examples/Basic/basic/server.py b/examples/Basic/basic/server.py
--- a/examples/Basic/basic/server.py
+++ b/examples/Basic/basic/server.py
@@ -19,8 +19,8 @@
"Layer": 2,
"Color": "green",
"Filled": "true",
- "heading0": agent.heading[0],
- "heading1": agent.heading[1],
+ "heading_x": agent.heading[0],
+ "heading_y": agent.heading[1],
"text": agent.unique_id,
"text_color": "white",
"scale": 0.8,
diff --git a/mesa/visualization/modules/CanvasGridVisualization.py b/mesa/visualization/modules/CanvasGridVisualization.py
--- a/mesa/visualization/modules/CanvasGridVisualization.py
+++ b/mesa/visualization/modules/CanvasGridVisualization.py
@@ -22,13 +22,17 @@
A portrayal as a dictionary with the following structure:
"x", "y": Coordinates for the cell in which the object is placed.
- "Shape": Can be either "circle" or "rect"
+ "Shape": Can be either "circle", "rect" or "arrowHead"
For Circles:
"r": The radius, defined as a fraction of cell size. r=1 will
fill the entire cell.
- For rectangles:
+ For Rectangles:
"w", "h": The width and height of the rectangle, which are in
fractions of cell width and height.
+ For arrowHead:
+ "scale": Proportion scaling as a fraction of cell size.
+ "heading_x": represents x direction unit vector.
+ "heading_y": represents y direction unit vector.
"Color": The color to draw the shape in; needs to be a valid HTML
color, e.g."Red" or "#AA08F8"
"Filled": either "true" or "false", and determines whether the shape is
|
{"golden_diff": "diff --git a/examples/Basic/basic/server.py b/examples/Basic/basic/server.py\n--- a/examples/Basic/basic/server.py\n+++ b/examples/Basic/basic/server.py\n@@ -19,8 +19,8 @@\n \"Layer\": 2,\n \"Color\": \"green\",\n \"Filled\": \"true\",\n- \"heading0\": agent.heading[0],\n- \"heading1\": agent.heading[1],\n+ \"heading_x\": agent.heading[0],\n+ \"heading_y\": agent.heading[1],\n \"text\": agent.unique_id,\n \"text_color\": \"white\",\n \"scale\": 0.8,\ndiff --git a/mesa/visualization/modules/CanvasGridVisualization.py b/mesa/visualization/modules/CanvasGridVisualization.py\n--- a/mesa/visualization/modules/CanvasGridVisualization.py\n+++ b/mesa/visualization/modules/CanvasGridVisualization.py\n@@ -22,13 +22,17 @@\n \n A portrayal as a dictionary with the following structure:\n \"x\", \"y\": Coordinates for the cell in which the object is placed.\n- \"Shape\": Can be either \"circle\" or \"rect\"\n+ \"Shape\": Can be either \"circle\", \"rect\" or \"arrowHead\"\n For Circles:\n \"r\": The radius, defined as a fraction of cell size. r=1 will\n fill the entire cell.\n- For rectangles:\n+ For Rectangles:\n \"w\", \"h\": The width and height of the rectangle, which are in\n fractions of cell width and height.\n+ For arrowHead:\n+ \"scale\": Proportion scaling as a fraction of cell size.\n+ \"heading_x\": represents x direction unit vector.\n+ \"heading_y\": represents y direction unit vector.\n \"Color\": The color to draw the shape in; needs to be a valid HTML\n color, e.g.\"Red\" or \"#AA08F8\"\n \"Filled\": either \"true\" or \"false\", and determines whether the shape is\n", "issue": "Documentation is not reflecting latest changes wrt width-height argument order in Grid()\nAs many people start with reading mesa on readthedocs, the documentation should be inline with the code changes wrt width-height argument order in Grid functions.This is not yet reflected.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"\nModular Canvas Rendering\n========================\n\nModule for visualizing model objects in grid cells.\n\n\"\"\"\nfrom collections import defaultdict\nfrom mesa.visualization.ModularVisualization import VisualizationElement\n\n\nclass CanvasGrid(VisualizationElement):\n \"\"\" A CanvasGrid object uses a user-provided portrayal method to generate a\n portrayal for each object. A portrayal is a JSON-ready dictionary which\n tells the relevant JavaScript code (GridDraw.js) where to draw what shape.\n\n The render method returns a dictionary, keyed on layers, with values as\n lists of portrayals to draw. Portrayals themselves are generated by the\n user-provided portrayal_method, which accepts an object as an input and\n produces a portrayal of it.\n\n A portrayal as a dictionary with the following structure:\n \"x\", \"y\": Coordinates for the cell in which the object is placed.\n \"Shape\": Can be either \"circle\" or \"rect\"\n For Circles:\n \"r\": The radius, defined as a fraction of cell size. r=1 will\n fill the entire cell.\n For rectangles:\n \"w\", \"h\": The width and height of the rectangle, which are in\n fractions of cell width and height.\n \"Color\": The color to draw the shape in; needs to be a valid HTML\n color, e.g.\"Red\" or \"#AA08F8\"\n \"Filled\": either \"true\" or \"false\", and determines whether the shape is\n filled or not.\n \"Layer\": Layer number of 0 or above; higher-numbered layers are drawn\n above lower-numbered layers.\n \"text\": The text to be inscribed inside the Shape. Normally useful for\n showing the unique_id of the agent.\n \"text_color\": The color to draw the inscribed text. Should be given in\n conjunction of \"text\" property.\n\n\n Attributes:\n portrayal_method: Function which generates portrayals from objects, as\n described above.\n grid_height, grid_width: Size of the grid to visualize, in cells.\n canvas_height, canvas_width: Size, in pixels, of the grid visualization\n to draw on the client.\n template: \"canvas_module.html\" stores the module's HTML template.\n\n \"\"\"\n package_includes = [\"GridDraw.js\", \"CanvasModule.js\"]\n portrayal_method = None # Portrayal function\n canvas_width = 500\n canvas_height = 500\n\n def __init__(self, portrayal_method, grid_width, grid_height,\n canvas_width=500, canvas_height=500):\n \"\"\" Instantiate a new CanvasGrid.\n\n Args:\n portrayal_method: function to convert each object on the grid to\n a portrayal, as described above.\n grid_width, grid_height: Size of the grid, in cells.\n canvas_height, canvas_width: Size of the canvas to draw in the\n client, in pixels. (default: 500x500)\n\n \"\"\"\n self.portrayal_method = portrayal_method\n self.grid_width = grid_width\n self.grid_height = grid_height\n self.canvas_width = canvas_width\n self.canvas_height = canvas_height\n\n new_element = (\"new CanvasModule({}, {}, {}, {})\"\n .format(self.canvas_width, self.canvas_height,\n self.grid_width, self.grid_height))\n\n self.js_code = \"elements.push(\" + new_element + \");\"\n\n def render(self, model):\n grid_state = defaultdict(list)\n for x in range(model.grid.width):\n for y in range(model.grid.height):\n cell_objects = model.grid.get_cell_list_contents([(x, y)])\n for obj in cell_objects:\n portrayal = self.portrayal_method(obj)\n if portrayal:\n portrayal[\"x\"] = x\n portrayal[\"y\"] = y\n grid_state[portrayal[\"Layer\"]].append(portrayal)\n\n return grid_state\n", "path": "mesa/visualization/modules/CanvasGridVisualization.py"}, {"content": "import random\n\nfrom mesa.visualization.modules import CanvasGrid\nfrom mesa.visualization.ModularVisualization import ModularServer\n\nfrom basic.model import Walker, ShapesModel\n\n\ndef agent_draw(agent):\n portrayal = None\n if agent is None:\n # Actually this if part is unnecessary, but still keeping it for\n # aesthetics\n pass\n elif isinstance(agent, Walker):\n print(\"Uid: {0}, Heading: {1}\".format(agent.unique_id, agent.heading))\n portrayal = {\"Shape\": \"arrowHead\",\n \"Filled\": \"true\",\n \"Layer\": 2,\n \"Color\": \"green\",\n \"Filled\": \"true\",\n \"heading0\": agent.heading[0],\n \"heading1\": agent.heading[1],\n \"text\": agent.unique_id,\n \"text_color\": \"white\",\n \"scale\": 0.8,\n }\n return portrayal\n\n\ndef launch_basic():\n width = 15\n height = 10\n num_agents = 2\n pixel_ratio = 50\n grid = CanvasGrid(agent_draw, width, height,\n width * pixel_ratio, height * pixel_ratio)\n server = ModularServer(ShapesModel, [grid], \"Basic Example\",\n num_agents, width, height)\n server.max_steps = 0\n server.port = 8888\n server.launch()\n\nif __name__ == \"__main__\":\n random.seed(3)\n launch_basic()\n", "path": "examples/Basic/basic/server.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"\nModular Canvas Rendering\n========================\n\nModule for visualizing model objects in grid cells.\n\n\"\"\"\nfrom collections import defaultdict\nfrom mesa.visualization.ModularVisualization import VisualizationElement\n\n\nclass CanvasGrid(VisualizationElement):\n \"\"\" A CanvasGrid object uses a user-provided portrayal method to generate a\n portrayal for each object. A portrayal is a JSON-ready dictionary which\n tells the relevant JavaScript code (GridDraw.js) where to draw what shape.\n\n The render method returns a dictionary, keyed on layers, with values as\n lists of portrayals to draw. Portrayals themselves are generated by the\n user-provided portrayal_method, which accepts an object as an input and\n produces a portrayal of it.\n\n A portrayal as a dictionary with the following structure:\n \"x\", \"y\": Coordinates for the cell in which the object is placed.\n \"Shape\": Can be either \"circle\", \"rect\" or \"arrowHead\"\n For Circles:\n \"r\": The radius, defined as a fraction of cell size. r=1 will\n fill the entire cell.\n For Rectangles:\n \"w\", \"h\": The width and height of the rectangle, which are in\n fractions of cell width and height.\n For arrowHead:\n \"scale\": Proportion scaling as a fraction of cell size.\n \"heading_x\": represents x direction unit vector.\n \"heading_y\": represents y direction unit vector.\n \"Color\": The color to draw the shape in; needs to be a valid HTML\n color, e.g.\"Red\" or \"#AA08F8\"\n \"Filled\": either \"true\" or \"false\", and determines whether the shape is\n filled or not.\n \"Layer\": Layer number of 0 or above; higher-numbered layers are drawn\n above lower-numbered layers.\n \"text\": The text to be inscribed inside the Shape. Normally useful for\n showing the unique_id of the agent.\n \"text_color\": The color to draw the inscribed text. Should be given in\n conjunction of \"text\" property.\n\n\n Attributes:\n portrayal_method: Function which generates portrayals from objects, as\n described above.\n grid_height, grid_width: Size of the grid to visualize, in cells.\n canvas_height, canvas_width: Size, in pixels, of the grid visualization\n to draw on the client.\n template: \"canvas_module.html\" stores the module's HTML template.\n\n \"\"\"\n package_includes = [\"GridDraw.js\", \"CanvasModule.js\"]\n portrayal_method = None # Portrayal function\n canvas_width = 500\n canvas_height = 500\n\n def __init__(self, portrayal_method, grid_width, grid_height,\n canvas_width=500, canvas_height=500):\n \"\"\" Instantiate a new CanvasGrid.\n\n Args:\n portrayal_method: function to convert each object on the grid to\n a portrayal, as described above.\n grid_width, grid_height: Size of the grid, in cells.\n canvas_height, canvas_width: Size of the canvas to draw in the\n client, in pixels. (default: 500x500)\n\n \"\"\"\n self.portrayal_method = portrayal_method\n self.grid_width = grid_width\n self.grid_height = grid_height\n self.canvas_width = canvas_width\n self.canvas_height = canvas_height\n\n new_element = (\"new CanvasModule({}, {}, {}, {})\"\n .format(self.canvas_width, self.canvas_height,\n self.grid_width, self.grid_height))\n\n self.js_code = \"elements.push(\" + new_element + \");\"\n\n def render(self, model):\n grid_state = defaultdict(list)\n for x in range(model.grid.width):\n for y in range(model.grid.height):\n cell_objects = model.grid.get_cell_list_contents([(x, y)])\n for obj in cell_objects:\n portrayal = self.portrayal_method(obj)\n if portrayal:\n portrayal[\"x\"] = x\n portrayal[\"y\"] = y\n grid_state[portrayal[\"Layer\"]].append(portrayal)\n\n return grid_state\n", "path": "mesa/visualization/modules/CanvasGridVisualization.py"}, {"content": "import random\n\nfrom mesa.visualization.modules import CanvasGrid\nfrom mesa.visualization.ModularVisualization import ModularServer\n\nfrom basic.model import Walker, ShapesModel\n\n\ndef agent_draw(agent):\n portrayal = None\n if agent is None:\n # Actually this if part is unnecessary, but still keeping it for\n # aesthetics\n pass\n elif isinstance(agent, Walker):\n print(\"Uid: {0}, Heading: {1}\".format(agent.unique_id, agent.heading))\n portrayal = {\"Shape\": \"arrowHead\",\n \"Filled\": \"true\",\n \"Layer\": 2,\n \"Color\": \"green\",\n \"Filled\": \"true\",\n \"heading_x\": agent.heading[0],\n \"heading_y\": agent.heading[1],\n \"text\": agent.unique_id,\n \"text_color\": \"white\",\n \"scale\": 0.8,\n }\n return portrayal\n\n\ndef launch_basic():\n width = 15\n height = 10\n num_agents = 2\n pixel_ratio = 50\n grid = CanvasGrid(agent_draw, width, height,\n width * pixel_ratio, height * pixel_ratio)\n server = ModularServer(ShapesModel, [grid], \"Basic Example\",\n num_agents, width, height)\n server.max_steps = 0\n server.port = 8888\n server.launch()\n\nif __name__ == \"__main__\":\n random.seed(3)\n launch_basic()\n", "path": "examples/Basic/basic/server.py"}]}
| 1,757 | 436 |
gh_patches_debug_15
|
rasdani/github-patches
|
git_diff
|
OCHA-DAP__hdx-ckan-1748
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Shrink the spacing on the top line numbers
Proposed spacings shown here:

modified css:
.item-info {
border-top: 1px solid #cccccc;
border-bottom: 1px solid #cccccc;
padding: 20px 0;
margin-top: -1px;
color: #333333;
}
.item-info .item-info-title {
font-family: 'Gotham-Bold', sans-serif;
font-weight: 400;
font-size: 16px;
letter-spacing: 0.01em;
margin-bottom: 20px;
}
.item-info .item-info-number {
font-family: 'Gotham-Light', sans-serif;
font-size: 74px;
line-height: 1;
letter-spacing: 0.01em;
margin-bottom: 20px;
}
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ckanext-hdx_theme/ckanext/hdx_theme/version.py`
Content:
```
1 hdx_version = 'v0.4.9'
2
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ckanext-hdx_theme/ckanext/hdx_theme/version.py b/ckanext-hdx_theme/ckanext/hdx_theme/version.py
--- a/ckanext-hdx_theme/ckanext/hdx_theme/version.py
+++ b/ckanext-hdx_theme/ckanext/hdx_theme/version.py
@@ -1 +1 @@
-hdx_version = 'v0.4.9'
+hdx_version = 'v0.4.10'
|
{"golden_diff": "diff --git a/ckanext-hdx_theme/ckanext/hdx_theme/version.py b/ckanext-hdx_theme/ckanext/hdx_theme/version.py\n--- a/ckanext-hdx_theme/ckanext/hdx_theme/version.py\n+++ b/ckanext-hdx_theme/ckanext/hdx_theme/version.py\n@@ -1 +1 @@\n-hdx_version = 'v0.4.9'\n+hdx_version = 'v0.4.10'\n", "issue": "Shrink the spacing on the top line numbers\nProposed spacings shown here:\n\n\n\nmodified css:\n\n.item-info {\nborder-top: 1px solid #cccccc;\nborder-bottom: 1px solid #cccccc;\npadding: 20px 0;\nmargin-top: -1px;\ncolor: #333333;\n}\n\n.item-info .item-info-title {\nfont-family: 'Gotham-Bold', sans-serif;\nfont-weight: 400;\nfont-size: 16px;\nletter-spacing: 0.01em;\nmargin-bottom: 20px;\n}\n\n.item-info .item-info-number {\nfont-family: 'Gotham-Light', sans-serif;\nfont-size: 74px;\nline-height: 1;\nletter-spacing: 0.01em;\nmargin-bottom: 20px;\n}\n\n", "before_files": [{"content": "hdx_version = 'v0.4.9'\n", "path": "ckanext-hdx_theme/ckanext/hdx_theme/version.py"}], "after_files": [{"content": "hdx_version = 'v0.4.10'\n", "path": "ckanext-hdx_theme/ckanext/hdx_theme/version.py"}]}
| 526 | 107 |
gh_patches_debug_24875
|
rasdani/github-patches
|
git_diff
|
coreproject-moe__CoreProject-Monorepo-3167
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[`Frontend`] : Move code to specific `web-component`
https://github.com/baseplate-admin/CoreProject/blob/cd436b876f4936b61397a0cc838aa88125527a78/backend/django_core/templates/anime/index.html#L123-L205
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `backend/django_core/apps/pages/views/anime.py`
Content:
```
1 from typing import TYPE_CHECKING
2
3 from django.http import HttpResponse
4 from django.shortcuts import render
5
6 from ..data.anime import (
7 anime,
8 anime_episode,
9 icons,
10 latest_animes,
11 latest_episodes,
12 my_list,
13 )
14
15 if TYPE_CHECKING:
16 from ..request import HtmxHttpRequest
17
18
19 async def anime_home_view_partial_slider_view(
20 request: "HtmxHttpRequest",
21 pk: int,
22 ) -> HttpResponse:
23 anime = latest_animes[pk]
24 next_index = (pk + 1) % len(latest_animes)
25 previous_index = (pk - 1) % len(latest_animes)
26
27 return render(
28 request,
29 "anime/_slider.html",
30 context={
31 "anime": anime,
32 "next_index": next_index,
33 "previous_index": previous_index,
34 "current_index": pk,
35 },
36 )
37
38
39 async def anime_home_view(request: "HtmxHttpRequest") -> HttpResponse:
40 if request.htmx:
41 return render(
42 request,
43 "anime/index.html",
44 context={
45 "latest_animes": latest_animes,
46 "my_list": my_list,
47 "latest_episodes": latest_episodes,
48 },
49 )
50
51 return render(
52 request,
53 "anime/_layout.html",
54 context={
55 "icons": icons,
56 "latest_animes": latest_animes,
57 "my_list": my_list,
58 "latest_episodes": latest_episodes,
59 },
60 )
61
62
63 async def anime_explore_view(request: "HtmxHttpRequest") -> HttpResponse:
64 if request.htmx:
65 return render(request, "anime/explore/index.html")
66
67 return render(request, "anime/_layout.html", context={"icons": icons})
68
69
70 async def anime_info_view(
71 request: "HtmxHttpRequest",
72 platform: str,
73 pk: int,
74 ) -> HttpResponse:
75 if request.htmx:
76 return render(
77 request,
78 "anime/info/index.html",
79 context={"anime": anime, "episode": anime_episode},
80 )
81
82 return render(request, "anime/_layout.html", context={"icons": icons})
83
84
85 async def anime_episode_view(
86 request: "HtmxHttpRequest", platform: str, mal_id: int, pk: int
87 ) -> HttpResponse:
88 if request.htmx:
89 return render(
90 request,
91 "anime/episode/index.html",
92 context={},
93 )
94
95 return render(request, "anime/_layout.html", context={"icons": icons})
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/backend/django_core/apps/pages/views/anime.py b/backend/django_core/apps/pages/views/anime.py
--- a/backend/django_core/apps/pages/views/anime.py
+++ b/backend/django_core/apps/pages/views/anime.py
@@ -1,3 +1,4 @@
+import json
from typing import TYPE_CHECKING
from django.http import HttpResponse
@@ -37,6 +38,9 @@
async def anime_home_view(request: "HtmxHttpRequest") -> HttpResponse:
+ # cant parse single quoted string
+ latest_episodes_json = json.dumps(latest_episodes)
+
if request.htmx:
return render(
request,
@@ -44,7 +48,7 @@
context={
"latest_animes": latest_animes,
"my_list": my_list,
- "latest_episodes": latest_episodes,
+ "latest_episodes": latest_episodes_json,
},
)
@@ -55,7 +59,7 @@
"icons": icons,
"latest_animes": latest_animes,
"my_list": my_list,
- "latest_episodes": latest_episodes,
+ "latest_episodes": latest_episodes_json,
},
)
|
{"golden_diff": "diff --git a/backend/django_core/apps/pages/views/anime.py b/backend/django_core/apps/pages/views/anime.py\n--- a/backend/django_core/apps/pages/views/anime.py\n+++ b/backend/django_core/apps/pages/views/anime.py\n@@ -1,3 +1,4 @@\n+import json\n from typing import TYPE_CHECKING\n \n from django.http import HttpResponse\n@@ -37,6 +38,9 @@\n \n \n async def anime_home_view(request: \"HtmxHttpRequest\") -> HttpResponse:\n+ # cant parse single quoted string\n+ latest_episodes_json = json.dumps(latest_episodes)\n+\n if request.htmx:\n return render(\n request,\n@@ -44,7 +48,7 @@\n context={\n \"latest_animes\": latest_animes,\n \"my_list\": my_list,\n- \"latest_episodes\": latest_episodes,\n+ \"latest_episodes\": latest_episodes_json,\n },\n )\n \n@@ -55,7 +59,7 @@\n \"icons\": icons,\n \"latest_animes\": latest_animes,\n \"my_list\": my_list,\n- \"latest_episodes\": latest_episodes,\n+ \"latest_episodes\": latest_episodes_json,\n },\n )\n", "issue": "[`Frontend`] : Move code to specific `web-component`\nhttps://github.com/baseplate-admin/CoreProject/blob/cd436b876f4936b61397a0cc838aa88125527a78/backend/django_core/templates/anime/index.html#L123-L205\n", "before_files": [{"content": "from typing import TYPE_CHECKING\n\nfrom django.http import HttpResponse\nfrom django.shortcuts import render\n\nfrom ..data.anime import (\n anime,\n anime_episode,\n icons,\n latest_animes,\n latest_episodes,\n my_list,\n)\n\nif TYPE_CHECKING:\n from ..request import HtmxHttpRequest\n\n\nasync def anime_home_view_partial_slider_view(\n request: \"HtmxHttpRequest\",\n pk: int,\n) -> HttpResponse:\n anime = latest_animes[pk]\n next_index = (pk + 1) % len(latest_animes)\n previous_index = (pk - 1) % len(latest_animes)\n\n return render(\n request,\n \"anime/_slider.html\",\n context={\n \"anime\": anime,\n \"next_index\": next_index,\n \"previous_index\": previous_index,\n \"current_index\": pk,\n },\n )\n\n\nasync def anime_home_view(request: \"HtmxHttpRequest\") -> HttpResponse:\n if request.htmx:\n return render(\n request,\n \"anime/index.html\",\n context={\n \"latest_animes\": latest_animes,\n \"my_list\": my_list,\n \"latest_episodes\": latest_episodes,\n },\n )\n\n return render(\n request,\n \"anime/_layout.html\",\n context={\n \"icons\": icons,\n \"latest_animes\": latest_animes,\n \"my_list\": my_list,\n \"latest_episodes\": latest_episodes,\n },\n )\n\n\nasync def anime_explore_view(request: \"HtmxHttpRequest\") -> HttpResponse:\n if request.htmx:\n return render(request, \"anime/explore/index.html\")\n\n return render(request, \"anime/_layout.html\", context={\"icons\": icons})\n\n\nasync def anime_info_view(\n request: \"HtmxHttpRequest\",\n platform: str,\n pk: int,\n) -> HttpResponse:\n if request.htmx:\n return render(\n request,\n \"anime/info/index.html\",\n context={\"anime\": anime, \"episode\": anime_episode},\n )\n\n return render(request, \"anime/_layout.html\", context={\"icons\": icons})\n\n\nasync def anime_episode_view(\n request: \"HtmxHttpRequest\", platform: str, mal_id: int, pk: int\n) -> HttpResponse:\n if request.htmx:\n return render(\n request,\n \"anime/episode/index.html\",\n context={},\n )\n\n return render(request, \"anime/_layout.html\", context={\"icons\": icons})\n", "path": "backend/django_core/apps/pages/views/anime.py"}], "after_files": [{"content": "import json\nfrom typing import TYPE_CHECKING\n\nfrom django.http import HttpResponse\nfrom django.shortcuts import render\n\nfrom ..data.anime import (\n anime,\n anime_episode,\n icons,\n latest_animes,\n latest_episodes,\n my_list,\n)\n\nif TYPE_CHECKING:\n from ..request import HtmxHttpRequest\n\n\nasync def anime_home_view_partial_slider_view(\n request: \"HtmxHttpRequest\",\n pk: int,\n) -> HttpResponse:\n anime = latest_animes[pk]\n next_index = (pk + 1) % len(latest_animes)\n previous_index = (pk - 1) % len(latest_animes)\n\n return render(\n request,\n \"anime/_slider.html\",\n context={\n \"anime\": anime,\n \"next_index\": next_index,\n \"previous_index\": previous_index,\n \"current_index\": pk,\n },\n )\n\n\nasync def anime_home_view(request: \"HtmxHttpRequest\") -> HttpResponse:\n # cant parse single quoted string\n latest_episodes_json = json.dumps(latest_episodes)\n\n if request.htmx:\n return render(\n request,\n \"anime/index.html\",\n context={\n \"latest_animes\": latest_animes,\n \"my_list\": my_list,\n \"latest_episodes\": latest_episodes_json,\n },\n )\n\n return render(\n request,\n \"anime/_layout.html\",\n context={\n \"icons\": icons,\n \"latest_animes\": latest_animes,\n \"my_list\": my_list,\n \"latest_episodes\": latest_episodes_json,\n },\n )\n\n\nasync def anime_explore_view(request: \"HtmxHttpRequest\") -> HttpResponse:\n if request.htmx:\n return render(request, \"anime/explore/index.html\")\n\n return render(request, \"anime/_layout.html\", context={\"icons\": icons})\n\n\nasync def anime_info_view(\n request: \"HtmxHttpRequest\",\n platform: str,\n pk: int,\n) -> HttpResponse:\n if request.htmx:\n return render(\n request,\n \"anime/info/index.html\",\n context={\"anime\": anime, \"episode\": anime_episode},\n )\n\n return render(request, \"anime/_layout.html\", context={\"icons\": icons})\n\n\nasync def anime_episode_view(\n request: \"HtmxHttpRequest\", platform: str, mal_id: int, pk: int\n) -> HttpResponse:\n if request.htmx:\n return render(\n request,\n \"anime/episode/index.html\",\n context={},\n )\n\n return render(request, \"anime/_layout.html\", context={\"icons\": icons})\n", "path": "backend/django_core/apps/pages/views/anime.py"}]}
| 1,061 | 259 |
gh_patches_debug_21401
|
rasdani/github-patches
|
git_diff
|
ultrabug__py3status-551
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Runtime error (BrokenPipeError) helpers.py line 11
When restarting i3 using `i3 restart`, error bar pops up with message `py3status: Runtime error (BrokenPipeError) helpers.py line 11. Please try to fix this and reload i3wm (Mod+Shift+R)`
Everything appears to be functioning and the bar still shows.
Running Ubuntu 16.04
py3status 3.1
python 3.5.2
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `py3status/__init__.py`
Content:
```
1 import locale
2 import sys
3
4 from py3status.core import Py3statusWrapper
5
6 try:
7 from setproctitle import setproctitle
8 setproctitle('py3status')
9 except ImportError:
10 pass
11
12
13 def main():
14 try:
15 locale.setlocale(locale.LC_ALL, '')
16 except locale.Error:
17 print('No locale available')
18 sys.exit(2)
19
20 py3 = None
21 try:
22 py3 = Py3statusWrapper()
23 py3.setup()
24 except KeyboardInterrupt:
25 if py3:
26 py3.notify_user('Setup interrupted (KeyboardInterrupt).')
27 sys.exit(0)
28 except Exception as e:
29 if py3:
30 py3.report_exception('Setup error')
31 else:
32 # we cannot report this Exception
33 raise e
34 sys.exit(2)
35
36 try:
37 py3.run()
38 except Exception:
39 py3.report_exception('Runtime error')
40 sys.exit(3)
41 except KeyboardInterrupt:
42 pass
43 finally:
44 py3.stop()
45 sys.exit(0)
46
47
48 if __name__ == '__main__':
49 main()
50
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/py3status/__init__.py b/py3status/__init__.py
--- a/py3status/__init__.py
+++ b/py3status/__init__.py
@@ -9,6 +9,13 @@
except ImportError:
pass
+try:
+ # python3
+ IOPipeError = BrokenPipeError
+except NameError:
+ # python2
+ IOPipeError = IOError
+
def main():
try:
@@ -21,9 +28,9 @@
try:
py3 = Py3statusWrapper()
py3.setup()
- except KeyboardInterrupt:
+ except (IOPipeError, KeyboardInterrupt):
if py3:
- py3.notify_user('Setup interrupted (KeyboardInterrupt).')
+ py3.notify_user('Setup interrupted')
sys.exit(0)
except Exception as e:
if py3:
@@ -35,11 +42,11 @@
try:
py3.run()
+ except (IOPipeError, KeyboardInterrupt):
+ pass
except Exception:
py3.report_exception('Runtime error')
sys.exit(3)
- except KeyboardInterrupt:
- pass
finally:
py3.stop()
sys.exit(0)
|
{"golden_diff": "diff --git a/py3status/__init__.py b/py3status/__init__.py\n--- a/py3status/__init__.py\n+++ b/py3status/__init__.py\n@@ -9,6 +9,13 @@\n except ImportError:\n pass\n \n+try:\n+ # python3\n+ IOPipeError = BrokenPipeError\n+except NameError:\n+ # python2\n+ IOPipeError = IOError\n+\n \n def main():\n try:\n@@ -21,9 +28,9 @@\n try:\n py3 = Py3statusWrapper()\n py3.setup()\n- except KeyboardInterrupt:\n+ except (IOPipeError, KeyboardInterrupt):\n if py3:\n- py3.notify_user('Setup interrupted (KeyboardInterrupt).')\n+ py3.notify_user('Setup interrupted')\n sys.exit(0)\n except Exception as e:\n if py3:\n@@ -35,11 +42,11 @@\n \n try:\n py3.run()\n+ except (IOPipeError, KeyboardInterrupt):\n+ pass\n except Exception:\n py3.report_exception('Runtime error')\n sys.exit(3)\n- except KeyboardInterrupt:\n- pass\n finally:\n py3.stop()\n sys.exit(0)\n", "issue": "Runtime error (BrokenPipeError) helpers.py line 11\nWhen restarting i3 using `i3 restart`, error bar pops up with message `py3status: Runtime error (BrokenPipeError) helpers.py line 11. Please try to fix this and reload i3wm (Mod+Shift+R)`\n\nEverything appears to be functioning and the bar still shows.\n\nRunning Ubuntu 16.04\npy3status 3.1\npython 3.5.2\n\n", "before_files": [{"content": "import locale\nimport sys\n\nfrom py3status.core import Py3statusWrapper\n\ntry:\n from setproctitle import setproctitle\n setproctitle('py3status')\nexcept ImportError:\n pass\n\n\ndef main():\n try:\n locale.setlocale(locale.LC_ALL, '')\n except locale.Error:\n print('No locale available')\n sys.exit(2)\n\n py3 = None\n try:\n py3 = Py3statusWrapper()\n py3.setup()\n except KeyboardInterrupt:\n if py3:\n py3.notify_user('Setup interrupted (KeyboardInterrupt).')\n sys.exit(0)\n except Exception as e:\n if py3:\n py3.report_exception('Setup error')\n else:\n # we cannot report this Exception\n raise e\n sys.exit(2)\n\n try:\n py3.run()\n except Exception:\n py3.report_exception('Runtime error')\n sys.exit(3)\n except KeyboardInterrupt:\n pass\n finally:\n py3.stop()\n sys.exit(0)\n\n\nif __name__ == '__main__':\n main()\n", "path": "py3status/__init__.py"}], "after_files": [{"content": "import locale\nimport sys\n\nfrom py3status.core import Py3statusWrapper\n\ntry:\n from setproctitle import setproctitle\n setproctitle('py3status')\nexcept ImportError:\n pass\n\ntry:\n # python3\n IOPipeError = BrokenPipeError\nexcept NameError:\n # python2\n IOPipeError = IOError\n\n\ndef main():\n try:\n locale.setlocale(locale.LC_ALL, '')\n except locale.Error:\n print('No locale available')\n sys.exit(2)\n\n py3 = None\n try:\n py3 = Py3statusWrapper()\n py3.setup()\n except (IOPipeError, KeyboardInterrupt):\n if py3:\n py3.notify_user('Setup interrupted')\n sys.exit(0)\n except Exception as e:\n if py3:\n py3.report_exception('Setup error')\n else:\n # we cannot report this Exception\n raise e\n sys.exit(2)\n\n try:\n py3.run()\n except (IOPipeError, KeyboardInterrupt):\n pass\n except Exception:\n py3.report_exception('Runtime error')\n sys.exit(3)\n finally:\n py3.stop()\n sys.exit(0)\n\n\nif __name__ == '__main__':\n main()\n", "path": "py3status/__init__.py"}]}
| 695 | 276 |
gh_patches_debug_4747
|
rasdani/github-patches
|
git_diff
|
scrapy__scrapy-5006
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove the use of parsel.Selector._default_type
Used at https://github.com/scrapy/scrapy/blob/58ca8bbf6d1589bd0c8cc1ebda52299346f55e8a/scrapy/selector/unified.py#L72
We should stop relying on this private class variable unless there’s a good reason for it.
[Noticed](https://github.com/scrapy/parsel/pull/181/files#r562118000) while trying out [JMESPath support for Parsel](https://github.com/scrapy/parsel/pull/181) in a real life project.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scrapy/selector/unified.py`
Content:
```
1 """
2 XPath selectors based on lxml
3 """
4
5 from parsel import Selector as _ParselSelector
6 from scrapy.utils.trackref import object_ref
7 from scrapy.utils.python import to_bytes
8 from scrapy.http import HtmlResponse, XmlResponse
9
10
11 __all__ = ['Selector', 'SelectorList']
12
13
14 def _st(response, st):
15 if st is None:
16 return 'xml' if isinstance(response, XmlResponse) else 'html'
17 return st
18
19
20 def _response_from_text(text, st):
21 rt = XmlResponse if st == 'xml' else HtmlResponse
22 return rt(url='about:blank', encoding='utf-8',
23 body=to_bytes(text, 'utf-8'))
24
25
26 class SelectorList(_ParselSelector.selectorlist_cls, object_ref):
27 """
28 The :class:`SelectorList` class is a subclass of the builtin ``list``
29 class, which provides a few additional methods.
30 """
31
32
33 class Selector(_ParselSelector, object_ref):
34 """
35 An instance of :class:`Selector` is a wrapper over response to select
36 certain parts of its content.
37
38 ``response`` is an :class:`~scrapy.http.HtmlResponse` or an
39 :class:`~scrapy.http.XmlResponse` object that will be used for selecting
40 and extracting data.
41
42 ``text`` is a unicode string or utf-8 encoded text for cases when a
43 ``response`` isn't available. Using ``text`` and ``response`` together is
44 undefined behavior.
45
46 ``type`` defines the selector type, it can be ``"html"``, ``"xml"``
47 or ``None`` (default).
48
49 If ``type`` is ``None``, the selector automatically chooses the best type
50 based on ``response`` type (see below), or defaults to ``"html"`` in case it
51 is used together with ``text``.
52
53 If ``type`` is ``None`` and a ``response`` is passed, the selector type is
54 inferred from the response type as follows:
55
56 * ``"html"`` for :class:`~scrapy.http.HtmlResponse` type
57 * ``"xml"`` for :class:`~scrapy.http.XmlResponse` type
58 * ``"html"`` for anything else
59
60 Otherwise, if ``type`` is set, the selector type will be forced and no
61 detection will occur.
62 """
63
64 __slots__ = ['response']
65 selectorlist_cls = SelectorList
66
67 def __init__(self, response=None, text=None, type=None, root=None, **kwargs):
68 if response is not None and text is not None:
69 raise ValueError(f'{self.__class__.__name__}.__init__() received '
70 'both response and text')
71
72 st = _st(response, type or self._default_type)
73
74 if text is not None:
75 response = _response_from_text(text, st)
76
77 if response is not None:
78 text = response.text
79 kwargs.setdefault('base_url', response.url)
80
81 self.response = response
82 super().__init__(text=text, type=st, root=root, **kwargs)
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/scrapy/selector/unified.py b/scrapy/selector/unified.py
--- a/scrapy/selector/unified.py
+++ b/scrapy/selector/unified.py
@@ -69,7 +69,7 @@
raise ValueError(f'{self.__class__.__name__}.__init__() received '
'both response and text')
- st = _st(response, type or self._default_type)
+ st = _st(response, type)
if text is not None:
response = _response_from_text(text, st)
|
{"golden_diff": "diff --git a/scrapy/selector/unified.py b/scrapy/selector/unified.py\n--- a/scrapy/selector/unified.py\n+++ b/scrapy/selector/unified.py\n@@ -69,7 +69,7 @@\n raise ValueError(f'{self.__class__.__name__}.__init__() received '\n 'both response and text')\n \n- st = _st(response, type or self._default_type)\n+ st = _st(response, type)\n \n if text is not None:\n response = _response_from_text(text, st)\n", "issue": "Remove the use of parsel.Selector._default_type\nUsed at https://github.com/scrapy/scrapy/blob/58ca8bbf6d1589bd0c8cc1ebda52299346f55e8a/scrapy/selector/unified.py#L72\r\n\r\nWe should stop relying on this private class variable unless there\u2019s a good reason for it.\r\n\r\n[Noticed](https://github.com/scrapy/parsel/pull/181/files#r562118000) while trying out [JMESPath support for Parsel](https://github.com/scrapy/parsel/pull/181) in a real life project.\n", "before_files": [{"content": "\"\"\"\nXPath selectors based on lxml\n\"\"\"\n\nfrom parsel import Selector as _ParselSelector\nfrom scrapy.utils.trackref import object_ref\nfrom scrapy.utils.python import to_bytes\nfrom scrapy.http import HtmlResponse, XmlResponse\n\n\n__all__ = ['Selector', 'SelectorList']\n\n\ndef _st(response, st):\n if st is None:\n return 'xml' if isinstance(response, XmlResponse) else 'html'\n return st\n\n\ndef _response_from_text(text, st):\n rt = XmlResponse if st == 'xml' else HtmlResponse\n return rt(url='about:blank', encoding='utf-8',\n body=to_bytes(text, 'utf-8'))\n\n\nclass SelectorList(_ParselSelector.selectorlist_cls, object_ref):\n \"\"\"\n The :class:`SelectorList` class is a subclass of the builtin ``list``\n class, which provides a few additional methods.\n \"\"\"\n\n\nclass Selector(_ParselSelector, object_ref):\n \"\"\"\n An instance of :class:`Selector` is a wrapper over response to select\n certain parts of its content.\n\n ``response`` is an :class:`~scrapy.http.HtmlResponse` or an\n :class:`~scrapy.http.XmlResponse` object that will be used for selecting\n and extracting data.\n\n ``text`` is a unicode string or utf-8 encoded text for cases when a\n ``response`` isn't available. Using ``text`` and ``response`` together is\n undefined behavior.\n\n ``type`` defines the selector type, it can be ``\"html\"``, ``\"xml\"``\n or ``None`` (default).\n\n If ``type`` is ``None``, the selector automatically chooses the best type\n based on ``response`` type (see below), or defaults to ``\"html\"`` in case it\n is used together with ``text``.\n\n If ``type`` is ``None`` and a ``response`` is passed, the selector type is\n inferred from the response type as follows:\n\n * ``\"html\"`` for :class:`~scrapy.http.HtmlResponse` type\n * ``\"xml\"`` for :class:`~scrapy.http.XmlResponse` type\n * ``\"html\"`` for anything else\n\n Otherwise, if ``type`` is set, the selector type will be forced and no\n detection will occur.\n \"\"\"\n\n __slots__ = ['response']\n selectorlist_cls = SelectorList\n\n def __init__(self, response=None, text=None, type=None, root=None, **kwargs):\n if response is not None and text is not None:\n raise ValueError(f'{self.__class__.__name__}.__init__() received '\n 'both response and text')\n\n st = _st(response, type or self._default_type)\n\n if text is not None:\n response = _response_from_text(text, st)\n\n if response is not None:\n text = response.text\n kwargs.setdefault('base_url', response.url)\n\n self.response = response\n super().__init__(text=text, type=st, root=root, **kwargs)\n", "path": "scrapy/selector/unified.py"}], "after_files": [{"content": "\"\"\"\nXPath selectors based on lxml\n\"\"\"\n\nfrom parsel import Selector as _ParselSelector\nfrom scrapy.utils.trackref import object_ref\nfrom scrapy.utils.python import to_bytes\nfrom scrapy.http import HtmlResponse, XmlResponse\n\n\n__all__ = ['Selector', 'SelectorList']\n\n\ndef _st(response, st):\n if st is None:\n return 'xml' if isinstance(response, XmlResponse) else 'html'\n return st\n\n\ndef _response_from_text(text, st):\n rt = XmlResponse if st == 'xml' else HtmlResponse\n return rt(url='about:blank', encoding='utf-8',\n body=to_bytes(text, 'utf-8'))\n\n\nclass SelectorList(_ParselSelector.selectorlist_cls, object_ref):\n \"\"\"\n The :class:`SelectorList` class is a subclass of the builtin ``list``\n class, which provides a few additional methods.\n \"\"\"\n\n\nclass Selector(_ParselSelector, object_ref):\n \"\"\"\n An instance of :class:`Selector` is a wrapper over response to select\n certain parts of its content.\n\n ``response`` is an :class:`~scrapy.http.HtmlResponse` or an\n :class:`~scrapy.http.XmlResponse` object that will be used for selecting\n and extracting data.\n\n ``text`` is a unicode string or utf-8 encoded text for cases when a\n ``response`` isn't available. Using ``text`` and ``response`` together is\n undefined behavior.\n\n ``type`` defines the selector type, it can be ``\"html\"``, ``\"xml\"``\n or ``None`` (default).\n\n If ``type`` is ``None``, the selector automatically chooses the best type\n based on ``response`` type (see below), or defaults to ``\"html\"`` in case it\n is used together with ``text``.\n\n If ``type`` is ``None`` and a ``response`` is passed, the selector type is\n inferred from the response type as follows:\n\n * ``\"html\"`` for :class:`~scrapy.http.HtmlResponse` type\n * ``\"xml\"`` for :class:`~scrapy.http.XmlResponse` type\n * ``\"html\"`` for anything else\n\n Otherwise, if ``type`` is set, the selector type will be forced and no\n detection will occur.\n \"\"\"\n\n __slots__ = ['response']\n selectorlist_cls = SelectorList\n\n def __init__(self, response=None, text=None, type=None, root=None, **kwargs):\n if response is not None and text is not None:\n raise ValueError(f'{self.__class__.__name__}.__init__() received '\n 'both response and text')\n\n st = _st(response, type)\n\n if text is not None:\n response = _response_from_text(text, st)\n\n if response is not None:\n text = response.text\n kwargs.setdefault('base_url', response.url)\n\n self.response = response\n super().__init__(text=text, type=st, root=root, **kwargs)\n", "path": "scrapy/selector/unified.py"}]}
| 1,243 | 121 |
gh_patches_debug_2707
|
rasdani/github-patches
|
git_diff
|
DataDog__dd-trace-py-1582
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ddtrace.Pin() for multiple grpc channels doesn't work
Thanks for taking the time for reporting an issue!
Before reporting an issue on dd-trace-py, please be sure to provide all
necessary information.
If you're hitting a bug, make sure that you're using the latest version of this
library.
### Which version of dd-trace-py are you using?
0.38.2
I didn't find anything related to this issue in the release notes of the releases after this version.
### Which version of the libraries are you using?
datadog==0.36.0
### How can we reproduce your problem?
Approach 1:
servers is a list of grpc server addresses
```
for server in servers:
channel = grpc.insecure_channel(server)
Pin.override(channel, service=server)
# Do something with the channel
```
Since `Pin.override(grpc.Channel, service=server)` worked with one server, I also tried the following to see how it looks
Approach 2:
servers is a list of grpc server addresses
```
for server in servers:
Pin.override(grpc.Channel, service=server)
channel = grpc.insecure_channel(server)
# Do something with the channel
```
### What is the result that you get?
In Approach 1, Pin.override did not set the service name correctly. Everywhere in Datadog, I could see it as `grpc-client` which is the default value.
In Approach 2, since I I don't pass the channels corresponding to each server, all servers are overriden by Pin to the final server (probably because it's the last one in the loop)
### What is the result that you expected?
ddtrace.Pin() onto multiple grpc channels should work and I should be able to see the correct `service` in Datadog APM traces and Service Map
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ddtrace/contrib/grpc/patch.py`
Content:
```
1 import os
2
3 import grpc
4
5 from ddtrace.vendor.wrapt import wrap_function_wrapper as _w
6 from ddtrace import config, Pin
7
8 from ...utils.wrappers import unwrap as _u
9
10 from . import constants
11 from .client_interceptor import create_client_interceptor, intercept_channel
12 from .server_interceptor import create_server_interceptor
13
14
15 config._add('grpc_server', dict(
16 service_name=config._get_service(default=constants.GRPC_SERVICE_SERVER),
17 distributed_tracing_enabled=True,
18 ))
19
20
21 # Precedence for the service name:
22 # 1) DD_GRPC_SERVICE if defined; or
23 # 2) For compatibility, the globally set service + "-grpc-client"; or
24 # 3) The fall-back "grpc-client"
25 if "DD_GRPC_SERVICE" in os.environ:
26 service = os.getenv("DD_GRPC_SERVICE")
27 elif config._get_service():
28 service = "{}-{}".format(config._get_service(), constants.GRPC_SERVICE_CLIENT)
29 else:
30 service = constants.GRPC_SERVICE_CLIENT
31
32
33 # TODO[tbutt]: keeping name for client config unchanged to maintain backwards
34 # compatibility but should change in future
35 config._add('grpc', dict(
36 service_name=service,
37 distributed_tracing_enabled=True,
38 ))
39
40
41 def patch():
42 _patch_client()
43 _patch_server()
44
45
46 def unpatch():
47 _unpatch_client()
48 _unpatch_server()
49
50
51 def _patch_client():
52 if getattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False):
53 return
54 setattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', True)
55
56 Pin().onto(constants.GRPC_PIN_MODULE_CLIENT)
57
58 _w('grpc', 'insecure_channel', _client_channel_interceptor)
59 _w('grpc', 'secure_channel', _client_channel_interceptor)
60 _w('grpc', 'intercept_channel', intercept_channel)
61
62
63 def _unpatch_client():
64 if not getattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False):
65 return
66 setattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False)
67
68 pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)
69 if pin:
70 pin.remove_from(constants.GRPC_PIN_MODULE_CLIENT)
71
72 _u(grpc, 'secure_channel')
73 _u(grpc, 'insecure_channel')
74
75
76 def _patch_server():
77 if getattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False):
78 return
79 setattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', True)
80
81 Pin().onto(constants.GRPC_PIN_MODULE_SERVER)
82
83 _w('grpc', 'server', _server_constructor_interceptor)
84
85
86 def _unpatch_server():
87 if not getattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False):
88 return
89 setattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False)
90
91 pin = Pin.get_from(constants.GRPC_PIN_MODULE_SERVER)
92 if pin:
93 pin.remove_from(constants.GRPC_PIN_MODULE_SERVER)
94
95 _u(grpc, 'server')
96
97
98 def _client_channel_interceptor(wrapped, instance, args, kwargs):
99 channel = wrapped(*args, **kwargs)
100
101 pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)
102 if not pin or not pin.enabled():
103 return channel
104
105 (host, port) = _parse_target_from_arguments(args, kwargs)
106
107 interceptor_function = create_client_interceptor(pin, host, port)
108 return grpc.intercept_channel(channel, interceptor_function)
109
110
111 def _server_constructor_interceptor(wrapped, instance, args, kwargs):
112 # DEV: we clone the pin on the grpc module and configure it for the server
113 # interceptor
114
115 pin = Pin.get_from(constants.GRPC_PIN_MODULE_SERVER)
116 if not pin or not pin.enabled():
117 return wrapped(*args, **kwargs)
118
119 interceptor = create_server_interceptor(pin)
120
121 # DEV: Inject our tracing interceptor first in the list of interceptors
122 if 'interceptors' in kwargs:
123 kwargs['interceptors'] = (interceptor,) + tuple(kwargs['interceptors'])
124 else:
125 kwargs['interceptors'] = (interceptor,)
126
127 return wrapped(*args, **kwargs)
128
129
130 def _parse_target_from_arguments(args, kwargs):
131 if 'target' in kwargs:
132 target = kwargs['target']
133 else:
134 target = args[0]
135
136 split = target.rsplit(':', 2)
137
138 return (split[0], split[1] if len(split) > 1 else None)
139
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ddtrace/contrib/grpc/patch.py b/ddtrace/contrib/grpc/patch.py
--- a/ddtrace/contrib/grpc/patch.py
+++ b/ddtrace/contrib/grpc/patch.py
@@ -98,7 +98,7 @@
def _client_channel_interceptor(wrapped, instance, args, kwargs):
channel = wrapped(*args, **kwargs)
- pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)
+ pin = Pin.get_from(channel)
if not pin or not pin.enabled():
return channel
|
{"golden_diff": "diff --git a/ddtrace/contrib/grpc/patch.py b/ddtrace/contrib/grpc/patch.py\n--- a/ddtrace/contrib/grpc/patch.py\n+++ b/ddtrace/contrib/grpc/patch.py\n@@ -98,7 +98,7 @@\n def _client_channel_interceptor(wrapped, instance, args, kwargs):\n channel = wrapped(*args, **kwargs)\n \n- pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)\n+ pin = Pin.get_from(channel)\n if not pin or not pin.enabled():\n return channel\n", "issue": "ddtrace.Pin() for multiple grpc channels doesn't work\nThanks for taking the time for reporting an issue!\r\n\r\nBefore reporting an issue on dd-trace-py, please be sure to provide all\r\nnecessary information.\r\n\r\nIf you're hitting a bug, make sure that you're using the latest version of this\r\nlibrary.\r\n\r\n### Which version of dd-trace-py are you using?\r\n0.38.2\r\nI didn't find anything related to this issue in the release notes of the releases after this version.\r\n\r\n### Which version of the libraries are you using?\r\ndatadog==0.36.0\r\n\r\n### How can we reproduce your problem?\r\nApproach 1:\r\nservers is a list of grpc server addresses\r\n```\r\nfor server in servers:\r\n channel = grpc.insecure_channel(server)\r\n Pin.override(channel, service=server)\r\n # Do something with the channel\r\n```\r\n\r\nSince `Pin.override(grpc.Channel, service=server)` worked with one server, I also tried the following to see how it looks\r\nApproach 2:\r\nservers is a list of grpc server addresses\r\n```\r\nfor server in servers:\r\n Pin.override(grpc.Channel, service=server)\r\n channel = grpc.insecure_channel(server)\r\n # Do something with the channel\r\n```\r\n\r\n### What is the result that you get?\r\nIn Approach 1, Pin.override did not set the service name correctly. Everywhere in Datadog, I could see it as `grpc-client` which is the default value.\r\nIn Approach 2, since I I don't pass the channels corresponding to each server, all servers are overriden by Pin to the final server (probably because it's the last one in the loop)\r\n\r\n### What is the result that you expected?\r\nddtrace.Pin() onto multiple grpc channels should work and I should be able to see the correct `service` in Datadog APM traces and Service Map\n", "before_files": [{"content": "import os\n\nimport grpc\n\nfrom ddtrace.vendor.wrapt import wrap_function_wrapper as _w\nfrom ddtrace import config, Pin\n\nfrom ...utils.wrappers import unwrap as _u\n\nfrom . import constants\nfrom .client_interceptor import create_client_interceptor, intercept_channel\nfrom .server_interceptor import create_server_interceptor\n\n\nconfig._add('grpc_server', dict(\n service_name=config._get_service(default=constants.GRPC_SERVICE_SERVER),\n distributed_tracing_enabled=True,\n))\n\n\n# Precedence for the service name:\n# 1) DD_GRPC_SERVICE if defined; or\n# 2) For compatibility, the globally set service + \"-grpc-client\"; or\n# 3) The fall-back \"grpc-client\"\nif \"DD_GRPC_SERVICE\" in os.environ:\n service = os.getenv(\"DD_GRPC_SERVICE\")\nelif config._get_service():\n service = \"{}-{}\".format(config._get_service(), constants.GRPC_SERVICE_CLIENT)\nelse:\n service = constants.GRPC_SERVICE_CLIENT\n\n\n# TODO[tbutt]: keeping name for client config unchanged to maintain backwards\n# compatibility but should change in future\nconfig._add('grpc', dict(\n service_name=service,\n distributed_tracing_enabled=True,\n))\n\n\ndef patch():\n _patch_client()\n _patch_server()\n\n\ndef unpatch():\n _unpatch_client()\n _unpatch_server()\n\n\ndef _patch_client():\n if getattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', True)\n\n Pin().onto(constants.GRPC_PIN_MODULE_CLIENT)\n\n _w('grpc', 'insecure_channel', _client_channel_interceptor)\n _w('grpc', 'secure_channel', _client_channel_interceptor)\n _w('grpc', 'intercept_channel', intercept_channel)\n\n\ndef _unpatch_client():\n if not getattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False)\n\n pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)\n if pin:\n pin.remove_from(constants.GRPC_PIN_MODULE_CLIENT)\n\n _u(grpc, 'secure_channel')\n _u(grpc, 'insecure_channel')\n\n\ndef _patch_server():\n if getattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', True)\n\n Pin().onto(constants.GRPC_PIN_MODULE_SERVER)\n\n _w('grpc', 'server', _server_constructor_interceptor)\n\n\ndef _unpatch_server():\n if not getattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False)\n\n pin = Pin.get_from(constants.GRPC_PIN_MODULE_SERVER)\n if pin:\n pin.remove_from(constants.GRPC_PIN_MODULE_SERVER)\n\n _u(grpc, 'server')\n\n\ndef _client_channel_interceptor(wrapped, instance, args, kwargs):\n channel = wrapped(*args, **kwargs)\n\n pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)\n if not pin or not pin.enabled():\n return channel\n\n (host, port) = _parse_target_from_arguments(args, kwargs)\n\n interceptor_function = create_client_interceptor(pin, host, port)\n return grpc.intercept_channel(channel, interceptor_function)\n\n\ndef _server_constructor_interceptor(wrapped, instance, args, kwargs):\n # DEV: we clone the pin on the grpc module and configure it for the server\n # interceptor\n\n pin = Pin.get_from(constants.GRPC_PIN_MODULE_SERVER)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n interceptor = create_server_interceptor(pin)\n\n # DEV: Inject our tracing interceptor first in the list of interceptors\n if 'interceptors' in kwargs:\n kwargs['interceptors'] = (interceptor,) + tuple(kwargs['interceptors'])\n else:\n kwargs['interceptors'] = (interceptor,)\n\n return wrapped(*args, **kwargs)\n\n\ndef _parse_target_from_arguments(args, kwargs):\n if 'target' in kwargs:\n target = kwargs['target']\n else:\n target = args[0]\n\n split = target.rsplit(':', 2)\n\n return (split[0], split[1] if len(split) > 1 else None)\n", "path": "ddtrace/contrib/grpc/patch.py"}], "after_files": [{"content": "import os\n\nimport grpc\n\nfrom ddtrace.vendor.wrapt import wrap_function_wrapper as _w\nfrom ddtrace import config, Pin\n\nfrom ...utils.wrappers import unwrap as _u\n\nfrom . import constants\nfrom .client_interceptor import create_client_interceptor, intercept_channel\nfrom .server_interceptor import create_server_interceptor\n\n\nconfig._add('grpc_server', dict(\n service_name=config._get_service(default=constants.GRPC_SERVICE_SERVER),\n distributed_tracing_enabled=True,\n))\n\n\n# Precedence for the service name:\n# 1) DD_GRPC_SERVICE if defined; or\n# 2) For compatibility, the globally set service + \"-grpc-client\"; or\n# 3) The fall-back \"grpc-client\"\nif \"DD_GRPC_SERVICE\" in os.environ:\n service = os.getenv(\"DD_GRPC_SERVICE\")\nelif config._get_service():\n service = \"{}-{}\".format(config._get_service(), constants.GRPC_SERVICE_CLIENT)\nelse:\n service = constants.GRPC_SERVICE_CLIENT\n\n\n# TODO[tbutt]: keeping name for client config unchanged to maintain backwards\n# compatibility but should change in future\nconfig._add('grpc', dict(\n service_name=service,\n distributed_tracing_enabled=True,\n))\n\n\ndef patch():\n _patch_client()\n _patch_server()\n\n\ndef unpatch():\n _unpatch_client()\n _unpatch_server()\n\n\ndef _patch_client():\n if getattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', True)\n\n Pin().onto(constants.GRPC_PIN_MODULE_CLIENT)\n\n _w('grpc', 'insecure_channel', _client_channel_interceptor)\n _w('grpc', 'secure_channel', _client_channel_interceptor)\n _w('grpc', 'intercept_channel', intercept_channel)\n\n\ndef _unpatch_client():\n if not getattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_CLIENT, '__datadog_patch', False)\n\n pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)\n if pin:\n pin.remove_from(constants.GRPC_PIN_MODULE_CLIENT)\n\n _u(grpc, 'secure_channel')\n _u(grpc, 'insecure_channel')\n\n\ndef _patch_server():\n if getattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', True)\n\n Pin().onto(constants.GRPC_PIN_MODULE_SERVER)\n\n _w('grpc', 'server', _server_constructor_interceptor)\n\n\ndef _unpatch_server():\n if not getattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False):\n return\n setattr(constants.GRPC_PIN_MODULE_SERVER, '__datadog_patch', False)\n\n pin = Pin.get_from(constants.GRPC_PIN_MODULE_SERVER)\n if pin:\n pin.remove_from(constants.GRPC_PIN_MODULE_SERVER)\n\n _u(grpc, 'server')\n\n\ndef _client_channel_interceptor(wrapped, instance, args, kwargs):\n channel = wrapped(*args, **kwargs)\n\n pin = Pin.get_from(channel)\n if not pin or not pin.enabled():\n return channel\n\n (host, port) = _parse_target_from_arguments(args, kwargs)\n\n interceptor_function = create_client_interceptor(pin, host, port)\n return grpc.intercept_channel(channel, interceptor_function)\n\n\ndef _server_constructor_interceptor(wrapped, instance, args, kwargs):\n # DEV: we clone the pin on the grpc module and configure it for the server\n # interceptor\n\n pin = Pin.get_from(constants.GRPC_PIN_MODULE_SERVER)\n if not pin or not pin.enabled():\n return wrapped(*args, **kwargs)\n\n interceptor = create_server_interceptor(pin)\n\n # DEV: Inject our tracing interceptor first in the list of interceptors\n if 'interceptors' in kwargs:\n kwargs['interceptors'] = (interceptor,) + tuple(kwargs['interceptors'])\n else:\n kwargs['interceptors'] = (interceptor,)\n\n return wrapped(*args, **kwargs)\n\n\ndef _parse_target_from_arguments(args, kwargs):\n if 'target' in kwargs:\n target = kwargs['target']\n else:\n target = args[0]\n\n split = target.rsplit(':', 2)\n\n return (split[0], split[1] if len(split) > 1 else None)\n", "path": "ddtrace/contrib/grpc/patch.py"}]}
| 1,938 | 118 |
gh_patches_debug_19885
|
rasdani/github-patches
|
git_diff
|
mlcommons__GaNDLF-498
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Per-label accuracy does not work for multiple batches
**Describe the bug**
When `batch_size > 1`, `per_label_accuracy` computation fails.
**To Reproduce**
Steps to reproduce the behavior:
1. Set `batch_size = 4` in any classification unit test
2. See error
**Expected behavior**
The function should compute multiple batches of accuracies.
**Screenshots**
N.A.
**GaNDLF Version**
<!-- Put the output of the following command:
python -c 'import GANDLF as g;print(g.__version__)'
-->
0.0.15-dev
**Desktop (please complete the following information):**
N.A.
**Additional context**
Reported by @brandon-edwards
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `GANDLF/metrics/regression.py`
Content:
```
1 """
2 All the metrics are to be called from here
3 """
4 import torch
5 from sklearn.metrics import balanced_accuracy_score
6 import numpy as np
7
8
9 def classification_accuracy(output, label, params):
10 """
11 This function computes the classification accuracy.
12
13 Args:
14 output (torch.Tensor): The output of the model.
15 label (torch.Tensor): The ground truth labels.
16 params (dict): The parameter dictionary containing training and data information.
17
18 Returns:
19 torch.Tensor: The classification accuracy.
20 """
21 if params["problem_type"] == "classification":
22 predicted_classes = torch.argmax(output, 1)
23 else:
24 predicted_classes = output
25
26 acc = torch.sum(predicted_classes == label.squeeze()) / len(label)
27 return acc
28
29
30 def balanced_acc_score(output, label, params):
31 """
32 This function computes the balanced accuracy.
33
34 Args:
35 output (torch.Tensor): The output of the model.
36 label (torch.Tensor): The ground truth labels.
37 params (dict): The parameter dictionary containing training and data information.
38
39 Returns:
40 torch.Tensor: The balanced accuracy.
41 """
42 if params["problem_type"] == "classification":
43 predicted_classes = torch.argmax(output, 1)
44 else:
45 predicted_classes = output
46
47 return torch.from_numpy(
48 np.array(balanced_accuracy_score(predicted_classes.cpu(), label.cpu()))
49 )
50
51
52 def per_label_accuracy(output, label, params):
53 """
54 This function computes the per class accuracy.
55
56 Args:
57 output (torch.Tensor): The output of the model.
58 label (torch.Tensor): The ground truth labels.
59 params (dict): The parameter dictionary containing training and data information.
60
61 Returns:
62 torch.Tensor: The per class accuracy.
63 """
64 if params["problem_type"] == "classification":
65 predicted_classes = np.array([0] * len(params["model"]["class_list"]))
66 label_cpu = np.array([0] * len(params["model"]["class_list"]))
67 predicted_classes[torch.argmax(output, 1).cpu().item()] = 1
68 label_cpu[label.cpu().item()] = 1
69 return torch.from_numpy((predicted_classes == label_cpu).astype(float))
70 else:
71 return balanced_acc_score(output, label, params)
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/GANDLF/metrics/regression.py b/GANDLF/metrics/regression.py
--- a/GANDLF/metrics/regression.py
+++ b/GANDLF/metrics/regression.py
@@ -62,10 +62,14 @@
torch.Tensor: The per class accuracy.
"""
if params["problem_type"] == "classification":
- predicted_classes = np.array([0] * len(params["model"]["class_list"]))
- label_cpu = np.array([0] * len(params["model"]["class_list"]))
- predicted_classes[torch.argmax(output, 1).cpu().item()] = 1
- label_cpu[label.cpu().item()] = 1
- return torch.from_numpy((predicted_classes == label_cpu).astype(float))
+ # ensure this works for multiple batches
+ output_accuracy = torch.zeros(len(params["model"]["class_list"]))
+ for output_batch, label_batch in zip(output, label):
+ predicted_classes = torch.Tensor([0] * len(params["model"]["class_list"]))
+ label_cpu = torch.Tensor([0] * len(params["model"]["class_list"]))
+ predicted_classes[torch.argmax(output_batch, 0).cpu().item()] = 1
+ label_cpu[label_batch.cpu().item()] = 1
+ output_accuracy += (predicted_classes == label_cpu).type(torch.float)
+ return output_accuracy / len(output)
else:
return balanced_acc_score(output, label, params)
|
{"golden_diff": "diff --git a/GANDLF/metrics/regression.py b/GANDLF/metrics/regression.py\n--- a/GANDLF/metrics/regression.py\n+++ b/GANDLF/metrics/regression.py\n@@ -62,10 +62,14 @@\n torch.Tensor: The per class accuracy.\n \"\"\"\n if params[\"problem_type\"] == \"classification\":\n- predicted_classes = np.array([0] * len(params[\"model\"][\"class_list\"]))\n- label_cpu = np.array([0] * len(params[\"model\"][\"class_list\"]))\n- predicted_classes[torch.argmax(output, 1).cpu().item()] = 1\n- label_cpu[label.cpu().item()] = 1\n- return torch.from_numpy((predicted_classes == label_cpu).astype(float))\n+ # ensure this works for multiple batches\n+ output_accuracy = torch.zeros(len(params[\"model\"][\"class_list\"]))\n+ for output_batch, label_batch in zip(output, label):\n+ predicted_classes = torch.Tensor([0] * len(params[\"model\"][\"class_list\"]))\n+ label_cpu = torch.Tensor([0] * len(params[\"model\"][\"class_list\"]))\n+ predicted_classes[torch.argmax(output_batch, 0).cpu().item()] = 1\n+ label_cpu[label_batch.cpu().item()] = 1\n+ output_accuracy += (predicted_classes == label_cpu).type(torch.float)\n+ return output_accuracy / len(output)\n else:\n return balanced_acc_score(output, label, params)\n", "issue": "Per-label accuracy does not work for multiple batches\n**Describe the bug**\r\nWhen `batch_size > 1`, `per_label_accuracy` computation fails.\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Set `batch_size = 4` in any classification unit test\r\n2. See error\r\n\r\n**Expected behavior**\r\nThe function should compute multiple batches of accuracies.\r\n\r\n**Screenshots**\r\nN.A.\r\n\r\n**GaNDLF Version**\r\n<!-- Put the output of the following command:\r\npython -c 'import GANDLF as g;print(g.__version__)'\r\n-->\r\n0.0.15-dev\r\n\r\n**Desktop (please complete the following information):**\r\nN.A.\r\n\r\n**Additional context**\r\nReported by @brandon-edwards\n", "before_files": [{"content": "\"\"\"\nAll the metrics are to be called from here\n\"\"\"\nimport torch\nfrom sklearn.metrics import balanced_accuracy_score\nimport numpy as np\n\n\ndef classification_accuracy(output, label, params):\n \"\"\"\n This function computes the classification accuracy.\n\n Args:\n output (torch.Tensor): The output of the model.\n label (torch.Tensor): The ground truth labels.\n params (dict): The parameter dictionary containing training and data information.\n\n Returns:\n torch.Tensor: The classification accuracy.\n \"\"\"\n if params[\"problem_type\"] == \"classification\":\n predicted_classes = torch.argmax(output, 1)\n else:\n predicted_classes = output\n\n acc = torch.sum(predicted_classes == label.squeeze()) / len(label)\n return acc\n\n\ndef balanced_acc_score(output, label, params):\n \"\"\"\n This function computes the balanced accuracy.\n\n Args:\n output (torch.Tensor): The output of the model.\n label (torch.Tensor): The ground truth labels.\n params (dict): The parameter dictionary containing training and data information.\n\n Returns:\n torch.Tensor: The balanced accuracy.\n \"\"\"\n if params[\"problem_type\"] == \"classification\":\n predicted_classes = torch.argmax(output, 1)\n else:\n predicted_classes = output\n\n return torch.from_numpy(\n np.array(balanced_accuracy_score(predicted_classes.cpu(), label.cpu()))\n )\n\n\ndef per_label_accuracy(output, label, params):\n \"\"\"\n This function computes the per class accuracy.\n\n Args:\n output (torch.Tensor): The output of the model.\n label (torch.Tensor): The ground truth labels.\n params (dict): The parameter dictionary containing training and data information.\n\n Returns:\n torch.Tensor: The per class accuracy.\n \"\"\"\n if params[\"problem_type\"] == \"classification\":\n predicted_classes = np.array([0] * len(params[\"model\"][\"class_list\"]))\n label_cpu = np.array([0] * len(params[\"model\"][\"class_list\"]))\n predicted_classes[torch.argmax(output, 1).cpu().item()] = 1\n label_cpu[label.cpu().item()] = 1\n return torch.from_numpy((predicted_classes == label_cpu).astype(float))\n else:\n return balanced_acc_score(output, label, params)\n", "path": "GANDLF/metrics/regression.py"}], "after_files": [{"content": "\"\"\"\nAll the metrics are to be called from here\n\"\"\"\nimport torch\nfrom sklearn.metrics import balanced_accuracy_score\nimport numpy as np\n\n\ndef classification_accuracy(output, label, params):\n \"\"\"\n This function computes the classification accuracy.\n\n Args:\n output (torch.Tensor): The output of the model.\n label (torch.Tensor): The ground truth labels.\n params (dict): The parameter dictionary containing training and data information.\n\n Returns:\n torch.Tensor: The classification accuracy.\n \"\"\"\n if params[\"problem_type\"] == \"classification\":\n predicted_classes = torch.argmax(output, 1)\n else:\n predicted_classes = output\n\n acc = torch.sum(predicted_classes == label.squeeze()) / len(label)\n return acc\n\n\ndef balanced_acc_score(output, label, params):\n \"\"\"\n This function computes the balanced accuracy.\n\n Args:\n output (torch.Tensor): The output of the model.\n label (torch.Tensor): The ground truth labels.\n params (dict): The parameter dictionary containing training and data information.\n\n Returns:\n torch.Tensor: The balanced accuracy.\n \"\"\"\n if params[\"problem_type\"] == \"classification\":\n predicted_classes = torch.argmax(output, 1)\n else:\n predicted_classes = output\n\n return torch.from_numpy(\n np.array(balanced_accuracy_score(predicted_classes.cpu(), label.cpu()))\n )\n\n\ndef per_label_accuracy(output, label, params):\n \"\"\"\n This function computes the per class accuracy.\n\n Args:\n output (torch.Tensor): The output of the model.\n label (torch.Tensor): The ground truth labels.\n params (dict): The parameter dictionary containing training and data information.\n\n Returns:\n torch.Tensor: The per class accuracy.\n \"\"\"\n if params[\"problem_type\"] == \"classification\":\n # ensure this works for multiple batches\n output_accuracy = torch.zeros(len(params[\"model\"][\"class_list\"]))\n for output_batch, label_batch in zip(output, label):\n predicted_classes = torch.Tensor([0] * len(params[\"model\"][\"class_list\"]))\n label_cpu = torch.Tensor([0] * len(params[\"model\"][\"class_list\"]))\n predicted_classes[torch.argmax(output_batch, 0).cpu().item()] = 1\n label_cpu[label_batch.cpu().item()] = 1\n output_accuracy += (predicted_classes == label_cpu).type(torch.float)\n return output_accuracy / len(output)\n else:\n return balanced_acc_score(output, label, params)\n", "path": "GANDLF/metrics/regression.py"}]}
| 1,023 | 318 |
gh_patches_debug_12330
|
rasdani/github-patches
|
git_diff
|
falconry__falcon-1883
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
deprecated() utility raises AttributeError under Meinheld
The inner function of our [`deprecated()`](https://falcon.readthedocs.io/en/latest/api/util.html#falcon.deprecated) utility generator grabs the current stack frame object via [`inspect.getcurrentframe()`](https://docs.python.org/3/library/inspect.html#inspect.currentframe), and then uses its attributes to provide a more informative deprecation warning.
However, as warned in the latter's docs, this function is not guaranteed to return a valid stack frame object on all Python implementations; it may also return `None`. It seems that running Gunicorn+Meinheld workers can trigger this situation even under CPython.
Discovered using the following command line under CPython 3.7 and 3.8:
```
gunicorn --workers=8 --worker-class="egg:meinheld#gunicorn_worker" test:app
```
For instance, assigning a value to the deprecated [`Response.body`](https://falcon.readthedocs.io/en/latest/api/request_and_response_wsgi.html#falcon.Response.body) yields
```
2021-03-11 23:31:42 [FALCON] [ERROR] GET /things => Traceback (most recent call last):
File "falcon/app.py", line 361, in falcon.app.App.__call__
File "/tmp/benchmark/test3.py", line 13, in on_get
resp.body = ('\nTwo things awe me most, the starry sky '
File "falcon/util/deprecation.py", line 67, in falcon.util.deprecation.deprecated.decorator.wrapper
AttributeError: 'NoneType' object has no attribute 'f_code'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `falcon/util/deprecation.py`
Content:
```
1 # Copyright 2013 by Rackspace Hosting, Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Miscellaneous deprecation utilities.
16
17 This module provides decorators to mark functions and classes as deprecated.
18 """
19
20 import functools
21 import inspect
22 import warnings
23
24
25 __all__ = (
26 'DeprecatedWarning',
27 'deprecated',
28 'deprecated_args',
29 )
30
31
32 # NOTE(kgriffs): We don't want our deprecations to be ignored by default,
33 # so create our own type.
34 #
35 # TODO(kgriffs): Revisit this decision if users complain.
36 class DeprecatedWarning(UserWarning):
37 pass
38
39
40 def deprecated(instructions, is_property=False):
41 """Flag a method as deprecated.
42
43 This function returns a decorator which can be used to mark deprecated
44 functions. Applying this decorator will result in a warning being
45 emitted when the function is used.
46
47 Args:
48 instructions (str): Specific guidance for the developer, e.g.:
49 'Please migrate to add_proxy(...)'
50 is_property (bool): If the deprecated object is a property. It
51 will omit the ``(...)`` from the generated documentation
52 """
53
54 def decorator(func):
55
56 object_name = 'property' if is_property else 'function'
57 post_name = '' if is_property else '(...)'
58 message = 'Call to deprecated {} {}{}. {}'.format(
59 object_name, func.__name__, post_name, instructions)
60
61 @functools.wraps(func)
62 def wrapper(*args, **kwargs):
63 frame = inspect.currentframe().f_back
64
65 warnings.warn_explicit(message,
66 category=DeprecatedWarning,
67 filename=inspect.getfile(frame.f_code),
68 lineno=frame.f_lineno)
69
70 return func(*args, **kwargs)
71
72 return wrapper
73
74 return decorator
75
76
77 def deprecated_args(*, allowed_positional, is_method=True):
78 """Flag a method call with positional args as deprecated.
79
80 Keyword Args:
81 allowed_positional (int): Number of allowed positional arguments
82 is_method (bool, optional): The decorated function is a method. Will
83 add one to the number of allowed positional args to account for
84 ``self``. Defaults to True.
85 """
86
87 template = (
88 'Calls with{} positional args are deprecated.'
89 ' Please specify them as keyword arguments instead.'
90 )
91 text = ' more than {}'.format(allowed_positional) if allowed_positional else ''
92 warn_text = template.format(text)
93 if is_method:
94 allowed_positional += 1
95
96 def deprecated_args(fn):
97 @functools.wraps(fn)
98 def wraps(*args, **kwargs):
99 if len(args) > allowed_positional:
100 warnings.warn(warn_text, DeprecatedWarning, stacklevel=2)
101 return fn(*args, **kwargs)
102
103 return wraps
104
105 return deprecated_args
106
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/falcon/util/deprecation.py b/falcon/util/deprecation.py
--- a/falcon/util/deprecation.py
+++ b/falcon/util/deprecation.py
@@ -18,7 +18,6 @@
"""
import functools
-import inspect
import warnings
@@ -60,12 +59,7 @@
@functools.wraps(func)
def wrapper(*args, **kwargs):
- frame = inspect.currentframe().f_back
-
- warnings.warn_explicit(message,
- category=DeprecatedWarning,
- filename=inspect.getfile(frame.f_code),
- lineno=frame.f_lineno)
+ warnings.warn(message, category=DeprecatedWarning, stacklevel=2)
return func(*args, **kwargs)
|
{"golden_diff": "diff --git a/falcon/util/deprecation.py b/falcon/util/deprecation.py\n--- a/falcon/util/deprecation.py\n+++ b/falcon/util/deprecation.py\n@@ -18,7 +18,6 @@\n \"\"\"\n \n import functools\n-import inspect\n import warnings\n \n \n@@ -60,12 +59,7 @@\n \n @functools.wraps(func)\n def wrapper(*args, **kwargs):\n- frame = inspect.currentframe().f_back\n-\n- warnings.warn_explicit(message,\n- category=DeprecatedWarning,\n- filename=inspect.getfile(frame.f_code),\n- lineno=frame.f_lineno)\n+ warnings.warn(message, category=DeprecatedWarning, stacklevel=2)\n \n return func(*args, **kwargs)\n", "issue": "deprecated() utility raises AttributeError under Meinheld\nThe inner function of our [`deprecated()`](https://falcon.readthedocs.io/en/latest/api/util.html#falcon.deprecated) utility generator grabs the current stack frame object via [`inspect.getcurrentframe()`](https://docs.python.org/3/library/inspect.html#inspect.currentframe), and then uses its attributes to provide a more informative deprecation warning.\r\n\r\nHowever, as warned in the latter's docs, this function is not guaranteed to return a valid stack frame object on all Python implementations; it may also return `None`. It seems that running Gunicorn+Meinheld workers can trigger this situation even under CPython.\r\n\r\nDiscovered using the following command line under CPython 3.7 and 3.8:\r\n```\r\ngunicorn --workers=8 --worker-class=\"egg:meinheld#gunicorn_worker\" test:app\r\n```\r\n\r\nFor instance, assigning a value to the deprecated [`Response.body`](https://falcon.readthedocs.io/en/latest/api/request_and_response_wsgi.html#falcon.Response.body) yields\r\n```\r\n2021-03-11 23:31:42 [FALCON] [ERROR] GET /things => Traceback (most recent call last):\r\n File \"falcon/app.py\", line 361, in falcon.app.App.__call__\r\n File \"/tmp/benchmark/test3.py\", line 13, in on_get\r\n resp.body = ('\\nTwo things awe me most, the starry sky '\r\n File \"falcon/util/deprecation.py\", line 67, in falcon.util.deprecation.deprecated.decorator.wrapper\r\nAttributeError: 'NoneType' object has no attribute 'f_code'\r\n```\n", "before_files": [{"content": "# Copyright 2013 by Rackspace Hosting, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Miscellaneous deprecation utilities.\n\nThis module provides decorators to mark functions and classes as deprecated.\n\"\"\"\n\nimport functools\nimport inspect\nimport warnings\n\n\n__all__ = (\n 'DeprecatedWarning',\n 'deprecated',\n 'deprecated_args',\n)\n\n\n# NOTE(kgriffs): We don't want our deprecations to be ignored by default,\n# so create our own type.\n#\n# TODO(kgriffs): Revisit this decision if users complain.\nclass DeprecatedWarning(UserWarning):\n pass\n\n\ndef deprecated(instructions, is_property=False):\n \"\"\"Flag a method as deprecated.\n\n This function returns a decorator which can be used to mark deprecated\n functions. Applying this decorator will result in a warning being\n emitted when the function is used.\n\n Args:\n instructions (str): Specific guidance for the developer, e.g.:\n 'Please migrate to add_proxy(...)'\n is_property (bool): If the deprecated object is a property. It\n will omit the ``(...)`` from the generated documentation\n \"\"\"\n\n def decorator(func):\n\n object_name = 'property' if is_property else 'function'\n post_name = '' if is_property else '(...)'\n message = 'Call to deprecated {} {}{}. {}'.format(\n object_name, func.__name__, post_name, instructions)\n\n @functools.wraps(func)\n def wrapper(*args, **kwargs):\n frame = inspect.currentframe().f_back\n\n warnings.warn_explicit(message,\n category=DeprecatedWarning,\n filename=inspect.getfile(frame.f_code),\n lineno=frame.f_lineno)\n\n return func(*args, **kwargs)\n\n return wrapper\n\n return decorator\n\n\ndef deprecated_args(*, allowed_positional, is_method=True):\n \"\"\"Flag a method call with positional args as deprecated.\n\n Keyword Args:\n allowed_positional (int): Number of allowed positional arguments\n is_method (bool, optional): The decorated function is a method. Will\n add one to the number of allowed positional args to account for\n ``self``. Defaults to True.\n \"\"\"\n\n template = (\n 'Calls with{} positional args are deprecated.'\n ' Please specify them as keyword arguments instead.'\n )\n text = ' more than {}'.format(allowed_positional) if allowed_positional else ''\n warn_text = template.format(text)\n if is_method:\n allowed_positional += 1\n\n def deprecated_args(fn):\n @functools.wraps(fn)\n def wraps(*args, **kwargs):\n if len(args) > allowed_positional:\n warnings.warn(warn_text, DeprecatedWarning, stacklevel=2)\n return fn(*args, **kwargs)\n\n return wraps\n\n return deprecated_args\n", "path": "falcon/util/deprecation.py"}], "after_files": [{"content": "# Copyright 2013 by Rackspace Hosting, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Miscellaneous deprecation utilities.\n\nThis module provides decorators to mark functions and classes as deprecated.\n\"\"\"\n\nimport functools\nimport warnings\n\n\n__all__ = (\n 'DeprecatedWarning',\n 'deprecated',\n 'deprecated_args',\n)\n\n\n# NOTE(kgriffs): We don't want our deprecations to be ignored by default,\n# so create our own type.\n#\n# TODO(kgriffs): Revisit this decision if users complain.\nclass DeprecatedWarning(UserWarning):\n pass\n\n\ndef deprecated(instructions, is_property=False):\n \"\"\"Flag a method as deprecated.\n\n This function returns a decorator which can be used to mark deprecated\n functions. Applying this decorator will result in a warning being\n emitted when the function is used.\n\n Args:\n instructions (str): Specific guidance for the developer, e.g.:\n 'Please migrate to add_proxy(...)'\n is_property (bool): If the deprecated object is a property. It\n will omit the ``(...)`` from the generated documentation\n \"\"\"\n\n def decorator(func):\n\n object_name = 'property' if is_property else 'function'\n post_name = '' if is_property else '(...)'\n message = 'Call to deprecated {} {}{}. {}'.format(\n object_name, func.__name__, post_name, instructions)\n\n @functools.wraps(func)\n def wrapper(*args, **kwargs):\n warnings.warn(message, category=DeprecatedWarning, stacklevel=2)\n\n return func(*args, **kwargs)\n\n return wrapper\n\n return decorator\n\n\ndef deprecated_args(*, allowed_positional, is_method=True):\n \"\"\"Flag a method call with positional args as deprecated.\n\n Keyword Args:\n allowed_positional (int): Number of allowed positional arguments\n is_method (bool, optional): The decorated function is a method. Will\n add one to the number of allowed positional args to account for\n ``self``. Defaults to True.\n \"\"\"\n\n template = (\n 'Calls with{} positional args are deprecated.'\n ' Please specify them as keyword arguments instead.'\n )\n text = ' more than {}'.format(allowed_positional) if allowed_positional else ''\n warn_text = template.format(text)\n if is_method:\n allowed_positional += 1\n\n def deprecated_args(fn):\n @functools.wraps(fn)\n def wraps(*args, **kwargs):\n if len(args) > allowed_positional:\n warnings.warn(warn_text, DeprecatedWarning, stacklevel=2)\n return fn(*args, **kwargs)\n\n return wraps\n\n return deprecated_args\n", "path": "falcon/util/deprecation.py"}]}
| 1,554 | 164 |
gh_patches_debug_1321
|
rasdani/github-patches
|
git_diff
|
pyodide__pyodide-717
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Calling yaml.load() without Loader=... is deprecated
For each built packages there is now the following deprecation warning ,
```
pyodide_build/common.py:27: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
return yaml.load(fd)
```
it would be nice to fix this.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyodide_build/common.py`
Content:
```
1 from pathlib import Path
2 from typing import Optional, Set
3
4
5 ROOTDIR = Path(__file__).parents[1].resolve() / "tools"
6 HOSTPYTHON = ROOTDIR / ".." / "cpython" / "build" / "3.8.2" / "host"
7 TARGETPYTHON = ROOTDIR / ".." / "cpython" / "installs" / "python-3.8.2"
8 DEFAULTCFLAGS = ""
9 DEFAULTLDFLAGS = " ".join(
10 [
11 "-O3",
12 "-s",
13 "BINARYEN_METHOD='native-wasm'",
14 "-Werror",
15 "-s",
16 "EMULATED_FUNCTION_POINTERS=1",
17 "-s",
18 "EMULATE_FUNCTION_POINTER_CASTS=1",
19 "-s",
20 "SIDE_MODULE=1",
21 "-s",
22 "WASM=1",
23 "--memory-init-file",
24 "0",
25 ]
26 )
27
28
29 def parse_package(package):
30 # Import yaml here because pywasmcross needs to run in the built native
31 # Python, which won't have PyYAML
32 import yaml
33
34 # TODO: Validate against a schema
35 with open(package) as fd:
36 return yaml.load(fd)
37
38
39 def _parse_package_subset(query: Optional[str]) -> Optional[Set[str]]:
40 """Parse the list of packages specified with PYODIDE_PACKAGES env var.
41
42 Also add the list of mandatory packages: ['micropip', 'distlib']
43
44 Returns:
45 a set of package names to build or None.
46 """
47 if query is None:
48 return None
49 packages = query.split(",")
50 packages = [el.strip() for el in packages]
51 packages = ["micropip", "distlib"] + packages
52 return set(packages)
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pyodide_build/common.py b/pyodide_build/common.py
--- a/pyodide_build/common.py
+++ b/pyodide_build/common.py
@@ -33,7 +33,7 @@
# TODO: Validate against a schema
with open(package) as fd:
- return yaml.load(fd)
+ return yaml.safe_load(fd)
def _parse_package_subset(query: Optional[str]) -> Optional[Set[str]]:
|
{"golden_diff": "diff --git a/pyodide_build/common.py b/pyodide_build/common.py\n--- a/pyodide_build/common.py\n+++ b/pyodide_build/common.py\n@@ -33,7 +33,7 @@\n \n # TODO: Validate against a schema\n with open(package) as fd:\n- return yaml.load(fd)\n+ return yaml.safe_load(fd)\n \n \n def _parse_package_subset(query: Optional[str]) -> Optional[Set[str]]:\n", "issue": "Calling yaml.load() without Loader=... is deprecated\nFor each built packages there is now the following deprecation warning ,\r\n```\r\npyodide_build/common.py:27: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.\r\n return yaml.load(fd)\r\n```\r\nit would be nice to fix this.\n", "before_files": [{"content": "from pathlib import Path\nfrom typing import Optional, Set\n\n\nROOTDIR = Path(__file__).parents[1].resolve() / \"tools\"\nHOSTPYTHON = ROOTDIR / \"..\" / \"cpython\" / \"build\" / \"3.8.2\" / \"host\"\nTARGETPYTHON = ROOTDIR / \"..\" / \"cpython\" / \"installs\" / \"python-3.8.2\"\nDEFAULTCFLAGS = \"\"\nDEFAULTLDFLAGS = \" \".join(\n [\n \"-O3\",\n \"-s\",\n \"BINARYEN_METHOD='native-wasm'\",\n \"-Werror\",\n \"-s\",\n \"EMULATED_FUNCTION_POINTERS=1\",\n \"-s\",\n \"EMULATE_FUNCTION_POINTER_CASTS=1\",\n \"-s\",\n \"SIDE_MODULE=1\",\n \"-s\",\n \"WASM=1\",\n \"--memory-init-file\",\n \"0\",\n ]\n)\n\n\ndef parse_package(package):\n # Import yaml here because pywasmcross needs to run in the built native\n # Python, which won't have PyYAML\n import yaml\n\n # TODO: Validate against a schema\n with open(package) as fd:\n return yaml.load(fd)\n\n\ndef _parse_package_subset(query: Optional[str]) -> Optional[Set[str]]:\n \"\"\"Parse the list of packages specified with PYODIDE_PACKAGES env var.\n\n Also add the list of mandatory packages: ['micropip', 'distlib']\n\n Returns:\n a set of package names to build or None.\n \"\"\"\n if query is None:\n return None\n packages = query.split(\",\")\n packages = [el.strip() for el in packages]\n packages = [\"micropip\", \"distlib\"] + packages\n return set(packages)\n", "path": "pyodide_build/common.py"}], "after_files": [{"content": "from pathlib import Path\nfrom typing import Optional, Set\n\n\nROOTDIR = Path(__file__).parents[1].resolve() / \"tools\"\nHOSTPYTHON = ROOTDIR / \"..\" / \"cpython\" / \"build\" / \"3.8.2\" / \"host\"\nTARGETPYTHON = ROOTDIR / \"..\" / \"cpython\" / \"installs\" / \"python-3.8.2\"\nDEFAULTCFLAGS = \"\"\nDEFAULTLDFLAGS = \" \".join(\n [\n \"-O3\",\n \"-s\",\n \"BINARYEN_METHOD='native-wasm'\",\n \"-Werror\",\n \"-s\",\n \"EMULATED_FUNCTION_POINTERS=1\",\n \"-s\",\n \"EMULATE_FUNCTION_POINTER_CASTS=1\",\n \"-s\",\n \"SIDE_MODULE=1\",\n \"-s\",\n \"WASM=1\",\n \"--memory-init-file\",\n \"0\",\n ]\n)\n\n\ndef parse_package(package):\n # Import yaml here because pywasmcross needs to run in the built native\n # Python, which won't have PyYAML\n import yaml\n\n # TODO: Validate against a schema\n with open(package) as fd:\n return yaml.safe_load(fd)\n\n\ndef _parse_package_subset(query: Optional[str]) -> Optional[Set[str]]:\n \"\"\"Parse the list of packages specified with PYODIDE_PACKAGES env var.\n\n Also add the list of mandatory packages: ['micropip', 'distlib']\n\n Returns:\n a set of package names to build or None.\n \"\"\"\n if query is None:\n return None\n packages = query.split(\",\")\n packages = [el.strip() for el in packages]\n packages = [\"micropip\", \"distlib\"] + packages\n return set(packages)\n", "path": "pyodide_build/common.py"}]}
| 827 | 98 |
gh_patches_debug_20517
|
rasdani/github-patches
|
git_diff
|
quantumlib__Cirq-1863
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Split cirq packages into with/without contrib
Otherwise there's no way to easily pip install the contrib-requirements
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright 2018 The Cirq Developers
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # https://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import io
16 from setuptools import find_packages, setup
17
18 # This reads the __version__ variable from cirq/_version.py
19 __version__ = ''
20 exec(open('cirq/_version.py').read())
21
22 description = ('A framework for creating, editing, and invoking '
23 'Noisy Intermediate Scale Quantum (NISQ) circuits.')
24
25 # README file as long_description.
26 long_description = io.open('README.rst', encoding='utf-8').read()
27
28 # Read in requirements
29 requirements = open('requirements.txt').readlines()
30 requirements = [r.strip() for r in requirements]
31
32 cirq_packages = ['cirq'] + [
33 'cirq.' + package for package in find_packages(where='cirq')
34 ]
35
36 setup(name='cirq',
37 version=__version__,
38 url='http://github.com/quantumlib/cirq',
39 author='The Cirq Developers',
40 author_email='[email protected]',
41 python_requires=('>=3.6.0'),
42 install_requires=requirements,
43 license='Apache 2',
44 description=description,
45 long_description=long_description,
46 packages=cirq_packages,
47 package_data={
48 'cirq.api.google.v1': ['*.proto'],
49 'cirq.api.google.v2': ['*.proto'],
50 })
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -28,6 +28,10 @@
# Read in requirements
requirements = open('requirements.txt').readlines()
requirements = [r.strip() for r in requirements]
+contrib_requirements = open('cirq/contrib/contrib-requirements.txt').readlines()
+contrib_requirements = [r.strip() for r in contrib_requirements]
+dev_requirements = open('dev_tools/conf/pip-list-dev-tools.txt').readlines()
+dev_requirements = [r.strip() for r in dev_requirements]
cirq_packages = ['cirq'] + [
'cirq.' + package for package in find_packages(where='cirq')
@@ -40,6 +44,10 @@
author_email='[email protected]',
python_requires=('>=3.6.0'),
install_requires=requirements,
+ extras_require={
+ 'contrib': contrib_requirements,
+ 'dev': dev_requirements,
+ },
license='Apache 2',
description=description,
long_description=long_description,
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -28,6 +28,10 @@\n # Read in requirements\n requirements = open('requirements.txt').readlines()\n requirements = [r.strip() for r in requirements]\n+contrib_requirements = open('cirq/contrib/contrib-requirements.txt').readlines()\n+contrib_requirements = [r.strip() for r in contrib_requirements]\n+dev_requirements = open('dev_tools/conf/pip-list-dev-tools.txt').readlines()\n+dev_requirements = [r.strip() for r in dev_requirements]\n \n cirq_packages = ['cirq'] + [\n 'cirq.' + package for package in find_packages(where='cirq')\n@@ -40,6 +44,10 @@\n author_email='[email protected]',\n python_requires=('>=3.6.0'),\n install_requires=requirements,\n+ extras_require={\n+ 'contrib': contrib_requirements,\n+ 'dev': dev_requirements,\n+ },\n license='Apache 2',\n description=description,\n long_description=long_description,\n", "issue": "Split cirq packages into with/without contrib\nOtherwise there's no way to easily pip install the contrib-requirements\n", "before_files": [{"content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nfrom setuptools import find_packages, setup\n\n# This reads the __version__ variable from cirq/_version.py\n__version__ = ''\nexec(open('cirq/_version.py').read())\n\ndescription = ('A framework for creating, editing, and invoking '\n 'Noisy Intermediate Scale Quantum (NISQ) circuits.')\n\n# README file as long_description.\nlong_description = io.open('README.rst', encoding='utf-8').read()\n\n# Read in requirements\nrequirements = open('requirements.txt').readlines()\nrequirements = [r.strip() for r in requirements]\n\ncirq_packages = ['cirq'] + [\n 'cirq.' + package for package in find_packages(where='cirq')\n]\n\nsetup(name='cirq',\n version=__version__,\n url='http://github.com/quantumlib/cirq',\n author='The Cirq Developers',\n author_email='[email protected]',\n python_requires=('>=3.6.0'),\n install_requires=requirements,\n license='Apache 2',\n description=description,\n long_description=long_description,\n packages=cirq_packages,\n package_data={\n 'cirq.api.google.v1': ['*.proto'],\n 'cirq.api.google.v2': ['*.proto'],\n })\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nfrom setuptools import find_packages, setup\n\n# This reads the __version__ variable from cirq/_version.py\n__version__ = ''\nexec(open('cirq/_version.py').read())\n\ndescription = ('A framework for creating, editing, and invoking '\n 'Noisy Intermediate Scale Quantum (NISQ) circuits.')\n\n# README file as long_description.\nlong_description = io.open('README.rst', encoding='utf-8').read()\n\n# Read in requirements\nrequirements = open('requirements.txt').readlines()\nrequirements = [r.strip() for r in requirements]\ncontrib_requirements = open('cirq/contrib/contrib-requirements.txt').readlines()\ncontrib_requirements = [r.strip() for r in contrib_requirements]\ndev_requirements = open('dev_tools/conf/pip-list-dev-tools.txt').readlines()\ndev_requirements = [r.strip() for r in dev_requirements]\n\ncirq_packages = ['cirq'] + [\n 'cirq.' + package for package in find_packages(where='cirq')\n]\n\nsetup(name='cirq',\n version=__version__,\n url='http://github.com/quantumlib/cirq',\n author='The Cirq Developers',\n author_email='[email protected]',\n python_requires=('>=3.6.0'),\n install_requires=requirements,\n extras_require={\n 'contrib': contrib_requirements,\n 'dev': dev_requirements,\n },\n license='Apache 2',\n description=description,\n long_description=long_description,\n packages=cirq_packages,\n package_data={\n 'cirq.api.google.v1': ['*.proto'],\n 'cirq.api.google.v2': ['*.proto'],\n })\n", "path": "setup.py"}]}
| 779 | 238 |
gh_patches_debug_2369
|
rasdani/github-patches
|
git_diff
|
Pyomo__pyomo-2265
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Consistent semantic versioning
## Summary
The most recent version of Pyomo released was 6.2, as opposed to 6.2.0. It seems inconsistent with the way many other packages are versioned (e.g. NumFocus packages), although I am unaware if there is a standard specified anywhere. Is there a benefit to the former as opposed to the latter?
## Context
Managing our dependencies, we automate pulling in new versions of packages, running them through our CI prior to upgrading. We run this in two ways - one allowing all upgrades and one allowing only compatible upgrades (PEP 440). This always requires manual review because not all packages use semantic versioning (or the same semantic versioning). One manual override we had to apply this time was pinning `Pyomo ~= 6.2.0` instead of what our script automatically pulled in `Pyomo ~= 6.2`.
Consistent semantic versioning
## Summary
The most recent version of Pyomo released was 6.2, as opposed to 6.2.0. It seems inconsistent with the way many other packages are versioned (e.g. NumFocus packages), although I am unaware if there is a standard specified anywhere. Is there a benefit to the former as opposed to the latter?
## Context
Managing our dependencies, we automate pulling in new versions of packages, running them through our CI prior to upgrading. We run this in two ways - one allowing all upgrades and one allowing only compatible upgrades (PEP 440). This always requires manual review because not all packages use semantic versioning (or the same semantic versioning). One manual override we had to apply this time was pinning `Pyomo ~= 6.2.0` instead of what our script automatically pulled in `Pyomo ~= 6.2`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyomo/version/info.py`
Content:
```
1 # ___________________________________________________________________________
2 #
3 # Pyomo: Python Optimization Modeling Objects
4 # Copyright 2017 National Technology and Engineering Solutions of Sandia, LLC
5 # Under the terms of Contract DE-NA0003525 with National Technology and
6 # Engineering Solutions of Sandia, LLC, the U.S. Government retains certain
7 # rights in this software.
8 # This software is distributed under the 3-clause BSD License.
9 # ___________________________________________________________________________
10
11 _init_url="$URL$"
12
13 # NOTE: releaselevel should be left at 'invalid' for trunk development
14 # and set to 'final' for releases. During development, the
15 # major.minor.micro should point ot the NEXT release (generally, the
16 # next micro release after the current release).
17 #
18 # Note: When cutting a release, also update the major/minor/micro in
19 #
20 # pyomo/RELEASE.txt
21 #
22 # The VOTD zipbuilder will automatically change releaselevel to "VOTD
23 # {hash}" and set the serial number to YYMMDDhhmm. The serial number
24 # should generally be left at 0, unless a downstream package is tracking
25 # main and needs a hard reference to "suitably new" development.
26 major=6
27 minor=2
28 micro=1
29 releaselevel='invalid'
30 #releaselevel='final'
31 serial=0
32
33 if releaselevel == 'final':
34 pass
35 elif '/tags/' in _init_url: #pragma:nocover
36 releaselevel = 'final'
37 elif releaselevel == 'invalid':
38 from os.path import abspath, dirname, exists, join
39 if __file__.endswith('setup.py'):
40 # This file is being sources (exec'ed) from setup.py.
41 # dirname(__file__) setup.py's scope is the root sourec directory
42 _rootdir = os.path.dirname(__file__)
43 else:
44 # Eventually this should import PYOMO_ROOT_DIR from
45 # pyomo.common instead of reimplementing that logic here.
46 #
47 # __file__ fails if script is called in different ways on Windows
48 # __file__ fails if someone does os.chdir() before
49 # sys.argv[0] also fails because it doesn't not always contains the path
50 from inspect import getfile, currentframe
51 _rootdir = join(dirname(abspath(getfile(currentframe()))), '..', '..')
52
53 if exists(join(_rootdir, '.git')):
54 try:
55 with open(join(_rootdir, '.git', 'HEAD')) as _FILE:
56 _ref = _FILE.readline().strip() #pragma:nocover
57 releaselevel = 'devel {%s}' % (
58 _ref.split('/')[-1].split('\\')[-1], ) #pragma:nocover
59 except:
60 releaselevel = 'devel' #pragma:nocover
61 elif exists(join(_rootdir, '.svn')):
62 releaselevel = 'devel {svn}' #pragma:nocover
63 else:
64 releaselevel = 'VOTD' #pragma:nocover
65
66
67 version_info = (major, minor, micro, releaselevel, serial)
68
69 version = '.'.join(str(x) for x in version_info[:(3 if micro else 2)])
70 __version__ = version
71 if releaselevel != 'final':
72 version += ' ('+releaselevel+')'
73 if releaselevel.startswith('devel'):
74 __version__ += ".dev%d" % (serial,)
75 elif releaselevel.startswith('VOTD'):
76 __version__ += "a%d" % (serial,)
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pyomo/version/info.py b/pyomo/version/info.py
--- a/pyomo/version/info.py
+++ b/pyomo/version/info.py
@@ -66,7 +66,7 @@
version_info = (major, minor, micro, releaselevel, serial)
-version = '.'.join(str(x) for x in version_info[:(3 if micro else 2)])
+version = '.'.join(str(x) for x in version_info[:3])
__version__ = version
if releaselevel != 'final':
version += ' ('+releaselevel+')'
|
{"golden_diff": "diff --git a/pyomo/version/info.py b/pyomo/version/info.py\n--- a/pyomo/version/info.py\n+++ b/pyomo/version/info.py\n@@ -66,7 +66,7 @@\n \n version_info = (major, minor, micro, releaselevel, serial)\n \n-version = '.'.join(str(x) for x in version_info[:(3 if micro else 2)])\n+version = '.'.join(str(x) for x in version_info[:3])\n __version__ = version\n if releaselevel != 'final':\n version += ' ('+releaselevel+')'\n", "issue": "Consistent semantic versioning\n## Summary\r\n\r\nThe most recent version of Pyomo released was 6.2, as opposed to 6.2.0. It seems inconsistent with the way many other packages are versioned (e.g. NumFocus packages), although I am unaware if there is a standard specified anywhere. Is there a benefit to the former as opposed to the latter? \r\n\r\n## Context\r\n\r\nManaging our dependencies, we automate pulling in new versions of packages, running them through our CI prior to upgrading. We run this in two ways - one allowing all upgrades and one allowing only compatible upgrades (PEP 440). This always requires manual review because not all packages use semantic versioning (or the same semantic versioning). One manual override we had to apply this time was pinning `Pyomo ~= 6.2.0` instead of what our script automatically pulled in `Pyomo ~= 6.2`.\nConsistent semantic versioning\n## Summary\r\n\r\nThe most recent version of Pyomo released was 6.2, as opposed to 6.2.0. It seems inconsistent with the way many other packages are versioned (e.g. NumFocus packages), although I am unaware if there is a standard specified anywhere. Is there a benefit to the former as opposed to the latter? \r\n\r\n## Context\r\n\r\nManaging our dependencies, we automate pulling in new versions of packages, running them through our CI prior to upgrading. We run this in two ways - one allowing all upgrades and one allowing only compatible upgrades (PEP 440). This always requires manual review because not all packages use semantic versioning (or the same semantic versioning). One manual override we had to apply this time was pinning `Pyomo ~= 6.2.0` instead of what our script automatically pulled in `Pyomo ~= 6.2`.\n", "before_files": [{"content": "# ___________________________________________________________________________\n#\n# Pyomo: Python Optimization Modeling Objects\n# Copyright 2017 National Technology and Engineering Solutions of Sandia, LLC\n# Under the terms of Contract DE-NA0003525 with National Technology and \n# Engineering Solutions of Sandia, LLC, the U.S. Government retains certain \n# rights in this software.\n# This software is distributed under the 3-clause BSD License.\n# ___________________________________________________________________________\n\n_init_url=\"$URL$\"\n\n# NOTE: releaselevel should be left at 'invalid' for trunk development\n# and set to 'final' for releases. During development, the\n# major.minor.micro should point ot the NEXT release (generally, the\n# next micro release after the current release).\n#\n# Note: When cutting a release, also update the major/minor/micro in\n#\n# pyomo/RELEASE.txt\n#\n# The VOTD zipbuilder will automatically change releaselevel to \"VOTD\n# {hash}\" and set the serial number to YYMMDDhhmm. The serial number\n# should generally be left at 0, unless a downstream package is tracking\n# main and needs a hard reference to \"suitably new\" development.\nmajor=6\nminor=2\nmicro=1\nreleaselevel='invalid'\n#releaselevel='final'\nserial=0\n\nif releaselevel == 'final':\n pass\nelif '/tags/' in _init_url: #pragma:nocover\n releaselevel = 'final'\nelif releaselevel == 'invalid':\n from os.path import abspath, dirname, exists, join\n if __file__.endswith('setup.py'):\n # This file is being sources (exec'ed) from setup.py.\n # dirname(__file__) setup.py's scope is the root sourec directory\n _rootdir = os.path.dirname(__file__)\n else:\n # Eventually this should import PYOMO_ROOT_DIR from\n # pyomo.common instead of reimplementing that logic here.\n #\n # __file__ fails if script is called in different ways on Windows\n # __file__ fails if someone does os.chdir() before\n # sys.argv[0] also fails because it doesn't not always contains the path\n from inspect import getfile, currentframe\n _rootdir = join(dirname(abspath(getfile(currentframe()))), '..', '..')\n\n if exists(join(_rootdir, '.git')):\n try:\n with open(join(_rootdir, '.git', 'HEAD')) as _FILE:\n _ref = _FILE.readline().strip() #pragma:nocover\n releaselevel = 'devel {%s}' % (\n _ref.split('/')[-1].split('\\\\')[-1], ) #pragma:nocover\n except:\n releaselevel = 'devel' #pragma:nocover\n elif exists(join(_rootdir, '.svn')):\n releaselevel = 'devel {svn}' #pragma:nocover\n else:\n releaselevel = 'VOTD' #pragma:nocover\n\n\nversion_info = (major, minor, micro, releaselevel, serial)\n\nversion = '.'.join(str(x) for x in version_info[:(3 if micro else 2)])\n__version__ = version\nif releaselevel != 'final':\n version += ' ('+releaselevel+')'\nif releaselevel.startswith('devel'):\n __version__ += \".dev%d\" % (serial,)\nelif releaselevel.startswith('VOTD'):\n __version__ += \"a%d\" % (serial,)\n", "path": "pyomo/version/info.py"}], "after_files": [{"content": "# ___________________________________________________________________________\n#\n# Pyomo: Python Optimization Modeling Objects\n# Copyright 2017 National Technology and Engineering Solutions of Sandia, LLC\n# Under the terms of Contract DE-NA0003525 with National Technology and \n# Engineering Solutions of Sandia, LLC, the U.S. Government retains certain \n# rights in this software.\n# This software is distributed under the 3-clause BSD License.\n# ___________________________________________________________________________\n\n_init_url=\"$URL$\"\n\n# NOTE: releaselevel should be left at 'invalid' for trunk development\n# and set to 'final' for releases. During development, the\n# major.minor.micro should point ot the NEXT release (generally, the\n# next micro release after the current release).\n#\n# Note: When cutting a release, also update the major/minor/micro in\n#\n# pyomo/RELEASE.txt\n#\n# The VOTD zipbuilder will automatically change releaselevel to \"VOTD\n# {hash}\" and set the serial number to YYMMDDhhmm. The serial number\n# should generally be left at 0, unless a downstream package is tracking\n# main and needs a hard reference to \"suitably new\" development.\nmajor=6\nminor=2\nmicro=1\nreleaselevel='invalid'\n#releaselevel='final'\nserial=0\n\nif releaselevel == 'final':\n pass\nelif '/tags/' in _init_url: #pragma:nocover\n releaselevel = 'final'\nelif releaselevel == 'invalid':\n from os.path import abspath, dirname, exists, join\n if __file__.endswith('setup.py'):\n # This file is being sources (exec'ed) from setup.py.\n # dirname(__file__) setup.py's scope is the root sourec directory\n _rootdir = os.path.dirname(__file__)\n else:\n # Eventually this should import PYOMO_ROOT_DIR from\n # pyomo.common instead of reimplementing that logic here.\n #\n # __file__ fails if script is called in different ways on Windows\n # __file__ fails if someone does os.chdir() before\n # sys.argv[0] also fails because it doesn't not always contains the path\n from inspect import getfile, currentframe\n _rootdir = join(dirname(abspath(getfile(currentframe()))), '..', '..')\n\n if exists(join(_rootdir, '.git')):\n try:\n with open(join(_rootdir, '.git', 'HEAD')) as _FILE:\n _ref = _FILE.readline().strip() #pragma:nocover\n releaselevel = 'devel {%s}' % (\n _ref.split('/')[-1].split('\\\\')[-1], ) #pragma:nocover\n except:\n releaselevel = 'devel' #pragma:nocover\n elif exists(join(_rootdir, '.svn')):\n releaselevel = 'devel {svn}' #pragma:nocover\n else:\n releaselevel = 'VOTD' #pragma:nocover\n\n\nversion_info = (major, minor, micro, releaselevel, serial)\n\nversion = '.'.join(str(x) for x in version_info[:3])\n__version__ = version\nif releaselevel != 'final':\n version += ' ('+releaselevel+')'\nif releaselevel.startswith('devel'):\n __version__ += \".dev%d\" % (serial,)\nelif releaselevel.startswith('VOTD'):\n __version__ += \"a%d\" % (serial,)\n", "path": "pyomo/version/info.py"}]}
| 1,557 | 123 |
gh_patches_debug_36212
|
rasdani/github-patches
|
git_diff
|
strawberry-graphql__strawberry-403
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error adding more than one implementation of an interface
**Observed Behaviour**: When i try to add two implementations of an interface, i get a duplicated type name exception
**Expected Behaviour**: Instead of trying to recreate the interface type again, reuse it.
**Steps to reproduce**:
1. Create an interface
2. Create two types which implement the interface
3. Launch `strawberry server app`
4. See it fails with ` Schema must contain uniquely named types but contains multiple types named '<InterfaceName>'`
**Snippet to reproduce the issue**
````python
from typing import List, Optional, Union
import strawberry
from strawberry import field
@strawberry.interface
class Person:
name: str
email: str
@strawberry.type
class Speaker(Person):
job: str
@strawberry.type
class Attendee(Person):
interests: List[str]
def get_people_by_name(name: str):
return []
@strawberry.type
class Query:
searchPeopleByName: List[Union[Speaker, Attendee]] = field(resolver=get_people_by_name)
schema = strawberry.Schema(query=Query)
````
**Full traceback:**
```
File "/mnt/c/Users/<User>/code/nerdearla/test_app.py", line 30, in <module>
schema = strawberry.Schema(query=Query)
File "/home/crow/.virtualenvs/venv/lib/python3.8/site-packages/strawberry/schema/schema.py", line 42, in __init__
self._schema = GraphQLSchema(
File "/home/crow/.virtualenvs/venv/lib/python3.8/site-packages/graphql/type/schema.py", line 240, in __init__
raise TypeError(
TypeError: Schema must contain uniquely named types but contains multiple types named 'Person'.
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `strawberry/schema/types/object_type.py`
Content:
```
1 from typing import Type, cast
2
3 from graphql import GraphQLInputObjectType, GraphQLObjectType
4 from graphql.type.definition import GraphQLInterfaceType
5 from strawberry.type import TypeDefinition
6
7 from .fields import get_field
8 from .types import ConcreteType, GraphQLType, TypeMap
9
10
11 def _get_object_type_for_type_definition(
12 type_definition: TypeDefinition, type_map: TypeMap
13 ) -> GraphQLType:
14
15 TypeClass: Type = GraphQLObjectType
16
17 kwargs = {}
18
19 if type_definition.is_input:
20 TypeClass = GraphQLInputObjectType
21 elif type_definition.is_interface:
22 TypeClass = GraphQLInterfaceType
23
24 if type_definition.interfaces:
25 kwargs["interfaces"] = [
26 _get_object_type_for_type_definition(interface, type_map)
27 for interface in type_definition.interfaces
28 ]
29
30 assert not type_definition.is_generic
31
32 return TypeClass(
33 name=type_definition.name,
34 fields=lambda: {
35 field.name: get_field(field, type_definition.is_input, type_map)
36 for field in type_definition.fields
37 },
38 description=type_definition.description,
39 **kwargs,
40 )
41
42
43 def get_object_type(origin: Type, type_map: TypeMap) -> GraphQLObjectType:
44 """Returns a root type (Query, Mutation, Subscription) from a decorated type"""
45
46 if not hasattr(origin, "_type_definition"):
47 raise ValueError(f"Wrong type passed to get object type {origin}")
48
49 type_definition: TypeDefinition = origin._type_definition
50
51 name = type_definition.name
52
53 if name not in type_map:
54 object_type = _get_object_type_for_type_definition(type_definition, type_map)
55
56 type_map[name] = ConcreteType(
57 definition=type_definition, implementation=object_type
58 )
59
60 return cast(GraphQLObjectType, type_map[name].implementation)
61
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/strawberry/schema/types/object_type.py b/strawberry/schema/types/object_type.py
--- a/strawberry/schema/types/object_type.py
+++ b/strawberry/schema/types/object_type.py
@@ -12,32 +12,43 @@
type_definition: TypeDefinition, type_map: TypeMap
) -> GraphQLType:
- TypeClass: Type = GraphQLObjectType
-
- kwargs = {}
-
- if type_definition.is_input:
- TypeClass = GraphQLInputObjectType
- elif type_definition.is_interface:
- TypeClass = GraphQLInterfaceType
-
- if type_definition.interfaces:
- kwargs["interfaces"] = [
- _get_object_type_for_type_definition(interface, type_map)
- for interface in type_definition.interfaces
- ]
-
- assert not type_definition.is_generic
-
- return TypeClass(
- name=type_definition.name,
- fields=lambda: {
- field.name: get_field(field, type_definition.is_input, type_map)
- for field in type_definition.fields
- },
- description=type_definition.description,
- **kwargs,
- )
+ if type_definition.name not in type_map:
+ TypeClass: Type = GraphQLObjectType
+
+ kwargs = {}
+
+ if type_definition.is_input:
+ TypeClass = GraphQLInputObjectType
+ elif type_definition.is_interface:
+ TypeClass = GraphQLInterfaceType
+
+ if type_definition.interfaces:
+ kwargs["interfaces"] = [
+ _get_object_type_for_type_definition(interface, type_map)
+ for interface in type_definition.interfaces
+ ]
+ # this tells GraphQL core what the returned object's actual type is
+ kwargs["is_type_of"] = lambda obj, _: isinstance( # type: ignore
+ obj, type_definition.origin
+ )
+
+ assert not type_definition.is_generic
+
+ object_type = TypeClass(
+ name=type_definition.name,
+ fields=lambda: {
+ field.name: get_field(field, type_definition.is_input, type_map)
+ for field in type_definition.fields
+ },
+ description=type_definition.description,
+ **kwargs,
+ )
+
+ type_map[type_definition.name] = ConcreteType(
+ definition=type_definition, implementation=object_type
+ )
+
+ return type_map[type_definition.name].implementation
def get_object_type(origin: Type, type_map: TypeMap) -> GraphQLObjectType:
@@ -48,13 +59,7 @@
type_definition: TypeDefinition = origin._type_definition
- name = type_definition.name
-
- if name not in type_map:
- object_type = _get_object_type_for_type_definition(type_definition, type_map)
-
- type_map[name] = ConcreteType(
- definition=type_definition, implementation=object_type
- )
-
- return cast(GraphQLObjectType, type_map[name].implementation)
+ return cast(
+ GraphQLObjectType,
+ _get_object_type_for_type_definition(type_definition, type_map),
+ )
|
{"golden_diff": "diff --git a/strawberry/schema/types/object_type.py b/strawberry/schema/types/object_type.py\n--- a/strawberry/schema/types/object_type.py\n+++ b/strawberry/schema/types/object_type.py\n@@ -12,32 +12,43 @@\n type_definition: TypeDefinition, type_map: TypeMap\n ) -> GraphQLType:\n \n- TypeClass: Type = GraphQLObjectType\n-\n- kwargs = {}\n-\n- if type_definition.is_input:\n- TypeClass = GraphQLInputObjectType\n- elif type_definition.is_interface:\n- TypeClass = GraphQLInterfaceType\n-\n- if type_definition.interfaces:\n- kwargs[\"interfaces\"] = [\n- _get_object_type_for_type_definition(interface, type_map)\n- for interface in type_definition.interfaces\n- ]\n-\n- assert not type_definition.is_generic\n-\n- return TypeClass(\n- name=type_definition.name,\n- fields=lambda: {\n- field.name: get_field(field, type_definition.is_input, type_map)\n- for field in type_definition.fields\n- },\n- description=type_definition.description,\n- **kwargs,\n- )\n+ if type_definition.name not in type_map:\n+ TypeClass: Type = GraphQLObjectType\n+\n+ kwargs = {}\n+\n+ if type_definition.is_input:\n+ TypeClass = GraphQLInputObjectType\n+ elif type_definition.is_interface:\n+ TypeClass = GraphQLInterfaceType\n+\n+ if type_definition.interfaces:\n+ kwargs[\"interfaces\"] = [\n+ _get_object_type_for_type_definition(interface, type_map)\n+ for interface in type_definition.interfaces\n+ ]\n+ # this tells GraphQL core what the returned object's actual type is\n+ kwargs[\"is_type_of\"] = lambda obj, _: isinstance( # type: ignore\n+ obj, type_definition.origin\n+ )\n+\n+ assert not type_definition.is_generic\n+\n+ object_type = TypeClass(\n+ name=type_definition.name,\n+ fields=lambda: {\n+ field.name: get_field(field, type_definition.is_input, type_map)\n+ for field in type_definition.fields\n+ },\n+ description=type_definition.description,\n+ **kwargs,\n+ )\n+\n+ type_map[type_definition.name] = ConcreteType(\n+ definition=type_definition, implementation=object_type\n+ )\n+\n+ return type_map[type_definition.name].implementation\n \n \n def get_object_type(origin: Type, type_map: TypeMap) -> GraphQLObjectType:\n@@ -48,13 +59,7 @@\n \n type_definition: TypeDefinition = origin._type_definition\n \n- name = type_definition.name\n-\n- if name not in type_map:\n- object_type = _get_object_type_for_type_definition(type_definition, type_map)\n-\n- type_map[name] = ConcreteType(\n- definition=type_definition, implementation=object_type\n- )\n-\n- return cast(GraphQLObjectType, type_map[name].implementation)\n+ return cast(\n+ GraphQLObjectType,\n+ _get_object_type_for_type_definition(type_definition, type_map),\n+ )\n", "issue": "Error adding more than one implementation of an interface\n**Observed Behaviour**: When i try to add two implementations of an interface, i get a duplicated type name exception\r\n\r\n**Expected Behaviour**: Instead of trying to recreate the interface type again, reuse it.\r\n\r\n**Steps to reproduce**:\r\n1. Create an interface\r\n2. Create two types which implement the interface\r\n3. Launch `strawberry server app`\r\n4. See it fails with ` Schema must contain uniquely named types but contains multiple types named '<InterfaceName>'`\r\n\r\n**Snippet to reproduce the issue**\r\n````python\r\nfrom typing import List, Optional, Union\r\nimport strawberry\r\nfrom strawberry import field\r\n\r\n\r\[email protected]\r\nclass Person:\r\n name: str\r\n email: str\r\n\r\n\r\[email protected]\r\nclass Speaker(Person):\r\n job: str \r\n\r\n\r\[email protected]\r\nclass Attendee(Person):\r\n interests: List[str]\r\n\r\n\r\ndef get_people_by_name(name: str): \r\n return []\r\n\r\n\r\[email protected]\r\nclass Query:\r\n searchPeopleByName: List[Union[Speaker, Attendee]] = field(resolver=get_people_by_name)\r\n\r\nschema = strawberry.Schema(query=Query)\r\n````\r\n**Full traceback:**\r\n```\r\n File \"/mnt/c/Users/<User>/code/nerdearla/test_app.py\", line 30, in <module>\r\n schema = strawberry.Schema(query=Query)\r\n File \"/home/crow/.virtualenvs/venv/lib/python3.8/site-packages/strawberry/schema/schema.py\", line 42, in __init__\r\n self._schema = GraphQLSchema(\r\n File \"/home/crow/.virtualenvs/venv/lib/python3.8/site-packages/graphql/type/schema.py\", line 240, in __init__\r\n raise TypeError(\r\nTypeError: Schema must contain uniquely named types but contains multiple types named 'Person'.\r\n```\n", "before_files": [{"content": "from typing import Type, cast\n\nfrom graphql import GraphQLInputObjectType, GraphQLObjectType\nfrom graphql.type.definition import GraphQLInterfaceType\nfrom strawberry.type import TypeDefinition\n\nfrom .fields import get_field\nfrom .types import ConcreteType, GraphQLType, TypeMap\n\n\ndef _get_object_type_for_type_definition(\n type_definition: TypeDefinition, type_map: TypeMap\n) -> GraphQLType:\n\n TypeClass: Type = GraphQLObjectType\n\n kwargs = {}\n\n if type_definition.is_input:\n TypeClass = GraphQLInputObjectType\n elif type_definition.is_interface:\n TypeClass = GraphQLInterfaceType\n\n if type_definition.interfaces:\n kwargs[\"interfaces\"] = [\n _get_object_type_for_type_definition(interface, type_map)\n for interface in type_definition.interfaces\n ]\n\n assert not type_definition.is_generic\n\n return TypeClass(\n name=type_definition.name,\n fields=lambda: {\n field.name: get_field(field, type_definition.is_input, type_map)\n for field in type_definition.fields\n },\n description=type_definition.description,\n **kwargs,\n )\n\n\ndef get_object_type(origin: Type, type_map: TypeMap) -> GraphQLObjectType:\n \"\"\"Returns a root type (Query, Mutation, Subscription) from a decorated type\"\"\"\n\n if not hasattr(origin, \"_type_definition\"):\n raise ValueError(f\"Wrong type passed to get object type {origin}\")\n\n type_definition: TypeDefinition = origin._type_definition\n\n name = type_definition.name\n\n if name not in type_map:\n object_type = _get_object_type_for_type_definition(type_definition, type_map)\n\n type_map[name] = ConcreteType(\n definition=type_definition, implementation=object_type\n )\n\n return cast(GraphQLObjectType, type_map[name].implementation)\n", "path": "strawberry/schema/types/object_type.py"}], "after_files": [{"content": "from typing import Type, cast\n\nfrom graphql import GraphQLInputObjectType, GraphQLObjectType\nfrom graphql.type.definition import GraphQLInterfaceType\nfrom strawberry.type import TypeDefinition\n\nfrom .fields import get_field\nfrom .types import ConcreteType, GraphQLType, TypeMap\n\n\ndef _get_object_type_for_type_definition(\n type_definition: TypeDefinition, type_map: TypeMap\n) -> GraphQLType:\n\n if type_definition.name not in type_map:\n TypeClass: Type = GraphQLObjectType\n\n kwargs = {}\n\n if type_definition.is_input:\n TypeClass = GraphQLInputObjectType\n elif type_definition.is_interface:\n TypeClass = GraphQLInterfaceType\n\n if type_definition.interfaces:\n kwargs[\"interfaces\"] = [\n _get_object_type_for_type_definition(interface, type_map)\n for interface in type_definition.interfaces\n ]\n # this tells GraphQL core what the returned object's actual type is\n kwargs[\"is_type_of\"] = lambda obj, _: isinstance( # type: ignore\n obj, type_definition.origin\n )\n\n assert not type_definition.is_generic\n\n object_type = TypeClass(\n name=type_definition.name,\n fields=lambda: {\n field.name: get_field(field, type_definition.is_input, type_map)\n for field in type_definition.fields\n },\n description=type_definition.description,\n **kwargs,\n )\n\n type_map[type_definition.name] = ConcreteType(\n definition=type_definition, implementation=object_type\n )\n\n return type_map[type_definition.name].implementation\n\n\ndef get_object_type(origin: Type, type_map: TypeMap) -> GraphQLObjectType:\n \"\"\"Returns a root type (Query, Mutation, Subscription) from a decorated type\"\"\"\n\n if not hasattr(origin, \"_type_definition\"):\n raise ValueError(f\"Wrong type passed to get object type {origin}\")\n\n type_definition: TypeDefinition = origin._type_definition\n\n return cast(\n GraphQLObjectType,\n _get_object_type_for_type_definition(type_definition, type_map),\n )\n", "path": "strawberry/schema/types/object_type.py"}]}
| 1,148 | 670 |
gh_patches_debug_26521
|
rasdani/github-patches
|
git_diff
|
internetarchive__openlibrary-7718
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Publisher search endpoint solr performance
<!-- What problem are we solving? What does the experience look like today? What are the symptoms? -->
Our /search/publishers endpoint is doing a strange roll-up and submitting many solr select queries causing performance issues. Solution presumably is to not make more than 1 solr query on /search/publishers.
### Proposal
Change the backend call for /search/publishers to make a single query to solr `publisher:(...)` query.
### Evidence / Screenshot (if possible)

<img width="775" alt="Screenshot 2023-03-23 at 12 18 55 PM" src="https://user-images.githubusercontent.com/978325/227324919-d19b91c5-d19b-4746-9908-43e0f7cf1cbd.png">
### Relevant url?
<!-- `https://openlibrary.org/...` -->
http://testing.openlibrary.org/search/publishers?q=Black%20Dolls%20And%20White%20Dolls%20From%201940%20Through%201970%3A%20Their%20Impact%20Then%20On%20Black%20And%20White%20Children%27s%20Development%20
### Related files
<!-- Files related to this issue; this is super useful for new contributors who might want to help! If you're not sure, leave this blank; a maintainer will add them. -->
https://github.com/internetarchive/openlibrary/blob/b897c8c51a79308e38f9825fac82864a5cc7d3ae/openlibrary/plugins/worksearch/publishers.py#L82
### Stakeholders
<!-- @ tag stakeholders of this bug -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `openlibrary/plugins/worksearch/publishers.py`
Content:
```
1 """Publisher pages
2 """
3 from infogami.utils import delegate, stats
4 from infogami.utils.view import render_template, safeint
5 import web
6 import logging
7
8 from . import subjects
9 from . import search
10
11 logger = logging.getLogger("openlibrary.worksearch")
12
13
14 class publishers(subjects.subjects):
15 path = '(/publishers/[^/]+)'
16
17 def GET(self, key):
18 key = key.replace("_", " ")
19 page = subjects.get_subject(key, details=True)
20
21 if not page or page.work_count == 0:
22 web.ctx.status = "404 Not Found"
23 return render_template('publishers/notfound.tmpl', key)
24
25 return render_template("publishers/view", page)
26
27 def is_enabled(self):
28 return "publishers" in web.ctx.features
29
30
31 class publishers_json(subjects.subjects_json):
32 path = '(/publishers/[^/]+)'
33 encoding = "json"
34
35 def is_enabled(self):
36 return "publishers" in web.ctx.features
37
38 def normalize_key(self, key):
39 return key
40
41 def process_key(self, key):
42 return key.replace("_", " ")
43
44
45 class index(delegate.page):
46 path = "/publishers"
47
48 def GET(self):
49 return render_template("publishers/index")
50
51 def is_enabled(self):
52 return "publishers" in web.ctx.features
53
54
55 class publisher_search(delegate.page):
56 path = '/search/publishers'
57
58 def GET(self):
59 i = web.input(q="")
60 solr = search.get_solr()
61 q = {"publisher": i.q}
62
63 result = solr.select(
64 q,
65 facets=["publisher_facet"],
66 fields=["publisher", "publisher_facet"],
67 rows=0,
68 )
69 result = self.process_result(result)
70 return render_template('search/publishers', i.q, result)
71
72 def process_result(self, result):
73 solr = search.get_solr()
74
75 def process(p):
76 return web.storage(
77 name=p.value,
78 key="/publishers/" + p.value.replace(" ", "_"),
79 count=solr.select({"publisher_facet": p.value}, rows=0)['num_found'],
80 )
81
82 publisher_facets = result['facets']['publisher_facet'][:25]
83 return [process(p) for p in publisher_facets]
84
85
86 class PublisherEngine(subjects.SubjectEngine):
87 def normalize_key(self, key):
88 return key
89
90 def get_ebook_count(self, name, value, publish_year):
91 # Query solr for this publish_year and publish_year combination and read the has_fulltext=true facet
92 solr = search.get_solr()
93 q = {"publisher_facet": value}
94
95 if isinstance(publish_year, list):
96 q['publish_year'] = tuple(publish_year) # range
97 elif publish_year:
98 q['publish_year'] = publish_year
99
100 result = solr.select(q, facets=["has_fulltext"], rows=0)
101 counts = {v.value: v.count for v in result["facets"]["has_fulltext"]}
102 return counts.get('true')
103
104
105 def setup():
106 subjects.SUBJECTS.append(
107 subjects.SubjectMeta(
108 name="publisher",
109 key="publishers",
110 prefix="/publishers/",
111 facet="publisher_facet",
112 facet_key="publisher_facet",
113 Engine=PublisherEngine,
114 )
115 )
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/openlibrary/plugins/worksearch/publishers.py b/openlibrary/plugins/worksearch/publishers.py
--- a/openlibrary/plugins/worksearch/publishers.py
+++ b/openlibrary/plugins/worksearch/publishers.py
@@ -57,30 +57,28 @@
def GET(self):
i = web.input(q="")
- solr = search.get_solr()
- q = {"publisher": i.q}
-
- result = solr.select(
- q,
+ result = search.get_solr().select(
+ {"publisher": i.q, "type": "work"},
facets=["publisher_facet"],
- fields=["publisher", "publisher_facet"],
+ facet_mincount=1,
+ facet_limit=25,
+ facet_contains=i.q,
+ facet_contains_ignoreCase='true',
rows=0,
)
result = self.process_result(result)
return render_template('search/publishers', i.q, result)
def process_result(self, result):
- solr = search.get_solr()
-
- def process(p):
- return web.storage(
+ publisher_facets = result['facets']['publisher_facet']
+ return [
+ web.storage(
name=p.value,
key="/publishers/" + p.value.replace(" ", "_"),
- count=solr.select({"publisher_facet": p.value}, rows=0)['num_found'],
+ count=p.count,
)
-
- publisher_facets = result['facets']['publisher_facet'][:25]
- return [process(p) for p in publisher_facets]
+ for p in publisher_facets
+ ]
class PublisherEngine(subjects.SubjectEngine):
|
{"golden_diff": "diff --git a/openlibrary/plugins/worksearch/publishers.py b/openlibrary/plugins/worksearch/publishers.py\n--- a/openlibrary/plugins/worksearch/publishers.py\n+++ b/openlibrary/plugins/worksearch/publishers.py\n@@ -57,30 +57,28 @@\n \n def GET(self):\n i = web.input(q=\"\")\n- solr = search.get_solr()\n- q = {\"publisher\": i.q}\n-\n- result = solr.select(\n- q,\n+ result = search.get_solr().select(\n+ {\"publisher\": i.q, \"type\": \"work\"},\n facets=[\"publisher_facet\"],\n- fields=[\"publisher\", \"publisher_facet\"],\n+ facet_mincount=1,\n+ facet_limit=25,\n+ facet_contains=i.q,\n+ facet_contains_ignoreCase='true',\n rows=0,\n )\n result = self.process_result(result)\n return render_template('search/publishers', i.q, result)\n \n def process_result(self, result):\n- solr = search.get_solr()\n-\n- def process(p):\n- return web.storage(\n+ publisher_facets = result['facets']['publisher_facet']\n+ return [\n+ web.storage(\n name=p.value,\n key=\"/publishers/\" + p.value.replace(\" \", \"_\"),\n- count=solr.select({\"publisher_facet\": p.value}, rows=0)['num_found'],\n+ count=p.count,\n )\n-\n- publisher_facets = result['facets']['publisher_facet'][:25]\n- return [process(p) for p in publisher_facets]\n+ for p in publisher_facets\n+ ]\n \n \n class PublisherEngine(subjects.SubjectEngine):\n", "issue": "Publisher search endpoint solr performance\n<!-- What problem are we solving? What does the experience look like today? What are the symptoms? -->\r\n\r\nOur /search/publishers endpoint is doing a strange roll-up and submitting many solr select queries causing performance issues. Solution presumably is to not make more than 1 solr query on /search/publishers.\r\n\r\n### Proposal\r\n\r\nChange the backend call for /search/publishers to make a single query to solr `publisher:(...)` query.\r\n\r\n### Evidence / Screenshot (if possible)\r\n\r\n<img width=\"775\" alt=\"Screenshot 2023-03-23 at 12 18 55 PM\" src=\"https://user-images.githubusercontent.com/978325/227324919-d19b91c5-d19b-4746-9908-43e0f7cf1cbd.png\">\r\n\r\n### Relevant url?\r\n<!-- `https://openlibrary.org/...` -->\r\n\r\nhttp://testing.openlibrary.org/search/publishers?q=Black%20Dolls%20And%20White%20Dolls%20From%201940%20Through%201970%3A%20Their%20Impact%20Then%20On%20Black%20And%20White%20Children%27s%20Development%20\r\n\r\n### Related files\r\n<!-- Files related to this issue; this is super useful for new contributors who might want to help! If you're not sure, leave this blank; a maintainer will add them. -->\r\n\r\nhttps://github.com/internetarchive/openlibrary/blob/b897c8c51a79308e38f9825fac82864a5cc7d3ae/openlibrary/plugins/worksearch/publishers.py#L82\r\n\r\n### Stakeholders\r\n<!-- @ tag stakeholders of this bug -->\r\n\n", "before_files": [{"content": "\"\"\"Publisher pages\n\"\"\"\nfrom infogami.utils import delegate, stats\nfrom infogami.utils.view import render_template, safeint\nimport web\nimport logging\n\nfrom . import subjects\nfrom . import search\n\nlogger = logging.getLogger(\"openlibrary.worksearch\")\n\n\nclass publishers(subjects.subjects):\n path = '(/publishers/[^/]+)'\n\n def GET(self, key):\n key = key.replace(\"_\", \" \")\n page = subjects.get_subject(key, details=True)\n\n if not page or page.work_count == 0:\n web.ctx.status = \"404 Not Found\"\n return render_template('publishers/notfound.tmpl', key)\n\n return render_template(\"publishers/view\", page)\n\n def is_enabled(self):\n return \"publishers\" in web.ctx.features\n\n\nclass publishers_json(subjects.subjects_json):\n path = '(/publishers/[^/]+)'\n encoding = \"json\"\n\n def is_enabled(self):\n return \"publishers\" in web.ctx.features\n\n def normalize_key(self, key):\n return key\n\n def process_key(self, key):\n return key.replace(\"_\", \" \")\n\n\nclass index(delegate.page):\n path = \"/publishers\"\n\n def GET(self):\n return render_template(\"publishers/index\")\n\n def is_enabled(self):\n return \"publishers\" in web.ctx.features\n\n\nclass publisher_search(delegate.page):\n path = '/search/publishers'\n\n def GET(self):\n i = web.input(q=\"\")\n solr = search.get_solr()\n q = {\"publisher\": i.q}\n\n result = solr.select(\n q,\n facets=[\"publisher_facet\"],\n fields=[\"publisher\", \"publisher_facet\"],\n rows=0,\n )\n result = self.process_result(result)\n return render_template('search/publishers', i.q, result)\n\n def process_result(self, result):\n solr = search.get_solr()\n\n def process(p):\n return web.storage(\n name=p.value,\n key=\"/publishers/\" + p.value.replace(\" \", \"_\"),\n count=solr.select({\"publisher_facet\": p.value}, rows=0)['num_found'],\n )\n\n publisher_facets = result['facets']['publisher_facet'][:25]\n return [process(p) for p in publisher_facets]\n\n\nclass PublisherEngine(subjects.SubjectEngine):\n def normalize_key(self, key):\n return key\n\n def get_ebook_count(self, name, value, publish_year):\n # Query solr for this publish_year and publish_year combination and read the has_fulltext=true facet\n solr = search.get_solr()\n q = {\"publisher_facet\": value}\n\n if isinstance(publish_year, list):\n q['publish_year'] = tuple(publish_year) # range\n elif publish_year:\n q['publish_year'] = publish_year\n\n result = solr.select(q, facets=[\"has_fulltext\"], rows=0)\n counts = {v.value: v.count for v in result[\"facets\"][\"has_fulltext\"]}\n return counts.get('true')\n\n\ndef setup():\n subjects.SUBJECTS.append(\n subjects.SubjectMeta(\n name=\"publisher\",\n key=\"publishers\",\n prefix=\"/publishers/\",\n facet=\"publisher_facet\",\n facet_key=\"publisher_facet\",\n Engine=PublisherEngine,\n )\n )\n", "path": "openlibrary/plugins/worksearch/publishers.py"}], "after_files": [{"content": "\"\"\"Publisher pages\n\"\"\"\nfrom infogami.utils import delegate, stats\nfrom infogami.utils.view import render_template, safeint\nimport web\nimport logging\n\nfrom . import subjects\nfrom . import search\n\nlogger = logging.getLogger(\"openlibrary.worksearch\")\n\n\nclass publishers(subjects.subjects):\n path = '(/publishers/[^/]+)'\n\n def GET(self, key):\n key = key.replace(\"_\", \" \")\n page = subjects.get_subject(key, details=True)\n\n if not page or page.work_count == 0:\n web.ctx.status = \"404 Not Found\"\n return render_template('publishers/notfound.tmpl', key)\n\n return render_template(\"publishers/view\", page)\n\n def is_enabled(self):\n return \"publishers\" in web.ctx.features\n\n\nclass publishers_json(subjects.subjects_json):\n path = '(/publishers/[^/]+)'\n encoding = \"json\"\n\n def is_enabled(self):\n return \"publishers\" in web.ctx.features\n\n def normalize_key(self, key):\n return key\n\n def process_key(self, key):\n return key.replace(\"_\", \" \")\n\n\nclass index(delegate.page):\n path = \"/publishers\"\n\n def GET(self):\n return render_template(\"publishers/index\")\n\n def is_enabled(self):\n return \"publishers\" in web.ctx.features\n\n\nclass publisher_search(delegate.page):\n path = '/search/publishers'\n\n def GET(self):\n i = web.input(q=\"\")\n result = search.get_solr().select(\n {\"publisher\": i.q, \"type\": \"work\"},\n facets=[\"publisher_facet\"],\n facet_mincount=1,\n facet_limit=25,\n facet_contains=i.q,\n facet_contains_ignoreCase='true',\n rows=0,\n )\n result = self.process_result(result)\n return render_template('search/publishers', i.q, result)\n\n def process_result(self, result):\n publisher_facets = result['facets']['publisher_facet']\n return [\n web.storage(\n name=p.value,\n key=\"/publishers/\" + p.value.replace(\" \", \"_\"),\n count=p.count,\n )\n for p in publisher_facets\n ]\n\n\nclass PublisherEngine(subjects.SubjectEngine):\n def normalize_key(self, key):\n return key\n\n def get_ebook_count(self, name, value, publish_year):\n # Query solr for this publish_year and publish_year combination and read the has_fulltext=true facet\n solr = search.get_solr()\n q = {\"publisher_facet\": value}\n\n if isinstance(publish_year, list):\n q['publish_year'] = tuple(publish_year) # range\n elif publish_year:\n q['publish_year'] = publish_year\n\n result = solr.select(q, facets=[\"has_fulltext\"], rows=0)\n counts = {v.value: v.count for v in result[\"facets\"][\"has_fulltext\"]}\n return counts.get('true')\n\n\ndef setup():\n subjects.SUBJECTS.append(\n subjects.SubjectMeta(\n name=\"publisher\",\n key=\"publishers\",\n prefix=\"/publishers/\",\n facet=\"publisher_facet\",\n facet_key=\"publisher_facet\",\n Engine=PublisherEngine,\n )\n )\n", "path": "openlibrary/plugins/worksearch/publishers.py"}]}
| 1,729 | 373 |
gh_patches_debug_18174
|
rasdani/github-patches
|
git_diff
|
ephios-dev__ephios-82
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Redirect anonymous users to login view instead of raising 403
this also raises 403 if users are not logged in. this is not what we want
_Originally posted by @jeriox in https://github.com/jeriox/jep/pull/48#discussion_r479789720_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `jep/permissions.py`
Content:
```
1 import guardian.mixins
2 from django.contrib.auth.models import Permission, Group
3 from guardian.ctypes import get_content_type
4 from guardian.utils import get_group_obj_perms_model
5
6
7 def get_groups_with_perms(obj, only_with_perms_in):
8
9 ctype = get_content_type(obj)
10 group_model = get_group_obj_perms_model(obj)
11
12 group_rel_name = group_model.group.field.related_query_name()
13
14 if group_model.objects.is_generic():
15 group_filters = {
16 "%s__content_type" % group_rel_name: ctype,
17 "%s__object_pk" % group_rel_name: obj.pk,
18 }
19 else:
20 group_filters = {"%s__content_object" % group_rel_name: obj}
21
22 permission_ids = Permission.objects.filter(
23 content_type=ctype, codename__in=only_with_perms_in
24 ).values_list("id", flat=True)
25 group_filters.update(
26 {"%s__permission_id__in" % group_rel_name: permission_ids,}
27 )
28 return Group.objects.filter(**group_filters).distinct()
29
30
31 class CustomPermissionRequiredMixin(guardian.mixins.PermissionRequiredMixin):
32 raise_exception = True
33 accept_global_perms = True
34
35 # FIXME redirect non logged in users and raise Permission for others
36
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/jep/permissions.py b/jep/permissions.py
--- a/jep/permissions.py
+++ b/jep/permissions.py
@@ -1,8 +1,12 @@
import guardian.mixins
+from django.contrib.auth import REDIRECT_FIELD_NAME
from django.contrib.auth.models import Permission, Group
+from django.contrib.auth.views import redirect_to_login
from guardian.ctypes import get_content_type
from guardian.utils import get_group_obj_perms_model
+from jep import settings
+
def get_groups_with_perms(obj, only_with_perms_in):
@@ -32,4 +36,10 @@
raise_exception = True
accept_global_perms = True
- # FIXME redirect non logged in users and raise Permission for others
+ def on_permission_check_fail(self, request, response, obj=None):
+ if request.user.is_authenticated:
+ return response
+ else:
+ return redirect_to_login(
+ self.request.get_full_path(), settings.LOGIN_URL, REDIRECT_FIELD_NAME
+ )
|
{"golden_diff": "diff --git a/jep/permissions.py b/jep/permissions.py\n--- a/jep/permissions.py\n+++ b/jep/permissions.py\n@@ -1,8 +1,12 @@\n import guardian.mixins\n+from django.contrib.auth import REDIRECT_FIELD_NAME\n from django.contrib.auth.models import Permission, Group\n+from django.contrib.auth.views import redirect_to_login\n from guardian.ctypes import get_content_type\n from guardian.utils import get_group_obj_perms_model\n \n+from jep import settings\n+\n \n def get_groups_with_perms(obj, only_with_perms_in):\n \n@@ -32,4 +36,10 @@\n raise_exception = True\n accept_global_perms = True\n \n- # FIXME redirect non logged in users and raise Permission for others\n+ def on_permission_check_fail(self, request, response, obj=None):\n+ if request.user.is_authenticated:\n+ return response\n+ else:\n+ return redirect_to_login(\n+ self.request.get_full_path(), settings.LOGIN_URL, REDIRECT_FIELD_NAME\n+ )\n", "issue": "Redirect anonymous users to login view instead of raising 403\nthis also raises 403 if users are not logged in. this is not what we want\r\n\r\n_Originally posted by @jeriox in https://github.com/jeriox/jep/pull/48#discussion_r479789720_\n", "before_files": [{"content": "import guardian.mixins\nfrom django.contrib.auth.models import Permission, Group\nfrom guardian.ctypes import get_content_type\nfrom guardian.utils import get_group_obj_perms_model\n\n\ndef get_groups_with_perms(obj, only_with_perms_in):\n\n ctype = get_content_type(obj)\n group_model = get_group_obj_perms_model(obj)\n\n group_rel_name = group_model.group.field.related_query_name()\n\n if group_model.objects.is_generic():\n group_filters = {\n \"%s__content_type\" % group_rel_name: ctype,\n \"%s__object_pk\" % group_rel_name: obj.pk,\n }\n else:\n group_filters = {\"%s__content_object\" % group_rel_name: obj}\n\n permission_ids = Permission.objects.filter(\n content_type=ctype, codename__in=only_with_perms_in\n ).values_list(\"id\", flat=True)\n group_filters.update(\n {\"%s__permission_id__in\" % group_rel_name: permission_ids,}\n )\n return Group.objects.filter(**group_filters).distinct()\n\n\nclass CustomPermissionRequiredMixin(guardian.mixins.PermissionRequiredMixin):\n raise_exception = True\n accept_global_perms = True\n\n # FIXME redirect non logged in users and raise Permission for others\n", "path": "jep/permissions.py"}], "after_files": [{"content": "import guardian.mixins\nfrom django.contrib.auth import REDIRECT_FIELD_NAME\nfrom django.contrib.auth.models import Permission, Group\nfrom django.contrib.auth.views import redirect_to_login\nfrom guardian.ctypes import get_content_type\nfrom guardian.utils import get_group_obj_perms_model\n\nfrom jep import settings\n\n\ndef get_groups_with_perms(obj, only_with_perms_in):\n\n ctype = get_content_type(obj)\n group_model = get_group_obj_perms_model(obj)\n\n group_rel_name = group_model.group.field.related_query_name()\n\n if group_model.objects.is_generic():\n group_filters = {\n \"%s__content_type\" % group_rel_name: ctype,\n \"%s__object_pk\" % group_rel_name: obj.pk,\n }\n else:\n group_filters = {\"%s__content_object\" % group_rel_name: obj}\n\n permission_ids = Permission.objects.filter(\n content_type=ctype, codename__in=only_with_perms_in\n ).values_list(\"id\", flat=True)\n group_filters.update(\n {\"%s__permission_id__in\" % group_rel_name: permission_ids,}\n )\n return Group.objects.filter(**group_filters).distinct()\n\n\nclass CustomPermissionRequiredMixin(guardian.mixins.PermissionRequiredMixin):\n raise_exception = True\n accept_global_perms = True\n\n def on_permission_check_fail(self, request, response, obj=None):\n if request.user.is_authenticated:\n return response\n else:\n return redirect_to_login(\n self.request.get_full_path(), settings.LOGIN_URL, REDIRECT_FIELD_NAME\n )\n", "path": "jep/permissions.py"}]}
| 654 | 222 |
gh_patches_debug_15802
|
rasdani/github-patches
|
git_diff
|
lutris__lutris-1179
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Have logger scroll automatically only when at the bottom
Currently the logger scrolls whenever it outputs which makes scrolling up useless unless the game is stopped. This behavior is annoying.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lutris/gui/logwindow.py`
Content:
```
1 from gi.repository import Gtk
2 from lutris.gui.widgets.dialogs import Dialog
3
4
5 class LogTextView(Gtk.TextView):
6 def __init__(self, buffer):
7 super(LogTextView, self).__init__()
8
9 self.set_buffer(buffer)
10 self.set_editable(False)
11 self.set_monospace(True)
12 self.set_left_margin(10)
13 self.set_wrap_mode(Gtk.WrapMode.CHAR)
14 self.get_style_context().add_class('lutris-logview')
15 self.connect("size-allocate", self.autoscroll)
16
17 def autoscroll(self, *args):
18 adj = self.get_vadjustment()
19 adj.set_value(adj.get_upper() - adj.get_page_size())
20
21
22 class LogWindow(Dialog):
23 def __init__(self, title, buffer, parent):
24 super(LogWindow, self).__init__(title, parent, 0,
25 ('_OK', Gtk.ResponseType.OK))
26 self.set_size_request(640, 480)
27 self.grid = Gtk.Grid()
28 self.buffer = buffer
29 self.logtextview = LogTextView(self.buffer)
30
31 scrolledwindow = Gtk.ScrolledWindow(hexpand=True, vexpand=True,
32 child=self.logtextview)
33 self.vbox.add(scrolledwindow)
34 self.show_all()
35
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lutris/gui/logwindow.py b/lutris/gui/logwindow.py
--- a/lutris/gui/logwindow.py
+++ b/lutris/gui/logwindow.py
@@ -10,13 +10,16 @@
self.set_editable(False)
self.set_monospace(True)
self.set_left_margin(10)
+ self.scroll_max = 0
self.set_wrap_mode(Gtk.WrapMode.CHAR)
self.get_style_context().add_class('lutris-logview')
self.connect("size-allocate", self.autoscroll)
def autoscroll(self, *args):
adj = self.get_vadjustment()
- adj.set_value(adj.get_upper() - adj.get_page_size())
+ if adj.get_value() == self.scroll_max or self.scroll_max == 0:
+ adj.set_value(adj.get_upper() - adj.get_page_size())
+ self.scroll_max = adj.get_upper() - adj.get_page_size()
class LogWindow(Dialog):
|
{"golden_diff": "diff --git a/lutris/gui/logwindow.py b/lutris/gui/logwindow.py\n--- a/lutris/gui/logwindow.py\n+++ b/lutris/gui/logwindow.py\n@@ -10,13 +10,16 @@\n self.set_editable(False)\n self.set_monospace(True)\n self.set_left_margin(10)\n+ self.scroll_max = 0\n self.set_wrap_mode(Gtk.WrapMode.CHAR)\n self.get_style_context().add_class('lutris-logview')\n self.connect(\"size-allocate\", self.autoscroll)\n \n def autoscroll(self, *args):\n adj = self.get_vadjustment()\n- adj.set_value(adj.get_upper() - adj.get_page_size())\n+ if adj.get_value() == self.scroll_max or self.scroll_max == 0:\n+ adj.set_value(adj.get_upper() - adj.get_page_size())\n+ self.scroll_max = adj.get_upper() - adj.get_page_size()\n \n \n class LogWindow(Dialog):\n", "issue": "Have logger scroll automatically only when at the bottom\nCurrently the logger scrolls whenever it outputs which makes scrolling up useless unless the game is stopped. This behavior is annoying.\n", "before_files": [{"content": "from gi.repository import Gtk\nfrom lutris.gui.widgets.dialogs import Dialog\n\n\nclass LogTextView(Gtk.TextView):\n def __init__(self, buffer):\n super(LogTextView, self).__init__()\n\n self.set_buffer(buffer)\n self.set_editable(False)\n self.set_monospace(True)\n self.set_left_margin(10)\n self.set_wrap_mode(Gtk.WrapMode.CHAR)\n self.get_style_context().add_class('lutris-logview')\n self.connect(\"size-allocate\", self.autoscroll)\n\n def autoscroll(self, *args):\n adj = self.get_vadjustment()\n adj.set_value(adj.get_upper() - adj.get_page_size())\n\n\nclass LogWindow(Dialog):\n def __init__(self, title, buffer, parent):\n super(LogWindow, self).__init__(title, parent, 0,\n ('_OK', Gtk.ResponseType.OK))\n self.set_size_request(640, 480)\n self.grid = Gtk.Grid()\n self.buffer = buffer\n self.logtextview = LogTextView(self.buffer)\n\n scrolledwindow = Gtk.ScrolledWindow(hexpand=True, vexpand=True,\n child=self.logtextview)\n self.vbox.add(scrolledwindow)\n self.show_all()\n", "path": "lutris/gui/logwindow.py"}], "after_files": [{"content": "from gi.repository import Gtk\nfrom lutris.gui.widgets.dialogs import Dialog\n\n\nclass LogTextView(Gtk.TextView):\n def __init__(self, buffer):\n super(LogTextView, self).__init__()\n\n self.set_buffer(buffer)\n self.set_editable(False)\n self.set_monospace(True)\n self.set_left_margin(10)\n self.scroll_max = 0\n self.set_wrap_mode(Gtk.WrapMode.CHAR)\n self.get_style_context().add_class('lutris-logview')\n self.connect(\"size-allocate\", self.autoscroll)\n\n def autoscroll(self, *args):\n adj = self.get_vadjustment()\n if adj.get_value() == self.scroll_max or self.scroll_max == 0:\n adj.set_value(adj.get_upper() - adj.get_page_size())\n self.scroll_max = adj.get_upper() - adj.get_page_size()\n\n\nclass LogWindow(Dialog):\n def __init__(self, title, buffer, parent):\n super(LogWindow, self).__init__(title, parent, 0,\n ('_OK', Gtk.ResponseType.OK))\n self.set_size_request(640, 480)\n self.grid = Gtk.Grid()\n self.buffer = buffer\n self.logtextview = LogTextView(self.buffer)\n\n scrolledwindow = Gtk.ScrolledWindow(hexpand=True, vexpand=True,\n child=self.logtextview)\n self.vbox.add(scrolledwindow)\n self.show_all()\n", "path": "lutris/gui/logwindow.py"}]}
| 620 | 214 |
gh_patches_debug_16370
|
rasdani/github-patches
|
git_diff
|
open-mmlab__mmaction2-676
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Localizer train cfg & test cfg ?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `configs/_base_/models/bsn_tem.py`
Content:
```
1 # model settings
2 model = dict(
3 type='TEM',
4 temporal_dim=100,
5 boundary_ratio=0.1,
6 tem_feat_dim=400,
7 tem_hidden_dim=512,
8 tem_match_threshold=0.5)
9 # model training and testing settings
10 train_cfg = None
11 test_cfg = dict(average_clips='score')
12
```
Path: `configs/_base_/models/bsn_pem.py`
Content:
```
1 # model settings
2 model = dict(
3 type='PEM',
4 pem_feat_dim=32,
5 pem_hidden_dim=256,
6 pem_u_ratio_m=1,
7 pem_u_ratio_l=2,
8 pem_high_temporal_iou_threshold=0.6,
9 pem_low_temporal_iou_threshold=0.2,
10 soft_nms_alpha=0.75,
11 soft_nms_low_threshold=0.65,
12 soft_nms_high_threshold=0.9,
13 post_process_top_k=100)
14 # model training and testing settings
15 train_cfg = None
16 test_cfg = dict(average_clips='score')
17
```
Path: `configs/_base_/models/bmn_400x100.py`
Content:
```
1 # model settings
2 model = dict(
3 type='BMN',
4 temporal_dim=100,
5 boundary_ratio=0.5,
6 num_samples=32,
7 num_samples_per_bin=3,
8 feat_dim=400,
9 soft_nms_alpha=0.4,
10 soft_nms_low_threshold=0.5,
11 soft_nms_high_threshold=0.9,
12 post_process_top_k=100)
13 # model training and testing settings
14 train_cfg = None
15 test_cfg = dict(average_clips='score')
16
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/configs/_base_/models/bmn_400x100.py b/configs/_base_/models/bmn_400x100.py
--- a/configs/_base_/models/bmn_400x100.py
+++ b/configs/_base_/models/bmn_400x100.py
@@ -10,6 +10,3 @@
soft_nms_low_threshold=0.5,
soft_nms_high_threshold=0.9,
post_process_top_k=100)
-# model training and testing settings
-train_cfg = None
-test_cfg = dict(average_clips='score')
diff --git a/configs/_base_/models/bsn_pem.py b/configs/_base_/models/bsn_pem.py
--- a/configs/_base_/models/bsn_pem.py
+++ b/configs/_base_/models/bsn_pem.py
@@ -11,6 +11,3 @@
soft_nms_low_threshold=0.65,
soft_nms_high_threshold=0.9,
post_process_top_k=100)
-# model training and testing settings
-train_cfg = None
-test_cfg = dict(average_clips='score')
diff --git a/configs/_base_/models/bsn_tem.py b/configs/_base_/models/bsn_tem.py
--- a/configs/_base_/models/bsn_tem.py
+++ b/configs/_base_/models/bsn_tem.py
@@ -6,6 +6,3 @@
tem_feat_dim=400,
tem_hidden_dim=512,
tem_match_threshold=0.5)
-# model training and testing settings
-train_cfg = None
-test_cfg = dict(average_clips='score')
|
{"golden_diff": "diff --git a/configs/_base_/models/bmn_400x100.py b/configs/_base_/models/bmn_400x100.py\n--- a/configs/_base_/models/bmn_400x100.py\n+++ b/configs/_base_/models/bmn_400x100.py\n@@ -10,6 +10,3 @@\n soft_nms_low_threshold=0.5,\n soft_nms_high_threshold=0.9,\n post_process_top_k=100)\n-# model training and testing settings\n-train_cfg = None\n-test_cfg = dict(average_clips='score')\ndiff --git a/configs/_base_/models/bsn_pem.py b/configs/_base_/models/bsn_pem.py\n--- a/configs/_base_/models/bsn_pem.py\n+++ b/configs/_base_/models/bsn_pem.py\n@@ -11,6 +11,3 @@\n soft_nms_low_threshold=0.65,\n soft_nms_high_threshold=0.9,\n post_process_top_k=100)\n-# model training and testing settings\n-train_cfg = None\n-test_cfg = dict(average_clips='score')\ndiff --git a/configs/_base_/models/bsn_tem.py b/configs/_base_/models/bsn_tem.py\n--- a/configs/_base_/models/bsn_tem.py\n+++ b/configs/_base_/models/bsn_tem.py\n@@ -6,6 +6,3 @@\n tem_feat_dim=400,\n tem_hidden_dim=512,\n tem_match_threshold=0.5)\n-# model training and testing settings\n-train_cfg = None\n-test_cfg = dict(average_clips='score')\n", "issue": "Localizer train cfg & test cfg ?\n\n", "before_files": [{"content": "# model settings\nmodel = dict(\n type='TEM',\n temporal_dim=100,\n boundary_ratio=0.1,\n tem_feat_dim=400,\n tem_hidden_dim=512,\n tem_match_threshold=0.5)\n# model training and testing settings\ntrain_cfg = None\ntest_cfg = dict(average_clips='score')\n", "path": "configs/_base_/models/bsn_tem.py"}, {"content": "# model settings\nmodel = dict(\n type='PEM',\n pem_feat_dim=32,\n pem_hidden_dim=256,\n pem_u_ratio_m=1,\n pem_u_ratio_l=2,\n pem_high_temporal_iou_threshold=0.6,\n pem_low_temporal_iou_threshold=0.2,\n soft_nms_alpha=0.75,\n soft_nms_low_threshold=0.65,\n soft_nms_high_threshold=0.9,\n post_process_top_k=100)\n# model training and testing settings\ntrain_cfg = None\ntest_cfg = dict(average_clips='score')\n", "path": "configs/_base_/models/bsn_pem.py"}, {"content": "# model settings\nmodel = dict(\n type='BMN',\n temporal_dim=100,\n boundary_ratio=0.5,\n num_samples=32,\n num_samples_per_bin=3,\n feat_dim=400,\n soft_nms_alpha=0.4,\n soft_nms_low_threshold=0.5,\n soft_nms_high_threshold=0.9,\n post_process_top_k=100)\n# model training and testing settings\ntrain_cfg = None\ntest_cfg = dict(average_clips='score')\n", "path": "configs/_base_/models/bmn_400x100.py"}], "after_files": [{"content": "# model settings\nmodel = dict(\n type='TEM',\n temporal_dim=100,\n boundary_ratio=0.1,\n tem_feat_dim=400,\n tem_hidden_dim=512,\n tem_match_threshold=0.5)\n", "path": "configs/_base_/models/bsn_tem.py"}, {"content": "# model settings\nmodel = dict(\n type='PEM',\n pem_feat_dim=32,\n pem_hidden_dim=256,\n pem_u_ratio_m=1,\n pem_u_ratio_l=2,\n pem_high_temporal_iou_threshold=0.6,\n pem_low_temporal_iou_threshold=0.2,\n soft_nms_alpha=0.75,\n soft_nms_low_threshold=0.65,\n soft_nms_high_threshold=0.9,\n post_process_top_k=100)\n", "path": "configs/_base_/models/bsn_pem.py"}, {"content": "# model settings\nmodel = dict(\n type='BMN',\n temporal_dim=100,\n boundary_ratio=0.5,\n num_samples=32,\n num_samples_per_bin=3,\n feat_dim=400,\n soft_nms_alpha=0.4,\n soft_nms_low_threshold=0.5,\n soft_nms_high_threshold=0.9,\n post_process_top_k=100)\n", "path": "configs/_base_/models/bmn_400x100.py"}]}
| 722 | 397 |
gh_patches_debug_16280
|
rasdani/github-patches
|
git_diff
|
mirumee__ariadne-35
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
If value from resolver is callable, call it with **kwargs.
[Apollo doc](https://www.apollographql.com/docs/graphql-tools/resolvers) for default resolver says that if `field_name` resolves to function, it will be called with query arguments:
> Calls a function on obj with the relevant field name and passes the query arguments into that function
This can be useful for situations when parent resolver returned an object with getter functions.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ariadne/resolvers.py`
Content:
```
1 from graphql import GraphQLObjectType, GraphQLScalarType, GraphQLSchema
2 from graphql.execution.base import ResolveInfo
3
4
5 def resolve_parent_field(parent, name: str):
6 if isinstance(parent, dict):
7 return parent.get(name)
8 return getattr(parent, name, None)
9
10
11 def default_resolver(parent, info: ResolveInfo):
12 return resolve_parent_field(parent, info.field_name)
13
14
15 def resolve_to(name: str):
16 def resolver(parent, *_):
17 return resolve_parent_field(parent, name)
18
19 return resolver
20
21
22 def add_resolve_functions_to_schema(schema: GraphQLSchema, resolvers: dict):
23 for type_name, type_object in schema.get_type_map().items():
24 if isinstance(type_object, GraphQLObjectType):
25 add_resolve_functions_to_object(type_name, type_object, resolvers)
26 if isinstance(type_object, GraphQLScalarType):
27 add_resolve_functions_to_scalar(type_name, type_object, resolvers)
28
29
30 def add_resolve_functions_to_object(name: str, obj: GraphQLObjectType, resolvers: dict):
31 type_resolvers = resolvers.get(name, {})
32 for field_name, field_object in obj.fields.items():
33 field_resolver = type_resolvers.get(field_name)
34 if field_resolver:
35 field_object.resolver = field_resolver
36 elif field_object.resolver is None:
37 field_object.resolver = default_resolver
38
39
40 def add_resolve_functions_to_scalar(name: str, obj: GraphQLObjectType, resolvers: dict):
41 scalar_resolvers = resolvers.get(name, {})
42
43 serialize = scalar_resolvers.get("serialize", obj.serialize)
44 obj.serialize = serialize
45
46 parse_literal = scalar_resolvers.get("parse_literal", obj.parse_literal)
47 obj.parse_literal = parse_literal
48
49 parse_value = scalar_resolvers.get("parse_value", obj.parse_value)
50 obj.parse_value = parse_value
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ariadne/resolvers.py b/ariadne/resolvers.py
--- a/ariadne/resolvers.py
+++ b/ariadne/resolvers.py
@@ -2,19 +2,23 @@
from graphql.execution.base import ResolveInfo
-def resolve_parent_field(parent, name: str):
+def resolve_parent_field(parent, name: str, **kwargs: dict):
if isinstance(parent, dict):
- return parent.get(name)
- return getattr(parent, name, None)
+ value = parent.get(name)
+ else:
+ value = getattr(parent, name, None)
+ if callable(value):
+ return value(**kwargs)
+ return value
-def default_resolver(parent, info: ResolveInfo):
- return resolve_parent_field(parent, info.field_name)
+def default_resolver(parent, info: ResolveInfo, **kwargs):
+ return resolve_parent_field(parent, info.field_name, **kwargs)
def resolve_to(name: str):
- def resolver(parent, *_):
- return resolve_parent_field(parent, name)
+ def resolver(parent, *_, **kwargs):
+ return resolve_parent_field(parent, name, **kwargs)
return resolver
|
{"golden_diff": "diff --git a/ariadne/resolvers.py b/ariadne/resolvers.py\n--- a/ariadne/resolvers.py\n+++ b/ariadne/resolvers.py\n@@ -2,19 +2,23 @@\n from graphql.execution.base import ResolveInfo\n \n \n-def resolve_parent_field(parent, name: str):\n+def resolve_parent_field(parent, name: str, **kwargs: dict):\n if isinstance(parent, dict):\n- return parent.get(name)\n- return getattr(parent, name, None)\n+ value = parent.get(name)\n+ else:\n+ value = getattr(parent, name, None)\n+ if callable(value):\n+ return value(**kwargs)\n+ return value\n \n \n-def default_resolver(parent, info: ResolveInfo):\n- return resolve_parent_field(parent, info.field_name)\n+def default_resolver(parent, info: ResolveInfo, **kwargs):\n+ return resolve_parent_field(parent, info.field_name, **kwargs)\n \n \n def resolve_to(name: str):\n- def resolver(parent, *_):\n- return resolve_parent_field(parent, name)\n+ def resolver(parent, *_, **kwargs):\n+ return resolve_parent_field(parent, name, **kwargs)\n \n return resolver\n", "issue": "If value from resolver is callable, call it with **kwargs.\n[Apollo doc](https://www.apollographql.com/docs/graphql-tools/resolvers) for default resolver says that if `field_name` resolves to function, it will be called with query arguments:\r\n\r\n> Calls a function on obj with the relevant field name and passes the query arguments into that function\r\n\r\nThis can be useful for situations when parent resolver returned an object with getter functions.\n", "before_files": [{"content": "from graphql import GraphQLObjectType, GraphQLScalarType, GraphQLSchema\nfrom graphql.execution.base import ResolveInfo\n\n\ndef resolve_parent_field(parent, name: str):\n if isinstance(parent, dict):\n return parent.get(name)\n return getattr(parent, name, None)\n\n\ndef default_resolver(parent, info: ResolveInfo):\n return resolve_parent_field(parent, info.field_name)\n\n\ndef resolve_to(name: str):\n def resolver(parent, *_):\n return resolve_parent_field(parent, name)\n\n return resolver\n\n\ndef add_resolve_functions_to_schema(schema: GraphQLSchema, resolvers: dict):\n for type_name, type_object in schema.get_type_map().items():\n if isinstance(type_object, GraphQLObjectType):\n add_resolve_functions_to_object(type_name, type_object, resolvers)\n if isinstance(type_object, GraphQLScalarType):\n add_resolve_functions_to_scalar(type_name, type_object, resolvers)\n\n\ndef add_resolve_functions_to_object(name: str, obj: GraphQLObjectType, resolvers: dict):\n type_resolvers = resolvers.get(name, {})\n for field_name, field_object in obj.fields.items():\n field_resolver = type_resolvers.get(field_name)\n if field_resolver:\n field_object.resolver = field_resolver\n elif field_object.resolver is None:\n field_object.resolver = default_resolver\n\n\ndef add_resolve_functions_to_scalar(name: str, obj: GraphQLObjectType, resolvers: dict):\n scalar_resolvers = resolvers.get(name, {})\n\n serialize = scalar_resolvers.get(\"serialize\", obj.serialize)\n obj.serialize = serialize\n\n parse_literal = scalar_resolvers.get(\"parse_literal\", obj.parse_literal)\n obj.parse_literal = parse_literal\n\n parse_value = scalar_resolvers.get(\"parse_value\", obj.parse_value)\n obj.parse_value = parse_value\n", "path": "ariadne/resolvers.py"}], "after_files": [{"content": "from graphql import GraphQLObjectType, GraphQLScalarType, GraphQLSchema\nfrom graphql.execution.base import ResolveInfo\n\n\ndef resolve_parent_field(parent, name: str, **kwargs: dict):\n if isinstance(parent, dict):\n value = parent.get(name)\n else:\n value = getattr(parent, name, None)\n if callable(value):\n return value(**kwargs)\n return value\n\n\ndef default_resolver(parent, info: ResolveInfo, **kwargs):\n return resolve_parent_field(parent, info.field_name, **kwargs)\n\n\ndef resolve_to(name: str):\n def resolver(parent, *_, **kwargs):\n return resolve_parent_field(parent, name, **kwargs)\n\n return resolver\n\n\ndef add_resolve_functions_to_schema(schema: GraphQLSchema, resolvers: dict):\n for type_name, type_object in schema.get_type_map().items():\n if isinstance(type_object, GraphQLObjectType):\n add_resolve_functions_to_object(type_name, type_object, resolvers)\n if isinstance(type_object, GraphQLScalarType):\n add_resolve_functions_to_scalar(type_name, type_object, resolvers)\n\n\ndef add_resolve_functions_to_object(name: str, obj: GraphQLObjectType, resolvers: dict):\n type_resolvers = resolvers.get(name, {})\n for field_name, field_object in obj.fields.items():\n field_resolver = type_resolvers.get(field_name)\n if field_resolver:\n field_object.resolver = field_resolver\n elif field_object.resolver is None:\n field_object.resolver = default_resolver\n\n\ndef add_resolve_functions_to_scalar(name: str, obj: GraphQLObjectType, resolvers: dict):\n scalar_resolvers = resolvers.get(name, {})\n\n serialize = scalar_resolvers.get(\"serialize\", obj.serialize)\n obj.serialize = serialize\n\n parse_literal = scalar_resolvers.get(\"parse_literal\", obj.parse_literal)\n obj.parse_literal = parse_literal\n\n parse_value = scalar_resolvers.get(\"parse_value\", obj.parse_value)\n obj.parse_value = parse_value\n", "path": "ariadne/resolvers.py"}]}
| 824 | 259 |
gh_patches_debug_13586
|
rasdani/github-patches
|
git_diff
|
pwndbg__pwndbg-146
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
"show print elements 0" causes exceptions on stop
```
pwndbg> show print elements
Limit on string chars or array elements to print is unlimited.
Traceback (most recent call last):
File "/home/david/.pwndbg/pwndbg/events.py", line 111, in caller
func()
File "/home/david/.pwndbg/pwndbg/strings.py", line 34, in update_length
length = int(message)
File "/home/david/.pwndbg/pwndbg/inthook.py", line 44, in __new__
return _int(_int(value, *a, **kw))
ValueError: invalid literal for int() with base 10: 'unlimited'
Python Exception <class 'ValueError'> invalid literal for int() with base 10: 'unlimited':
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwndbg/strings.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 """
4 Functionality for resolving ASCII printable strings within
5 the debuggee's address space.
6 """
7 from __future__ import absolute_import
8 from __future__ import division
9 from __future__ import print_function
10 from __future__ import unicode_literals
11
12 import string
13
14 import gdb
15
16 import pwndbg.events
17 import pwndbg.memory
18 import pwndbg.typeinfo
19
20 length = 15
21
22 @pwndbg.events.stop
23 def update_length():
24 r"""
25 Unfortunately there's not a better way to get at this info.
26
27 >>> gdb.execute('show print elements', from_tty=False, to_string=True)
28 'Limit on string chars or array elements to print is 21.\n'
29 """
30 global length
31 message = gdb.execute('show print elements', from_tty=False, to_string=True)
32 message = message.split()[-1]
33 message = message.strip('.')
34 length = int(message)
35
36 def get(address, maxlen = None):
37 if maxlen is None:
38 maxlen = length
39
40 try:
41 sz = pwndbg.memory.string(address)
42 sz = sz.decode('latin-1', 'replace')
43
44 if not sz or not all(s in string.printable for s in sz):
45 return None
46 except Exception as e:
47 return None
48
49 if len(sz) < maxlen:
50 return sz
51
52 return sz[:maxlen] + '...'
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pwndbg/strings.py b/pwndbg/strings.py
--- a/pwndbg/strings.py
+++ b/pwndbg/strings.py
@@ -31,7 +31,10 @@
message = gdb.execute('show print elements', from_tty=False, to_string=True)
message = message.split()[-1]
message = message.strip('.')
- length = int(message)
+ if message == 'unlimited':
+ length = 0
+ else:
+ length = int(message)
def get(address, maxlen = None):
if maxlen is None:
@@ -46,7 +49,7 @@
except Exception as e:
return None
- if len(sz) < maxlen:
+ if len(sz) < maxlen or not maxlen:
return sz
return sz[:maxlen] + '...'
|
{"golden_diff": "diff --git a/pwndbg/strings.py b/pwndbg/strings.py\n--- a/pwndbg/strings.py\n+++ b/pwndbg/strings.py\n@@ -31,7 +31,10 @@\n message = gdb.execute('show print elements', from_tty=False, to_string=True)\n message = message.split()[-1]\n message = message.strip('.')\n- length = int(message)\n+ if message == 'unlimited':\n+ length = 0\n+ else:\n+ length = int(message)\n \n def get(address, maxlen = None):\n if maxlen is None:\n@@ -46,7 +49,7 @@\n except Exception as e:\n return None\n \n- if len(sz) < maxlen:\n+ if len(sz) < maxlen or not maxlen:\n return sz\n \n return sz[:maxlen] + '...'\n", "issue": "\"show print elements 0\" causes exceptions on stop\n```\r\npwndbg> show print elements\r\nLimit on string chars or array elements to print is unlimited.\r\nTraceback (most recent call last):\r\n File \"/home/david/.pwndbg/pwndbg/events.py\", line 111, in caller\r\n func()\r\n File \"/home/david/.pwndbg/pwndbg/strings.py\", line 34, in update_length\r\n length = int(message)\r\n File \"/home/david/.pwndbg/pwndbg/inthook.py\", line 44, in __new__\r\n return _int(_int(value, *a, **kw))\r\nValueError: invalid literal for int() with base 10: 'unlimited'\r\nPython Exception <class 'ValueError'> invalid literal for int() with base 10: 'unlimited': \r\n```\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nFunctionality for resolving ASCII printable strings within\nthe debuggee's address space.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport string\n\nimport gdb\n\nimport pwndbg.events\nimport pwndbg.memory\nimport pwndbg.typeinfo\n\nlength = 15\n\[email protected]\ndef update_length():\n r\"\"\"\n Unfortunately there's not a better way to get at this info.\n\n >>> gdb.execute('show print elements', from_tty=False, to_string=True)\n 'Limit on string chars or array elements to print is 21.\\n'\n \"\"\"\n global length\n message = gdb.execute('show print elements', from_tty=False, to_string=True)\n message = message.split()[-1]\n message = message.strip('.')\n length = int(message)\n\ndef get(address, maxlen = None):\n if maxlen is None:\n maxlen = length\n\n try:\n sz = pwndbg.memory.string(address)\n sz = sz.decode('latin-1', 'replace')\n\n if not sz or not all(s in string.printable for s in sz):\n return None\n except Exception as e:\n return None\n\n if len(sz) < maxlen:\n return sz\n\n return sz[:maxlen] + '...'\n", "path": "pwndbg/strings.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nFunctionality for resolving ASCII printable strings within\nthe debuggee's address space.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport string\n\nimport gdb\n\nimport pwndbg.events\nimport pwndbg.memory\nimport pwndbg.typeinfo\n\nlength = 15\n\[email protected]\ndef update_length():\n r\"\"\"\n Unfortunately there's not a better way to get at this info.\n\n >>> gdb.execute('show print elements', from_tty=False, to_string=True)\n 'Limit on string chars or array elements to print is 21.\\n'\n \"\"\"\n global length\n message = gdb.execute('show print elements', from_tty=False, to_string=True)\n message = message.split()[-1]\n message = message.strip('.')\n if message == 'unlimited':\n length = 0\n else:\n length = int(message)\n\ndef get(address, maxlen = None):\n if maxlen is None:\n maxlen = length\n\n try:\n sz = pwndbg.memory.string(address)\n sz = sz.decode('latin-1', 'replace')\n\n if not sz or not all(s in string.printable for s in sz):\n return None\n except Exception as e:\n return None\n\n if len(sz) < maxlen or not maxlen:\n return sz\n\n return sz[:maxlen] + '...'\n", "path": "pwndbg/strings.py"}]}
| 860 | 190 |
gh_patches_debug_1981
|
rasdani/github-patches
|
git_diff
|
vyperlang__vyper-2905
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Missing @view decorator for interface ERC20Detailed.py
### Version Information
* vyper Version (output of `vyper --version`): 0.3.3
* OS: linux
* Python Version (output of `python --version`): Python 3.9.5
### What's your issue about?
**Issue**
Error using `ERC20Detailed.py` as an interface to a vyper class. Trying to compile the following snippet produces the following error.
```
# @version 0.3.3
from vyper.interfaces import ERC20Detailed
@view
@external
def getSymbol() -> String[32]:
return ERC20Detailed(0x5f3b5DfEb7B28CDbD7FAba78963EE202a494e2A2).symbol()
```
**Error**
```
vyper.exceptions.StateAccessViolation: May not call state modifying function 'symbol' within a constant
function.vyper.exceptions.StateAccessViolation: May not call state modifying function 'symbol' within a constant function.
```
**Reason**
This issue occurs because `ERC20Detailed.py` does not contain `@view` decorator for its interfaces
### How can it be fixed?
Adding `@view` decorator to interface under `vyper.builtin_interfaces.ERC20Detailed.py`
```
@external
@view
def name() -> String[1]:
pass
@external
@view
def symbol() -> String[1]:
pass
@external
@view
def decimals() -> uint8:
pass
```
**Why?**
Running `vyper -f interface examples/tokens/ERC20.vy` generates the following
```
...
@view
@external
def name() -> String[32]:
pass
@view
@external
def symbol() -> String[32]:
pass
@view
@external
def decimals() -> uint8:
pass
...
```
Adding `@view` decorator to `vyper.builtin_interfaces.ERC20Detailed.py` would make interface consistent.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `vyper/builtin_interfaces/ERC20Detailed.py`
Content:
```
1 """
2 NOTE: interface uses `String[1]` where 1 is the lower bound of the string returned by the function.
3 For end-users this means they can't use `implements: ERC20Detailed` unless their implementation
4 uses a value n >= 1. Regardless this is fine as one can't do String[0] where n == 0.
5 """
6
7 interface_code = """
8 @external
9 def name() -> String[1]:
10 pass
11
12 @external
13 def symbol() -> String[1]:
14 pass
15
16 @external
17 def decimals() -> uint8:
18 pass
19 """
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/vyper/builtin_interfaces/ERC20Detailed.py b/vyper/builtin_interfaces/ERC20Detailed.py
--- a/vyper/builtin_interfaces/ERC20Detailed.py
+++ b/vyper/builtin_interfaces/ERC20Detailed.py
@@ -5,14 +5,17 @@
"""
interface_code = """
+@view
@external
def name() -> String[1]:
pass
+@view
@external
def symbol() -> String[1]:
pass
+@view
@external
def decimals() -> uint8:
pass
|
{"golden_diff": "diff --git a/vyper/builtin_interfaces/ERC20Detailed.py b/vyper/builtin_interfaces/ERC20Detailed.py\n--- a/vyper/builtin_interfaces/ERC20Detailed.py\n+++ b/vyper/builtin_interfaces/ERC20Detailed.py\n@@ -5,14 +5,17 @@\n \"\"\"\n \n interface_code = \"\"\"\n+@view\n @external\n def name() -> String[1]:\n pass\n \n+@view\n @external\n def symbol() -> String[1]:\n pass\n \n+@view\n @external\n def decimals() -> uint8:\n pass\n", "issue": "Missing @view decorator for interface ERC20Detailed.py\n### Version Information\r\n* vyper Version (output of `vyper --version`): 0.3.3\r\n* OS: linux\r\n* Python Version (output of `python --version`): Python 3.9.5\r\n### What's your issue about?\r\n**Issue**\r\nError using `ERC20Detailed.py` as an interface to a vyper class. Trying to compile the following snippet produces the following error.\r\n```\r\n# @version 0.3.3\r\n\r\nfrom vyper.interfaces import ERC20Detailed\r\n\r\n@view\r\n@external\r\ndef getSymbol() -> String[32]:\r\n return ERC20Detailed(0x5f3b5DfEb7B28CDbD7FAba78963EE202a494e2A2).symbol()\r\n```\r\n**Error**\r\n```\r\nvyper.exceptions.StateAccessViolation: May not call state modifying function 'symbol' within a constant\r\nfunction.vyper.exceptions.StateAccessViolation: May not call state modifying function 'symbol' within a constant function.\r\n```\r\n**Reason**\r\nThis issue occurs because `ERC20Detailed.py` does not contain `@view` decorator for its interfaces\r\n### How can it be fixed?\r\nAdding `@view` decorator to interface under `vyper.builtin_interfaces.ERC20Detailed.py`\r\n```\r\n@external\r\n@view\r\ndef name() -> String[1]:\r\n pass\r\n \r\n@external\r\n@view\r\ndef symbol() -> String[1]:\r\n pass\r\n \r\n@external\r\n@view\r\ndef decimals() -> uint8:\r\n pass\r\n```\r\n**Why?**\r\nRunning `vyper -f interface examples/tokens/ERC20.vy` generates the following\r\n```\r\n...\r\n@view\r\n@external\r\ndef name() -> String[32]:\r\n pass\r\n \r\n@view\r\n@external\r\ndef symbol() -> String[32]:\r\n pass\r\n \r\n@view\r\n@external\r\ndef decimals() -> uint8:\r\n pass\r\n...\r\n```\r\n\r\nAdding `@view` decorator to `vyper.builtin_interfaces.ERC20Detailed.py` would make interface consistent.\n", "before_files": [{"content": "\"\"\"\nNOTE: interface uses `String[1]` where 1 is the lower bound of the string returned by the function.\n For end-users this means they can't use `implements: ERC20Detailed` unless their implementation\n uses a value n >= 1. Regardless this is fine as one can't do String[0] where n == 0.\n\"\"\"\n\ninterface_code = \"\"\"\n@external\ndef name() -> String[1]:\n pass\n\n@external\ndef symbol() -> String[1]:\n pass\n\n@external\ndef decimals() -> uint8:\n pass\n\"\"\"\n", "path": "vyper/builtin_interfaces/ERC20Detailed.py"}], "after_files": [{"content": "\"\"\"\nNOTE: interface uses `String[1]` where 1 is the lower bound of the string returned by the function.\n For end-users this means they can't use `implements: ERC20Detailed` unless their implementation\n uses a value n >= 1. Regardless this is fine as one can't do String[0] where n == 0.\n\"\"\"\n\ninterface_code = \"\"\"\n@view\n@external\ndef name() -> String[1]:\n pass\n\n@view\n@external\ndef symbol() -> String[1]:\n pass\n\n@view\n@external\ndef decimals() -> uint8:\n pass\n\"\"\"\n", "path": "vyper/builtin_interfaces/ERC20Detailed.py"}]}
| 876 | 127 |
gh_patches_debug_2390
|
rasdani/github-patches
|
git_diff
|
Qiskit__qiskit-2448
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
No module named 'vcr': requirement is missing (vcrpy)
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: 0.10.1
- **Python version**: 3.7.3
- **Operating system**: windows 10
### What is the current behavior?
Fresh qiskit installation inside a new environment on windows 10.
In one of the terra tutorial (using_the_transpiler) `from qiskit.test.mock import FakeTokyo` is failing 'ModuleNotFoundError: No module named vcr'
### Suggested solutions
'pip install vcrpy'
'vcrpy' needs to be added in requirements.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qiskit/util.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # This code is part of Qiskit.
3 #
4 # (C) Copyright IBM 2017.
5 #
6 # This code is licensed under the Apache License, Version 2.0. You may
7 # obtain a copy of this license in the LICENSE.txt file in the root directory
8 # of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
9 #
10 # Any modifications or derivative works of this code must retain this
11 # copyright notice, and modified files need to carry a notice indicating
12 # that they have been altered from the originals.
13
14 """Common utilities for Qiskit."""
15
16 import platform
17 import re
18 import socket
19 import sys
20 import warnings
21
22 import psutil
23 from marshmallow.warnings import ChangedInMarshmallow3Warning
24
25
26 def _check_python_version():
27 """Check for Python version 3.5+."""
28 if sys.version_info < (3, 5):
29 raise Exception('Qiskit requires Python version 3.5 or greater.')
30
31
32 def _filter_deprecation_warnings():
33 """Apply filters to deprecation warnings.
34
35 Force the `DeprecationWarning` warnings to be displayed for the qiskit
36 module, overriding the system configuration as they are ignored by default
37 [1] for end-users. Additionally, silence the `ChangedInMarshmallow3Warning`
38 messages.
39
40 TODO: on Python 3.7, this might not be needed due to PEP-0565 [2].
41
42 [1] https://docs.python.org/3/library/warnings.html#default-warning-filters
43 [2] https://www.python.org/dev/peps/pep-0565/
44 """
45 deprecation_filter = ('always', None, DeprecationWarning,
46 re.compile(r'^qiskit\.*', re.UNICODE), 0)
47
48 # Instead of using warnings.simple_filter() directly, the internal
49 # _add_filter() function is used for being able to match against the
50 # module.
51 try:
52 warnings._add_filter(*deprecation_filter, append=False)
53 except AttributeError:
54 # ._add_filter is internal and not available in some Python versions.
55 pass
56
57 # Add a filter for ignoring ChangedInMarshmallow3Warning, as we depend on
58 # marhsmallow 2 explicitly. 2.17.0 introduced new deprecation warnings that
59 # are useful for eventually migrating, but too verbose for our purposes.
60 warnings.simplefilter('ignore', category=ChangedInMarshmallow3Warning)
61
62
63 _check_python_version()
64 _filter_deprecation_warnings()
65
66
67 def local_hardware_info():
68 """Basic hardware information about the local machine.
69
70 Gives actual number of CPU's in the machine, even when hyperthreading is
71 turned on. CPU count defaults to 1 when true count can't be determined.
72
73 Returns:
74 dict: The hardware information.
75 """
76 results = {
77 'os': platform.system(),
78 'memory': psutil.virtual_memory().total / (1024 ** 3),
79 'cpus': psutil.cpu_count(logical=False) or 1
80 }
81 return results
82
83
84 def _has_connection(hostname, port):
85 """Checks if internet connection exists to host via specified port.
86
87 If any exception is raised while trying to open a socket this will return
88 false.
89
90 Args:
91 hostname (str): Hostname to connect to.
92 port (int): Port to connect to
93
94 Returns:
95 bool: Has connection or not
96
97 """
98 try:
99 host = socket.gethostbyname(hostname)
100 socket.create_connection((host, port), 2)
101 return True
102 except Exception: # pylint: disable=broad-except
103 return False
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/qiskit/util.py b/qiskit/util.py
--- a/qiskit/util.py
+++ b/qiskit/util.py
@@ -97,7 +97,7 @@
"""
try:
host = socket.gethostbyname(hostname)
- socket.create_connection((host, port), 2)
+ socket.create_connection((host, port), 2).close()
return True
except Exception: # pylint: disable=broad-except
return False
|
{"golden_diff": "diff --git a/qiskit/util.py b/qiskit/util.py\n--- a/qiskit/util.py\n+++ b/qiskit/util.py\n@@ -97,7 +97,7 @@\n \"\"\"\n try:\n host = socket.gethostbyname(hostname)\n- socket.create_connection((host, port), 2)\n+ socket.create_connection((host, port), 2).close()\n return True\n except Exception: # pylint: disable=broad-except\n return False\n", "issue": "No module named 'vcr': requirement is missing (vcrpy) \n<!-- \u26a0\ufe0f If you do not respect this template, your issue will be closed -->\r\n<!-- \u26a0\ufe0f Make sure to browse the opened and closed issues -->\r\n\r\n### Information\r\n\r\n- **Qiskit Terra version**: 0.10.1\r\n- **Python version**: 3.7.3\r\n- **Operating system**: windows 10\r\n\r\n### What is the current behavior?\r\nFresh qiskit installation inside a new environment on windows 10. \r\nIn one of the terra tutorial (using_the_transpiler) `from qiskit.test.mock import FakeTokyo` is failing 'ModuleNotFoundError: No module named vcr'\r\n\r\n### Suggested solutions\r\n'pip install vcrpy' \r\n'vcrpy' needs to be added in requirements.\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"Common utilities for Qiskit.\"\"\"\n\nimport platform\nimport re\nimport socket\nimport sys\nimport warnings\n\nimport psutil\nfrom marshmallow.warnings import ChangedInMarshmallow3Warning\n\n\ndef _check_python_version():\n \"\"\"Check for Python version 3.5+.\"\"\"\n if sys.version_info < (3, 5):\n raise Exception('Qiskit requires Python version 3.5 or greater.')\n\n\ndef _filter_deprecation_warnings():\n \"\"\"Apply filters to deprecation warnings.\n\n Force the `DeprecationWarning` warnings to be displayed for the qiskit\n module, overriding the system configuration as they are ignored by default\n [1] for end-users. Additionally, silence the `ChangedInMarshmallow3Warning`\n messages.\n\n TODO: on Python 3.7, this might not be needed due to PEP-0565 [2].\n\n [1] https://docs.python.org/3/library/warnings.html#default-warning-filters\n [2] https://www.python.org/dev/peps/pep-0565/\n \"\"\"\n deprecation_filter = ('always', None, DeprecationWarning,\n re.compile(r'^qiskit\\.*', re.UNICODE), 0)\n\n # Instead of using warnings.simple_filter() directly, the internal\n # _add_filter() function is used for being able to match against the\n # module.\n try:\n warnings._add_filter(*deprecation_filter, append=False)\n except AttributeError:\n # ._add_filter is internal and not available in some Python versions.\n pass\n\n # Add a filter for ignoring ChangedInMarshmallow3Warning, as we depend on\n # marhsmallow 2 explicitly. 2.17.0 introduced new deprecation warnings that\n # are useful for eventually migrating, but too verbose for our purposes.\n warnings.simplefilter('ignore', category=ChangedInMarshmallow3Warning)\n\n\n_check_python_version()\n_filter_deprecation_warnings()\n\n\ndef local_hardware_info():\n \"\"\"Basic hardware information about the local machine.\n\n Gives actual number of CPU's in the machine, even when hyperthreading is\n turned on. CPU count defaults to 1 when true count can't be determined.\n\n Returns:\n dict: The hardware information.\n \"\"\"\n results = {\n 'os': platform.system(),\n 'memory': psutil.virtual_memory().total / (1024 ** 3),\n 'cpus': psutil.cpu_count(logical=False) or 1\n }\n return results\n\n\ndef _has_connection(hostname, port):\n \"\"\"Checks if internet connection exists to host via specified port.\n\n If any exception is raised while trying to open a socket this will return\n false.\n\n Args:\n hostname (str): Hostname to connect to.\n port (int): Port to connect to\n\n Returns:\n bool: Has connection or not\n\n \"\"\"\n try:\n host = socket.gethostbyname(hostname)\n socket.create_connection((host, port), 2)\n return True\n except Exception: # pylint: disable=broad-except\n return False\n", "path": "qiskit/util.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"Common utilities for Qiskit.\"\"\"\n\nimport platform\nimport re\nimport socket\nimport sys\nimport warnings\n\nimport psutil\nfrom marshmallow.warnings import ChangedInMarshmallow3Warning\n\n\ndef _check_python_version():\n \"\"\"Check for Python version 3.5+.\"\"\"\n if sys.version_info < (3, 5):\n raise Exception('Qiskit requires Python version 3.5 or greater.')\n\n\ndef _filter_deprecation_warnings():\n \"\"\"Apply filters to deprecation warnings.\n\n Force the `DeprecationWarning` warnings to be displayed for the qiskit\n module, overriding the system configuration as they are ignored by default\n [1] for end-users. Additionally, silence the `ChangedInMarshmallow3Warning`\n messages.\n\n TODO: on Python 3.7, this might not be needed due to PEP-0565 [2].\n\n [1] https://docs.python.org/3/library/warnings.html#default-warning-filters\n [2] https://www.python.org/dev/peps/pep-0565/\n \"\"\"\n deprecation_filter = ('always', None, DeprecationWarning,\n re.compile(r'^qiskit\\.*', re.UNICODE), 0)\n\n # Instead of using warnings.simple_filter() directly, the internal\n # _add_filter() function is used for being able to match against the\n # module.\n try:\n warnings._add_filter(*deprecation_filter, append=False)\n except AttributeError:\n # ._add_filter is internal and not available in some Python versions.\n pass\n\n # Add a filter for ignoring ChangedInMarshmallow3Warning, as we depend on\n # marhsmallow 2 explicitly. 2.17.0 introduced new deprecation warnings that\n # are useful for eventually migrating, but too verbose for our purposes.\n warnings.simplefilter('ignore', category=ChangedInMarshmallow3Warning)\n\n\n_check_python_version()\n_filter_deprecation_warnings()\n\n\ndef local_hardware_info():\n \"\"\"Basic hardware information about the local machine.\n\n Gives actual number of CPU's in the machine, even when hyperthreading is\n turned on. CPU count defaults to 1 when true count can't be determined.\n\n Returns:\n dict: The hardware information.\n \"\"\"\n results = {\n 'os': platform.system(),\n 'memory': psutil.virtual_memory().total / (1024 ** 3),\n 'cpus': psutil.cpu_count(logical=False) or 1\n }\n return results\n\n\ndef _has_connection(hostname, port):\n \"\"\"Checks if internet connection exists to host via specified port.\n\n If any exception is raised while trying to open a socket this will return\n false.\n\n Args:\n hostname (str): Hostname to connect to.\n port (int): Port to connect to\n\n Returns:\n bool: Has connection or not\n\n \"\"\"\n try:\n host = socket.gethostbyname(hostname)\n socket.create_connection((host, port), 2).close()\n return True\n except Exception: # pylint: disable=broad-except\n return False\n", "path": "qiskit/util.py"}]}
| 1,452 | 108 |
gh_patches_debug_4028
|
rasdani/github-patches
|
git_diff
|
diofant__diofant-852
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Project logo
There are few places to put images:
- [x] Top left corner of https://diofant.readthedocs.io/ (see e.g. https://sphinx-rtd-theme.readthedocs.io/)
- [x] favicon.ico.
- [x] logo for pdf logs
- [x] Organization profile on the Github (at least 200x200px)
- [x] ~~repository’s social media preview (640×320px - 1280×640px for best display)~~
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/conf.py`
Content:
```
1 #
2 # Diofant documentation build configuration file.
3 #
4 # This file is execfile()d with the current directory set to its
5 # containing dir.
6 #
7 # The contents of this file are pickled, so don't put values in the
8 # namespace that aren't pickleable (module imports are okay, they're
9 # removed automatically).
10 #
11
12 import warnings
13
14 import diofant
15
16
17 # Turns numpydoc's section warnings to exceptions, see numpy/numpydoc#58.
18 warnings.simplefilter('error', UserWarning)
19
20 # Add any Sphinx extension module names here, as strings.
21 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode', 'sphinx.ext.mathjax',
22 'sphinx.ext.graphviz', 'sphinx.ext.intersphinx',
23 'sphinx.ext.extlinks', 'sphinx.ext.napoleon',
24 'sphinxcontrib.bibtex']
25
26 napoleon_google_docstring = False
27 napoleon_use_param = False
28 napoleon_use_rtype = False
29
30 # Sphinx will warn about all references where the target cannot be found.
31 nitpicky = True
32
33 # Glob-style patterns that should be excluded when looking for sources.
34 exclude_patterns = ['README.rst']
35
36 # The document name of the "master" document, that is, the document
37 # that contains the root toctree directive.
38 master_doc = 'index'
39
40 # Project information.
41 project = 'Diofant'
42 copyright = '2006-2018 SymPy Development Team, 2013-2019 Sergey B Kirpichev'
43 version = diofant.__version__
44 release = version
45
46 # The name of default reST role, that is, for text marked up `like this`.
47 default_role = 'math'
48
49 # The theme to use for HTML and HTML Help pages.
50 html_theme = 'sphinx_rtd_theme'
51
52 # The LaTeX engine to build the docs.
53 latex_engine = 'xelatex'
54
55 # If True, the PDF build from the LaTeX files created by Sphinx will use xindy
56 # rather than makeindex.
57 latex_use_xindy = False
58
59 # This value determines how to group the document tree into LaTeX source
60 # files. It must be a list of tuples (startdocname, targetname, title,
61 # author, documentclass, toctree_only),
62 latex_documents = [('index', 'diofant.tex', 'Diofant Documentation',
63 'Diofant Development Team', 'manual', True)]
64
65 # A dictionary that contains LaTeX snippets that override predefined.
66 latex_elements = {
67 'preamble': r'''
68 \setmainfont{DejaVu Serif}
69 \setsansfont{DejaVu Sans}
70 \setmonofont{DejaVu Sans Mono}
71 % redefine \LaTeX to be usable in math mode
72 \expandafter\def\expandafter\LaTeX\expandafter{\expandafter\text\expandafter{\LaTeX}}
73 '''
74 }
75
76 # Add page references after internal references.
77 latex_show_pagerefs = True
78
79 # The output format for Graphviz when building HTML files.
80 graphviz_output_format = 'svg'
81
82 # Contains mapping the locations and names of other projects that
83 # should be linked to in this documentation.
84 intersphinx_mapping = {
85 'python3': ('https://docs.python.org/3/', None),
86 'numpy': ('https://docs.scipy.org/doc/numpy', None),
87 'scipy': ('https://docs.scipy.org/doc/scipy/reference', None),
88 }
89
90 # Dictionary of external sites, mapping unique short alias names to a
91 # base URL and a prefix.
92 extlinks = {
93 'issue': ('https://github.com/diofant/diofant/issues/%s', '#'),
94 'pull': ('https://github.com/diofant/diofant/pull/%s', '#'),
95 'commit': ('https://github.com/diofant/diofant/commit/%s', ''),
96 'sympyissue': ('https://github.com/sympy/sympy/issues/%s', 'sympy/sympy#'),
97 'sympypull': ('https://github.com/sympy/sympy/pull/%s', 'sympy/sympy#'),
98 }
99
100 # The number of times the linkcheck builder will attempt to check a URL
101 # before declaring it broken.
102 linkcheck_retries = 3
103
104 # A list of regular expressions that match URIs that should not be checked.
105 linkcheck_ignore = [r'https://primes.utm.edu/notes/gaps.html',
106 r'https://primes.utm.edu/glossary/xpage/BertrandsPostulate.html',
107 r'https://primes.utm.edu/prove/prove2_3.html',
108 r'https://primes.utm.edu/glossary/xpage/Pseudoprime.html']
109
110 # This value controls if docstring for classes or methods, if not explicitly
111 # set, is inherited form parents.
112 autodoc_inherit_docstrings = False
113
114 # A list of paths that contain custom static files. Relative paths are taken as
115 # relative to the configuration directory. They are copied to the output’s
116 # _static directory.
117 html_static_path = ['_static']
118
119 # Should we show "Created using Sphinx" in the HTML footer?
120 html_show_sphinx = False
121
122 # Theme-specific options.
123 html_theme_options = {
124 'logo_only': True,
125 'display_version': False,
126 }
127
128 mathjax_config = {
129 'CommonHTML': {'linebreaks': {'automatic': True}},
130 'HTML-CSS': {'linebreaks': {'automatic': True}},
131 'SVG': {'linebreaks': {'automatic': True}},
132 }
133
134
135 # https://docs.readthedocs.io/en/latest/guides/adding-custom-css.html
136 def setup(app):
137 app.add_stylesheet('custom.css')
138
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/docs/conf.py b/docs/conf.py
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -119,6 +119,11 @@
# Should we show "Created using Sphinx" in the HTML footer?
html_show_sphinx = False
+# Paths to the logo and favicon.ico, relative to the conf.py's directory.
+html_logo = '_static/logo.svg'
+html_favicon = '_static/favicon.ico'
+latex_logo = '_static/logo.png'
+
# Theme-specific options.
html_theme_options = {
'logo_only': True,
|
{"golden_diff": "diff --git a/docs/conf.py b/docs/conf.py\n--- a/docs/conf.py\n+++ b/docs/conf.py\n@@ -119,6 +119,11 @@\n # Should we show \"Created using Sphinx\" in the HTML footer?\n html_show_sphinx = False\n \n+# Paths to the logo and favicon.ico, relative to the conf.py's directory.\n+html_logo = '_static/logo.svg'\n+html_favicon = '_static/favicon.ico'\n+latex_logo = '_static/logo.png'\n+\n # Theme-specific options.\n html_theme_options = {\n 'logo_only': True,\n", "issue": "Project logo\nThere are few places to put images:\r\n- [x] Top left corner of https://diofant.readthedocs.io/ (see e.g. https://sphinx-rtd-theme.readthedocs.io/)\r\n- [x] favicon.ico.\r\n- [x] logo for pdf logs\r\n- [x] Organization profile on the Github (at least 200x200px)\r\n- [x] ~~repository\u2019s social media preview (640\u00d7320px - 1280\u00d7640px for best display)~~\r\n\n", "before_files": [{"content": "#\n# Diofant documentation build configuration file.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# The contents of this file are pickled, so don't put values in the\n# namespace that aren't pickleable (module imports are okay, they're\n# removed automatically).\n#\n\nimport warnings\n\nimport diofant\n\n\n# Turns numpydoc's section warnings to exceptions, see numpy/numpydoc#58.\nwarnings.simplefilter('error', UserWarning)\n\n# Add any Sphinx extension module names here, as strings.\nextensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode', 'sphinx.ext.mathjax',\n 'sphinx.ext.graphviz', 'sphinx.ext.intersphinx',\n 'sphinx.ext.extlinks', 'sphinx.ext.napoleon',\n 'sphinxcontrib.bibtex']\n\nnapoleon_google_docstring = False\nnapoleon_use_param = False\nnapoleon_use_rtype = False\n\n# Sphinx will warn about all references where the target cannot be found.\nnitpicky = True\n\n# Glob-style patterns that should be excluded when looking for sources.\nexclude_patterns = ['README.rst']\n\n# The document name of the \"master\" document, that is, the document\n# that contains the root toctree directive.\nmaster_doc = 'index'\n\n# Project information.\nproject = 'Diofant'\ncopyright = '2006-2018 SymPy Development Team, 2013-2019 Sergey B Kirpichev'\nversion = diofant.__version__\nrelease = version\n\n# The name of default reST role, that is, for text marked up `like this`.\ndefault_role = 'math'\n\n# The theme to use for HTML and HTML Help pages.\nhtml_theme = 'sphinx_rtd_theme'\n\n# The LaTeX engine to build the docs.\nlatex_engine = 'xelatex'\n\n# If True, the PDF build from the LaTeX files created by Sphinx will use xindy\n# rather than makeindex.\nlatex_use_xindy = False\n\n# This value determines how to group the document tree into LaTeX source\n# files. It must be a list of tuples (startdocname, targetname, title,\n# author, documentclass, toctree_only),\nlatex_documents = [('index', 'diofant.tex', 'Diofant Documentation',\n 'Diofant Development Team', 'manual', True)]\n\n# A dictionary that contains LaTeX snippets that override predefined.\nlatex_elements = {\n 'preamble': r'''\n\\setmainfont{DejaVu Serif}\n\\setsansfont{DejaVu Sans}\n\\setmonofont{DejaVu Sans Mono}\n% redefine \\LaTeX to be usable in math mode\n\\expandafter\\def\\expandafter\\LaTeX\\expandafter{\\expandafter\\text\\expandafter{\\LaTeX}}\n'''\n}\n\n# Add page references after internal references.\nlatex_show_pagerefs = True\n\n# The output format for Graphviz when building HTML files.\ngraphviz_output_format = 'svg'\n\n# Contains mapping the locations and names of other projects that\n# should be linked to in this documentation.\nintersphinx_mapping = {\n 'python3': ('https://docs.python.org/3/', None),\n 'numpy': ('https://docs.scipy.org/doc/numpy', None),\n 'scipy': ('https://docs.scipy.org/doc/scipy/reference', None),\n}\n\n# Dictionary of external sites, mapping unique short alias names to a\n# base URL and a prefix.\nextlinks = {\n 'issue': ('https://github.com/diofant/diofant/issues/%s', '#'),\n 'pull': ('https://github.com/diofant/diofant/pull/%s', '#'),\n 'commit': ('https://github.com/diofant/diofant/commit/%s', ''),\n 'sympyissue': ('https://github.com/sympy/sympy/issues/%s', 'sympy/sympy#'),\n 'sympypull': ('https://github.com/sympy/sympy/pull/%s', 'sympy/sympy#'),\n}\n\n# The number of times the linkcheck builder will attempt to check a URL\n# before declaring it broken.\nlinkcheck_retries = 3\n\n# A list of regular expressions that match URIs that should not be checked.\nlinkcheck_ignore = [r'https://primes.utm.edu/notes/gaps.html',\n r'https://primes.utm.edu/glossary/xpage/BertrandsPostulate.html',\n r'https://primes.utm.edu/prove/prove2_3.html',\n r'https://primes.utm.edu/glossary/xpage/Pseudoprime.html']\n\n# This value controls if docstring for classes or methods, if not explicitly\n# set, is inherited form parents.\nautodoc_inherit_docstrings = False\n\n# A list of paths that contain custom static files. Relative paths are taken as\n# relative to the configuration directory. They are copied to the output\u2019s\n# _static directory.\nhtml_static_path = ['_static']\n\n# Should we show \"Created using Sphinx\" in the HTML footer?\nhtml_show_sphinx = False\n\n# Theme-specific options.\nhtml_theme_options = {\n 'logo_only': True,\n 'display_version': False,\n}\n\nmathjax_config = {\n 'CommonHTML': {'linebreaks': {'automatic': True}},\n 'HTML-CSS': {'linebreaks': {'automatic': True}},\n 'SVG': {'linebreaks': {'automatic': True}},\n}\n\n\n# https://docs.readthedocs.io/en/latest/guides/adding-custom-css.html\ndef setup(app):\n app.add_stylesheet('custom.css')\n", "path": "docs/conf.py"}], "after_files": [{"content": "#\n# Diofant documentation build configuration file.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# The contents of this file are pickled, so don't put values in the\n# namespace that aren't pickleable (module imports are okay, they're\n# removed automatically).\n#\n\nimport warnings\n\nimport diofant\n\n\n# Turns numpydoc's section warnings to exceptions, see numpy/numpydoc#58.\nwarnings.simplefilter('error', UserWarning)\n\n# Add any Sphinx extension module names here, as strings.\nextensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode', 'sphinx.ext.mathjax',\n 'sphinx.ext.graphviz', 'sphinx.ext.intersphinx',\n 'sphinx.ext.extlinks', 'sphinx.ext.napoleon',\n 'sphinxcontrib.bibtex']\n\nnapoleon_google_docstring = False\nnapoleon_use_param = False\nnapoleon_use_rtype = False\n\n# Sphinx will warn about all references where the target cannot be found.\nnitpicky = True\n\n# Glob-style patterns that should be excluded when looking for sources.\nexclude_patterns = ['README.rst']\n\n# The document name of the \"master\" document, that is, the document\n# that contains the root toctree directive.\nmaster_doc = 'index'\n\n# Project information.\nproject = 'Diofant'\ncopyright = '2006-2018 SymPy Development Team, 2013-2019 Sergey B Kirpichev'\nversion = diofant.__version__\nrelease = version\n\n# The name of default reST role, that is, for text marked up `like this`.\ndefault_role = 'math'\n\n# The theme to use for HTML and HTML Help pages.\nhtml_theme = 'sphinx_rtd_theme'\n\n# The LaTeX engine to build the docs.\nlatex_engine = 'xelatex'\n\n# If True, the PDF build from the LaTeX files created by Sphinx will use xindy\n# rather than makeindex.\nlatex_use_xindy = False\n\n# This value determines how to group the document tree into LaTeX source\n# files. It must be a list of tuples (startdocname, targetname, title,\n# author, documentclass, toctree_only),\nlatex_documents = [('index', 'diofant.tex', 'Diofant Documentation',\n 'Diofant Development Team', 'manual', True)]\n\n# A dictionary that contains LaTeX snippets that override predefined.\nlatex_elements = {\n 'preamble': r'''\n\\setmainfont{DejaVu Serif}\n\\setsansfont{DejaVu Sans}\n\\setmonofont{DejaVu Sans Mono}\n% redefine \\LaTeX to be usable in math mode\n\\expandafter\\def\\expandafter\\LaTeX\\expandafter{\\expandafter\\text\\expandafter{\\LaTeX}}\n'''\n}\n\n# Add page references after internal references.\nlatex_show_pagerefs = True\n\n# The output format for Graphviz when building HTML files.\ngraphviz_output_format = 'svg'\n\n# Contains mapping the locations and names of other projects that\n# should be linked to in this documentation.\nintersphinx_mapping = {\n 'python3': ('https://docs.python.org/3/', None),\n 'numpy': ('https://docs.scipy.org/doc/numpy', None),\n 'scipy': ('https://docs.scipy.org/doc/scipy/reference', None),\n}\n\n# Dictionary of external sites, mapping unique short alias names to a\n# base URL and a prefix.\nextlinks = {\n 'issue': ('https://github.com/diofant/diofant/issues/%s', '#'),\n 'pull': ('https://github.com/diofant/diofant/pull/%s', '#'),\n 'commit': ('https://github.com/diofant/diofant/commit/%s', ''),\n 'sympyissue': ('https://github.com/sympy/sympy/issues/%s', 'sympy/sympy#'),\n 'sympypull': ('https://github.com/sympy/sympy/pull/%s', 'sympy/sympy#'),\n}\n\n# The number of times the linkcheck builder will attempt to check a URL\n# before declaring it broken.\nlinkcheck_retries = 3\n\n# A list of regular expressions that match URIs that should not be checked.\nlinkcheck_ignore = [r'https://primes.utm.edu/notes/gaps.html',\n r'https://primes.utm.edu/glossary/xpage/BertrandsPostulate.html',\n r'https://primes.utm.edu/prove/prove2_3.html',\n r'https://primes.utm.edu/glossary/xpage/Pseudoprime.html']\n\n# This value controls if docstring for classes or methods, if not explicitly\n# set, is inherited form parents.\nautodoc_inherit_docstrings = False\n\n# A list of paths that contain custom static files. Relative paths are taken as\n# relative to the configuration directory. They are copied to the output\u2019s\n# _static directory.\nhtml_static_path = ['_static']\n\n# Should we show \"Created using Sphinx\" in the HTML footer?\nhtml_show_sphinx = False\n\n# Paths to the logo and favicon.ico, relative to the conf.py's directory.\nhtml_logo = '_static/logo.svg'\nhtml_favicon = '_static/favicon.ico'\nlatex_logo = '_static/logo.png'\n\n# Theme-specific options.\nhtml_theme_options = {\n 'logo_only': True,\n 'display_version': False,\n}\n\nmathjax_config = {\n 'CommonHTML': {'linebreaks': {'automatic': True}},\n 'HTML-CSS': {'linebreaks': {'automatic': True}},\n 'SVG': {'linebreaks': {'automatic': True}},\n}\n\n\n# https://docs.readthedocs.io/en/latest/guides/adding-custom-css.html\ndef setup(app):\n app.add_stylesheet('custom.css')\n", "path": "docs/conf.py"}]}
| 1,927 | 123 |
gh_patches_debug_16223
|
rasdani/github-patches
|
git_diff
|
microsoft__botbuilder-python-1930
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bump msrest to the 0.6.19 or higher
Is your feature request related to a problem? Please describe.
Old version of msrest is used in botframework components -> https://github.com/microsoft/botbuilder-python/search?q=msrest%3D%3D0.6.10 . This blocks us to use latest versions of the service bus client or event using the new language studio python libraries.
With msrest=0.6.10, we're blocked to using 0.50 service bus package and other packages like event grid.
Describe the solution you'd like
EDITED: Upgrade msrest to the at least 0.6.19 or higher.
Describe alternatives you've considered
No alternatives.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libraries/botframework-connector/setup.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 import os
5 from setuptools import setup
6
7 NAME = "botframework-connector"
8 VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
9 REQUIRES = [
10 "msrest==0.6.10",
11 "requests>=2.23.0,<2.26",
12 "PyJWT>=1.5.3,<2.0.0",
13 "botbuilder-schema==4.15.0",
14 "msal==1.6.0",
15 ]
16
17 root = os.path.abspath(os.path.dirname(__file__))
18
19 with open(os.path.join(root, "README.rst"), encoding="utf-8") as f:
20 long_description = f.read()
21
22 setup(
23 name=NAME,
24 version=VERSION,
25 description="Microsoft Bot Framework Bot Builder SDK for Python.",
26 author="Microsoft",
27 url="https://www.github.com/Microsoft/botbuilder-python",
28 keywords=["BotFrameworkConnector", "bots", "ai", "botframework", "botbuilder"],
29 install_requires=REQUIRES,
30 packages=[
31 "botframework.connector",
32 "botframework.connector.auth",
33 "botframework.connector.async_mixin",
34 "botframework.connector.operations",
35 "botframework.connector.models",
36 "botframework.connector.aio",
37 "botframework.connector.aio.operations_async",
38 "botframework.connector.skills",
39 "botframework.connector.teams",
40 "botframework.connector.teams.operations",
41 "botframework.connector.token_api",
42 "botframework.connector.token_api.aio",
43 "botframework.connector.token_api.aio.operations_async",
44 "botframework.connector.token_api.models",
45 "botframework.connector.token_api.operations",
46 ],
47 include_package_data=True,
48 long_description=long_description,
49 long_description_content_type="text/x-rst",
50 license="MIT",
51 classifiers=[
52 "Programming Language :: Python :: 3.7",
53 "Intended Audience :: Developers",
54 "License :: OSI Approved :: MIT License",
55 "Operating System :: OS Independent",
56 "Development Status :: 5 - Production/Stable",
57 "Topic :: Scientific/Engineering :: Artificial Intelligence",
58 ],
59 )
60
```
Path: `libraries/botbuilder-schema/setup.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 import os
5 from setuptools import setup
6
7 NAME = "botbuilder-schema"
8 VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
9 REQUIRES = ["msrest==0.6.10"]
10
11 root = os.path.abspath(os.path.dirname(__file__))
12
13 with open(os.path.join(root, "README.rst"), encoding="utf-8") as f:
14 long_description = f.read()
15
16 setup(
17 name=NAME,
18 version=VERSION,
19 description="BotBuilder Schema",
20 author="Microsoft",
21 url="https://github.com/Microsoft/botbuilder-python",
22 keywords=["BotBuilderSchema", "bots", "ai", "botframework", "botbuilder"],
23 long_description=long_description,
24 long_description_content_type="text/x-rst",
25 license="MIT",
26 install_requires=REQUIRES,
27 packages=["botbuilder.schema", "botbuilder.schema.teams",],
28 include_package_data=True,
29 classifiers=[
30 "Programming Language :: Python :: 3.7",
31 "Intended Audience :: Developers",
32 "License :: OSI Approved :: MIT License",
33 "Operating System :: OS Independent",
34 "Development Status :: 5 - Production/Stable",
35 "Topic :: Scientific/Engineering :: Artificial Intelligence",
36 ],
37 )
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/libraries/botbuilder-schema/setup.py b/libraries/botbuilder-schema/setup.py
--- a/libraries/botbuilder-schema/setup.py
+++ b/libraries/botbuilder-schema/setup.py
@@ -6,7 +6,7 @@
NAME = "botbuilder-schema"
VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
-REQUIRES = ["msrest==0.6.10"]
+REQUIRES = ["msrest==0.6.19"]
root = os.path.abspath(os.path.dirname(__file__))
diff --git a/libraries/botframework-connector/setup.py b/libraries/botframework-connector/setup.py
--- a/libraries/botframework-connector/setup.py
+++ b/libraries/botframework-connector/setup.py
@@ -7,7 +7,7 @@
NAME = "botframework-connector"
VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
REQUIRES = [
- "msrest==0.6.10",
+ "msrest==0.6.19",
"requests>=2.23.0,<2.26",
"PyJWT>=1.5.3,<2.0.0",
"botbuilder-schema==4.15.0",
|
{"golden_diff": "diff --git a/libraries/botbuilder-schema/setup.py b/libraries/botbuilder-schema/setup.py\n--- a/libraries/botbuilder-schema/setup.py\n+++ b/libraries/botbuilder-schema/setup.py\n@@ -6,7 +6,7 @@\n \r\n NAME = \"botbuilder-schema\"\r\n VERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\r\n-REQUIRES = [\"msrest==0.6.10\"]\r\n+REQUIRES = [\"msrest==0.6.19\"]\r\n \r\n root = os.path.abspath(os.path.dirname(__file__))\r\n \r\ndiff --git a/libraries/botframework-connector/setup.py b/libraries/botframework-connector/setup.py\n--- a/libraries/botframework-connector/setup.py\n+++ b/libraries/botframework-connector/setup.py\n@@ -7,7 +7,7 @@\n NAME = \"botframework-connector\"\n VERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\n REQUIRES = [\n- \"msrest==0.6.10\",\n+ \"msrest==0.6.19\",\n \"requests>=2.23.0,<2.26\",\n \"PyJWT>=1.5.3,<2.0.0\",\n \"botbuilder-schema==4.15.0\",\n", "issue": "Bump msrest to the 0.6.19 or higher\nIs your feature request related to a problem? Please describe.\r\nOld version of msrest is used in botframework components -> https://github.com/microsoft/botbuilder-python/search?q=msrest%3D%3D0.6.10 . This blocks us to use latest versions of the service bus client or event using the new language studio python libraries.\r\n\r\nWith msrest=0.6.10, we're blocked to using 0.50 service bus package and other packages like event grid.\r\n\r\nDescribe the solution you'd like\r\nEDITED: Upgrade msrest to the at least 0.6.19 or higher.\r\n\r\nDescribe alternatives you've considered\r\nNo alternatives.\r\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nNAME = \"botframework-connector\"\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\nREQUIRES = [\n \"msrest==0.6.10\",\n \"requests>=2.23.0,<2.26\",\n \"PyJWT>=1.5.3,<2.0.0\",\n \"botbuilder-schema==4.15.0\",\n \"msal==1.6.0\",\n]\n\nroot = os.path.abspath(os.path.dirname(__file__))\n\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nsetup(\n name=NAME,\n version=VERSION,\n description=\"Microsoft Bot Framework Bot Builder SDK for Python.\",\n author=\"Microsoft\",\n url=\"https://www.github.com/Microsoft/botbuilder-python\",\n keywords=[\"BotFrameworkConnector\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\n install_requires=REQUIRES,\n packages=[\n \"botframework.connector\",\n \"botframework.connector.auth\",\n \"botframework.connector.async_mixin\",\n \"botframework.connector.operations\",\n \"botframework.connector.models\",\n \"botframework.connector.aio\",\n \"botframework.connector.aio.operations_async\",\n \"botframework.connector.skills\",\n \"botframework.connector.teams\",\n \"botframework.connector.teams.operations\",\n \"botframework.connector.token_api\",\n \"botframework.connector.token_api.aio\",\n \"botframework.connector.token_api.aio.operations_async\",\n \"botframework.connector.token_api.models\",\n \"botframework.connector.token_api.operations\",\n ],\n include_package_data=True,\n long_description=long_description,\n long_description_content_type=\"text/x-rst\",\n license=\"MIT\",\n classifiers=[\n \"Programming Language :: Python :: 3.7\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: MIT License\",\n \"Operating System :: OS Independent\",\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n ],\n)\n", "path": "libraries/botframework-connector/setup.py"}, {"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\r\n# Licensed under the MIT License.\r\n\r\nimport os\r\nfrom setuptools import setup\r\n\r\nNAME = \"botbuilder-schema\"\r\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\r\nREQUIRES = [\"msrest==0.6.10\"]\r\n\r\nroot = os.path.abspath(os.path.dirname(__file__))\r\n\r\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\r\n long_description = f.read()\r\n\r\nsetup(\r\n name=NAME,\r\n version=VERSION,\r\n description=\"BotBuilder Schema\",\r\n author=\"Microsoft\",\r\n url=\"https://github.com/Microsoft/botbuilder-python\",\r\n keywords=[\"BotBuilderSchema\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\r\n long_description=long_description,\r\n long_description_content_type=\"text/x-rst\",\r\n license=\"MIT\",\r\n install_requires=REQUIRES,\r\n packages=[\"botbuilder.schema\", \"botbuilder.schema.teams\",],\r\n include_package_data=True,\r\n classifiers=[\r\n \"Programming Language :: Python :: 3.7\",\r\n \"Intended Audience :: Developers\",\r\n \"License :: OSI Approved :: MIT License\",\r\n \"Operating System :: OS Independent\",\r\n \"Development Status :: 5 - Production/Stable\",\r\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\r\n ],\r\n)\r\n", "path": "libraries/botbuilder-schema/setup.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nNAME = \"botframework-connector\"\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\nREQUIRES = [\n \"msrest==0.6.19\",\n \"requests>=2.23.0,<2.26\",\n \"PyJWT>=1.5.3,<2.0.0\",\n \"botbuilder-schema==4.15.0\",\n \"msal==1.6.0\",\n]\n\nroot = os.path.abspath(os.path.dirname(__file__))\n\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nsetup(\n name=NAME,\n version=VERSION,\n description=\"Microsoft Bot Framework Bot Builder SDK for Python.\",\n author=\"Microsoft\",\n url=\"https://www.github.com/Microsoft/botbuilder-python\",\n keywords=[\"BotFrameworkConnector\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\n install_requires=REQUIRES,\n packages=[\n \"botframework.connector\",\n \"botframework.connector.auth\",\n \"botframework.connector.async_mixin\",\n \"botframework.connector.operations\",\n \"botframework.connector.models\",\n \"botframework.connector.aio\",\n \"botframework.connector.aio.operations_async\",\n \"botframework.connector.skills\",\n \"botframework.connector.teams\",\n \"botframework.connector.teams.operations\",\n \"botframework.connector.token_api\",\n \"botframework.connector.token_api.aio\",\n \"botframework.connector.token_api.aio.operations_async\",\n \"botframework.connector.token_api.models\",\n \"botframework.connector.token_api.operations\",\n ],\n include_package_data=True,\n long_description=long_description,\n long_description_content_type=\"text/x-rst\",\n license=\"MIT\",\n classifiers=[\n \"Programming Language :: Python :: 3.7\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: MIT License\",\n \"Operating System :: OS Independent\",\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n ],\n)\n", "path": "libraries/botframework-connector/setup.py"}, {"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\r\n# Licensed under the MIT License.\r\n\r\nimport os\r\nfrom setuptools import setup\r\n\r\nNAME = \"botbuilder-schema\"\r\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\r\nREQUIRES = [\"msrest==0.6.19\"]\r\n\r\nroot = os.path.abspath(os.path.dirname(__file__))\r\n\r\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\r\n long_description = f.read()\r\n\r\nsetup(\r\n name=NAME,\r\n version=VERSION,\r\n description=\"BotBuilder Schema\",\r\n author=\"Microsoft\",\r\n url=\"https://github.com/Microsoft/botbuilder-python\",\r\n keywords=[\"BotBuilderSchema\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\r\n long_description=long_description,\r\n long_description_content_type=\"text/x-rst\",\r\n license=\"MIT\",\r\n install_requires=REQUIRES,\r\n packages=[\"botbuilder.schema\", \"botbuilder.schema.teams\",],\r\n include_package_data=True,\r\n classifiers=[\r\n \"Programming Language :: Python :: 3.7\",\r\n \"Intended Audience :: Developers\",\r\n \"License :: OSI Approved :: MIT License\",\r\n \"Operating System :: OS Independent\",\r\n \"Development Status :: 5 - Production/Stable\",\r\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\r\n ],\r\n)\r\n", "path": "libraries/botbuilder-schema/setup.py"}]}
| 1,390 | 298 |
gh_patches_debug_11520
|
rasdani/github-patches
|
git_diff
|
gratipay__gratipay.com-2999
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Font problem in production
> Font from origin 'https://assets.gratipay.com' has been blocked from loading by Cross-Origin Resource Sharing policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https://gratipay.com' is therefore not allowed access.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gratipay/utils/cache_static.py`
Content:
```
1 """
2 Handles caching of static resources.
3 """
4 from base64 import b64encode
5 from hashlib import md5
6
7 from aspen import Response
8
9
10 ETAGS = {}
11
12
13 def asset_etag(path):
14 if path.endswith('.spt'):
15 return ''
16 if path in ETAGS:
17 h = ETAGS[path]
18 else:
19 with open(path) as f:
20 h = ETAGS[path] = b64encode(md5(f.read()).digest(), '-_').replace('=', '~')
21 return h
22
23
24 # algorithm functions
25
26 def get_etag_for_file(dispatch_result):
27 return {'etag': asset_etag(dispatch_result.match)}
28
29
30 def try_to_serve_304(website, dispatch_result, request, etag):
31 """Try to serve a 304 for static resources.
32 """
33 if not etag:
34 # This is a request for a dynamic resource.
35 return
36
37 qs_etag = request.line.uri.querystring.get('etag')
38 if qs_etag and qs_etag != etag:
39 # Don't serve one version of a file as if it were another.
40 raise Response(410)
41
42 headers_etag = request.headers.get('If-None-Match')
43 if not headers_etag:
44 # This client doesn't want a 304.
45 return
46
47 if headers_etag != etag:
48 # Cache miss, the client sent an old or invalid etag.
49 return
50
51 # Huzzah!
52 # =======
53 # We can serve a 304! :D
54
55 raise Response(304)
56
57
58 def add_caching_to_response(website, response, request=None, etag=None):
59 """Set caching headers for static resources.
60 """
61 if etag is None:
62 return
63 assert request is not None # sanity check
64
65 if response.code not in (200, 304):
66 return
67
68 # https://developers.google.com/speed/docs/best-practices/caching
69 response.headers['Vary'] = 'accept-encoding'
70 response.headers['Etag'] = etag
71
72 if request.line.uri.querystring.get('etag'):
73 # We can cache "indefinitely" when the querystring contains the etag.
74 response.headers['Cache-Control'] = 'public, max-age=31536000'
75 else:
76 # Otherwise we cache for 5 seconds
77 response.headers['Cache-Control'] = 'public, max-age=5'
78
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/gratipay/utils/cache_static.py b/gratipay/utils/cache_static.py
--- a/gratipay/utils/cache_static.py
+++ b/gratipay/utils/cache_static.py
@@ -68,6 +68,9 @@
# https://developers.google.com/speed/docs/best-practices/caching
response.headers['Vary'] = 'accept-encoding'
response.headers['Etag'] = etag
+ # Set CORS header for https://assets.gratipay.com (see issue #2970)
+ if 'Access-Control-Allow-Origin' not in response.headers:
+ response.headers['Access-Control-Allow-Origin'] = 'https://gratipay.com'
if request.line.uri.querystring.get('etag'):
# We can cache "indefinitely" when the querystring contains the etag.
|
{"golden_diff": "diff --git a/gratipay/utils/cache_static.py b/gratipay/utils/cache_static.py\n--- a/gratipay/utils/cache_static.py\n+++ b/gratipay/utils/cache_static.py\n@@ -68,6 +68,9 @@\n # https://developers.google.com/speed/docs/best-practices/caching\n response.headers['Vary'] = 'accept-encoding'\n response.headers['Etag'] = etag\n+ # Set CORS header for https://assets.gratipay.com (see issue #2970)\n+ if 'Access-Control-Allow-Origin' not in response.headers:\n+ response.headers['Access-Control-Allow-Origin'] = 'https://gratipay.com'\n \n if request.line.uri.querystring.get('etag'):\n # We can cache \"indefinitely\" when the querystring contains the etag.\n", "issue": "Font problem in production\n> Font from origin 'https://assets.gratipay.com' has been blocked from loading by Cross-Origin Resource Sharing policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https://gratipay.com' is therefore not allowed access. \n\n", "before_files": [{"content": "\"\"\"\nHandles caching of static resources.\n\"\"\"\nfrom base64 import b64encode\nfrom hashlib import md5\n\nfrom aspen import Response\n\n\nETAGS = {}\n\n\ndef asset_etag(path):\n if path.endswith('.spt'):\n return ''\n if path in ETAGS:\n h = ETAGS[path]\n else:\n with open(path) as f:\n h = ETAGS[path] = b64encode(md5(f.read()).digest(), '-_').replace('=', '~')\n return h\n\n\n# algorithm functions\n\ndef get_etag_for_file(dispatch_result):\n return {'etag': asset_etag(dispatch_result.match)}\n\n\ndef try_to_serve_304(website, dispatch_result, request, etag):\n \"\"\"Try to serve a 304 for static resources.\n \"\"\"\n if not etag:\n # This is a request for a dynamic resource.\n return\n\n qs_etag = request.line.uri.querystring.get('etag')\n if qs_etag and qs_etag != etag:\n # Don't serve one version of a file as if it were another.\n raise Response(410)\n\n headers_etag = request.headers.get('If-None-Match')\n if not headers_etag:\n # This client doesn't want a 304.\n return\n\n if headers_etag != etag:\n # Cache miss, the client sent an old or invalid etag.\n return\n\n # Huzzah!\n # =======\n # We can serve a 304! :D\n\n raise Response(304)\n\n\ndef add_caching_to_response(website, response, request=None, etag=None):\n \"\"\"Set caching headers for static resources.\n \"\"\"\n if etag is None:\n return\n assert request is not None # sanity check\n\n if response.code not in (200, 304):\n return\n\n # https://developers.google.com/speed/docs/best-practices/caching\n response.headers['Vary'] = 'accept-encoding'\n response.headers['Etag'] = etag\n\n if request.line.uri.querystring.get('etag'):\n # We can cache \"indefinitely\" when the querystring contains the etag.\n response.headers['Cache-Control'] = 'public, max-age=31536000'\n else:\n # Otherwise we cache for 5 seconds\n response.headers['Cache-Control'] = 'public, max-age=5'\n", "path": "gratipay/utils/cache_static.py"}], "after_files": [{"content": "\"\"\"\nHandles caching of static resources.\n\"\"\"\nfrom base64 import b64encode\nfrom hashlib import md5\n\nfrom aspen import Response\n\n\nETAGS = {}\n\n\ndef asset_etag(path):\n if path.endswith('.spt'):\n return ''\n if path in ETAGS:\n h = ETAGS[path]\n else:\n with open(path) as f:\n h = ETAGS[path] = b64encode(md5(f.read()).digest(), '-_').replace('=', '~')\n return h\n\n\n# algorithm functions\n\ndef get_etag_for_file(dispatch_result):\n return {'etag': asset_etag(dispatch_result.match)}\n\n\ndef try_to_serve_304(website, dispatch_result, request, etag):\n \"\"\"Try to serve a 304 for static resources.\n \"\"\"\n if not etag:\n # This is a request for a dynamic resource.\n return\n\n qs_etag = request.line.uri.querystring.get('etag')\n if qs_etag and qs_etag != etag:\n # Don't serve one version of a file as if it were another.\n raise Response(410)\n\n headers_etag = request.headers.get('If-None-Match')\n if not headers_etag:\n # This client doesn't want a 304.\n return\n\n if headers_etag != etag:\n # Cache miss, the client sent an old or invalid etag.\n return\n\n # Huzzah!\n # =======\n # We can serve a 304! :D\n\n raise Response(304)\n\n\ndef add_caching_to_response(website, response, request=None, etag=None):\n \"\"\"Set caching headers for static resources.\n \"\"\"\n if etag is None:\n return\n assert request is not None # sanity check\n\n if response.code not in (200, 304):\n return\n\n # https://developers.google.com/speed/docs/best-practices/caching\n response.headers['Vary'] = 'accept-encoding'\n response.headers['Etag'] = etag\n # Set CORS header for https://assets.gratipay.com (see issue #2970)\n if 'Access-Control-Allow-Origin' not in response.headers:\n response.headers['Access-Control-Allow-Origin'] = 'https://gratipay.com'\n\n if request.line.uri.querystring.get('etag'):\n # We can cache \"indefinitely\" when the querystring contains the etag.\n response.headers['Cache-Control'] = 'public, max-age=31536000'\n else:\n # Otherwise we cache for 5 seconds\n response.headers['Cache-Control'] = 'public, max-age=5'\n", "path": "gratipay/utils/cache_static.py"}]}
| 1,021 | 180 |
gh_patches_debug_2632
|
rasdani/github-patches
|
git_diff
|
hpcaitech__ColossalAI-5433
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `extensions/inference/inference_ops_cuda.py`
Content:
```
1 from ..cuda_extension import _CudaExtension
2 from ..utils import get_cuda_cc_flag
3
4
5 class InferenceOpsCudaExtension(_CudaExtension):
6 def __init__(self):
7 super().__init__(name="inference_ops_cuda")
8
9 def sources_files(self):
10 ret = [
11 self.csrc_abs_path(fname)
12 for fname in [
13 "cuda/colossal_inference_C_frontend.cpp",
14 "cuda/decode_kv_cache_memcpy_kernel.cu",
15 ]
16 ]
17 return ret
18
19 def include_dirs(self):
20 ret = [self.get_cuda_home_include()]
21 return ret
22
23 def cxx_flags(self):
24 version_dependent_macros = ["-DVERSION_GE_1_1", "-DVERSION_GE_1_3", "-DVERSION_GE_1_5"]
25 return ["-O3"] + version_dependent_macros
26
27 def nvcc_flags(self):
28 extra_cuda_flags = ["-lineinfo"]
29 extra_cuda_flags.extend(get_cuda_cc_flag())
30 return ["-O3", "--use_fast_math"] + extra_cuda_flags
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/extensions/inference/inference_ops_cuda.py b/extensions/inference/inference_ops_cuda.py
--- a/extensions/inference/inference_ops_cuda.py
+++ b/extensions/inference/inference_ops_cuda.py
@@ -12,6 +12,7 @@
for fname in [
"cuda/colossal_inference_C_frontend.cpp",
"cuda/decode_kv_cache_memcpy_kernel.cu",
+ "cuda/activation_kernel.cu",
]
]
return ret
|
{"golden_diff": "diff --git a/extensions/inference/inference_ops_cuda.py b/extensions/inference/inference_ops_cuda.py\n--- a/extensions/inference/inference_ops_cuda.py\n+++ b/extensions/inference/inference_ops_cuda.py\n@@ -12,6 +12,7 @@\n for fname in [\n \"cuda/colossal_inference_C_frontend.cpp\",\n \"cuda/decode_kv_cache_memcpy_kernel.cu\",\n+ \"cuda/activation_kernel.cu\",\n ]\n ]\n return ret\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "from ..cuda_extension import _CudaExtension\nfrom ..utils import get_cuda_cc_flag\n\n\nclass InferenceOpsCudaExtension(_CudaExtension):\n def __init__(self):\n super().__init__(name=\"inference_ops_cuda\")\n\n def sources_files(self):\n ret = [\n self.csrc_abs_path(fname)\n for fname in [\n \"cuda/colossal_inference_C_frontend.cpp\",\n \"cuda/decode_kv_cache_memcpy_kernel.cu\",\n ]\n ]\n return ret\n\n def include_dirs(self):\n ret = [self.get_cuda_home_include()]\n return ret\n\n def cxx_flags(self):\n version_dependent_macros = [\"-DVERSION_GE_1_1\", \"-DVERSION_GE_1_3\", \"-DVERSION_GE_1_5\"]\n return [\"-O3\"] + version_dependent_macros\n\n def nvcc_flags(self):\n extra_cuda_flags = [\"-lineinfo\"]\n extra_cuda_flags.extend(get_cuda_cc_flag())\n return [\"-O3\", \"--use_fast_math\"] + extra_cuda_flags\n", "path": "extensions/inference/inference_ops_cuda.py"}], "after_files": [{"content": "from ..cuda_extension import _CudaExtension\nfrom ..utils import get_cuda_cc_flag\n\n\nclass InferenceOpsCudaExtension(_CudaExtension):\n def __init__(self):\n super().__init__(name=\"inference_ops_cuda\")\n\n def sources_files(self):\n ret = [\n self.csrc_abs_path(fname)\n for fname in [\n \"cuda/colossal_inference_C_frontend.cpp\",\n \"cuda/decode_kv_cache_memcpy_kernel.cu\",\n \"cuda/activation_kernel.cu\",\n ]\n ]\n return ret\n\n def include_dirs(self):\n ret = [self.get_cuda_home_include()]\n return ret\n\n def cxx_flags(self):\n version_dependent_macros = [\"-DVERSION_GE_1_1\", \"-DVERSION_GE_1_3\", \"-DVERSION_GE_1_5\"]\n return [\"-O3\"] + version_dependent_macros\n\n def nvcc_flags(self):\n extra_cuda_flags = [\"-lineinfo\"]\n extra_cuda_flags.extend(get_cuda_cc_flag())\n return [\"-O3\", \"--use_fast_math\"] + extra_cuda_flags\n", "path": "extensions/inference/inference_ops_cuda.py"}]}
| 568 | 103 |
gh_patches_debug_153
|
rasdani/github-patches
|
git_diff
|
bookwyrm-social__bookwyrm-1018
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Ratings don't federate
**Describe the bug**
I do follow someone on bookwyrm.social from bookwyrm.social and wyrms.de. I have seen on b.s that they rated some books without reviewing them, but those ratings do not appear on w.d. All other posts federate properly (I think).
**Expeceted behaviour**
The rating should show up on connected instances and ideally also be used on those to calculate the average rating of the book.
Here is one example that's not visible from w.d: https://bookwyrm.social/user/tastytea/reviewrating/21469
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bookwyrm/activitypub/note.py`
Content:
```
1 """ note serializer and children thereof """
2 from dataclasses import dataclass, field
3 from typing import Dict, List
4 from django.apps import apps
5
6 from .base_activity import ActivityObject, Link
7 from .image import Document
8
9
10 @dataclass(init=False)
11 class Tombstone(ActivityObject):
12 """the placeholder for a deleted status"""
13
14 type: str = "Tombstone"
15
16 def to_model(self, *args, **kwargs): # pylint: disable=unused-argument
17 """this should never really get serialized, just searched for"""
18 model = apps.get_model("bookwyrm.Status")
19 return model.find_existing_by_remote_id(self.id)
20
21
22 @dataclass(init=False)
23 class Note(ActivityObject):
24 """Note activity"""
25
26 published: str
27 attributedTo: str
28 content: str = ""
29 to: List[str] = field(default_factory=lambda: [])
30 cc: List[str] = field(default_factory=lambda: [])
31 replies: Dict = field(default_factory=lambda: {})
32 inReplyTo: str = ""
33 summary: str = ""
34 tag: List[Link] = field(default_factory=lambda: [])
35 attachment: List[Document] = field(default_factory=lambda: [])
36 sensitive: bool = False
37 type: str = "Note"
38
39
40 @dataclass(init=False)
41 class Article(Note):
42 """what's an article except a note with more fields"""
43
44 name: str
45 type: str = "Article"
46
47
48 @dataclass(init=False)
49 class GeneratedNote(Note):
50 """just a re-typed note"""
51
52 type: str = "GeneratedNote"
53
54
55 @dataclass(init=False)
56 class Comment(Note):
57 """like a note but with a book"""
58
59 inReplyToBook: str
60 type: str = "Comment"
61
62
63 @dataclass(init=False)
64 class Quotation(Comment):
65 """a quote and commentary on a book"""
66
67 quote: str
68 type: str = "Quotation"
69
70
71 @dataclass(init=False)
72 class Review(Comment):
73 """a full book review"""
74
75 name: str = None
76 rating: int = None
77 type: str = "Review"
78
79
80 @dataclass(init=False)
81 class Rating(Comment):
82 """just a star rating"""
83
84 rating: int
85 content: str = None
86 type: str = "Rating"
87
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/bookwyrm/activitypub/note.py b/bookwyrm/activitypub/note.py
--- a/bookwyrm/activitypub/note.py
+++ b/bookwyrm/activitypub/note.py
@@ -83,4 +83,5 @@
rating: int
content: str = None
+ name: str = None # not used, but the model inherits from Review
type: str = "Rating"
|
{"golden_diff": "diff --git a/bookwyrm/activitypub/note.py b/bookwyrm/activitypub/note.py\n--- a/bookwyrm/activitypub/note.py\n+++ b/bookwyrm/activitypub/note.py\n@@ -83,4 +83,5 @@\n \n rating: int\n content: str = None\n+ name: str = None # not used, but the model inherits from Review\n type: str = \"Rating\"\n", "issue": "Ratings don't federate\n**Describe the bug**\r\nI do follow someone on bookwyrm.social from bookwyrm.social and wyrms.de. I have seen on b.s that they rated some books without reviewing them, but those ratings do not appear on w.d. All other posts federate properly (I think).\r\n\r\n**Expeceted behaviour**\r\nThe rating should show up on connected instances and ideally also be used on those to calculate the average rating of the book.\r\n\r\nHere is one example that's not visible from w.d: https://bookwyrm.social/user/tastytea/reviewrating/21469\n", "before_files": [{"content": "\"\"\" note serializer and children thereof \"\"\"\nfrom dataclasses import dataclass, field\nfrom typing import Dict, List\nfrom django.apps import apps\n\nfrom .base_activity import ActivityObject, Link\nfrom .image import Document\n\n\n@dataclass(init=False)\nclass Tombstone(ActivityObject):\n \"\"\"the placeholder for a deleted status\"\"\"\n\n type: str = \"Tombstone\"\n\n def to_model(self, *args, **kwargs): # pylint: disable=unused-argument\n \"\"\"this should never really get serialized, just searched for\"\"\"\n model = apps.get_model(\"bookwyrm.Status\")\n return model.find_existing_by_remote_id(self.id)\n\n\n@dataclass(init=False)\nclass Note(ActivityObject):\n \"\"\"Note activity\"\"\"\n\n published: str\n attributedTo: str\n content: str = \"\"\n to: List[str] = field(default_factory=lambda: [])\n cc: List[str] = field(default_factory=lambda: [])\n replies: Dict = field(default_factory=lambda: {})\n inReplyTo: str = \"\"\n summary: str = \"\"\n tag: List[Link] = field(default_factory=lambda: [])\n attachment: List[Document] = field(default_factory=lambda: [])\n sensitive: bool = False\n type: str = \"Note\"\n\n\n@dataclass(init=False)\nclass Article(Note):\n \"\"\"what's an article except a note with more fields\"\"\"\n\n name: str\n type: str = \"Article\"\n\n\n@dataclass(init=False)\nclass GeneratedNote(Note):\n \"\"\"just a re-typed note\"\"\"\n\n type: str = \"GeneratedNote\"\n\n\n@dataclass(init=False)\nclass Comment(Note):\n \"\"\"like a note but with a book\"\"\"\n\n inReplyToBook: str\n type: str = \"Comment\"\n\n\n@dataclass(init=False)\nclass Quotation(Comment):\n \"\"\"a quote and commentary on a book\"\"\"\n\n quote: str\n type: str = \"Quotation\"\n\n\n@dataclass(init=False)\nclass Review(Comment):\n \"\"\"a full book review\"\"\"\n\n name: str = None\n rating: int = None\n type: str = \"Review\"\n\n\n@dataclass(init=False)\nclass Rating(Comment):\n \"\"\"just a star rating\"\"\"\n\n rating: int\n content: str = None\n type: str = \"Rating\"\n", "path": "bookwyrm/activitypub/note.py"}], "after_files": [{"content": "\"\"\" note serializer and children thereof \"\"\"\nfrom dataclasses import dataclass, field\nfrom typing import Dict, List\nfrom django.apps import apps\n\nfrom .base_activity import ActivityObject, Link\nfrom .image import Document\n\n\n@dataclass(init=False)\nclass Tombstone(ActivityObject):\n \"\"\"the placeholder for a deleted status\"\"\"\n\n type: str = \"Tombstone\"\n\n def to_model(self, *args, **kwargs): # pylint: disable=unused-argument\n \"\"\"this should never really get serialized, just searched for\"\"\"\n model = apps.get_model(\"bookwyrm.Status\")\n return model.find_existing_by_remote_id(self.id)\n\n\n@dataclass(init=False)\nclass Note(ActivityObject):\n \"\"\"Note activity\"\"\"\n\n published: str\n attributedTo: str\n content: str = \"\"\n to: List[str] = field(default_factory=lambda: [])\n cc: List[str] = field(default_factory=lambda: [])\n replies: Dict = field(default_factory=lambda: {})\n inReplyTo: str = \"\"\n summary: str = \"\"\n tag: List[Link] = field(default_factory=lambda: [])\n attachment: List[Document] = field(default_factory=lambda: [])\n sensitive: bool = False\n type: str = \"Note\"\n\n\n@dataclass(init=False)\nclass Article(Note):\n \"\"\"what's an article except a note with more fields\"\"\"\n\n name: str\n type: str = \"Article\"\n\n\n@dataclass(init=False)\nclass GeneratedNote(Note):\n \"\"\"just a re-typed note\"\"\"\n\n type: str = \"GeneratedNote\"\n\n\n@dataclass(init=False)\nclass Comment(Note):\n \"\"\"like a note but with a book\"\"\"\n\n inReplyToBook: str\n type: str = \"Comment\"\n\n\n@dataclass(init=False)\nclass Quotation(Comment):\n \"\"\"a quote and commentary on a book\"\"\"\n\n quote: str\n type: str = \"Quotation\"\n\n\n@dataclass(init=False)\nclass Review(Comment):\n \"\"\"a full book review\"\"\"\n\n name: str = None\n rating: int = None\n type: str = \"Review\"\n\n\n@dataclass(init=False)\nclass Rating(Comment):\n \"\"\"just a star rating\"\"\"\n\n rating: int\n content: str = None\n name: str = None # not used, but the model inherits from Review\n type: str = \"Rating\"\n", "path": "bookwyrm/activitypub/note.py"}]}
| 1,059 | 96 |
gh_patches_debug_591
|
rasdani/github-patches
|
git_diff
|
pex-tool__pex-1140
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.23
On the docket:
+ [x] Upgrade Pex to Pip 20.3.1. #1133
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.22"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.22"
+__version__ = "2.1.23"
|
{"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.22\"\n+__version__ = \"2.1.23\"\n", "issue": "Release 2.1.23\nOn the docket:\r\n+ [x] Upgrade Pex to Pip 20.3.1. #1133\r\n\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.22\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.23\"\n", "path": "pex/version.py"}]}
| 345 | 96 |
gh_patches_debug_13712
|
rasdani/github-patches
|
git_diff
|
chainer__chainer-1312
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`split_axis` doesn't support empty sections
This code causes a TypeError.
`functions.split_axis(x, [], 0)`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `chainer/functions/array/split_axis.py`
Content:
```
1 import collections
2
3 import six
4
5 import chainer
6 from chainer import cuda
7 from chainer import function
8 from chainer.utils import type_check
9
10
11 class SplitAxis(function.Function):
12
13 """Function that splits multiple arrays along the specified axis."""
14
15 def __init__(self, indices_or_sections, axis):
16 if not isinstance(indices_or_sections, (int, collections.Iterable)):
17 raise TypeError('indices_or_sections must be integer or 1-D array')
18 self.indices_or_sections = indices_or_sections
19 self.axis = axis
20
21 def check_type_forward(self, in_types):
22 type_check.expect(in_types.size() == 1)
23 type_check.expect(in_types[0].ndim > self.axis)
24
25 if isinstance(self.indices_or_sections, collections.Iterable):
26 max_index = type_check.Variable(
27 self.indices_or_sections[-1], 'max_index')
28 type_check.expect(in_types[0].shape[self.axis] > max_index)
29 else:
30 sections = type_check.Variable(
31 self.indices_or_sections, 'sections')
32 type_check.expect(in_types[0].shape[self.axis] % sections == 0)
33
34 def forward(self, x):
35 if isinstance(self.indices_or_sections, collections.Iterable):
36 cdimx = x[0].shape[self.axis]
37 ind = list(self.indices_or_sections)
38 ind.append(cdimx)
39 prev_i = 0
40 for i in ind:
41 cdimy = max(0, min(i, cdimx) - prev_i)
42 if cdimy == 0:
43 raise ValueError('Not support if shape contains 0')
44 prev_i = i
45 xp = cuda.get_array_module(*x)
46 return tuple(xp.split(x[0], self.indices_or_sections, self.axis))
47
48 def backward(self, x, gys):
49 xp = cuda.get_array_module(*x)
50 if any(gy is None for gy in gys):
51 gx = xp.zeros_like(x[0])
52 gxs = xp.split(gx, self.indices_or_sections, self.axis)
53 for gxi, gy in six.moves.zip(gxs, gys):
54 if gy is None:
55 continue
56 gxi[:] = gy
57 return gx,
58 else:
59 return xp.concatenate(gys, axis=self.axis),
60
61
62 def split_axis(x, indices_or_sections, axis, force_tuple=False):
63 """Splits given variables along an axis.
64
65 Args:
66 x (tuple of Variables): Variables to be split.
67 indices_or_sections (int or 1-D array): If this argument is an integer,
68 N, the array will be divided into N equal arrays along axis.
69 If it is a 1-D array of sorted integers, it
70 indicates the positions where the array is split.
71 axis (int): Axis that the input array is split along.
72 force_tuple (bool): If ``True``, this method returns a tuple even when
73 the number of outputs is one.
74
75 Returns:
76 tuple or Variable: Tuple of :class:`~chainer.Variable` objects
77 if the number of outputs is more than 1 or
78 :class:`~chainer.Variable` otherwise.
79 When ``force_tuple`` is ``True``, returned value is always a tuple
80 regardless of the number of outputs.
81
82 .. note::
83 This function raises :class:`ValueError` if at least
84 one of the outputs is split to zero-size
85 (i.e. ``axis``-th value of its shape is zero).
86
87 """
88 res = SplitAxis(indices_or_sections, axis)(x)
89 if force_tuple and isinstance(res, chainer.Variable):
90 res = (res,)
91 return res
92
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/chainer/functions/array/split_axis.py b/chainer/functions/array/split_axis.py
--- a/chainer/functions/array/split_axis.py
+++ b/chainer/functions/array/split_axis.py
@@ -23,9 +23,10 @@
type_check.expect(in_types[0].ndim > self.axis)
if isinstance(self.indices_or_sections, collections.Iterable):
- max_index = type_check.Variable(
- self.indices_or_sections[-1], 'max_index')
- type_check.expect(in_types[0].shape[self.axis] > max_index)
+ if len(self.indices_or_sections) > 0:
+ max_index = type_check.Variable(
+ self.indices_or_sections[-1], 'max_index')
+ type_check.expect(in_types[0].shape[self.axis] > max_index)
else:
sections = type_check.Variable(
self.indices_or_sections, 'sections')
|
{"golden_diff": "diff --git a/chainer/functions/array/split_axis.py b/chainer/functions/array/split_axis.py\n--- a/chainer/functions/array/split_axis.py\n+++ b/chainer/functions/array/split_axis.py\n@@ -23,9 +23,10 @@\n type_check.expect(in_types[0].ndim > self.axis)\n \n if isinstance(self.indices_or_sections, collections.Iterable):\n- max_index = type_check.Variable(\n- self.indices_or_sections[-1], 'max_index')\n- type_check.expect(in_types[0].shape[self.axis] > max_index)\n+ if len(self.indices_or_sections) > 0:\n+ max_index = type_check.Variable(\n+ self.indices_or_sections[-1], 'max_index')\n+ type_check.expect(in_types[0].shape[self.axis] > max_index)\n else:\n sections = type_check.Variable(\n self.indices_or_sections, 'sections')\n", "issue": "`split_axis` doesn't support empty sections\nThis code causes a TypeError.\n`functions.split_axis(x, [], 0)`\n\n", "before_files": [{"content": "import collections\n\nimport six\n\nimport chainer\nfrom chainer import cuda\nfrom chainer import function\nfrom chainer.utils import type_check\n\n\nclass SplitAxis(function.Function):\n\n \"\"\"Function that splits multiple arrays along the specified axis.\"\"\"\n\n def __init__(self, indices_or_sections, axis):\n if not isinstance(indices_or_sections, (int, collections.Iterable)):\n raise TypeError('indices_or_sections must be integer or 1-D array')\n self.indices_or_sections = indices_or_sections\n self.axis = axis\n\n def check_type_forward(self, in_types):\n type_check.expect(in_types.size() == 1)\n type_check.expect(in_types[0].ndim > self.axis)\n\n if isinstance(self.indices_or_sections, collections.Iterable):\n max_index = type_check.Variable(\n self.indices_or_sections[-1], 'max_index')\n type_check.expect(in_types[0].shape[self.axis] > max_index)\n else:\n sections = type_check.Variable(\n self.indices_or_sections, 'sections')\n type_check.expect(in_types[0].shape[self.axis] % sections == 0)\n\n def forward(self, x):\n if isinstance(self.indices_or_sections, collections.Iterable):\n cdimx = x[0].shape[self.axis]\n ind = list(self.indices_or_sections)\n ind.append(cdimx)\n prev_i = 0\n for i in ind:\n cdimy = max(0, min(i, cdimx) - prev_i)\n if cdimy == 0:\n raise ValueError('Not support if shape contains 0')\n prev_i = i\n xp = cuda.get_array_module(*x)\n return tuple(xp.split(x[0], self.indices_or_sections, self.axis))\n\n def backward(self, x, gys):\n xp = cuda.get_array_module(*x)\n if any(gy is None for gy in gys):\n gx = xp.zeros_like(x[0])\n gxs = xp.split(gx, self.indices_or_sections, self.axis)\n for gxi, gy in six.moves.zip(gxs, gys):\n if gy is None:\n continue\n gxi[:] = gy\n return gx,\n else:\n return xp.concatenate(gys, axis=self.axis),\n\n\ndef split_axis(x, indices_or_sections, axis, force_tuple=False):\n \"\"\"Splits given variables along an axis.\n\n Args:\n x (tuple of Variables): Variables to be split.\n indices_or_sections (int or 1-D array): If this argument is an integer,\n N, the array will be divided into N equal arrays along axis.\n If it is a 1-D array of sorted integers, it\n indicates the positions where the array is split.\n axis (int): Axis that the input array is split along.\n force_tuple (bool): If ``True``, this method returns a tuple even when\n the number of outputs is one.\n\n Returns:\n tuple or Variable: Tuple of :class:`~chainer.Variable` objects\n if the number of outputs is more than 1 or\n :class:`~chainer.Variable` otherwise.\n When ``force_tuple`` is ``True``, returned value is always a tuple\n regardless of the number of outputs.\n\n .. note::\n This function raises :class:`ValueError` if at least\n one of the outputs is split to zero-size\n (i.e. ``axis``-th value of its shape is zero).\n\n \"\"\"\n res = SplitAxis(indices_or_sections, axis)(x)\n if force_tuple and isinstance(res, chainer.Variable):\n res = (res,)\n return res\n", "path": "chainer/functions/array/split_axis.py"}], "after_files": [{"content": "import collections\n\nimport six\n\nimport chainer\nfrom chainer import cuda\nfrom chainer import function\nfrom chainer.utils import type_check\n\n\nclass SplitAxis(function.Function):\n\n \"\"\"Function that splits multiple arrays along the specified axis.\"\"\"\n\n def __init__(self, indices_or_sections, axis):\n if not isinstance(indices_or_sections, (int, collections.Iterable)):\n raise TypeError('indices_or_sections must be integer or 1-D array')\n self.indices_or_sections = indices_or_sections\n self.axis = axis\n\n def check_type_forward(self, in_types):\n type_check.expect(in_types.size() == 1)\n type_check.expect(in_types[0].ndim > self.axis)\n\n if isinstance(self.indices_or_sections, collections.Iterable):\n if len(self.indices_or_sections) > 0:\n max_index = type_check.Variable(\n self.indices_or_sections[-1], 'max_index')\n type_check.expect(in_types[0].shape[self.axis] > max_index)\n else:\n sections = type_check.Variable(\n self.indices_or_sections, 'sections')\n type_check.expect(in_types[0].shape[self.axis] % sections == 0)\n\n def forward(self, x):\n if isinstance(self.indices_or_sections, collections.Iterable):\n cdimx = x[0].shape[self.axis]\n ind = list(self.indices_or_sections)\n ind.append(cdimx)\n prev_i = 0\n for i in ind:\n cdimy = max(0, min(i, cdimx) - prev_i)\n if cdimy == 0:\n raise ValueError('Not support if shape contains 0')\n prev_i = i\n xp = cuda.get_array_module(*x)\n return tuple(xp.split(x[0], self.indices_or_sections, self.axis))\n\n def backward(self, x, gys):\n xp = cuda.get_array_module(*x)\n if any(gy is None for gy in gys):\n gx = xp.zeros_like(x[0])\n gxs = xp.split(gx, self.indices_or_sections, self.axis)\n for gxi, gy in six.moves.zip(gxs, gys):\n if gy is None:\n continue\n gxi[:] = gy\n return gx,\n else:\n return xp.concatenate(gys, axis=self.axis),\n\n\ndef split_axis(x, indices_or_sections, axis, force_tuple=False):\n \"\"\"Splits given variables along an axis.\n\n Args:\n x (tuple of Variables): Variables to be split.\n indices_or_sections (int or 1-D array): If this argument is an integer,\n N, the array will be divided into N equal arrays along axis.\n If it is a 1-D array of sorted integers, it\n indicates the positions where the array is split.\n axis (int): Axis that the input array is split along.\n force_tuple (bool): If ``True``, this method returns a tuple even when\n the number of outputs is one.\n\n Returns:\n tuple or Variable: Tuple of :class:`~chainer.Variable` objects\n if the number of outputs is more than 1 or\n :class:`~chainer.Variable` otherwise.\n When ``force_tuple`` is ``True``, returned value is always a tuple\n regardless of the number of outputs.\n\n .. note::\n This function raises :class:`ValueError` if at least\n one of the outputs is split to zero-size\n (i.e. ``axis``-th value of its shape is zero).\n\n \"\"\"\n res = SplitAxis(indices_or_sections, axis)(x)\n if force_tuple and isinstance(res, chainer.Variable):\n res = (res,)\n return res\n", "path": "chainer/functions/array/split_axis.py"}]}
| 1,247 | 197 |
gh_patches_debug_35028
|
rasdani/github-patches
|
git_diff
|
strawberry-graphql__strawberry-1071
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Compatibility with pydantic 1.4
I'm trying to use strawberry in a project that has pydantic pinned at 1.4. I chatted with @patrick91 on discord about this, and he thought it would be reasonable to achieve compatibility with this version.
Pydantic appears to only be used in the [strawberry.experimental](https://github.com/strawberry-graphql/strawberry/blob/main/strawberry/experimental/__init__.py) module, which only gets loaded if pydantic is present. One way to solve this for me in particular would be to lazily load strawberry.experimental/pydantic, such that when an older version of pydantic is present, one can still import other packages in strawberry.
Thank you!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `strawberry/experimental/pydantic/fields.py`
Content:
```
1 from decimal import Decimal
2 from typing import Optional
3 from uuid import UUID
4
5 import pydantic
6
7 from .exceptions import UnsupportedTypeError
8
9
10 FIELDS_MAP = {
11 pydantic.NoneStr: Optional[str],
12 pydantic.NoneBytes: Optional[bytes],
13 pydantic.StrBytes: None,
14 pydantic.NoneStrBytes: None,
15 pydantic.StrictStr: str,
16 pydantic.ConstrainedBytes: bytes,
17 pydantic.conbytes: bytes,
18 pydantic.ConstrainedList: None,
19 pydantic.conlist: None,
20 pydantic.ConstrainedSet: None,
21 pydantic.conset: None,
22 pydantic.ConstrainedStr: str,
23 pydantic.constr: str,
24 pydantic.EmailStr: str,
25 pydantic.PyObject: None,
26 pydantic.ConstrainedInt: int,
27 pydantic.conint: int,
28 pydantic.PositiveInt: int,
29 pydantic.NegativeInt: int,
30 pydantic.ConstrainedFloat: float,
31 pydantic.confloat: float,
32 pydantic.PositiveFloat: float,
33 pydantic.NegativeFloat: float,
34 pydantic.ConstrainedDecimal: Decimal,
35 pydantic.condecimal: Decimal,
36 pydantic.UUID1: UUID,
37 pydantic.UUID3: UUID,
38 pydantic.UUID4: UUID,
39 pydantic.UUID5: UUID,
40 pydantic.FilePath: None,
41 pydantic.DirectoryPath: None,
42 pydantic.Json: None,
43 pydantic.JsonWrapper: None,
44 pydantic.SecretStr: str,
45 pydantic.SecretBytes: bytes,
46 pydantic.StrictBool: bool,
47 pydantic.StrictInt: int,
48 pydantic.StrictFloat: float,
49 pydantic.PaymentCardNumber: None,
50 pydantic.ByteSize: None,
51 pydantic.AnyUrl: str,
52 pydantic.AnyHttpUrl: str,
53 pydantic.HttpUrl: str,
54 pydantic.PostgresDsn: str,
55 pydantic.RedisDsn: str,
56 }
57
58
59 def get_basic_type(type_):
60 if isinstance(type_, type):
61 if issubclass(type_, pydantic.ConstrainedInt):
62 return int
63 if issubclass(type_, pydantic.ConstrainedStr):
64 return str
65
66 if type_ in FIELDS_MAP:
67 type_ = FIELDS_MAP.get(type_)
68
69 if type_ is None:
70 raise UnsupportedTypeError()
71
72 return type_
73
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/strawberry/experimental/pydantic/fields.py b/strawberry/experimental/pydantic/fields.py
--- a/strawberry/experimental/pydantic/fields.py
+++ b/strawberry/experimental/pydantic/fields.py
@@ -7,52 +7,59 @@
from .exceptions import UnsupportedTypeError
+ATTR_TO_TYPE_MAP = {
+ "NoneStr": Optional[str],
+ "NoneBytes": Optional[bytes],
+ "StrBytes": None,
+ "NoneStrBytes": None,
+ "StrictStr": str,
+ "ConstrainedBytes": bytes,
+ "conbytes": bytes,
+ "ConstrainedList": None,
+ "conlist": None,
+ "ConstrainedSet": None,
+ "conset": None,
+ "ConstrainedStr": str,
+ "constr": str,
+ "EmailStr": str,
+ "PyObject": None,
+ "ConstrainedInt": int,
+ "conint": int,
+ "PositiveInt": int,
+ "NegativeInt": int,
+ "ConstrainedFloat": float,
+ "confloat": float,
+ "PositiveFloat": float,
+ "NegativeFloat": float,
+ "ConstrainedDecimal": Decimal,
+ "condecimal": Decimal,
+ "UUID1": UUID,
+ "UUID3": UUID,
+ "UUID4": UUID,
+ "UUID5": UUID,
+ "FilePath": None,
+ "DirectoryPath": None,
+ "Json": None,
+ "JsonWrapper": None,
+ "SecretStr": str,
+ "SecretBytes": bytes,
+ "StrictBool": bool,
+ "StrictInt": int,
+ "StrictFloat": float,
+ "PaymentCardNumber": None,
+ "ByteSize": None,
+ "AnyUrl": str,
+ "AnyHttpUrl": str,
+ "HttpUrl": str,
+ "PostgresDsn": str,
+ "RedisDsn": str,
+}
+
+
FIELDS_MAP = {
- pydantic.NoneStr: Optional[str],
- pydantic.NoneBytes: Optional[bytes],
- pydantic.StrBytes: None,
- pydantic.NoneStrBytes: None,
- pydantic.StrictStr: str,
- pydantic.ConstrainedBytes: bytes,
- pydantic.conbytes: bytes,
- pydantic.ConstrainedList: None,
- pydantic.conlist: None,
- pydantic.ConstrainedSet: None,
- pydantic.conset: None,
- pydantic.ConstrainedStr: str,
- pydantic.constr: str,
- pydantic.EmailStr: str,
- pydantic.PyObject: None,
- pydantic.ConstrainedInt: int,
- pydantic.conint: int,
- pydantic.PositiveInt: int,
- pydantic.NegativeInt: int,
- pydantic.ConstrainedFloat: float,
- pydantic.confloat: float,
- pydantic.PositiveFloat: float,
- pydantic.NegativeFloat: float,
- pydantic.ConstrainedDecimal: Decimal,
- pydantic.condecimal: Decimal,
- pydantic.UUID1: UUID,
- pydantic.UUID3: UUID,
- pydantic.UUID4: UUID,
- pydantic.UUID5: UUID,
- pydantic.FilePath: None,
- pydantic.DirectoryPath: None,
- pydantic.Json: None,
- pydantic.JsonWrapper: None,
- pydantic.SecretStr: str,
- pydantic.SecretBytes: bytes,
- pydantic.StrictBool: bool,
- pydantic.StrictInt: int,
- pydantic.StrictFloat: float,
- pydantic.PaymentCardNumber: None,
- pydantic.ByteSize: None,
- pydantic.AnyUrl: str,
- pydantic.AnyHttpUrl: str,
- pydantic.HttpUrl: str,
- pydantic.PostgresDsn: str,
- pydantic.RedisDsn: str,
+ getattr(pydantic, field_name): type
+ for field_name, type in ATTR_TO_TYPE_MAP.items()
+ if hasattr(pydantic, field_name)
}
|
{"golden_diff": "diff --git a/strawberry/experimental/pydantic/fields.py b/strawberry/experimental/pydantic/fields.py\n--- a/strawberry/experimental/pydantic/fields.py\n+++ b/strawberry/experimental/pydantic/fields.py\n@@ -7,52 +7,59 @@\n from .exceptions import UnsupportedTypeError\n \n \n+ATTR_TO_TYPE_MAP = {\n+ \"NoneStr\": Optional[str],\n+ \"NoneBytes\": Optional[bytes],\n+ \"StrBytes\": None,\n+ \"NoneStrBytes\": None,\n+ \"StrictStr\": str,\n+ \"ConstrainedBytes\": bytes,\n+ \"conbytes\": bytes,\n+ \"ConstrainedList\": None,\n+ \"conlist\": None,\n+ \"ConstrainedSet\": None,\n+ \"conset\": None,\n+ \"ConstrainedStr\": str,\n+ \"constr\": str,\n+ \"EmailStr\": str,\n+ \"PyObject\": None,\n+ \"ConstrainedInt\": int,\n+ \"conint\": int,\n+ \"PositiveInt\": int,\n+ \"NegativeInt\": int,\n+ \"ConstrainedFloat\": float,\n+ \"confloat\": float,\n+ \"PositiveFloat\": float,\n+ \"NegativeFloat\": float,\n+ \"ConstrainedDecimal\": Decimal,\n+ \"condecimal\": Decimal,\n+ \"UUID1\": UUID,\n+ \"UUID3\": UUID,\n+ \"UUID4\": UUID,\n+ \"UUID5\": UUID,\n+ \"FilePath\": None,\n+ \"DirectoryPath\": None,\n+ \"Json\": None,\n+ \"JsonWrapper\": None,\n+ \"SecretStr\": str,\n+ \"SecretBytes\": bytes,\n+ \"StrictBool\": bool,\n+ \"StrictInt\": int,\n+ \"StrictFloat\": float,\n+ \"PaymentCardNumber\": None,\n+ \"ByteSize\": None,\n+ \"AnyUrl\": str,\n+ \"AnyHttpUrl\": str,\n+ \"HttpUrl\": str,\n+ \"PostgresDsn\": str,\n+ \"RedisDsn\": str,\n+}\n+\n+\n FIELDS_MAP = {\n- pydantic.NoneStr: Optional[str],\n- pydantic.NoneBytes: Optional[bytes],\n- pydantic.StrBytes: None,\n- pydantic.NoneStrBytes: None,\n- pydantic.StrictStr: str,\n- pydantic.ConstrainedBytes: bytes,\n- pydantic.conbytes: bytes,\n- pydantic.ConstrainedList: None,\n- pydantic.conlist: None,\n- pydantic.ConstrainedSet: None,\n- pydantic.conset: None,\n- pydantic.ConstrainedStr: str,\n- pydantic.constr: str,\n- pydantic.EmailStr: str,\n- pydantic.PyObject: None,\n- pydantic.ConstrainedInt: int,\n- pydantic.conint: int,\n- pydantic.PositiveInt: int,\n- pydantic.NegativeInt: int,\n- pydantic.ConstrainedFloat: float,\n- pydantic.confloat: float,\n- pydantic.PositiveFloat: float,\n- pydantic.NegativeFloat: float,\n- pydantic.ConstrainedDecimal: Decimal,\n- pydantic.condecimal: Decimal,\n- pydantic.UUID1: UUID,\n- pydantic.UUID3: UUID,\n- pydantic.UUID4: UUID,\n- pydantic.UUID5: UUID,\n- pydantic.FilePath: None,\n- pydantic.DirectoryPath: None,\n- pydantic.Json: None,\n- pydantic.JsonWrapper: None,\n- pydantic.SecretStr: str,\n- pydantic.SecretBytes: bytes,\n- pydantic.StrictBool: bool,\n- pydantic.StrictInt: int,\n- pydantic.StrictFloat: float,\n- pydantic.PaymentCardNumber: None,\n- pydantic.ByteSize: None,\n- pydantic.AnyUrl: str,\n- pydantic.AnyHttpUrl: str,\n- pydantic.HttpUrl: str,\n- pydantic.PostgresDsn: str,\n- pydantic.RedisDsn: str,\n+ getattr(pydantic, field_name): type\n+ for field_name, type in ATTR_TO_TYPE_MAP.items()\n+ if hasattr(pydantic, field_name)\n }\n", "issue": "Compatibility with pydantic 1.4\nI'm trying to use strawberry in a project that has pydantic pinned at 1.4. I chatted with @patrick91 on discord about this, and he thought it would be reasonable to achieve compatibility with this version.\r\n\r\nPydantic appears to only be used in the [strawberry.experimental](https://github.com/strawberry-graphql/strawberry/blob/main/strawberry/experimental/__init__.py) module, which only gets loaded if pydantic is present. One way to solve this for me in particular would be to lazily load strawberry.experimental/pydantic, such that when an older version of pydantic is present, one can still import other packages in strawberry.\r\n\r\nThank you!\n", "before_files": [{"content": "from decimal import Decimal\nfrom typing import Optional\nfrom uuid import UUID\n\nimport pydantic\n\nfrom .exceptions import UnsupportedTypeError\n\n\nFIELDS_MAP = {\n pydantic.NoneStr: Optional[str],\n pydantic.NoneBytes: Optional[bytes],\n pydantic.StrBytes: None,\n pydantic.NoneStrBytes: None,\n pydantic.StrictStr: str,\n pydantic.ConstrainedBytes: bytes,\n pydantic.conbytes: bytes,\n pydantic.ConstrainedList: None,\n pydantic.conlist: None,\n pydantic.ConstrainedSet: None,\n pydantic.conset: None,\n pydantic.ConstrainedStr: str,\n pydantic.constr: str,\n pydantic.EmailStr: str,\n pydantic.PyObject: None,\n pydantic.ConstrainedInt: int,\n pydantic.conint: int,\n pydantic.PositiveInt: int,\n pydantic.NegativeInt: int,\n pydantic.ConstrainedFloat: float,\n pydantic.confloat: float,\n pydantic.PositiveFloat: float,\n pydantic.NegativeFloat: float,\n pydantic.ConstrainedDecimal: Decimal,\n pydantic.condecimal: Decimal,\n pydantic.UUID1: UUID,\n pydantic.UUID3: UUID,\n pydantic.UUID4: UUID,\n pydantic.UUID5: UUID,\n pydantic.FilePath: None,\n pydantic.DirectoryPath: None,\n pydantic.Json: None,\n pydantic.JsonWrapper: None,\n pydantic.SecretStr: str,\n pydantic.SecretBytes: bytes,\n pydantic.StrictBool: bool,\n pydantic.StrictInt: int,\n pydantic.StrictFloat: float,\n pydantic.PaymentCardNumber: None,\n pydantic.ByteSize: None,\n pydantic.AnyUrl: str,\n pydantic.AnyHttpUrl: str,\n pydantic.HttpUrl: str,\n pydantic.PostgresDsn: str,\n pydantic.RedisDsn: str,\n}\n\n\ndef get_basic_type(type_):\n if isinstance(type_, type):\n if issubclass(type_, pydantic.ConstrainedInt):\n return int\n if issubclass(type_, pydantic.ConstrainedStr):\n return str\n\n if type_ in FIELDS_MAP:\n type_ = FIELDS_MAP.get(type_)\n\n if type_ is None:\n raise UnsupportedTypeError()\n\n return type_\n", "path": "strawberry/experimental/pydantic/fields.py"}], "after_files": [{"content": "from decimal import Decimal\nfrom typing import Optional\nfrom uuid import UUID\n\nimport pydantic\n\nfrom .exceptions import UnsupportedTypeError\n\n\nATTR_TO_TYPE_MAP = {\n \"NoneStr\": Optional[str],\n \"NoneBytes\": Optional[bytes],\n \"StrBytes\": None,\n \"NoneStrBytes\": None,\n \"StrictStr\": str,\n \"ConstrainedBytes\": bytes,\n \"conbytes\": bytes,\n \"ConstrainedList\": None,\n \"conlist\": None,\n \"ConstrainedSet\": None,\n \"conset\": None,\n \"ConstrainedStr\": str,\n \"constr\": str,\n \"EmailStr\": str,\n \"PyObject\": None,\n \"ConstrainedInt\": int,\n \"conint\": int,\n \"PositiveInt\": int,\n \"NegativeInt\": int,\n \"ConstrainedFloat\": float,\n \"confloat\": float,\n \"PositiveFloat\": float,\n \"NegativeFloat\": float,\n \"ConstrainedDecimal\": Decimal,\n \"condecimal\": Decimal,\n \"UUID1\": UUID,\n \"UUID3\": UUID,\n \"UUID4\": UUID,\n \"UUID5\": UUID,\n \"FilePath\": None,\n \"DirectoryPath\": None,\n \"Json\": None,\n \"JsonWrapper\": None,\n \"SecretStr\": str,\n \"SecretBytes\": bytes,\n \"StrictBool\": bool,\n \"StrictInt\": int,\n \"StrictFloat\": float,\n \"PaymentCardNumber\": None,\n \"ByteSize\": None,\n \"AnyUrl\": str,\n \"AnyHttpUrl\": str,\n \"HttpUrl\": str,\n \"PostgresDsn\": str,\n \"RedisDsn\": str,\n}\n\n\nFIELDS_MAP = {\n getattr(pydantic, field_name): type\n for field_name, type in ATTR_TO_TYPE_MAP.items()\n if hasattr(pydantic, field_name)\n}\n\n\ndef get_basic_type(type_):\n if isinstance(type_, type):\n if issubclass(type_, pydantic.ConstrainedInt):\n return int\n if issubclass(type_, pydantic.ConstrainedStr):\n return str\n\n if type_ in FIELDS_MAP:\n type_ = FIELDS_MAP.get(type_)\n\n if type_ is None:\n raise UnsupportedTypeError()\n\n return type_\n", "path": "strawberry/experimental/pydantic/fields.py"}]}
| 1,110 | 975 |
gh_patches_debug_9298
|
rasdani/github-patches
|
git_diff
|
joke2k__faker-1607
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
es_ES postalcode is not generating valid codes
* Faker version: 11.3
* OS: Any
When using postcode for es_ES and using it with a field that requires a valid Postal Code, it fails sometimes.
I will assume that there is no logic with postal code generation for Spain.
### Steps to reproduce
Generate postal codes
### Expected behavior
Get a valid Spain postal code
### Actual behavior
Unexpected. Many are wrong
----
I'll dig now into the code. Let's see if I can get some more information and fix it :thinking: Do not expect much from me
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/providers/address/es_ES/__init__.py`
Content:
```
1 from ..es import Provider as AddressProvider
2
3
4 class Provider(AddressProvider):
5 building_number_formats = ("%", "%#", "%#", "%#", "%##")
6 street_prefixes = (
7 "Plaza",
8 "Calle",
9 "Avenida",
10 "Via",
11 "Vial",
12 "Rambla",
13 "Glorieta",
14 "Urbanización",
15 "Callejón",
16 "Cañada",
17 "Alameda",
18 "Acceso",
19 "C.",
20 "Ronda",
21 "Pasaje",
22 "Cuesta",
23 "Pasadizo",
24 "Paseo",
25 "Camino",
26 )
27 postcode_formats = ("#####",)
28 states = (
29 "Álava",
30 "Albacete",
31 "Alicante",
32 "Almería",
33 "Asturias",
34 "Ávila",
35 "Badajoz",
36 "Baleares",
37 "Barcelona",
38 "Burgos",
39 "Cáceres",
40 "Cádiz",
41 "Cantabria",
42 "Castellón",
43 "Ceuta",
44 "Ciudad",
45 "Córdoba",
46 "Cuenca",
47 "Girona",
48 "Granada",
49 "Guadalajara",
50 "Guipúzcoa",
51 "Huelva",
52 "Huesca",
53 "Jaén",
54 "La Coruña",
55 "La Rioja",
56 "Las Palmas",
57 "León",
58 "Lleida",
59 "Lugo",
60 "Madrid",
61 "Málaga",
62 "Melilla",
63 "Murcia",
64 "Navarra",
65 "Ourense",
66 "Palencia",
67 "Pontevedra",
68 "Salamanca",
69 "Santa Cruz de Tenerife",
70 "Segovia",
71 "Sevilla",
72 "Soria",
73 "Tarragona",
74 "Teruel",
75 "Toledo",
76 "Valencia",
77 "Valladolid",
78 "Vizcaya",
79 "Zamora",
80 "Zaragoza",
81 )
82
83 # Source:
84 # https://administracionelectronica.gob.es/ctt/resources/Soluciones
85 # /238/Descargas/Catalogo-de-Comunidades-Autonomas.xlsx
86 regions = (
87 "Andalucía",
88 "Aragón",
89 "Principado de Asturias",
90 "Illes Balears",
91 "Canarias",
92 "Cantabria",
93 "Castilla y León",
94 "Castilla-La Mancha",
95 "Cataluña",
96 "Comunitat Valenciana",
97 "Extremadura",
98 "Galicia",
99 "Comunidad de Madrid",
100 "Región de Murcia",
101 "Comunidad Foral de Navarra",
102 "País Vasco",
103 "La Rioja",
104 "Ciudad Autónoma de Ceuta",
105 "Ciudad Autónoma de Melilla",
106 )
107
108 city_formats = ("{{state_name}}",)
109
110 street_name_formats = (
111 "{{street_prefix}} {{first_name}} {{last_name}}",
112 "{{street_prefix}} de {{first_name}} {{last_name}}",
113 )
114 street_address_formats = (
115 "{{street_name}} {{building_number}}",
116 "{{street_name}} {{building_number}} {{secondary_address}} ",
117 )
118 address_formats = ("{{street_address}}\n{{city}}, {{postcode}}",)
119 secondary_address_formats = ("Apt. ##", "Piso #", "Puerta #")
120
121 def state_name(self) -> str:
122 return self.random_element(self.states)
123
124 def street_prefix(self) -> str:
125 return self.random_element(self.street_prefixes)
126
127 def secondary_address(self) -> str:
128 return self.numerify(self.random_element(self.secondary_address_formats))
129
130 def administrative_unit(self) -> str:
131 return self.random_element(self.states)
132
133 state = administrative_unit
134
135 def region(self) -> str:
136 return self.random_element(self.regions)
137
138 autonomous_community = region
139
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/faker/providers/address/es_ES/__init__.py b/faker/providers/address/es_ES/__init__.py
--- a/faker/providers/address/es_ES/__init__.py
+++ b/faker/providers/address/es_ES/__init__.py
@@ -24,7 +24,6 @@
"Paseo",
"Camino",
)
- postcode_formats = ("#####",)
states = (
"Álava",
"Albacete",
@@ -135,4 +134,7 @@
def region(self) -> str:
return self.random_element(self.regions)
+ def postcode(self) -> str:
+ return str(self.generator.random.randint(1000, 52100)).zfill(5)
+
autonomous_community = region
|
{"golden_diff": "diff --git a/faker/providers/address/es_ES/__init__.py b/faker/providers/address/es_ES/__init__.py\n--- a/faker/providers/address/es_ES/__init__.py\n+++ b/faker/providers/address/es_ES/__init__.py\n@@ -24,7 +24,6 @@\n \"Paseo\",\n \"Camino\",\n )\n- postcode_formats = (\"#####\",)\n states = (\n \"\u00c1lava\",\n \"Albacete\",\n@@ -135,4 +134,7 @@\n def region(self) -> str:\n return self.random_element(self.regions)\n \n+ def postcode(self) -> str:\n+ return str(self.generator.random.randint(1000, 52100)).zfill(5)\n+\n autonomous_community = region\n", "issue": "es_ES postalcode is not generating valid codes\n* Faker version: 11.3\r\n* OS: Any\r\n\r\nWhen using postcode for es_ES and using it with a field that requires a valid Postal Code, it fails sometimes.\r\nI will assume that there is no logic with postal code generation for Spain.\r\n\r\n### Steps to reproduce\r\n\r\nGenerate postal codes\r\n\r\n### Expected behavior\r\n\r\nGet a valid Spain postal code\r\n\r\n### Actual behavior\r\n\r\nUnexpected. Many are wrong\r\n\r\n----\r\n\r\nI'll dig now into the code. Let's see if I can get some more information and fix it :thinking: Do not expect much from me\n", "before_files": [{"content": "from ..es import Provider as AddressProvider\n\n\nclass Provider(AddressProvider):\n building_number_formats = (\"%\", \"%#\", \"%#\", \"%#\", \"%##\")\n street_prefixes = (\n \"Plaza\",\n \"Calle\",\n \"Avenida\",\n \"Via\",\n \"Vial\",\n \"Rambla\",\n \"Glorieta\",\n \"Urbanizaci\u00f3n\",\n \"Callej\u00f3n\",\n \"Ca\u00f1ada\",\n \"Alameda\",\n \"Acceso\",\n \"C.\",\n \"Ronda\",\n \"Pasaje\",\n \"Cuesta\",\n \"Pasadizo\",\n \"Paseo\",\n \"Camino\",\n )\n postcode_formats = (\"#####\",)\n states = (\n \"\u00c1lava\",\n \"Albacete\",\n \"Alicante\",\n \"Almer\u00eda\",\n \"Asturias\",\n \"\u00c1vila\",\n \"Badajoz\",\n \"Baleares\",\n \"Barcelona\",\n \"Burgos\",\n \"C\u00e1ceres\",\n \"C\u00e1diz\",\n \"Cantabria\",\n \"Castell\u00f3n\",\n \"Ceuta\",\n \"Ciudad\",\n \"C\u00f3rdoba\",\n \"Cuenca\",\n \"Girona\",\n \"Granada\",\n \"Guadalajara\",\n \"Guip\u00fazcoa\",\n \"Huelva\",\n \"Huesca\",\n \"Ja\u00e9n\",\n \"La Coru\u00f1a\",\n \"La Rioja\",\n \"Las Palmas\",\n \"Le\u00f3n\",\n \"Lleida\",\n \"Lugo\",\n \"Madrid\",\n \"M\u00e1laga\",\n \"Melilla\",\n \"Murcia\",\n \"Navarra\",\n \"Ourense\",\n \"Palencia\",\n \"Pontevedra\",\n \"Salamanca\",\n \"Santa Cruz de Tenerife\",\n \"Segovia\",\n \"Sevilla\",\n \"Soria\",\n \"Tarragona\",\n \"Teruel\",\n \"Toledo\",\n \"Valencia\",\n \"Valladolid\",\n \"Vizcaya\",\n \"Zamora\",\n \"Zaragoza\",\n )\n\n # Source:\n # https://administracionelectronica.gob.es/ctt/resources/Soluciones\n # /238/Descargas/Catalogo-de-Comunidades-Autonomas.xlsx\n regions = (\n \"Andaluc\u00eda\",\n \"Arag\u00f3n\",\n \"Principado de Asturias\",\n \"Illes Balears\",\n \"Canarias\",\n \"Cantabria\",\n \"Castilla y Le\u00f3n\",\n \"Castilla-La Mancha\",\n \"Catalu\u00f1a\",\n \"Comunitat Valenciana\",\n \"Extremadura\",\n \"Galicia\",\n \"Comunidad de Madrid\",\n \"Regi\u00f3n de Murcia\",\n \"Comunidad Foral de Navarra\",\n \"Pa\u00eds Vasco\",\n \"La Rioja\",\n \"Ciudad Aut\u00f3noma de Ceuta\",\n \"Ciudad Aut\u00f3noma de Melilla\",\n )\n\n city_formats = (\"{{state_name}}\",)\n\n street_name_formats = (\n \"{{street_prefix}} {{first_name}} {{last_name}}\",\n \"{{street_prefix}} de {{first_name}} {{last_name}}\",\n )\n street_address_formats = (\n \"{{street_name}} {{building_number}}\",\n \"{{street_name}} {{building_number}} {{secondary_address}} \",\n )\n address_formats = (\"{{street_address}}\\n{{city}}, {{postcode}}\",)\n secondary_address_formats = (\"Apt. ##\", \"Piso #\", \"Puerta #\")\n\n def state_name(self) -> str:\n return self.random_element(self.states)\n\n def street_prefix(self) -> str:\n return self.random_element(self.street_prefixes)\n\n def secondary_address(self) -> str:\n return self.numerify(self.random_element(self.secondary_address_formats))\n\n def administrative_unit(self) -> str:\n return self.random_element(self.states)\n\n state = administrative_unit\n\n def region(self) -> str:\n return self.random_element(self.regions)\n\n autonomous_community = region\n", "path": "faker/providers/address/es_ES/__init__.py"}], "after_files": [{"content": "from ..es import Provider as AddressProvider\n\n\nclass Provider(AddressProvider):\n building_number_formats = (\"%\", \"%#\", \"%#\", \"%#\", \"%##\")\n street_prefixes = (\n \"Plaza\",\n \"Calle\",\n \"Avenida\",\n \"Via\",\n \"Vial\",\n \"Rambla\",\n \"Glorieta\",\n \"Urbanizaci\u00f3n\",\n \"Callej\u00f3n\",\n \"Ca\u00f1ada\",\n \"Alameda\",\n \"Acceso\",\n \"C.\",\n \"Ronda\",\n \"Pasaje\",\n \"Cuesta\",\n \"Pasadizo\",\n \"Paseo\",\n \"Camino\",\n )\n states = (\n \"\u00c1lava\",\n \"Albacete\",\n \"Alicante\",\n \"Almer\u00eda\",\n \"Asturias\",\n \"\u00c1vila\",\n \"Badajoz\",\n \"Baleares\",\n \"Barcelona\",\n \"Burgos\",\n \"C\u00e1ceres\",\n \"C\u00e1diz\",\n \"Cantabria\",\n \"Castell\u00f3n\",\n \"Ceuta\",\n \"Ciudad\",\n \"C\u00f3rdoba\",\n \"Cuenca\",\n \"Girona\",\n \"Granada\",\n \"Guadalajara\",\n \"Guip\u00fazcoa\",\n \"Huelva\",\n \"Huesca\",\n \"Ja\u00e9n\",\n \"La Coru\u00f1a\",\n \"La Rioja\",\n \"Las Palmas\",\n \"Le\u00f3n\",\n \"Lleida\",\n \"Lugo\",\n \"Madrid\",\n \"M\u00e1laga\",\n \"Melilla\",\n \"Murcia\",\n \"Navarra\",\n \"Ourense\",\n \"Palencia\",\n \"Pontevedra\",\n \"Salamanca\",\n \"Santa Cruz de Tenerife\",\n \"Segovia\",\n \"Sevilla\",\n \"Soria\",\n \"Tarragona\",\n \"Teruel\",\n \"Toledo\",\n \"Valencia\",\n \"Valladolid\",\n \"Vizcaya\",\n \"Zamora\",\n \"Zaragoza\",\n )\n\n # Source:\n # https://administracionelectronica.gob.es/ctt/resources/Soluciones\n # /238/Descargas/Catalogo-de-Comunidades-Autonomas.xlsx\n regions = (\n \"Andaluc\u00eda\",\n \"Arag\u00f3n\",\n \"Principado de Asturias\",\n \"Illes Balears\",\n \"Canarias\",\n \"Cantabria\",\n \"Castilla y Le\u00f3n\",\n \"Castilla-La Mancha\",\n \"Catalu\u00f1a\",\n \"Comunitat Valenciana\",\n \"Extremadura\",\n \"Galicia\",\n \"Comunidad de Madrid\",\n \"Regi\u00f3n de Murcia\",\n \"Comunidad Foral de Navarra\",\n \"Pa\u00eds Vasco\",\n \"La Rioja\",\n \"Ciudad Aut\u00f3noma de Ceuta\",\n \"Ciudad Aut\u00f3noma de Melilla\",\n )\n\n city_formats = (\"{{state_name}}\",)\n\n street_name_formats = (\n \"{{street_prefix}} {{first_name}} {{last_name}}\",\n \"{{street_prefix}} de {{first_name}} {{last_name}}\",\n )\n street_address_formats = (\n \"{{street_name}} {{building_number}}\",\n \"{{street_name}} {{building_number}} {{secondary_address}} \",\n )\n address_formats = (\"{{street_address}}\\n{{city}}, {{postcode}}\",)\n secondary_address_formats = (\"Apt. ##\", \"Piso #\", \"Puerta #\")\n\n def state_name(self) -> str:\n return self.random_element(self.states)\n\n def street_prefix(self) -> str:\n return self.random_element(self.street_prefixes)\n\n def secondary_address(self) -> str:\n return self.numerify(self.random_element(self.secondary_address_formats))\n\n def administrative_unit(self) -> str:\n return self.random_element(self.states)\n\n state = administrative_unit\n\n def region(self) -> str:\n return self.random_element(self.regions)\n\n def postcode(self) -> str:\n return str(self.generator.random.randint(1000, 52100)).zfill(5)\n\n autonomous_community = region\n", "path": "faker/providers/address/es_ES/__init__.py"}]}
| 1,598 | 176 |
gh_patches_debug_11009
|
rasdani/github-patches
|
git_diff
|
pyca__cryptography-7895
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bump BoringSSL and/or OpenSSL in CI
## BoringSSL
[Commit: e2e613c269a6bb3d7c0271150fff48d11fdbbace](https://boringssl.googlesource.com/boringssl/+/e2e613c269a6bb3d7c0271150fff48d11fdbbace)
[Diff](https://boringssl.googlesource.com/boringssl/+/d77fdbff010ee70776036c41155d1b3711ede548..e2e613c269a6bb3d7c0271150fff48d11fdbbace) between the last commit hash merged to this repository and the new commit.
## OpenSSL
[Commit: dc45d4c6faeb53bb68401141d899b9f857bbc51d](https://github.com/openssl/openssl/commit/dc45d4c6faeb53bb68401141d899b9f857bbc51d)
[Diff](https://github.com/openssl/openssl/compare/efec0f4611ee854f2b0b3da0c135e839bf8e7d04...dc45d4c6faeb53bb68401141d899b9f857bbc51d) between the last commit hash merged to this repository and the new commit.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/_cffi_src/openssl/rsa.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5
6 INCLUDES = """
7 #include <openssl/rsa.h>
8 """
9
10 TYPES = """
11 typedef ... RSA;
12 typedef ... BN_GENCB;
13 static const int RSA_PKCS1_PADDING;
14 static const int RSA_NO_PADDING;
15 static const int RSA_PKCS1_OAEP_PADDING;
16 static const int RSA_PKCS1_PSS_PADDING;
17 static const int RSA_F4;
18 static const int RSA_PSS_SALTLEN_AUTO;
19 """
20
21 FUNCTIONS = """
22 RSA *RSA_new(void);
23 void RSA_free(RSA *);
24 int RSA_generate_key_ex(RSA *, int, BIGNUM *, BN_GENCB *);
25 int RSA_check_key(const RSA *);
26 RSA *RSAPublicKey_dup(RSA *);
27 int RSA_blinding_on(RSA *, BN_CTX *);
28 int RSA_print(BIO *, const RSA *, int);
29
30 int RSA_set0_key(RSA *, BIGNUM *, BIGNUM *, BIGNUM *);
31 int RSA_set0_factors(RSA *, BIGNUM *, BIGNUM *);
32 int RSA_set0_crt_params(RSA *, BIGNUM *, BIGNUM *, BIGNUM *);
33 void RSA_get0_key(const RSA *, const BIGNUM **, const BIGNUM **,
34 const BIGNUM **);
35 void RSA_get0_factors(const RSA *, const BIGNUM **, const BIGNUM **);
36 void RSA_get0_crt_params(const RSA *, const BIGNUM **, const BIGNUM **,
37 const BIGNUM **);
38 int EVP_PKEY_CTX_set_rsa_padding(EVP_PKEY_CTX *, int);
39 int EVP_PKEY_CTX_set_rsa_pss_saltlen(EVP_PKEY_CTX *, int);
40 int EVP_PKEY_CTX_set_rsa_mgf1_md(EVP_PKEY_CTX *, EVP_MD *);
41 int EVP_PKEY_CTX_set0_rsa_oaep_label(EVP_PKEY_CTX *, unsigned char *, int);
42
43 int EVP_PKEY_CTX_set_rsa_oaep_md(EVP_PKEY_CTX *, EVP_MD *);
44 """
45
46 CUSTOMIZATIONS = """
47 // BoringSSL doesn't define this constant, but the value is used for
48 // automatic salt length computation as in OpenSSL and LibreSSL
49 #if !defined(RSA_PSS_SALTLEN_AUTO)
50 #define RSA_PSS_SALTLEN_AUTO -2
51 #endif
52 """
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/_cffi_src/openssl/rsa.py b/src/_cffi_src/openssl/rsa.py
--- a/src/_cffi_src/openssl/rsa.py
+++ b/src/_cffi_src/openssl/rsa.py
@@ -16,6 +16,8 @@
static const int RSA_PKCS1_PSS_PADDING;
static const int RSA_F4;
static const int RSA_PSS_SALTLEN_AUTO;
+
+static const int Cryptography_HAS_IMPLICIT_RSA_REJECTION;
"""
FUNCTIONS = """
@@ -49,4 +51,10 @@
#if !defined(RSA_PSS_SALTLEN_AUTO)
#define RSA_PSS_SALTLEN_AUTO -2
#endif
+
+#if defined(EVP_PKEY_CTRL_RSA_IMPLICIT_REJECTION)
+static const int Cryptography_HAS_IMPLICIT_RSA_REJECTION = 1;
+#else
+static const int Cryptography_HAS_IMPLICIT_RSA_REJECTION = 0;
+#endif
"""
|
{"golden_diff": "diff --git a/src/_cffi_src/openssl/rsa.py b/src/_cffi_src/openssl/rsa.py\n--- a/src/_cffi_src/openssl/rsa.py\n+++ b/src/_cffi_src/openssl/rsa.py\n@@ -16,6 +16,8 @@\n static const int RSA_PKCS1_PSS_PADDING;\n static const int RSA_F4;\n static const int RSA_PSS_SALTLEN_AUTO;\n+\n+static const int Cryptography_HAS_IMPLICIT_RSA_REJECTION;\n \"\"\"\n \n FUNCTIONS = \"\"\"\n@@ -49,4 +51,10 @@\n #if !defined(RSA_PSS_SALTLEN_AUTO)\n #define RSA_PSS_SALTLEN_AUTO -2\n #endif\n+\n+#if defined(EVP_PKEY_CTRL_RSA_IMPLICIT_REJECTION)\n+static const int Cryptography_HAS_IMPLICIT_RSA_REJECTION = 1;\n+#else\n+static const int Cryptography_HAS_IMPLICIT_RSA_REJECTION = 0;\n+#endif\n \"\"\"\n", "issue": "Bump BoringSSL and/or OpenSSL in CI\n## BoringSSL\n[Commit: e2e613c269a6bb3d7c0271150fff48d11fdbbace](https://boringssl.googlesource.com/boringssl/+/e2e613c269a6bb3d7c0271150fff48d11fdbbace)\n\n[Diff](https://boringssl.googlesource.com/boringssl/+/d77fdbff010ee70776036c41155d1b3711ede548..e2e613c269a6bb3d7c0271150fff48d11fdbbace) between the last commit hash merged to this repository and the new commit.\n## OpenSSL\n[Commit: dc45d4c6faeb53bb68401141d899b9f857bbc51d](https://github.com/openssl/openssl/commit/dc45d4c6faeb53bb68401141d899b9f857bbc51d)\n\n[Diff](https://github.com/openssl/openssl/compare/efec0f4611ee854f2b0b3da0c135e839bf8e7d04...dc45d4c6faeb53bb68401141d899b9f857bbc51d) between the last commit hash merged to this repository and the new commit.\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\n\nINCLUDES = \"\"\"\n#include <openssl/rsa.h>\n\"\"\"\n\nTYPES = \"\"\"\ntypedef ... RSA;\ntypedef ... BN_GENCB;\nstatic const int RSA_PKCS1_PADDING;\nstatic const int RSA_NO_PADDING;\nstatic const int RSA_PKCS1_OAEP_PADDING;\nstatic const int RSA_PKCS1_PSS_PADDING;\nstatic const int RSA_F4;\nstatic const int RSA_PSS_SALTLEN_AUTO;\n\"\"\"\n\nFUNCTIONS = \"\"\"\nRSA *RSA_new(void);\nvoid RSA_free(RSA *);\nint RSA_generate_key_ex(RSA *, int, BIGNUM *, BN_GENCB *);\nint RSA_check_key(const RSA *);\nRSA *RSAPublicKey_dup(RSA *);\nint RSA_blinding_on(RSA *, BN_CTX *);\nint RSA_print(BIO *, const RSA *, int);\n\nint RSA_set0_key(RSA *, BIGNUM *, BIGNUM *, BIGNUM *);\nint RSA_set0_factors(RSA *, BIGNUM *, BIGNUM *);\nint RSA_set0_crt_params(RSA *, BIGNUM *, BIGNUM *, BIGNUM *);\nvoid RSA_get0_key(const RSA *, const BIGNUM **, const BIGNUM **,\n const BIGNUM **);\nvoid RSA_get0_factors(const RSA *, const BIGNUM **, const BIGNUM **);\nvoid RSA_get0_crt_params(const RSA *, const BIGNUM **, const BIGNUM **,\n const BIGNUM **);\nint EVP_PKEY_CTX_set_rsa_padding(EVP_PKEY_CTX *, int);\nint EVP_PKEY_CTX_set_rsa_pss_saltlen(EVP_PKEY_CTX *, int);\nint EVP_PKEY_CTX_set_rsa_mgf1_md(EVP_PKEY_CTX *, EVP_MD *);\nint EVP_PKEY_CTX_set0_rsa_oaep_label(EVP_PKEY_CTX *, unsigned char *, int);\n\nint EVP_PKEY_CTX_set_rsa_oaep_md(EVP_PKEY_CTX *, EVP_MD *);\n\"\"\"\n\nCUSTOMIZATIONS = \"\"\"\n// BoringSSL doesn't define this constant, but the value is used for\n// automatic salt length computation as in OpenSSL and LibreSSL\n#if !defined(RSA_PSS_SALTLEN_AUTO)\n#define RSA_PSS_SALTLEN_AUTO -2\n#endif\n\"\"\"\n", "path": "src/_cffi_src/openssl/rsa.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\n\nINCLUDES = \"\"\"\n#include <openssl/rsa.h>\n\"\"\"\n\nTYPES = \"\"\"\ntypedef ... RSA;\ntypedef ... BN_GENCB;\nstatic const int RSA_PKCS1_PADDING;\nstatic const int RSA_NO_PADDING;\nstatic const int RSA_PKCS1_OAEP_PADDING;\nstatic const int RSA_PKCS1_PSS_PADDING;\nstatic const int RSA_F4;\nstatic const int RSA_PSS_SALTLEN_AUTO;\n\nstatic const int Cryptography_HAS_IMPLICIT_RSA_REJECTION;\n\"\"\"\n\nFUNCTIONS = \"\"\"\nRSA *RSA_new(void);\nvoid RSA_free(RSA *);\nint RSA_generate_key_ex(RSA *, int, BIGNUM *, BN_GENCB *);\nint RSA_check_key(const RSA *);\nRSA *RSAPublicKey_dup(RSA *);\nint RSA_blinding_on(RSA *, BN_CTX *);\nint RSA_print(BIO *, const RSA *, int);\n\nint RSA_set0_key(RSA *, BIGNUM *, BIGNUM *, BIGNUM *);\nint RSA_set0_factors(RSA *, BIGNUM *, BIGNUM *);\nint RSA_set0_crt_params(RSA *, BIGNUM *, BIGNUM *, BIGNUM *);\nvoid RSA_get0_key(const RSA *, const BIGNUM **, const BIGNUM **,\n const BIGNUM **);\nvoid RSA_get0_factors(const RSA *, const BIGNUM **, const BIGNUM **);\nvoid RSA_get0_crt_params(const RSA *, const BIGNUM **, const BIGNUM **,\n const BIGNUM **);\nint EVP_PKEY_CTX_set_rsa_padding(EVP_PKEY_CTX *, int);\nint EVP_PKEY_CTX_set_rsa_pss_saltlen(EVP_PKEY_CTX *, int);\nint EVP_PKEY_CTX_set_rsa_mgf1_md(EVP_PKEY_CTX *, EVP_MD *);\nint EVP_PKEY_CTX_set0_rsa_oaep_label(EVP_PKEY_CTX *, unsigned char *, int);\n\nint EVP_PKEY_CTX_set_rsa_oaep_md(EVP_PKEY_CTX *, EVP_MD *);\n\"\"\"\n\nCUSTOMIZATIONS = \"\"\"\n// BoringSSL doesn't define this constant, but the value is used for\n// automatic salt length computation as in OpenSSL and LibreSSL\n#if !defined(RSA_PSS_SALTLEN_AUTO)\n#define RSA_PSS_SALTLEN_AUTO -2\n#endif\n\n#if defined(EVP_PKEY_CTRL_RSA_IMPLICIT_REJECTION)\nstatic const int Cryptography_HAS_IMPLICIT_RSA_REJECTION = 1;\n#else\nstatic const int Cryptography_HAS_IMPLICIT_RSA_REJECTION = 0;\n#endif\n\"\"\"\n", "path": "src/_cffi_src/openssl/rsa.py"}]}
| 1,251 | 207 |
gh_patches_debug_13019
|
rasdani/github-patches
|
git_diff
|
getsentry__sentry-52100
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SDK Crash Detection: Store Project ID and Event ID
Store project ID and event ID in the SDK crash detection context to find the original SDK crash event, which is only possible with admin Sentry rights.
https://github.com/getsentry/sentry/blob/2c31ee009b44964f78b9e7e8282e602b7ef849b0/src/sentry/utils/sdk_crashes/sdk_crash_detection.py#L40C2-L42
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/sentry/utils/sdk_crashes/sdk_crash_detection.py`
Content:
```
1 from __future__ import annotations
2
3 from typing import Any, Mapping, Optional
4
5 from sentry.eventstore.models import Event
6 from sentry.issues.grouptype import GroupCategory
7 from sentry.utils.safe import get_path, set_path
8 from sentry.utils.sdk_crashes.cocoa_sdk_crash_detector import CocoaSDKCrashDetector
9 from sentry.utils.sdk_crashes.event_stripper import strip_event_data
10 from sentry.utils.sdk_crashes.sdk_crash_detector import SDKCrashDetector
11
12
13 class SDKCrashReporter:
14 def report(self, event_data: Mapping[str, Any], event_project_id: int) -> Event:
15 from sentry.event_manager import EventManager
16
17 manager = EventManager(dict(event_data))
18 manager.normalize()
19 return manager.save(project_id=event_project_id)
20
21
22 class SDKCrashDetection:
23 def __init__(
24 self,
25 sdk_crash_reporter: SDKCrashReporter,
26 sdk_crash_detector: SDKCrashDetector,
27 ):
28 self.sdk_crash_reporter = sdk_crash_reporter
29 self.cocoa_sdk_crash_detector = sdk_crash_detector
30
31 def detect_sdk_crash(self, event: Event, event_project_id: int) -> Optional[Event]:
32 should_detect_sdk_crash = (
33 event.group
34 and event.group.issue_category == GroupCategory.ERROR
35 and event.group.platform == "cocoa"
36 )
37 if not should_detect_sdk_crash:
38 return None
39
40 context = get_path(event.data, "contexts", "sdk_crash_detection")
41 if context is not None and context.get("detected", False):
42 return None
43
44 # Getting the frames and checking if the event is unhandled might different per platform.
45 # We will change this once we implement this for more platforms.
46 is_unhandled = (
47 get_path(event.data, "exception", "values", -1, "mechanism", "handled") is False
48 )
49 if is_unhandled is False:
50 return None
51
52 frames = get_path(event.data, "exception", "values", -1, "stacktrace", "frames")
53 if not frames:
54 return None
55
56 if self.cocoa_sdk_crash_detector.is_sdk_crash(frames):
57 sdk_crash_event_data = strip_event_data(event.data, self.cocoa_sdk_crash_detector)
58
59 set_path(
60 sdk_crash_event_data, "contexts", "sdk_crash_detection", value={"detected": True}
61 )
62
63 return self.sdk_crash_reporter.report(sdk_crash_event_data, event_project_id)
64
65 return None
66
67
68 _crash_reporter = SDKCrashReporter()
69 _cocoa_sdk_crash_detector = CocoaSDKCrashDetector()
70
71 sdk_crash_detection = SDKCrashDetection(_crash_reporter, _cocoa_sdk_crash_detector)
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/sentry/utils/sdk_crashes/sdk_crash_detection.py b/src/sentry/utils/sdk_crashes/sdk_crash_detection.py
--- a/src/sentry/utils/sdk_crashes/sdk_crash_detection.py
+++ b/src/sentry/utils/sdk_crashes/sdk_crash_detection.py
@@ -57,7 +57,14 @@
sdk_crash_event_data = strip_event_data(event.data, self.cocoa_sdk_crash_detector)
set_path(
- sdk_crash_event_data, "contexts", "sdk_crash_detection", value={"detected": True}
+ sdk_crash_event_data,
+ "contexts",
+ "sdk_crash_detection",
+ value={
+ "detected": True,
+ "original_project_id": event.project.id,
+ "original_event_id": event.event_id,
+ },
)
return self.sdk_crash_reporter.report(sdk_crash_event_data, event_project_id)
|
{"golden_diff": "diff --git a/src/sentry/utils/sdk_crashes/sdk_crash_detection.py b/src/sentry/utils/sdk_crashes/sdk_crash_detection.py\n--- a/src/sentry/utils/sdk_crashes/sdk_crash_detection.py\n+++ b/src/sentry/utils/sdk_crashes/sdk_crash_detection.py\n@@ -57,7 +57,14 @@\n sdk_crash_event_data = strip_event_data(event.data, self.cocoa_sdk_crash_detector)\n \n set_path(\n- sdk_crash_event_data, \"contexts\", \"sdk_crash_detection\", value={\"detected\": True}\n+ sdk_crash_event_data,\n+ \"contexts\",\n+ \"sdk_crash_detection\",\n+ value={\n+ \"detected\": True,\n+ \"original_project_id\": event.project.id,\n+ \"original_event_id\": event.event_id,\n+ },\n )\n \n return self.sdk_crash_reporter.report(sdk_crash_event_data, event_project_id)\n", "issue": "SDK Crash Detection: Store Project ID and Event ID\nStore project ID and event ID in the SDK crash detection context to find the original SDK crash event, which is only possible with admin Sentry rights.\r\n\r\nhttps://github.com/getsentry/sentry/blob/2c31ee009b44964f78b9e7e8282e602b7ef849b0/src/sentry/utils/sdk_crashes/sdk_crash_detection.py#L40C2-L42\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom typing import Any, Mapping, Optional\n\nfrom sentry.eventstore.models import Event\nfrom sentry.issues.grouptype import GroupCategory\nfrom sentry.utils.safe import get_path, set_path\nfrom sentry.utils.sdk_crashes.cocoa_sdk_crash_detector import CocoaSDKCrashDetector\nfrom sentry.utils.sdk_crashes.event_stripper import strip_event_data\nfrom sentry.utils.sdk_crashes.sdk_crash_detector import SDKCrashDetector\n\n\nclass SDKCrashReporter:\n def report(self, event_data: Mapping[str, Any], event_project_id: int) -> Event:\n from sentry.event_manager import EventManager\n\n manager = EventManager(dict(event_data))\n manager.normalize()\n return manager.save(project_id=event_project_id)\n\n\nclass SDKCrashDetection:\n def __init__(\n self,\n sdk_crash_reporter: SDKCrashReporter,\n sdk_crash_detector: SDKCrashDetector,\n ):\n self.sdk_crash_reporter = sdk_crash_reporter\n self.cocoa_sdk_crash_detector = sdk_crash_detector\n\n def detect_sdk_crash(self, event: Event, event_project_id: int) -> Optional[Event]:\n should_detect_sdk_crash = (\n event.group\n and event.group.issue_category == GroupCategory.ERROR\n and event.group.platform == \"cocoa\"\n )\n if not should_detect_sdk_crash:\n return None\n\n context = get_path(event.data, \"contexts\", \"sdk_crash_detection\")\n if context is not None and context.get(\"detected\", False):\n return None\n\n # Getting the frames and checking if the event is unhandled might different per platform.\n # We will change this once we implement this for more platforms.\n is_unhandled = (\n get_path(event.data, \"exception\", \"values\", -1, \"mechanism\", \"handled\") is False\n )\n if is_unhandled is False:\n return None\n\n frames = get_path(event.data, \"exception\", \"values\", -1, \"stacktrace\", \"frames\")\n if not frames:\n return None\n\n if self.cocoa_sdk_crash_detector.is_sdk_crash(frames):\n sdk_crash_event_data = strip_event_data(event.data, self.cocoa_sdk_crash_detector)\n\n set_path(\n sdk_crash_event_data, \"contexts\", \"sdk_crash_detection\", value={\"detected\": True}\n )\n\n return self.sdk_crash_reporter.report(sdk_crash_event_data, event_project_id)\n\n return None\n\n\n_crash_reporter = SDKCrashReporter()\n_cocoa_sdk_crash_detector = CocoaSDKCrashDetector()\n\nsdk_crash_detection = SDKCrashDetection(_crash_reporter, _cocoa_sdk_crash_detector)\n", "path": "src/sentry/utils/sdk_crashes/sdk_crash_detection.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import Any, Mapping, Optional\n\nfrom sentry.eventstore.models import Event\nfrom sentry.issues.grouptype import GroupCategory\nfrom sentry.utils.safe import get_path, set_path\nfrom sentry.utils.sdk_crashes.cocoa_sdk_crash_detector import CocoaSDKCrashDetector\nfrom sentry.utils.sdk_crashes.event_stripper import strip_event_data\nfrom sentry.utils.sdk_crashes.sdk_crash_detector import SDKCrashDetector\n\n\nclass SDKCrashReporter:\n def report(self, event_data: Mapping[str, Any], event_project_id: int) -> Event:\n from sentry.event_manager import EventManager\n\n manager = EventManager(dict(event_data))\n manager.normalize()\n return manager.save(project_id=event_project_id)\n\n\nclass SDKCrashDetection:\n def __init__(\n self,\n sdk_crash_reporter: SDKCrashReporter,\n sdk_crash_detector: SDKCrashDetector,\n ):\n self.sdk_crash_reporter = sdk_crash_reporter\n self.cocoa_sdk_crash_detector = sdk_crash_detector\n\n def detect_sdk_crash(self, event: Event, event_project_id: int) -> Optional[Event]:\n should_detect_sdk_crash = (\n event.group\n and event.group.issue_category == GroupCategory.ERROR\n and event.group.platform == \"cocoa\"\n )\n if not should_detect_sdk_crash:\n return None\n\n context = get_path(event.data, \"contexts\", \"sdk_crash_detection\")\n if context is not None and context.get(\"detected\", False):\n return None\n\n # Getting the frames and checking if the event is unhandled might different per platform.\n # We will change this once we implement this for more platforms.\n is_unhandled = (\n get_path(event.data, \"exception\", \"values\", -1, \"mechanism\", \"handled\") is False\n )\n if is_unhandled is False:\n return None\n\n frames = get_path(event.data, \"exception\", \"values\", -1, \"stacktrace\", \"frames\")\n if not frames:\n return None\n\n if self.cocoa_sdk_crash_detector.is_sdk_crash(frames):\n sdk_crash_event_data = strip_event_data(event.data, self.cocoa_sdk_crash_detector)\n\n set_path(\n sdk_crash_event_data,\n \"contexts\",\n \"sdk_crash_detection\",\n value={\n \"detected\": True,\n \"original_project_id\": event.project.id,\n \"original_event_id\": event.event_id,\n },\n )\n\n return self.sdk_crash_reporter.report(sdk_crash_event_data, event_project_id)\n\n return None\n\n\n_crash_reporter = SDKCrashReporter()\n_cocoa_sdk_crash_detector = CocoaSDKCrashDetector()\n\nsdk_crash_detection = SDKCrashDetection(_crash_reporter, _cocoa_sdk_crash_detector)\n", "path": "src/sentry/utils/sdk_crashes/sdk_crash_detection.py"}]}
| 1,114 | 204 |
gh_patches_debug_35563
|
rasdani/github-patches
|
git_diff
|
litestar-org__litestar-784
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug: StaticFiles sends files as `content-disposition: 'attachment'` in html-mode
**Describe the bug**
When using `StaticFiles` in html-mode, files are being sent with `content-disposition: 'attachment'`
**To Reproduce**
Create an `html/index.html` file. Run:
```python
from starlite import Starlite, StaticFilesConfig, TestClient
app = Starlite(
static_files_config=[StaticFilesConfig(path="/", directories=["html"], html_mode=True)], route_handlers=[]
)
with TestClient(app=app) as client:
res = client.get("/index.html")
assert not res.headers["content-disposition"].startswith("attachment")
```
Bug: StaticFiles sends files as `content-disposition: 'attachment'` in html-mode
**Describe the bug**
When using `StaticFiles` in html-mode, files are being sent with `content-disposition: 'attachment'`
**To Reproduce**
Create an `html/index.html` file. Run:
```python
from starlite import Starlite, StaticFilesConfig, TestClient
app = Starlite(
static_files_config=[StaticFilesConfig(path="/", directories=["html"], html_mode=True)], route_handlers=[]
)
with TestClient(app=app) as client:
res = client.get("/index.html")
assert not res.headers["content-disposition"].startswith("attachment")
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlite/static_files/base.py`
Content:
```
1 from os.path import commonpath, join
2 from typing import TYPE_CHECKING, List, Tuple, Union
3
4 from starlite.enums import ScopeType
5 from starlite.exceptions import MethodNotAllowedException, NotFoundException
6 from starlite.response import FileResponse
7 from starlite.status_codes import HTTP_404_NOT_FOUND
8 from starlite.utils.file import FileSystemAdapter
9
10 if TYPE_CHECKING:
11
12 from starlite.types import Receive, Scope, Send
13 from starlite.types.composite_types import PathType
14 from starlite.types.file_types import FileInfo, FileSystemProtocol
15
16
17 class StaticFiles:
18 __slots__ = ("is_html_mode", "directories", "adapter")
19
20 def __init__(self, is_html_mode: bool, directories: List["PathType"], file_system: "FileSystemProtocol") -> None:
21 """This class is an ASGI App that handles file sending.
22
23 Args:
24 is_html_mode: Flag dictating whether serving html. If true, the default file will be 'index.html'.
25 directories: A list of directories to serve files from.
26 file_system: The file_system spec to use for serving files.
27 """
28 self.adapter = FileSystemAdapter(file_system)
29 self.directories = directories
30 self.is_html_mode = is_html_mode
31
32 async def get_fs_info(
33 self, directories: List["PathType"], file_path: str
34 ) -> Union[Tuple[str, "FileInfo"], Tuple[None, None]]:
35 """Resolves the file path and returns the resolved path and a.
36
37 [stat_result][os.stat_result].
38
39 Args:
40 directories: A list of directory paths.
41 file_path: A file path to resolve
42
43 Returns:
44 A tuple with an optional resolved [Path][anyio.Path] instance and an optional [stat_result][os.stat_result].
45 """
46 for directory in directories:
47 try:
48 joined_path = join(directory, file_path) # noqa: PL118
49 file_info = await self.adapter.info(joined_path)
50 if file_info and commonpath([str(directory), file_info["name"], joined_path]) == str(directory):
51 return joined_path, file_info
52 except FileNotFoundError:
53 continue
54 return None, None
55
56 async def __call__(self, scope: "Scope", receive: "Receive", send: "Send") -> None:
57 if scope["type"] != ScopeType.HTTP or scope["method"] not in {"GET", "HEAD"}:
58 raise MethodNotAllowedException()
59
60 split_path = scope["path"].split("/")
61 filename = split_path[-1]
62 joined_path = join(*split_path) # noqa: PL118
63 resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=joined_path)
64
65 if fs_info and fs_info["type"] == "directory" and self.is_html_mode:
66 filename = "index.html"
67 resolved_path, fs_info = await self.get_fs_info(
68 directories=self.directories, file_path=join(resolved_path or joined_path, filename)
69 )
70
71 if fs_info and fs_info["type"] == "file":
72 await FileResponse(
73 path=resolved_path or joined_path,
74 file_info=fs_info,
75 file_system=self.adapter.file_system,
76 filename=filename,
77 is_head_response=scope["method"] == "HEAD",
78 )(scope, receive, send)
79 return
80
81 if self.is_html_mode:
82 filename = "404.html"
83 resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=filename)
84 if fs_info and fs_info["type"] == "file":
85 await FileResponse(
86 path=resolved_path or joined_path,
87 file_info=fs_info,
88 file_system=self.adapter.file_system,
89 filename=filename,
90 is_head_response=scope["method"] == "HEAD",
91 status_code=HTTP_404_NOT_FOUND,
92 )(scope, receive, send)
93 return
94
95 raise NotFoundException(f"no file or directory match the path {resolved_path or joined_path} was found")
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/starlite/static_files/base.py b/starlite/static_files/base.py
--- a/starlite/static_files/base.py
+++ b/starlite/static_files/base.py
@@ -8,6 +8,7 @@
from starlite.utils.file import FileSystemAdapter
if TYPE_CHECKING:
+ from typing_extensions import Literal
from starlite.types import Receive, Scope, Send
from starlite.types.composite_types import PathType
@@ -61,12 +62,15 @@
filename = split_path[-1]
joined_path = join(*split_path) # noqa: PL118
resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=joined_path)
+ content_disposition_type: "Literal['inline', 'attachment']" = "attachment"
- if fs_info and fs_info["type"] == "directory" and self.is_html_mode:
- filename = "index.html"
- resolved_path, fs_info = await self.get_fs_info(
- directories=self.directories, file_path=join(resolved_path or joined_path, filename)
- )
+ if self.is_html_mode:
+ content_disposition_type = "inline"
+ if fs_info and fs_info["type"] == "directory":
+ filename = "index.html"
+ resolved_path, fs_info = await self.get_fs_info(
+ directories=self.directories, file_path=join(resolved_path or joined_path, filename)
+ )
if fs_info and fs_info["type"] == "file":
await FileResponse(
@@ -75,6 +79,7 @@
file_system=self.adapter.file_system,
filename=filename,
is_head_response=scope["method"] == "HEAD",
+ content_disposition_type=content_disposition_type,
)(scope, receive, send)
return
@@ -89,6 +94,7 @@
filename=filename,
is_head_response=scope["method"] == "HEAD",
status_code=HTTP_404_NOT_FOUND,
+ content_disposition_type=content_disposition_type,
)(scope, receive, send)
return
|
{"golden_diff": "diff --git a/starlite/static_files/base.py b/starlite/static_files/base.py\n--- a/starlite/static_files/base.py\n+++ b/starlite/static_files/base.py\n@@ -8,6 +8,7 @@\n from starlite.utils.file import FileSystemAdapter\n \n if TYPE_CHECKING:\n+ from typing_extensions import Literal\n \n from starlite.types import Receive, Scope, Send\n from starlite.types.composite_types import PathType\n@@ -61,12 +62,15 @@\n filename = split_path[-1]\n joined_path = join(*split_path) # noqa: PL118\n resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=joined_path)\n+ content_disposition_type: \"Literal['inline', 'attachment']\" = \"attachment\"\n \n- if fs_info and fs_info[\"type\"] == \"directory\" and self.is_html_mode:\n- filename = \"index.html\"\n- resolved_path, fs_info = await self.get_fs_info(\n- directories=self.directories, file_path=join(resolved_path or joined_path, filename)\n- )\n+ if self.is_html_mode:\n+ content_disposition_type = \"inline\"\n+ if fs_info and fs_info[\"type\"] == \"directory\":\n+ filename = \"index.html\"\n+ resolved_path, fs_info = await self.get_fs_info(\n+ directories=self.directories, file_path=join(resolved_path or joined_path, filename)\n+ )\n \n if fs_info and fs_info[\"type\"] == \"file\":\n await FileResponse(\n@@ -75,6 +79,7 @@\n file_system=self.adapter.file_system,\n filename=filename,\n is_head_response=scope[\"method\"] == \"HEAD\",\n+ content_disposition_type=content_disposition_type,\n )(scope, receive, send)\n return\n \n@@ -89,6 +94,7 @@\n filename=filename,\n is_head_response=scope[\"method\"] == \"HEAD\",\n status_code=HTTP_404_NOT_FOUND,\n+ content_disposition_type=content_disposition_type,\n )(scope, receive, send)\n return\n", "issue": "Bug: StaticFiles sends files as `content-disposition: 'attachment'` in html-mode\n**Describe the bug**\r\nWhen using `StaticFiles` in html-mode, files are being sent with `content-disposition: 'attachment'`\r\n\r\n**To Reproduce**\r\nCreate an `html/index.html` file. Run:\r\n\r\n```python\r\nfrom starlite import Starlite, StaticFilesConfig, TestClient\r\n\r\napp = Starlite(\r\n static_files_config=[StaticFilesConfig(path=\"/\", directories=[\"html\"], html_mode=True)], route_handlers=[]\r\n)\r\n\r\nwith TestClient(app=app) as client:\r\n res = client.get(\"/index.html\")\r\n assert not res.headers[\"content-disposition\"].startswith(\"attachment\")\r\n```\r\n\nBug: StaticFiles sends files as `content-disposition: 'attachment'` in html-mode\n**Describe the bug**\r\nWhen using `StaticFiles` in html-mode, files are being sent with `content-disposition: 'attachment'`\r\n\r\n**To Reproduce**\r\nCreate an `html/index.html` file. Run:\r\n\r\n```python\r\nfrom starlite import Starlite, StaticFilesConfig, TestClient\r\n\r\napp = Starlite(\r\n static_files_config=[StaticFilesConfig(path=\"/\", directories=[\"html\"], html_mode=True)], route_handlers=[]\r\n)\r\n\r\nwith TestClient(app=app) as client:\r\n res = client.get(\"/index.html\")\r\n assert not res.headers[\"content-disposition\"].startswith(\"attachment\")\r\n```\r\n\n", "before_files": [{"content": "from os.path import commonpath, join\nfrom typing import TYPE_CHECKING, List, Tuple, Union\n\nfrom starlite.enums import ScopeType\nfrom starlite.exceptions import MethodNotAllowedException, NotFoundException\nfrom starlite.response import FileResponse\nfrom starlite.status_codes import HTTP_404_NOT_FOUND\nfrom starlite.utils.file import FileSystemAdapter\n\nif TYPE_CHECKING:\n\n from starlite.types import Receive, Scope, Send\n from starlite.types.composite_types import PathType\n from starlite.types.file_types import FileInfo, FileSystemProtocol\n\n\nclass StaticFiles:\n __slots__ = (\"is_html_mode\", \"directories\", \"adapter\")\n\n def __init__(self, is_html_mode: bool, directories: List[\"PathType\"], file_system: \"FileSystemProtocol\") -> None:\n \"\"\"This class is an ASGI App that handles file sending.\n\n Args:\n is_html_mode: Flag dictating whether serving html. If true, the default file will be 'index.html'.\n directories: A list of directories to serve files from.\n file_system: The file_system spec to use for serving files.\n \"\"\"\n self.adapter = FileSystemAdapter(file_system)\n self.directories = directories\n self.is_html_mode = is_html_mode\n\n async def get_fs_info(\n self, directories: List[\"PathType\"], file_path: str\n ) -> Union[Tuple[str, \"FileInfo\"], Tuple[None, None]]:\n \"\"\"Resolves the file path and returns the resolved path and a.\n\n [stat_result][os.stat_result].\n\n Args:\n directories: A list of directory paths.\n file_path: A file path to resolve\n\n Returns:\n A tuple with an optional resolved [Path][anyio.Path] instance and an optional [stat_result][os.stat_result].\n \"\"\"\n for directory in directories:\n try:\n joined_path = join(directory, file_path) # noqa: PL118\n file_info = await self.adapter.info(joined_path)\n if file_info and commonpath([str(directory), file_info[\"name\"], joined_path]) == str(directory):\n return joined_path, file_info\n except FileNotFoundError:\n continue\n return None, None\n\n async def __call__(self, scope: \"Scope\", receive: \"Receive\", send: \"Send\") -> None:\n if scope[\"type\"] != ScopeType.HTTP or scope[\"method\"] not in {\"GET\", \"HEAD\"}:\n raise MethodNotAllowedException()\n\n split_path = scope[\"path\"].split(\"/\")\n filename = split_path[-1]\n joined_path = join(*split_path) # noqa: PL118\n resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=joined_path)\n\n if fs_info and fs_info[\"type\"] == \"directory\" and self.is_html_mode:\n filename = \"index.html\"\n resolved_path, fs_info = await self.get_fs_info(\n directories=self.directories, file_path=join(resolved_path or joined_path, filename)\n )\n\n if fs_info and fs_info[\"type\"] == \"file\":\n await FileResponse(\n path=resolved_path or joined_path,\n file_info=fs_info,\n file_system=self.adapter.file_system,\n filename=filename,\n is_head_response=scope[\"method\"] == \"HEAD\",\n )(scope, receive, send)\n return\n\n if self.is_html_mode:\n filename = \"404.html\"\n resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=filename)\n if fs_info and fs_info[\"type\"] == \"file\":\n await FileResponse(\n path=resolved_path or joined_path,\n file_info=fs_info,\n file_system=self.adapter.file_system,\n filename=filename,\n is_head_response=scope[\"method\"] == \"HEAD\",\n status_code=HTTP_404_NOT_FOUND,\n )(scope, receive, send)\n return\n\n raise NotFoundException(f\"no file or directory match the path {resolved_path or joined_path} was found\")\n", "path": "starlite/static_files/base.py"}], "after_files": [{"content": "from os.path import commonpath, join\nfrom typing import TYPE_CHECKING, List, Tuple, Union\n\nfrom starlite.enums import ScopeType\nfrom starlite.exceptions import MethodNotAllowedException, NotFoundException\nfrom starlite.response import FileResponse\nfrom starlite.status_codes import HTTP_404_NOT_FOUND\nfrom starlite.utils.file import FileSystemAdapter\n\nif TYPE_CHECKING:\n from typing_extensions import Literal\n\n from starlite.types import Receive, Scope, Send\n from starlite.types.composite_types import PathType\n from starlite.types.file_types import FileInfo, FileSystemProtocol\n\n\nclass StaticFiles:\n __slots__ = (\"is_html_mode\", \"directories\", \"adapter\")\n\n def __init__(self, is_html_mode: bool, directories: List[\"PathType\"], file_system: \"FileSystemProtocol\") -> None:\n \"\"\"This class is an ASGI App that handles file sending.\n\n Args:\n is_html_mode: Flag dictating whether serving html. If true, the default file will be 'index.html'.\n directories: A list of directories to serve files from.\n file_system: The file_system spec to use for serving files.\n \"\"\"\n self.adapter = FileSystemAdapter(file_system)\n self.directories = directories\n self.is_html_mode = is_html_mode\n\n async def get_fs_info(\n self, directories: List[\"PathType\"], file_path: str\n ) -> Union[Tuple[str, \"FileInfo\"], Tuple[None, None]]:\n \"\"\"Resolves the file path and returns the resolved path and a.\n\n [stat_result][os.stat_result].\n\n Args:\n directories: A list of directory paths.\n file_path: A file path to resolve\n\n Returns:\n A tuple with an optional resolved [Path][anyio.Path] instance and an optional [stat_result][os.stat_result].\n \"\"\"\n for directory in directories:\n try:\n joined_path = join(directory, file_path) # noqa: PL118\n file_info = await self.adapter.info(joined_path)\n if file_info and commonpath([str(directory), file_info[\"name\"], joined_path]) == str(directory):\n return joined_path, file_info\n except FileNotFoundError:\n continue\n return None, None\n\n async def __call__(self, scope: \"Scope\", receive: \"Receive\", send: \"Send\") -> None:\n if scope[\"type\"] != ScopeType.HTTP or scope[\"method\"] not in {\"GET\", \"HEAD\"}:\n raise MethodNotAllowedException()\n\n split_path = scope[\"path\"].split(\"/\")\n filename = split_path[-1]\n joined_path = join(*split_path) # noqa: PL118\n resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=joined_path)\n content_disposition_type: \"Literal['inline', 'attachment']\" = \"attachment\"\n\n if self.is_html_mode:\n content_disposition_type = \"inline\"\n if fs_info and fs_info[\"type\"] == \"directory\":\n filename = \"index.html\"\n resolved_path, fs_info = await self.get_fs_info(\n directories=self.directories, file_path=join(resolved_path or joined_path, filename)\n )\n\n if fs_info and fs_info[\"type\"] == \"file\":\n await FileResponse(\n path=resolved_path or joined_path,\n file_info=fs_info,\n file_system=self.adapter.file_system,\n filename=filename,\n is_head_response=scope[\"method\"] == \"HEAD\",\n content_disposition_type=content_disposition_type,\n )(scope, receive, send)\n return\n\n if self.is_html_mode:\n filename = \"404.html\"\n resolved_path, fs_info = await self.get_fs_info(directories=self.directories, file_path=filename)\n if fs_info and fs_info[\"type\"] == \"file\":\n await FileResponse(\n path=resolved_path or joined_path,\n file_info=fs_info,\n file_system=self.adapter.file_system,\n filename=filename,\n is_head_response=scope[\"method\"] == \"HEAD\",\n status_code=HTTP_404_NOT_FOUND,\n content_disposition_type=content_disposition_type,\n )(scope, receive, send)\n return\n\n raise NotFoundException(f\"no file or directory match the path {resolved_path or joined_path} was found\")\n", "path": "starlite/static_files/base.py"}]}
| 1,608 | 468 |
gh_patches_debug_27756
|
rasdani/github-patches
|
git_diff
|
scrapy__scrapy-5002
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
refactoring curl_to_request_kwargs to reduce cyclomatic complexity
<!--
Thanks for taking an interest in Scrapy!
If you have a question that starts with "How to...", please see the Scrapy Community page: https://scrapy.org/community/.
The GitHub issue tracker's purpose is to deal with bug reports and feature requests for the project itself.
Keep in mind that by filing an issue, you are expected to comply with Scrapy's Code of Conduct, including treating everyone with respect: https://github.com/scrapy/scrapy/blob/master/CODE_OF_CONDUCT.md
The following is a suggested template to structure your pull request, you can find more guidelines at https://doc.scrapy.org/en/latest/contributing.html#writing-patches and https://doc.scrapy.org/en/latest/contributing.html#submitting-patches
-->
## Summary
After some exploring with cyclomatic complexity tools (lizard), the function was found to have the second highest complexity.
## Motivation
Low complexity allows for higher readability, testability and maintainability.
## Solution
Refactor
## Additional context
N/A
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scrapy/utils/curl.py`
Content:
```
1 import argparse
2 import warnings
3 from shlex import split
4 from http.cookies import SimpleCookie
5 from urllib.parse import urlparse
6
7 from w3lib.http import basic_auth_header
8
9
10 class CurlParser(argparse.ArgumentParser):
11 def error(self, message):
12 error_msg = f'There was an error parsing the curl command: {message}'
13 raise ValueError(error_msg)
14
15
16 curl_parser = CurlParser()
17 curl_parser.add_argument('url')
18 curl_parser.add_argument('-H', '--header', dest='headers', action='append')
19 curl_parser.add_argument('-X', '--request', dest='method')
20 curl_parser.add_argument('-d', '--data', '--data-raw', dest='data')
21 curl_parser.add_argument('-u', '--user', dest='auth')
22
23
24 safe_to_ignore_arguments = [
25 ['--compressed'],
26 # `--compressed` argument is not safe to ignore, but it's included here
27 # because the `HttpCompressionMiddleware` is enabled by default
28 ['-s', '--silent'],
29 ['-v', '--verbose'],
30 ['-#', '--progress-bar']
31 ]
32
33 for argument in safe_to_ignore_arguments:
34 curl_parser.add_argument(*argument, action='store_true')
35
36
37 def curl_to_request_kwargs(curl_command, ignore_unknown_options=True):
38 """Convert a cURL command syntax to Request kwargs.
39
40 :param str curl_command: string containing the curl command
41 :param bool ignore_unknown_options: If true, only a warning is emitted when
42 cURL options are unknown. Otherwise
43 raises an error. (default: True)
44 :return: dictionary of Request kwargs
45 """
46
47 curl_args = split(curl_command)
48
49 if curl_args[0] != 'curl':
50 raise ValueError('A curl command must start with "curl"')
51
52 parsed_args, argv = curl_parser.parse_known_args(curl_args[1:])
53
54 if argv:
55 msg = f'Unrecognized options: {", ".join(argv)}'
56 if ignore_unknown_options:
57 warnings.warn(msg)
58 else:
59 raise ValueError(msg)
60
61 url = parsed_args.url
62
63 # curl automatically prepends 'http' if the scheme is missing, but Request
64 # needs the scheme to work
65 parsed_url = urlparse(url)
66 if not parsed_url.scheme:
67 url = 'http://' + url
68
69 method = parsed_args.method or 'GET'
70
71 result = {'method': method.upper(), 'url': url}
72
73 headers = []
74 cookies = {}
75 for header in parsed_args.headers or ():
76 name, val = header.split(':', 1)
77 name = name.strip()
78 val = val.strip()
79 if name.title() == 'Cookie':
80 for name, morsel in SimpleCookie(val).items():
81 cookies[name] = morsel.value
82 else:
83 headers.append((name, val))
84
85 if parsed_args.auth:
86 user, password = parsed_args.auth.split(':', 1)
87 headers.append(('Authorization', basic_auth_header(user, password)))
88
89 if headers:
90 result['headers'] = headers
91 if cookies:
92 result['cookies'] = cookies
93 if parsed_args.data:
94 result['body'] = parsed_args.data
95 if not parsed_args.method:
96 # if the "data" is specified but the "method" is not specified,
97 # the default method is 'POST'
98 result['method'] = 'POST'
99
100 return result
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/scrapy/utils/curl.py b/scrapy/utils/curl.py
--- a/scrapy/utils/curl.py
+++ b/scrapy/utils/curl.py
@@ -34,6 +34,26 @@
curl_parser.add_argument(*argument, action='store_true')
+def _parse_headers_and_cookies(parsed_args):
+ headers = []
+ cookies = {}
+ for header in parsed_args.headers or ():
+ name, val = header.split(':', 1)
+ name = name.strip()
+ val = val.strip()
+ if name.title() == 'Cookie':
+ for name, morsel in SimpleCookie(val).items():
+ cookies[name] = morsel.value
+ else:
+ headers.append((name, val))
+
+ if parsed_args.auth:
+ user, password = parsed_args.auth.split(':', 1)
+ headers.append(('Authorization', basic_auth_header(user, password)))
+
+ return headers, cookies
+
+
def curl_to_request_kwargs(curl_command, ignore_unknown_options=True):
"""Convert a cURL command syntax to Request kwargs.
@@ -70,21 +90,7 @@
result = {'method': method.upper(), 'url': url}
- headers = []
- cookies = {}
- for header in parsed_args.headers or ():
- name, val = header.split(':', 1)
- name = name.strip()
- val = val.strip()
- if name.title() == 'Cookie':
- for name, morsel in SimpleCookie(val).items():
- cookies[name] = morsel.value
- else:
- headers.append((name, val))
-
- if parsed_args.auth:
- user, password = parsed_args.auth.split(':', 1)
- headers.append(('Authorization', basic_auth_header(user, password)))
+ headers, cookies = _parse_headers_and_cookies(parsed_args)
if headers:
result['headers'] = headers
|
{"golden_diff": "diff --git a/scrapy/utils/curl.py b/scrapy/utils/curl.py\n--- a/scrapy/utils/curl.py\n+++ b/scrapy/utils/curl.py\n@@ -34,6 +34,26 @@\n curl_parser.add_argument(*argument, action='store_true')\n \n \n+def _parse_headers_and_cookies(parsed_args):\n+ headers = []\n+ cookies = {}\n+ for header in parsed_args.headers or ():\n+ name, val = header.split(':', 1)\n+ name = name.strip()\n+ val = val.strip()\n+ if name.title() == 'Cookie':\n+ for name, morsel in SimpleCookie(val).items():\n+ cookies[name] = morsel.value\n+ else:\n+ headers.append((name, val))\n+\n+ if parsed_args.auth:\n+ user, password = parsed_args.auth.split(':', 1)\n+ headers.append(('Authorization', basic_auth_header(user, password)))\n+\n+ return headers, cookies\n+\n+\n def curl_to_request_kwargs(curl_command, ignore_unknown_options=True):\n \"\"\"Convert a cURL command syntax to Request kwargs.\n \n@@ -70,21 +90,7 @@\n \n result = {'method': method.upper(), 'url': url}\n \n- headers = []\n- cookies = {}\n- for header in parsed_args.headers or ():\n- name, val = header.split(':', 1)\n- name = name.strip()\n- val = val.strip()\n- if name.title() == 'Cookie':\n- for name, morsel in SimpleCookie(val).items():\n- cookies[name] = morsel.value\n- else:\n- headers.append((name, val))\n-\n- if parsed_args.auth:\n- user, password = parsed_args.auth.split(':', 1)\n- headers.append(('Authorization', basic_auth_header(user, password)))\n+ headers, cookies = _parse_headers_and_cookies(parsed_args)\n \n if headers:\n result['headers'] = headers\n", "issue": "refactoring curl_to_request_kwargs to reduce cyclomatic complexity\n<!--\r\n\r\nThanks for taking an interest in Scrapy!\r\n\r\nIf you have a question that starts with \"How to...\", please see the Scrapy Community page: https://scrapy.org/community/.\r\nThe GitHub issue tracker's purpose is to deal with bug reports and feature requests for the project itself.\r\n\r\nKeep in mind that by filing an issue, you are expected to comply with Scrapy's Code of Conduct, including treating everyone with respect: https://github.com/scrapy/scrapy/blob/master/CODE_OF_CONDUCT.md\r\n\r\nThe following is a suggested template to structure your pull request, you can find more guidelines at https://doc.scrapy.org/en/latest/contributing.html#writing-patches and https://doc.scrapy.org/en/latest/contributing.html#submitting-patches\r\n\r\n-->\r\n\r\n## Summary\r\n\r\nAfter some exploring with cyclomatic complexity tools (lizard), the function was found to have the second highest complexity. \r\n\r\n## Motivation\r\n\r\nLow complexity allows for higher readability, testability and maintainability. \r\n\r\n## Solution\r\n\r\nRefactor\r\n\r\n## Additional context\r\n\r\nN/A\r\n\n", "before_files": [{"content": "import argparse\nimport warnings\nfrom shlex import split\nfrom http.cookies import SimpleCookie\nfrom urllib.parse import urlparse\n\nfrom w3lib.http import basic_auth_header\n\n\nclass CurlParser(argparse.ArgumentParser):\n def error(self, message):\n error_msg = f'There was an error parsing the curl command: {message}'\n raise ValueError(error_msg)\n\n\ncurl_parser = CurlParser()\ncurl_parser.add_argument('url')\ncurl_parser.add_argument('-H', '--header', dest='headers', action='append')\ncurl_parser.add_argument('-X', '--request', dest='method')\ncurl_parser.add_argument('-d', '--data', '--data-raw', dest='data')\ncurl_parser.add_argument('-u', '--user', dest='auth')\n\n\nsafe_to_ignore_arguments = [\n ['--compressed'],\n # `--compressed` argument is not safe to ignore, but it's included here\n # because the `HttpCompressionMiddleware` is enabled by default\n ['-s', '--silent'],\n ['-v', '--verbose'],\n ['-#', '--progress-bar']\n]\n\nfor argument in safe_to_ignore_arguments:\n curl_parser.add_argument(*argument, action='store_true')\n\n\ndef curl_to_request_kwargs(curl_command, ignore_unknown_options=True):\n \"\"\"Convert a cURL command syntax to Request kwargs.\n\n :param str curl_command: string containing the curl command\n :param bool ignore_unknown_options: If true, only a warning is emitted when\n cURL options are unknown. Otherwise\n raises an error. (default: True)\n :return: dictionary of Request kwargs\n \"\"\"\n\n curl_args = split(curl_command)\n\n if curl_args[0] != 'curl':\n raise ValueError('A curl command must start with \"curl\"')\n\n parsed_args, argv = curl_parser.parse_known_args(curl_args[1:])\n\n if argv:\n msg = f'Unrecognized options: {\", \".join(argv)}'\n if ignore_unknown_options:\n warnings.warn(msg)\n else:\n raise ValueError(msg)\n\n url = parsed_args.url\n\n # curl automatically prepends 'http' if the scheme is missing, but Request\n # needs the scheme to work\n parsed_url = urlparse(url)\n if not parsed_url.scheme:\n url = 'http://' + url\n\n method = parsed_args.method or 'GET'\n\n result = {'method': method.upper(), 'url': url}\n\n headers = []\n cookies = {}\n for header in parsed_args.headers or ():\n name, val = header.split(':', 1)\n name = name.strip()\n val = val.strip()\n if name.title() == 'Cookie':\n for name, morsel in SimpleCookie(val).items():\n cookies[name] = morsel.value\n else:\n headers.append((name, val))\n\n if parsed_args.auth:\n user, password = parsed_args.auth.split(':', 1)\n headers.append(('Authorization', basic_auth_header(user, password)))\n\n if headers:\n result['headers'] = headers\n if cookies:\n result['cookies'] = cookies\n if parsed_args.data:\n result['body'] = parsed_args.data\n if not parsed_args.method:\n # if the \"data\" is specified but the \"method\" is not specified,\n # the default method is 'POST'\n result['method'] = 'POST'\n\n return result\n", "path": "scrapy/utils/curl.py"}], "after_files": [{"content": "import argparse\nimport warnings\nfrom shlex import split\nfrom http.cookies import SimpleCookie\nfrom urllib.parse import urlparse\n\nfrom w3lib.http import basic_auth_header\n\n\nclass CurlParser(argparse.ArgumentParser):\n def error(self, message):\n error_msg = f'There was an error parsing the curl command: {message}'\n raise ValueError(error_msg)\n\n\ncurl_parser = CurlParser()\ncurl_parser.add_argument('url')\ncurl_parser.add_argument('-H', '--header', dest='headers', action='append')\ncurl_parser.add_argument('-X', '--request', dest='method')\ncurl_parser.add_argument('-d', '--data', '--data-raw', dest='data')\ncurl_parser.add_argument('-u', '--user', dest='auth')\n\n\nsafe_to_ignore_arguments = [\n ['--compressed'],\n # `--compressed` argument is not safe to ignore, but it's included here\n # because the `HttpCompressionMiddleware` is enabled by default\n ['-s', '--silent'],\n ['-v', '--verbose'],\n ['-#', '--progress-bar']\n]\n\nfor argument in safe_to_ignore_arguments:\n curl_parser.add_argument(*argument, action='store_true')\n\n\ndef _parse_headers_and_cookies(parsed_args):\n headers = []\n cookies = {}\n for header in parsed_args.headers or ():\n name, val = header.split(':', 1)\n name = name.strip()\n val = val.strip()\n if name.title() == 'Cookie':\n for name, morsel in SimpleCookie(val).items():\n cookies[name] = morsel.value\n else:\n headers.append((name, val))\n\n if parsed_args.auth:\n user, password = parsed_args.auth.split(':', 1)\n headers.append(('Authorization', basic_auth_header(user, password)))\n\n return headers, cookies\n\n\ndef curl_to_request_kwargs(curl_command, ignore_unknown_options=True):\n \"\"\"Convert a cURL command syntax to Request kwargs.\n\n :param str curl_command: string containing the curl command\n :param bool ignore_unknown_options: If true, only a warning is emitted when\n cURL options are unknown. Otherwise\n raises an error. (default: True)\n :return: dictionary of Request kwargs\n \"\"\"\n\n curl_args = split(curl_command)\n\n if curl_args[0] != 'curl':\n raise ValueError('A curl command must start with \"curl\"')\n\n parsed_args, argv = curl_parser.parse_known_args(curl_args[1:])\n\n if argv:\n msg = f'Unrecognized options: {\", \".join(argv)}'\n if ignore_unknown_options:\n warnings.warn(msg)\n else:\n raise ValueError(msg)\n\n url = parsed_args.url\n\n # curl automatically prepends 'http' if the scheme is missing, but Request\n # needs the scheme to work\n parsed_url = urlparse(url)\n if not parsed_url.scheme:\n url = 'http://' + url\n\n method = parsed_args.method or 'GET'\n\n result = {'method': method.upper(), 'url': url}\n\n headers, cookies = _parse_headers_and_cookies(parsed_args)\n\n if headers:\n result['headers'] = headers\n if cookies:\n result['cookies'] = cookies\n if parsed_args.data:\n result['body'] = parsed_args.data\n if not parsed_args.method:\n # if the \"data\" is specified but the \"method\" is not specified,\n # the default method is 'POST'\n result['method'] = 'POST'\n\n return result\n", "path": "scrapy/utils/curl.py"}]}
| 1,406 | 429 |
gh_patches_debug_802
|
rasdani/github-patches
|
git_diff
|
pyca__cryptography-1599
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update year in copyright notice for vectors
Refs #1597
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `vectors/cryptography_vectors/__about__.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5 from __future__ import absolute_import, division, print_function
6
7 __all__ = [
8 "__title__", "__summary__", "__uri__", "__version__", "__author__",
9 "__email__", "__license__", "__copyright__",
10 ]
11
12 __title__ = "cryptography_vectors"
13 __summary__ = "Test vectors for the cryptography package."
14
15 __uri__ = "https://github.com/pyca/cryptography"
16
17 __version__ = "0.8.dev1"
18
19 __author__ = "The cryptography developers"
20 __email__ = "[email protected]"
21
22 __license__ = "BSD or Apache License, Version 2.0"
23 __copyright__ = "Copyright 2013-2014 %s" % __author__
24
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/vectors/cryptography_vectors/__about__.py b/vectors/cryptography_vectors/__about__.py
--- a/vectors/cryptography_vectors/__about__.py
+++ b/vectors/cryptography_vectors/__about__.py
@@ -20,4 +20,4 @@
__email__ = "[email protected]"
__license__ = "BSD or Apache License, Version 2.0"
-__copyright__ = "Copyright 2013-2014 %s" % __author__
+__copyright__ = "Copyright 2013-2015 %s" % __author__
|
{"golden_diff": "diff --git a/vectors/cryptography_vectors/__about__.py b/vectors/cryptography_vectors/__about__.py\n--- a/vectors/cryptography_vectors/__about__.py\n+++ b/vectors/cryptography_vectors/__about__.py\n@@ -20,4 +20,4 @@\n __email__ = \"[email protected]\"\n \n __license__ = \"BSD or Apache License, Version 2.0\"\n-__copyright__ = \"Copyright 2013-2014 %s\" % __author__\n+__copyright__ = \"Copyright 2013-2015 %s\" % __author__\n", "issue": "Update year in copyright notice for vectors\nRefs #1597 \n\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\n__all__ = [\n \"__title__\", \"__summary__\", \"__uri__\", \"__version__\", \"__author__\",\n \"__email__\", \"__license__\", \"__copyright__\",\n]\n\n__title__ = \"cryptography_vectors\"\n__summary__ = \"Test vectors for the cryptography package.\"\n\n__uri__ = \"https://github.com/pyca/cryptography\"\n\n__version__ = \"0.8.dev1\"\n\n__author__ = \"The cryptography developers\"\n__email__ = \"[email protected]\"\n\n__license__ = \"BSD or Apache License, Version 2.0\"\n__copyright__ = \"Copyright 2013-2014 %s\" % __author__\n", "path": "vectors/cryptography_vectors/__about__.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\n__all__ = [\n \"__title__\", \"__summary__\", \"__uri__\", \"__version__\", \"__author__\",\n \"__email__\", \"__license__\", \"__copyright__\",\n]\n\n__title__ = \"cryptography_vectors\"\n__summary__ = \"Test vectors for the cryptography package.\"\n\n__uri__ = \"https://github.com/pyca/cryptography\"\n\n__version__ = \"0.8.dev1\"\n\n__author__ = \"The cryptography developers\"\n__email__ = \"[email protected]\"\n\n__license__ = \"BSD or Apache License, Version 2.0\"\n__copyright__ = \"Copyright 2013-2015 %s\" % __author__\n", "path": "vectors/cryptography_vectors/__about__.py"}]}
| 526 | 137 |
gh_patches_debug_1453
|
rasdani/github-patches
|
git_diff
|
rlworkgroup__garage-971
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pytest flag --strict-markers requires version 4.5.0
pytest flag `--strict-markers` in https://github.com/rlworkgroup/garage/blob/master/setup.cfg#L79 requires version >= 4.5.0.
See https://docs.pytest.org/en/latest/changelog.html#pytest-4-5-0-2019-05-11
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 """setuptools based setup module."""
2 from setuptools import find_packages
3 from setuptools import setup
4
5 TF_VERSION = '<1.16,>=1.15.0'
6 GYM_VERSION = '==0.12.4'
7
8 # Required dependencies
9 REQUIRED = [
10 # Please keep alphabetized
11 'akro==0.0.6',
12 'cached_property',
13 'click',
14 'cloudpickle',
15 'cma==2.7.0',
16 'dowel==0.0.2',
17 'gym[atari,box2d,classic_control]' + GYM_VERSION,
18 'joblib<0.13,>=0.12',
19 'matplotlib',
20 'numpy>=1.14.5',
21 'psutil',
22 # Pyglet 1.4.0 introduces some api change which breaks some
23 # gym environments
24 # See: https://github.com/openai/gym/issues/1588
25 'pyglet<1.4.0,>=1.3.0',
26 'pyprind',
27 'python-dateutil',
28 'torch==1.3.0',
29 'ray',
30 'scikit-image',
31 'scipy',
32 'tensorflow' + TF_VERSION,
33 'tensorflow-probability',
34 'torchvision==0.4.1'
35 ]
36
37 # Dependencies for optional features
38 EXTRAS = {}
39
40 EXTRAS['mujoco'] = [
41 'mujoco-py<2.1,>=2.0',
42 'gym[all]' + GYM_VERSION,
43 ]
44
45 EXTRAS['dm_control'] = [
46 # dm_control throws an error during install about not being able to
47 # find a build dependency (absl-py). Later pip executes the `install`
48 # command again and the install succeeds because absl-py has been
49 # installed. This is stupid, but harmless.
50 'dm_control @ https://api.github.com/repos/deepmind/dm_control/tarball/7a36377879c57777e5d5b4da5aae2cd2a29b607a', # pylint: disable=line-too-long; # noqa: E501
51 ]
52
53 EXTRAS['all'] = list(set(sum(EXTRAS.values(), [])))
54
55 # dependencies for using gpu, not included in 'all'
56 EXTRAS['gpu'] = ['tensorflow-gpu' + TF_VERSION]
57
58 # Development dependencies (*not* included in 'all')
59 EXTRAS['dev'] = [
60 # Please keep alphabetized
61 'baselines @ https://api.github.com/repos/openai/baselines/tarball/f2729693253c0ef4d4086231d36e0a4307ec1cb3', # pylint: disable=line-too-long; # noqa: E501
62 'flake8',
63 'flake8-docstrings>=1.5.0',
64 'flake8-import-order',
65 'gtimer',
66 'pandas',
67 'pep8-naming==0.7.0',
68 'pre-commit',
69 'pycodestyle>=2.5.0',
70 'pydocstyle>=4.0.0',
71 'pylint>=2.4.3',
72 'pytest>=3.6', # Required for pytest-cov on Python 3.6
73 'pytest-cov',
74 'pytest-xdist',
75 'recommonmark',
76 'rlkit @ git+https://github.com/vitchyr/rlkit/@1d469a509b797ca04a39b8734c1816ca7d108fc8', # pylint: disable=line-too-long; # noqa: E501
77 'seaborn',
78 'sphinx',
79 'sphinx_rtd_theme',
80 'yapf==0.28.0',
81 ]
82
83 with open('README.md') as f:
84 README = f.read()
85
86 # Get the package version dynamically
87 with open('VERSION') as v:
88 VERSION = v.read().strip()
89
90 setup(
91 name='garage',
92 version=VERSION,
93 author='Reinforcement Learning Working Group',
94 description='A toolkit for reproducible reinforcement learning research',
95 url='https://github.com/rlworkgroup/garage',
96 packages=find_packages(where='src'),
97 package_dir={'': 'src'},
98 scripts=['scripts/garage'],
99 python_requires='>=3.5',
100 install_requires=REQUIRED,
101 extras_require=EXTRAS,
102 license='MIT',
103 long_description=README,
104 long_description_content_type='text/markdown',
105 classifiers=[
106 'Development Status :: 4 - Beta',
107 'Intended Audience :: Developers',
108 'Intended Audience :: Education',
109 'Intended Audience :: Science/Research',
110 'License :: OSI Approved :: MIT License',
111 'Programming Language :: Python :: 3.5',
112 'Programming Language :: Python :: 3.6',
113 'Programming Language :: Python :: 3.7',
114 'Programming Language :: Python :: 3 :: Only',
115 'Topic :: Scientific/Engineering :: Artificial Intelligence',
116 'Topic :: Scientific/Engineering :: Mathematics',
117 'Topic :: Software Development :: Libraries',
118 ],
119 )
120
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -69,7 +69,7 @@
'pycodestyle>=2.5.0',
'pydocstyle>=4.0.0',
'pylint>=2.4.3',
- 'pytest>=3.6', # Required for pytest-cov on Python 3.6
+ 'pytest>=4.5.0', # Required for strict-markers
'pytest-cov',
'pytest-xdist',
'recommonmark',
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -69,7 +69,7 @@\n 'pycodestyle>=2.5.0',\n 'pydocstyle>=4.0.0',\n 'pylint>=2.4.3',\n- 'pytest>=3.6', # Required for pytest-cov on Python 3.6\n+ 'pytest>=4.5.0', # Required for strict-markers\n 'pytest-cov',\n 'pytest-xdist',\n 'recommonmark',\n", "issue": "pytest flag --strict-markers requires version 4.5.0\npytest flag `--strict-markers` in https://github.com/rlworkgroup/garage/blob/master/setup.cfg#L79 requires version >= 4.5.0. \r\n\r\nSee https://docs.pytest.org/en/latest/changelog.html#pytest-4-5-0-2019-05-11\n", "before_files": [{"content": "\"\"\"setuptools based setup module.\"\"\"\nfrom setuptools import find_packages\nfrom setuptools import setup\n\nTF_VERSION = '<1.16,>=1.15.0'\nGYM_VERSION = '==0.12.4'\n\n# Required dependencies\nREQUIRED = [\n # Please keep alphabetized\n 'akro==0.0.6',\n 'cached_property',\n 'click',\n 'cloudpickle',\n 'cma==2.7.0',\n 'dowel==0.0.2',\n 'gym[atari,box2d,classic_control]' + GYM_VERSION,\n 'joblib<0.13,>=0.12',\n 'matplotlib',\n 'numpy>=1.14.5',\n 'psutil',\n # Pyglet 1.4.0 introduces some api change which breaks some\n # gym environments\n # See: https://github.com/openai/gym/issues/1588\n 'pyglet<1.4.0,>=1.3.0',\n 'pyprind',\n 'python-dateutil',\n 'torch==1.3.0',\n 'ray',\n 'scikit-image',\n 'scipy',\n 'tensorflow' + TF_VERSION,\n 'tensorflow-probability',\n 'torchvision==0.4.1'\n]\n\n# Dependencies for optional features\nEXTRAS = {}\n\nEXTRAS['mujoco'] = [\n 'mujoco-py<2.1,>=2.0',\n 'gym[all]' + GYM_VERSION,\n]\n\nEXTRAS['dm_control'] = [\n # dm_control throws an error during install about not being able to\n # find a build dependency (absl-py). Later pip executes the `install`\n # command again and the install succeeds because absl-py has been\n # installed. This is stupid, but harmless.\n 'dm_control @ https://api.github.com/repos/deepmind/dm_control/tarball/7a36377879c57777e5d5b4da5aae2cd2a29b607a', # pylint: disable=line-too-long; # noqa: E501\n]\n\nEXTRAS['all'] = list(set(sum(EXTRAS.values(), [])))\n\n# dependencies for using gpu, not included in 'all'\nEXTRAS['gpu'] = ['tensorflow-gpu' + TF_VERSION]\n\n# Development dependencies (*not* included in 'all')\nEXTRAS['dev'] = [\n # Please keep alphabetized\n 'baselines @ https://api.github.com/repos/openai/baselines/tarball/f2729693253c0ef4d4086231d36e0a4307ec1cb3', # pylint: disable=line-too-long; # noqa: E501\n 'flake8',\n 'flake8-docstrings>=1.5.0',\n 'flake8-import-order',\n 'gtimer',\n 'pandas',\n 'pep8-naming==0.7.0',\n 'pre-commit',\n 'pycodestyle>=2.5.0',\n 'pydocstyle>=4.0.0',\n 'pylint>=2.4.3',\n 'pytest>=3.6', # Required for pytest-cov on Python 3.6\n 'pytest-cov',\n 'pytest-xdist',\n 'recommonmark',\n 'rlkit @ git+https://github.com/vitchyr/rlkit/@1d469a509b797ca04a39b8734c1816ca7d108fc8', # pylint: disable=line-too-long; # noqa: E501\n 'seaborn',\n 'sphinx',\n 'sphinx_rtd_theme',\n 'yapf==0.28.0',\n]\n\nwith open('README.md') as f:\n README = f.read()\n\n# Get the package version dynamically\nwith open('VERSION') as v:\n VERSION = v.read().strip()\n\nsetup(\n name='garage',\n version=VERSION,\n author='Reinforcement Learning Working Group',\n description='A toolkit for reproducible reinforcement learning research',\n url='https://github.com/rlworkgroup/garage',\n packages=find_packages(where='src'),\n package_dir={'': 'src'},\n scripts=['scripts/garage'],\n python_requires='>=3.5',\n install_requires=REQUIRED,\n extras_require=EXTRAS,\n license='MIT',\n long_description=README,\n long_description_content_type='text/markdown',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: Developers',\n 'Intended Audience :: Education',\n 'Intended Audience :: Science/Research',\n 'License :: OSI Approved :: MIT License',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Scientific/Engineering :: Mathematics',\n 'Topic :: Software Development :: Libraries',\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "\"\"\"setuptools based setup module.\"\"\"\nfrom setuptools import find_packages\nfrom setuptools import setup\n\nTF_VERSION = '<1.16,>=1.15.0'\nGYM_VERSION = '==0.12.4'\n\n# Required dependencies\nREQUIRED = [\n # Please keep alphabetized\n 'akro==0.0.6',\n 'cached_property',\n 'click',\n 'cloudpickle',\n 'cma==2.7.0',\n 'dowel==0.0.2',\n 'gym[atari,box2d,classic_control]' + GYM_VERSION,\n 'joblib<0.13,>=0.12',\n 'matplotlib',\n 'numpy>=1.14.5',\n 'psutil',\n # Pyglet 1.4.0 introduces some api change which breaks some\n # gym environments\n # See: https://github.com/openai/gym/issues/1588\n 'pyglet<1.4.0,>=1.3.0',\n 'pyprind',\n 'python-dateutil',\n 'torch==1.3.0',\n 'ray',\n 'scikit-image',\n 'scipy',\n 'tensorflow' + TF_VERSION,\n 'tensorflow-probability',\n 'torchvision==0.4.1'\n]\n\n# Dependencies for optional features\nEXTRAS = {}\n\nEXTRAS['mujoco'] = [\n 'mujoco-py<2.1,>=2.0',\n 'gym[all]' + GYM_VERSION,\n]\n\nEXTRAS['dm_control'] = [\n # dm_control throws an error during install about not being able to\n # find a build dependency (absl-py). Later pip executes the `install`\n # command again and the install succeeds because absl-py has been\n # installed. This is stupid, but harmless.\n 'dm_control @ https://api.github.com/repos/deepmind/dm_control/tarball/7a36377879c57777e5d5b4da5aae2cd2a29b607a', # pylint: disable=line-too-long; # noqa: E501\n]\n\nEXTRAS['all'] = list(set(sum(EXTRAS.values(), [])))\n\n# dependencies for using gpu, not included in 'all'\nEXTRAS['gpu'] = ['tensorflow-gpu' + TF_VERSION]\n\n# Development dependencies (*not* included in 'all')\nEXTRAS['dev'] = [\n # Please keep alphabetized\n 'baselines @ https://api.github.com/repos/openai/baselines/tarball/f2729693253c0ef4d4086231d36e0a4307ec1cb3', # pylint: disable=line-too-long; # noqa: E501\n 'flake8',\n 'flake8-docstrings>=1.5.0',\n 'flake8-import-order',\n 'gtimer',\n 'pandas',\n 'pep8-naming==0.7.0',\n 'pre-commit',\n 'pycodestyle>=2.5.0',\n 'pydocstyle>=4.0.0',\n 'pylint>=2.4.3',\n 'pytest>=4.5.0', # Required for strict-markers\n 'pytest-cov',\n 'pytest-xdist',\n 'recommonmark',\n 'rlkit @ git+https://github.com/vitchyr/rlkit/@1d469a509b797ca04a39b8734c1816ca7d108fc8', # pylint: disable=line-too-long; # noqa: E501\n 'seaborn',\n 'sphinx',\n 'sphinx_rtd_theme',\n 'yapf==0.28.0',\n]\n\nwith open('README.md') as f:\n README = f.read()\n\n# Get the package version dynamically\nwith open('VERSION') as v:\n VERSION = v.read().strip()\n\nsetup(\n name='garage',\n version=VERSION,\n author='Reinforcement Learning Working Group',\n description='A toolkit for reproducible reinforcement learning research',\n url='https://github.com/rlworkgroup/garage',\n packages=find_packages(where='src'),\n package_dir={'': 'src'},\n scripts=['scripts/garage'],\n python_requires='>=3.5',\n install_requires=REQUIRED,\n extras_require=EXTRAS,\n license='MIT',\n long_description=README,\n long_description_content_type='text/markdown',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: Developers',\n 'Intended Audience :: Education',\n 'Intended Audience :: Science/Research',\n 'License :: OSI Approved :: MIT License',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Scientific/Engineering :: Mathematics',\n 'Topic :: Software Development :: Libraries',\n ],\n)\n", "path": "setup.py"}]}
| 1,758 | 127 |
gh_patches_debug_20214
|
rasdani/github-patches
|
git_diff
|
getsentry__sentry-python-921
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Not working with older boto version
Hello, we use it in Django==2.1.7 app and this row breaks the app.
https://github.com/getsentry/sentry-python/blob/cc08a6bed116e09db41c712c20ab63eb0a839e41/sentry_sdk/integrations/boto3.py#L36
For versions
boto3==1.7.45
botocore==1.10.84
this throws
`
AttributeError: 'str' object has no attribute 'hyphenize'`
I'm not sure the base of the integrations but I thought they must be enabled in settings, but this part of Boto3Integration is triggered even if we have not enabled it in django settings.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sentry_sdk/integrations/boto3.py`
Content:
```
1 from __future__ import absolute_import
2
3 from sentry_sdk import Hub
4 from sentry_sdk.integrations import Integration, DidNotEnable
5 from sentry_sdk.tracing import Span
6
7 from sentry_sdk._functools import partial
8 from sentry_sdk._types import MYPY
9
10 if MYPY:
11 from typing import Any
12 from typing import Dict
13 from typing import Optional
14 from typing import Type
15
16 try:
17 from botocore.client import BaseClient # type: ignore
18 from botocore.response import StreamingBody # type: ignore
19 from botocore.awsrequest import AWSRequest # type: ignore
20 except ImportError:
21 raise DidNotEnable("botocore is not installed")
22
23
24 class Boto3Integration(Integration):
25 identifier = "boto3"
26
27 @staticmethod
28 def setup_once():
29 # type: () -> None
30 orig_init = BaseClient.__init__
31
32 def sentry_patched_init(self, *args, **kwargs):
33 # type: (Type[BaseClient], *Any, **Any) -> None
34 orig_init(self, *args, **kwargs)
35 meta = self.meta
36 service_id = meta.service_model.service_id.hyphenize()
37 meta.events.register(
38 "request-created",
39 partial(_sentry_request_created, service_id=service_id),
40 )
41 meta.events.register("after-call", _sentry_after_call)
42 meta.events.register("after-call-error", _sentry_after_call_error)
43
44 BaseClient.__init__ = sentry_patched_init
45
46
47 def _sentry_request_created(service_id, request, operation_name, **kwargs):
48 # type: (str, AWSRequest, str, **Any) -> None
49 hub = Hub.current
50 if hub.get_integration(Boto3Integration) is None:
51 return
52
53 description = "aws.%s.%s" % (service_id, operation_name)
54 span = hub.start_span(
55 hub=hub,
56 op="aws.request",
57 description=description,
58 )
59 span.set_tag("aws.service_id", service_id)
60 span.set_tag("aws.operation_name", operation_name)
61 span.set_data("aws.request.url", request.url)
62
63 # We do it in order for subsequent http calls/retries be
64 # attached to this span.
65 span.__enter__()
66
67 # request.context is an open-ended data-structure
68 # where we can add anything useful in request life cycle.
69 request.context["_sentrysdk_span"] = span
70
71
72 def _sentry_after_call(context, parsed, **kwargs):
73 # type: (Dict[str, Any], Dict[str, Any], **Any) -> None
74 span = context.pop("_sentrysdk_span", None) # type: Optional[Span]
75
76 # Span could be absent if the integration is disabled.
77 if span is None:
78 return
79 span.__exit__(None, None, None)
80
81 body = parsed.get("Body")
82 if not isinstance(body, StreamingBody):
83 return
84
85 streaming_span = span.start_child(
86 op="aws.request.stream",
87 description=span.description,
88 )
89
90 orig_read = body.read
91 orig_close = body.close
92
93 def sentry_streaming_body_read(*args, **kwargs):
94 # type: (*Any, **Any) -> bytes
95 try:
96 ret = orig_read(*args, **kwargs)
97 if not ret:
98 streaming_span.finish()
99 return ret
100 except Exception:
101 streaming_span.finish()
102 raise
103
104 body.read = sentry_streaming_body_read
105
106 def sentry_streaming_body_close(*args, **kwargs):
107 # type: (*Any, **Any) -> None
108 streaming_span.finish()
109 orig_close(*args, **kwargs)
110
111 body.close = sentry_streaming_body_close
112
113
114 def _sentry_after_call_error(context, exception, **kwargs):
115 # type: (Dict[str, Any], Type[BaseException], **Any) -> None
116 span = context.pop("_sentrysdk_span", None) # type: Optional[Span]
117
118 # Span could be absent if the integration is disabled.
119 if span is None:
120 return
121 span.__exit__(type(exception), exception, None)
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sentry_sdk/integrations/boto3.py b/sentry_sdk/integrations/boto3.py
--- a/sentry_sdk/integrations/boto3.py
+++ b/sentry_sdk/integrations/boto3.py
@@ -14,6 +14,7 @@
from typing import Type
try:
+ from botocore import __version__ as BOTOCORE_VERSION # type: ignore
from botocore.client import BaseClient # type: ignore
from botocore.response import StreamingBody # type: ignore
from botocore.awsrequest import AWSRequest # type: ignore
@@ -27,6 +28,14 @@
@staticmethod
def setup_once():
# type: () -> None
+ try:
+ version = tuple(map(int, BOTOCORE_VERSION.split(".")[:3]))
+ except (ValueError, TypeError):
+ raise DidNotEnable(
+ "Unparsable botocore version: {}".format(BOTOCORE_VERSION)
+ )
+ if version < (1, 12):
+ raise DidNotEnable("Botocore 1.12 or newer is required.")
orig_init = BaseClient.__init__
def sentry_patched_init(self, *args, **kwargs):
|
{"golden_diff": "diff --git a/sentry_sdk/integrations/boto3.py b/sentry_sdk/integrations/boto3.py\n--- a/sentry_sdk/integrations/boto3.py\n+++ b/sentry_sdk/integrations/boto3.py\n@@ -14,6 +14,7 @@\n from typing import Type\n \n try:\n+ from botocore import __version__ as BOTOCORE_VERSION # type: ignore\n from botocore.client import BaseClient # type: ignore\n from botocore.response import StreamingBody # type: ignore\n from botocore.awsrequest import AWSRequest # type: ignore\n@@ -27,6 +28,14 @@\n @staticmethod\n def setup_once():\n # type: () -> None\n+ try:\n+ version = tuple(map(int, BOTOCORE_VERSION.split(\".\")[:3]))\n+ except (ValueError, TypeError):\n+ raise DidNotEnable(\n+ \"Unparsable botocore version: {}\".format(BOTOCORE_VERSION)\n+ )\n+ if version < (1, 12):\n+ raise DidNotEnable(\"Botocore 1.12 or newer is required.\")\n orig_init = BaseClient.__init__\n \n def sentry_patched_init(self, *args, **kwargs):\n", "issue": "Not working with older boto version\nHello, we use it in Django==2.1.7 app and this row breaks the app.\r\n\r\nhttps://github.com/getsentry/sentry-python/blob/cc08a6bed116e09db41c712c20ab63eb0a839e41/sentry_sdk/integrations/boto3.py#L36\r\n\r\nFor versions\r\nboto3==1.7.45\r\nbotocore==1.10.84\r\n\r\nthis throws\r\n`\r\nAttributeError: 'str' object has no attribute 'hyphenize'`\r\n\r\nI'm not sure the base of the integrations but I thought they must be enabled in settings, but this part of Boto3Integration is triggered even if we have not enabled it in django settings.\r\n\n", "before_files": [{"content": "from __future__ import absolute_import\n\nfrom sentry_sdk import Hub\nfrom sentry_sdk.integrations import Integration, DidNotEnable\nfrom sentry_sdk.tracing import Span\n\nfrom sentry_sdk._functools import partial\nfrom sentry_sdk._types import MYPY\n\nif MYPY:\n from typing import Any\n from typing import Dict\n from typing import Optional\n from typing import Type\n\ntry:\n from botocore.client import BaseClient # type: ignore\n from botocore.response import StreamingBody # type: ignore\n from botocore.awsrequest import AWSRequest # type: ignore\nexcept ImportError:\n raise DidNotEnable(\"botocore is not installed\")\n\n\nclass Boto3Integration(Integration):\n identifier = \"boto3\"\n\n @staticmethod\n def setup_once():\n # type: () -> None\n orig_init = BaseClient.__init__\n\n def sentry_patched_init(self, *args, **kwargs):\n # type: (Type[BaseClient], *Any, **Any) -> None\n orig_init(self, *args, **kwargs)\n meta = self.meta\n service_id = meta.service_model.service_id.hyphenize()\n meta.events.register(\n \"request-created\",\n partial(_sentry_request_created, service_id=service_id),\n )\n meta.events.register(\"after-call\", _sentry_after_call)\n meta.events.register(\"after-call-error\", _sentry_after_call_error)\n\n BaseClient.__init__ = sentry_patched_init\n\n\ndef _sentry_request_created(service_id, request, operation_name, **kwargs):\n # type: (str, AWSRequest, str, **Any) -> None\n hub = Hub.current\n if hub.get_integration(Boto3Integration) is None:\n return\n\n description = \"aws.%s.%s\" % (service_id, operation_name)\n span = hub.start_span(\n hub=hub,\n op=\"aws.request\",\n description=description,\n )\n span.set_tag(\"aws.service_id\", service_id)\n span.set_tag(\"aws.operation_name\", operation_name)\n span.set_data(\"aws.request.url\", request.url)\n\n # We do it in order for subsequent http calls/retries be\n # attached to this span.\n span.__enter__()\n\n # request.context is an open-ended data-structure\n # where we can add anything useful in request life cycle.\n request.context[\"_sentrysdk_span\"] = span\n\n\ndef _sentry_after_call(context, parsed, **kwargs):\n # type: (Dict[str, Any], Dict[str, Any], **Any) -> None\n span = context.pop(\"_sentrysdk_span\", None) # type: Optional[Span]\n\n # Span could be absent if the integration is disabled.\n if span is None:\n return\n span.__exit__(None, None, None)\n\n body = parsed.get(\"Body\")\n if not isinstance(body, StreamingBody):\n return\n\n streaming_span = span.start_child(\n op=\"aws.request.stream\",\n description=span.description,\n )\n\n orig_read = body.read\n orig_close = body.close\n\n def sentry_streaming_body_read(*args, **kwargs):\n # type: (*Any, **Any) -> bytes\n try:\n ret = orig_read(*args, **kwargs)\n if not ret:\n streaming_span.finish()\n return ret\n except Exception:\n streaming_span.finish()\n raise\n\n body.read = sentry_streaming_body_read\n\n def sentry_streaming_body_close(*args, **kwargs):\n # type: (*Any, **Any) -> None\n streaming_span.finish()\n orig_close(*args, **kwargs)\n\n body.close = sentry_streaming_body_close\n\n\ndef _sentry_after_call_error(context, exception, **kwargs):\n # type: (Dict[str, Any], Type[BaseException], **Any) -> None\n span = context.pop(\"_sentrysdk_span\", None) # type: Optional[Span]\n\n # Span could be absent if the integration is disabled.\n if span is None:\n return\n span.__exit__(type(exception), exception, None)\n", "path": "sentry_sdk/integrations/boto3.py"}], "after_files": [{"content": "from __future__ import absolute_import\n\nfrom sentry_sdk import Hub\nfrom sentry_sdk.integrations import Integration, DidNotEnable\nfrom sentry_sdk.tracing import Span\n\nfrom sentry_sdk._functools import partial\nfrom sentry_sdk._types import MYPY\n\nif MYPY:\n from typing import Any\n from typing import Dict\n from typing import Optional\n from typing import Type\n\ntry:\n from botocore import __version__ as BOTOCORE_VERSION # type: ignore\n from botocore.client import BaseClient # type: ignore\n from botocore.response import StreamingBody # type: ignore\n from botocore.awsrequest import AWSRequest # type: ignore\nexcept ImportError:\n raise DidNotEnable(\"botocore is not installed\")\n\n\nclass Boto3Integration(Integration):\n identifier = \"boto3\"\n\n @staticmethod\n def setup_once():\n # type: () -> None\n try:\n version = tuple(map(int, BOTOCORE_VERSION.split(\".\")[:3]))\n except (ValueError, TypeError):\n raise DidNotEnable(\n \"Unparsable botocore version: {}\".format(BOTOCORE_VERSION)\n )\n if version < (1, 12):\n raise DidNotEnable(\"Botocore 1.12 or newer is required.\")\n orig_init = BaseClient.__init__\n\n def sentry_patched_init(self, *args, **kwargs):\n # type: (Type[BaseClient], *Any, **Any) -> None\n orig_init(self, *args, **kwargs)\n meta = self.meta\n service_id = meta.service_model.service_id.hyphenize()\n meta.events.register(\n \"request-created\",\n partial(_sentry_request_created, service_id=service_id),\n )\n meta.events.register(\"after-call\", _sentry_after_call)\n meta.events.register(\"after-call-error\", _sentry_after_call_error)\n\n BaseClient.__init__ = sentry_patched_init\n\n\ndef _sentry_request_created(service_id, request, operation_name, **kwargs):\n # type: (str, AWSRequest, str, **Any) -> None\n hub = Hub.current\n if hub.get_integration(Boto3Integration) is None:\n return\n\n description = \"aws.%s.%s\" % (service_id, operation_name)\n span = hub.start_span(\n hub=hub,\n op=\"aws.request\",\n description=description,\n )\n span.set_tag(\"aws.service_id\", service_id)\n span.set_tag(\"aws.operation_name\", operation_name)\n span.set_data(\"aws.request.url\", request.url)\n\n # We do it in order for subsequent http calls/retries be\n # attached to this span.\n span.__enter__()\n\n # request.context is an open-ended data-structure\n # where we can add anything useful in request life cycle.\n request.context[\"_sentrysdk_span\"] = span\n\n\ndef _sentry_after_call(context, parsed, **kwargs):\n # type: (Dict[str, Any], Dict[str, Any], **Any) -> None\n span = context.pop(\"_sentrysdk_span\", None) # type: Optional[Span]\n\n # Span could be absent if the integration is disabled.\n if span is None:\n return\n span.__exit__(None, None, None)\n\n body = parsed.get(\"Body\")\n if not isinstance(body, StreamingBody):\n return\n\n streaming_span = span.start_child(\n op=\"aws.request.stream\",\n description=span.description,\n )\n\n orig_read = body.read\n orig_close = body.close\n\n def sentry_streaming_body_read(*args, **kwargs):\n # type: (*Any, **Any) -> bytes\n try:\n ret = orig_read(*args, **kwargs)\n if not ret:\n streaming_span.finish()\n return ret\n except Exception:\n streaming_span.finish()\n raise\n\n body.read = sentry_streaming_body_read\n\n def sentry_streaming_body_close(*args, **kwargs):\n # type: (*Any, **Any) -> None\n streaming_span.finish()\n orig_close(*args, **kwargs)\n\n body.close = sentry_streaming_body_close\n\n\ndef _sentry_after_call_error(context, exception, **kwargs):\n # type: (Dict[str, Any], Type[BaseException], **Any) -> None\n span = context.pop(\"_sentrysdk_span\", None) # type: Optional[Span]\n\n # Span could be absent if the integration is disabled.\n if span is None:\n return\n span.__exit__(type(exception), exception, None)\n", "path": "sentry_sdk/integrations/boto3.py"}]}
| 1,618 | 285 |
gh_patches_debug_30061
|
rasdani/github-patches
|
git_diff
|
Miserlou__Zappa-1993
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Set_Cookie option sets duplicate cookies on AWS Lambda
## Context
I have an API running Python3.7 and Zappa (in a virtualenv).
I am setting 6 cookies by using the option "set_cookie" in flask. It looks something like this:
```
resp = make_response(jsonify({'success':'true', 'message': 'Successfully authenticated!'}), 200)
resp.set_cookie("1", value="1", secure=True, samesite='Lax', domain=".example.com",max_age=3600)
resp.set_cookie("2", value="2", secure=True, samesite='Lax', domain=".example.com",max_age=3600)
resp.set_cookie("3", value="3", secure=True, samesite='Lax', domain=".example.com",max_age=3600)
resp.set_cookie("4", value="4", secure=True, samesite='Lax', domain=".example.com",max_age=3600)
resp.set_cookie("5", value="5", secure=True, samesite='Lax', domain=".example.com",max_age=3600)
resp.set_cookie("6", value="6", secure=True, samesite='Lax', domain=".example.com",max_age=3600)
return resp
```
On localhost testing Flask, this works as expected.
If I deploy the same code to AWS using Zappa, the response header will show 36 "set-cookie" headers. So the formula here is n^2. So if I add 4 cookies using the above method, it will show 16 in the request header.
The browser takes care of duplicate cookies, but the response from the API is still huge because of this issue.
Same thing happens if I use:
`resp.headers.add("set-cookie""1"="1; Domain=.example.com; Max-Age=3600; Secure; Path=/; SameSite=Lax")`
## Expected Behavior
I believe Zappa or something at AWS is at fault here. Expected behaviour is to send 6 "set-cookie" headers and not 36.
## Actual Behavior
Sets n^2 cookies as response.
## Steps to Reproduce
Deploy a Flask route using Zappa which sets the cookies. Use the code above.
## Your Environment
* Zappa version used: 0.48.2
* Operating System and Python version: Ubuntu 18.04, Python3.7
* The output of `pip freeze`: https://pastebin.com/d4QTaTuG
* Your `zappa_settings.py`: https://pastebin.com/d1GK8sbe
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `zappa/middleware.py`
Content:
```
1 from werkzeug.wsgi import ClosingIterator
2
3
4 def all_casings(input_string):
5 """
6 Permute all casings of a given string.
7
8 A pretty algorithm, via @Amber
9 http://stackoverflow.com/questions/6792803/finding-all-possible-case-permutations-in-python
10 """
11 if not input_string:
12 yield ""
13 else:
14 first = input_string[:1]
15 if first.lower() == first.upper():
16 for sub_casing in all_casings(input_string[1:]):
17 yield first + sub_casing
18 else:
19 for sub_casing in all_casings(input_string[1:]):
20 yield first.lower() + sub_casing
21 yield first.upper() + sub_casing
22
23
24 class ZappaWSGIMiddleware(object):
25 """
26 Middleware functions necessary for a Zappa deployment.
27
28 Most hacks have now been remove except for Set-Cookie permutation.
29 """
30 def __init__(self, application):
31 self.application = application
32
33 def __call__(self, environ, start_response):
34 """
35 We must case-mangle the Set-Cookie header name or AWS will use only a
36 single one of these headers.
37 """
38
39 def encode_response(status, headers, exc_info=None):
40 """
41 Create an APIGW-acceptable version of our cookies.
42
43 We have to use a bizarre hack that turns multiple Set-Cookie headers into
44 their case-permutated format, ex:
45
46 Set-cookie:
47 sEt-cookie:
48 seT-cookie:
49
50 To get around an API Gateway limitation.
51
52 This is weird, but better than our previous hack of creating a Base58-encoded
53 supercookie.
54 """
55
56 # All the non-cookie headers should be sent unharmed.
57
58 # The main app can send 'set-cookie' headers in any casing
59 # Related: https://github.com/Miserlou/Zappa/issues/990
60 new_headers = [header for header in headers
61 if ((type(header[0]) != str) or (header[0].lower() != 'set-cookie'))]
62 cookie_headers = [header for header in headers
63 if ((type(header[0]) == str) and (header[0].lower() == "set-cookie"))]
64 for header, new_name in zip(cookie_headers,
65 all_casings("Set-Cookie")):
66 new_headers.append((new_name, header[1]))
67 return start_response(status, new_headers, exc_info)
68
69 # Call the application with our modifier
70 response = self.application(environ, encode_response)
71
72 # Return the response as a WSGI-safe iterator
73 return ClosingIterator(response)
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/zappa/middleware.py b/zappa/middleware.py
--- a/zappa/middleware.py
+++ b/zappa/middleware.py
@@ -38,32 +38,17 @@
def encode_response(status, headers, exc_info=None):
"""
- Create an APIGW-acceptable version of our cookies.
-
- We have to use a bizarre hack that turns multiple Set-Cookie headers into
- their case-permutated format, ex:
-
- Set-cookie:
- sEt-cookie:
- seT-cookie:
-
- To get around an API Gateway limitation.
-
- This is weird, but better than our previous hack of creating a Base58-encoded
- supercookie.
+ This makes the 'set-cookie' headers name lowercase,
+ all the non-cookie headers should be sent unharmed.
+ Related: https://github.com/Miserlou/Zappa/issues/1965
"""
- # All the non-cookie headers should be sent unharmed.
-
- # The main app can send 'set-cookie' headers in any casing
- # Related: https://github.com/Miserlou/Zappa/issues/990
new_headers = [header for header in headers
if ((type(header[0]) != str) or (header[0].lower() != 'set-cookie'))]
- cookie_headers = [header for header in headers
+ cookie_headers = [(header[0].lower(), header[1]) for header in headers
if ((type(header[0]) == str) and (header[0].lower() == "set-cookie"))]
- for header, new_name in zip(cookie_headers,
- all_casings("Set-Cookie")):
- new_headers.append((new_name, header[1]))
+ new_headers = new_headers + cookie_headers
+
return start_response(status, new_headers, exc_info)
# Call the application with our modifier
|
{"golden_diff": "diff --git a/zappa/middleware.py b/zappa/middleware.py\n--- a/zappa/middleware.py\n+++ b/zappa/middleware.py\n@@ -38,32 +38,17 @@\n \n def encode_response(status, headers, exc_info=None):\n \"\"\"\n- Create an APIGW-acceptable version of our cookies.\n-\n- We have to use a bizarre hack that turns multiple Set-Cookie headers into\n- their case-permutated format, ex:\n-\n- Set-cookie:\n- sEt-cookie:\n- seT-cookie:\n-\n- To get around an API Gateway limitation.\n-\n- This is weird, but better than our previous hack of creating a Base58-encoded\n- supercookie.\n+ This makes the 'set-cookie' headers name lowercase,\n+ all the non-cookie headers should be sent unharmed.\n+ Related: https://github.com/Miserlou/Zappa/issues/1965\n \"\"\"\n \n- # All the non-cookie headers should be sent unharmed.\n- \n- # The main app can send 'set-cookie' headers in any casing\n- # Related: https://github.com/Miserlou/Zappa/issues/990\n new_headers = [header for header in headers\n if ((type(header[0]) != str) or (header[0].lower() != 'set-cookie'))]\n- cookie_headers = [header for header in headers \n+ cookie_headers = [(header[0].lower(), header[1]) for header in headers\n if ((type(header[0]) == str) and (header[0].lower() == \"set-cookie\"))]\n- for header, new_name in zip(cookie_headers,\n- all_casings(\"Set-Cookie\")):\n- new_headers.append((new_name, header[1]))\n+ new_headers = new_headers + cookie_headers\n+\n return start_response(status, new_headers, exc_info)\n \n # Call the application with our modifier\n", "issue": "Set_Cookie option sets duplicate cookies on AWS Lambda\n## Context\r\nI have an API running Python3.7 and Zappa (in a virtualenv).\r\nI am setting 6 cookies by using the option \"set_cookie\" in flask. It looks something like this:\r\n```\r\nresp = make_response(jsonify({'success':'true', 'message': 'Successfully authenticated!'}), 200)\r\nresp.set_cookie(\"1\", value=\"1\", secure=True, samesite='Lax', domain=\".example.com\",max_age=3600)\r\nresp.set_cookie(\"2\", value=\"2\", secure=True, samesite='Lax', domain=\".example.com\",max_age=3600)\r\nresp.set_cookie(\"3\", value=\"3\", secure=True, samesite='Lax', domain=\".example.com\",max_age=3600)\r\nresp.set_cookie(\"4\", value=\"4\", secure=True, samesite='Lax', domain=\".example.com\",max_age=3600)\r\nresp.set_cookie(\"5\", value=\"5\", secure=True, samesite='Lax', domain=\".example.com\",max_age=3600)\r\nresp.set_cookie(\"6\", value=\"6\", secure=True, samesite='Lax', domain=\".example.com\",max_age=3600)\r\nreturn resp\r\n```\r\n\r\nOn localhost testing Flask, this works as expected.\r\n\r\nIf I deploy the same code to AWS using Zappa, the response header will show 36 \"set-cookie\" headers. So the formula here is n^2. So if I add 4 cookies using the above method, it will show 16 in the request header.\r\n\r\nThe browser takes care of duplicate cookies, but the response from the API is still huge because of this issue.\r\n\r\nSame thing happens if I use: \r\n`resp.headers.add(\"set-cookie\"\"1\"=\"1; Domain=.example.com; Max-Age=3600; Secure; Path=/; SameSite=Lax\")`\r\n\r\n## Expected Behavior\r\nI believe Zappa or something at AWS is at fault here. Expected behaviour is to send 6 \"set-cookie\" headers and not 36.\r\n\r\n## Actual Behavior\r\nSets n^2 cookies as response.\r\n\r\n## Steps to Reproduce\r\nDeploy a Flask route using Zappa which sets the cookies. Use the code above.\r\n\r\n## Your Environment\r\n* Zappa version used: 0.48.2\r\n* Operating System and Python version: Ubuntu 18.04, Python3.7\r\n* The output of `pip freeze`: https://pastebin.com/d4QTaTuG\r\n* Your `zappa_settings.py`: https://pastebin.com/d1GK8sbe\n", "before_files": [{"content": "from werkzeug.wsgi import ClosingIterator\n\n\ndef all_casings(input_string):\n \"\"\"\n Permute all casings of a given string.\n\n A pretty algorithm, via @Amber\n http://stackoverflow.com/questions/6792803/finding-all-possible-case-permutations-in-python\n \"\"\"\n if not input_string:\n yield \"\"\n else:\n first = input_string[:1]\n if first.lower() == first.upper():\n for sub_casing in all_casings(input_string[1:]):\n yield first + sub_casing\n else:\n for sub_casing in all_casings(input_string[1:]):\n yield first.lower() + sub_casing\n yield first.upper() + sub_casing\n\n\nclass ZappaWSGIMiddleware(object):\n \"\"\"\n Middleware functions necessary for a Zappa deployment.\n\n Most hacks have now been remove except for Set-Cookie permutation.\n \"\"\"\n def __init__(self, application):\n self.application = application\n\n def __call__(self, environ, start_response):\n \"\"\"\n We must case-mangle the Set-Cookie header name or AWS will use only a\n single one of these headers.\n \"\"\"\n\n def encode_response(status, headers, exc_info=None):\n \"\"\"\n Create an APIGW-acceptable version of our cookies.\n\n We have to use a bizarre hack that turns multiple Set-Cookie headers into\n their case-permutated format, ex:\n\n Set-cookie:\n sEt-cookie:\n seT-cookie:\n\n To get around an API Gateway limitation.\n\n This is weird, but better than our previous hack of creating a Base58-encoded\n supercookie.\n \"\"\"\n\n # All the non-cookie headers should be sent unharmed.\n \n # The main app can send 'set-cookie' headers in any casing\n # Related: https://github.com/Miserlou/Zappa/issues/990\n new_headers = [header for header in headers\n if ((type(header[0]) != str) or (header[0].lower() != 'set-cookie'))]\n cookie_headers = [header for header in headers \n if ((type(header[0]) == str) and (header[0].lower() == \"set-cookie\"))]\n for header, new_name in zip(cookie_headers,\n all_casings(\"Set-Cookie\")):\n new_headers.append((new_name, header[1]))\n return start_response(status, new_headers, exc_info)\n\n # Call the application with our modifier\n response = self.application(environ, encode_response)\n\n # Return the response as a WSGI-safe iterator\n return ClosingIterator(response)\n", "path": "zappa/middleware.py"}], "after_files": [{"content": "from werkzeug.wsgi import ClosingIterator\n\n\ndef all_casings(input_string):\n \"\"\"\n Permute all casings of a given string.\n\n A pretty algorithm, via @Amber\n http://stackoverflow.com/questions/6792803/finding-all-possible-case-permutations-in-python\n \"\"\"\n if not input_string:\n yield \"\"\n else:\n first = input_string[:1]\n if first.lower() == first.upper():\n for sub_casing in all_casings(input_string[1:]):\n yield first + sub_casing\n else:\n for sub_casing in all_casings(input_string[1:]):\n yield first.lower() + sub_casing\n yield first.upper() + sub_casing\n\n\nclass ZappaWSGIMiddleware(object):\n \"\"\"\n Middleware functions necessary for a Zappa deployment.\n\n Most hacks have now been remove except for Set-Cookie permutation.\n \"\"\"\n def __init__(self, application):\n self.application = application\n\n def __call__(self, environ, start_response):\n \"\"\"\n We must case-mangle the Set-Cookie header name or AWS will use only a\n single one of these headers.\n \"\"\"\n\n def encode_response(status, headers, exc_info=None):\n \"\"\"\n This makes the 'set-cookie' headers name lowercase,\n all the non-cookie headers should be sent unharmed.\n Related: https://github.com/Miserlou/Zappa/issues/1965\n \"\"\"\n\n new_headers = [header for header in headers\n if ((type(header[0]) != str) or (header[0].lower() != 'set-cookie'))]\n cookie_headers = [(header[0].lower(), header[1]) for header in headers\n if ((type(header[0]) == str) and (header[0].lower() == \"set-cookie\"))]\n new_headers = new_headers + cookie_headers\n\n return start_response(status, new_headers, exc_info)\n\n # Call the application with our modifier\n response = self.application(environ, encode_response)\n\n # Return the response as a WSGI-safe iterator\n return ClosingIterator(response)\n", "path": "zappa/middleware.py"}]}
| 1,546 | 430 |
gh_patches_debug_32826
|
rasdani/github-patches
|
git_diff
|
dotkom__onlineweb4-1498
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Don't send additional 500 email if no useful information in it.
After the implementation if #1485 we get an additional email for _all_ 500 errors, even if there is no supplied information. Let's not send an email if there's no useful information in it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `onlineweb4/views.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import logging
3 from smtplib import SMTPException
4
5 from django.conf import settings
6 from django.contrib import messages
7 from django.core.mail import send_mail
8 from django.shortcuts import redirect, render
9 from onlineweb4.forms import ErrorForm
10
11
12 def server_error(request):
13 log = logging.getLogger(__name__)
14
15 if request.method == 'POST':
16 form = ErrorForm(request.POST)
17 message = request.POST.get('reason', 'Ingen forklaring oppgitt.')
18 try:
19 log.error('%s triggered a 500 server error and provided the following description: %s' % (
20 request.user,
21 message
22 ))
23 send_mail('500error user-report', message,
24 settings.DEFAULT_FROM_EMAIL, [settings.EMAIL_DOTKOM])
25 log.debug('Finished sending error email to %s' % settings.EMAIL_DOTKOM)
26
27 messages.success(request, 'Feilmeldingen din ble sendt til %s' % settings.EMAIL_DOTKOM)
28
29 return redirect('home')
30 except SMTPException:
31 messages.error(request, 'Det oppstod en uventet feil under sending av feilmeldingen')
32 return redirect('home')
33
34 return render(request, '500.html', {'error_form': ErrorForm})
35
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/onlineweb4/views.py b/onlineweb4/views.py
--- a/onlineweb4/views.py
+++ b/onlineweb4/views.py
@@ -1,21 +1,22 @@
# -*- coding: utf-8 -*-
import logging
-from smtplib import SMTPException
from django.conf import settings
from django.contrib import messages
from django.core.mail import send_mail
from django.shortcuts import redirect, render
from onlineweb4.forms import ErrorForm
+from smtplib import SMTPException
def server_error(request):
log = logging.getLogger(__name__)
if request.method == 'POST':
- form = ErrorForm(request.POST)
- message = request.POST.get('reason', 'Ingen forklaring oppgitt.')
- try:
+ message = request.POST.get('reason')
+ if not message:
+ return redirect('home')
+ try:
log.error('%s triggered a 500 server error and provided the following description: %s' % (
request.user,
message
@@ -23,12 +24,9 @@
send_mail('500error user-report', message,
settings.DEFAULT_FROM_EMAIL, [settings.EMAIL_DOTKOM])
log.debug('Finished sending error email to %s' % settings.EMAIL_DOTKOM)
-
messages.success(request, 'Feilmeldingen din ble sendt til %s' % settings.EMAIL_DOTKOM)
-
return redirect('home')
except SMTPException:
messages.error(request, 'Det oppstod en uventet feil under sending av feilmeldingen')
return redirect('home')
-
return render(request, '500.html', {'error_form': ErrorForm})
|
{"golden_diff": "diff --git a/onlineweb4/views.py b/onlineweb4/views.py\n--- a/onlineweb4/views.py\n+++ b/onlineweb4/views.py\n@@ -1,21 +1,22 @@\n # -*- coding: utf-8 -*-\n import logging\n-from smtplib import SMTPException\n \n from django.conf import settings\n from django.contrib import messages\n from django.core.mail import send_mail\n from django.shortcuts import redirect, render\n from onlineweb4.forms import ErrorForm\n+from smtplib import SMTPException\n \n \n def server_error(request):\n log = logging.getLogger(__name__)\n \n if request.method == 'POST':\n- form = ErrorForm(request.POST)\n- message = request.POST.get('reason', 'Ingen forklaring oppgitt.')\n- try: \n+ message = request.POST.get('reason')\n+ if not message:\n+ return redirect('home')\n+ try:\n log.error('%s triggered a 500 server error and provided the following description: %s' % (\n request.user,\n message\n@@ -23,12 +24,9 @@\n send_mail('500error user-report', message,\n settings.DEFAULT_FROM_EMAIL, [settings.EMAIL_DOTKOM])\n log.debug('Finished sending error email to %s' % settings.EMAIL_DOTKOM)\n-\n messages.success(request, 'Feilmeldingen din ble sendt til %s' % settings.EMAIL_DOTKOM)\n-\n return redirect('home')\n except SMTPException:\n messages.error(request, 'Det oppstod en uventet feil under sending av feilmeldingen')\n return redirect('home')\n-\n return render(request, '500.html', {'error_form': ErrorForm})\n", "issue": "Don't send additional 500 email if no useful information in it.\nAfter the implementation if #1485 we get an additional email for _all_ 500 errors, even if there is no supplied information. Let's not send an email if there's no useful information in it.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport logging\nfrom smtplib import SMTPException\n\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django.core.mail import send_mail\nfrom django.shortcuts import redirect, render\nfrom onlineweb4.forms import ErrorForm\n\n\ndef server_error(request):\n log = logging.getLogger(__name__)\n\n if request.method == 'POST':\n form = ErrorForm(request.POST)\n message = request.POST.get('reason', 'Ingen forklaring oppgitt.')\n try: \n log.error('%s triggered a 500 server error and provided the following description: %s' % (\n request.user,\n message\n ))\n send_mail('500error user-report', message,\n settings.DEFAULT_FROM_EMAIL, [settings.EMAIL_DOTKOM])\n log.debug('Finished sending error email to %s' % settings.EMAIL_DOTKOM)\n\n messages.success(request, 'Feilmeldingen din ble sendt til %s' % settings.EMAIL_DOTKOM)\n\n return redirect('home')\n except SMTPException:\n messages.error(request, 'Det oppstod en uventet feil under sending av feilmeldingen')\n return redirect('home')\n\n return render(request, '500.html', {'error_form': ErrorForm})\n", "path": "onlineweb4/views.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport logging\n\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django.core.mail import send_mail\nfrom django.shortcuts import redirect, render\nfrom onlineweb4.forms import ErrorForm\nfrom smtplib import SMTPException\n\n\ndef server_error(request):\n log = logging.getLogger(__name__)\n\n if request.method == 'POST':\n message = request.POST.get('reason')\n if not message:\n return redirect('home')\n try:\n log.error('%s triggered a 500 server error and provided the following description: %s' % (\n request.user,\n message\n ))\n send_mail('500error user-report', message,\n settings.DEFAULT_FROM_EMAIL, [settings.EMAIL_DOTKOM])\n log.debug('Finished sending error email to %s' % settings.EMAIL_DOTKOM)\n messages.success(request, 'Feilmeldingen din ble sendt til %s' % settings.EMAIL_DOTKOM)\n return redirect('home')\n except SMTPException:\n messages.error(request, 'Det oppstod en uventet feil under sending av feilmeldingen')\n return redirect('home')\n return render(request, '500.html', {'error_form': ErrorForm})\n", "path": "onlineweb4/views.py"}]}
| 663 | 377 |
gh_patches_debug_1048
|
rasdani/github-patches
|
git_diff
|
mindee__doctr-243
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Pb: unitest text_export_size not passing on tf 2.3.1
Unitest text_export_size not OK locally on tf 2.3.1 :
```
def test_export_sizes(test_convert_to_tflite, test_convert_to_fp16, test_quantize_model):
assert sys.getsizeof(test_convert_to_tflite) > sys.getsizeof(test_convert_to_fp16)
> assert sys.getsizeof(test_convert_to_fp16) > sys.getsizeof(test_quantize_model)
E AssertionError: assert 3041 > 3041
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright (C) 2021, Mindee.
2
3 # This program is licensed under the Apache License version 2.
4 # See LICENSE or go to <https://www.apache.org/licenses/LICENSE-2.0.txt> for full license details.
5
6 """
7 Package installation setup
8 """
9
10 import os
11 from pathlib import Path
12 import subprocess
13
14 from setuptools import find_packages, setup
15
16
17 version = "0.1.2a0"
18 sha = 'Unknown'
19 package_name = 'doctr'
20
21 cwd = Path(__file__).parent.absolute()
22
23 if os.getenv('BUILD_VERSION'):
24 version = os.getenv('BUILD_VERSION')
25 elif sha != 'Unknown':
26 try:
27 sha = subprocess.check_output(['git', 'rev-parse', 'HEAD'], cwd=cwd).decode('ascii').strip()
28 except Exception:
29 pass
30 version += '+' + sha[:7]
31 print(f"Building wheel {package_name}-{version}")
32
33 with open(cwd.joinpath(package_name, 'version.py'), 'w') as f:
34 f.write(f"__version__ = '{version}'\n")
35
36 with open('README.md', 'r') as f:
37 readme = f.read()
38
39 requirements = [
40 "numpy>=1.16.0",
41 "scipy>=1.4.0",
42 "opencv-python>=4.2",
43 "tensorflow>=2.3.0",
44 "PyMuPDF>=1.16.0,<1.18.11",
45 "pyclipper>=1.2.0",
46 "shapely>=1.6.0",
47 "matplotlib>=3.1.0",
48 "mplcursors>=0.3",
49 "rapidfuzz>=1.0.0",
50 "weasyprint>=52.2",
51 ]
52
53 setup(
54 # Metadata
55 name=os.getenv('PKG_INDEX') if os.getenv('PKG_INDEX') else package_name,
56 version=version,
57 author='François-Guillaume Fernandez, Charles Gaillard',
58 author_email='[email protected]',
59 description='Extract valuable text information from your documents',
60 long_description=readme,
61 long_description_content_type="text/markdown",
62 url='https://github.com/mindee/doctr',
63 download_url='https://github.com/mindee/doctr/tags',
64 license='Apache',
65 classifiers=[
66 'Development Status :: 3 - Alpha',
67 'Intended Audience :: Developers',
68 'Intended Audience :: Science/Research',
69 'License :: OSI Approved :: Apache Software License',
70 'Natural Language :: English',
71 'Operating System :: OS Independent',
72 'Programming Language :: Python :: 3',
73 'Programming Language :: Python :: 3.6',
74 'Programming Language :: Python :: 3.7',
75 'Topic :: Scientific/Engineering',
76 'Topic :: Scientific/Engineering :: Artificial Intelligence',
77 'Topic :: Software Development',
78 'Topic :: Software Development :: Libraries',
79 'Topic :: Software Development :: Libraries :: Python Modules',
80 ],
81 keywords=['ocr', 'deep learning', 'tensorflow', 'text detection', 'text recognition'],
82
83 # Package info
84 packages=find_packages(exclude=('test',)),
85 zip_safe=True,
86 python_requires='>=3.6.0',
87 include_package_data=True,
88 install_requires=requirements,
89 package_data={'': ['LICENSE']}
90 )
91
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -40,7 +40,7 @@
"numpy>=1.16.0",
"scipy>=1.4.0",
"opencv-python>=4.2",
- "tensorflow>=2.3.0",
+ "tensorflow>=2.4.0",
"PyMuPDF>=1.16.0,<1.18.11",
"pyclipper>=1.2.0",
"shapely>=1.6.0",
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -40,7 +40,7 @@\n \"numpy>=1.16.0\",\n \"scipy>=1.4.0\",\n \"opencv-python>=4.2\",\n- \"tensorflow>=2.3.0\",\n+ \"tensorflow>=2.4.0\",\n \"PyMuPDF>=1.16.0,<1.18.11\",\n \"pyclipper>=1.2.0\",\n \"shapely>=1.6.0\",\n", "issue": "Pb: unitest text_export_size not passing on tf 2.3.1\nUnitest text_export_size not OK locally on tf 2.3.1 : \r\n\r\n```\r\ndef test_export_sizes(test_convert_to_tflite, test_convert_to_fp16, test_quantize_model):\r\n assert sys.getsizeof(test_convert_to_tflite) > sys.getsizeof(test_convert_to_fp16)\r\n> assert sys.getsizeof(test_convert_to_fp16) > sys.getsizeof(test_quantize_model)\r\nE AssertionError: assert 3041 > 3041\r\n\r\n```\n", "before_files": [{"content": "# Copyright (C) 2021, Mindee.\n\n# This program is licensed under the Apache License version 2.\n# See LICENSE or go to <https://www.apache.org/licenses/LICENSE-2.0.txt> for full license details.\n\n\"\"\"\nPackage installation setup\n\"\"\"\n\nimport os\nfrom pathlib import Path\nimport subprocess\n\nfrom setuptools import find_packages, setup\n\n\nversion = \"0.1.2a0\"\nsha = 'Unknown'\npackage_name = 'doctr'\n\ncwd = Path(__file__).parent.absolute()\n\nif os.getenv('BUILD_VERSION'):\n version = os.getenv('BUILD_VERSION')\nelif sha != 'Unknown':\n try:\n sha = subprocess.check_output(['git', 'rev-parse', 'HEAD'], cwd=cwd).decode('ascii').strip()\n except Exception:\n pass\n version += '+' + sha[:7]\nprint(f\"Building wheel {package_name}-{version}\")\n\nwith open(cwd.joinpath(package_name, 'version.py'), 'w') as f:\n f.write(f\"__version__ = '{version}'\\n\")\n\nwith open('README.md', 'r') as f:\n readme = f.read()\n\nrequirements = [\n \"numpy>=1.16.0\",\n \"scipy>=1.4.0\",\n \"opencv-python>=4.2\",\n \"tensorflow>=2.3.0\",\n \"PyMuPDF>=1.16.0,<1.18.11\",\n \"pyclipper>=1.2.0\",\n \"shapely>=1.6.0\",\n \"matplotlib>=3.1.0\",\n \"mplcursors>=0.3\",\n \"rapidfuzz>=1.0.0\",\n \"weasyprint>=52.2\",\n]\n\nsetup(\n # Metadata\n name=os.getenv('PKG_INDEX') if os.getenv('PKG_INDEX') else package_name,\n version=version,\n author='Fran\u00e7ois-Guillaume Fernandez, Charles Gaillard',\n author_email='[email protected]',\n description='Extract valuable text information from your documents',\n long_description=readme,\n long_description_content_type=\"text/markdown\",\n url='https://github.com/mindee/doctr',\n download_url='https://github.com/mindee/doctr/tags',\n license='Apache',\n classifiers=[\n 'Development Status :: 3 - Alpha',\n 'Intended Audience :: Developers',\n 'Intended Audience :: Science/Research',\n 'License :: OSI Approved :: Apache Software License',\n 'Natural Language :: English',\n 'Operating System :: OS Independent',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Software Development',\n 'Topic :: Software Development :: Libraries',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n ],\n keywords=['ocr', 'deep learning', 'tensorflow', 'text detection', 'text recognition'],\n\n # Package info\n packages=find_packages(exclude=('test',)),\n zip_safe=True,\n python_requires='>=3.6.0',\n include_package_data=True,\n install_requires=requirements,\n package_data={'': ['LICENSE']}\n)\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright (C) 2021, Mindee.\n\n# This program is licensed under the Apache License version 2.\n# See LICENSE or go to <https://www.apache.org/licenses/LICENSE-2.0.txt> for full license details.\n\n\"\"\"\nPackage installation setup\n\"\"\"\n\nimport os\nfrom pathlib import Path\nimport subprocess\n\nfrom setuptools import find_packages, setup\n\n\nversion = \"0.1.2a0\"\nsha = 'Unknown'\npackage_name = 'doctr'\n\ncwd = Path(__file__).parent.absolute()\n\nif os.getenv('BUILD_VERSION'):\n version = os.getenv('BUILD_VERSION')\nelif sha != 'Unknown':\n try:\n sha = subprocess.check_output(['git', 'rev-parse', 'HEAD'], cwd=cwd).decode('ascii').strip()\n except Exception:\n pass\n version += '+' + sha[:7]\nprint(f\"Building wheel {package_name}-{version}\")\n\nwith open(cwd.joinpath(package_name, 'version.py'), 'w') as f:\n f.write(f\"__version__ = '{version}'\\n\")\n\nwith open('README.md', 'r') as f:\n readme = f.read()\n\nrequirements = [\n \"numpy>=1.16.0\",\n \"scipy>=1.4.0\",\n \"opencv-python>=4.2\",\n \"tensorflow>=2.4.0\",\n \"PyMuPDF>=1.16.0,<1.18.11\",\n \"pyclipper>=1.2.0\",\n \"shapely>=1.6.0\",\n \"matplotlib>=3.1.0\",\n \"mplcursors>=0.3\",\n \"rapidfuzz>=1.0.0\",\n \"weasyprint>=52.2\",\n]\n\nsetup(\n # Metadata\n name=os.getenv('PKG_INDEX') if os.getenv('PKG_INDEX') else package_name,\n version=version,\n author='Fran\u00e7ois-Guillaume Fernandez, Charles Gaillard',\n author_email='[email protected]',\n description='Extract valuable text information from your documents',\n long_description=readme,\n long_description_content_type=\"text/markdown\",\n url='https://github.com/mindee/doctr',\n download_url='https://github.com/mindee/doctr/tags',\n license='Apache',\n classifiers=[\n 'Development Status :: 3 - Alpha',\n 'Intended Audience :: Developers',\n 'Intended Audience :: Science/Research',\n 'License :: OSI Approved :: Apache Software License',\n 'Natural Language :: English',\n 'Operating System :: OS Independent',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Software Development',\n 'Topic :: Software Development :: Libraries',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n ],\n keywords=['ocr', 'deep learning', 'tensorflow', 'text detection', 'text recognition'],\n\n # Package info\n packages=find_packages(exclude=('test',)),\n zip_safe=True,\n python_requires='>=3.6.0',\n include_package_data=True,\n install_requires=requirements,\n package_data={'': ['LICENSE']}\n)\n", "path": "setup.py"}]}
| 1,263 | 130 |
gh_patches_debug_30338
|
rasdani/github-patches
|
git_diff
|
ansible__molecule-3105
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add support for checking exit codes on shell dependencies
# Issue Type
- Feature request
# Molecule and Ansible details
```
ansible 2.10.5
config file = /Users/jhg03a/<redacted>/ansible.cfg
configured module search path = ['/Users/jhg03a/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.9/site-packages/ansible
executable location = /usr/local/bin/ansible
python version = 3.9.2 (default, Feb 24 2021, 13:26:09) [Clang 12.0.0 (clang-1200.0.32.29)]
molecule 3.2.3 using python 3.9
ansible:2.10.5
delegated:3.2.3 from molecule
docker:0.2.4 from molecule_docker
```
Molecule installation method (one of):
- pip
Ansible installation method (one of):
- pip
# Desired Behavior
Currently it appears that the dependency shell module doesn't take into account the exit code from the command. If something goes wrong in the dependency phase, it's highly likely the rest of the run is going to fail or have inconsistent results.
### Example:
```yaml
dependency:
name: shell
command: 'false'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/molecule/dependency/base.py`
Content:
```
1 # Copyright (c) 2015-2018 Cisco Systems, Inc.
2 #
3 # Permission is hereby granted, free of charge, to any person obtaining a copy
4 # of this software and associated documentation files (the "Software"), to
5 # deal in the Software without restriction, including without limitation the
6 # rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
7 # sell copies of the Software, and to permit persons to whom the Software is
8 # furnished to do so, subject to the following conditions:
9 #
10 # The above copyright notice and this permission notice shall be included in
11 # all copies or substantial portions of the Software.
12 #
13 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
16 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
18 # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
19 # DEALINGS IN THE SOFTWARE.
20 """Base Dependency Module."""
21
22 import abc
23 import logging
24 import os
25 import time
26
27 from molecule import constants, util
28
29 LOG = logging.getLogger(__name__)
30
31
32 class Base(object):
33 """Dependency Base Class."""
34
35 __metaclass__ = abc.ABCMeta
36
37 RETRY = 3
38 SLEEP = 3
39 BACKOFF = 3
40
41 def __init__(self, config):
42 """
43 Initialize code for all :ref:`Dependency` classes.
44
45 :param config: An instance of a Molecule config.
46 :returns: None
47 """
48 self._config = config
49
50 def execute_with_retries(self):
51 """Run dependency downloads with retry and timed back-off."""
52 exception = None
53
54 try:
55 # print(555, self._sh_command)
56 util.run_command(self._sh_command, debug=self._config.debug)
57 msg = "Dependency completed successfully."
58 LOG.info(msg)
59 return
60 except Exception:
61 pass
62
63 for counter in range(1, (self.RETRY + 1)):
64 msg = "Retrying dependency ... {}/{} time(s)".format(counter, self.RETRY)
65 LOG.warning(msg)
66
67 msg = "Sleeping {} seconds before retrying ...".format(self.SLEEP)
68 LOG.warning(msg)
69 time.sleep(self.SLEEP)
70 self.SLEEP += self.BACKOFF
71
72 try:
73 util.run_command(self._sh_command, debug=self._config.debug)
74 msg = "Dependency completed successfully."
75 LOG.info(msg)
76 return
77 except Exception as _exception:
78 exception = _exception
79
80 LOG.error(str(exception), self._sh_command)
81 util.sysexit(getattr(exception, "exit_code", constants.RC_UNKNOWN_ERROR))
82
83 @abc.abstractmethod
84 def execute(self): # pragma: no cover
85 """
86 Execute ``cmd`` and returns None.
87
88 :return: None
89 """
90
91 @abc.abstractproperty
92 def default_options(self): # pragma: no cover
93 """
94 Get default CLI arguments provided to ``cmd`` as a dict.
95
96 :return: dict
97 """
98
99 @property
100 def default_env(self): # pragma: no cover
101 """
102 Get default env variables provided to ``cmd`` as a dict.
103
104 :return: dict
105 """
106 env = util.merge_dicts(os.environ, self._config.env)
107 # inject ephemeral_directory on top of path
108 env[self._config.ansible_collections_path] = os.path.join(
109 self._config.scenario.ephemeral_directory, "collections"
110 )
111 return env
112
113 @property
114 def name(self):
115 """
116 Name of the dependency and returns a string.
117
118 :returns: str
119 """
120 return self._config.config["dependency"]["name"]
121
122 @property
123 def enabled(self):
124 return self._config.config["dependency"]["enabled"]
125
126 @property
127 def options(self):
128 return util.merge_dicts(
129 self.default_options, self._config.config["dependency"]["options"]
130 )
131
132 @property
133 def env(self):
134 return util.merge_dicts(
135 self.default_env, self._config.config["dependency"]["env"]
136 )
137
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/molecule/dependency/base.py b/src/molecule/dependency/base.py
--- a/src/molecule/dependency/base.py
+++ b/src/molecule/dependency/base.py
@@ -23,8 +23,9 @@
import logging
import os
import time
+from subprocess import CalledProcessError
-from molecule import constants, util
+from molecule import util
LOG = logging.getLogger(__name__)
@@ -53,11 +54,11 @@
try:
# print(555, self._sh_command)
- util.run_command(self._sh_command, debug=self._config.debug)
+ util.run_command(self._sh_command, debug=self._config.debug, check=True)
msg = "Dependency completed successfully."
LOG.info(msg)
return
- except Exception:
+ except CalledProcessError:
pass
for counter in range(1, (self.RETRY + 1)):
@@ -70,15 +71,15 @@
self.SLEEP += self.BACKOFF
try:
- util.run_command(self._sh_command, debug=self._config.debug)
+ util.run_command(self._sh_command, debug=self._config.debug, check=True)
msg = "Dependency completed successfully."
LOG.info(msg)
return
- except Exception as _exception:
+ except CalledProcessError as _exception:
exception = _exception
- LOG.error(str(exception), self._sh_command)
- util.sysexit(getattr(exception, "exit_code", constants.RC_UNKNOWN_ERROR))
+ LOG.error(str(exception))
+ util.sysexit(exception.returncode)
@abc.abstractmethod
def execute(self): # pragma: no cover
|
{"golden_diff": "diff --git a/src/molecule/dependency/base.py b/src/molecule/dependency/base.py\n--- a/src/molecule/dependency/base.py\n+++ b/src/molecule/dependency/base.py\n@@ -23,8 +23,9 @@\n import logging\n import os\n import time\n+from subprocess import CalledProcessError\n \n-from molecule import constants, util\n+from molecule import util\n \n LOG = logging.getLogger(__name__)\n \n@@ -53,11 +54,11 @@\n \n try:\n # print(555, self._sh_command)\n- util.run_command(self._sh_command, debug=self._config.debug)\n+ util.run_command(self._sh_command, debug=self._config.debug, check=True)\n msg = \"Dependency completed successfully.\"\n LOG.info(msg)\n return\n- except Exception:\n+ except CalledProcessError:\n pass\n \n for counter in range(1, (self.RETRY + 1)):\n@@ -70,15 +71,15 @@\n self.SLEEP += self.BACKOFF\n \n try:\n- util.run_command(self._sh_command, debug=self._config.debug)\n+ util.run_command(self._sh_command, debug=self._config.debug, check=True)\n msg = \"Dependency completed successfully.\"\n LOG.info(msg)\n return\n- except Exception as _exception:\n+ except CalledProcessError as _exception:\n exception = _exception\n \n- LOG.error(str(exception), self._sh_command)\n- util.sysexit(getattr(exception, \"exit_code\", constants.RC_UNKNOWN_ERROR))\n+ LOG.error(str(exception))\n+ util.sysexit(exception.returncode)\n \n @abc.abstractmethod\n def execute(self): # pragma: no cover\n", "issue": "Add support for checking exit codes on shell dependencies\n# Issue Type\r\n\r\n- Feature request\r\n\r\n# Molecule and Ansible details\r\n\r\n```\r\nansible 2.10.5\r\n config file = /Users/jhg03a/<redacted>/ansible.cfg\r\n configured module search path = ['/Users/jhg03a/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']\r\n ansible python module location = /usr/local/lib/python3.9/site-packages/ansible\r\n executable location = /usr/local/bin/ansible\r\n python version = 3.9.2 (default, Feb 24 2021, 13:26:09) [Clang 12.0.0 (clang-1200.0.32.29)]\r\nmolecule 3.2.3 using python 3.9\r\n ansible:2.10.5\r\n delegated:3.2.3 from molecule\r\n docker:0.2.4 from molecule_docker\r\n```\r\n\r\nMolecule installation method (one of):\r\n\r\n- pip\r\n\r\nAnsible installation method (one of):\r\n\r\n- pip\r\n\r\n# Desired Behavior\r\n\r\nCurrently it appears that the dependency shell module doesn't take into account the exit code from the command. If something goes wrong in the dependency phase, it's highly likely the rest of the run is going to fail or have inconsistent results.\r\n\r\n### Example:\r\n```yaml\r\ndependency:\r\n name: shell\r\n command: 'false'\r\n```\r\n\n", "before_files": [{"content": "# Copyright (c) 2015-2018 Cisco Systems, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to\n# deal in the Software without restriction, including without limitation the\n# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or\n# sell copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER\n# DEALINGS IN THE SOFTWARE.\n\"\"\"Base Dependency Module.\"\"\"\n\nimport abc\nimport logging\nimport os\nimport time\n\nfrom molecule import constants, util\n\nLOG = logging.getLogger(__name__)\n\n\nclass Base(object):\n \"\"\"Dependency Base Class.\"\"\"\n\n __metaclass__ = abc.ABCMeta\n\n RETRY = 3\n SLEEP = 3\n BACKOFF = 3\n\n def __init__(self, config):\n \"\"\"\n Initialize code for all :ref:`Dependency` classes.\n\n :param config: An instance of a Molecule config.\n :returns: None\n \"\"\"\n self._config = config\n\n def execute_with_retries(self):\n \"\"\"Run dependency downloads with retry and timed back-off.\"\"\"\n exception = None\n\n try:\n # print(555, self._sh_command)\n util.run_command(self._sh_command, debug=self._config.debug)\n msg = \"Dependency completed successfully.\"\n LOG.info(msg)\n return\n except Exception:\n pass\n\n for counter in range(1, (self.RETRY + 1)):\n msg = \"Retrying dependency ... {}/{} time(s)\".format(counter, self.RETRY)\n LOG.warning(msg)\n\n msg = \"Sleeping {} seconds before retrying ...\".format(self.SLEEP)\n LOG.warning(msg)\n time.sleep(self.SLEEP)\n self.SLEEP += self.BACKOFF\n\n try:\n util.run_command(self._sh_command, debug=self._config.debug)\n msg = \"Dependency completed successfully.\"\n LOG.info(msg)\n return\n except Exception as _exception:\n exception = _exception\n\n LOG.error(str(exception), self._sh_command)\n util.sysexit(getattr(exception, \"exit_code\", constants.RC_UNKNOWN_ERROR))\n\n @abc.abstractmethod\n def execute(self): # pragma: no cover\n \"\"\"\n Execute ``cmd`` and returns None.\n\n :return: None\n \"\"\"\n\n @abc.abstractproperty\n def default_options(self): # pragma: no cover\n \"\"\"\n Get default CLI arguments provided to ``cmd`` as a dict.\n\n :return: dict\n \"\"\"\n\n @property\n def default_env(self): # pragma: no cover\n \"\"\"\n Get default env variables provided to ``cmd`` as a dict.\n\n :return: dict\n \"\"\"\n env = util.merge_dicts(os.environ, self._config.env)\n # inject ephemeral_directory on top of path\n env[self._config.ansible_collections_path] = os.path.join(\n self._config.scenario.ephemeral_directory, \"collections\"\n )\n return env\n\n @property\n def name(self):\n \"\"\"\n Name of the dependency and returns a string.\n\n :returns: str\n \"\"\"\n return self._config.config[\"dependency\"][\"name\"]\n\n @property\n def enabled(self):\n return self._config.config[\"dependency\"][\"enabled\"]\n\n @property\n def options(self):\n return util.merge_dicts(\n self.default_options, self._config.config[\"dependency\"][\"options\"]\n )\n\n @property\n def env(self):\n return util.merge_dicts(\n self.default_env, self._config.config[\"dependency\"][\"env\"]\n )\n", "path": "src/molecule/dependency/base.py"}], "after_files": [{"content": "# Copyright (c) 2015-2018 Cisco Systems, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to\n# deal in the Software without restriction, including without limitation the\n# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or\n# sell copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER\n# DEALINGS IN THE SOFTWARE.\n\"\"\"Base Dependency Module.\"\"\"\n\nimport abc\nimport logging\nimport os\nimport time\nfrom subprocess import CalledProcessError\n\nfrom molecule import util\n\nLOG = logging.getLogger(__name__)\n\n\nclass Base(object):\n \"\"\"Dependency Base Class.\"\"\"\n\n __metaclass__ = abc.ABCMeta\n\n RETRY = 3\n SLEEP = 3\n BACKOFF = 3\n\n def __init__(self, config):\n \"\"\"\n Initialize code for all :ref:`Dependency` classes.\n\n :param config: An instance of a Molecule config.\n :returns: None\n \"\"\"\n self._config = config\n\n def execute_with_retries(self):\n \"\"\"Run dependency downloads with retry and timed back-off.\"\"\"\n exception = None\n\n try:\n # print(555, self._sh_command)\n util.run_command(self._sh_command, debug=self._config.debug, check=True)\n msg = \"Dependency completed successfully.\"\n LOG.info(msg)\n return\n except CalledProcessError:\n pass\n\n for counter in range(1, (self.RETRY + 1)):\n msg = \"Retrying dependency ... {}/{} time(s)\".format(counter, self.RETRY)\n LOG.warning(msg)\n\n msg = \"Sleeping {} seconds before retrying ...\".format(self.SLEEP)\n LOG.warning(msg)\n time.sleep(self.SLEEP)\n self.SLEEP += self.BACKOFF\n\n try:\n util.run_command(self._sh_command, debug=self._config.debug, check=True)\n msg = \"Dependency completed successfully.\"\n LOG.info(msg)\n return\n except CalledProcessError as _exception:\n exception = _exception\n\n LOG.error(str(exception))\n util.sysexit(exception.returncode)\n\n @abc.abstractmethod\n def execute(self): # pragma: no cover\n \"\"\"\n Execute ``cmd`` and returns None.\n\n :return: None\n \"\"\"\n\n @abc.abstractproperty\n def default_options(self): # pragma: no cover\n \"\"\"\n Get default CLI arguments provided to ``cmd`` as a dict.\n\n :return: dict\n \"\"\"\n\n @property\n def default_env(self): # pragma: no cover\n \"\"\"\n Get default env variables provided to ``cmd`` as a dict.\n\n :return: dict\n \"\"\"\n env = util.merge_dicts(os.environ, self._config.env)\n # inject ephemeral_directory on top of path\n env[self._config.ansible_collections_path] = os.path.join(\n self._config.scenario.ephemeral_directory, \"collections\"\n )\n return env\n\n @property\n def name(self):\n \"\"\"\n Name of the dependency and returns a string.\n\n :returns: str\n \"\"\"\n return self._config.config[\"dependency\"][\"name\"]\n\n @property\n def enabled(self):\n return self._config.config[\"dependency\"][\"enabled\"]\n\n @property\n def options(self):\n return util.merge_dicts(\n self.default_options, self._config.config[\"dependency\"][\"options\"]\n )\n\n @property\n def env(self):\n return util.merge_dicts(\n self.default_env, self._config.config[\"dependency\"][\"env\"]\n )\n", "path": "src/molecule/dependency/base.py"}]}
| 1,829 | 372 |
gh_patches_debug_16475
|
rasdani/github-patches
|
git_diff
|
pyinstaller__pyinstaller-8544
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Could not import `pywintypes` or `win32api` from `win32ctypes.pywin32`
## Description of the issue
Error when running the executable.
Issue is present in 6.6.0 and "latest development version". Issue is not present in 6.5.0.
Output differs between versions at this point:
6.6.0:
```
import 'win32ctypes.core' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001C6D9E41BB0>
# win32ctypes.core._common not found in PYZ
# win32ctypes.core.ctypes not found in PYZ
# destroy win32ctypes.pywin32.win32api
# destroy win32ctypes.pywin32
# destroy PyInstaller
Could not import `pywintypes` or `win32api` from `win32ctypes.pywin32`.
Please make sure that `pywin32-ctypes` is installed and importable, for example:
pip install pywin32-ctypes
```
6.5.0:
```
# cffi not found in PYZ
# code object from '[...]\\cffi\\__init__.pyc'
# cffi.api not found in PYZ
# code object from '[...]\\cffi\\api.pyc'
# cffi.lock not found in PYZ
# code object from '[...]\\cffi\\lock.pyc'
import 'cffi.lock' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB845C0>
# cffi.error not found in PYZ
# code object from '[...]\\cffi\\error.pyc'
import 'cffi.error' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB846B0>
# cffi.model not found in PYZ
# code object from '[...]\\cffi\\model.pyc'
import 'cffi.model' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB848F0>
import 'cffi.api' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB52330>
import 'cffi' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB51EB0>
import 'win32ctypes.core' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB51B80>
# win32ctypes.core._common not found in PYZ
# win32ctypes.core.cffi not found in PYZ
# code object from '[...]\\win32ctypes\\core\\cffi\\__init__.pyc'
import 'win32ctypes.core.cffi' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB86BA0>
# win32ctypes.core.cffi._common not found in PYZ
# code object from '[...]\\win32ctypes\\core\\cffi\\_common.pyc'
# win32ctypes.core.cffi._util not found in PYZ
# code object from '[...]\\win32ctypes\\core\\cffi\\_util.pyc'
# win32ctypes.core.compat not found in PYZ
# code object from '[...]\\win32ctypes\\core\\compat.pyc'
import 'win32ctypes.core.compat' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB87440>
# _cffi_backend not found in PYZ
# extension module '_cffi_backend' loaded from '[...]\\_cffi_backend.cp312-win_amd64.pyd'
# extension module '_cffi_backend' executed from '[...]\\_cffi_backend.cp312-win_amd64.pyd'
import '_cffi_backend' # <_frozen_importlib_external.ExtensionFileLoader object at 0x000001F4AEB876B0>
# cffi.cparser not found in PYZ
# code object from '[...]\\cffi\\cparser.pyc'
# cffi.commontypes not found in PYZ
# code object from '[...]\\cffi\\commontypes.pyc'
import 'cffi.commontypes' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEBA91F0>
```
### Context information (for bug reports)
* 502 INFO: PyInstaller: 6.6.0, contrib hooks: 2024.6
* 503 INFO: Python: 3.12.0
* 541 INFO: Platform: Windows-10-10.0.19045-SP0
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `PyInstaller/hooks/hook-win32ctypes.core.py`
Content:
```
1 #-----------------------------------------------------------------------------
2 # Copyright (c) 2020-2023, PyInstaller Development Team.
3 #
4 # Distributed under the terms of the GNU General Public License (version 2
5 # or later) with exception for distributing the bootloader.
6 #
7 # The full license is in the file COPYING.txt, distributed with this software.
8 #
9 # SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
10 #-----------------------------------------------------------------------------
11
12 # TODO: remove this hook during PyInstaller 4.5 release cycle!
13
14 from PyInstaller.utils.hooks import can_import_module, collect_submodules
15
16 # We need to collect submodules from win32ctypes.core.cffi or win32ctypes.core.ctypes for win32ctypes.core to work. The
17 # use of the backend is determined by availability of cffi.
18 if can_import_module('cffi'):
19 hiddenimports = collect_submodules('win32ctypes.core.cffi')
20 else:
21 hiddenimports = collect_submodules('win32ctypes.core.ctypes')
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/PyInstaller/hooks/hook-win32ctypes.core.py b/PyInstaller/hooks/hook-win32ctypes.core.py
--- a/PyInstaller/hooks/hook-win32ctypes.core.py
+++ b/PyInstaller/hooks/hook-win32ctypes.core.py
@@ -13,9 +13,10 @@
from PyInstaller.utils.hooks import can_import_module, collect_submodules
-# We need to collect submodules from win32ctypes.core.cffi or win32ctypes.core.ctypes for win32ctypes.core to work. The
-# use of the backend is determined by availability of cffi.
+# We need to collect submodules from win32ctypes.core.cffi or win32ctypes.core.ctypes for win32ctypes.core to work.
+# Always collect the `ctypes` backend, and add the `cffi` one if `cffi` is available. Having the `ctypes` backend always
+# available helps in situations when `cffi` is available in the build environment, but is disabled at run-time or not
+# collected (e.g., due to `--exclude cffi`).
+hiddenimports = collect_submodules('win32ctypes.core.ctypes')
if can_import_module('cffi'):
- hiddenimports = collect_submodules('win32ctypes.core.cffi')
-else:
- hiddenimports = collect_submodules('win32ctypes.core.ctypes')
+ hiddenimports += collect_submodules('win32ctypes.core.cffi')
|
{"golden_diff": "diff --git a/PyInstaller/hooks/hook-win32ctypes.core.py b/PyInstaller/hooks/hook-win32ctypes.core.py\n--- a/PyInstaller/hooks/hook-win32ctypes.core.py\n+++ b/PyInstaller/hooks/hook-win32ctypes.core.py\n@@ -13,9 +13,10 @@\n \n from PyInstaller.utils.hooks import can_import_module, collect_submodules\n \n-# We need to collect submodules from win32ctypes.core.cffi or win32ctypes.core.ctypes for win32ctypes.core to work. The\n-# use of the backend is determined by availability of cffi.\n+# We need to collect submodules from win32ctypes.core.cffi or win32ctypes.core.ctypes for win32ctypes.core to work.\n+# Always collect the `ctypes` backend, and add the `cffi` one if `cffi` is available. Having the `ctypes` backend always\n+# available helps in situations when `cffi` is available in the build environment, but is disabled at run-time or not\n+# collected (e.g., due to `--exclude cffi`).\n+hiddenimports = collect_submodules('win32ctypes.core.ctypes')\n if can_import_module('cffi'):\n- hiddenimports = collect_submodules('win32ctypes.core.cffi')\n-else:\n- hiddenimports = collect_submodules('win32ctypes.core.ctypes')\n+ hiddenimports += collect_submodules('win32ctypes.core.cffi')\n", "issue": "Could not import `pywintypes` or `win32api` from `win32ctypes.pywin32`\n## Description of the issue\r\n\r\nError when running the executable.\r\n\r\nIssue is present in 6.6.0 and \"latest development version\". Issue is not present in 6.5.0.\r\n\r\nOutput differs between versions at this point:\r\n\r\n6.6.0:\r\n```\r\nimport 'win32ctypes.core' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001C6D9E41BB0>\r\n# win32ctypes.core._common not found in PYZ\r\n# win32ctypes.core.ctypes not found in PYZ\r\n# destroy win32ctypes.pywin32.win32api\r\n# destroy win32ctypes.pywin32\r\n# destroy PyInstaller\r\nCould not import `pywintypes` or `win32api` from `win32ctypes.pywin32`.\r\nPlease make sure that `pywin32-ctypes` is installed and importable, for example:\r\n\r\npip install pywin32-ctypes\r\n\r\n```\r\n\r\n6.5.0:\r\n```\r\n# cffi not found in PYZ\r\n# code object from '[...]\\\\cffi\\\\__init__.pyc'\r\n# cffi.api not found in PYZ\r\n# code object from '[...]\\\\cffi\\\\api.pyc'\r\n# cffi.lock not found in PYZ\r\n# code object from '[...]\\\\cffi\\\\lock.pyc'\r\nimport 'cffi.lock' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB845C0>\r\n# cffi.error not found in PYZ\r\n# code object from '[...]\\\\cffi\\\\error.pyc'\r\nimport 'cffi.error' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB846B0>\r\n# cffi.model not found in PYZ\r\n# code object from '[...]\\\\cffi\\\\model.pyc'\r\nimport 'cffi.model' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB848F0>\r\nimport 'cffi.api' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB52330>\r\nimport 'cffi' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB51EB0>\r\nimport 'win32ctypes.core' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB51B80>\r\n# win32ctypes.core._common not found in PYZ\r\n# win32ctypes.core.cffi not found in PYZ\r\n# code object from '[...]\\\\win32ctypes\\\\core\\\\cffi\\\\__init__.pyc'\r\nimport 'win32ctypes.core.cffi' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB86BA0>\r\n# win32ctypes.core.cffi._common not found in PYZ\r\n# code object from '[...]\\\\win32ctypes\\\\core\\\\cffi\\\\_common.pyc'\r\n# win32ctypes.core.cffi._util not found in PYZ\r\n# code object from '[...]\\\\win32ctypes\\\\core\\\\cffi\\\\_util.pyc'\r\n# win32ctypes.core.compat not found in PYZ\r\n# code object from '[...]\\\\win32ctypes\\\\core\\\\compat.pyc'\r\nimport 'win32ctypes.core.compat' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEB87440>\r\n# _cffi_backend not found in PYZ\r\n# extension module '_cffi_backend' loaded from '[...]\\\\_cffi_backend.cp312-win_amd64.pyd'\r\n# extension module '_cffi_backend' executed from '[...]\\\\_cffi_backend.cp312-win_amd64.pyd'\r\nimport '_cffi_backend' # <_frozen_importlib_external.ExtensionFileLoader object at 0x000001F4AEB876B0>\r\n# cffi.cparser not found in PYZ\r\n# code object from '[...]\\\\cffi\\\\cparser.pyc'\r\n# cffi.commontypes not found in PYZ\r\n# code object from '[...]\\\\cffi\\\\commontypes.pyc'\r\nimport 'cffi.commontypes' # <_frozen_importlib_external.SourcelessFileLoader object at 0x000001F4AEBA91F0>\r\n```\r\n\r\n\r\n### Context information (for bug reports)\r\n\r\n* 502 INFO: PyInstaller: 6.6.0, contrib hooks: 2024.6\r\n* 503 INFO: Python: 3.12.0\r\n* 541 INFO: Platform: Windows-10-10.0.19045-SP0\r\n\r\n\r\n\n", "before_files": [{"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2020-2023, PyInstaller Development Team.\n#\n# Distributed under the terms of the GNU General Public License (version 2\n# or later) with exception for distributing the bootloader.\n#\n# The full license is in the file COPYING.txt, distributed with this software.\n#\n# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)\n#-----------------------------------------------------------------------------\n\n# TODO: remove this hook during PyInstaller 4.5 release cycle!\n\nfrom PyInstaller.utils.hooks import can_import_module, collect_submodules\n\n# We need to collect submodules from win32ctypes.core.cffi or win32ctypes.core.ctypes for win32ctypes.core to work. The\n# use of the backend is determined by availability of cffi.\nif can_import_module('cffi'):\n hiddenimports = collect_submodules('win32ctypes.core.cffi')\nelse:\n hiddenimports = collect_submodules('win32ctypes.core.ctypes')\n", "path": "PyInstaller/hooks/hook-win32ctypes.core.py"}], "after_files": [{"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2020-2023, PyInstaller Development Team.\n#\n# Distributed under the terms of the GNU General Public License (version 2\n# or later) with exception for distributing the bootloader.\n#\n# The full license is in the file COPYING.txt, distributed with this software.\n#\n# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)\n#-----------------------------------------------------------------------------\n\n# TODO: remove this hook during PyInstaller 4.5 release cycle!\n\nfrom PyInstaller.utils.hooks import can_import_module, collect_submodules\n\n# We need to collect submodules from win32ctypes.core.cffi or win32ctypes.core.ctypes for win32ctypes.core to work.\n# Always collect the `ctypes` backend, and add the `cffi` one if `cffi` is available. Having the `ctypes` backend always\n# available helps in situations when `cffi` is available in the build environment, but is disabled at run-time or not\n# collected (e.g., due to `--exclude cffi`).\nhiddenimports = collect_submodules('win32ctypes.core.ctypes')\nif can_import_module('cffi'):\n hiddenimports += collect_submodules('win32ctypes.core.cffi')\n", "path": "PyInstaller/hooks/hook-win32ctypes.core.py"}]}
| 1,677 | 338 |
gh_patches_debug_16407
|
rasdani/github-patches
|
git_diff
|
buildbot__buildbot-5729
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
change_hook/poller not working for ReconfigurablePollingChangeSource
In [poller.py](https://github.com/buildbot/buildbot/blob/a0e1d8840e8856ead136a1ad6e2021931355af15/master/buildbot/www/hooks/poller.py#L40), the change sources are filtered like this:
```python
for source in change_svc:
if not isinstance(source, PollingChangeSource):
continue
```
This means that any pollers derived from the super-class `ReconfigurablePollingChangeSource` will not be found. Since [new code is supposed to use `ReconfigurablePollingChangeSource`](https://docs.buildbot.net/current/developer/cls-changesources.html?highlight=reconfigurablepollingchangesource#pollingchangesource), the code should probably read:
```python
for source in change_svc:
if not isinstance(source, ReconfigurablePollingChangeSource):
continue
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `master/buildbot/www/hooks/poller.py`
Content:
```
1 # This file is part of Buildbot. Buildbot is free software: you can
2 # redistribute it and/or modify it under the terms of the GNU General Public
3 # License as published by the Free Software Foundation, version 2.
4 #
5 # This program is distributed in the hope that it will be useful, but WITHOUT
6 # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
7 # FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
8 # details.
9 #
10 # You should have received a copy of the GNU General Public License along with
11 # this program; if not, write to the Free Software Foundation, Inc., 51
12 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
13 #
14 # Copyright Buildbot Team Members
15
16 # This change hook allows GitHub or a hand crafted curl invocation to "knock on
17 # the door" and trigger a change source to poll.
18
19
20 from buildbot.changes.base import PollingChangeSource
21 from buildbot.util import bytes2unicode
22 from buildbot.util import unicode2bytes
23 from buildbot.www.hooks.base import BaseHookHandler
24
25
26 class PollingHandler(BaseHookHandler):
27
28 def getChanges(self, req):
29 change_svc = req.site.master.change_svc
30 poll_all = b"poller" not in req.args
31
32 allow_all = True
33 allowed = []
34 if isinstance(self.options, dict) and b"allowed" in self.options:
35 allow_all = False
36 allowed = self.options[b"allowed"]
37
38 pollers = []
39
40 for source in change_svc:
41 if not isinstance(source, PollingChangeSource):
42 continue
43 if not hasattr(source, "name"):
44 continue
45 if (not poll_all and
46 unicode2bytes(source.name) not in req.args[b'poller']):
47 continue
48 if not allow_all and unicode2bytes(source.name) not in allowed:
49 continue
50 pollers.append(source)
51
52 if not poll_all:
53 missing = (set(req.args[b'poller']) -
54 set(unicode2bytes(s.name) for s in pollers))
55 if missing:
56 raise ValueError("Could not find pollers: {}".format(
57 bytes2unicode(b",".join(missing))))
58
59 for p in pollers:
60 p.force()
61
62 return [], None
63
64
65 poller = PollingHandler
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/master/buildbot/www/hooks/poller.py b/master/buildbot/www/hooks/poller.py
--- a/master/buildbot/www/hooks/poller.py
+++ b/master/buildbot/www/hooks/poller.py
@@ -17,7 +17,7 @@
# the door" and trigger a change source to poll.
-from buildbot.changes.base import PollingChangeSource
+from buildbot.changes.base import ReconfigurablePollingChangeSource
from buildbot.util import bytes2unicode
from buildbot.util import unicode2bytes
from buildbot.www.hooks.base import BaseHookHandler
@@ -38,7 +38,7 @@
pollers = []
for source in change_svc:
- if not isinstance(source, PollingChangeSource):
+ if not isinstance(source, ReconfigurablePollingChangeSource):
continue
if not hasattr(source, "name"):
continue
|
{"golden_diff": "diff --git a/master/buildbot/www/hooks/poller.py b/master/buildbot/www/hooks/poller.py\n--- a/master/buildbot/www/hooks/poller.py\n+++ b/master/buildbot/www/hooks/poller.py\n@@ -17,7 +17,7 @@\n # the door\" and trigger a change source to poll.\n \n \n-from buildbot.changes.base import PollingChangeSource\n+from buildbot.changes.base import ReconfigurablePollingChangeSource\n from buildbot.util import bytes2unicode\n from buildbot.util import unicode2bytes\n from buildbot.www.hooks.base import BaseHookHandler\n@@ -38,7 +38,7 @@\n pollers = []\n \n for source in change_svc:\n- if not isinstance(source, PollingChangeSource):\n+ if not isinstance(source, ReconfigurablePollingChangeSource):\n continue\n if not hasattr(source, \"name\"):\n continue\n", "issue": "change_hook/poller not working for ReconfigurablePollingChangeSource\nIn [poller.py](https://github.com/buildbot/buildbot/blob/a0e1d8840e8856ead136a1ad6e2021931355af15/master/buildbot/www/hooks/poller.py#L40), the change sources are filtered like this:\r\n\r\n```python\r\n for source in change_svc:\r\n if not isinstance(source, PollingChangeSource):\r\n continue\r\n```\r\n\r\nThis means that any pollers derived from the super-class `ReconfigurablePollingChangeSource` will not be found. Since [new code is supposed to use `ReconfigurablePollingChangeSource`](https://docs.buildbot.net/current/developer/cls-changesources.html?highlight=reconfigurablepollingchangesource#pollingchangesource), the code should probably read:\r\n\r\n```python\r\n for source in change_svc:\r\n if not isinstance(source, ReconfigurablePollingChangeSource):\r\n continue\r\n```\r\n\n", "before_files": [{"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS\n# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more\n# details.\n#\n# You should have received a copy of the GNU General Public License along with\n# this program; if not, write to the Free Software Foundation, Inc., 51\n# Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.\n#\n# Copyright Buildbot Team Members\n\n# This change hook allows GitHub or a hand crafted curl invocation to \"knock on\n# the door\" and trigger a change source to poll.\n\n\nfrom buildbot.changes.base import PollingChangeSource\nfrom buildbot.util import bytes2unicode\nfrom buildbot.util import unicode2bytes\nfrom buildbot.www.hooks.base import BaseHookHandler\n\n\nclass PollingHandler(BaseHookHandler):\n\n def getChanges(self, req):\n change_svc = req.site.master.change_svc\n poll_all = b\"poller\" not in req.args\n\n allow_all = True\n allowed = []\n if isinstance(self.options, dict) and b\"allowed\" in self.options:\n allow_all = False\n allowed = self.options[b\"allowed\"]\n\n pollers = []\n\n for source in change_svc:\n if not isinstance(source, PollingChangeSource):\n continue\n if not hasattr(source, \"name\"):\n continue\n if (not poll_all and\n unicode2bytes(source.name) not in req.args[b'poller']):\n continue\n if not allow_all and unicode2bytes(source.name) not in allowed:\n continue\n pollers.append(source)\n\n if not poll_all:\n missing = (set(req.args[b'poller']) -\n set(unicode2bytes(s.name) for s in pollers))\n if missing:\n raise ValueError(\"Could not find pollers: {}\".format(\n bytes2unicode(b\",\".join(missing))))\n\n for p in pollers:\n p.force()\n\n return [], None\n\n\npoller = PollingHandler\n", "path": "master/buildbot/www/hooks/poller.py"}], "after_files": [{"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS\n# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more\n# details.\n#\n# You should have received a copy of the GNU General Public License along with\n# this program; if not, write to the Free Software Foundation, Inc., 51\n# Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.\n#\n# Copyright Buildbot Team Members\n\n# This change hook allows GitHub or a hand crafted curl invocation to \"knock on\n# the door\" and trigger a change source to poll.\n\n\nfrom buildbot.changes.base import ReconfigurablePollingChangeSource\nfrom buildbot.util import bytes2unicode\nfrom buildbot.util import unicode2bytes\nfrom buildbot.www.hooks.base import BaseHookHandler\n\n\nclass PollingHandler(BaseHookHandler):\n\n def getChanges(self, req):\n change_svc = req.site.master.change_svc\n poll_all = b\"poller\" not in req.args\n\n allow_all = True\n allowed = []\n if isinstance(self.options, dict) and b\"allowed\" in self.options:\n allow_all = False\n allowed = self.options[b\"allowed\"]\n\n pollers = []\n\n for source in change_svc:\n if not isinstance(source, ReconfigurablePollingChangeSource):\n continue\n if not hasattr(source, \"name\"):\n continue\n if (not poll_all and\n unicode2bytes(source.name) not in req.args[b'poller']):\n continue\n if not allow_all and unicode2bytes(source.name) not in allowed:\n continue\n pollers.append(source)\n\n if not poll_all:\n missing = (set(req.args[b'poller']) -\n set(unicode2bytes(s.name) for s in pollers))\n if missing:\n raise ValueError(\"Could not find pollers: {}\".format(\n bytes2unicode(b\",\".join(missing))))\n\n for p in pollers:\n p.force()\n\n return [], None\n\n\npoller = PollingHandler\n", "path": "master/buildbot/www/hooks/poller.py"}]}
| 1,117 | 194 |
gh_patches_debug_12830
|
rasdani/github-patches
|
git_diff
|
mars-project__mars-82
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
By default use core number as n_parallel for threaded scheduling
Use core number as `n_parallel` for threaded scheduling, currently 1 thread by default.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mars/session.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 # Copyright 1999-2018 Alibaba Group Holding Ltd.
4 #
5 # Licensed under the Apache License, Version 2.0 (the "License");
6 # you may not use this file except in compliance with the License.
7 # You may obtain a copy of the License at
8 #
9 # http://www.apache.org/licenses/LICENSE-2.0
10 #
11 # Unless required by applicable law or agreed to in writing, software
12 # distributed under the License is distributed on an "AS IS" BASIS,
13 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14 # See the License for the specific language governing permissions and
15 # limitations under the License.
16
17 import numpy as np
18
19
20 class LocalSession(object):
21 def __init__(self):
22 from .tensor.execution.core import Executor
23
24 self._executor = Executor()
25 self._endpoint = None
26
27 @property
28 def endpoint(self):
29 return self._endpoint
30
31 @endpoint.setter
32 def endpoint(self, endpoint):
33 if endpoint is not None:
34 raise ValueError('Local session cannot set endpoint')
35 self._endpoint = endpoint
36
37 def run(self, *tensors, **kw):
38 if self._executor is None:
39 raise RuntimeError('Session has closed')
40 return self._executor.execute_tensors(tensors, **kw)
41
42 def decref(self, *keys):
43 self._executor.decref(*keys)
44
45 def __enter__(self):
46 return self
47
48 def __exit__(self, *_):
49 self._executor = None
50
51
52 class Session(object):
53 _default_session = None
54
55 def __init__(self, endpoint=None):
56 if endpoint is not None:
57 if 'http' in endpoint:
58 # connect to web
59 from .web.session import Session as WebSession
60
61 self._sess = WebSession(endpoint)
62 else:
63 # connect to local cluster
64 from .deploy.local.session import LocalClusterSession
65
66 self._sess = LocalClusterSession(endpoint)
67 else:
68 self._sess = LocalSession()
69
70 self._executed_keys = set()
71
72 def run(self, *tensors, **kw):
73 from . import tensor as mt
74
75 ret_list = False
76 if len(tensors) == 1 and isinstance(tensors[0], (tuple, list)):
77 ret_list = True
78 tensors = tensors[0]
79 elif len(tensors) > 1:
80 ret_list = True
81
82 tensors = tuple(mt.tensor(t) for t in tensors)
83 result = self._sess.run(*tensors, **kw)
84 self._executed_keys.update(t.key for t in tensors)
85 for t in tensors:
86 t._execute_session = self
87
88 ret = []
89 for r, t in zip(result, tensors):
90 if r is None:
91 ret.append(r)
92 continue
93 if t.isscalar() and hasattr(r, 'item'):
94 ret.append(np.asscalar(r))
95 else:
96 ret.append(r)
97 if ret_list:
98 return ret
99 return ret[0]
100
101 @property
102 def endpoint(self):
103 return self._sess.endpoint
104
105 @endpoint.setter
106 def endpoint(self, endpoint):
107 self._sess.endpoint = endpoint
108
109 def decref(self, *keys):
110 if hasattr(self._sess, 'decref'):
111 self._sess.decref(*keys)
112
113 def __getattr__(self, attr):
114 try:
115 obj = self._sess.__getattribute__(attr)
116 return obj
117 except AttributeError:
118 raise
119
120 def __enter__(self):
121 self._sess.__enter__()
122 return self
123
124 def __exit__(self, exc_type, exc_val, exc_tb):
125 self._sess.__exit__(exc_type, exc_val, exc_tb)
126
127 close = __exit__
128
129 def as_default(self):
130 Session._default_session = self
131 return self
132
133 @classmethod
134 def default_or_local(cls):
135 if cls._default_session is not None:
136 return cls._default_session
137
138 cls._default_session = Session()
139 return cls._default_session
140
141
142 def new_session(scheduler=None):
143 return Session(scheduler)
144
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mars/session.py b/mars/session.py
--- a/mars/session.py
+++ b/mars/session.py
@@ -16,6 +16,11 @@
import numpy as np
+try:
+ from .resource import cpu_count
+except ImportError:
+ from multiprocessing import cpu_count
+
class LocalSession(object):
def __init__(self):
@@ -37,6 +42,8 @@
def run(self, *tensors, **kw):
if self._executor is None:
raise RuntimeError('Session has closed')
+ if 'n_parallel' not in kw:
+ kw['n_parallel'] = cpu_count()
return self._executor.execute_tensors(tensors, **kw)
def decref(self, *keys):
|
{"golden_diff": "diff --git a/mars/session.py b/mars/session.py\n--- a/mars/session.py\n+++ b/mars/session.py\n@@ -16,6 +16,11 @@\n \n import numpy as np\n \n+try:\n+ from .resource import cpu_count\n+except ImportError:\n+ from multiprocessing import cpu_count\n+\n \n class LocalSession(object):\n def __init__(self):\n@@ -37,6 +42,8 @@\n def run(self, *tensors, **kw):\n if self._executor is None:\n raise RuntimeError('Session has closed')\n+ if 'n_parallel' not in kw:\n+ kw['n_parallel'] = cpu_count()\n return self._executor.execute_tensors(tensors, **kw)\n \n def decref(self, *keys):\n", "issue": "By default use core number as n_parallel for threaded scheduling\nUse core number as `n_parallel` for threaded scheduling, currently 1 thread by default.\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n# Copyright 1999-2018 Alibaba Group Holding Ltd.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport numpy as np\n\n\nclass LocalSession(object):\n def __init__(self):\n from .tensor.execution.core import Executor\n\n self._executor = Executor()\n self._endpoint = None\n\n @property\n def endpoint(self):\n return self._endpoint\n\n @endpoint.setter\n def endpoint(self, endpoint):\n if endpoint is not None:\n raise ValueError('Local session cannot set endpoint')\n self._endpoint = endpoint\n\n def run(self, *tensors, **kw):\n if self._executor is None:\n raise RuntimeError('Session has closed')\n return self._executor.execute_tensors(tensors, **kw)\n\n def decref(self, *keys):\n self._executor.decref(*keys)\n\n def __enter__(self):\n return self\n\n def __exit__(self, *_):\n self._executor = None\n\n\nclass Session(object):\n _default_session = None\n\n def __init__(self, endpoint=None):\n if endpoint is not None:\n if 'http' in endpoint:\n # connect to web\n from .web.session import Session as WebSession\n\n self._sess = WebSession(endpoint)\n else:\n # connect to local cluster\n from .deploy.local.session import LocalClusterSession\n\n self._sess = LocalClusterSession(endpoint)\n else:\n self._sess = LocalSession()\n\n self._executed_keys = set()\n\n def run(self, *tensors, **kw):\n from . import tensor as mt\n\n ret_list = False\n if len(tensors) == 1 and isinstance(tensors[0], (tuple, list)):\n ret_list = True\n tensors = tensors[0]\n elif len(tensors) > 1:\n ret_list = True\n\n tensors = tuple(mt.tensor(t) for t in tensors)\n result = self._sess.run(*tensors, **kw)\n self._executed_keys.update(t.key for t in tensors)\n for t in tensors:\n t._execute_session = self\n\n ret = []\n for r, t in zip(result, tensors):\n if r is None:\n ret.append(r)\n continue\n if t.isscalar() and hasattr(r, 'item'):\n ret.append(np.asscalar(r))\n else:\n ret.append(r)\n if ret_list:\n return ret\n return ret[0]\n\n @property\n def endpoint(self):\n return self._sess.endpoint\n\n @endpoint.setter\n def endpoint(self, endpoint):\n self._sess.endpoint = endpoint\n\n def decref(self, *keys):\n if hasattr(self._sess, 'decref'):\n self._sess.decref(*keys)\n\n def __getattr__(self, attr):\n try:\n obj = self._sess.__getattribute__(attr)\n return obj\n except AttributeError:\n raise\n\n def __enter__(self):\n self._sess.__enter__()\n return self\n\n def __exit__(self, exc_type, exc_val, exc_tb):\n self._sess.__exit__(exc_type, exc_val, exc_tb)\n\n close = __exit__\n\n def as_default(self):\n Session._default_session = self\n return self\n\n @classmethod\n def default_or_local(cls):\n if cls._default_session is not None:\n return cls._default_session\n\n cls._default_session = Session()\n return cls._default_session\n\n\ndef new_session(scheduler=None):\n return Session(scheduler)\n", "path": "mars/session.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n# Copyright 1999-2018 Alibaba Group Holding Ltd.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport numpy as np\n\ntry:\n from .resource import cpu_count\nexcept ImportError:\n from multiprocessing import cpu_count\n\n\nclass LocalSession(object):\n def __init__(self):\n from .tensor.execution.core import Executor\n\n self._executor = Executor()\n self._endpoint = None\n\n @property\n def endpoint(self):\n return self._endpoint\n\n @endpoint.setter\n def endpoint(self, endpoint):\n if endpoint is not None:\n raise ValueError('Local session cannot set endpoint')\n self._endpoint = endpoint\n\n def run(self, *tensors, **kw):\n if self._executor is None:\n raise RuntimeError('Session has closed')\n if 'n_parallel' not in kw:\n kw['n_parallel'] = cpu_count()\n return self._executor.execute_tensors(tensors, **kw)\n\n def decref(self, *keys):\n self._executor.decref(*keys)\n\n def __enter__(self):\n return self\n\n def __exit__(self, *_):\n self._executor = None\n\n\nclass Session(object):\n _default_session = None\n\n def __init__(self, endpoint=None):\n if endpoint is not None:\n if 'http' in endpoint:\n # connect to web\n from .web.session import Session as WebSession\n\n self._sess = WebSession(endpoint)\n else:\n # connect to local cluster\n from .deploy.local.session import LocalClusterSession\n\n self._sess = LocalClusterSession(endpoint)\n else:\n self._sess = LocalSession()\n\n self._executed_keys = set()\n\n def run(self, *tensors, **kw):\n from . import tensor as mt\n\n ret_list = False\n if len(tensors) == 1 and isinstance(tensors[0], (tuple, list)):\n ret_list = True\n tensors = tensors[0]\n elif len(tensors) > 1:\n ret_list = True\n\n tensors = tuple(mt.tensor(t) for t in tensors)\n result = self._sess.run(*tensors, **kw)\n self._executed_keys.update(t.key for t in tensors)\n for t in tensors:\n t._execute_session = self\n\n ret = []\n for r, t in zip(result, tensors):\n if r is None:\n ret.append(r)\n continue\n if t.isscalar() and hasattr(r, 'item'):\n ret.append(np.asscalar(r))\n else:\n ret.append(r)\n if ret_list:\n return ret\n return ret[0]\n\n @property\n def endpoint(self):\n return self._sess.endpoint\n\n @endpoint.setter\n def endpoint(self, endpoint):\n self._sess.endpoint = endpoint\n\n def decref(self, *keys):\n if hasattr(self._sess, 'decref'):\n self._sess.decref(*keys)\n\n def __getattr__(self, attr):\n try:\n obj = self._sess.__getattribute__(attr)\n return obj\n except AttributeError:\n raise\n\n def __enter__(self):\n self._sess.__enter__()\n return self\n\n def __exit__(self, exc_type, exc_val, exc_tb):\n self._sess.__exit__(exc_type, exc_val, exc_tb)\n\n close = __exit__\n\n def as_default(self):\n Session._default_session = self\n return self\n\n @classmethod\n def default_or_local(cls):\n if cls._default_session is not None:\n return cls._default_session\n\n cls._default_session = Session()\n return cls._default_session\n\n\ndef new_session(scheduler=None):\n return Session(scheduler)\n", "path": "mars/session.py"}]}
| 1,525 | 170 |
gh_patches_debug_15984
|
rasdani/github-patches
|
git_diff
|
OpenMined__PySyft-5397
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Adding loguru compatiblity with pytest caplog
## Description
`caplog` fixture in pytest captures the logging output for testing if appropriate warnings have been raised.
By default pytest uses the standard `logging` module, but since we are using `loguru` appropriate patching needs to be added.
## Additional Context
https://loguru.readthedocs.io/en/stable/resources/migration.html#making-things-work-with-pytest-and-caplog
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/syft/logger.py`
Content:
```
1 # stdlib
2 import os
3 from typing import Any
4 from typing import Callable
5 from typing import NoReturn
6 from typing import TextIO
7 from typing import Union
8
9 # third party
10 from loguru import logger
11
12 LOG_FORMAT = "[{time}][{level}][{module}]][{process.id}] {message}"
13
14 logger.remove()
15 DEFAULT_SINK = "syft_{time}.log"
16
17
18 def remove() -> None:
19 logger.remove()
20
21
22 def add(
23 sink: Union[None, str, os.PathLike, TextIO] = None,
24 level: str = "ERROR",
25 ) -> None:
26 sink = DEFAULT_SINK if sink is None else sink
27 try:
28 logger.add(
29 sink=sink,
30 format=LOG_FORMAT,
31 enqueue=True,
32 colorize=False,
33 diagnose=True,
34 backtrace=True,
35 rotation="10 MB",
36 retention="1 day",
37 level=level,
38 )
39 except BaseException:
40 logger.add(
41 sink=sink,
42 format=LOG_FORMAT,
43 enqueue=True,
44 colorize=False,
45 diagnose=True,
46 backtrace=True,
47 level=level,
48 )
49
50
51 def traceback_and_raise(e: Any, verbose: bool = False) -> NoReturn:
52 try:
53 if verbose:
54 logger.opt(lazy=True).exception(e)
55 else:
56 logger.opt(lazy=True).critical(e)
57 except BaseException as ex:
58 logger.debug("failed to print exception", ex)
59 if not issubclass(type(e), Exception):
60 e = Exception(e)
61 raise e
62
63
64 def create_log_and_print_function(level: str) -> Callable:
65 def log_and_print(*args: Any, **kwargs: Any) -> None:
66 try:
67 method = getattr(logger.opt(lazy=True), level, None)
68 if "print" in kwargs and kwargs["print"] is True:
69 del kwargs["print"]
70 print(*args, **kwargs)
71 if "end" in kwargs:
72 # clean up extra end for printing
73 del kwargs["end"]
74
75 if method is not None:
76 method(*args, **kwargs)
77 else:
78 raise Exception(f"no method {level} on logger")
79 except BaseException as e:
80 msg = f"failed to log exception. {e}"
81 try:
82 logger.debug(msg)
83 except Exception as e:
84 print(f"{msg}. {e}")
85
86 return log_and_print
87
88
89 def traceback(*args: Any, **kwargs: Any) -> None:
90 return create_log_and_print_function(level="exception")(*args, **kwargs)
91
92
93 def critical(*args: Any, **kwargs: Any) -> None:
94 return create_log_and_print_function(level="critical")(*args, **kwargs)
95
96
97 def error(*args: Any, **kwargs: Any) -> None:
98 return create_log_and_print_function(level="error")(*args, **kwargs)
99
100
101 def warning(*args: Any, **kwargs: Any) -> None:
102 return create_log_and_print_function(level="warning")(*args, **kwargs)
103
104
105 def info(*args: Any, **kwargs: Any) -> None:
106 return create_log_and_print_function(level="info")(*args, **kwargs)
107
108
109 def debug(*args: Any, **kwargs: Any) -> None:
110 return create_log_and_print_function(level="debug")(*args, **kwargs)
111
112
113 def trace(*args: Any, **kwargs: Any) -> None:
114 return create_log_and_print_function(level="trace")(*args, **kwargs)
115
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/syft/logger.py b/src/syft/logger.py
--- a/src/syft/logger.py
+++ b/src/syft/logger.py
@@ -1,4 +1,5 @@
# stdlib
+import logging
import os
from typing import Any
from typing import Callable
@@ -20,7 +21,7 @@
def add(
- sink: Union[None, str, os.PathLike, TextIO] = None,
+ sink: Union[None, str, os.PathLike, TextIO, logging.Handler] = None,
level: str = "ERROR",
) -> None:
sink = DEFAULT_SINK if sink is None else sink
@@ -40,7 +41,6 @@
logger.add(
sink=sink,
format=LOG_FORMAT,
- enqueue=True,
colorize=False,
diagnose=True,
backtrace=True,
|
{"golden_diff": "diff --git a/src/syft/logger.py b/src/syft/logger.py\n--- a/src/syft/logger.py\n+++ b/src/syft/logger.py\n@@ -1,4 +1,5 @@\n # stdlib\n+import logging\n import os\n from typing import Any\n from typing import Callable\n@@ -20,7 +21,7 @@\n \n \n def add(\n- sink: Union[None, str, os.PathLike, TextIO] = None,\n+ sink: Union[None, str, os.PathLike, TextIO, logging.Handler] = None,\n level: str = \"ERROR\",\n ) -> None:\n sink = DEFAULT_SINK if sink is None else sink\n@@ -40,7 +41,6 @@\n logger.add(\n sink=sink,\n format=LOG_FORMAT,\n- enqueue=True,\n colorize=False,\n diagnose=True,\n backtrace=True,\n", "issue": "Adding loguru compatiblity with pytest caplog\n## Description\r\n`caplog` fixture in pytest captures the logging output for testing if appropriate warnings have been raised.\r\n\r\nBy default pytest uses the standard `logging` module, but since we are using `loguru` appropriate patching needs to be added.\r\n\r\n## Additional Context\r\nhttps://loguru.readthedocs.io/en/stable/resources/migration.html#making-things-work-with-pytest-and-caplog\r\n\n", "before_files": [{"content": "# stdlib\nimport os\nfrom typing import Any\nfrom typing import Callable\nfrom typing import NoReturn\nfrom typing import TextIO\nfrom typing import Union\n\n# third party\nfrom loguru import logger\n\nLOG_FORMAT = \"[{time}][{level}][{module}]][{process.id}] {message}\"\n\nlogger.remove()\nDEFAULT_SINK = \"syft_{time}.log\"\n\n\ndef remove() -> None:\n logger.remove()\n\n\ndef add(\n sink: Union[None, str, os.PathLike, TextIO] = None,\n level: str = \"ERROR\",\n) -> None:\n sink = DEFAULT_SINK if sink is None else sink\n try:\n logger.add(\n sink=sink,\n format=LOG_FORMAT,\n enqueue=True,\n colorize=False,\n diagnose=True,\n backtrace=True,\n rotation=\"10 MB\",\n retention=\"1 day\",\n level=level,\n )\n except BaseException:\n logger.add(\n sink=sink,\n format=LOG_FORMAT,\n enqueue=True,\n colorize=False,\n diagnose=True,\n backtrace=True,\n level=level,\n )\n\n\ndef traceback_and_raise(e: Any, verbose: bool = False) -> NoReturn:\n try:\n if verbose:\n logger.opt(lazy=True).exception(e)\n else:\n logger.opt(lazy=True).critical(e)\n except BaseException as ex:\n logger.debug(\"failed to print exception\", ex)\n if not issubclass(type(e), Exception):\n e = Exception(e)\n raise e\n\n\ndef create_log_and_print_function(level: str) -> Callable:\n def log_and_print(*args: Any, **kwargs: Any) -> None:\n try:\n method = getattr(logger.opt(lazy=True), level, None)\n if \"print\" in kwargs and kwargs[\"print\"] is True:\n del kwargs[\"print\"]\n print(*args, **kwargs)\n if \"end\" in kwargs:\n # clean up extra end for printing\n del kwargs[\"end\"]\n\n if method is not None:\n method(*args, **kwargs)\n else:\n raise Exception(f\"no method {level} on logger\")\n except BaseException as e:\n msg = f\"failed to log exception. {e}\"\n try:\n logger.debug(msg)\n except Exception as e:\n print(f\"{msg}. {e}\")\n\n return log_and_print\n\n\ndef traceback(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"exception\")(*args, **kwargs)\n\n\ndef critical(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"critical\")(*args, **kwargs)\n\n\ndef error(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"error\")(*args, **kwargs)\n\n\ndef warning(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"warning\")(*args, **kwargs)\n\n\ndef info(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"info\")(*args, **kwargs)\n\n\ndef debug(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"debug\")(*args, **kwargs)\n\n\ndef trace(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"trace\")(*args, **kwargs)\n", "path": "src/syft/logger.py"}], "after_files": [{"content": "# stdlib\nimport logging\nimport os\nfrom typing import Any\nfrom typing import Callable\nfrom typing import NoReturn\nfrom typing import TextIO\nfrom typing import Union\n\n# third party\nfrom loguru import logger\n\nLOG_FORMAT = \"[{time}][{level}][{module}]][{process.id}] {message}\"\n\nlogger.remove()\nDEFAULT_SINK = \"syft_{time}.log\"\n\n\ndef remove() -> None:\n logger.remove()\n\n\ndef add(\n sink: Union[None, str, os.PathLike, TextIO, logging.Handler] = None,\n level: str = \"ERROR\",\n) -> None:\n sink = DEFAULT_SINK if sink is None else sink\n try:\n logger.add(\n sink=sink,\n format=LOG_FORMAT,\n enqueue=True,\n colorize=False,\n diagnose=True,\n backtrace=True,\n rotation=\"10 MB\",\n retention=\"1 day\",\n level=level,\n )\n except BaseException:\n logger.add(\n sink=sink,\n format=LOG_FORMAT,\n colorize=False,\n diagnose=True,\n backtrace=True,\n level=level,\n )\n\n\ndef traceback_and_raise(e: Any, verbose: bool = False) -> NoReturn:\n try:\n if verbose:\n logger.opt(lazy=True).exception(e)\n else:\n logger.opt(lazy=True).critical(e)\n except BaseException as ex:\n logger.debug(\"failed to print exception\", ex)\n if not issubclass(type(e), Exception):\n e = Exception(e)\n raise e\n\n\ndef create_log_and_print_function(level: str) -> Callable:\n def log_and_print(*args: Any, **kwargs: Any) -> None:\n try:\n method = getattr(logger.opt(lazy=True), level, None)\n if \"print\" in kwargs and kwargs[\"print\"] is True:\n del kwargs[\"print\"]\n print(*args, **kwargs)\n if \"end\" in kwargs:\n # clean up extra end for printing\n del kwargs[\"end\"]\n\n if method is not None:\n method(*args, **kwargs)\n else:\n raise Exception(f\"no method {level} on logger\")\n except BaseException as e:\n msg = f\"failed to log exception. {e}\"\n try:\n logger.debug(msg)\n except Exception as e:\n print(f\"{msg}. {e}\")\n\n return log_and_print\n\n\ndef traceback(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"exception\")(*args, **kwargs)\n\n\ndef critical(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"critical\")(*args, **kwargs)\n\n\ndef error(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"error\")(*args, **kwargs)\n\n\ndef warning(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"warning\")(*args, **kwargs)\n\n\ndef info(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"info\")(*args, **kwargs)\n\n\ndef debug(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"debug\")(*args, **kwargs)\n\n\ndef trace(*args: Any, **kwargs: Any) -> None:\n return create_log_and_print_function(level=\"trace\")(*args, **kwargs)\n", "path": "src/syft/logger.py"}]}
| 1,350 | 197 |
gh_patches_debug_36091
|
rasdani/github-patches
|
git_diff
|
freedomofpress__securedrop-5894
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
v2 removal on restore does not cover HTTPS services
The logic added in https://github.com/freedomofpress/securedrop/pull/5677 to disable v2 onion services when a backup is restored to a Focal server does not remove config lines for HTTPS services (port 443), potentially resulting in a broken configuration.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `install_files/ansible-base/roles/restore/files/disable_v2.py`
Content:
```
1 #!/usr/bin/env python3
2 # To execute on prod:
3 # python3 disable_v2.py /etc/tor/torrc /etc/tor/torrc
4 # To execute for testing locally:
5 # python3 disable_v2.py /etc/tor/torrc /tmp/dumytorrc
6 import sys
7
8
9 def filter_v2(filename):
10 # Read the file
11 with open(filename) as f:
12 data = f.readlines()
13 # We will store the filtered lines to result
14 result = []
15
16 i = 0
17 while i < len(data):
18 line = data[i]
19 if line == "HiddenServiceDir /var/lib/tor/services/source\n":
20 i += 1
21 while data[i].strip() == "":
22 i += 1
23 line = data[i]
24 if line == "HiddenServiceVersion 2\n":
25 i += 1
26 line = data[i]
27 while data[i].strip() == "":
28 i += 1
29 line = data[i]
30 if line == "HiddenServicePort 80 127.0.0.1:80\n":
31 i += 1
32 continue
33 # Now check for journalist
34 if line == "HiddenServiceDir /var/lib/tor/services/journalist\n":
35 i += 1
36 while data[i].strip() == "":
37 i += 1
38 line = data[i]
39 if line == "HiddenServiceVersion 2\n":
40 i += 1
41 line = data[i]
42 while data[i].strip() == "":
43 i += 1
44 line = data[i]
45 if line == "HiddenServicePort 80 127.0.0.1:8080\n":
46 i += 1
47 line = data[i]
48 while data[i].strip() == "":
49 i += 1
50 line = data[i]
51 if line == "HiddenServiceAuthorizeClient stealth journalist\n":
52 i += 1
53 continue
54 # Now the v2 ssh access
55 if line == "HiddenServiceDir /var/lib/tor/services/ssh\n":
56 i += 1
57 while data[i].strip() == "":
58 i += 1
59 line = data[i]
60 if line == "HiddenServiceVersion 2\n":
61 i += 1
62 line = data[i]
63 while data[i].strip() == "":
64 i += 1
65 line = data[i]
66 if line == "HiddenServicePort 22 127.0.0.1:22\n":
67 i += 1
68 line = data[i]
69 while data[i].strip() == "":
70 i += 1
71 line = data[i]
72 if line == "HiddenServiceAuthorizeClient stealth admin\n":
73 i += 1
74 continue
75
76 result.append(line)
77 i += 1
78
79 # Now return the result
80 return result
81
82
83 if __name__ == "__main__":
84 filename = sys.argv[1]
85 outputfilename = sys.argv[2]
86 result = filter_v2(filename)
87 with open(outputfilename, "w") as fobj:
88 for line in result:
89 fobj.write(line)
90
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/install_files/ansible-base/roles/restore/files/disable_v2.py b/install_files/ansible-base/roles/restore/files/disable_v2.py
deleted file mode 100644
--- a/install_files/ansible-base/roles/restore/files/disable_v2.py
+++ /dev/null
@@ -1,89 +0,0 @@
-#!/usr/bin/env python3
-# To execute on prod:
-# python3 disable_v2.py /etc/tor/torrc /etc/tor/torrc
-# To execute for testing locally:
-# python3 disable_v2.py /etc/tor/torrc /tmp/dumytorrc
-import sys
-
-
-def filter_v2(filename):
- # Read the file
- with open(filename) as f:
- data = f.readlines()
- # We will store the filtered lines to result
- result = []
-
- i = 0
- while i < len(data):
- line = data[i]
- if line == "HiddenServiceDir /var/lib/tor/services/source\n":
- i += 1
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServiceVersion 2\n":
- i += 1
- line = data[i]
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServicePort 80 127.0.0.1:80\n":
- i += 1
- continue
- # Now check for journalist
- if line == "HiddenServiceDir /var/lib/tor/services/journalist\n":
- i += 1
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServiceVersion 2\n":
- i += 1
- line = data[i]
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServicePort 80 127.0.0.1:8080\n":
- i += 1
- line = data[i]
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServiceAuthorizeClient stealth journalist\n":
- i += 1
- continue
- # Now the v2 ssh access
- if line == "HiddenServiceDir /var/lib/tor/services/ssh\n":
- i += 1
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServiceVersion 2\n":
- i += 1
- line = data[i]
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServicePort 22 127.0.0.1:22\n":
- i += 1
- line = data[i]
- while data[i].strip() == "":
- i += 1
- line = data[i]
- if line == "HiddenServiceAuthorizeClient stealth admin\n":
- i += 1
- continue
-
- result.append(line)
- i += 1
-
- # Now return the result
- return result
-
-
-if __name__ == "__main__":
- filename = sys.argv[1]
- outputfilename = sys.argv[2]
- result = filter_v2(filename)
- with open(outputfilename, "w") as fobj:
- for line in result:
- fobj.write(line)
|
{"golden_diff": "diff --git a/install_files/ansible-base/roles/restore/files/disable_v2.py b/install_files/ansible-base/roles/restore/files/disable_v2.py\ndeleted file mode 100644\n--- a/install_files/ansible-base/roles/restore/files/disable_v2.py\n+++ /dev/null\n@@ -1,89 +0,0 @@\n-#!/usr/bin/env python3\n-# To execute on prod:\n-# python3 disable_v2.py /etc/tor/torrc /etc/tor/torrc\n-# To execute for testing locally:\n-# python3 disable_v2.py /etc/tor/torrc /tmp/dumytorrc\n-import sys\n-\n-\n-def filter_v2(filename):\n- # Read the file\n- with open(filename) as f:\n- data = f.readlines()\n- # We will store the filtered lines to result\n- result = []\n-\n- i = 0\n- while i < len(data):\n- line = data[i]\n- if line == \"HiddenServiceDir /var/lib/tor/services/source\\n\":\n- i += 1\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServiceVersion 2\\n\":\n- i += 1\n- line = data[i]\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServicePort 80 127.0.0.1:80\\n\":\n- i += 1\n- continue\n- # Now check for journalist\n- if line == \"HiddenServiceDir /var/lib/tor/services/journalist\\n\":\n- i += 1\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServiceVersion 2\\n\":\n- i += 1\n- line = data[i]\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServicePort 80 127.0.0.1:8080\\n\":\n- i += 1\n- line = data[i]\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServiceAuthorizeClient stealth journalist\\n\":\n- i += 1\n- continue\n- # Now the v2 ssh access\n- if line == \"HiddenServiceDir /var/lib/tor/services/ssh\\n\":\n- i += 1\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServiceVersion 2\\n\":\n- i += 1\n- line = data[i]\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServicePort 22 127.0.0.1:22\\n\":\n- i += 1\n- line = data[i]\n- while data[i].strip() == \"\":\n- i += 1\n- line = data[i]\n- if line == \"HiddenServiceAuthorizeClient stealth admin\\n\":\n- i += 1\n- continue\n-\n- result.append(line)\n- i += 1\n-\n- # Now return the result\n- return result\n-\n-\n-if __name__ == \"__main__\":\n- filename = sys.argv[1]\n- outputfilename = sys.argv[2]\n- result = filter_v2(filename)\n- with open(outputfilename, \"w\") as fobj:\n- for line in result:\n- fobj.write(line)\n", "issue": "v2 removal on restore does not cover HTTPS services\nThe logic added in https://github.com/freedomofpress/securedrop/pull/5677 to disable v2 onion services when a backup is restored to a Focal server does not remove config lines for HTTPS services (port 443), potentially resulting in a broken configuration.\n", "before_files": [{"content": "#!/usr/bin/env python3\n# To execute on prod:\n# python3 disable_v2.py /etc/tor/torrc /etc/tor/torrc\n# To execute for testing locally:\n# python3 disable_v2.py /etc/tor/torrc /tmp/dumytorrc\nimport sys\n\n\ndef filter_v2(filename):\n # Read the file\n with open(filename) as f:\n data = f.readlines()\n # We will store the filtered lines to result\n result = []\n\n i = 0\n while i < len(data):\n line = data[i]\n if line == \"HiddenServiceDir /var/lib/tor/services/source\\n\":\n i += 1\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServiceVersion 2\\n\":\n i += 1\n line = data[i]\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServicePort 80 127.0.0.1:80\\n\":\n i += 1\n continue\n # Now check for journalist\n if line == \"HiddenServiceDir /var/lib/tor/services/journalist\\n\":\n i += 1\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServiceVersion 2\\n\":\n i += 1\n line = data[i]\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServicePort 80 127.0.0.1:8080\\n\":\n i += 1\n line = data[i]\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServiceAuthorizeClient stealth journalist\\n\":\n i += 1\n continue\n # Now the v2 ssh access\n if line == \"HiddenServiceDir /var/lib/tor/services/ssh\\n\":\n i += 1\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServiceVersion 2\\n\":\n i += 1\n line = data[i]\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServicePort 22 127.0.0.1:22\\n\":\n i += 1\n line = data[i]\n while data[i].strip() == \"\":\n i += 1\n line = data[i]\n if line == \"HiddenServiceAuthorizeClient stealth admin\\n\":\n i += 1\n continue\n\n result.append(line)\n i += 1\n\n # Now return the result\n return result\n\n\nif __name__ == \"__main__\":\n filename = sys.argv[1]\n outputfilename = sys.argv[2]\n result = filter_v2(filename)\n with open(outputfilename, \"w\") as fobj:\n for line in result:\n fobj.write(line)\n", "path": "install_files/ansible-base/roles/restore/files/disable_v2.py"}], "after_files": [{"content": null, "path": "install_files/ansible-base/roles/restore/files/disable_v2.py"}]}
| 1,211 | 856 |
gh_patches_debug_10651
|
rasdani/github-patches
|
git_diff
|
hpcaitech__ColossalAI-5440
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `colossalai/legacy/tensor/tensor_spec.py`
Content:
```
1 from dataclasses import dataclass
2 from typing import Optional
3
4 from colossalai.legacy.tensor.distspec import DistPlacementPattern, _DistSpec
5 from colossalai.legacy.tensor.process_group import ProcessGroup
6
7 from .compute_spec import ComputeSpec
8
9
10 @dataclass
11 class ColoTensorSpec:
12 """ColoTensorSpec
13
14 A data class for specifications of the `ColoTensor`.
15 It contains attributes of `ProcessGroup`, `_DistSpec`, `ComputeSpec`.
16 The latter two attributes are optional. If not set, they are default value is `Replicate()` and `None`.
17 """
18
19 pg: ProcessGroup
20 dist_attr: Optional[_DistSpec] = _DistSpec(DistPlacementPattern.REPLICATE)
21 compute_attr: Optional[ComputeSpec] = None
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/colossalai/legacy/tensor/tensor_spec.py b/colossalai/legacy/tensor/tensor_spec.py
--- a/colossalai/legacy/tensor/tensor_spec.py
+++ b/colossalai/legacy/tensor/tensor_spec.py
@@ -1,4 +1,4 @@
-from dataclasses import dataclass
+from dataclasses import dataclass, field
from typing import Optional
from colossalai.legacy.tensor.distspec import DistPlacementPattern, _DistSpec
@@ -17,5 +17,5 @@
"""
pg: ProcessGroup
- dist_attr: Optional[_DistSpec] = _DistSpec(DistPlacementPattern.REPLICATE)
+ dist_attr: Optional[_DistSpec] = field(default_factory=lambda: _DistSpec(DistPlacementPattern.REPLICATE))
compute_attr: Optional[ComputeSpec] = None
|
{"golden_diff": "diff --git a/colossalai/legacy/tensor/tensor_spec.py b/colossalai/legacy/tensor/tensor_spec.py\n--- a/colossalai/legacy/tensor/tensor_spec.py\n+++ b/colossalai/legacy/tensor/tensor_spec.py\n@@ -1,4 +1,4 @@\n-from dataclasses import dataclass\n+from dataclasses import dataclass, field\n from typing import Optional\n \n from colossalai.legacy.tensor.distspec import DistPlacementPattern, _DistSpec\n@@ -17,5 +17,5 @@\n \"\"\"\n \n pg: ProcessGroup\n- dist_attr: Optional[_DistSpec] = _DistSpec(DistPlacementPattern.REPLICATE)\n+ dist_attr: Optional[_DistSpec] = field(default_factory=lambda: _DistSpec(DistPlacementPattern.REPLICATE))\n compute_attr: Optional[ComputeSpec] = None\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "from dataclasses import dataclass\nfrom typing import Optional\n\nfrom colossalai.legacy.tensor.distspec import DistPlacementPattern, _DistSpec\nfrom colossalai.legacy.tensor.process_group import ProcessGroup\n\nfrom .compute_spec import ComputeSpec\n\n\n@dataclass\nclass ColoTensorSpec:\n \"\"\"ColoTensorSpec\n\n A data class for specifications of the `ColoTensor`.\n It contains attributes of `ProcessGroup`, `_DistSpec`, `ComputeSpec`.\n The latter two attributes are optional. If not set, they are default value is `Replicate()` and `None`.\n \"\"\"\n\n pg: ProcessGroup\n dist_attr: Optional[_DistSpec] = _DistSpec(DistPlacementPattern.REPLICATE)\n compute_attr: Optional[ComputeSpec] = None\n", "path": "colossalai/legacy/tensor/tensor_spec.py"}], "after_files": [{"content": "from dataclasses import dataclass, field\nfrom typing import Optional\n\nfrom colossalai.legacy.tensor.distspec import DistPlacementPattern, _DistSpec\nfrom colossalai.legacy.tensor.process_group import ProcessGroup\n\nfrom .compute_spec import ComputeSpec\n\n\n@dataclass\nclass ColoTensorSpec:\n \"\"\"ColoTensorSpec\n\n A data class for specifications of the `ColoTensor`.\n It contains attributes of `ProcessGroup`, `_DistSpec`, `ComputeSpec`.\n The latter two attributes are optional. If not set, they are default value is `Replicate()` and `None`.\n \"\"\"\n\n pg: ProcessGroup\n dist_attr: Optional[_DistSpec] = field(default_factory=lambda: _DistSpec(DistPlacementPattern.REPLICATE))\n compute_attr: Optional[ComputeSpec] = None\n", "path": "colossalai/legacy/tensor/tensor_spec.py"}]}
| 491 | 190 |
gh_patches_debug_12715
|
rasdani/github-patches
|
git_diff
|
open-telemetry__opentelemetry-python-3240
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Export ExponentialBucketHistogramAggregation in opentelemetry.sdk.metrics.view
**Is your feature request related to a problem?**
We want to use the Exponential Histograms features publicly released in version [1.17.0](https://github.com/open-telemetry/opentelemetry-python/blob/main/CHANGELOG.md#version-1170038b0-2023-03-22).
**Describe the solution you'd like**
I'd like to use the public API.
**Describe alternatives you've considered**
One can import it from `opentelemetry.sdk.metrics._internal.aggregation`
**Additional context**
Currently the code in https://github.com/open-telemetry/opentelemetry-python/blob/b6a1b22fa65f41bdefb01d64b76e5e793d039f6d/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py#L25-L33 is not exporting the newly added `ExponentialBucketHistogramAggregation`
Export ExponentialBucketHistogramAggregation in opentelemetry.sdk.metrics.view
**Is your feature request related to a problem?**
We want to use the Exponential Histograms features publicly released in version [1.17.0](https://github.com/open-telemetry/opentelemetry-python/blob/main/CHANGELOG.md#version-1170038b0-2023-03-22).
**Describe the solution you'd like**
I'd like to use the public API.
**Describe alternatives you've considered**
One can import it from `opentelemetry.sdk.metrics._internal.aggregation`
**Additional context**
Currently the code in https://github.com/open-telemetry/opentelemetry-python/blob/b6a1b22fa65f41bdefb01d64b76e5e793d039f6d/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py#L25-L33 is not exporting the newly added `ExponentialBucketHistogramAggregation`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py`
Content:
```
1 # Copyright The OpenTelemetry Authors
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from opentelemetry.sdk.metrics._internal.aggregation import (
16 Aggregation,
17 DefaultAggregation,
18 DropAggregation,
19 ExplicitBucketHistogramAggregation,
20 LastValueAggregation,
21 SumAggregation,
22 )
23 from opentelemetry.sdk.metrics._internal.view import View
24
25 __all__ = [
26 "Aggregation",
27 "DefaultAggregation",
28 "DropAggregation",
29 "ExplicitBucketHistogramAggregation",
30 "LastValueAggregation",
31 "SumAggregation",
32 "View",
33 ]
34
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py b/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py
--- a/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py
+++ b/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py
@@ -17,6 +17,7 @@
DefaultAggregation,
DropAggregation,
ExplicitBucketHistogramAggregation,
+ ExponentialBucketHistogramAggregation,
LastValueAggregation,
SumAggregation,
)
@@ -27,6 +28,7 @@
"DefaultAggregation",
"DropAggregation",
"ExplicitBucketHistogramAggregation",
+ "ExponentialBucketHistogramAggregation",
"LastValueAggregation",
"SumAggregation",
"View",
|
{"golden_diff": "diff --git a/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py b/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py\n--- a/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py\n+++ b/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py\n@@ -17,6 +17,7 @@\n DefaultAggregation,\n DropAggregation,\n ExplicitBucketHistogramAggregation,\n+ ExponentialBucketHistogramAggregation,\n LastValueAggregation,\n SumAggregation,\n )\n@@ -27,6 +28,7 @@\n \"DefaultAggregation\",\n \"DropAggregation\",\n \"ExplicitBucketHistogramAggregation\",\n+ \"ExponentialBucketHistogramAggregation\",\n \"LastValueAggregation\",\n \"SumAggregation\",\n \"View\",\n", "issue": "Export ExponentialBucketHistogramAggregation in opentelemetry.sdk.metrics.view\n**Is your feature request related to a problem?**\r\nWe want to use the Exponential Histograms features publicly released in version [1.17.0](https://github.com/open-telemetry/opentelemetry-python/blob/main/CHANGELOG.md#version-1170038b0-2023-03-22).\r\n\r\n**Describe the solution you'd like**\r\nI'd like to use the public API.\r\n\r\n**Describe alternatives you've considered**\r\nOne can import it from `opentelemetry.sdk.metrics._internal.aggregation`\r\n\r\n**Additional context**\r\nCurrently the code in https://github.com/open-telemetry/opentelemetry-python/blob/b6a1b22fa65f41bdefb01d64b76e5e793d039f6d/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py#L25-L33 is not exporting the newly added `ExponentialBucketHistogramAggregation`\r\n\nExport ExponentialBucketHistogramAggregation in opentelemetry.sdk.metrics.view\n**Is your feature request related to a problem?**\r\nWe want to use the Exponential Histograms features publicly released in version [1.17.0](https://github.com/open-telemetry/opentelemetry-python/blob/main/CHANGELOG.md#version-1170038b0-2023-03-22).\r\n\r\n**Describe the solution you'd like**\r\nI'd like to use the public API.\r\n\r\n**Describe alternatives you've considered**\r\nOne can import it from `opentelemetry.sdk.metrics._internal.aggregation`\r\n\r\n**Additional context**\r\nCurrently the code in https://github.com/open-telemetry/opentelemetry-python/blob/b6a1b22fa65f41bdefb01d64b76e5e793d039f6d/opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py#L25-L33 is not exporting the newly added `ExponentialBucketHistogramAggregation`\r\n\n", "before_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom opentelemetry.sdk.metrics._internal.aggregation import (\n Aggregation,\n DefaultAggregation,\n DropAggregation,\n ExplicitBucketHistogramAggregation,\n LastValueAggregation,\n SumAggregation,\n)\nfrom opentelemetry.sdk.metrics._internal.view import View\n\n__all__ = [\n \"Aggregation\",\n \"DefaultAggregation\",\n \"DropAggregation\",\n \"ExplicitBucketHistogramAggregation\",\n \"LastValueAggregation\",\n \"SumAggregation\",\n \"View\",\n]\n", "path": "opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py"}], "after_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom opentelemetry.sdk.metrics._internal.aggregation import (\n Aggregation,\n DefaultAggregation,\n DropAggregation,\n ExplicitBucketHistogramAggregation,\n ExponentialBucketHistogramAggregation,\n LastValueAggregation,\n SumAggregation,\n)\nfrom opentelemetry.sdk.metrics._internal.view import View\n\n__all__ = [\n \"Aggregation\",\n \"DefaultAggregation\",\n \"DropAggregation\",\n \"ExplicitBucketHistogramAggregation\",\n \"ExponentialBucketHistogramAggregation\",\n \"LastValueAggregation\",\n \"SumAggregation\",\n \"View\",\n]\n", "path": "opentelemetry-sdk/src/opentelemetry/sdk/metrics/view/__init__.py"}]}
| 1,017 | 185 |
gh_patches_debug_9367
|
rasdani/github-patches
|
git_diff
|
ytdl-org__youtube-dl-14997
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[https://www.franceinter.fr] WARNING: unable to extract upload date
---
### Make sure you are using the *latest* version: run `youtube-dl --version` and ensure your version is *2017.12.14*. If it's not, read [this FAQ entry](https://github.com/rg3/youtube-dl/blob/master/README.md#how-do-i-update-youtube-dl) and update. Issues with outdated version will be rejected.
- [x] I've **verified** and **I assure** that I'm running youtube-dl **2017.12.14**
### Before submitting an *issue* make sure you have:
- [x] At least skimmed through the [README](https://github.com/rg3/youtube-dl/blob/master/README.md), **most notably** the [FAQ](https://github.com/rg3/youtube-dl#faq) and [BUGS](https://github.com/rg3/youtube-dl#bugs) sections
- [x] [Searched](https://github.com/rg3/youtube-dl/search?type=Issues) the bugtracker for similar issues including closed ones
### What is the purpose of your *issue*?
- [ ] Bug report (encountered problems with youtube-dl)
- [x] Site support request (request for adding support for a new site)
- [ ] Feature request (request for a new functionality)
- [ ] Question
- [ ] Other
---
```
youtube-dl-mp3 "https://www.franceinter.fr/emissions/les-concerts-d-inter/les-concerts-d-inter-14-decembre-2017"
[FranceInter] les-concerts-d-inter/les-concerts-d-inter-14-decembre-2017: Downloading webpage
WARNING: unable to extract upload date; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; see https://yt-dl.org/update on how to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
```
```
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: [u'-v']
[debug] Encodings: locale UTF-8, fs UTF-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.12.14
[debug] Python version 2.7.12 - Linux-4.4.0-103-generic-x86_64-with-Ubuntu-16.04-xenial
[debug] exe versions: avconv 2.8.11-0ubuntu0.16.04.1, avprobe 2.8.11-0ubuntu0.16.04.1, ffmpeg 2.8.11-0ubuntu0.16.04.1, ffprobe 2.8.11-0ubuntu0.16.04.1
[debug] Proxy map: {}
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `youtube_dl/extractor/franceinter.py`
Content:
```
1 # coding: utf-8
2 from __future__ import unicode_literals
3
4 from .common import InfoExtractor
5 from ..utils import month_by_name
6
7
8 class FranceInterIE(InfoExtractor):
9 _VALID_URL = r'https?://(?:www\.)?franceinter\.fr/emissions/(?P<id>[^?#]+)'
10
11 _TEST = {
12 'url': 'https://www.franceinter.fr/emissions/affaires-sensibles/affaires-sensibles-07-septembre-2016',
13 'md5': '9e54d7bdb6fdc02a841007f8a975c094',
14 'info_dict': {
15 'id': 'affaires-sensibles/affaires-sensibles-07-septembre-2016',
16 'ext': 'mp3',
17 'title': 'Affaire Cahuzac : le contentieux du compte en Suisse',
18 'description': 'md5:401969c5d318c061f86bda1fa359292b',
19 'upload_date': '20160907',
20 },
21 }
22
23 def _real_extract(self, url):
24 video_id = self._match_id(url)
25
26 webpage = self._download_webpage(url, video_id)
27
28 video_url = self._search_regex(
29 r'(?s)<div[^>]+class=["\']page-diffusion["\'][^>]*>.*?<button[^>]+data-url=(["\'])(?P<url>(?:(?!\1).)+)\1',
30 webpage, 'video url', group='url')
31
32 title = self._og_search_title(webpage)
33 description = self._og_search_description(webpage)
34
35 upload_date_str = self._search_regex(
36 r'class=["\']cover-emission-period["\'][^>]*>[^<]+\s+(\d{1,2}\s+[^\s]+\s+\d{4})<',
37 webpage, 'upload date', fatal=False)
38 if upload_date_str:
39 upload_date_list = upload_date_str.split()
40 upload_date_list.reverse()
41 upload_date_list[1] = '%02d' % (month_by_name(upload_date_list[1], lang='fr') or 0)
42 upload_date_list[2] = '%02d' % int(upload_date_list[2])
43 upload_date = ''.join(upload_date_list)
44 else:
45 upload_date = None
46
47 return {
48 'id': video_id,
49 'title': title,
50 'description': description,
51 'upload_date': upload_date,
52 'formats': [{
53 'url': video_url,
54 'vcodec': 'none',
55 }],
56 }
57
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/youtube_dl/extractor/franceinter.py b/youtube_dl/extractor/franceinter.py
--- a/youtube_dl/extractor/franceinter.py
+++ b/youtube_dl/extractor/franceinter.py
@@ -33,7 +33,7 @@
description = self._og_search_description(webpage)
upload_date_str = self._search_regex(
- r'class=["\']cover-emission-period["\'][^>]*>[^<]+\s+(\d{1,2}\s+[^\s]+\s+\d{4})<',
+ r'class=["\']\s*cover-emission-period\s*["\'][^>]*>[^<]+\s+(\d{1,2}\s+[^\s]+\s+\d{4})<',
webpage, 'upload date', fatal=False)
if upload_date_str:
upload_date_list = upload_date_str.split()
|
{"golden_diff": "diff --git a/youtube_dl/extractor/franceinter.py b/youtube_dl/extractor/franceinter.py\n--- a/youtube_dl/extractor/franceinter.py\n+++ b/youtube_dl/extractor/franceinter.py\n@@ -33,7 +33,7 @@\n description = self._og_search_description(webpage)\n \n upload_date_str = self._search_regex(\n- r'class=[\"\\']cover-emission-period[\"\\'][^>]*>[^<]+\\s+(\\d{1,2}\\s+[^\\s]+\\s+\\d{4})<',\n+ r'class=[\"\\']\\s*cover-emission-period\\s*[\"\\'][^>]*>[^<]+\\s+(\\d{1,2}\\s+[^\\s]+\\s+\\d{4})<',\n webpage, 'upload date', fatal=False)\n if upload_date_str:\n upload_date_list = upload_date_str.split()\n", "issue": "[https://www.franceinter.fr] WARNING: unable to extract upload date\n---\r\n\r\n### Make sure you are using the *latest* version: run `youtube-dl --version` and ensure your version is *2017.12.14*. If it's not, read [this FAQ entry](https://github.com/rg3/youtube-dl/blob/master/README.md#how-do-i-update-youtube-dl) and update. Issues with outdated version will be rejected.\r\n- [x] I've **verified** and **I assure** that I'm running youtube-dl **2017.12.14**\r\n\r\n### Before submitting an *issue* make sure you have:\r\n- [x] At least skimmed through the [README](https://github.com/rg3/youtube-dl/blob/master/README.md), **most notably** the [FAQ](https://github.com/rg3/youtube-dl#faq) and [BUGS](https://github.com/rg3/youtube-dl#bugs) sections\r\n- [x] [Searched](https://github.com/rg3/youtube-dl/search?type=Issues) the bugtracker for similar issues including closed ones\r\n\r\n### What is the purpose of your *issue*?\r\n- [ ] Bug report (encountered problems with youtube-dl)\r\n- [x] Site support request (request for adding support for a new site)\r\n- [ ] Feature request (request for a new functionality)\r\n- [ ] Question\r\n- [ ] Other\r\n\r\n---\r\n\r\n``` \r\nyoutube-dl-mp3 \"https://www.franceinter.fr/emissions/les-concerts-d-inter/les-concerts-d-inter-14-decembre-2017\"\r\n[FranceInter] les-concerts-d-inter/les-concerts-d-inter-14-decembre-2017: Downloading webpage\r\nWARNING: unable to extract upload date; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; see https://yt-dl.org/update on how to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.\r\n```\r\n\r\n```\r\n[debug] System config: []\r\n[debug] User config: []\r\n[debug] Custom config: []\r\n[debug] Command-line args: [u'-v']\r\n[debug] Encodings: locale UTF-8, fs UTF-8, out UTF-8, pref UTF-8\r\n[debug] youtube-dl version 2017.12.14\r\n[debug] Python version 2.7.12 - Linux-4.4.0-103-generic-x86_64-with-Ubuntu-16.04-xenial\r\n[debug] exe versions: avconv 2.8.11-0ubuntu0.16.04.1, avprobe 2.8.11-0ubuntu0.16.04.1, ffmpeg 2.8.11-0ubuntu0.16.04.1, ffprobe 2.8.11-0ubuntu0.16.04.1\r\n[debug] Proxy map: {}\r\n```\n", "before_files": [{"content": "# coding: utf-8\nfrom __future__ import unicode_literals\n\nfrom .common import InfoExtractor\nfrom ..utils import month_by_name\n\n\nclass FranceInterIE(InfoExtractor):\n _VALID_URL = r'https?://(?:www\\.)?franceinter\\.fr/emissions/(?P<id>[^?#]+)'\n\n _TEST = {\n 'url': 'https://www.franceinter.fr/emissions/affaires-sensibles/affaires-sensibles-07-septembre-2016',\n 'md5': '9e54d7bdb6fdc02a841007f8a975c094',\n 'info_dict': {\n 'id': 'affaires-sensibles/affaires-sensibles-07-septembre-2016',\n 'ext': 'mp3',\n 'title': 'Affaire Cahuzac : le contentieux du compte en Suisse',\n 'description': 'md5:401969c5d318c061f86bda1fa359292b',\n 'upload_date': '20160907',\n },\n }\n\n def _real_extract(self, url):\n video_id = self._match_id(url)\n\n webpage = self._download_webpage(url, video_id)\n\n video_url = self._search_regex(\n r'(?s)<div[^>]+class=[\"\\']page-diffusion[\"\\'][^>]*>.*?<button[^>]+data-url=([\"\\'])(?P<url>(?:(?!\\1).)+)\\1',\n webpage, 'video url', group='url')\n\n title = self._og_search_title(webpage)\n description = self._og_search_description(webpage)\n\n upload_date_str = self._search_regex(\n r'class=[\"\\']cover-emission-period[\"\\'][^>]*>[^<]+\\s+(\\d{1,2}\\s+[^\\s]+\\s+\\d{4})<',\n webpage, 'upload date', fatal=False)\n if upload_date_str:\n upload_date_list = upload_date_str.split()\n upload_date_list.reverse()\n upload_date_list[1] = '%02d' % (month_by_name(upload_date_list[1], lang='fr') or 0)\n upload_date_list[2] = '%02d' % int(upload_date_list[2])\n upload_date = ''.join(upload_date_list)\n else:\n upload_date = None\n\n return {\n 'id': video_id,\n 'title': title,\n 'description': description,\n 'upload_date': upload_date,\n 'formats': [{\n 'url': video_url,\n 'vcodec': 'none',\n }],\n }\n", "path": "youtube_dl/extractor/franceinter.py"}], "after_files": [{"content": "# coding: utf-8\nfrom __future__ import unicode_literals\n\nfrom .common import InfoExtractor\nfrom ..utils import month_by_name\n\n\nclass FranceInterIE(InfoExtractor):\n _VALID_URL = r'https?://(?:www\\.)?franceinter\\.fr/emissions/(?P<id>[^?#]+)'\n\n _TEST = {\n 'url': 'https://www.franceinter.fr/emissions/affaires-sensibles/affaires-sensibles-07-septembre-2016',\n 'md5': '9e54d7bdb6fdc02a841007f8a975c094',\n 'info_dict': {\n 'id': 'affaires-sensibles/affaires-sensibles-07-septembre-2016',\n 'ext': 'mp3',\n 'title': 'Affaire Cahuzac : le contentieux du compte en Suisse',\n 'description': 'md5:401969c5d318c061f86bda1fa359292b',\n 'upload_date': '20160907',\n },\n }\n\n def _real_extract(self, url):\n video_id = self._match_id(url)\n\n webpage = self._download_webpage(url, video_id)\n\n video_url = self._search_regex(\n r'(?s)<div[^>]+class=[\"\\']page-diffusion[\"\\'][^>]*>.*?<button[^>]+data-url=([\"\\'])(?P<url>(?:(?!\\1).)+)\\1',\n webpage, 'video url', group='url')\n\n title = self._og_search_title(webpage)\n description = self._og_search_description(webpage)\n\n upload_date_str = self._search_regex(\n r'class=[\"\\']\\s*cover-emission-period\\s*[\"\\'][^>]*>[^<]+\\s+(\\d{1,2}\\s+[^\\s]+\\s+\\d{4})<',\n webpage, 'upload date', fatal=False)\n if upload_date_str:\n upload_date_list = upload_date_str.split()\n upload_date_list.reverse()\n upload_date_list[1] = '%02d' % (month_by_name(upload_date_list[1], lang='fr') or 0)\n upload_date_list[2] = '%02d' % int(upload_date_list[2])\n upload_date = ''.join(upload_date_list)\n else:\n upload_date = None\n\n return {\n 'id': video_id,\n 'title': title,\n 'description': description,\n 'upload_date': upload_date,\n 'formats': [{\n 'url': video_url,\n 'vcodec': 'none',\n }],\n }\n", "path": "youtube_dl/extractor/franceinter.py"}]}
| 1,676 | 201 |
gh_patches_debug_7992
|
rasdani/github-patches
|
git_diff
|
lightly-ai__lightly-305
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix import of ApiWorkflowClient
# Fix import of ApiWorkflowClient
Currently, the following import statement (from the docs) does not work:
```python
from lightly.api import ApiWorkflowClient
```
TODO:
- [x] We need to fix this by exposing the client in the `__init__.py` file.
- [x] Make sure the other imports in the docs work as well
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lightly/active_learning/utils/__init__.py`
Content:
```
```
Path: `lightly/api/__init__.py`
Content:
```
1 """ The lightly.api module provides access to the Lightly web-app. """
2
3 # Copyright (c) 2020. Lightly AG and its affiliates.
4 # All Rights Reserved
5
6 from lightly.api import routes
7
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lightly/active_learning/utils/__init__.py b/lightly/active_learning/utils/__init__.py
--- a/lightly/active_learning/utils/__init__.py
+++ b/lightly/active_learning/utils/__init__.py
@@ -0,0 +1,7 @@
+""" Collection of Utils for Active Learning """
+
+# Copyright (c) 2020. Lightly AG and its affiliates.
+# All Rights Reserved
+
+from lightly.active_learning.utils.bounding_box import BoundingBox
+from lightly.active_learning.utils.object_detection_output import ObjectDetectionOutput
\ No newline at end of file
diff --git a/lightly/api/__init__.py b/lightly/api/__init__.py
--- a/lightly/api/__init__.py
+++ b/lightly/api/__init__.py
@@ -3,4 +3,5 @@
# Copyright (c) 2020. Lightly AG and its affiliates.
# All Rights Reserved
+from lightly.api.api_workflow_client import ApiWorkflowClient
from lightly.api import routes
|
{"golden_diff": "diff --git a/lightly/active_learning/utils/__init__.py b/lightly/active_learning/utils/__init__.py\n--- a/lightly/active_learning/utils/__init__.py\n+++ b/lightly/active_learning/utils/__init__.py\n@@ -0,0 +1,7 @@\n+\"\"\" Collection of Utils for Active Learning \"\"\"\n+\n+# Copyright (c) 2020. Lightly AG and its affiliates.\n+# All Rights Reserved\n+\n+from lightly.active_learning.utils.bounding_box import BoundingBox\n+from lightly.active_learning.utils.object_detection_output import ObjectDetectionOutput\n\\ No newline at end of file\ndiff --git a/lightly/api/__init__.py b/lightly/api/__init__.py\n--- a/lightly/api/__init__.py\n+++ b/lightly/api/__init__.py\n@@ -3,4 +3,5 @@\n # Copyright (c) 2020. Lightly AG and its affiliates.\n # All Rights Reserved\n \n+from lightly.api.api_workflow_client import ApiWorkflowClient\n from lightly.api import routes\n", "issue": "Fix import of ApiWorkflowClient\n# Fix import of ApiWorkflowClient\r\n\r\nCurrently, the following import statement (from the docs) does not work:\r\n```python\r\nfrom lightly.api import ApiWorkflowClient\r\n```\r\n\r\nTODO:\r\n- [x] We need to fix this by exposing the client in the `__init__.py` file. \r\n- [x] Make sure the other imports in the docs work as well\n", "before_files": [{"content": "", "path": "lightly/active_learning/utils/__init__.py"}, {"content": "\"\"\" The lightly.api module provides access to the Lightly web-app. \"\"\"\n\n# Copyright (c) 2020. Lightly AG and its affiliates.\n# All Rights Reserved\n\nfrom lightly.api import routes\n", "path": "lightly/api/__init__.py"}], "after_files": [{"content": "\"\"\" Collection of Utils for Active Learning \"\"\"\n\n# Copyright (c) 2020. Lightly AG and its affiliates.\n# All Rights Reserved\n\nfrom lightly.active_learning.utils.bounding_box import BoundingBox\nfrom lightly.active_learning.utils.object_detection_output import ObjectDetectionOutput", "path": "lightly/active_learning/utils/__init__.py"}, {"content": "\"\"\" The lightly.api module provides access to the Lightly web-app. \"\"\"\n\n# Copyright (c) 2020. Lightly AG and its affiliates.\n# All Rights Reserved\n\nfrom lightly.api.api_workflow_client import ApiWorkflowClient\nfrom lightly.api import routes\n", "path": "lightly/api/__init__.py"}]}
| 418 | 222 |
gh_patches_debug_5392
|
rasdani/github-patches
|
git_diff
|
streamlink__streamlink-1351
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Kanal7 Defective again
Only 2 months later they have changed the design.
Not opening with latest 0.9.0 Release:
[cli][info] Found matching plugin kanal7 for URL http://www.kanal7.com/canli-izle
error: No playable streams found on this URL: http://www.kanal7.com/canli-izle
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/kanal7.py`
Content:
```
1 from __future__ import print_function
2 import re
3
4 from streamlink.plugin import Plugin
5 from streamlink.plugin.api import http
6 from streamlink.plugin.api import useragents
7 from streamlink.plugin.api import validate
8 from streamlink.stream import HLSStream
9
10
11 class Kanal7(Plugin):
12 url_re = re.compile(r"https?://(?:www.)?kanal7.com/canli-izle")
13 iframe_re = re.compile(r'iframe .*?src="(http://[^"]*?)"')
14 stream_re = re.compile(r'src="(http[^"]*?)"')
15
16 @classmethod
17 def can_handle_url(cls, url):
18 return cls.url_re.match(url) is not None
19
20 def find_iframe(self, url):
21 res = http.get(url)
22 # find iframe url
23 iframe = self.iframe_re.search(res.text)
24 iframe_url = iframe and iframe.group(1)
25 if iframe_url:
26 self.logger.debug("Found iframe: {}", iframe_url)
27 return iframe_url
28
29 def _get_streams(self):
30 iframe1 = self.find_iframe(self.url)
31 if iframe1:
32 iframe2 = self.find_iframe(iframe1)
33 if iframe2:
34 ires = http.get(iframe2)
35 stream_m = self.stream_re.search(ires.text)
36 stream_url = stream_m and stream_m.group(1)
37 if stream_url:
38 yield "live", HLSStream(self.session, stream_url, headers={"Referer": iframe2})
39 else:
40 self.logger.error("Could not find second iframe, has the page layout changed?")
41 else:
42 self.logger.error("Could not find iframe, has the page layout changed?")
43
44
45 __plugin__ = Kanal7
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/streamlink/plugins/kanal7.py b/src/streamlink/plugins/kanal7.py
--- a/src/streamlink/plugins/kanal7.py
+++ b/src/streamlink/plugins/kanal7.py
@@ -11,7 +11,7 @@
class Kanal7(Plugin):
url_re = re.compile(r"https?://(?:www.)?kanal7.com/canli-izle")
iframe_re = re.compile(r'iframe .*?src="(http://[^"]*?)"')
- stream_re = re.compile(r'src="(http[^"]*?)"')
+ stream_re = re.compile(r'''tp_file\s+=\s+['"](http[^"]*?)['"]''')
@classmethod
def can_handle_url(cls, url):
|
{"golden_diff": "diff --git a/src/streamlink/plugins/kanal7.py b/src/streamlink/plugins/kanal7.py\n--- a/src/streamlink/plugins/kanal7.py\n+++ b/src/streamlink/plugins/kanal7.py\n@@ -11,7 +11,7 @@\n class Kanal7(Plugin):\n url_re = re.compile(r\"https?://(?:www.)?kanal7.com/canli-izle\")\n iframe_re = re.compile(r'iframe .*?src=\"(http://[^\"]*?)\"')\n- stream_re = re.compile(r'src=\"(http[^\"]*?)\"')\n+ stream_re = re.compile(r'''tp_file\\s+=\\s+['\"](http[^\"]*?)['\"]''')\n \n @classmethod\n def can_handle_url(cls, url):\n", "issue": "Kanal7 Defective again\nOnly 2 months later they have changed the design.\r\n\r\nNot opening with latest 0.9.0 Release:\r\n\r\n[cli][info] Found matching plugin kanal7 for URL http://www.kanal7.com/canli-izle\r\nerror: No playable streams found on this URL: http://www.kanal7.com/canli-izle\n", "before_files": [{"content": "from __future__ import print_function\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import http\nfrom streamlink.plugin.api import useragents\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream\n\n\nclass Kanal7(Plugin):\n url_re = re.compile(r\"https?://(?:www.)?kanal7.com/canli-izle\")\n iframe_re = re.compile(r'iframe .*?src=\"(http://[^\"]*?)\"')\n stream_re = re.compile(r'src=\"(http[^\"]*?)\"')\n\n @classmethod\n def can_handle_url(cls, url):\n return cls.url_re.match(url) is not None\n\n def find_iframe(self, url):\n res = http.get(url)\n # find iframe url\n iframe = self.iframe_re.search(res.text)\n iframe_url = iframe and iframe.group(1)\n if iframe_url:\n self.logger.debug(\"Found iframe: {}\", iframe_url)\n return iframe_url\n\n def _get_streams(self):\n iframe1 = self.find_iframe(self.url)\n if iframe1:\n iframe2 = self.find_iframe(iframe1)\n if iframe2:\n ires = http.get(iframe2)\n stream_m = self.stream_re.search(ires.text)\n stream_url = stream_m and stream_m.group(1)\n if stream_url:\n yield \"live\", HLSStream(self.session, stream_url, headers={\"Referer\": iframe2})\n else:\n self.logger.error(\"Could not find second iframe, has the page layout changed?\")\n else:\n self.logger.error(\"Could not find iframe, has the page layout changed?\")\n\n\n__plugin__ = Kanal7\n", "path": "src/streamlink/plugins/kanal7.py"}], "after_files": [{"content": "from __future__ import print_function\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import http\nfrom streamlink.plugin.api import useragents\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream\n\n\nclass Kanal7(Plugin):\n url_re = re.compile(r\"https?://(?:www.)?kanal7.com/canli-izle\")\n iframe_re = re.compile(r'iframe .*?src=\"(http://[^\"]*?)\"')\n stream_re = re.compile(r'''tp_file\\s+=\\s+['\"](http[^\"]*?)['\"]''')\n\n @classmethod\n def can_handle_url(cls, url):\n return cls.url_re.match(url) is not None\n\n def find_iframe(self, url):\n res = http.get(url)\n # find iframe url\n iframe = self.iframe_re.search(res.text)\n iframe_url = iframe and iframe.group(1)\n if iframe_url:\n self.logger.debug(\"Found iframe: {}\", iframe_url)\n return iframe_url\n\n def _get_streams(self):\n iframe1 = self.find_iframe(self.url)\n if iframe1:\n iframe2 = self.find_iframe(iframe1)\n if iframe2:\n ires = http.get(iframe2)\n stream_m = self.stream_re.search(ires.text)\n stream_url = stream_m and stream_m.group(1)\n if stream_url:\n yield \"live\", HLSStream(self.session, stream_url, headers={\"Referer\": iframe2})\n else:\n self.logger.error(\"Could not find second iframe, has the page layout changed?\")\n else:\n self.logger.error(\"Could not find iframe, has the page layout changed?\")\n\n\n__plugin__ = Kanal7\n", "path": "src/streamlink/plugins/kanal7.py"}]}
| 795 | 172 |
gh_patches_debug_41012
|
rasdani/github-patches
|
git_diff
|
cornellius-gp__gpytorch-1468
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Docs] Missing Grid.py documentation
# 📚 Documentation/Examples
** Is there documentation missing? **
The utils section of [GPyTorch documentation](https://gpytorch.readthedocs.io) does not include any information on grid.py, which is referenced [elsewhere in the docs](https://docs.gpytorch.ai/en/stable/kernels.html?highlight=choose_grid_size#gpytorch.kernels.GridKernel.update_grid).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gpytorch/utils/grid.py`
Content:
```
1 #!/usr/bin/env python3
2
3 import math
4 from typing import List, Tuple
5
6 import torch
7
8
9 def scale_to_bounds(x, lower_bound, upper_bound):
10 """
11 Scale the input data so that it lies in between the lower and upper bounds.
12
13 Args:
14 :attr:`x` (Tensor `n` or `b x n`):
15 the input
16 :attr:`lower_bound` (float)
17 :attr:`upper_bound` (float)
18
19 Returns:
20 :obj:`torch.Tensor`
21 """
22 # Scale features so they fit inside grid bounds
23 min_val = x.min()
24 max_val = x.max()
25 diff = max_val - min_val
26 x = (x - min_val) * (0.95 * (upper_bound - lower_bound) / diff) + 0.95 * lower_bound
27 return x
28
29
30 def choose_grid_size(train_inputs, ratio=1.0, kronecker_structure=True):
31 """
32 Given some training inputs, determine a good grid size for KISS-GP.
33
34 Args:
35 :attr:`train_inputs` (Tensor `n` or `n x d` or `b x n x d`):
36 training data
37 :attr:`ratio` (float, optional):
38 Ratio - number of grid points to the amount of data (default: 1.)
39 :attr:`kronecker_structure` (bool, default=True):
40 Whether or not the model will use Kronecker structure in the grid
41 (set to True unless there is an additive or product decomposition in the prior)
42
43 Returns:
44 :obj:`int`
45 """
46 # Scale features so they fit inside grid bounds
47 num_data = train_inputs.numel() if train_inputs.dim() == 1 else train_inputs.size(-2)
48 num_dim = 1 if train_inputs.dim() == 1 else train_inputs.size(-1)
49 if kronecker_structure:
50 return int(ratio * math.pow(num_data, 1.0 / num_dim))
51 else:
52 return ratio * num_data
53
54
55 def convert_legacy_grid(grid: torch.Tensor) -> List[torch.Tensor]:
56 return [grid[:, i] for i in range(grid.size(-1))]
57
58
59 def create_data_from_grid(grid: List[torch.Tensor]) -> torch.Tensor:
60 """
61 Args:
62 :attr:`grid` (List[Tensor])
63 Each Tensor is a 1D set of increments for the grid in that dimension
64 Returns:
65 `grid_data` (Tensor)
66 Returns the set of points on the grid going by column-major order
67 (due to legacy reasons).
68 """
69 if torch.is_tensor(grid):
70 grid = convert_legacy_grid(grid)
71 ndims = len(grid)
72 assert all(axis.dim() == 1 for axis in grid)
73 projections = torch.meshgrid(*grid)
74 grid_tensor = torch.stack(projections, axis=-1)
75 # Note that if we did
76 # grid_data = grid_tensor.reshape(-1, ndims)
77 # instead, we would be iterating through the points of our grid from the
78 # last data dimension to the first data dimension. However, due to legacy
79 # reasons, we need to iterate from the first data dimension to the last data
80 # dimension when creating grid_data
81 grid_data = grid_tensor.permute(*(reversed(range(ndims + 1)))).reshape(ndims, -1).transpose(0, 1)
82 return grid_data
83
84
85 def create_grid(
86 grid_sizes: List[int], grid_bounds: List[Tuple[float, float]], extend: bool = True, device="cpu", dtype=torch.float,
87 ) -> List[torch.Tensor]:
88 """
89 Creates a grid represented by a list of 1D Tensors representing the
90 projections of the grid into each dimension
91
92 If `extend`, we extend the grid by two points past the specified boundary
93 which can be important for getting good grid interpolations
94 """
95 grid = []
96 for i in range(len(grid_bounds)):
97 grid_diff = float(grid_bounds[i][1] - grid_bounds[i][0]) / (grid_sizes[i] - 2)
98 if extend:
99 proj = torch.linspace(
100 grid_bounds[i][0] - grid_diff, grid_bounds[i][1] + grid_diff, grid_sizes[i], device=device, dtype=dtype,
101 )
102 else:
103 proj = torch.linspace(grid_bounds[i][0], grid_bounds[i][1], grid_sizes[i], device=device, dtype=dtype,)
104 grid.append(proj)
105 return grid
106
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/gpytorch/utils/grid.py b/gpytorch/utils/grid.py
--- a/gpytorch/utils/grid.py
+++ b/gpytorch/utils/grid.py
@@ -10,14 +10,12 @@
"""
Scale the input data so that it lies in between the lower and upper bounds.
- Args:
- :attr:`x` (Tensor `n` or `b x n`):
- the input
- :attr:`lower_bound` (float)
- :attr:`upper_bound` (float)
-
- Returns:
- :obj:`torch.Tensor`
+ :param x: the input data
+ :type x: torch.Tensor (... x n x d)
+ :param float lower_bound: lower bound of scaled data
+ :param float upper_bound: upper bound of scaled data
+ :return: scaled data
+ :rtype: torch.Tensor (... x n x d)
"""
# Scale features so they fit inside grid bounds
min_val = x.min()
@@ -31,17 +29,15 @@
"""
Given some training inputs, determine a good grid size for KISS-GP.
- Args:
- :attr:`train_inputs` (Tensor `n` or `n x d` or `b x n x d`):
- training data
- :attr:`ratio` (float, optional):
- Ratio - number of grid points to the amount of data (default: 1.)
- :attr:`kronecker_structure` (bool, default=True):
- Whether or not the model will use Kronecker structure in the grid
- (set to True unless there is an additive or product decomposition in the prior)
-
- Returns:
- :obj:`int`
+ :param x: the input data
+ :type x: torch.Tensor (... x n x d)
+ :param ratio: Amount of grid points per data point (default: 1.)
+ :type ratio: float, optional
+ :param kronecker_structure: Whether or not the model will use Kronecker structure in the grid
+ (set to True unless there is an additive or product decomposition in the prior)
+ :type kronecker_structure: bool, optional
+ :return: Grid size
+ :rtype: int
"""
# Scale features so they fit inside grid bounds
num_data = train_inputs.numel() if train_inputs.dim() == 1 else train_inputs.size(-2)
@@ -58,13 +54,10 @@
def create_data_from_grid(grid: List[torch.Tensor]) -> torch.Tensor:
"""
- Args:
- :attr:`grid` (List[Tensor])
- Each Tensor is a 1D set of increments for the grid in that dimension
- Returns:
- `grid_data` (Tensor)
- Returns the set of points on the grid going by column-major order
- (due to legacy reasons).
+ :param grid: Each Tensor is a 1D set of increments for the grid in that dimension
+ :type grid: List[torch.Tensor]
+ :return: The set of points on the grid going by column-major order
+ :rtype: torch.Tensor
"""
if torch.is_tensor(grid):
grid = convert_legacy_grid(grid)
@@ -90,7 +83,18 @@
projections of the grid into each dimension
If `extend`, we extend the grid by two points past the specified boundary
- which can be important for getting good grid interpolations
+ which can be important for getting good grid interpolations.
+
+ :param grid_sizes: Sizes of each grid dimension
+ :type grid_sizes: List[int]
+ :param grid_bounds: Lower and upper bounds of each grid dimension
+ :type grid_sizes: List[Tuple[float, float]]
+ :param device: target device for output (default: cpu)
+ :type device: torch.device, optional
+ :param dtype: target dtype for output (default: torch.float)
+ :type dtype: torch.dtype, optional
+ :return: Grid points for each dimension. Grid points are stored in a :obj:`torch.Tensor` with shape `grid_sizes[i]`.
+ :rtype: List[torch.Tensor]
"""
grid = []
for i in range(len(grid_bounds)):
|
{"golden_diff": "diff --git a/gpytorch/utils/grid.py b/gpytorch/utils/grid.py\n--- a/gpytorch/utils/grid.py\n+++ b/gpytorch/utils/grid.py\n@@ -10,14 +10,12 @@\n \"\"\"\n Scale the input data so that it lies in between the lower and upper bounds.\n \n- Args:\n- :attr:`x` (Tensor `n` or `b x n`):\n- the input\n- :attr:`lower_bound` (float)\n- :attr:`upper_bound` (float)\n-\n- Returns:\n- :obj:`torch.Tensor`\n+ :param x: the input data\n+ :type x: torch.Tensor (... x n x d)\n+ :param float lower_bound: lower bound of scaled data\n+ :param float upper_bound: upper bound of scaled data\n+ :return: scaled data\n+ :rtype: torch.Tensor (... x n x d)\n \"\"\"\n # Scale features so they fit inside grid bounds\n min_val = x.min()\n@@ -31,17 +29,15 @@\n \"\"\"\n Given some training inputs, determine a good grid size for KISS-GP.\n \n- Args:\n- :attr:`train_inputs` (Tensor `n` or `n x d` or `b x n x d`):\n- training data\n- :attr:`ratio` (float, optional):\n- Ratio - number of grid points to the amount of data (default: 1.)\n- :attr:`kronecker_structure` (bool, default=True):\n- Whether or not the model will use Kronecker structure in the grid\n- (set to True unless there is an additive or product decomposition in the prior)\n-\n- Returns:\n- :obj:`int`\n+ :param x: the input data\n+ :type x: torch.Tensor (... x n x d)\n+ :param ratio: Amount of grid points per data point (default: 1.)\n+ :type ratio: float, optional\n+ :param kronecker_structure: Whether or not the model will use Kronecker structure in the grid\n+ (set to True unless there is an additive or product decomposition in the prior)\n+ :type kronecker_structure: bool, optional\n+ :return: Grid size\n+ :rtype: int\n \"\"\"\n # Scale features so they fit inside grid bounds\n num_data = train_inputs.numel() if train_inputs.dim() == 1 else train_inputs.size(-2)\n@@ -58,13 +54,10 @@\n \n def create_data_from_grid(grid: List[torch.Tensor]) -> torch.Tensor:\n \"\"\"\n- Args:\n- :attr:`grid` (List[Tensor])\n- Each Tensor is a 1D set of increments for the grid in that dimension\n- Returns:\n- `grid_data` (Tensor)\n- Returns the set of points on the grid going by column-major order\n- (due to legacy reasons).\n+ :param grid: Each Tensor is a 1D set of increments for the grid in that dimension\n+ :type grid: List[torch.Tensor]\n+ :return: The set of points on the grid going by column-major order\n+ :rtype: torch.Tensor\n \"\"\"\n if torch.is_tensor(grid):\n grid = convert_legacy_grid(grid)\n@@ -90,7 +83,18 @@\n projections of the grid into each dimension\n \n If `extend`, we extend the grid by two points past the specified boundary\n- which can be important for getting good grid interpolations\n+ which can be important for getting good grid interpolations.\n+\n+ :param grid_sizes: Sizes of each grid dimension\n+ :type grid_sizes: List[int]\n+ :param grid_bounds: Lower and upper bounds of each grid dimension\n+ :type grid_sizes: List[Tuple[float, float]]\n+ :param device: target device for output (default: cpu)\n+ :type device: torch.device, optional\n+ :param dtype: target dtype for output (default: torch.float)\n+ :type dtype: torch.dtype, optional\n+ :return: Grid points for each dimension. Grid points are stored in a :obj:`torch.Tensor` with shape `grid_sizes[i]`.\n+ :rtype: List[torch.Tensor]\n \"\"\"\n grid = []\n for i in range(len(grid_bounds)):\n", "issue": "[Docs] Missing Grid.py documentation\n# \ud83d\udcda Documentation/Examples\r\n\r\n** Is there documentation missing? **\r\nThe utils section of [GPyTorch documentation](https://gpytorch.readthedocs.io) does not include any information on grid.py, which is referenced [elsewhere in the docs](https://docs.gpytorch.ai/en/stable/kernels.html?highlight=choose_grid_size#gpytorch.kernels.GridKernel.update_grid).\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n\nimport math\nfrom typing import List, Tuple\n\nimport torch\n\n\ndef scale_to_bounds(x, lower_bound, upper_bound):\n \"\"\"\n Scale the input data so that it lies in between the lower and upper bounds.\n\n Args:\n :attr:`x` (Tensor `n` or `b x n`):\n the input\n :attr:`lower_bound` (float)\n :attr:`upper_bound` (float)\n\n Returns:\n :obj:`torch.Tensor`\n \"\"\"\n # Scale features so they fit inside grid bounds\n min_val = x.min()\n max_val = x.max()\n diff = max_val - min_val\n x = (x - min_val) * (0.95 * (upper_bound - lower_bound) / diff) + 0.95 * lower_bound\n return x\n\n\ndef choose_grid_size(train_inputs, ratio=1.0, kronecker_structure=True):\n \"\"\"\n Given some training inputs, determine a good grid size for KISS-GP.\n\n Args:\n :attr:`train_inputs` (Tensor `n` or `n x d` or `b x n x d`):\n training data\n :attr:`ratio` (float, optional):\n Ratio - number of grid points to the amount of data (default: 1.)\n :attr:`kronecker_structure` (bool, default=True):\n Whether or not the model will use Kronecker structure in the grid\n (set to True unless there is an additive or product decomposition in the prior)\n\n Returns:\n :obj:`int`\n \"\"\"\n # Scale features so they fit inside grid bounds\n num_data = train_inputs.numel() if train_inputs.dim() == 1 else train_inputs.size(-2)\n num_dim = 1 if train_inputs.dim() == 1 else train_inputs.size(-1)\n if kronecker_structure:\n return int(ratio * math.pow(num_data, 1.0 / num_dim))\n else:\n return ratio * num_data\n\n\ndef convert_legacy_grid(grid: torch.Tensor) -> List[torch.Tensor]:\n return [grid[:, i] for i in range(grid.size(-1))]\n\n\ndef create_data_from_grid(grid: List[torch.Tensor]) -> torch.Tensor:\n \"\"\"\n Args:\n :attr:`grid` (List[Tensor])\n Each Tensor is a 1D set of increments for the grid in that dimension\n Returns:\n `grid_data` (Tensor)\n Returns the set of points on the grid going by column-major order\n (due to legacy reasons).\n \"\"\"\n if torch.is_tensor(grid):\n grid = convert_legacy_grid(grid)\n ndims = len(grid)\n assert all(axis.dim() == 1 for axis in grid)\n projections = torch.meshgrid(*grid)\n grid_tensor = torch.stack(projections, axis=-1)\n # Note that if we did\n # grid_data = grid_tensor.reshape(-1, ndims)\n # instead, we would be iterating through the points of our grid from the\n # last data dimension to the first data dimension. However, due to legacy\n # reasons, we need to iterate from the first data dimension to the last data\n # dimension when creating grid_data\n grid_data = grid_tensor.permute(*(reversed(range(ndims + 1)))).reshape(ndims, -1).transpose(0, 1)\n return grid_data\n\n\ndef create_grid(\n grid_sizes: List[int], grid_bounds: List[Tuple[float, float]], extend: bool = True, device=\"cpu\", dtype=torch.float,\n) -> List[torch.Tensor]:\n \"\"\"\n Creates a grid represented by a list of 1D Tensors representing the\n projections of the grid into each dimension\n\n If `extend`, we extend the grid by two points past the specified boundary\n which can be important for getting good grid interpolations\n \"\"\"\n grid = []\n for i in range(len(grid_bounds)):\n grid_diff = float(grid_bounds[i][1] - grid_bounds[i][0]) / (grid_sizes[i] - 2)\n if extend:\n proj = torch.linspace(\n grid_bounds[i][0] - grid_diff, grid_bounds[i][1] + grid_diff, grid_sizes[i], device=device, dtype=dtype,\n )\n else:\n proj = torch.linspace(grid_bounds[i][0], grid_bounds[i][1], grid_sizes[i], device=device, dtype=dtype,)\n grid.append(proj)\n return grid\n", "path": "gpytorch/utils/grid.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nimport math\nfrom typing import List, Tuple\n\nimport torch\n\n\ndef scale_to_bounds(x, lower_bound, upper_bound):\n \"\"\"\n Scale the input data so that it lies in between the lower and upper bounds.\n\n :param x: the input data\n :type x: torch.Tensor (... x n x d)\n :param float lower_bound: lower bound of scaled data\n :param float upper_bound: upper bound of scaled data\n :return: scaled data\n :rtype: torch.Tensor (... x n x d)\n \"\"\"\n # Scale features so they fit inside grid bounds\n min_val = x.min()\n max_val = x.max()\n diff = max_val - min_val\n x = (x - min_val) * (0.95 * (upper_bound - lower_bound) / diff) + 0.95 * lower_bound\n return x\n\n\ndef choose_grid_size(train_inputs, ratio=1.0, kronecker_structure=True):\n \"\"\"\n Given some training inputs, determine a good grid size for KISS-GP.\n\n :param x: the input data\n :type x: torch.Tensor (... x n x d)\n :param ratio: Amount of grid points per data point (default: 1.)\n :type ratio: float, optional\n :param kronecker_structure: Whether or not the model will use Kronecker structure in the grid\n (set to True unless there is an additive or product decomposition in the prior)\n :type kronecker_structure: bool, optional\n :return: Grid size\n :rtype: int\n \"\"\"\n # Scale features so they fit inside grid bounds\n num_data = train_inputs.numel() if train_inputs.dim() == 1 else train_inputs.size(-2)\n num_dim = 1 if train_inputs.dim() == 1 else train_inputs.size(-1)\n if kronecker_structure:\n return int(ratio * math.pow(num_data, 1.0 / num_dim))\n else:\n return ratio * num_data\n\n\ndef convert_legacy_grid(grid: torch.Tensor) -> List[torch.Tensor]:\n return [grid[:, i] for i in range(grid.size(-1))]\n\n\ndef create_data_from_grid(grid: List[torch.Tensor]) -> torch.Tensor:\n \"\"\"\n :param grid: Each Tensor is a 1D set of increments for the grid in that dimension\n :type grid: List[torch.Tensor]\n :return: The set of points on the grid going by column-major order\n :rtype: torch.Tensor\n \"\"\"\n if torch.is_tensor(grid):\n grid = convert_legacy_grid(grid)\n ndims = len(grid)\n assert all(axis.dim() == 1 for axis in grid)\n projections = torch.meshgrid(*grid)\n grid_tensor = torch.stack(projections, axis=-1)\n # Note that if we did\n # grid_data = grid_tensor.reshape(-1, ndims)\n # instead, we would be iterating through the points of our grid from the\n # last data dimension to the first data dimension. However, due to legacy\n # reasons, we need to iterate from the first data dimension to the last data\n # dimension when creating grid_data\n grid_data = grid_tensor.permute(*(reversed(range(ndims + 1)))).reshape(ndims, -1).transpose(0, 1)\n return grid_data\n\n\ndef create_grid(\n grid_sizes: List[int], grid_bounds: List[Tuple[float, float]], extend: bool = True, device=\"cpu\", dtype=torch.float,\n) -> List[torch.Tensor]:\n \"\"\"\n Creates a grid represented by a list of 1D Tensors representing the\n projections of the grid into each dimension\n\n If `extend`, we extend the grid by two points past the specified boundary\n which can be important for getting good grid interpolations.\n\n :param grid_sizes: Sizes of each grid dimension\n :type grid_sizes: List[int]\n :param grid_bounds: Lower and upper bounds of each grid dimension\n :type grid_sizes: List[Tuple[float, float]]\n :param device: target device for output (default: cpu)\n :type device: torch.device, optional\n :param dtype: target dtype for output (default: torch.float)\n :type dtype: torch.dtype, optional\n :return: Grid points for each dimension. Grid points are stored in a :obj:`torch.Tensor` with shape `grid_sizes[i]`.\n :rtype: List[torch.Tensor]\n \"\"\"\n grid = []\n for i in range(len(grid_bounds)):\n grid_diff = float(grid_bounds[i][1] - grid_bounds[i][0]) / (grid_sizes[i] - 2)\n if extend:\n proj = torch.linspace(\n grid_bounds[i][0] - grid_diff, grid_bounds[i][1] + grid_diff, grid_sizes[i], device=device, dtype=dtype,\n )\n else:\n proj = torch.linspace(grid_bounds[i][0], grid_bounds[i][1], grid_sizes[i], device=device, dtype=dtype,)\n grid.append(proj)\n return grid\n", "path": "gpytorch/utils/grid.py"}]}
| 1,538 | 961 |
gh_patches_debug_27885
|
rasdani/github-patches
|
git_diff
|
pwr-Solaar__Solaar-743
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
authorship of Solaar in setup.py
Daniel Pavel is listed as the sole author of Solaar in setup.py
As far as I can tell, this puts him and his email in several repositories, such as PyPI https://pypi.org/project/solaar/
Who should be put there?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python3
2
3 from glob import glob as _glob
4 try:
5 from setuptools import setup
6 except ImportError:
7 from distutils.core import setup
8
9 autostart_path = '/etc/xdg/autostart'
10
11 import sys
12 backup_path_0 = sys.path[0]
13 sys.path[0] = backup_path_0 + '/lib'
14 #from solaar import NAME, __version__
15 __version__ = '1.0.2-rc1'
16 NAME = 'Solaar'
17
18 sys.path[0] = backup_path_0
19
20 if 'install' in sys.argv:
21 # naively guess where the autostart .desktop file should be installed
22 if '--prefix' in sys.argv or any(x.startswith('--prefix=') for x in sys.argv) or '--home' in sys.argv:
23 autostart_path = 'etc/xdg/autostart'
24 elif '--user' in sys.argv:
25 from os import environ
26 from os import path
27 xdg_config_home = environ.get('XDG_CONFIG_HOME', path.expanduser(path.join('~', '.config')))
28 autostart_path = path.join(xdg_config_home, 'autostart')
29 del environ, path, xdg_config_home
30
31 del sys, backup_path_0
32
33
34 def _data_files():
35 from os.path import dirname as _dirname
36
37 yield 'share/solaar/icons', _glob('share/solaar/icons/solaar*.svg')
38 yield 'share/solaar/icons', _glob('share/solaar/icons/light_*.png')
39 yield 'share/icons/hicolor/scalable/apps', ['share/solaar/icons/solaar.svg']
40
41 for mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):
42 yield _dirname(mo), [mo]
43
44 yield 'share/applications', ['share/applications/solaar.desktop']
45 yield autostart_path, ['share/autostart/solaar.desktop']
46
47 del _dirname
48
49
50 setup(name=NAME.lower(),
51 version=__version__,
52 description='Linux devices manager for the Logitech Unifying Receiver.',
53 long_description='''
54 Solaar is a Linux device manager for Logitech's Unifying Receiver peripherals.
55 It is able to pair/unpair devices to the receiver, and for some devices read
56 battery status.
57 '''.strip(),
58 author='Daniel Pavel',
59 author_email='[email protected]',
60 license='GPLv2',
61 url='http://pwr-solaar.github.io/Solaar/',
62 classifiers=[
63 'Development Status :: 4 - Beta',
64 'Environment :: X11 Applications :: GTK',
65 'Environment :: Console',
66 'Intended Audience :: End Users/Desktop',
67 'License :: DFSG approved',
68 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',
69 'Natural Language :: English',
70 'Programming Language :: Python :: 3 :: Only',
71 'Operating System :: POSIX :: Linux',
72 'Topic :: Utilities',
73 ],
74
75 platforms=['linux'],
76
77 # sudo apt install python-gi python3-gi \
78 # gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1
79 # os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],
80
81 python_requires='>=3.2',
82 install_requires=['pyudev (>= 0.13)', ],
83 package_dir={'': 'lib'},
84 packages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],
85 data_files=list(_data_files()),
86 scripts=_glob('bin/*'),
87 )
88
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -8,28 +8,10 @@
autostart_path = '/etc/xdg/autostart'
-import sys
-backup_path_0 = sys.path[0]
-sys.path[0] = backup_path_0 + '/lib'
#from solaar import NAME, __version__
__version__ = '1.0.2-rc1'
NAME = 'Solaar'
-sys.path[0] = backup_path_0
-
-if 'install' in sys.argv:
- # naively guess where the autostart .desktop file should be installed
- if '--prefix' in sys.argv or any(x.startswith('--prefix=') for x in sys.argv) or '--home' in sys.argv:
- autostart_path = 'etc/xdg/autostart'
- elif '--user' in sys.argv:
- from os import environ
- from os import path
- xdg_config_home = environ.get('XDG_CONFIG_HOME', path.expanduser(path.join('~', '.config')))
- autostart_path = path.join(xdg_config_home, 'autostart')
- del environ, path, xdg_config_home
-
-del sys, backup_path_0
-
def _data_files():
from os.path import dirname as _dirname
@@ -43,6 +25,7 @@
yield 'share/applications', ['share/applications/solaar.desktop']
yield autostart_path, ['share/autostart/solaar.desktop']
+ yield '/etc/udev/rules.d', ['rules.d/42-logitech-unify-permissions.rules']
del _dirname
@@ -56,7 +39,6 @@
battery status.
'''.strip(),
author='Daniel Pavel',
- author_email='[email protected]',
license='GPLv2',
url='http://pwr-solaar.github.io/Solaar/',
classifiers=[
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -8,28 +8,10 @@\n \n autostart_path = '/etc/xdg/autostart'\n \n-import sys\n-backup_path_0 = sys.path[0]\n-sys.path[0] = backup_path_0 + '/lib'\n #from solaar import NAME, __version__\n __version__ = '1.0.2-rc1'\n NAME = 'Solaar'\n \n-sys.path[0] = backup_path_0\n-\n-if 'install' in sys.argv:\n-\t# naively guess where the autostart .desktop file should be installed\n-\tif '--prefix' in sys.argv or any(x.startswith('--prefix=') for x in sys.argv) or '--home' in sys.argv:\n-\t\tautostart_path = 'etc/xdg/autostart'\n-\telif '--user' in sys.argv:\n-\t\tfrom os import environ\n-\t\tfrom os import path\n-\t\txdg_config_home = environ.get('XDG_CONFIG_HOME', path.expanduser(path.join('~', '.config')))\n-\t\tautostart_path = path.join(xdg_config_home, 'autostart')\n-\t\tdel environ, path, xdg_config_home\n-\n-del sys, backup_path_0\n-\n \n def _data_files():\n \tfrom os.path import dirname as _dirname\n@@ -43,6 +25,7 @@\n \n \tyield 'share/applications', ['share/applications/solaar.desktop']\n \tyield autostart_path, ['share/autostart/solaar.desktop']\n+\tyield '/etc/udev/rules.d', ['rules.d/42-logitech-unify-permissions.rules']\n \n \tdel _dirname\n \n@@ -56,7 +39,6 @@\n battery status.\n '''.strip(),\n \t\tauthor='Daniel Pavel',\n-\t\tauthor_email='[email protected]',\n \t\tlicense='GPLv2',\n \t\turl='http://pwr-solaar.github.io/Solaar/',\n \t\tclassifiers=[\n", "issue": "authorship of Solaar in setup.py\nDaniel Pavel is listed as the sole author of Solaar in setup.py \r\n\r\nAs far as I can tell, this puts him and his email in several repositories, such as PyPI https://pypi.org/project/solaar/\r\n\r\nWho should be put there?\n", "before_files": [{"content": "#!/usr/bin/env python3\n\nfrom glob import glob as _glob\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\nautostart_path = '/etc/xdg/autostart'\n\nimport sys\nbackup_path_0 = sys.path[0]\nsys.path[0] = backup_path_0 + '/lib'\n#from solaar import NAME, __version__\n__version__ = '1.0.2-rc1'\nNAME = 'Solaar'\n\nsys.path[0] = backup_path_0\n\nif 'install' in sys.argv:\n\t# naively guess where the autostart .desktop file should be installed\n\tif '--prefix' in sys.argv or any(x.startswith('--prefix=') for x in sys.argv) or '--home' in sys.argv:\n\t\tautostart_path = 'etc/xdg/autostart'\n\telif '--user' in sys.argv:\n\t\tfrom os import environ\n\t\tfrom os import path\n\t\txdg_config_home = environ.get('XDG_CONFIG_HOME', path.expanduser(path.join('~', '.config')))\n\t\tautostart_path = path.join(xdg_config_home, 'autostart')\n\t\tdel environ, path, xdg_config_home\n\ndel sys, backup_path_0\n\n\ndef _data_files():\n\tfrom os.path import dirname as _dirname\n\n\tyield 'share/solaar/icons', _glob('share/solaar/icons/solaar*.svg')\n\tyield 'share/solaar/icons', _glob('share/solaar/icons/light_*.png')\n\tyield 'share/icons/hicolor/scalable/apps', ['share/solaar/icons/solaar.svg']\n\n\tfor mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):\n\t\tyield _dirname(mo), [mo]\n\n\tyield 'share/applications', ['share/applications/solaar.desktop']\n\tyield autostart_path, ['share/autostart/solaar.desktop']\n\n\tdel _dirname\n\n\nsetup(name=NAME.lower(),\n\t\tversion=__version__,\n\t\tdescription='Linux devices manager for the Logitech Unifying Receiver.',\n\t\tlong_description='''\nSolaar is a Linux device manager for Logitech's Unifying Receiver peripherals.\nIt is able to pair/unpair devices to the receiver, and for some devices read\nbattery status.\n'''.strip(),\n\t\tauthor='Daniel Pavel',\n\t\tauthor_email='[email protected]',\n\t\tlicense='GPLv2',\n\t\turl='http://pwr-solaar.github.io/Solaar/',\n\t\tclassifiers=[\n\t\t\t'Development Status :: 4 - Beta',\n\t\t\t'Environment :: X11 Applications :: GTK',\n\t\t\t'Environment :: Console',\n\t\t\t'Intended Audience :: End Users/Desktop',\n\t\t\t'License :: DFSG approved',\n\t\t\t'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',\n\t\t\t'Natural Language :: English',\n\t\t\t'Programming Language :: Python :: 3 :: Only',\n\t\t\t'Operating System :: POSIX :: Linux',\n\t\t\t'Topic :: Utilities',\n\t\t\t],\n\n\t\tplatforms=['linux'],\n\n\t\t# sudo apt install python-gi python3-gi \\\n\t\t# gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1\n\t\t# os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],\n\n\t\tpython_requires='>=3.2',\n\t\tinstall_requires=['pyudev (>= 0.13)', ],\n\t\tpackage_dir={'': 'lib'},\n\t\tpackages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n\t\tdata_files=list(_data_files()),\n\t\tscripts=_glob('bin/*'),\n\t)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nfrom glob import glob as _glob\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\nautostart_path = '/etc/xdg/autostart'\n\n#from solaar import NAME, __version__\n__version__ = '1.0.2-rc1'\nNAME = 'Solaar'\n\n\ndef _data_files():\n\tfrom os.path import dirname as _dirname\n\n\tyield 'share/solaar/icons', _glob('share/solaar/icons/solaar*.svg')\n\tyield 'share/solaar/icons', _glob('share/solaar/icons/light_*.png')\n\tyield 'share/icons/hicolor/scalable/apps', ['share/solaar/icons/solaar.svg']\n\n\tfor mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):\n\t\tyield _dirname(mo), [mo]\n\n\tyield 'share/applications', ['share/applications/solaar.desktop']\n\tyield autostart_path, ['share/autostart/solaar.desktop']\n\tyield '/etc/udev/rules.d', ['rules.d/42-logitech-unify-permissions.rules']\n\n\tdel _dirname\n\n\nsetup(name=NAME.lower(),\n\t\tversion=__version__,\n\t\tdescription='Linux devices manager for the Logitech Unifying Receiver.',\n\t\tlong_description='''\nSolaar is a Linux device manager for Logitech's Unifying Receiver peripherals.\nIt is able to pair/unpair devices to the receiver, and for some devices read\nbattery status.\n'''.strip(),\n\t\tauthor='Daniel Pavel',\n\t\tlicense='GPLv2',\n\t\turl='http://pwr-solaar.github.io/Solaar/',\n\t\tclassifiers=[\n\t\t\t'Development Status :: 4 - Beta',\n\t\t\t'Environment :: X11 Applications :: GTK',\n\t\t\t'Environment :: Console',\n\t\t\t'Intended Audience :: End Users/Desktop',\n\t\t\t'License :: DFSG approved',\n\t\t\t'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',\n\t\t\t'Natural Language :: English',\n\t\t\t'Programming Language :: Python :: 3 :: Only',\n\t\t\t'Operating System :: POSIX :: Linux',\n\t\t\t'Topic :: Utilities',\n\t\t\t],\n\n\t\tplatforms=['linux'],\n\n\t\t# sudo apt install python-gi python3-gi \\\n\t\t# gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1\n\t\t# os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],\n\n\t\tpython_requires='>=3.2',\n\t\tinstall_requires=['pyudev (>= 0.13)', ],\n\t\tpackage_dir={'': 'lib'},\n\t\tpackages=['hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n\t\tdata_files=list(_data_files()),\n\t\tscripts=_glob('bin/*'),\n\t)\n", "path": "setup.py"}]}
| 1,332 | 447 |
gh_patches_debug_60773
|
rasdani/github-patches
|
git_diff
|
data-for-change__anyway-1848
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix walla scraping - see test_scrape_sanity_online_walla
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `anyway/parsers/rss_sites.py`
Content:
```
1 import requests
2 from bs4 import BeautifulSoup
3 import feedparser
4 from anyway.parsers import timezones
5
6
7 def parse_html_walla(item_rss, html_soup):
8 # For some reason there's html here
9 description = BeautifulSoup(item_rss["summary"], features="lxml").text
10
11 author = html_soup.find("div", class_="author").find("a").get_text()
12 return author, description
13
14
15 def parse_html_ynet(item_rss, html_soup):
16 # This is rather fragile
17 # description_text: "[description] ([author]) [unrelated stuff]"
18 description_text = html_soup.find(id="ArticleBodyComponent").get_text()
19 author = description_text.split("(")[-1].split(")")[0].strip()
20 description = description_text.rsplit("(")[0].strip()
21 return author, description
22
23
24 sites_config = {
25 "ynet": {
26 "rss": "https://www.ynet.co.il:443/Integration/StoryRss1854.xml",
27 "parser": parse_html_ynet,
28 },
29 "walla": {"rss": "https://rss.walla.co.il:443/feed/22", "parser": parse_html_walla},
30 }
31
32
33 def _fetch(url: str) -> str:
34 return requests.get(url).text
35
36
37 def scrape_raw(site_name: str, *, rss_source=None, fetch_html=_fetch):
38 config = sites_config[site_name]
39 if rss_source is None:
40 rss_source = config["rss"]
41 rss_dict = feedparser.parse(rss_source)
42 if rss_dict.get("bozo_exception"):
43 raise rss_dict["bozo_exception"]
44
45 for item_rss in rss_dict["items"]:
46 html_text = fetch_html(item_rss["link"])
47 author, description = config["parser"](item_rss, BeautifulSoup(html_text, "lxml"))
48 yield {
49 "link": item_rss["link"],
50 "date": timezones.from_rss(item_rss["published_parsed"]),
51 "source": site_name,
52 "author": author,
53 "title": item_rss["title"],
54 "description": description,
55 "accident": False,
56 }
57
58
59 def scrape(*args, **kwargs):
60 # lazily load dependencies, so this module will behave like an independent library
61 from anyway.models import NewsFlash
62
63 for dict_item in scrape_raw(*args, **kwargs):
64 yield NewsFlash(**dict_item)
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/anyway/parsers/rss_sites.py b/anyway/parsers/rss_sites.py
--- a/anyway/parsers/rss_sites.py
+++ b/anyway/parsers/rss_sites.py
@@ -8,7 +8,7 @@
# For some reason there's html here
description = BeautifulSoup(item_rss["summary"], features="lxml").text
- author = html_soup.find("div", class_="author").find("a").get_text()
+ author = html_soup.find("div", class_="author").get_text().strip()
return author, description
|
{"golden_diff": "diff --git a/anyway/parsers/rss_sites.py b/anyway/parsers/rss_sites.py\n--- a/anyway/parsers/rss_sites.py\n+++ b/anyway/parsers/rss_sites.py\n@@ -8,7 +8,7 @@\n # For some reason there's html here\n description = BeautifulSoup(item_rss[\"summary\"], features=\"lxml\").text\n \n- author = html_soup.find(\"div\", class_=\"author\").find(\"a\").get_text()\n+ author = html_soup.find(\"div\", class_=\"author\").get_text().strip()\n return author, description\n", "issue": "Fix walla scraping - see test_scrape_sanity_online_walla\n\n", "before_files": [{"content": "import requests\nfrom bs4 import BeautifulSoup\nimport feedparser\nfrom anyway.parsers import timezones\n\n\ndef parse_html_walla(item_rss, html_soup):\n # For some reason there's html here\n description = BeautifulSoup(item_rss[\"summary\"], features=\"lxml\").text\n\n author = html_soup.find(\"div\", class_=\"author\").find(\"a\").get_text()\n return author, description\n\n\ndef parse_html_ynet(item_rss, html_soup):\n # This is rather fragile\n # description_text: \"[description] ([author]) [unrelated stuff]\"\n description_text = html_soup.find(id=\"ArticleBodyComponent\").get_text()\n author = description_text.split(\"(\")[-1].split(\")\")[0].strip()\n description = description_text.rsplit(\"(\")[0].strip()\n return author, description\n\n\nsites_config = {\n \"ynet\": {\n \"rss\": \"https://www.ynet.co.il:443/Integration/StoryRss1854.xml\",\n \"parser\": parse_html_ynet,\n },\n \"walla\": {\"rss\": \"https://rss.walla.co.il:443/feed/22\", \"parser\": parse_html_walla},\n}\n\n\ndef _fetch(url: str) -> str:\n return requests.get(url).text\n\n\ndef scrape_raw(site_name: str, *, rss_source=None, fetch_html=_fetch):\n config = sites_config[site_name]\n if rss_source is None:\n rss_source = config[\"rss\"]\n rss_dict = feedparser.parse(rss_source)\n if rss_dict.get(\"bozo_exception\"):\n raise rss_dict[\"bozo_exception\"]\n\n for item_rss in rss_dict[\"items\"]:\n html_text = fetch_html(item_rss[\"link\"])\n author, description = config[\"parser\"](item_rss, BeautifulSoup(html_text, \"lxml\"))\n yield {\n \"link\": item_rss[\"link\"],\n \"date\": timezones.from_rss(item_rss[\"published_parsed\"]),\n \"source\": site_name,\n \"author\": author,\n \"title\": item_rss[\"title\"],\n \"description\": description,\n \"accident\": False,\n }\n\n\ndef scrape(*args, **kwargs):\n # lazily load dependencies, so this module will behave like an independent library\n from anyway.models import NewsFlash\n\n for dict_item in scrape_raw(*args, **kwargs):\n yield NewsFlash(**dict_item)\n", "path": "anyway/parsers/rss_sites.py"}], "after_files": [{"content": "import requests\nfrom bs4 import BeautifulSoup\nimport feedparser\nfrom anyway.parsers import timezones\n\n\ndef parse_html_walla(item_rss, html_soup):\n # For some reason there's html here\n description = BeautifulSoup(item_rss[\"summary\"], features=\"lxml\").text\n\n author = html_soup.find(\"div\", class_=\"author\").get_text().strip()\n return author, description\n\n\ndef parse_html_ynet(item_rss, html_soup):\n # This is rather fragile\n # description_text: \"[description] ([author]) [unrelated stuff]\"\n description_text = html_soup.find(id=\"ArticleBodyComponent\").get_text()\n author = description_text.split(\"(\")[-1].split(\")\")[0].strip()\n description = description_text.rsplit(\"(\")[0].strip()\n return author, description\n\n\nsites_config = {\n \"ynet\": {\n \"rss\": \"https://www.ynet.co.il:443/Integration/StoryRss1854.xml\",\n \"parser\": parse_html_ynet,\n },\n \"walla\": {\"rss\": \"https://rss.walla.co.il:443/feed/22\", \"parser\": parse_html_walla},\n}\n\n\ndef _fetch(url: str) -> str:\n return requests.get(url).text\n\n\ndef scrape_raw(site_name: str, *, rss_source=None, fetch_html=_fetch):\n config = sites_config[site_name]\n if rss_source is None:\n rss_source = config[\"rss\"]\n rss_dict = feedparser.parse(rss_source)\n if rss_dict.get(\"bozo_exception\"):\n raise rss_dict[\"bozo_exception\"]\n\n for item_rss in rss_dict[\"items\"]:\n html_text = fetch_html(item_rss[\"link\"])\n author, description = config[\"parser\"](item_rss, BeautifulSoup(html_text, \"lxml\"))\n yield {\n \"link\": item_rss[\"link\"],\n \"date\": timezones.from_rss(item_rss[\"published_parsed\"]),\n \"source\": site_name,\n \"author\": author,\n \"title\": item_rss[\"title\"],\n \"description\": description,\n \"accident\": False,\n }\n\n\ndef scrape(*args, **kwargs):\n # lazily load dependencies, so this module will behave like an independent library\n from anyway.models import NewsFlash\n\n for dict_item in scrape_raw(*args, **kwargs):\n yield NewsFlash(**dict_item)\n", "path": "anyway/parsers/rss_sites.py"}]}
| 923 | 129 |
gh_patches_debug_17789
|
rasdani/github-patches
|
git_diff
|
encode__starlette-1018
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Memory usage streaming large responses
We've been running into memory issues when providing very large async generators to a streaming response. We have these generators producing large (larger than memory set) responses in a way that allows us to only keep small chunks in memory at a time. However, it looks like the BaseHTTPMiddleware implementation uses an asyncio queue to store the individual chunks:
https://github.com/encode/starlette/blob/master/starlette/middleware/base.py#L30
This prevents any network backpressure handling -- if the client that is receiving the streaming response is on a slow connection, the queue will happily grow without bound and consume all memory, triggering kernel out-of-memory, when the ideal handling here would be for send to block (yield) when this happens. I believe this would naturally happen if there were no queue here at all, so I am wondering why it needs to be here?
Would a PR to remove the queueing be accepted?
If not, what is the appropriate way to override this to not use a queue? We can write our own, but the use of BaseHTTPMiddleware is hardcoded: https://github.com/encode/starlette/blob/519f5750b5e797bb3d4805fd29657674304ce397/starlette/applications.py#L197, leaving only some fairly hacky approaches to preventing this queueing.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlette/middleware/base.py`
Content:
```
1 import asyncio
2 import typing
3
4 from starlette.requests import Request
5 from starlette.responses import Response, StreamingResponse
6 from starlette.types import ASGIApp, Receive, Scope, Send
7
8 RequestResponseEndpoint = typing.Callable[[Request], typing.Awaitable[Response]]
9 DispatchFunction = typing.Callable[
10 [Request, RequestResponseEndpoint], typing.Awaitable[Response]
11 ]
12
13
14 class BaseHTTPMiddleware:
15 def __init__(self, app: ASGIApp, dispatch: DispatchFunction = None) -> None:
16 self.app = app
17 self.dispatch_func = self.dispatch if dispatch is None else dispatch
18
19 async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
20 if scope["type"] != "http":
21 await self.app(scope, receive, send)
22 return
23
24 request = Request(scope, receive=receive)
25 response = await self.dispatch_func(request, self.call_next)
26 await response(scope, receive, send)
27
28 async def call_next(self, request: Request) -> Response:
29 loop = asyncio.get_event_loop()
30 queue = asyncio.Queue() # type: asyncio.Queue
31
32 scope = request.scope
33 receive = request.receive
34 send = queue.put
35
36 async def coro() -> None:
37 try:
38 await self.app(scope, receive, send)
39 finally:
40 await queue.put(None)
41
42 task = loop.create_task(coro())
43 message = await queue.get()
44 if message is None:
45 task.result()
46 raise RuntimeError("No response returned.")
47 assert message["type"] == "http.response.start"
48
49 async def body_stream() -> typing.AsyncGenerator[bytes, None]:
50 while True:
51 message = await queue.get()
52 if message is None:
53 break
54 assert message["type"] == "http.response.body"
55 yield message.get("body", b"")
56 task.result()
57
58 response = StreamingResponse(
59 status_code=message["status"], content=body_stream()
60 )
61 response.raw_headers = message["headers"]
62 return response
63
64 async def dispatch(
65 self, request: Request, call_next: RequestResponseEndpoint
66 ) -> Response:
67 raise NotImplementedError() # pragma: no cover
68
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/starlette/middleware/base.py b/starlette/middleware/base.py
--- a/starlette/middleware/base.py
+++ b/starlette/middleware/base.py
@@ -3,7 +3,7 @@
from starlette.requests import Request
from starlette.responses import Response, StreamingResponse
-from starlette.types import ASGIApp, Receive, Scope, Send
+from starlette.types import ASGIApp, Message, Receive, Scope, Send
RequestResponseEndpoint = typing.Callable[[Request], typing.Awaitable[Response]]
DispatchFunction = typing.Callable[
@@ -27,7 +27,7 @@
async def call_next(self, request: Request) -> Response:
loop = asyncio.get_event_loop()
- queue = asyncio.Queue() # type: asyncio.Queue
+ queue: "asyncio.Queue[typing.Optional[Message]]" = asyncio.Queue(maxsize=1)
scope = request.scope
receive = request.receive
|
{"golden_diff": "diff --git a/starlette/middleware/base.py b/starlette/middleware/base.py\n--- a/starlette/middleware/base.py\n+++ b/starlette/middleware/base.py\n@@ -3,7 +3,7 @@\n \n from starlette.requests import Request\n from starlette.responses import Response, StreamingResponse\n-from starlette.types import ASGIApp, Receive, Scope, Send\n+from starlette.types import ASGIApp, Message, Receive, Scope, Send\n \n RequestResponseEndpoint = typing.Callable[[Request], typing.Awaitable[Response]]\n DispatchFunction = typing.Callable[\n@@ -27,7 +27,7 @@\n \n async def call_next(self, request: Request) -> Response:\n loop = asyncio.get_event_loop()\n- queue = asyncio.Queue() # type: asyncio.Queue\n+ queue: \"asyncio.Queue[typing.Optional[Message]]\" = asyncio.Queue(maxsize=1)\n \n scope = request.scope\n receive = request.receive\n", "issue": "Memory usage streaming large responses\nWe've been running into memory issues when providing very large async generators to a streaming response. We have these generators producing large (larger than memory set) responses in a way that allows us to only keep small chunks in memory at a time. However, it looks like the BaseHTTPMiddleware implementation uses an asyncio queue to store the individual chunks:\r\n\r\nhttps://github.com/encode/starlette/blob/master/starlette/middleware/base.py#L30\r\n\r\nThis prevents any network backpressure handling -- if the client that is receiving the streaming response is on a slow connection, the queue will happily grow without bound and consume all memory, triggering kernel out-of-memory, when the ideal handling here would be for send to block (yield) when this happens. I believe this would naturally happen if there were no queue here at all, so I am wondering why it needs to be here?\r\n\r\nWould a PR to remove the queueing be accepted?\r\n\r\nIf not, what is the appropriate way to override this to not use a queue? We can write our own, but the use of BaseHTTPMiddleware is hardcoded: https://github.com/encode/starlette/blob/519f5750b5e797bb3d4805fd29657674304ce397/starlette/applications.py#L197, leaving only some fairly hacky approaches to preventing this queueing.\n", "before_files": [{"content": "import asyncio\nimport typing\n\nfrom starlette.requests import Request\nfrom starlette.responses import Response, StreamingResponse\nfrom starlette.types import ASGIApp, Receive, Scope, Send\n\nRequestResponseEndpoint = typing.Callable[[Request], typing.Awaitable[Response]]\nDispatchFunction = typing.Callable[\n [Request, RequestResponseEndpoint], typing.Awaitable[Response]\n]\n\n\nclass BaseHTTPMiddleware:\n def __init__(self, app: ASGIApp, dispatch: DispatchFunction = None) -> None:\n self.app = app\n self.dispatch_func = self.dispatch if dispatch is None else dispatch\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n if scope[\"type\"] != \"http\":\n await self.app(scope, receive, send)\n return\n\n request = Request(scope, receive=receive)\n response = await self.dispatch_func(request, self.call_next)\n await response(scope, receive, send)\n\n async def call_next(self, request: Request) -> Response:\n loop = asyncio.get_event_loop()\n queue = asyncio.Queue() # type: asyncio.Queue\n\n scope = request.scope\n receive = request.receive\n send = queue.put\n\n async def coro() -> None:\n try:\n await self.app(scope, receive, send)\n finally:\n await queue.put(None)\n\n task = loop.create_task(coro())\n message = await queue.get()\n if message is None:\n task.result()\n raise RuntimeError(\"No response returned.\")\n assert message[\"type\"] == \"http.response.start\"\n\n async def body_stream() -> typing.AsyncGenerator[bytes, None]:\n while True:\n message = await queue.get()\n if message is None:\n break\n assert message[\"type\"] == \"http.response.body\"\n yield message.get(\"body\", b\"\")\n task.result()\n\n response = StreamingResponse(\n status_code=message[\"status\"], content=body_stream()\n )\n response.raw_headers = message[\"headers\"]\n return response\n\n async def dispatch(\n self, request: Request, call_next: RequestResponseEndpoint\n ) -> Response:\n raise NotImplementedError() # pragma: no cover\n", "path": "starlette/middleware/base.py"}], "after_files": [{"content": "import asyncio\nimport typing\n\nfrom starlette.requests import Request\nfrom starlette.responses import Response, StreamingResponse\nfrom starlette.types import ASGIApp, Message, Receive, Scope, Send\n\nRequestResponseEndpoint = typing.Callable[[Request], typing.Awaitable[Response]]\nDispatchFunction = typing.Callable[\n [Request, RequestResponseEndpoint], typing.Awaitable[Response]\n]\n\n\nclass BaseHTTPMiddleware:\n def __init__(self, app: ASGIApp, dispatch: DispatchFunction = None) -> None:\n self.app = app\n self.dispatch_func = self.dispatch if dispatch is None else dispatch\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n if scope[\"type\"] != \"http\":\n await self.app(scope, receive, send)\n return\n\n request = Request(scope, receive=receive)\n response = await self.dispatch_func(request, self.call_next)\n await response(scope, receive, send)\n\n async def call_next(self, request: Request) -> Response:\n loop = asyncio.get_event_loop()\n queue: \"asyncio.Queue[typing.Optional[Message]]\" = asyncio.Queue(maxsize=1)\n\n scope = request.scope\n receive = request.receive\n send = queue.put\n\n async def coro() -> None:\n try:\n await self.app(scope, receive, send)\n finally:\n await queue.put(None)\n\n task = loop.create_task(coro())\n message = await queue.get()\n if message is None:\n task.result()\n raise RuntimeError(\"No response returned.\")\n assert message[\"type\"] == \"http.response.start\"\n\n async def body_stream() -> typing.AsyncGenerator[bytes, None]:\n while True:\n message = await queue.get()\n if message is None:\n break\n assert message[\"type\"] == \"http.response.body\"\n yield message.get(\"body\", b\"\")\n task.result()\n\n response = StreamingResponse(\n status_code=message[\"status\"], content=body_stream()\n )\n response.raw_headers = message[\"headers\"]\n return response\n\n async def dispatch(\n self, request: Request, call_next: RequestResponseEndpoint\n ) -> Response:\n raise NotImplementedError() # pragma: no cover\n", "path": "starlette/middleware/base.py"}]}
| 1,165 | 206 |
gh_patches_debug_15866
|
rasdani/github-patches
|
git_diff
|
tornadoweb__tornado-2653
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tornado.platform.twisted disappeared but did not explain its replacement
Many years' worth of documentation explains that folks can do
```
from tornado.platform.twisted import install
reactor = install()
```
I can see that in https://github.com/tornadoweb/tornado/commit/004de9c301cc4c2dae4d8f2507af1851d0c9763a#diff-77b5a8a33248ef0bcafbc1bb71e9f013 Twisted integration was removed, since we can all depend on the stdlib loop APIs. This is great, but it also breaks a bunch of Jupyter notebooks, tutorials, etc.
Could you be convinced to replace all those sprawling APIs with something like this:
```python3
def install():
from twisted.internet.asyncioreactor import install
install()
from twisted.internet import reactor
reactor.startRunning()
return reactor
```
possibly with a `warnings.warn` explaining that users could just call these APIs directly, if that's the desired end-state?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tornado/platform/twisted.py`
Content:
```
1 # Licensed under the Apache License, Version 2.0 (the "License"); you may
2 # not use this file except in compliance with the License. You may obtain
3 # a copy of the License at
4 #
5 # http://www.apache.org/licenses/LICENSE-2.0
6 #
7 # Unless required by applicable law or agreed to in writing, software
8 # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
9 # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
10 # License for the specific language governing permissions and limitations
11 # under the License.
12 """Bridges between the Twisted package and Tornado.
13 """
14
15 import socket
16 import sys
17
18 import twisted.internet.abstract # type: ignore
19 import twisted.internet.asyncioreactor # type: ignore
20 from twisted.internet.defer import Deferred # type: ignore
21 from twisted.python import failure # type: ignore
22 import twisted.names.cache # type: ignore
23 import twisted.names.client # type: ignore
24 import twisted.names.hosts # type: ignore
25 import twisted.names.resolve # type: ignore
26
27
28 from tornado.concurrent import Future, future_set_exc_info
29 from tornado.escape import utf8
30 from tornado import gen
31 from tornado.netutil import Resolver
32
33 import typing
34
35 if typing.TYPE_CHECKING:
36 from typing import Generator, Any, List, Tuple # noqa: F401
37
38
39 class TwistedResolver(Resolver):
40 """Twisted-based asynchronous resolver.
41
42 This is a non-blocking and non-threaded resolver. It is
43 recommended only when threads cannot be used, since it has
44 limitations compared to the standard ``getaddrinfo``-based
45 `~tornado.netutil.Resolver` and
46 `~tornado.netutil.DefaultExecutorResolver`. Specifically, it returns at
47 most one result, and arguments other than ``host`` and ``family``
48 are ignored. It may fail to resolve when ``family`` is not
49 ``socket.AF_UNSPEC``.
50
51 Requires Twisted 12.1 or newer.
52
53 .. versionchanged:: 5.0
54 The ``io_loop`` argument (deprecated since version 4.1) has been removed.
55 """
56
57 def initialize(self) -> None:
58 # partial copy of twisted.names.client.createResolver, which doesn't
59 # allow for a reactor to be passed in.
60 self.reactor = twisted.internet.asyncioreactor.AsyncioSelectorReactor()
61
62 host_resolver = twisted.names.hosts.Resolver("/etc/hosts")
63 cache_resolver = twisted.names.cache.CacheResolver(reactor=self.reactor)
64 real_resolver = twisted.names.client.Resolver(
65 "/etc/resolv.conf", reactor=self.reactor
66 )
67 self.resolver = twisted.names.resolve.ResolverChain(
68 [host_resolver, cache_resolver, real_resolver]
69 )
70
71 @gen.coroutine
72 def resolve(
73 self, host: str, port: int, family: int = 0
74 ) -> "Generator[Any, Any, List[Tuple[int, Any]]]":
75 # getHostByName doesn't accept IP addresses, so if the input
76 # looks like an IP address just return it immediately.
77 if twisted.internet.abstract.isIPAddress(host):
78 resolved = host
79 resolved_family = socket.AF_INET
80 elif twisted.internet.abstract.isIPv6Address(host):
81 resolved = host
82 resolved_family = socket.AF_INET6
83 else:
84 deferred = self.resolver.getHostByName(utf8(host))
85 fut = Future() # type: Future[Any]
86 deferred.addBoth(fut.set_result)
87 resolved = yield fut
88 if isinstance(resolved, failure.Failure):
89 try:
90 resolved.raiseException()
91 except twisted.names.error.DomainError as e:
92 raise IOError(e)
93 elif twisted.internet.abstract.isIPAddress(resolved):
94 resolved_family = socket.AF_INET
95 elif twisted.internet.abstract.isIPv6Address(resolved):
96 resolved_family = socket.AF_INET6
97 else:
98 resolved_family = socket.AF_UNSPEC
99 if family != socket.AF_UNSPEC and family != resolved_family:
100 raise Exception(
101 "Requested socket family %d but got %d" % (family, resolved_family)
102 )
103 result = [(typing.cast(int, resolved_family), (resolved, port))]
104 return result
105
106
107 if hasattr(gen.convert_yielded, "register"):
108
109 @gen.convert_yielded.register(Deferred) # type: ignore
110 def _(d: Deferred) -> Future:
111 f = Future() # type: Future[Any]
112
113 def errback(failure: failure.Failure) -> None:
114 try:
115 failure.raiseException()
116 # Should never happen, but just in case
117 raise Exception("errback called without error")
118 except:
119 future_set_exc_info(f, sys.exc_info())
120
121 d.addCallbacks(f.set_result, errback)
122 return f
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/tornado/platform/twisted.py b/tornado/platform/twisted.py
--- a/tornado/platform/twisted.py
+++ b/tornado/platform/twisted.py
@@ -104,6 +104,30 @@
return result
+def install() -> None:
+ """Install ``AsyncioSelectorReactor`` as the default Twisted reactor.
+
+ .. deprecated:: 5.1
+
+ This function is provided for backwards compatibility; code
+ that does not require compatibility with older versions of
+ Tornado should use
+ ``twisted.internet.asyncioreactor.install()`` directly.
+
+ .. versionchanged:: 6.0.3
+
+ In Tornado 5.x and before, this function installed a reactor
+ based on the Tornado ``IOLoop``. When that reactor
+ implementation was removed in Tornado 6.0.0, this function was
+ removed as well. It was restored in Tornado 6.0.3 using the
+ ``asyncio`` reactor instead.
+
+ """
+ from twisted.internet.asyncioreactor import install
+
+ install()
+
+
if hasattr(gen.convert_yielded, "register"):
@gen.convert_yielded.register(Deferred) # type: ignore
|
{"golden_diff": "diff --git a/tornado/platform/twisted.py b/tornado/platform/twisted.py\n--- a/tornado/platform/twisted.py\n+++ b/tornado/platform/twisted.py\n@@ -104,6 +104,30 @@\n return result\n \n \n+def install() -> None:\n+ \"\"\"Install ``AsyncioSelectorReactor`` as the default Twisted reactor.\n+\n+ .. deprecated:: 5.1\n+\n+ This function is provided for backwards compatibility; code\n+ that does not require compatibility with older versions of\n+ Tornado should use\n+ ``twisted.internet.asyncioreactor.install()`` directly.\n+\n+ .. versionchanged:: 6.0.3\n+\n+ In Tornado 5.x and before, this function installed a reactor\n+ based on the Tornado ``IOLoop``. When that reactor\n+ implementation was removed in Tornado 6.0.0, this function was\n+ removed as well. It was restored in Tornado 6.0.3 using the\n+ ``asyncio`` reactor instead.\n+\n+ \"\"\"\n+ from twisted.internet.asyncioreactor import install\n+\n+ install()\n+\n+\n if hasattr(gen.convert_yielded, \"register\"):\n \n @gen.convert_yielded.register(Deferred) # type: ignore\n", "issue": "tornado.platform.twisted disappeared but did not explain its replacement\nMany years' worth of documentation explains that folks can do\r\n\r\n```\r\nfrom tornado.platform.twisted import install\r\nreactor = install()\r\n```\r\n\r\nI can see that in https://github.com/tornadoweb/tornado/commit/004de9c301cc4c2dae4d8f2507af1851d0c9763a#diff-77b5a8a33248ef0bcafbc1bb71e9f013 Twisted integration was removed, since we can all depend on the stdlib loop APIs. This is great, but it also breaks a bunch of Jupyter notebooks, tutorials, etc.\r\n\r\nCould you be convinced to replace all those sprawling APIs with something like this:\r\n\r\n```python3\r\ndef install():\r\n from twisted.internet.asyncioreactor import install\r\n install()\r\n from twisted.internet import reactor\r\n reactor.startRunning()\r\n return reactor\r\n```\r\n\r\npossibly with a `warnings.warn` explaining that users could just call these APIs directly, if that's the desired end-state?\n", "before_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\"); you may\n# not use this file except in compliance with the License. You may obtain\n# a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the\n# License for the specific language governing permissions and limitations\n# under the License.\n\"\"\"Bridges between the Twisted package and Tornado.\n\"\"\"\n\nimport socket\nimport sys\n\nimport twisted.internet.abstract # type: ignore\nimport twisted.internet.asyncioreactor # type: ignore\nfrom twisted.internet.defer import Deferred # type: ignore\nfrom twisted.python import failure # type: ignore\nimport twisted.names.cache # type: ignore\nimport twisted.names.client # type: ignore\nimport twisted.names.hosts # type: ignore\nimport twisted.names.resolve # type: ignore\n\n\nfrom tornado.concurrent import Future, future_set_exc_info\nfrom tornado.escape import utf8\nfrom tornado import gen\nfrom tornado.netutil import Resolver\n\nimport typing\n\nif typing.TYPE_CHECKING:\n from typing import Generator, Any, List, Tuple # noqa: F401\n\n\nclass TwistedResolver(Resolver):\n \"\"\"Twisted-based asynchronous resolver.\n\n This is a non-blocking and non-threaded resolver. It is\n recommended only when threads cannot be used, since it has\n limitations compared to the standard ``getaddrinfo``-based\n `~tornado.netutil.Resolver` and\n `~tornado.netutil.DefaultExecutorResolver`. Specifically, it returns at\n most one result, and arguments other than ``host`` and ``family``\n are ignored. It may fail to resolve when ``family`` is not\n ``socket.AF_UNSPEC``.\n\n Requires Twisted 12.1 or newer.\n\n .. versionchanged:: 5.0\n The ``io_loop`` argument (deprecated since version 4.1) has been removed.\n \"\"\"\n\n def initialize(self) -> None:\n # partial copy of twisted.names.client.createResolver, which doesn't\n # allow for a reactor to be passed in.\n self.reactor = twisted.internet.asyncioreactor.AsyncioSelectorReactor()\n\n host_resolver = twisted.names.hosts.Resolver(\"/etc/hosts\")\n cache_resolver = twisted.names.cache.CacheResolver(reactor=self.reactor)\n real_resolver = twisted.names.client.Resolver(\n \"/etc/resolv.conf\", reactor=self.reactor\n )\n self.resolver = twisted.names.resolve.ResolverChain(\n [host_resolver, cache_resolver, real_resolver]\n )\n\n @gen.coroutine\n def resolve(\n self, host: str, port: int, family: int = 0\n ) -> \"Generator[Any, Any, List[Tuple[int, Any]]]\":\n # getHostByName doesn't accept IP addresses, so if the input\n # looks like an IP address just return it immediately.\n if twisted.internet.abstract.isIPAddress(host):\n resolved = host\n resolved_family = socket.AF_INET\n elif twisted.internet.abstract.isIPv6Address(host):\n resolved = host\n resolved_family = socket.AF_INET6\n else:\n deferred = self.resolver.getHostByName(utf8(host))\n fut = Future() # type: Future[Any]\n deferred.addBoth(fut.set_result)\n resolved = yield fut\n if isinstance(resolved, failure.Failure):\n try:\n resolved.raiseException()\n except twisted.names.error.DomainError as e:\n raise IOError(e)\n elif twisted.internet.abstract.isIPAddress(resolved):\n resolved_family = socket.AF_INET\n elif twisted.internet.abstract.isIPv6Address(resolved):\n resolved_family = socket.AF_INET6\n else:\n resolved_family = socket.AF_UNSPEC\n if family != socket.AF_UNSPEC and family != resolved_family:\n raise Exception(\n \"Requested socket family %d but got %d\" % (family, resolved_family)\n )\n result = [(typing.cast(int, resolved_family), (resolved, port))]\n return result\n\n\nif hasattr(gen.convert_yielded, \"register\"):\n\n @gen.convert_yielded.register(Deferred) # type: ignore\n def _(d: Deferred) -> Future:\n f = Future() # type: Future[Any]\n\n def errback(failure: failure.Failure) -> None:\n try:\n failure.raiseException()\n # Should never happen, but just in case\n raise Exception(\"errback called without error\")\n except:\n future_set_exc_info(f, sys.exc_info())\n\n d.addCallbacks(f.set_result, errback)\n return f\n", "path": "tornado/platform/twisted.py"}], "after_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\"); you may\n# not use this file except in compliance with the License. You may obtain\n# a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the\n# License for the specific language governing permissions and limitations\n# under the License.\n\"\"\"Bridges between the Twisted package and Tornado.\n\"\"\"\n\nimport socket\nimport sys\n\nimport twisted.internet.abstract # type: ignore\nimport twisted.internet.asyncioreactor # type: ignore\nfrom twisted.internet.defer import Deferred # type: ignore\nfrom twisted.python import failure # type: ignore\nimport twisted.names.cache # type: ignore\nimport twisted.names.client # type: ignore\nimport twisted.names.hosts # type: ignore\nimport twisted.names.resolve # type: ignore\n\n\nfrom tornado.concurrent import Future, future_set_exc_info\nfrom tornado.escape import utf8\nfrom tornado import gen\nfrom tornado.netutil import Resolver\n\nimport typing\n\nif typing.TYPE_CHECKING:\n from typing import Generator, Any, List, Tuple # noqa: F401\n\n\nclass TwistedResolver(Resolver):\n \"\"\"Twisted-based asynchronous resolver.\n\n This is a non-blocking and non-threaded resolver. It is\n recommended only when threads cannot be used, since it has\n limitations compared to the standard ``getaddrinfo``-based\n `~tornado.netutil.Resolver` and\n `~tornado.netutil.DefaultExecutorResolver`. Specifically, it returns at\n most one result, and arguments other than ``host`` and ``family``\n are ignored. It may fail to resolve when ``family`` is not\n ``socket.AF_UNSPEC``.\n\n Requires Twisted 12.1 or newer.\n\n .. versionchanged:: 5.0\n The ``io_loop`` argument (deprecated since version 4.1) has been removed.\n \"\"\"\n\n def initialize(self) -> None:\n # partial copy of twisted.names.client.createResolver, which doesn't\n # allow for a reactor to be passed in.\n self.reactor = twisted.internet.asyncioreactor.AsyncioSelectorReactor()\n\n host_resolver = twisted.names.hosts.Resolver(\"/etc/hosts\")\n cache_resolver = twisted.names.cache.CacheResolver(reactor=self.reactor)\n real_resolver = twisted.names.client.Resolver(\n \"/etc/resolv.conf\", reactor=self.reactor\n )\n self.resolver = twisted.names.resolve.ResolverChain(\n [host_resolver, cache_resolver, real_resolver]\n )\n\n @gen.coroutine\n def resolve(\n self, host: str, port: int, family: int = 0\n ) -> \"Generator[Any, Any, List[Tuple[int, Any]]]\":\n # getHostByName doesn't accept IP addresses, so if the input\n # looks like an IP address just return it immediately.\n if twisted.internet.abstract.isIPAddress(host):\n resolved = host\n resolved_family = socket.AF_INET\n elif twisted.internet.abstract.isIPv6Address(host):\n resolved = host\n resolved_family = socket.AF_INET6\n else:\n deferred = self.resolver.getHostByName(utf8(host))\n fut = Future() # type: Future[Any]\n deferred.addBoth(fut.set_result)\n resolved = yield fut\n if isinstance(resolved, failure.Failure):\n try:\n resolved.raiseException()\n except twisted.names.error.DomainError as e:\n raise IOError(e)\n elif twisted.internet.abstract.isIPAddress(resolved):\n resolved_family = socket.AF_INET\n elif twisted.internet.abstract.isIPv6Address(resolved):\n resolved_family = socket.AF_INET6\n else:\n resolved_family = socket.AF_UNSPEC\n if family != socket.AF_UNSPEC and family != resolved_family:\n raise Exception(\n \"Requested socket family %d but got %d\" % (family, resolved_family)\n )\n result = [(typing.cast(int, resolved_family), (resolved, port))]\n return result\n\n\ndef install() -> None:\n \"\"\"Install ``AsyncioSelectorReactor`` as the default Twisted reactor.\n\n .. deprecated:: 5.1\n\n This function is provided for backwards compatibility; code\n that does not require compatibility with older versions of\n Tornado should use\n ``twisted.internet.asyncioreactor.install()`` directly.\n\n .. versionchanged:: 6.0.3\n\n In Tornado 5.x and before, this function installed a reactor\n based on the Tornado ``IOLoop``. When that reactor\n implementation was removed in Tornado 6.0.0, this function was\n removed as well. It was restored in Tornado 6.0.3 using the\n ``asyncio`` reactor instead.\n\n \"\"\"\n from twisted.internet.asyncioreactor import install\n\n install()\n\n\nif hasattr(gen.convert_yielded, \"register\"):\n\n @gen.convert_yielded.register(Deferred) # type: ignore\n def _(d: Deferred) -> Future:\n f = Future() # type: Future[Any]\n\n def errback(failure: failure.Failure) -> None:\n try:\n failure.raiseException()\n # Should never happen, but just in case\n raise Exception(\"errback called without error\")\n except:\n future_set_exc_info(f, sys.exc_info())\n\n d.addCallbacks(f.set_result, errback)\n return f\n", "path": "tornado/platform/twisted.py"}]}
| 1,797 | 289 |
gh_patches_debug_335
|
rasdani/github-patches
|
git_diff
|
pymodbus-dev__pymodbus-1395
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pip show pymodbus, misses information.
```
pymodbus) pymodbus % pip show pymodbus
Name: pymodbus
Version: 3.1.x
Summary: A fully featured modbus protocol stack in python
Home-page: https://github.com/pymodbus-dev/pymodbus/
Author: attr: pymodbus.__author__
Author-email:
License: BSD-3-Clause
Location: /Users/jan/repos/pymodbus
Editable project location: /Users/jan/repos/pymodbus
Requires: setuptools
Required-by:
```
Normally it gets the information from setup.cfg, but for some reason it does not work with "pip show".
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pymodbus/__init__.py`
Content:
```
1 """Pymodbus: Modbus Protocol Implementation.
2
3 Released under the the BSD license
4 """
5
6 from logging import WARNING
7
8 import pymodbus.version as __version
9 from pymodbus.logging import Log
10
11
12 __version__ = __version.version.short()
13 __author__ = "Galen Collins"
14 __maintainer__ = "dhoomakethu, janiversen"
15
16
17 def pymodbus_apply_logging_config(level=WARNING):
18 """Apply basic logging configuration used by default by Pymodbus maintainers.
19
20 Please call this function to format logging appropriately when opening issues.
21 """
22 Log.apply_logging_config(level)
23
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pymodbus/__init__.py b/pymodbus/__init__.py
--- a/pymodbus/__init__.py
+++ b/pymodbus/__init__.py
@@ -10,7 +10,7 @@
__version__ = __version.version.short()
-__author__ = "Galen Collins"
+__author__ = "Galen Collins, Jan Iversen"
__maintainer__ = "dhoomakethu, janiversen"
|
{"golden_diff": "diff --git a/pymodbus/__init__.py b/pymodbus/__init__.py\n--- a/pymodbus/__init__.py\n+++ b/pymodbus/__init__.py\n@@ -10,7 +10,7 @@\n \n \n __version__ = __version.version.short()\n-__author__ = \"Galen Collins\"\n+__author__ = \"Galen Collins, Jan Iversen\"\n __maintainer__ = \"dhoomakethu, janiversen\"\n", "issue": "pip show pymodbus, misses information.\n```\r\npymodbus) pymodbus % pip show pymodbus\r\n\r\nName: pymodbus\r\nVersion: 3.1.x\r\nSummary: A fully featured modbus protocol stack in python\r\nHome-page: https://github.com/pymodbus-dev/pymodbus/\r\nAuthor: attr: pymodbus.__author__\r\nAuthor-email: \r\nLicense: BSD-3-Clause\r\nLocation: /Users/jan/repos/pymodbus\r\nEditable project location: /Users/jan/repos/pymodbus\r\nRequires: setuptools\r\nRequired-by: \r\n```\r\nNormally it gets the information from setup.cfg, but for some reason it does not work with \"pip show\".\n", "before_files": [{"content": "\"\"\"Pymodbus: Modbus Protocol Implementation.\n\nReleased under the the BSD license\n\"\"\"\n\nfrom logging import WARNING\n\nimport pymodbus.version as __version\nfrom pymodbus.logging import Log\n\n\n__version__ = __version.version.short()\n__author__ = \"Galen Collins\"\n__maintainer__ = \"dhoomakethu, janiversen\"\n\n\ndef pymodbus_apply_logging_config(level=WARNING):\n \"\"\"Apply basic logging configuration used by default by Pymodbus maintainers.\n\n Please call this function to format logging appropriately when opening issues.\n \"\"\"\n Log.apply_logging_config(level)\n", "path": "pymodbus/__init__.py"}], "after_files": [{"content": "\"\"\"Pymodbus: Modbus Protocol Implementation.\n\nReleased under the the BSD license\n\"\"\"\n\nfrom logging import WARNING\n\nimport pymodbus.version as __version\nfrom pymodbus.logging import Log\n\n\n__version__ = __version.version.short()\n__author__ = \"Galen Collins, Jan Iversen\"\n__maintainer__ = \"dhoomakethu, janiversen\"\n\n\ndef pymodbus_apply_logging_config(level=WARNING):\n \"\"\"Apply basic logging configuration used by default by Pymodbus maintainers.\n\n Please call this function to format logging appropriately when opening issues.\n \"\"\"\n Log.apply_logging_config(level)\n", "path": "pymodbus/__init__.py"}]}
| 581 | 107 |
gh_patches_debug_26733
|
rasdani/github-patches
|
git_diff
|
pytorch__vision-2696
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Reading PNG/JPG images into a torch::tensor and saving a torch::tensor to PNG/JPG in C++ without OpenCV
## 🚀 Feature
After integrating Siv3D with Libtorch (https://github.com/QuantScientist/Siv3DTorch) I am now trying to read and write images from and to Siv3D **in C++, not Python**. The way it works is:
## Motivation
In C++ I need to do the following:
1. An image is read from disk (usually using OpenCV which is easy but I am trying to avoid)
2. The image is converted to torch::tensor
3. A DL model is applied on the tensor
4. A tensor is returned from the model
5. The tensor is converted to an image for display/saving purposes.
This is one example where they used stb_image to this, avoiding the use of OpenCV.
https://github.com/prabhuomkar/pytorch-cpp/blob/master/utils/image_io/src/image_io.cpp
## Pitch
## Alternatives
For reference this is the OpenCV to Libtorch conversion utils which I use, I would like something very similiar:
```
at::Tensor matToTensor(cv::Mat frame, int h, int w, int c) {
cv::cvtColor(frame, frame, CV_BGR2RGB);
frame.convertTo(frame, CV_32FC3, 1.0f / 255.0f);
auto input_tensor = torch::from_blob(frame.data, {1, h, w, c});
input_tensor = input_tensor.permute({0, 3, 1, 2});
torch::DeviceType device_type = torch::kCPU;
// if (torch::cuda::is_available()) {
device_type = torch::kCUDA;
// }
input_tensor = input_tensor.to(device_type);
return input_tensor;
}
cv::Mat tensorToOpenCv(at::Tensor out_tensor, int h, int w, int c) {
out_tensor = out_tensor.squeeze().detach().permute({1, 2, 0});
out_tensor = out_tensor.mul(255).clamp(0, 255).to(torch::kU8);
out_tensor = out_tensor.to(torch::kCPU);
cv::Mat resultImg(h, w, CV_8UC3);
// cv::Mat resultImg(h, w, CV_8UC1);
std::memcpy((void *) resultImg.data, out_tensor.data_ptr(), sizeof(torch::kU8) * out_tensor.numel());
return resultImg;
}
```
## Additional context
I found this:"https://github.com/pytorch/vision/blob/5e4a9f6d1a2bf85137f4826dbf76e4f25986f878/torchvision/csrc/cpu/image/readpng_cpu.cpp
however, could not get any useful method out of it.
Thanks,
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchvision/io/image.py`
Content:
```
1 import torch
2
3 import os
4 import os.path as osp
5 import importlib.machinery
6
7 _HAS_IMAGE_OPT = False
8
9 try:
10 lib_dir = osp.join(osp.dirname(__file__), "..")
11
12 loader_details = (
13 importlib.machinery.ExtensionFileLoader,
14 importlib.machinery.EXTENSION_SUFFIXES
15 )
16
17 extfinder = importlib.machinery.FileFinder(lib_dir, loader_details) # type: ignore[arg-type]
18 ext_specs = extfinder.find_spec("image")
19 if ext_specs is not None:
20 torch.ops.load_library(ext_specs.origin)
21 _HAS_IMAGE_OPT = True
22 except (ImportError, OSError):
23 pass
24
25
26 def decode_png(input: torch.Tensor) -> torch.Tensor:
27 """
28 Decodes a PNG image into a 3 dimensional RGB Tensor.
29 The values of the output tensor are uint8 between 0 and 255.
30
31 Arguments:
32 input (Tensor[1]): a one dimensional int8 tensor containing
33 the raw bytes of the PNG image.
34
35 Returns:
36 output (Tensor[3, image_height, image_width])
37 """
38 if not isinstance(input, torch.Tensor) or input.numel() == 0 or input.ndim != 1: # type: ignore[attr-defined]
39 raise ValueError("Expected a non empty 1-dimensional tensor.")
40
41 if not input.dtype == torch.uint8:
42 raise ValueError("Expected a torch.uint8 tensor.")
43 output = torch.ops.image.decode_png(input)
44 return output
45
46
47 def read_png(path: str) -> torch.Tensor:
48 """
49 Reads a PNG image into a 3 dimensional RGB Tensor.
50 The values of the output tensor are uint8 between 0 and 255.
51
52 Arguments:
53 path (str): path of the PNG image.
54
55 Returns:
56 output (Tensor[3, image_height, image_width])
57 """
58 if not os.path.isfile(path):
59 raise ValueError("Expected a valid file path.")
60
61 size = os.path.getsize(path)
62 if size == 0:
63 raise ValueError("Expected a non empty file.")
64 data = torch.from_file(path, dtype=torch.uint8, size=size)
65 return decode_png(data)
66
67
68 def decode_jpeg(input: torch.Tensor) -> torch.Tensor:
69 """
70 Decodes a JPEG image into a 3 dimensional RGB Tensor.
71 The values of the output tensor are uint8 between 0 and 255.
72 Arguments:
73 input (Tensor[1]): a one dimensional int8 tensor containing
74 the raw bytes of the JPEG image.
75 Returns:
76 output (Tensor[3, image_height, image_width])
77 """
78 if not isinstance(input, torch.Tensor) or len(input) == 0 or input.ndim != 1: # type: ignore[attr-defined]
79 raise ValueError("Expected a non empty 1-dimensional tensor.")
80
81 if not input.dtype == torch.uint8:
82 raise ValueError("Expected a torch.uint8 tensor.")
83
84 output = torch.ops.image.decode_jpeg(input)
85 return output
86
87
88 def read_jpeg(path: str) -> torch.Tensor:
89 """
90 Reads a JPEG image into a 3 dimensional RGB Tensor.
91 The values of the output tensor are uint8 between 0 and 255.
92 Arguments:
93 path (str): path of the JPEG image.
94 Returns:
95 output (Tensor[3, image_height, image_width])
96 """
97 if not os.path.isfile(path):
98 raise ValueError("Expected a valid file path.")
99
100 size = os.path.getsize(path)
101 if size == 0:
102 raise ValueError("Expected a non empty file.")
103 data = torch.from_file(path, dtype=torch.uint8, size=size)
104 return decode_jpeg(data)
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/torchvision/io/image.py b/torchvision/io/image.py
--- a/torchvision/io/image.py
+++ b/torchvision/io/image.py
@@ -102,3 +102,42 @@
raise ValueError("Expected a non empty file.")
data = torch.from_file(path, dtype=torch.uint8, size=size)
return decode_jpeg(data)
+
+
+def encode_jpeg(input: torch.Tensor, quality: int = 75) -> torch.Tensor:
+ """
+ Takes an input tensor in CHW layout (or HW in the case of grayscale images)
+ and returns a buffer with the contents of its corresponding JPEG file.
+ Arguments:
+ input (Tensor[channels, image_height, image_width]): int8 image tensor
+ of `c` channels, where `c` must be 1 or 3.
+ quality (int): Quality of the resulting JPEG file, it must be a number
+ between 1 and 100. Default: 75
+ Returns
+ output (Tensor[1]): A one dimensional int8 tensor that contains the raw
+ bytes of the JPEG file.
+ """
+ if quality < 1 or quality > 100:
+ raise ValueError('Image quality should be a positive number '
+ 'between 1 and 100')
+
+ output = torch.ops.image.encode_jpeg(input, quality)
+ return output
+
+
+def write_jpeg(input: torch.Tensor, filename: str, quality: int = 75):
+ """
+ Takes an input tensor in CHW layout (or HW in the case of grayscale images)
+ and saves it in a JPEG file.
+ Arguments:
+ input (Tensor[channels, image_height, image_width]): int8 image tensor
+ of `c` channels, where `c` must be 1 or 3.
+ filename (str): Path to save the image.
+ quality (int): Quality of the resulting JPEG file, it must be a number
+ between 1 and 100. Default: 75
+ """
+ if quality < 1 or quality > 100:
+ raise ValueError('Image quality should be a positive number '
+ 'between 1 and 100')
+
+ torch.ops.image.write_jpeg(input, filename, quality)
|
{"golden_diff": "diff --git a/torchvision/io/image.py b/torchvision/io/image.py\n--- a/torchvision/io/image.py\n+++ b/torchvision/io/image.py\n@@ -102,3 +102,42 @@\n raise ValueError(\"Expected a non empty file.\")\n data = torch.from_file(path, dtype=torch.uint8, size=size)\n return decode_jpeg(data)\n+\n+\n+def encode_jpeg(input: torch.Tensor, quality: int = 75) -> torch.Tensor:\n+ \"\"\"\n+ Takes an input tensor in CHW layout (or HW in the case of grayscale images)\n+ and returns a buffer with the contents of its corresponding JPEG file.\n+ Arguments:\n+ input (Tensor[channels, image_height, image_width]): int8 image tensor\n+ of `c` channels, where `c` must be 1 or 3.\n+ quality (int): Quality of the resulting JPEG file, it must be a number\n+ between 1 and 100. Default: 75\n+ Returns\n+ output (Tensor[1]): A one dimensional int8 tensor that contains the raw\n+ bytes of the JPEG file.\n+ \"\"\"\n+ if quality < 1 or quality > 100:\n+ raise ValueError('Image quality should be a positive number '\n+ 'between 1 and 100')\n+\n+ output = torch.ops.image.encode_jpeg(input, quality)\n+ return output\n+\n+\n+def write_jpeg(input: torch.Tensor, filename: str, quality: int = 75):\n+ \"\"\"\n+ Takes an input tensor in CHW layout (or HW in the case of grayscale images)\n+ and saves it in a JPEG file.\n+ Arguments:\n+ input (Tensor[channels, image_height, image_width]): int8 image tensor\n+ of `c` channels, where `c` must be 1 or 3.\n+ filename (str): Path to save the image.\n+ quality (int): Quality of the resulting JPEG file, it must be a number\n+ between 1 and 100. Default: 75\n+ \"\"\"\n+ if quality < 1 or quality > 100:\n+ raise ValueError('Image quality should be a positive number '\n+ 'between 1 and 100')\n+\n+ torch.ops.image.write_jpeg(input, filename, quality)\n", "issue": "Reading PNG/JPG images into a torch::tensor and saving a torch::tensor to PNG/JPG in C++ without OpenCV \n## \ud83d\ude80 Feature\r\n\r\nAfter integrating Siv3D with Libtorch (https://github.com/QuantScientist/Siv3DTorch) I am now trying to read and write images from and to Siv3D **in C++, not Python**. The way it works is:\r\n\r\n## Motivation\r\nIn C++ I need to do the following:\r\n1. An image is read from disk (usually using OpenCV which is easy but I am trying to avoid)\r\n2. The image is converted to torch::tensor\r\n3. A DL model is applied on the tensor\r\n4. A tensor is returned from the model\r\n5. The tensor is converted to an image for display/saving purposes.\r\n\r\nThis is one example where they used stb_image to this, avoiding the use of OpenCV.\r\nhttps://github.com/prabhuomkar/pytorch-cpp/blob/master/utils/image_io/src/image_io.cpp\r\n\r\n## Pitch\r\n\r\n## Alternatives\r\n\r\nFor reference this is the OpenCV to Libtorch conversion utils which I use, I would like something very similiar:\r\n```\r\nat::Tensor matToTensor(cv::Mat frame, int h, int w, int c) {\r\n cv::cvtColor(frame, frame, CV_BGR2RGB);\r\n frame.convertTo(frame, CV_32FC3, 1.0f / 255.0f);\r\n auto input_tensor = torch::from_blob(frame.data, {1, h, w, c});\r\n input_tensor = input_tensor.permute({0, 3, 1, 2});\r\n\r\n torch::DeviceType device_type = torch::kCPU;\r\n// if (torch::cuda::is_available()) {\r\n device_type = torch::kCUDA;\r\n// }\r\n input_tensor = input_tensor.to(device_type);\r\n return input_tensor;\r\n}\r\n\r\ncv::Mat tensorToOpenCv(at::Tensor out_tensor, int h, int w, int c) {\r\n out_tensor = out_tensor.squeeze().detach().permute({1, 2, 0});\r\n out_tensor = out_tensor.mul(255).clamp(0, 255).to(torch::kU8);\r\n out_tensor = out_tensor.to(torch::kCPU);\r\n cv::Mat resultImg(h, w, CV_8UC3);\r\n // cv::Mat resultImg(h, w, CV_8UC1);\r\n std::memcpy((void *) resultImg.data, out_tensor.data_ptr(), sizeof(torch::kU8) * out_tensor.numel());\r\n return resultImg;\r\n}\r\n\r\n```\r\n## Additional context\r\n\r\nI found this:\"https://github.com/pytorch/vision/blob/5e4a9f6d1a2bf85137f4826dbf76e4f25986f878/torchvision/csrc/cpu/image/readpng_cpu.cpp\r\nhowever, could not get any useful method out of it. \r\n\r\nThanks, \r\n\r\n\r\n\r\n\n", "before_files": [{"content": "import torch\n\nimport os\nimport os.path as osp\nimport importlib.machinery\n\n_HAS_IMAGE_OPT = False\n\ntry:\n lib_dir = osp.join(osp.dirname(__file__), \"..\")\n\n loader_details = (\n importlib.machinery.ExtensionFileLoader,\n importlib.machinery.EXTENSION_SUFFIXES\n )\n\n extfinder = importlib.machinery.FileFinder(lib_dir, loader_details) # type: ignore[arg-type]\n ext_specs = extfinder.find_spec(\"image\")\n if ext_specs is not None:\n torch.ops.load_library(ext_specs.origin)\n _HAS_IMAGE_OPT = True\nexcept (ImportError, OSError):\n pass\n\n\ndef decode_png(input: torch.Tensor) -> torch.Tensor:\n \"\"\"\n Decodes a PNG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n\n Arguments:\n input (Tensor[1]): a one dimensional int8 tensor containing\n the raw bytes of the PNG image.\n\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not isinstance(input, torch.Tensor) or input.numel() == 0 or input.ndim != 1: # type: ignore[attr-defined]\n raise ValueError(\"Expected a non empty 1-dimensional tensor.\")\n\n if not input.dtype == torch.uint8:\n raise ValueError(\"Expected a torch.uint8 tensor.\")\n output = torch.ops.image.decode_png(input)\n return output\n\n\ndef read_png(path: str) -> torch.Tensor:\n \"\"\"\n Reads a PNG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n\n Arguments:\n path (str): path of the PNG image.\n\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not os.path.isfile(path):\n raise ValueError(\"Expected a valid file path.\")\n\n size = os.path.getsize(path)\n if size == 0:\n raise ValueError(\"Expected a non empty file.\")\n data = torch.from_file(path, dtype=torch.uint8, size=size)\n return decode_png(data)\n\n\ndef decode_jpeg(input: torch.Tensor) -> torch.Tensor:\n \"\"\"\n Decodes a JPEG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n Arguments:\n input (Tensor[1]): a one dimensional int8 tensor containing\n the raw bytes of the JPEG image.\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not isinstance(input, torch.Tensor) or len(input) == 0 or input.ndim != 1: # type: ignore[attr-defined]\n raise ValueError(\"Expected a non empty 1-dimensional tensor.\")\n\n if not input.dtype == torch.uint8:\n raise ValueError(\"Expected a torch.uint8 tensor.\")\n\n output = torch.ops.image.decode_jpeg(input)\n return output\n\n\ndef read_jpeg(path: str) -> torch.Tensor:\n \"\"\"\n Reads a JPEG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n Arguments:\n path (str): path of the JPEG image.\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not os.path.isfile(path):\n raise ValueError(\"Expected a valid file path.\")\n\n size = os.path.getsize(path)\n if size == 0:\n raise ValueError(\"Expected a non empty file.\")\n data = torch.from_file(path, dtype=torch.uint8, size=size)\n return decode_jpeg(data)\n", "path": "torchvision/io/image.py"}], "after_files": [{"content": "import torch\n\nimport os\nimport os.path as osp\nimport importlib.machinery\n\n_HAS_IMAGE_OPT = False\n\ntry:\n lib_dir = osp.join(osp.dirname(__file__), \"..\")\n\n loader_details = (\n importlib.machinery.ExtensionFileLoader,\n importlib.machinery.EXTENSION_SUFFIXES\n )\n\n extfinder = importlib.machinery.FileFinder(lib_dir, loader_details) # type: ignore[arg-type]\n ext_specs = extfinder.find_spec(\"image\")\n if ext_specs is not None:\n torch.ops.load_library(ext_specs.origin)\n _HAS_IMAGE_OPT = True\nexcept (ImportError, OSError):\n pass\n\n\ndef decode_png(input: torch.Tensor) -> torch.Tensor:\n \"\"\"\n Decodes a PNG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n\n Arguments:\n input (Tensor[1]): a one dimensional int8 tensor containing\n the raw bytes of the PNG image.\n\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not isinstance(input, torch.Tensor) or input.numel() == 0 or input.ndim != 1: # type: ignore[attr-defined]\n raise ValueError(\"Expected a non empty 1-dimensional tensor.\")\n\n if not input.dtype == torch.uint8:\n raise ValueError(\"Expected a torch.uint8 tensor.\")\n output = torch.ops.image.decode_png(input)\n return output\n\n\ndef read_png(path: str) -> torch.Tensor:\n \"\"\"\n Reads a PNG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n\n Arguments:\n path (str): path of the PNG image.\n\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not os.path.isfile(path):\n raise ValueError(\"Expected a valid file path.\")\n\n size = os.path.getsize(path)\n if size == 0:\n raise ValueError(\"Expected a non empty file.\")\n data = torch.from_file(path, dtype=torch.uint8, size=size)\n return decode_png(data)\n\n\ndef decode_jpeg(input: torch.Tensor) -> torch.Tensor:\n \"\"\"\n Decodes a JPEG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n Arguments:\n input (Tensor[1]): a one dimensional int8 tensor containing\n the raw bytes of the JPEG image.\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not isinstance(input, torch.Tensor) or len(input) == 0 or input.ndim != 1: # type: ignore[attr-defined]\n raise ValueError(\"Expected a non empty 1-dimensional tensor.\")\n\n if not input.dtype == torch.uint8:\n raise ValueError(\"Expected a torch.uint8 tensor.\")\n\n output = torch.ops.image.decode_jpeg(input)\n return output\n\n\ndef read_jpeg(path: str) -> torch.Tensor:\n \"\"\"\n Reads a JPEG image into a 3 dimensional RGB Tensor.\n The values of the output tensor are uint8 between 0 and 255.\n Arguments:\n path (str): path of the JPEG image.\n Returns:\n output (Tensor[3, image_height, image_width])\n \"\"\"\n if not os.path.isfile(path):\n raise ValueError(\"Expected a valid file path.\")\n\n size = os.path.getsize(path)\n if size == 0:\n raise ValueError(\"Expected a non empty file.\")\n data = torch.from_file(path, dtype=torch.uint8, size=size)\n return decode_jpeg(data)\n\n\ndef encode_jpeg(input: torch.Tensor, quality: int = 75) -> torch.Tensor:\n \"\"\"\n Takes an input tensor in CHW layout (or HW in the case of grayscale images)\n and returns a buffer with the contents of its corresponding JPEG file.\n Arguments:\n input (Tensor[channels, image_height, image_width]): int8 image tensor\n of `c` channels, where `c` must be 1 or 3.\n quality (int): Quality of the resulting JPEG file, it must be a number\n between 1 and 100. Default: 75\n Returns\n output (Tensor[1]): A one dimensional int8 tensor that contains the raw\n bytes of the JPEG file.\n \"\"\"\n if quality < 1 or quality > 100:\n raise ValueError('Image quality should be a positive number '\n 'between 1 and 100')\n\n output = torch.ops.image.encode_jpeg(input, quality)\n return output\n\n\ndef write_jpeg(input: torch.Tensor, filename: str, quality: int = 75):\n \"\"\"\n Takes an input tensor in CHW layout (or HW in the case of grayscale images)\n and saves it in a JPEG file.\n Arguments:\n input (Tensor[channels, image_height, image_width]): int8 image tensor\n of `c` channels, where `c` must be 1 or 3.\n filename (str): Path to save the image.\n quality (int): Quality of the resulting JPEG file, it must be a number\n between 1 and 100. Default: 75\n \"\"\"\n if quality < 1 or quality > 100:\n raise ValueError('Image quality should be a positive number '\n 'between 1 and 100')\n\n torch.ops.image.write_jpeg(input, filename, quality)\n", "path": "torchvision/io/image.py"}]}
| 1,904 | 525 |
gh_patches_debug_33263
|
rasdani/github-patches
|
git_diff
|
kserve__kserve-2684
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
GPL License Violation in the kserve python package
The version of the `kserve` package that is currently on PyPI (version `0.10`) violates the GPL license because it depends on [`table-logger`](https://github.com/AleksTk/table-logger), distributed under GPLv2 (you'll see that the library is now MIT, the author updated the license just a few days ago, but hasn't released a new version with the new license yet). No GPLv2 packages should be imported given that `kserve` has an Apache 2 license.
This was recently fixed by this PR https://github.com/kserve/kserve/pull/2673, which accidentally resolved the issue by replacing `table-logger` with `tabulate` (MIT License)
cc @yuzisun @cliveseldon @jinchihe @ellistarn
Is it possible to quickly release a patch release `0.10.1` to include the above patch and make sure `kserve` is compliant with the Apache license? As it stands, any distribution and vendor using `kserve` is liable for a license violation.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/kserve/kserve/api/watch.py`
Content:
```
1 # Copyright 2021 The KServe Authors.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import time
16 from kubernetes import client
17 from kubernetes import watch as k8s_watch
18 from table_logger import TableLogger
19
20 from ..constants import constants
21 from ..utils import utils
22
23
24 def isvc_watch(name=None, namespace=None, timeout_seconds=600, generation=0):
25 """Watch the created or patched InferenceService in the specified namespace"""
26
27 if namespace is None:
28 namespace = utils.get_default_target_namespace()
29
30 tbl = TableLogger(
31 columns='NAME,READY,PREV,LATEST,URL',
32 colwidth={'NAME': 20, 'READY': 10, 'PREV': 25, 'LATEST': 25, 'URL': 65},
33 border=False)
34
35 stream = k8s_watch.Watch().stream(
36 client.CustomObjectsApi().list_namespaced_custom_object,
37 constants.KSERVE_GROUP,
38 constants.KSERVE_V1BETA1_VERSION,
39 namespace,
40 constants.KSERVE_PLURAL,
41 timeout_seconds=timeout_seconds)
42
43 for event in stream:
44 isvc = event['object']
45 isvc_name = isvc['metadata']['name']
46 if name and name != isvc_name:
47 continue
48 else:
49 status = 'Unknown'
50 if isvc.get('status', ''):
51 url = isvc['status'].get('url', '')
52 traffic = isvc['status'].get('components', {}).get(
53 'predictor', {}).get('traffic', [])
54 traffic_percent = 100
55 if constants.OBSERVED_GENERATION in isvc['status']:
56 observed_generation = isvc['status'][constants.OBSERVED_GENERATION]
57 for t in traffic:
58 if t["latestRevision"]:
59 traffic_percent = t["percent"]
60
61 if generation != 0 and observed_generation != generation:
62 continue
63 for condition in isvc['status'].get('conditions', {}):
64 if condition.get('type', '') == 'Ready':
65 status = condition.get('status', 'Unknown')
66 tbl(isvc_name, status, 100-traffic_percent, traffic_percent, url)
67 if status == 'True':
68 break
69
70 else:
71 tbl(isvc_name, status, '', '', '')
72 # Sleep 2 to avoid status section is not generated within a very short time.
73 time.sleep(2)
74 continue
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/python/kserve/kserve/api/watch.py b/python/kserve/kserve/api/watch.py
--- a/python/kserve/kserve/api/watch.py
+++ b/python/kserve/kserve/api/watch.py
@@ -13,9 +13,10 @@
# limitations under the License.
import time
+
from kubernetes import client
from kubernetes import watch as k8s_watch
-from table_logger import TableLogger
+from tabulate import tabulate
from ..constants import constants
from ..utils import utils
@@ -27,10 +28,8 @@
if namespace is None:
namespace = utils.get_default_target_namespace()
- tbl = TableLogger(
- columns='NAME,READY,PREV,LATEST,URL',
- colwidth={'NAME': 20, 'READY': 10, 'PREV': 25, 'LATEST': 25, 'URL': 65},
- border=False)
+ headers = ['NAME', 'READY', 'PREV', 'LATEST', 'URL']
+ table_fmt = 'plain'
stream = k8s_watch.Watch().stream(
client.CustomObjectsApi().list_namespaced_custom_object,
@@ -63,12 +62,13 @@
for condition in isvc['status'].get('conditions', {}):
if condition.get('type', '') == 'Ready':
status = condition.get('status', 'Unknown')
- tbl(isvc_name, status, 100-traffic_percent, traffic_percent, url)
+ print(tabulate([[isvc_name, status, 100 - traffic_percent, traffic_percent, url]],
+ headers=headers, tablefmt=table_fmt))
if status == 'True':
break
else:
- tbl(isvc_name, status, '', '', '')
+ print(tabulate([[isvc_name, status, '', '', '']], headers=headers, tablefmt=table_fmt))
# Sleep 2 to avoid status section is not generated within a very short time.
time.sleep(2)
continue
|
{"golden_diff": "diff --git a/python/kserve/kserve/api/watch.py b/python/kserve/kserve/api/watch.py\n--- a/python/kserve/kserve/api/watch.py\n+++ b/python/kserve/kserve/api/watch.py\n@@ -13,9 +13,10 @@\n # limitations under the License.\n \n import time\n+\n from kubernetes import client\n from kubernetes import watch as k8s_watch\n-from table_logger import TableLogger\n+from tabulate import tabulate\n \n from ..constants import constants\n from ..utils import utils\n@@ -27,10 +28,8 @@\n if namespace is None:\n namespace = utils.get_default_target_namespace()\n \n- tbl = TableLogger(\n- columns='NAME,READY,PREV,LATEST,URL',\n- colwidth={'NAME': 20, 'READY': 10, 'PREV': 25, 'LATEST': 25, 'URL': 65},\n- border=False)\n+ headers = ['NAME', 'READY', 'PREV', 'LATEST', 'URL']\n+ table_fmt = 'plain'\n \n stream = k8s_watch.Watch().stream(\n client.CustomObjectsApi().list_namespaced_custom_object,\n@@ -63,12 +62,13 @@\n for condition in isvc['status'].get('conditions', {}):\n if condition.get('type', '') == 'Ready':\n status = condition.get('status', 'Unknown')\n- tbl(isvc_name, status, 100-traffic_percent, traffic_percent, url)\n+ print(tabulate([[isvc_name, status, 100 - traffic_percent, traffic_percent, url]],\n+ headers=headers, tablefmt=table_fmt))\n if status == 'True':\n break\n \n else:\n- tbl(isvc_name, status, '', '', '')\n+ print(tabulate([[isvc_name, status, '', '', '']], headers=headers, tablefmt=table_fmt))\n # Sleep 2 to avoid status section is not generated within a very short time.\n time.sleep(2)\n continue\n", "issue": "GPL License Violation in the kserve python package\nThe version of the `kserve` package that is currently on PyPI (version `0.10`) violates the GPL license because it depends on [`table-logger`](https://github.com/AleksTk/table-logger), distributed under GPLv2 (you'll see that the library is now MIT, the author updated the license just a few days ago, but hasn't released a new version with the new license yet). No GPLv2 packages should be imported given that `kserve` has an Apache 2 license.\r\n\r\n\r\nThis was recently fixed by this PR https://github.com/kserve/kserve/pull/2673, which accidentally resolved the issue by replacing `table-logger` with `tabulate` (MIT License)\r\n\r\ncc @yuzisun @cliveseldon @jinchihe @ellistarn \r\n\r\nIs it possible to quickly release a patch release `0.10.1` to include the above patch and make sure `kserve` is compliant with the Apache license? As it stands, any distribution and vendor using `kserve` is liable for a license violation.\r\n\r\n\r\n\r\n\n", "before_files": [{"content": "# Copyright 2021 The KServe Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport time\nfrom kubernetes import client\nfrom kubernetes import watch as k8s_watch\nfrom table_logger import TableLogger\n\nfrom ..constants import constants\nfrom ..utils import utils\n\n\ndef isvc_watch(name=None, namespace=None, timeout_seconds=600, generation=0):\n \"\"\"Watch the created or patched InferenceService in the specified namespace\"\"\"\n\n if namespace is None:\n namespace = utils.get_default_target_namespace()\n\n tbl = TableLogger(\n columns='NAME,READY,PREV,LATEST,URL',\n colwidth={'NAME': 20, 'READY': 10, 'PREV': 25, 'LATEST': 25, 'URL': 65},\n border=False)\n\n stream = k8s_watch.Watch().stream(\n client.CustomObjectsApi().list_namespaced_custom_object,\n constants.KSERVE_GROUP,\n constants.KSERVE_V1BETA1_VERSION,\n namespace,\n constants.KSERVE_PLURAL,\n timeout_seconds=timeout_seconds)\n\n for event in stream:\n isvc = event['object']\n isvc_name = isvc['metadata']['name']\n if name and name != isvc_name:\n continue\n else:\n status = 'Unknown'\n if isvc.get('status', ''):\n url = isvc['status'].get('url', '')\n traffic = isvc['status'].get('components', {}).get(\n 'predictor', {}).get('traffic', [])\n traffic_percent = 100\n if constants.OBSERVED_GENERATION in isvc['status']:\n observed_generation = isvc['status'][constants.OBSERVED_GENERATION]\n for t in traffic:\n if t[\"latestRevision\"]:\n traffic_percent = t[\"percent\"]\n\n if generation != 0 and observed_generation != generation:\n continue\n for condition in isvc['status'].get('conditions', {}):\n if condition.get('type', '') == 'Ready':\n status = condition.get('status', 'Unknown')\n tbl(isvc_name, status, 100-traffic_percent, traffic_percent, url)\n if status == 'True':\n break\n\n else:\n tbl(isvc_name, status, '', '', '')\n # Sleep 2 to avoid status section is not generated within a very short time.\n time.sleep(2)\n continue\n", "path": "python/kserve/kserve/api/watch.py"}], "after_files": [{"content": "# Copyright 2021 The KServe Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport time\n\nfrom kubernetes import client\nfrom kubernetes import watch as k8s_watch\nfrom tabulate import tabulate\n\nfrom ..constants import constants\nfrom ..utils import utils\n\n\ndef isvc_watch(name=None, namespace=None, timeout_seconds=600, generation=0):\n \"\"\"Watch the created or patched InferenceService in the specified namespace\"\"\"\n\n if namespace is None:\n namespace = utils.get_default_target_namespace()\n\n headers = ['NAME', 'READY', 'PREV', 'LATEST', 'URL']\n table_fmt = 'plain'\n\n stream = k8s_watch.Watch().stream(\n client.CustomObjectsApi().list_namespaced_custom_object,\n constants.KSERVE_GROUP,\n constants.KSERVE_V1BETA1_VERSION,\n namespace,\n constants.KSERVE_PLURAL,\n timeout_seconds=timeout_seconds)\n\n for event in stream:\n isvc = event['object']\n isvc_name = isvc['metadata']['name']\n if name and name != isvc_name:\n continue\n else:\n status = 'Unknown'\n if isvc.get('status', ''):\n url = isvc['status'].get('url', '')\n traffic = isvc['status'].get('components', {}).get(\n 'predictor', {}).get('traffic', [])\n traffic_percent = 100\n if constants.OBSERVED_GENERATION in isvc['status']:\n observed_generation = isvc['status'][constants.OBSERVED_GENERATION]\n for t in traffic:\n if t[\"latestRevision\"]:\n traffic_percent = t[\"percent\"]\n\n if generation != 0 and observed_generation != generation:\n continue\n for condition in isvc['status'].get('conditions', {}):\n if condition.get('type', '') == 'Ready':\n status = condition.get('status', 'Unknown')\n print(tabulate([[isvc_name, status, 100 - traffic_percent, traffic_percent, url]],\n headers=headers, tablefmt=table_fmt))\n if status == 'True':\n break\n\n else:\n print(tabulate([[isvc_name, status, '', '', '']], headers=headers, tablefmt=table_fmt))\n # Sleep 2 to avoid status section is not generated within a very short time.\n time.sleep(2)\n continue\n", "path": "python/kserve/kserve/api/watch.py"}]}
| 1,283 | 450 |
gh_patches_debug_27026
|
rasdani/github-patches
|
git_diff
|
DataDog__integrations-core-619
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
postfix integration should not require sudo to root
Reading the source code to integrations-core/postfix/check.py I note that it does a sudo to root to run the find command.
This is noted in the docs / comments :
> WARNING: the user that dd-agent runs as must have sudo access for the 'find' command
> --
> | sudo access is not required when running dd-agent as root (not recommended)
> |
> | example /etc/sudoers entry:
> | dd-agent ALL=(ALL) NOPASSWD:/usr/bin/find /var/spool/postfix* -type f
root should not be required here - postfix user should be sufficient. That would be combined with a '-u postfix' on line 64's sudo command to allow this to work.
This is a concern because find has a -exec parameter and your command list has a wildcard in it - this could be used to run arbitrary commands as root if the dd-agent user is compromised.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `postfix/check.py`
Content:
```
1 # (C) Datadog, Inc. 2013-2016
2 # (C) Josiah C Webb <[email protected]> 2013
3 # All rights reserved
4 # Licensed under Simplified BSD License (see LICENSE)
5
6 # stdlib
7 import os
8
9 # project
10 from checks import AgentCheck
11 from utils.subprocess_output import get_subprocess_output
12
13 class PostfixCheck(AgentCheck):
14 """This check provides metrics on the number of messages in a given postfix queue
15
16 WARNING: the user that dd-agent runs as must have sudo access for the 'find' command
17 sudo access is not required when running dd-agent as root (not recommended)
18
19 example /etc/sudoers entry:
20 dd-agent ALL=(ALL) NOPASSWD:/usr/bin/find /var/spool/postfix* -type f
21
22 YAML config options:
23 "directory" - the value of 'postconf -h queue_directory'
24 "queues" - the postfix mail queues you would like to get message count totals for
25 """
26 def check(self, instance):
27 config = self._get_config(instance)
28
29 directory = config['directory']
30 queues = config['queues']
31 tags = config['tags']
32
33 self._get_queue_count(directory, queues, tags)
34
35 def _get_config(self, instance):
36 directory = instance.get('directory', None)
37 queues = instance.get('queues', None)
38 tags = instance.get('tags', [])
39 if not queues or not directory:
40 raise Exception('missing required yaml config entry')
41
42 instance_config = {
43 'directory': directory,
44 'queues': queues,
45 'tags': tags,
46 }
47
48 return instance_config
49
50 def _get_queue_count(self, directory, queues, tags):
51 for queue in queues:
52 queue_path = os.path.join(directory, queue)
53 if not os.path.exists(queue_path):
54 raise Exception('%s does not exist' % queue_path)
55
56 count = 0
57 if os.geteuid() == 0:
58 # dd-agent is running as root (not recommended)
59 count = sum(len(files) for root, dirs, files in os.walk(queue_path))
60 else:
61 # can dd-agent user run sudo?
62 test_sudo = os.system('setsid sudo -l < /dev/null')
63 if test_sudo == 0:
64 output, _, _ = get_subprocess_output(['sudo', 'find', queue_path, '-type', 'f'], self.log, False)
65 count = len(output.splitlines())
66 else:
67 raise Exception('The dd-agent user does not have sudo access')
68
69 # emit an individually tagged metric
70 self.gauge('postfix.queue.size', count, tags=tags + ['queue:%s' % queue, 'instance:%s' % os.path.basename(directory)])
71
72 # these can be retrieved in a single graph statement
73 # for example:
74 # sum:postfix.queue.size{instance:postfix-2,queue:incoming,host:hostname.domain.tld}
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/postfix/check.py b/postfix/check.py
--- a/postfix/check.py
+++ b/postfix/check.py
@@ -17,7 +17,9 @@
sudo access is not required when running dd-agent as root (not recommended)
example /etc/sudoers entry:
- dd-agent ALL=(ALL) NOPASSWD:/usr/bin/find /var/spool/postfix* -type f
+ dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/incoming -type f
+ dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/active -type f
+ dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/deferred -type f
YAML config options:
"directory" - the value of 'postconf -h queue_directory'
@@ -61,7 +63,9 @@
# can dd-agent user run sudo?
test_sudo = os.system('setsid sudo -l < /dev/null')
if test_sudo == 0:
- output, _, _ = get_subprocess_output(['sudo', 'find', queue_path, '-type', 'f'], self.log, False)
+ # default to `root` for backward compatibility
+ postfix_user = self.init_config.get('postfix_user', 'root')
+ output, _, _ = get_subprocess_output(['sudo', '-u', postfix_user, 'find', queue_path, '-type', 'f'], self.log, False)
count = len(output.splitlines())
else:
raise Exception('The dd-agent user does not have sudo access')
|
{"golden_diff": "diff --git a/postfix/check.py b/postfix/check.py\n--- a/postfix/check.py\n+++ b/postfix/check.py\n@@ -17,7 +17,9 @@\n sudo access is not required when running dd-agent as root (not recommended)\n \n example /etc/sudoers entry:\n- dd-agent ALL=(ALL) NOPASSWD:/usr/bin/find /var/spool/postfix* -type f\n+ dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/incoming -type f\n+ dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/active -type f\n+ dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/deferred -type f\n \n YAML config options:\n \"directory\" - the value of 'postconf -h queue_directory'\n@@ -61,7 +63,9 @@\n # can dd-agent user run sudo?\n test_sudo = os.system('setsid sudo -l < /dev/null')\n if test_sudo == 0:\n- output, _, _ = get_subprocess_output(['sudo', 'find', queue_path, '-type', 'f'], self.log, False)\n+ # default to `root` for backward compatibility\n+ postfix_user = self.init_config.get('postfix_user', 'root')\n+ output, _, _ = get_subprocess_output(['sudo', '-u', postfix_user, 'find', queue_path, '-type', 'f'], self.log, False)\n count = len(output.splitlines())\n else:\n raise Exception('The dd-agent user does not have sudo access')\n", "issue": "postfix integration should not require sudo to root\nReading the source code to integrations-core/postfix/check.py I note that it does a sudo to root to run the find command.\r\n\r\nThis is noted in the docs / comments :\r\n\r\n> WARNING: the user that dd-agent runs as must have sudo access for the 'find' command\r\n> --\r\n> \u00a0 | sudo access is not required when running dd-agent as root (not recommended)\r\n> \u00a0 | \u00a0\r\n> \u00a0 | example /etc/sudoers entry:\r\n> \u00a0 | dd-agent ALL=(ALL) NOPASSWD:/usr/bin/find /var/spool/postfix* -type f\r\n\r\nroot should not be required here - postfix user should be sufficient. That would be combined with a '-u postfix' on line 64's sudo command to allow this to work.\r\n\r\nThis is a concern because find has a -exec parameter and your command list has a wildcard in it - this could be used to run arbitrary commands as root if the dd-agent user is compromised.\r\n\n", "before_files": [{"content": "# (C) Datadog, Inc. 2013-2016\n# (C) Josiah C Webb <[email protected]> 2013\n# All rights reserved\n# Licensed under Simplified BSD License (see LICENSE)\n\n# stdlib\nimport os\n\n# project\nfrom checks import AgentCheck\nfrom utils.subprocess_output import get_subprocess_output\n\nclass PostfixCheck(AgentCheck):\n \"\"\"This check provides metrics on the number of messages in a given postfix queue\n\n WARNING: the user that dd-agent runs as must have sudo access for the 'find' command\n sudo access is not required when running dd-agent as root (not recommended)\n\n example /etc/sudoers entry:\n dd-agent ALL=(ALL) NOPASSWD:/usr/bin/find /var/spool/postfix* -type f\n\n YAML config options:\n \"directory\" - the value of 'postconf -h queue_directory'\n \"queues\" - the postfix mail queues you would like to get message count totals for\n \"\"\"\n def check(self, instance):\n config = self._get_config(instance)\n\n directory = config['directory']\n queues = config['queues']\n tags = config['tags']\n\n self._get_queue_count(directory, queues, tags)\n\n def _get_config(self, instance):\n directory = instance.get('directory', None)\n queues = instance.get('queues', None)\n tags = instance.get('tags', [])\n if not queues or not directory:\n raise Exception('missing required yaml config entry')\n\n instance_config = {\n 'directory': directory,\n 'queues': queues,\n 'tags': tags,\n }\n\n return instance_config\n\n def _get_queue_count(self, directory, queues, tags):\n for queue in queues:\n queue_path = os.path.join(directory, queue)\n if not os.path.exists(queue_path):\n raise Exception('%s does not exist' % queue_path)\n\n count = 0\n if os.geteuid() == 0:\n # dd-agent is running as root (not recommended)\n count = sum(len(files) for root, dirs, files in os.walk(queue_path))\n else:\n # can dd-agent user run sudo?\n test_sudo = os.system('setsid sudo -l < /dev/null')\n if test_sudo == 0:\n output, _, _ = get_subprocess_output(['sudo', 'find', queue_path, '-type', 'f'], self.log, False)\n count = len(output.splitlines())\n else:\n raise Exception('The dd-agent user does not have sudo access')\n\n # emit an individually tagged metric\n self.gauge('postfix.queue.size', count, tags=tags + ['queue:%s' % queue, 'instance:%s' % os.path.basename(directory)])\n\n # these can be retrieved in a single graph statement\n # for example:\n # sum:postfix.queue.size{instance:postfix-2,queue:incoming,host:hostname.domain.tld}\n", "path": "postfix/check.py"}], "after_files": [{"content": "# (C) Datadog, Inc. 2013-2016\n# (C) Josiah C Webb <[email protected]> 2013\n# All rights reserved\n# Licensed under Simplified BSD License (see LICENSE)\n\n# stdlib\nimport os\n\n# project\nfrom checks import AgentCheck\nfrom utils.subprocess_output import get_subprocess_output\n\nclass PostfixCheck(AgentCheck):\n \"\"\"This check provides metrics on the number of messages in a given postfix queue\n\n WARNING: the user that dd-agent runs as must have sudo access for the 'find' command\n sudo access is not required when running dd-agent as root (not recommended)\n\n example /etc/sudoers entry:\n dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/incoming -type f\n dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/active -type f\n dd-agent ALL=(postfix) NOPASSWD:/usr/bin/find /var/spool/postfix/deferred -type f\n\n YAML config options:\n \"directory\" - the value of 'postconf -h queue_directory'\n \"queues\" - the postfix mail queues you would like to get message count totals for\n \"\"\"\n def check(self, instance):\n config = self._get_config(instance)\n\n directory = config['directory']\n queues = config['queues']\n tags = config['tags']\n\n self._get_queue_count(directory, queues, tags)\n\n def _get_config(self, instance):\n directory = instance.get('directory', None)\n queues = instance.get('queues', None)\n tags = instance.get('tags', [])\n if not queues or not directory:\n raise Exception('missing required yaml config entry')\n\n instance_config = {\n 'directory': directory,\n 'queues': queues,\n 'tags': tags,\n }\n\n return instance_config\n\n def _get_queue_count(self, directory, queues, tags):\n for queue in queues:\n queue_path = os.path.join(directory, queue)\n if not os.path.exists(queue_path):\n raise Exception('%s does not exist' % queue_path)\n\n count = 0\n if os.geteuid() == 0:\n # dd-agent is running as root (not recommended)\n count = sum(len(files) for root, dirs, files in os.walk(queue_path))\n else:\n # can dd-agent user run sudo?\n test_sudo = os.system('setsid sudo -l < /dev/null')\n if test_sudo == 0:\n # default to `root` for backward compatibility\n postfix_user = self.init_config.get('postfix_user', 'root')\n output, _, _ = get_subprocess_output(['sudo', '-u', postfix_user, 'find', queue_path, '-type', 'f'], self.log, False)\n count = len(output.splitlines())\n else:\n raise Exception('The dd-agent user does not have sudo access')\n\n # emit an individually tagged metric\n self.gauge('postfix.queue.size', count, tags=tags + ['queue:%s' % queue, 'instance:%s' % os.path.basename(directory)])\n\n # these can be retrieved in a single graph statement\n # for example:\n # sum:postfix.queue.size{instance:postfix-2,queue:incoming,host:hostname.domain.tld}\n", "path": "postfix/check.py"}]}
| 1,265 | 368 |
gh_patches_debug_15930
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-4591
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[question][suggestion] Listing profile should be recursive
Hello,
I use multiple profiles, and those are organized in subdirectories:
+ `~/.conan/profiles/application/x64_gcc6_app1`
+ `~/.conan/profiles/application/x64_msvc_app1`
+ `~/.conan/profiles/compilers/x64_gcc6`
+ `~/.conan/profiles/compilers/x64_msvc`
The "applications" profile include other profiles etc. This works pretty well, so I assume
using subdirectory in profiles is supported and is not a problem.
However, the `conan profile list` command does not list profiles contained in subdirectories.
I believe it should recursively search for profile files rather than only list the files available directly in the `~/.conan/profiles` directory.
I'm wondering if there is a particular reason why the search is limited to the `~/.conan/profiles` directory and if you'd be open to changing this behavior.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conans/client/cmd/profile.py`
Content:
```
1 import os
2
3 from conans.client.conf.detect import detect_defaults_settings
4 from conans.client.profile_loader import get_profile_path, read_profile
5 from conans.errors import ConanException
6 from conans.model.options import OptionsValues
7 from conans.model.profile import Profile
8 from conans.unicode import get_cwd
9 from conans.util.files import save
10
11
12 def _get_profile_keys(key):
13 # settings.compiler.version => settings, compiler.version
14 tmp = key.split(".")
15 first_key = tmp[0]
16 rest_key = ".".join(tmp[1:]) if len(tmp) > 1 else None
17 if first_key not in ("build_requires", "settings", "options", "env"):
18 raise ConanException("Invalid specified key: %s" % key)
19
20 return first_key, rest_key
21
22
23 def cmd_profile_list(cache_profiles_path, output):
24 folder = cache_profiles_path
25 if os.path.exists(folder):
26 return [name for name in os.listdir(folder)
27 if not os.path.isdir(os.path.join(folder, name))]
28 else:
29 output.info("No profiles defined")
30 return []
31
32
33 def cmd_profile_create(profile_name, cache_profiles_path, output, detect=False):
34 profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd(),
35 exists=False)
36 if os.path.exists(profile_path):
37 raise ConanException("Profile already exists")
38
39 profile = Profile()
40 if detect:
41 settings = detect_defaults_settings(output)
42 for name, value in settings:
43 profile.settings[name] = value
44
45 contents = profile.dumps()
46 save(profile_path, contents)
47
48 if detect:
49 output.info("Profile created with detected settings: %s" % profile_path)
50 else:
51 output.info("Empty profile created: %s" % profile_path)
52 return profile_path
53
54
55 def cmd_profile_update(profile_name, key, value, cache_profiles_path):
56 first_key, rest_key = _get_profile_keys(key)
57
58 profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)
59 if first_key == "settings":
60 profile.settings[rest_key] = value
61 elif first_key == "options":
62 tmp = OptionsValues([(rest_key, value)])
63 profile.options.update(tmp)
64 elif first_key == "env":
65 profile.env_values.update_replace(rest_key, value)
66 elif first_key == "build_requires":
67 raise ConanException("Edit the profile manually to change the build_requires")
68
69 contents = profile.dumps()
70 profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd())
71 save(profile_path, contents)
72
73
74 def cmd_profile_get(profile_name, key, cache_profiles_path):
75 first_key, rest_key = _get_profile_keys(key)
76 profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)
77 try:
78 if first_key == "settings":
79 return profile.settings[rest_key]
80 elif first_key == "options":
81 return dict(profile.options.as_list())[rest_key]
82 elif first_key == "env":
83 package = None
84 var = rest_key
85 if ":" in rest_key:
86 package, var = rest_key.split(":")
87 return profile.env_values.data[package][var]
88 elif first_key == "build_requires":
89 raise ConanException("List the profile manually to see the build_requires")
90 except KeyError:
91 raise ConanException("Key not found: '%s'" % key)
92
93
94 def cmd_profile_delete_key(profile_name, key, cache_profiles_path):
95 first_key, rest_key = _get_profile_keys(key)
96 profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)
97
98 try:
99 package, name = rest_key.split(":")
100 except ValueError:
101 package = None
102 name = rest_key
103
104 try:
105 if first_key == "settings":
106 del profile.settings[rest_key]
107 elif first_key == "options":
108 profile.options.remove(name, package)
109 elif first_key == "env":
110 profile.env_values.remove(name, package)
111 elif first_key == "build_requires":
112 raise ConanException("Edit the profile manually to delete a build_require")
113 except KeyError:
114 raise ConanException("Profile key '%s' doesn't exist" % key)
115
116 contents = profile.dumps()
117 profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd())
118 save(profile_path, contents)
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/conans/client/cmd/profile.py b/conans/client/cmd/profile.py
--- a/conans/client/cmd/profile.py
+++ b/conans/client/cmd/profile.py
@@ -21,13 +21,18 @@
def cmd_profile_list(cache_profiles_path, output):
- folder = cache_profiles_path
- if os.path.exists(folder):
- return [name for name in os.listdir(folder)
- if not os.path.isdir(os.path.join(folder, name))]
- else:
+ profiles = []
+ if os.path.exists(cache_profiles_path):
+ for current_directory, _, files in os.walk(cache_profiles_path, followlinks=True):
+ for filename in files:
+ rel_path = os.path.relpath(os.path.join(current_directory, filename),
+ cache_profiles_path)
+ profiles.append(rel_path)
+
+ if not profiles:
output.info("No profiles defined")
- return []
+ profiles.sort()
+ return profiles
def cmd_profile_create(profile_name, cache_profiles_path, output, detect=False):
|
{"golden_diff": "diff --git a/conans/client/cmd/profile.py b/conans/client/cmd/profile.py\n--- a/conans/client/cmd/profile.py\n+++ b/conans/client/cmd/profile.py\n@@ -21,13 +21,18 @@\n \n \n def cmd_profile_list(cache_profiles_path, output):\n- folder = cache_profiles_path\n- if os.path.exists(folder):\n- return [name for name in os.listdir(folder)\n- if not os.path.isdir(os.path.join(folder, name))]\n- else:\n+ profiles = []\n+ if os.path.exists(cache_profiles_path):\n+ for current_directory, _, files in os.walk(cache_profiles_path, followlinks=True):\n+ for filename in files:\n+ rel_path = os.path.relpath(os.path.join(current_directory, filename),\n+ cache_profiles_path)\n+ profiles.append(rel_path)\n+\n+ if not profiles:\n output.info(\"No profiles defined\")\n- return []\n+ profiles.sort()\n+ return profiles\n \n \n def cmd_profile_create(profile_name, cache_profiles_path, output, detect=False):\n", "issue": "[question][suggestion] Listing profile should be recursive\nHello,\r\n\r\nI use multiple profiles, and those are organized in subdirectories:\r\n + `~/.conan/profiles/application/x64_gcc6_app1`\r\n + `~/.conan/profiles/application/x64_msvc_app1`\r\n + `~/.conan/profiles/compilers/x64_gcc6`\r\n + `~/.conan/profiles/compilers/x64_msvc`\r\n\r\nThe \"applications\" profile include other profiles etc. This works pretty well, so I assume \r\nusing subdirectory in profiles is supported and is not a problem.\r\n\r\nHowever, the `conan profile list` command does not list profiles contained in subdirectories.\r\nI believe it should recursively search for profile files rather than only list the files available directly in the `~/.conan/profiles` directory.\r\n\r\nI'm wondering if there is a particular reason why the search is limited to the `~/.conan/profiles` directory and if you'd be open to changing this behavior.\r\n\n", "before_files": [{"content": "import os\n\nfrom conans.client.conf.detect import detect_defaults_settings\nfrom conans.client.profile_loader import get_profile_path, read_profile\nfrom conans.errors import ConanException\nfrom conans.model.options import OptionsValues\nfrom conans.model.profile import Profile\nfrom conans.unicode import get_cwd\nfrom conans.util.files import save\n\n\ndef _get_profile_keys(key):\n # settings.compiler.version => settings, compiler.version\n tmp = key.split(\".\")\n first_key = tmp[0]\n rest_key = \".\".join(tmp[1:]) if len(tmp) > 1 else None\n if first_key not in (\"build_requires\", \"settings\", \"options\", \"env\"):\n raise ConanException(\"Invalid specified key: %s\" % key)\n\n return first_key, rest_key\n\n\ndef cmd_profile_list(cache_profiles_path, output):\n folder = cache_profiles_path\n if os.path.exists(folder):\n return [name for name in os.listdir(folder)\n if not os.path.isdir(os.path.join(folder, name))]\n else:\n output.info(\"No profiles defined\")\n return []\n\n\ndef cmd_profile_create(profile_name, cache_profiles_path, output, detect=False):\n profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd(),\n exists=False)\n if os.path.exists(profile_path):\n raise ConanException(\"Profile already exists\")\n\n profile = Profile()\n if detect:\n settings = detect_defaults_settings(output)\n for name, value in settings:\n profile.settings[name] = value\n\n contents = profile.dumps()\n save(profile_path, contents)\n\n if detect:\n output.info(\"Profile created with detected settings: %s\" % profile_path)\n else:\n output.info(\"Empty profile created: %s\" % profile_path)\n return profile_path\n\n\ndef cmd_profile_update(profile_name, key, value, cache_profiles_path):\n first_key, rest_key = _get_profile_keys(key)\n\n profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)\n if first_key == \"settings\":\n profile.settings[rest_key] = value\n elif first_key == \"options\":\n tmp = OptionsValues([(rest_key, value)])\n profile.options.update(tmp)\n elif first_key == \"env\":\n profile.env_values.update_replace(rest_key, value)\n elif first_key == \"build_requires\":\n raise ConanException(\"Edit the profile manually to change the build_requires\")\n\n contents = profile.dumps()\n profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd())\n save(profile_path, contents)\n\n\ndef cmd_profile_get(profile_name, key, cache_profiles_path):\n first_key, rest_key = _get_profile_keys(key)\n profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)\n try:\n if first_key == \"settings\":\n return profile.settings[rest_key]\n elif first_key == \"options\":\n return dict(profile.options.as_list())[rest_key]\n elif first_key == \"env\":\n package = None\n var = rest_key\n if \":\" in rest_key:\n package, var = rest_key.split(\":\")\n return profile.env_values.data[package][var]\n elif first_key == \"build_requires\":\n raise ConanException(\"List the profile manually to see the build_requires\")\n except KeyError:\n raise ConanException(\"Key not found: '%s'\" % key)\n\n\ndef cmd_profile_delete_key(profile_name, key, cache_profiles_path):\n first_key, rest_key = _get_profile_keys(key)\n profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)\n\n try:\n package, name = rest_key.split(\":\")\n except ValueError:\n package = None\n name = rest_key\n\n try:\n if first_key == \"settings\":\n del profile.settings[rest_key]\n elif first_key == \"options\":\n profile.options.remove(name, package)\n elif first_key == \"env\":\n profile.env_values.remove(name, package)\n elif first_key == \"build_requires\":\n raise ConanException(\"Edit the profile manually to delete a build_require\")\n except KeyError:\n raise ConanException(\"Profile key '%s' doesn't exist\" % key)\n\n contents = profile.dumps()\n profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd())\n save(profile_path, contents)\n", "path": "conans/client/cmd/profile.py"}], "after_files": [{"content": "import os\n\nfrom conans.client.conf.detect import detect_defaults_settings\nfrom conans.client.profile_loader import get_profile_path, read_profile\nfrom conans.errors import ConanException\nfrom conans.model.options import OptionsValues\nfrom conans.model.profile import Profile\nfrom conans.unicode import get_cwd\nfrom conans.util.files import save\n\n\ndef _get_profile_keys(key):\n # settings.compiler.version => settings, compiler.version\n tmp = key.split(\".\")\n first_key = tmp[0]\n rest_key = \".\".join(tmp[1:]) if len(tmp) > 1 else None\n if first_key not in (\"build_requires\", \"settings\", \"options\", \"env\"):\n raise ConanException(\"Invalid specified key: %s\" % key)\n\n return first_key, rest_key\n\n\ndef cmd_profile_list(cache_profiles_path, output):\n profiles = []\n if os.path.exists(cache_profiles_path):\n for current_directory, _, files in os.walk(cache_profiles_path, followlinks=True):\n for filename in files:\n rel_path = os.path.relpath(os.path.join(current_directory, filename),\n cache_profiles_path)\n profiles.append(rel_path)\n\n if not profiles:\n output.info(\"No profiles defined\")\n profiles.sort()\n return profiles\n\n\ndef cmd_profile_create(profile_name, cache_profiles_path, output, detect=False):\n profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd(),\n exists=False)\n if os.path.exists(profile_path):\n raise ConanException(\"Profile already exists\")\n\n profile = Profile()\n if detect:\n settings = detect_defaults_settings(output)\n for name, value in settings:\n profile.settings[name] = value\n\n contents = profile.dumps()\n save(profile_path, contents)\n\n if detect:\n output.info(\"Profile created with detected settings: %s\" % profile_path)\n else:\n output.info(\"Empty profile created: %s\" % profile_path)\n return profile_path\n\n\ndef cmd_profile_update(profile_name, key, value, cache_profiles_path):\n first_key, rest_key = _get_profile_keys(key)\n\n profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)\n if first_key == \"settings\":\n profile.settings[rest_key] = value\n elif first_key == \"options\":\n tmp = OptionsValues([(rest_key, value)])\n profile.options.update(tmp)\n elif first_key == \"env\":\n profile.env_values.update_replace(rest_key, value)\n elif first_key == \"build_requires\":\n raise ConanException(\"Edit the profile manually to change the build_requires\")\n\n contents = profile.dumps()\n profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd())\n save(profile_path, contents)\n\n\ndef cmd_profile_get(profile_name, key, cache_profiles_path):\n first_key, rest_key = _get_profile_keys(key)\n profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)\n try:\n if first_key == \"settings\":\n return profile.settings[rest_key]\n elif first_key == \"options\":\n return dict(profile.options.as_list())[rest_key]\n elif first_key == \"env\":\n package = None\n var = rest_key\n if \":\" in rest_key:\n package, var = rest_key.split(\":\")\n return profile.env_values.data[package][var]\n elif first_key == \"build_requires\":\n raise ConanException(\"List the profile manually to see the build_requires\")\n except KeyError:\n raise ConanException(\"Key not found: '%s'\" % key)\n\n\ndef cmd_profile_delete_key(profile_name, key, cache_profiles_path):\n first_key, rest_key = _get_profile_keys(key)\n profile, _ = read_profile(profile_name, get_cwd(), cache_profiles_path)\n\n try:\n package, name = rest_key.split(\":\")\n except ValueError:\n package = None\n name = rest_key\n\n try:\n if first_key == \"settings\":\n del profile.settings[rest_key]\n elif first_key == \"options\":\n profile.options.remove(name, package)\n elif first_key == \"env\":\n profile.env_values.remove(name, package)\n elif first_key == \"build_requires\":\n raise ConanException(\"Edit the profile manually to delete a build_require\")\n except KeyError:\n raise ConanException(\"Profile key '%s' doesn't exist\" % key)\n\n contents = profile.dumps()\n profile_path = get_profile_path(profile_name, cache_profiles_path, get_cwd())\n save(profile_path, contents)\n", "path": "conans/client/cmd/profile.py"}]}
| 1,661 | 224 |
gh_patches_debug_60854
|
rasdani/github-patches
|
git_diff
|
airctic__icevision-441
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add icedata to icevision.all
## 🚀 Feature
Currently to train a dataset available with icedata the following two lines are necessary:
```python
import icedata
from icevision.all import *
```
Because icedata already depends on icevision, icevision cannot depend on icedata. **But** I guess we can add icedata as a soft dependency to `.all`, we just have to be sure not to use `icedata` internally in icevision.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `icevision/all.py`
Content:
```
1 from icevision.imports import *
2 from icevision import *
3
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/icevision/all.py b/icevision/all.py
--- a/icevision/all.py
+++ b/icevision/all.py
@@ -1,2 +1,9 @@
from icevision.imports import *
from icevision import *
+
+# soft import icedata
+try:
+ import icedata
+except ModuleNotFoundError as e:
+ if str(e) != f"No module named 'icedata'":
+ raise e
|
{"golden_diff": "diff --git a/icevision/all.py b/icevision/all.py\n--- a/icevision/all.py\n+++ b/icevision/all.py\n@@ -1,2 +1,9 @@\n from icevision.imports import *\n from icevision import *\n+\n+# soft import icedata\n+try:\n+ import icedata\n+except ModuleNotFoundError as e:\n+ if str(e) != f\"No module named 'icedata'\":\n+ raise e\n", "issue": "Add icedata to icevision.all\n## \ud83d\ude80 Feature\r\nCurrently to train a dataset available with icedata the following two lines are necessary:\r\n```python\r\nimport icedata\r\nfrom icevision.all import *\r\n```\r\n\r\nBecause icedata already depends on icevision, icevision cannot depend on icedata. **But** I guess we can add icedata as a soft dependency to `.all`, we just have to be sure not to use `icedata` internally in icevision.\n", "before_files": [{"content": "from icevision.imports import *\nfrom icevision import *\n", "path": "icevision/all.py"}], "after_files": [{"content": "from icevision.imports import *\nfrom icevision import *\n\n# soft import icedata\ntry:\n import icedata\nexcept ModuleNotFoundError as e:\n if str(e) != f\"No module named 'icedata'\":\n raise e\n", "path": "icevision/all.py"}]}
| 377 | 100 |
gh_patches_debug_5156
|
rasdani/github-patches
|
git_diff
|
DataBiosphere__toil-2834
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Example script duplicated
`examples/hello.py` and `src/toil/test/docs/scripts/tutorial_arguments.py` are duplicate scripts with the same contents.
┆Issue is synchronized with this [Jira Task](https://ucsc-cgl.atlassian.net/browse/TOIL-443)
┆Issue Number: TOIL-443
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/hello.py`
Content:
```
1 from toil.common import Toil
2 from toil.job import Job
3
4 class HelloWorld(Job):
5 def __init__(self, message):
6 Job.__init__(self, memory="1G", cores=2, disk="2G")
7 self.message = message
8
9 def run(self, fileStore):
10 return "Hello, world!, here's a message: %s" % self.message
11
12 if __name__=="__main__":
13 parser = Job.Runner.getDefaultArgumentParser()
14 options = parser.parse_args()
15
16 hello_job = HelloWorld("Woot")
17
18 with Toil(options) as toil:
19 print(toil.start(hello_job))
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/examples/hello.py b/examples/hello.py
deleted file mode 100644
--- a/examples/hello.py
+++ /dev/null
@@ -1,19 +0,0 @@
-from toil.common import Toil
-from toil.job import Job
-
-class HelloWorld(Job):
- def __init__(self, message):
- Job.__init__(self, memory="1G", cores=2, disk="2G")
- self.message = message
-
- def run(self, fileStore):
- return "Hello, world!, here's a message: %s" % self.message
-
-if __name__=="__main__":
- parser = Job.Runner.getDefaultArgumentParser()
- options = parser.parse_args()
-
- hello_job = HelloWorld("Woot")
-
- with Toil(options) as toil:
- print(toil.start(hello_job))
|
{"golden_diff": "diff --git a/examples/hello.py b/examples/hello.py\ndeleted file mode 100644\n--- a/examples/hello.py\n+++ /dev/null\n@@ -1,19 +0,0 @@\n-from toil.common import Toil\n-from toil.job import Job\n-\n-class HelloWorld(Job):\n- def __init__(self, message):\n- Job.__init__(self, memory=\"1G\", cores=2, disk=\"2G\")\n- self.message = message\n-\n- def run(self, fileStore):\n- return \"Hello, world!, here's a message: %s\" % self.message\n-\n-if __name__==\"__main__\":\n- parser = Job.Runner.getDefaultArgumentParser()\n- options = parser.parse_args()\n-\n- hello_job = HelloWorld(\"Woot\")\n-\n- with Toil(options) as toil:\n- print(toil.start(hello_job))\n", "issue": "Example script duplicated\n`examples/hello.py` and `src/toil/test/docs/scripts/tutorial_arguments.py` are duplicate scripts with the same contents.\n\n\u2506Issue is synchronized with this [Jira Task](https://ucsc-cgl.atlassian.net/browse/TOIL-443)\n\u2506Issue Number: TOIL-443\n\n", "before_files": [{"content": "from toil.common import Toil\nfrom toil.job import Job\n\nclass HelloWorld(Job):\n def __init__(self, message):\n Job.__init__(self, memory=\"1G\", cores=2, disk=\"2G\")\n self.message = message\n\n def run(self, fileStore):\n return \"Hello, world!, here's a message: %s\" % self.message\n\nif __name__==\"__main__\":\n parser = Job.Runner.getDefaultArgumentParser()\n options = parser.parse_args()\n\n hello_job = HelloWorld(\"Woot\")\n\n with Toil(options) as toil:\n print(toil.start(hello_job))\n", "path": "examples/hello.py"}], "after_files": [{"content": null, "path": "examples/hello.py"}]}
| 502 | 199 |
gh_patches_debug_18244
|
rasdani/github-patches
|
git_diff
|
interlegis__sapl-1606
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Exclusão Tramitação - status de tramitação
Ao incluir uma tramitação à matéria, em que esta tramitação tenha seu indicador de tramitação definido nas tabelas auxiliares como fim, a matéria em questão passa de "em tramitação - sim para não" corretamente. Se essa tramitação por ventura for excluída, não seria o caso de alterar novamente a matéria de "em tramitação - não para sim" ?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sapl/materia/receivers.py`
Content:
```
1 from django.dispatch import receiver
2
3 from sapl.materia.signals import tramitacao_signal
4 from sapl.utils import get_base_url
5
6 from .email_utils import do_envia_email_tramitacao
7
8
9 @receiver(tramitacao_signal)
10 def handle_tramitacao_signal(sender, **kwargs):
11 tramitacao = kwargs.get("post")
12 request = kwargs.get("request")
13 materia = tramitacao.materia
14
15 do_envia_email_tramitacao(
16 get_base_url(request),
17 materia,
18 tramitacao.status,
19 tramitacao.unidade_tramitacao_destino)
20
```
Path: `sapl/materia/apps.py`
Content:
```
1 from django import apps
2 from django.utils.translation import ugettext_lazy as _
3
4
5 class AppConfig(apps.AppConfig):
6 name = 'sapl.materia'
7 label = 'materia'
8 verbose_name = _('Matéria')
9
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sapl/materia/apps.py b/sapl/materia/apps.py
--- a/sapl/materia/apps.py
+++ b/sapl/materia/apps.py
@@ -6,3 +6,6 @@
name = 'sapl.materia'
label = 'materia'
verbose_name = _('Matéria')
+
+ def ready(self):
+ from . import receivers
\ No newline at end of file
diff --git a/sapl/materia/receivers.py b/sapl/materia/receivers.py
--- a/sapl/materia/receivers.py
+++ b/sapl/materia/receivers.py
@@ -1,5 +1,7 @@
+from django.db.models.signals import post_delete, post_save
from django.dispatch import receiver
+from sapl.materia.models import Tramitacao
from sapl.materia.signals import tramitacao_signal
from sapl.utils import get_base_url
@@ -17,3 +19,11 @@
materia,
tramitacao.status,
tramitacao.unidade_tramitacao_destino)
+
+
+@receiver(post_delete, sender=Tramitacao)
+def status_tramitacao_materia(sender, instance, **kwargs):
+ if instance.turno == 'F':
+ materia = instance.materia
+ materia.em_tramitacao = True
+ materia.save()
|
{"golden_diff": "diff --git a/sapl/materia/apps.py b/sapl/materia/apps.py\n--- a/sapl/materia/apps.py\n+++ b/sapl/materia/apps.py\n@@ -6,3 +6,6 @@\n name = 'sapl.materia'\n label = 'materia'\n verbose_name = _('Mat\u00e9ria')\n+\n+ def ready(self):\n+ from . import receivers\n\\ No newline at end of file\ndiff --git a/sapl/materia/receivers.py b/sapl/materia/receivers.py\n--- a/sapl/materia/receivers.py\n+++ b/sapl/materia/receivers.py\n@@ -1,5 +1,7 @@\n+from django.db.models.signals import post_delete, post_save\n from django.dispatch import receiver\n \n+from sapl.materia.models import Tramitacao\n from sapl.materia.signals import tramitacao_signal\n from sapl.utils import get_base_url\n \n@@ -17,3 +19,11 @@\n materia,\n tramitacao.status,\n tramitacao.unidade_tramitacao_destino)\n+\n+\n+@receiver(post_delete, sender=Tramitacao)\n+def status_tramitacao_materia(sender, instance, **kwargs):\n+ if instance.turno == 'F':\n+ materia = instance.materia\n+ materia.em_tramitacao = True\n+ materia.save()\n", "issue": "Exclus\u00e3o Tramita\u00e7\u00e3o - status de tramita\u00e7\u00e3o\nAo incluir uma tramita\u00e7\u00e3o \u00e0 mat\u00e9ria, em que esta tramita\u00e7\u00e3o tenha seu indicador de tramita\u00e7\u00e3o definido nas tabelas auxiliares como fim, a mat\u00e9ria em quest\u00e3o passa de \"em tramita\u00e7\u00e3o - sim para n\u00e3o\" corretamente. Se essa tramita\u00e7\u00e3o por ventura for exclu\u00edda, n\u00e3o seria o caso de alterar novamente a mat\u00e9ria de \"em tramita\u00e7\u00e3o - n\u00e3o para sim\" ?\n", "before_files": [{"content": "from django.dispatch import receiver\n\nfrom sapl.materia.signals import tramitacao_signal\nfrom sapl.utils import get_base_url\n\nfrom .email_utils import do_envia_email_tramitacao\n\n\n@receiver(tramitacao_signal)\ndef handle_tramitacao_signal(sender, **kwargs):\n tramitacao = kwargs.get(\"post\")\n request = kwargs.get(\"request\")\n materia = tramitacao.materia\n\n do_envia_email_tramitacao(\n get_base_url(request),\n materia,\n tramitacao.status,\n tramitacao.unidade_tramitacao_destino)\n", "path": "sapl/materia/receivers.py"}, {"content": "from django import apps\nfrom django.utils.translation import ugettext_lazy as _\n\n\nclass AppConfig(apps.AppConfig):\n name = 'sapl.materia'\n label = 'materia'\n verbose_name = _('Mat\u00e9ria')\n", "path": "sapl/materia/apps.py"}], "after_files": [{"content": "from django.db.models.signals import post_delete, post_save\nfrom django.dispatch import receiver\n\nfrom sapl.materia.models import Tramitacao\nfrom sapl.materia.signals import tramitacao_signal\nfrom sapl.utils import get_base_url\n\nfrom .email_utils import do_envia_email_tramitacao\n\n\n@receiver(tramitacao_signal)\ndef handle_tramitacao_signal(sender, **kwargs):\n tramitacao = kwargs.get(\"post\")\n request = kwargs.get(\"request\")\n materia = tramitacao.materia\n\n do_envia_email_tramitacao(\n get_base_url(request),\n materia,\n tramitacao.status,\n tramitacao.unidade_tramitacao_destino)\n\n\n@receiver(post_delete, sender=Tramitacao)\ndef status_tramitacao_materia(sender, instance, **kwargs):\n if instance.turno == 'F':\n materia = instance.materia\n materia.em_tramitacao = True\n materia.save()\n", "path": "sapl/materia/receivers.py"}, {"content": "from django import apps\nfrom django.utils.translation import ugettext_lazy as _\n\n\nclass AppConfig(apps.AppConfig):\n name = 'sapl.materia'\n label = 'materia'\n verbose_name = _('Mat\u00e9ria')\n\n def ready(self):\n from . import receivers", "path": "sapl/materia/apps.py"}]}
| 604 | 300 |
gh_patches_debug_4716
|
rasdani/github-patches
|
git_diff
|
crytic__slither-2331
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug]: High level call does not always have function
### Describe the issue:
I have two somehow similar examples, in the first one I have the function on the high level call in function safeAdd. In the next example I edited the array size to be constant instead of a literal, and I got None instead of a function object
### Code example to reproduce the issue:
```solidity
library SafeMath {
uint256 private constant twelve = 12;
struct A {uint256 a;}
function add(A[twelve] storage z) internal { }
}
contract MathContract {
uint256 private constant twelve = 12;
using SafeMath for SafeMath.A[12];
SafeMath.A[12] public z;
function safeAdd() public {
z.add();
}
}
```
```solidity
library SafeMath {
uint256 private constant twelve = 12;
struct A {uint256 a;}
function add(A[twelve] storage z) internal { }
}
contract MathContract {
uint256 private constant twelve = 12;
using SafeMath for SafeMath.A[twelve];
SafeMath.A[twelve] public z;
function safeAdd() public {
z.add();
}
}
```
### Version:
0.10.0
### Relevant log output:
```shell
>>> from slither import Slither
>>> math = Slither('a.sol').contracts[1]
>>> math.name
'MathContract'
>>> f = math.functions[0]
>>> f.name
'safeAdd'
>>> f.nodes
[<slither.core.cfg.node.Node object at 0x7f5460aa1e50>, <slither.core.cfg.node.No
de object at 0x7f5460aa2090>]
>>> f.nodes[1]
<slither.core.cfg.node.Node object at 0x7f5460aa2090>
>>> f.nodes[1].irs
[<slither.slithir.operations.library_call.LibraryCall object at 0x7f5460a748d0>]
>>> f.nodes[1].irs[0].function
<slither.core.declarations.function_contract.FunctionContract object at 0x7f5460a
9e090>
>>> f.nodes[1].irs[0].function.name
'add'
----------------------------------------------------------------------------------
>>> from slither import Slither
>>> math = Slither('a.sol').contracts[1]
>>> math.name
'MathContract'
>>> f = math.functions[0]
>>> f.name
'safeAdd'
>>> f.nodes
[<slither.core.cfg.node.Node object at 0x7f9d6379db10>, <slither.core.cfg.node.No
de object at 0x7f9d63a47850>]
>>> f.nodes[1]
<slither.core.cfg.node.Node object at 0x7f9d63a47850>
>>> f.nodes[1].irs
[<slither.slithir.operations.high_level_call.HighLevelCall object at 0x7f9d63a376
90>]
>>> f.nodes[1].irs[0].function
>>> print(f.nodes[1].irs[0].function)
None
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `slither/core/solidity_types/array_type.py`
Content:
```
1 from typing import Union, Optional, Tuple, Any, TYPE_CHECKING
2
3 from slither.core.expressions.expression import Expression
4 from slither.core.expressions.literal import Literal
5 from slither.core.solidity_types.elementary_type import ElementaryType
6 from slither.core.solidity_types.type import Type
7 from slither.visitors.expression.constants_folding import ConstantFolding
8
9 if TYPE_CHECKING:
10 from slither.core.expressions.binary_operation import BinaryOperation
11 from slither.core.expressions.identifier import Identifier
12
13
14 class ArrayType(Type):
15 def __init__(
16 self,
17 t: Type,
18 length: Optional[Union["Identifier", Literal, "BinaryOperation", int]],
19 ) -> None:
20 assert isinstance(t, Type)
21 if length:
22 if isinstance(length, int):
23 length = Literal(length, ElementaryType("uint256"))
24
25 super().__init__()
26 self._type: Type = t
27 assert length is None or isinstance(length, Expression)
28 self._length: Optional[Expression] = length
29
30 if length:
31 if not isinstance(length, Literal):
32 cf = ConstantFolding(length, "uint256")
33 length = cf.result()
34 self._length_value: Optional[Literal] = length
35 else:
36 self._length_value = None
37
38 @property
39 def type(self) -> Type:
40 return self._type
41
42 @property
43 def is_dynamic(self) -> bool:
44 return self.length is None
45
46 @property
47 def length(self) -> Optional[Expression]:
48 return self._length
49
50 @property
51 def length_value(self) -> Optional[Literal]:
52 return self._length_value
53
54 @property
55 def is_fixed_array(self) -> bool:
56 return bool(self.length)
57
58 @property
59 def is_dynamic_array(self) -> bool:
60 return not self.is_fixed_array
61
62 @property
63 def storage_size(self) -> Tuple[int, bool]:
64 if self._length_value:
65 elem_size, _ = self._type.storage_size
66 return elem_size * int(str(self._length_value)), True
67 return 32, True
68
69 def __str__(self) -> str:
70 if self._length:
71 return str(self._type) + f"[{str(self._length_value)}]"
72 return str(self._type) + "[]"
73
74 def __eq__(self, other: Any) -> bool:
75 if not isinstance(other, ArrayType):
76 return False
77 return self._type == other.type and self.length == other.length
78
79 def __hash__(self) -> int:
80 return hash(str(self))
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/slither/core/solidity_types/array_type.py b/slither/core/solidity_types/array_type.py
--- a/slither/core/solidity_types/array_type.py
+++ b/slither/core/solidity_types/array_type.py
@@ -74,7 +74,7 @@
def __eq__(self, other: Any) -> bool:
if not isinstance(other, ArrayType):
return False
- return self._type == other.type and self.length == other.length
+ return self._type == other.type and self._length_value == other.length_value
def __hash__(self) -> int:
return hash(str(self))
|
{"golden_diff": "diff --git a/slither/core/solidity_types/array_type.py b/slither/core/solidity_types/array_type.py\n--- a/slither/core/solidity_types/array_type.py\n+++ b/slither/core/solidity_types/array_type.py\n@@ -74,7 +74,7 @@\n def __eq__(self, other: Any) -> bool:\n if not isinstance(other, ArrayType):\n return False\n- return self._type == other.type and self.length == other.length\n+ return self._type == other.type and self._length_value == other.length_value\n \n def __hash__(self) -> int:\n return hash(str(self))\n", "issue": "[Bug]: High level call does not always have function\n### Describe the issue:\r\n\r\nI have two somehow similar examples, in the first one I have the function on the high level call in function safeAdd. In the next example I edited the array size to be constant instead of a literal, and I got None instead of a function object\r\n\r\n### Code example to reproduce the issue:\r\n\r\n```solidity\r\nlibrary SafeMath {\r\n uint256 private constant twelve = 12; \r\n struct A {uint256 a;}\r\n function add(A[twelve] storage z) internal { }\r\n}\r\n\r\ncontract MathContract {\r\n uint256 private constant twelve = 12; \r\n using SafeMath for SafeMath.A[12];\r\n SafeMath.A[12] public z;\r\n function safeAdd() public {\r\n z.add();\r\n }\r\n}\r\n```\r\n```solidity\r\nlibrary SafeMath {\r\n uint256 private constant twelve = 12; \r\n struct A {uint256 a;}\r\n function add(A[twelve] storage z) internal { }\r\n}\r\n\r\ncontract MathContract {\r\n uint256 private constant twelve = 12; \r\n using SafeMath for SafeMath.A[twelve];\r\n SafeMath.A[twelve] public z;\r\n function safeAdd() public {\r\n z.add();\r\n }\r\n}\r\n```\r\n\r\n### Version:\r\n\r\n0.10.0\r\n\r\n### Relevant log output:\r\n\r\n```shell\r\n>>> from slither import Slither\r\n>>> math = Slither('a.sol').contracts[1]\r\n>>> math.name\r\n'MathContract'\r\n>>> f = math.functions[0]\r\n>>> f.name\r\n'safeAdd'\r\n>>> f.nodes\r\n[<slither.core.cfg.node.Node object at 0x7f5460aa1e50>, <slither.core.cfg.node.No\r\nde object at 0x7f5460aa2090>]\r\n>>> f.nodes[1]\r\n<slither.core.cfg.node.Node object at 0x7f5460aa2090>\r\n>>> f.nodes[1].irs\r\n[<slither.slithir.operations.library_call.LibraryCall object at 0x7f5460a748d0>]\r\n>>> f.nodes[1].irs[0].function\r\n<slither.core.declarations.function_contract.FunctionContract object at 0x7f5460a\r\n9e090>\r\n>>> f.nodes[1].irs[0].function.name\r\n'add'\r\n----------------------------------------------------------------------------------\r\n>>> from slither import Slither\r\n>>> math = Slither('a.sol').contracts[1]\r\n>>> math.name\r\n'MathContract'\r\n>>> f = math.functions[0]\r\n>>> f.name\r\n'safeAdd'\r\n>>> f.nodes\r\n[<slither.core.cfg.node.Node object at 0x7f9d6379db10>, <slither.core.cfg.node.No\r\nde object at 0x7f9d63a47850>]\r\n>>> f.nodes[1]\r\n<slither.core.cfg.node.Node object at 0x7f9d63a47850>\r\n>>> f.nodes[1].irs\r\n[<slither.slithir.operations.high_level_call.HighLevelCall object at 0x7f9d63a376\r\n90>]\r\n>>> f.nodes[1].irs[0].function\r\n>>> print(f.nodes[1].irs[0].function)\r\nNone\r\n```\r\n\n", "before_files": [{"content": "from typing import Union, Optional, Tuple, Any, TYPE_CHECKING\n\nfrom slither.core.expressions.expression import Expression\nfrom slither.core.expressions.literal import Literal\nfrom slither.core.solidity_types.elementary_type import ElementaryType\nfrom slither.core.solidity_types.type import Type\nfrom slither.visitors.expression.constants_folding import ConstantFolding\n\nif TYPE_CHECKING:\n from slither.core.expressions.binary_operation import BinaryOperation\n from slither.core.expressions.identifier import Identifier\n\n\nclass ArrayType(Type):\n def __init__(\n self,\n t: Type,\n length: Optional[Union[\"Identifier\", Literal, \"BinaryOperation\", int]],\n ) -> None:\n assert isinstance(t, Type)\n if length:\n if isinstance(length, int):\n length = Literal(length, ElementaryType(\"uint256\"))\n\n super().__init__()\n self._type: Type = t\n assert length is None or isinstance(length, Expression)\n self._length: Optional[Expression] = length\n\n if length:\n if not isinstance(length, Literal):\n cf = ConstantFolding(length, \"uint256\")\n length = cf.result()\n self._length_value: Optional[Literal] = length\n else:\n self._length_value = None\n\n @property\n def type(self) -> Type:\n return self._type\n\n @property\n def is_dynamic(self) -> bool:\n return self.length is None\n\n @property\n def length(self) -> Optional[Expression]:\n return self._length\n\n @property\n def length_value(self) -> Optional[Literal]:\n return self._length_value\n\n @property\n def is_fixed_array(self) -> bool:\n return bool(self.length)\n\n @property\n def is_dynamic_array(self) -> bool:\n return not self.is_fixed_array\n\n @property\n def storage_size(self) -> Tuple[int, bool]:\n if self._length_value:\n elem_size, _ = self._type.storage_size\n return elem_size * int(str(self._length_value)), True\n return 32, True\n\n def __str__(self) -> str:\n if self._length:\n return str(self._type) + f\"[{str(self._length_value)}]\"\n return str(self._type) + \"[]\"\n\n def __eq__(self, other: Any) -> bool:\n if not isinstance(other, ArrayType):\n return False\n return self._type == other.type and self.length == other.length\n\n def __hash__(self) -> int:\n return hash(str(self))\n", "path": "slither/core/solidity_types/array_type.py"}], "after_files": [{"content": "from typing import Union, Optional, Tuple, Any, TYPE_CHECKING\n\nfrom slither.core.expressions.expression import Expression\nfrom slither.core.expressions.literal import Literal\nfrom slither.core.solidity_types.elementary_type import ElementaryType\nfrom slither.core.solidity_types.type import Type\nfrom slither.visitors.expression.constants_folding import ConstantFolding\n\nif TYPE_CHECKING:\n from slither.core.expressions.binary_operation import BinaryOperation\n from slither.core.expressions.identifier import Identifier\n\n\nclass ArrayType(Type):\n def __init__(\n self,\n t: Type,\n length: Optional[Union[\"Identifier\", Literal, \"BinaryOperation\", int]],\n ) -> None:\n assert isinstance(t, Type)\n if length:\n if isinstance(length, int):\n length = Literal(length, ElementaryType(\"uint256\"))\n\n super().__init__()\n self._type: Type = t\n assert length is None or isinstance(length, Expression)\n self._length: Optional[Expression] = length\n\n if length:\n if not isinstance(length, Literal):\n cf = ConstantFolding(length, \"uint256\")\n length = cf.result()\n self._length_value: Optional[Literal] = length\n else:\n self._length_value = None\n\n @property\n def type(self) -> Type:\n return self._type\n\n @property\n def is_dynamic(self) -> bool:\n return self.length is None\n\n @property\n def length(self) -> Optional[Expression]:\n return self._length\n\n @property\n def length_value(self) -> Optional[Literal]:\n return self._length_value\n\n @property\n def is_fixed_array(self) -> bool:\n return bool(self.length)\n\n @property\n def is_dynamic_array(self) -> bool:\n return not self.is_fixed_array\n\n @property\n def storage_size(self) -> Tuple[int, bool]:\n if self._length_value:\n elem_size, _ = self._type.storage_size\n return elem_size * int(str(self._length_value)), True\n return 32, True\n\n def __str__(self) -> str:\n if self._length:\n return str(self._type) + f\"[{str(self._length_value)}]\"\n return str(self._type) + \"[]\"\n\n def __eq__(self, other: Any) -> bool:\n if not isinstance(other, ArrayType):\n return False\n return self._type == other.type and self._length_value == other.length_value\n\n def __hash__(self) -> int:\n return hash(str(self))\n", "path": "slither/core/solidity_types/array_type.py"}]}
| 1,722 | 142 |
gh_patches_debug_21235
|
rasdani/github-patches
|
git_diff
|
facebookresearch__hydra-1630
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Document hydra.callbacks api.
- [ ] Document API
- [ ] Add news fragment if missing
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hydra/experimental/callback.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2 import logging
3 from typing import Any
4
5 from omegaconf import DictConfig
6
7 from hydra.core.utils import JobReturn
8
9 logger = logging.getLogger(__name__)
10
11
12 class Callback:
13 def on_run_start(self, config: DictConfig, **kwargs: Any) -> None:
14 """
15 Called in RUN mode before job starts.
16 """
17 ...
18
19 def on_run_end(self, config: DictConfig, **kwargs: Any) -> None:
20 """
21 Called in RUN mode after job ends.
22 """
23 ...
24
25 def on_multirun_start(self, config: DictConfig, **kwargs: Any) -> None:
26 """
27 Called in MULTIRUN mode before any job starts.
28 """
29 ...
30
31 def on_multirun_end(self, config: DictConfig, **kwargs: Any) -> None:
32 """
33 Called in MULTIRUN mode after all job end.
34 """
35 ...
36
37 def on_job_start(self, config: DictConfig, **kwargs: Any) -> None:
38 """
39 Called in both RUN and MULTIRUN modes inside a Hydra job; before running
40 application code.
41 """
42 ...
43
44 def on_job_end(
45 self, config: DictConfig, job_return: JobReturn, **kwargs: Any
46 ) -> None:
47 """
48 Called in both RUN and MULTIRUN modes inside a Hydra job; after running
49 application code.
50 """
51 ...
52
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/hydra/experimental/callback.py b/hydra/experimental/callback.py
--- a/hydra/experimental/callback.py
+++ b/hydra/experimental/callback.py
@@ -30,14 +30,14 @@
def on_multirun_end(self, config: DictConfig, **kwargs: Any) -> None:
"""
- Called in MULTIRUN mode after all job end.
+ Called in MULTIRUN mode after all jobs end.
"""
...
def on_job_start(self, config: DictConfig, **kwargs: Any) -> None:
"""
- Called in both RUN and MULTIRUN modes inside a Hydra job; before running
- application code.
+ Called in both RUN and MULTIRUN modes, once for each Hydra job (before running
+ application code).
"""
...
@@ -45,7 +45,7 @@
self, config: DictConfig, job_return: JobReturn, **kwargs: Any
) -> None:
"""
- Called in both RUN and MULTIRUN modes inside a Hydra job; after running
- application code.
+ Called in both RUN and MULTIRUN modes, once for each Hydra job (after running
+ application code).
"""
...
|
{"golden_diff": "diff --git a/hydra/experimental/callback.py b/hydra/experimental/callback.py\n--- a/hydra/experimental/callback.py\n+++ b/hydra/experimental/callback.py\n@@ -30,14 +30,14 @@\n \n def on_multirun_end(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n- Called in MULTIRUN mode after all job end.\n+ Called in MULTIRUN mode after all jobs end.\n \"\"\"\n ...\n \n def on_job_start(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n- Called in both RUN and MULTIRUN modes inside a Hydra job; before running\n- application code.\n+ Called in both RUN and MULTIRUN modes, once for each Hydra job (before running\n+ application code).\n \"\"\"\n ...\n \n@@ -45,7 +45,7 @@\n self, config: DictConfig, job_return: JobReturn, **kwargs: Any\n ) -> None:\n \"\"\"\n- Called in both RUN and MULTIRUN modes inside a Hydra job; after running\n- application code.\n+ Called in both RUN and MULTIRUN modes, once for each Hydra job (after running\n+ application code).\n \"\"\"\n ...\n", "issue": "Document hydra.callbacks api.\n- [ ] Document API\r\n- [ ] Add news fragment if missing\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport logging\nfrom typing import Any\n\nfrom omegaconf import DictConfig\n\nfrom hydra.core.utils import JobReturn\n\nlogger = logging.getLogger(__name__)\n\n\nclass Callback:\n def on_run_start(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in RUN mode before job starts.\n \"\"\"\n ...\n\n def on_run_end(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in RUN mode after job ends.\n \"\"\"\n ...\n\n def on_multirun_start(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in MULTIRUN mode before any job starts.\n \"\"\"\n ...\n\n def on_multirun_end(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in MULTIRUN mode after all job end.\n \"\"\"\n ...\n\n def on_job_start(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in both RUN and MULTIRUN modes inside a Hydra job; before running\n application code.\n \"\"\"\n ...\n\n def on_job_end(\n self, config: DictConfig, job_return: JobReturn, **kwargs: Any\n ) -> None:\n \"\"\"\n Called in both RUN and MULTIRUN modes inside a Hydra job; after running\n application code.\n \"\"\"\n ...\n", "path": "hydra/experimental/callback.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport logging\nfrom typing import Any\n\nfrom omegaconf import DictConfig\n\nfrom hydra.core.utils import JobReturn\n\nlogger = logging.getLogger(__name__)\n\n\nclass Callback:\n def on_run_start(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in RUN mode before job starts.\n \"\"\"\n ...\n\n def on_run_end(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in RUN mode after job ends.\n \"\"\"\n ...\n\n def on_multirun_start(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in MULTIRUN mode before any job starts.\n \"\"\"\n ...\n\n def on_multirun_end(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in MULTIRUN mode after all jobs end.\n \"\"\"\n ...\n\n def on_job_start(self, config: DictConfig, **kwargs: Any) -> None:\n \"\"\"\n Called in both RUN and MULTIRUN modes, once for each Hydra job (before running\n application code).\n \"\"\"\n ...\n\n def on_job_end(\n self, config: DictConfig, job_return: JobReturn, **kwargs: Any\n ) -> None:\n \"\"\"\n Called in both RUN and MULTIRUN modes, once for each Hydra job (after running\n application code).\n \"\"\"\n ...\n", "path": "hydra/experimental/callback.py"}]}
| 695 | 279 |
gh_patches_debug_8080
|
rasdani/github-patches
|
git_diff
|
keras-team__keras-nlp-195
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Why does the docstring say vocab size should be no larger than 999?
https://github.com/keras-team/keras-nlp/blob/e3adddaa98bbe1aee071117c01678fe3017dae80/keras_nlp/layers/token_and_position_embedding.py#L30
Seems like a very small vocab to me
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `keras_nlp/layers/token_and_position_embedding.py`
Content:
```
1 # Copyright 2022 The KerasNLP Authors
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # https://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Creates an Embedding Layer and adds Positional Embeddings"""
16
17 from tensorflow import keras
18
19 import keras_nlp.layers
20
21
22 class TokenAndPositionEmbedding(keras.layers.Layer):
23 """A layer which sums a token and position embedding.
24
25 This layer assumes that the last dimension in the input corresponds
26 to the sequence dimension.
27
28 Args:
29 vocabulary_size: The size of the vocabulary (should be no larger
30 than 999)
31 sequence_length: The maximum length of input sequence
32 embedding_dim: The output dimension of the embedding layer
33 embeddings_initializer: The initializer to use for the Embedding
34 Layers
35 mask_zero: Boolean, whether or not the input value 0 is a special
36 "padding" value that should be masked out.
37 This is useful when using recurrent layers which may take variable
38 length input. If this is True, then all subsequent layers in the
39 model need to support masking or an exception will be raised.
40 If mask_zero` is set to True, as a consequence, index 0 cannot be
41 used in the vocabulary
42 (input_dim should equal size of vocabulary + 1).
43
44 Examples:
45 ```python
46 seq_length = 50
47 vocab_size = 5000
48 embed_dim = 128
49 inputs = keras.Input(shape=(seq_length,))
50 embedding_layer = keras_nlp.layers.TokenAndPositionEmbedding(
51 vocabulary_size=vocab_size,
52 sequence_length=seq_length,
53 embedding_dim=embed_dim,
54 )
55 outputs = embedding_layer(inputs)
56 ```
57 """
58
59 def __init__(
60 self,
61 vocabulary_size,
62 sequence_length,
63 embedding_dim,
64 embeddings_initializer="glorot_uniform",
65 mask_zero=False,
66 **kwargs
67 ):
68 super().__init__(**kwargs)
69 if vocabulary_size is None:
70 raise ValueError(
71 "`vocabulary_size` must be an Integer, received `None`."
72 )
73 if sequence_length is None:
74 raise ValueError(
75 "`sequence_length` must be an Integer, received `None`."
76 )
77 if embedding_dim is None:
78 raise ValueError(
79 "`embedding_dim` must be an Integer, received `None`."
80 )
81 self.vocabulary_size = int(vocabulary_size)
82 self.sequence_length = int(sequence_length)
83 self.embedding_dim = int(embedding_dim)
84 self.token_embedding = keras.layers.Embedding(
85 vocabulary_size,
86 embedding_dim,
87 embeddings_initializer=embeddings_initializer,
88 mask_zero=mask_zero,
89 )
90 self.position_embedding = keras_nlp.layers.PositionEmbedding(
91 sequence_length=sequence_length,
92 initializer=embeddings_initializer,
93 )
94 self.supports_masking = self.token_embedding.supports_masking
95
96 def get_config(self):
97 config = super().get_config()
98 config.update(
99 {
100 "vocabulary_size": self.vocabulary_size,
101 "sequence_length": self.sequence_length,
102 "embedding_dim": self.embedding_dim,
103 "embeddings_initializer": keras.initializers.serialize(
104 self.token_embedding.embeddings_initializer
105 ),
106 "mask_zero": self.token_embedding.mask_zero,
107 },
108 )
109 return config
110
111 def call(self, inputs):
112 embedded_tokens = self.token_embedding(inputs)
113 embedded_positions = self.position_embedding(embedded_tokens)
114 outputs = embedded_tokens + embedded_positions
115 return outputs
116
117 def compute_mask(self, inputs, mask=None):
118 return self.token_embedding.compute_mask(inputs, mask=mask)
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/keras_nlp/layers/token_and_position_embedding.py b/keras_nlp/layers/token_and_position_embedding.py
--- a/keras_nlp/layers/token_and_position_embedding.py
+++ b/keras_nlp/layers/token_and_position_embedding.py
@@ -26,8 +26,7 @@
to the sequence dimension.
Args:
- vocabulary_size: The size of the vocabulary (should be no larger
- than 999)
+ vocabulary_size: The size of the vocabulary.
sequence_length: The maximum length of input sequence
embedding_dim: The output dimension of the embedding layer
embeddings_initializer: The initializer to use for the Embedding
|
{"golden_diff": "diff --git a/keras_nlp/layers/token_and_position_embedding.py b/keras_nlp/layers/token_and_position_embedding.py\n--- a/keras_nlp/layers/token_and_position_embedding.py\n+++ b/keras_nlp/layers/token_and_position_embedding.py\n@@ -26,8 +26,7 @@\n to the sequence dimension.\n \n Args:\n- vocabulary_size: The size of the vocabulary (should be no larger\n- than 999)\n+ vocabulary_size: The size of the vocabulary.\n sequence_length: The maximum length of input sequence\n embedding_dim: The output dimension of the embedding layer\n embeddings_initializer: The initializer to use for the Embedding\n", "issue": "Why does the docstring say vocab size should be no larger than 999?\nhttps://github.com/keras-team/keras-nlp/blob/e3adddaa98bbe1aee071117c01678fe3017dae80/keras_nlp/layers/token_and_position_embedding.py#L30\r\n\r\nSeems like a very small vocab to me\n", "before_files": [{"content": "# Copyright 2022 The KerasNLP Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Creates an Embedding Layer and adds Positional Embeddings\"\"\"\n\nfrom tensorflow import keras\n\nimport keras_nlp.layers\n\n\nclass TokenAndPositionEmbedding(keras.layers.Layer):\n \"\"\"A layer which sums a token and position embedding.\n\n This layer assumes that the last dimension in the input corresponds\n to the sequence dimension.\n\n Args:\n vocabulary_size: The size of the vocabulary (should be no larger\n than 999)\n sequence_length: The maximum length of input sequence\n embedding_dim: The output dimension of the embedding layer\n embeddings_initializer: The initializer to use for the Embedding\n Layers\n mask_zero: Boolean, whether or not the input value 0 is a special\n \"padding\" value that should be masked out.\n This is useful when using recurrent layers which may take variable\n length input. If this is True, then all subsequent layers in the\n model need to support masking or an exception will be raised.\n If mask_zero` is set to True, as a consequence, index 0 cannot be\n used in the vocabulary\n (input_dim should equal size of vocabulary + 1).\n\n Examples:\n ```python\n seq_length = 50\n vocab_size = 5000\n embed_dim = 128\n inputs = keras.Input(shape=(seq_length,))\n embedding_layer = keras_nlp.layers.TokenAndPositionEmbedding(\n vocabulary_size=vocab_size,\n sequence_length=seq_length,\n embedding_dim=embed_dim,\n )\n outputs = embedding_layer(inputs)\n ```\n \"\"\"\n\n def __init__(\n self,\n vocabulary_size,\n sequence_length,\n embedding_dim,\n embeddings_initializer=\"glorot_uniform\",\n mask_zero=False,\n **kwargs\n ):\n super().__init__(**kwargs)\n if vocabulary_size is None:\n raise ValueError(\n \"`vocabulary_size` must be an Integer, received `None`.\"\n )\n if sequence_length is None:\n raise ValueError(\n \"`sequence_length` must be an Integer, received `None`.\"\n )\n if embedding_dim is None:\n raise ValueError(\n \"`embedding_dim` must be an Integer, received `None`.\"\n )\n self.vocabulary_size = int(vocabulary_size)\n self.sequence_length = int(sequence_length)\n self.embedding_dim = int(embedding_dim)\n self.token_embedding = keras.layers.Embedding(\n vocabulary_size,\n embedding_dim,\n embeddings_initializer=embeddings_initializer,\n mask_zero=mask_zero,\n )\n self.position_embedding = keras_nlp.layers.PositionEmbedding(\n sequence_length=sequence_length,\n initializer=embeddings_initializer,\n )\n self.supports_masking = self.token_embedding.supports_masking\n\n def get_config(self):\n config = super().get_config()\n config.update(\n {\n \"vocabulary_size\": self.vocabulary_size,\n \"sequence_length\": self.sequence_length,\n \"embedding_dim\": self.embedding_dim,\n \"embeddings_initializer\": keras.initializers.serialize(\n self.token_embedding.embeddings_initializer\n ),\n \"mask_zero\": self.token_embedding.mask_zero,\n },\n )\n return config\n\n def call(self, inputs):\n embedded_tokens = self.token_embedding(inputs)\n embedded_positions = self.position_embedding(embedded_tokens)\n outputs = embedded_tokens + embedded_positions\n return outputs\n\n def compute_mask(self, inputs, mask=None):\n return self.token_embedding.compute_mask(inputs, mask=mask)\n", "path": "keras_nlp/layers/token_and_position_embedding.py"}], "after_files": [{"content": "# Copyright 2022 The KerasNLP Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Creates an Embedding Layer and adds Positional Embeddings\"\"\"\n\nfrom tensorflow import keras\n\nimport keras_nlp.layers\n\n\nclass TokenAndPositionEmbedding(keras.layers.Layer):\n \"\"\"A layer which sums a token and position embedding.\n\n This layer assumes that the last dimension in the input corresponds\n to the sequence dimension.\n\n Args:\n vocabulary_size: The size of the vocabulary.\n sequence_length: The maximum length of input sequence\n embedding_dim: The output dimension of the embedding layer\n embeddings_initializer: The initializer to use for the Embedding\n Layers\n mask_zero: Boolean, whether or not the input value 0 is a special\n \"padding\" value that should be masked out.\n This is useful when using recurrent layers which may take variable\n length input. If this is True, then all subsequent layers in the\n model need to support masking or an exception will be raised.\n If mask_zero` is set to True, as a consequence, index 0 cannot be\n used in the vocabulary\n (input_dim should equal size of vocabulary + 1).\n\n Examples:\n ```python\n seq_length = 50\n vocab_size = 5000\n embed_dim = 128\n inputs = keras.Input(shape=(seq_length,))\n embedding_layer = keras_nlp.layers.TokenAndPositionEmbedding(\n vocabulary_size=vocab_size,\n sequence_length=seq_length,\n embedding_dim=embed_dim,\n )\n outputs = embedding_layer(inputs)\n ```\n \"\"\"\n\n def __init__(\n self,\n vocabulary_size,\n sequence_length,\n embedding_dim,\n embeddings_initializer=\"glorot_uniform\",\n mask_zero=False,\n **kwargs\n ):\n super().__init__(**kwargs)\n if vocabulary_size is None:\n raise ValueError(\n \"`vocabulary_size` must be an Integer, received `None`.\"\n )\n if sequence_length is None:\n raise ValueError(\n \"`sequence_length` must be an Integer, received `None`.\"\n )\n if embedding_dim is None:\n raise ValueError(\n \"`embedding_dim` must be an Integer, received `None`.\"\n )\n self.vocabulary_size = int(vocabulary_size)\n self.sequence_length = int(sequence_length)\n self.embedding_dim = int(embedding_dim)\n self.token_embedding = keras.layers.Embedding(\n vocabulary_size,\n embedding_dim,\n embeddings_initializer=embeddings_initializer,\n mask_zero=mask_zero,\n )\n self.position_embedding = keras_nlp.layers.PositionEmbedding(\n sequence_length=sequence_length,\n initializer=embeddings_initializer,\n )\n self.supports_masking = self.token_embedding.supports_masking\n\n def get_config(self):\n config = super().get_config()\n config.update(\n {\n \"vocabulary_size\": self.vocabulary_size,\n \"sequence_length\": self.sequence_length,\n \"embedding_dim\": self.embedding_dim,\n \"embeddings_initializer\": keras.initializers.serialize(\n self.token_embedding.embeddings_initializer\n ),\n \"mask_zero\": self.token_embedding.mask_zero,\n },\n )\n return config\n\n def call(self, inputs):\n embedded_tokens = self.token_embedding(inputs)\n embedded_positions = self.position_embedding(embedded_tokens)\n outputs = embedded_tokens + embedded_positions\n return outputs\n\n def compute_mask(self, inputs, mask=None):\n return self.token_embedding.compute_mask(inputs, mask=mask)\n", "path": "keras_nlp/layers/token_and_position_embedding.py"}]}
| 1,477 | 153 |
gh_patches_debug_4313
|
rasdani/github-patches
|
git_diff
|
fossasia__open-event-server-352
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
HTML Template rendered when page doesn't exist in API
If a paginated API endpoint is called with a non-existant page number, a template is rendered which should never happen in case of REST APIs.
```
http http://localhost:5000/api/v1/event/page/2
HTTP/1.0 404 NOT FOUND
Content-Length: 1062
Content-Type: text/html; charset=utf-8
Date: Sat, 21 May 2016 07:51:38 GMT
Server: Werkzeug/0.11.7 Python/2.7.10
<!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
<title>You got 404'd</title>
<link href="/admin/static/bootstrap/bootstrap3/css/bootstrap.min.css" rel="stylesheet">
<link href="/static/admin/css/roboto.css" rel="stylesheet">
<link href="/static/admin/css/material-custom.css" rel="stylesheet">
</head>
<body>
<div class="container">
<div class="row">
<div class="col-md-push-3 col-md-6" style="margin-top: 20px;">
<div class="jumbotron">
<h2 style="font-weight: 100; ">Page Not Found</h2>
<p class="lead">Oops, the page you're looking for does not exist.</p>
<p style="font-size: 14px;">
You may want to head back to the homepage and restart your journey.
</p>
<a href="/" class="btn btn-large btn-info" style="background-color: #3f51b5;">
<i class="glyphicon glyphicon-home"></i> Take Me Home
</a>
</div>
</div>
</div>
</div>
</body>
</html>
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `open_event/helpers/object_formatter.py`
Content:
```
1 """Copyright 2015 Rafal Kowalski"""
2 from flask import jsonify
3
4 from .query_filter import QueryFilter
5
6
7 PER_PAGE = 20
8
9
10 class ObjectFormatter(object):
11 """Object formatter class"""
12 @staticmethod
13 def get_json(name, query, request, page=None):
14 """Returns formatted json"""
15 objects = QueryFilter(request.args, query).get_filtered_data()
16 count = objects.count()
17 if not page:
18 return jsonify(
19 {name: [
20 table_object.serialize
21 for table_object in
22 objects]})
23 else:
24 pagination = objects.paginate(page, PER_PAGE)
25 return jsonify({
26 name: [
27 table_object.serialize
28 for table_object in
29 pagination.items
30 ],
31 'total_pages': pagination.pages,
32 'page': pagination.page
33 })
34
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/open_event/helpers/object_formatter.py b/open_event/helpers/object_formatter.py
--- a/open_event/helpers/object_formatter.py
+++ b/open_event/helpers/object_formatter.py
@@ -21,6 +21,8 @@
for table_object in
objects]})
else:
+ if count <= ((page-1) * PER_PAGE): # no results possible
+ return jsonify({})
pagination = objects.paginate(page, PER_PAGE)
return jsonify({
name: [
|
{"golden_diff": "diff --git a/open_event/helpers/object_formatter.py b/open_event/helpers/object_formatter.py\n--- a/open_event/helpers/object_formatter.py\n+++ b/open_event/helpers/object_formatter.py\n@@ -21,6 +21,8 @@\n for table_object in\n objects]})\n else:\n+ if count <= ((page-1) * PER_PAGE): # no results possible\n+ return jsonify({})\n pagination = objects.paginate(page, PER_PAGE)\n return jsonify({\n name: [\n", "issue": "HTML Template rendered when page doesn't exist in API\nIf a paginated API endpoint is called with a non-existant page number, a template is rendered which should never happen in case of REST APIs.\n\n```\nhttp http://localhost:5000/api/v1/event/page/2\nHTTP/1.0 404 NOT FOUND\nContent-Length: 1062\nContent-Type: text/html; charset=utf-8\nDate: Sat, 21 May 2016 07:51:38 GMT\nServer: Werkzeug/0.11.7 Python/2.7.10\n\n<!DOCTYPE html>\n<html>\n<head lang=\"en\">\n <meta charset=\"UTF-8\">\n <title>You got 404'd</title>\n <link href=\"/admin/static/bootstrap/bootstrap3/css/bootstrap.min.css\" rel=\"stylesheet\">\n <link href=\"/static/admin/css/roboto.css\" rel=\"stylesheet\">\n <link href=\"/static/admin/css/material-custom.css\" rel=\"stylesheet\">\n</head>\n<body>\n<div class=\"container\">\n <div class=\"row\">\n <div class=\"col-md-push-3 col-md-6\" style=\"margin-top: 20px;\">\n <div class=\"jumbotron\">\n <h2 style=\"font-weight: 100; \">Page Not Found</h2>\n <p class=\"lead\">Oops, the page you're looking for does not exist.</p>\n <p style=\"font-size: 14px;\">\n You may want to head back to the homepage and restart your journey.\n </p>\n <a href=\"/\" class=\"btn btn-large btn-info\" style=\"background-color: #3f51b5;\">\n <i class=\"glyphicon glyphicon-home\"></i> Take Me Home\n </a>\n </div>\n </div>\n </div>\n</div>\n</body>\n</html>\n```\n\n", "before_files": [{"content": "\"\"\"Copyright 2015 Rafal Kowalski\"\"\"\nfrom flask import jsonify\n\nfrom .query_filter import QueryFilter\n\n\nPER_PAGE = 20\n\n\nclass ObjectFormatter(object):\n \"\"\"Object formatter class\"\"\"\n @staticmethod\n def get_json(name, query, request, page=None):\n \"\"\"Returns formatted json\"\"\"\n objects = QueryFilter(request.args, query).get_filtered_data()\n count = objects.count()\n if not page:\n return jsonify(\n {name: [\n table_object.serialize\n for table_object in\n objects]})\n else:\n pagination = objects.paginate(page, PER_PAGE)\n return jsonify({\n name: [\n table_object.serialize\n for table_object in\n pagination.items\n ],\n 'total_pages': pagination.pages,\n 'page': pagination.page\n })\n", "path": "open_event/helpers/object_formatter.py"}], "after_files": [{"content": "\"\"\"Copyright 2015 Rafal Kowalski\"\"\"\nfrom flask import jsonify\n\nfrom .query_filter import QueryFilter\n\n\nPER_PAGE = 20\n\n\nclass ObjectFormatter(object):\n \"\"\"Object formatter class\"\"\"\n @staticmethod\n def get_json(name, query, request, page=None):\n \"\"\"Returns formatted json\"\"\"\n objects = QueryFilter(request.args, query).get_filtered_data()\n count = objects.count()\n if not page:\n return jsonify(\n {name: [\n table_object.serialize\n for table_object in\n objects]})\n else:\n if count <= ((page-1) * PER_PAGE): # no results possible\n return jsonify({})\n pagination = objects.paginate(page, PER_PAGE)\n return jsonify({\n name: [\n table_object.serialize\n for table_object in\n pagination.items\n ],\n 'total_pages': pagination.pages,\n 'page': pagination.page\n })\n", "path": "open_event/helpers/object_formatter.py"}]}
| 909 | 104 |
gh_patches_debug_2905
|
rasdani/github-patches
|
git_diff
|
mabel-dev__opteryx-1689
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
🪲 VIEWs load error should be in debug mode only
### Thank you for taking the time to report a problem with Opteryx.
_To help us to respond to your request we ask that you try to provide the below detail about the bug._
**Describe the bug** _A clear and specific description of what the bug is. What the error, incorrect or unexpected behaviour was._
**Expected behaviour** _A clear and concise description of what you expected to happen._
**Sample Code/Statement** _If you can, please submit the SQL statement or Python code snippet, or a representative example using the sample datasets._
~~~sql
~~~
**Additional context** _Add any other context about the problem here, for example what you have done to try to diagnose or workaround the problem._
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opteryx/planner/views/__init__.py`
Content:
```
1 # Licensed under the Apache License, Version 2.0 (the "License");
2 # you may not use this file except in compliance with the License.
3 # You may obtain a copy of the License at
4 #
5 # http://www.apache.org/licenses/LICENSE-2.0
6 #
7 # Unless required by applicable law or agreed to in writing, software
8 # distributed under the License is distributed on an "AS IS" BASIS,
9 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10 # See the License for the specific language governing permissions and
11 # limitations under the License.
12
13 import orjson
14
15 from opteryx.planner.logical_planner import LogicalPlan
16
17
18 def _load_views():
19 try:
20 with open("views.json", "rb") as defs:
21 return orjson.loads(defs.read())
22 except Exception as err:
23 print(f"[OPTERYX] Unable to open views definition file. {err}")
24 return {}
25
26
27 VIEWS = _load_views()
28
29
30 def is_view(view_name: str) -> bool:
31 return view_name in VIEWS
32
33
34 def view_as_plan(view_name: str) -> LogicalPlan:
35 from opteryx.planner.logical_planner import do_logical_planning_phase
36 from opteryx.third_party import sqloxide
37 from opteryx.utils.sql import clean_statement
38 from opteryx.utils.sql import remove_comments
39
40 operation = VIEWS.get(view_name)["statement"]
41
42 clean_sql = clean_statement(remove_comments(operation))
43 parsed_statements = sqloxide.parse_sql(clean_sql, dialect="mysql")
44 logical_plan, _, _ = next(do_logical_planning_phase(parsed_statements))
45
46 return logical_plan
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/opteryx/planner/views/__init__.py b/opteryx/planner/views/__init__.py
--- a/opteryx/planner/views/__init__.py
+++ b/opteryx/planner/views/__init__.py
@@ -20,7 +20,7 @@
with open("views.json", "rb") as defs:
return orjson.loads(defs.read())
except Exception as err:
- print(f"[OPTERYX] Unable to open views definition file. {err}")
+ # DEBUG:: log (f"[OPTERYX] Unable to open views definition file. {err}")
return {}
|
{"golden_diff": "diff --git a/opteryx/planner/views/__init__.py b/opteryx/planner/views/__init__.py\n--- a/opteryx/planner/views/__init__.py\n+++ b/opteryx/planner/views/__init__.py\n@@ -20,7 +20,7 @@\n with open(\"views.json\", \"rb\") as defs:\n return orjson.loads(defs.read())\n except Exception as err:\n- print(f\"[OPTERYX] Unable to open views definition file. {err}\")\n+ # DEBUG:: log (f\"[OPTERYX] Unable to open views definition file. {err}\")\n return {}\n", "issue": "\ud83e\udeb2 VIEWs load error should be in debug mode only\n### Thank you for taking the time to report a problem with Opteryx.\r\n_To help us to respond to your request we ask that you try to provide the below detail about the bug._\r\n\r\n**Describe the bug** _A clear and specific description of what the bug is. What the error, incorrect or unexpected behaviour was._\r\n\r\n\r\n**Expected behaviour** _A clear and concise description of what you expected to happen._\r\n\r\n\r\n**Sample Code/Statement** _If you can, please submit the SQL statement or Python code snippet, or a representative example using the sample datasets._\r\n\r\n~~~sql\r\n\r\n~~~\r\n\r\n**Additional context** _Add any other context about the problem here, for example what you have done to try to diagnose or workaround the problem._\r\n\n", "before_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport orjson\n\nfrom opteryx.planner.logical_planner import LogicalPlan\n\n\ndef _load_views():\n try:\n with open(\"views.json\", \"rb\") as defs:\n return orjson.loads(defs.read())\n except Exception as err:\n print(f\"[OPTERYX] Unable to open views definition file. {err}\")\n return {}\n\n\nVIEWS = _load_views()\n\n\ndef is_view(view_name: str) -> bool:\n return view_name in VIEWS\n\n\ndef view_as_plan(view_name: str) -> LogicalPlan:\n from opteryx.planner.logical_planner import do_logical_planning_phase\n from opteryx.third_party import sqloxide\n from opteryx.utils.sql import clean_statement\n from opteryx.utils.sql import remove_comments\n\n operation = VIEWS.get(view_name)[\"statement\"]\n\n clean_sql = clean_statement(remove_comments(operation))\n parsed_statements = sqloxide.parse_sql(clean_sql, dialect=\"mysql\")\n logical_plan, _, _ = next(do_logical_planning_phase(parsed_statements))\n\n return logical_plan\n", "path": "opteryx/planner/views/__init__.py"}], "after_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport orjson\n\nfrom opteryx.planner.logical_planner import LogicalPlan\n\n\ndef _load_views():\n try:\n with open(\"views.json\", \"rb\") as defs:\n return orjson.loads(defs.read())\n except Exception as err:\n # DEBUG:: log (f\"[OPTERYX] Unable to open views definition file. {err}\")\n return {}\n\n\nVIEWS = _load_views()\n\n\ndef is_view(view_name: str) -> bool:\n return view_name in VIEWS\n\n\ndef view_as_plan(view_name: str) -> LogicalPlan:\n from opteryx.planner.logical_planner import do_logical_planning_phase\n from opteryx.third_party import sqloxide\n from opteryx.utils.sql import clean_statement\n from opteryx.utils.sql import remove_comments\n\n operation = VIEWS.get(view_name)[\"statement\"]\n\n clean_sql = clean_statement(remove_comments(operation))\n parsed_statements = sqloxide.parse_sql(clean_sql, dialect=\"mysql\")\n logical_plan, _, _ = next(do_logical_planning_phase(parsed_statements))\n\n return logical_plan\n", "path": "opteryx/planner/views/__init__.py"}]}
| 874 | 137 |
gh_patches_debug_14047
|
rasdani/github-patches
|
git_diff
|
liqd__a4-opin-1146
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
add languages to project detail page in wagtail (user manuals)
The titles of the user manuals or only in english and there are no fieldsfor translating them in wagtail. Is it possible to add the languages? :)


--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `home/templatetags/base_tags.py`
Content:
```
1 import bleach
2 import feedparser
3 from dateutil import parser
4 from django import template
5 from django.conf import settings
6
7 from home.models.snippets import NavigationMenu
8
9 register = template.Library()
10
11
12 @register.assignment_tag(takes_context=True)
13 def get_site_root(context):
14 return context['request'].site.root_page
15
16
17 @register.inclusion_tag('tags/top_menu.html', takes_context=True)
18 def top_menu(context, parent, calling_page=None):
19 menuitems = parent.get_children().live().in_menu().specific()
20
21 return {
22 'calling_page': calling_page,
23 'menuitems': menuitems,
24 'request': context['request'],
25 }
26
27
28 @register.inclusion_tag('includes/rss_import.html', takes_context=True)
29 def import_rss(context, rss_import):
30
31 feeds = feedparser.parse(rss_import.url)
32 entries = feeds.entries[:2]
33
34 result = []
35
36 for entry in entries:
37 try:
38 published = parser.parse(entry["published"])
39 except Exception:
40 published = ''
41
42 result.append({
43 'published': published,
44 'title': entry.title,
45 'link': entry.link,
46 'description': bleach.clean(entry.summary,
47 tags=[],
48 attributes={},
49 styles=[],
50 strip=True
51 )
52 }
53 )
54
55 return {
56 'title': rss_import.translated_rss_title,
57 'entries': result
58 }
59
60
61 @register.assignment_tag(takes_context=False)
62 def load_site_menu(menu_name):
63 menu = NavigationMenu.objects.filter(menu_name=menu_name)
64
65 if menu:
66 return menu[0].menu_items.all()
67 else:
68 return None
69
70
71 @register.filter(name='clear_class')
72 def clear_class(columns_per_row, count):
73 if (count-1) % (12/int(columns_per_row)) == 0:
74 return "m-clear"
75 else:
76 return ""
77
78
79 @register.simple_tag
80 def settings_value(name):
81 return getattr(settings, name, "")
82
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/home/templatetags/base_tags.py b/home/templatetags/base_tags.py
--- a/home/templatetags/base_tags.py
+++ b/home/templatetags/base_tags.py
@@ -3,12 +3,22 @@
from dateutil import parser
from django import template
from django.conf import settings
+from django.core.exceptions import ObjectDoesNotExist
+from home.models.manual_pages import ManualsDetailPage
from home.models.snippets import NavigationMenu
register = template.Library()
[email protected]_tag(takes_context=True)
+def get_page_name(context, page):
+ try:
+ return ManualsDetailPage.objects.get(id=page.id).translated_title
+ except ObjectDoesNotExist:
+ return page
+
+
@register.assignment_tag(takes_context=True)
def get_site_root(context):
return context['request'].site.root_page
|
{"golden_diff": "diff --git a/home/templatetags/base_tags.py b/home/templatetags/base_tags.py\n--- a/home/templatetags/base_tags.py\n+++ b/home/templatetags/base_tags.py\n@@ -3,12 +3,22 @@\n from dateutil import parser\n from django import template\n from django.conf import settings\n+from django.core.exceptions import ObjectDoesNotExist\n \n+from home.models.manual_pages import ManualsDetailPage\n from home.models.snippets import NavigationMenu\n \n register = template.Library()\n \n \[email protected]_tag(takes_context=True)\n+def get_page_name(context, page):\n+ try:\n+ return ManualsDetailPage.objects.get(id=page.id).translated_title\n+ except ObjectDoesNotExist:\n+ return page\n+\n+\n @register.assignment_tag(takes_context=True)\n def get_site_root(context):\n return context['request'].site.root_page\n", "issue": "add languages to project detail page in wagtail (user manuals)\nThe titles of the user manuals or only in english and there are no fieldsfor translating them in wagtail. Is it possible to add the languages? :)\r\n\r\n\r\n\n", "before_files": [{"content": "import bleach\nimport feedparser\nfrom dateutil import parser\nfrom django import template\nfrom django.conf import settings\n\nfrom home.models.snippets import NavigationMenu\n\nregister = template.Library()\n\n\[email protected]_tag(takes_context=True)\ndef get_site_root(context):\n return context['request'].site.root_page\n\n\[email protected]_tag('tags/top_menu.html', takes_context=True)\ndef top_menu(context, parent, calling_page=None):\n menuitems = parent.get_children().live().in_menu().specific()\n\n return {\n 'calling_page': calling_page,\n 'menuitems': menuitems,\n 'request': context['request'],\n }\n\n\[email protected]_tag('includes/rss_import.html', takes_context=True)\ndef import_rss(context, rss_import):\n\n feeds = feedparser.parse(rss_import.url)\n entries = feeds.entries[:2]\n\n result = []\n\n for entry in entries:\n try:\n published = parser.parse(entry[\"published\"])\n except Exception:\n published = ''\n\n result.append({\n 'published': published,\n 'title': entry.title,\n 'link': entry.link,\n 'description': bleach.clean(entry.summary,\n tags=[],\n attributes={},\n styles=[],\n strip=True\n )\n }\n )\n\n return {\n 'title': rss_import.translated_rss_title,\n 'entries': result\n }\n\n\[email protected]_tag(takes_context=False)\ndef load_site_menu(menu_name):\n menu = NavigationMenu.objects.filter(menu_name=menu_name)\n\n if menu:\n return menu[0].menu_items.all()\n else:\n return None\n\n\[email protected](name='clear_class')\ndef clear_class(columns_per_row, count):\n if (count-1) % (12/int(columns_per_row)) == 0:\n return \"m-clear\"\n else:\n return \"\"\n\n\[email protected]_tag\ndef settings_value(name):\n return getattr(settings, name, \"\")\n", "path": "home/templatetags/base_tags.py"}], "after_files": [{"content": "import bleach\nimport feedparser\nfrom dateutil import parser\nfrom django import template\nfrom django.conf import settings\nfrom django.core.exceptions import ObjectDoesNotExist\n\nfrom home.models.manual_pages import ManualsDetailPage\nfrom home.models.snippets import NavigationMenu\n\nregister = template.Library()\n\n\[email protected]_tag(takes_context=True)\ndef get_page_name(context, page):\n try:\n return ManualsDetailPage.objects.get(id=page.id).translated_title\n except ObjectDoesNotExist:\n return page\n\n\[email protected]_tag(takes_context=True)\ndef get_site_root(context):\n return context['request'].site.root_page\n\n\[email protected]_tag('tags/top_menu.html', takes_context=True)\ndef top_menu(context, parent, calling_page=None):\n menuitems = parent.get_children().live().in_menu().specific()\n\n return {\n 'calling_page': calling_page,\n 'menuitems': menuitems,\n 'request': context['request'],\n }\n\n\[email protected]_tag('includes/rss_import.html', takes_context=True)\ndef import_rss(context, rss_import):\n\n feeds = feedparser.parse(rss_import.url)\n entries = feeds.entries[:2]\n\n result = []\n\n for entry in entries:\n try:\n published = parser.parse(entry[\"published\"])\n except Exception:\n published = ''\n\n result.append({\n 'published': published,\n 'title': entry.title,\n 'link': entry.link,\n 'description': bleach.clean(entry.summary,\n tags=[],\n attributes={},\n styles=[],\n strip=True\n )\n }\n )\n\n return {\n 'title': rss_import.translated_rss_title,\n 'entries': result\n }\n\n\[email protected]_tag(takes_context=False)\ndef load_site_menu(menu_name):\n menu = NavigationMenu.objects.filter(menu_name=menu_name)\n\n if menu:\n return menu[0].menu_items.all()\n else:\n return None\n\n\[email protected](name='clear_class')\ndef clear_class(columns_per_row, count):\n if (count-1) % (12/int(columns_per_row)) == 0:\n return \"m-clear\"\n else:\n return \"\"\n\n\[email protected]_tag\ndef settings_value(name):\n return getattr(settings, name, \"\")\n", "path": "home/templatetags/base_tags.py"}]}
| 1,014 | 191 |
gh_patches_debug_10988
|
rasdani/github-patches
|
git_diff
|
chainer__chainer-7738
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve chainerx import check
#7518 has disabled the following:
1. Check out the source code and `import chainer` without pip install.
2. pip install chainer in non-editable mode (with/without chainerx) and import chainer from the source root directory.
\1. should be supported. In 2., we should give more comprehensible error.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `chainerx/__init__.py`
Content:
```
1 import os
2 import warnings
3
4 from chainerx import _build_info
5
6
7 if _build_info.build_chainerx:
8 from chainerx import _core
9 _available = True
10 else:
11 _available = False
12
13
14 if _available:
15 from numpy import dtype # NOQA
16 from numpy import ( # NOQA
17 Inf, Infinity, NAN, NINF, NZERO, NaN, PINF, PZERO,
18 e, euler_gamma,
19 inf, infty, nan,
20 newaxis,
21 pi)
22 from numpy import (
23 bool_, int8, int16, int32, int64, uint8, float16, float32, float64) # NOQA
24 all_dtypes = (
25 bool_, int8, int16, int32, int64, uint8, float16, float32, float64)
26
27 from chainerx._core import * # NOQA
28 from chainerx._core import _to_cupy # NOQA
29
30 from builtins import bool, int, float # NOQA
31
32 from chainerx import _device # NOQA
33
34 from chainerx.creation.from_data import asanyarray # NOQA
35 from chainerx.creation.from_data import fromfile # NOQA
36 from chainerx.creation.from_data import fromfunction # NOQA
37 from chainerx.creation.from_data import fromiter # NOQA
38 from chainerx.creation.from_data import fromstring # NOQA
39 from chainerx.creation.from_data import loadtxt # NOQA
40
41 from chainerx.manipulation.shape import ravel # NOQA
42
43 from chainerx.math.misc import clip # NOQA
44
45 from chainerx import random # NOQA
46
47 _global_context = _core.Context()
48 _core.set_global_default_context(_global_context)
49
50 # Implements ndarray methods in Python
51 from chainerx import _ndarray
52 _ndarray.populate()
53
54 # Temporary workaround implementations that fall back to NumPy/CuPy's
55 # respective functions.
56 from chainerx import _fallback_workarounds
57 _fallback_workarounds.populate()
58
59 # Dynamically inject docstrings
60 from chainerx import _docs
61 _docs.set_docs()
62
63 from chainerx import _cuda
64 # Share memory pool with CuPy.
65 if bool(int(os.getenv('CHAINERX_CUDA_CUPY_SHARE_ALLOCATOR', '0'))):
66 _cuda.cupy_share_allocator()
67 else:
68 class ndarray(object):
69
70 """Dummy class for type testing."""
71
72 def __init__(self, *args, **kwargs):
73 raise RuntimeError('chainerx is not available.')
74
75
76 def is_available():
77 return _available
78
79
80 if _available and _core._is_debug():
81 # Warn if the ChainerX core binary is built in debug mode
82 warnings.warn(
83 'ChainerX core binary is built in debug mode.', stacklevel=2)
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/chainerx/__init__.py b/chainerx/__init__.py
--- a/chainerx/__init__.py
+++ b/chainerx/__init__.py
@@ -1,8 +1,22 @@
import os
import warnings
-from chainerx import _build_info
-
+try:
+ from chainerx import _build_info
+except ImportError:
+ raise ImportError(
+ '''\
+Cannot import chainerx because _build_info.py cannot be found.
+
+The chainer and chainerx module being imported was not correctly \
+installed by `pip install`.
+
+It may be caused by either of the following reasons.
+
+1. You are directly importing chainer source files without installing it with \
+`pip install`.
+2. You installed chainer in non-editable mode (`pip install` without -e) and \
+are importing chainer source files instead of the installed module.''')
if _build_info.build_chainerx:
from chainerx import _core
|
{"golden_diff": "diff --git a/chainerx/__init__.py b/chainerx/__init__.py\n--- a/chainerx/__init__.py\n+++ b/chainerx/__init__.py\n@@ -1,8 +1,22 @@\n import os\n import warnings\n \n-from chainerx import _build_info\n-\n+try:\n+ from chainerx import _build_info\n+except ImportError:\n+ raise ImportError(\n+ '''\\\n+Cannot import chainerx because _build_info.py cannot be found.\n+\n+The chainer and chainerx module being imported was not correctly \\\n+installed by `pip install`.\n+\n+It may be caused by either of the following reasons.\n+\n+1. You are directly importing chainer source files without installing it with \\\n+`pip install`.\n+2. You installed chainer in non-editable mode (`pip install` without -e) and \\\n+are importing chainer source files instead of the installed module.''')\n \n if _build_info.build_chainerx:\n from chainerx import _core\n", "issue": "Improve chainerx import check\n#7518 has disabled the following:\r\n1. Check out the source code and `import chainer` without pip install.\r\n2. pip install chainer in non-editable mode (with/without chainerx) and import chainer from the source root directory.\r\n\r\n\\1. should be supported. In 2., we should give more comprehensible error.\n", "before_files": [{"content": "import os\nimport warnings\n\nfrom chainerx import _build_info\n\n\nif _build_info.build_chainerx:\n from chainerx import _core\n _available = True\nelse:\n _available = False\n\n\nif _available:\n from numpy import dtype # NOQA\n from numpy import ( # NOQA\n Inf, Infinity, NAN, NINF, NZERO, NaN, PINF, PZERO,\n e, euler_gamma,\n inf, infty, nan,\n newaxis,\n pi)\n from numpy import (\n bool_, int8, int16, int32, int64, uint8, float16, float32, float64) # NOQA\n all_dtypes = (\n bool_, int8, int16, int32, int64, uint8, float16, float32, float64)\n\n from chainerx._core import * # NOQA\n from chainerx._core import _to_cupy # NOQA\n\n from builtins import bool, int, float # NOQA\n\n from chainerx import _device # NOQA\n\n from chainerx.creation.from_data import asanyarray # NOQA\n from chainerx.creation.from_data import fromfile # NOQA\n from chainerx.creation.from_data import fromfunction # NOQA\n from chainerx.creation.from_data import fromiter # NOQA\n from chainerx.creation.from_data import fromstring # NOQA\n from chainerx.creation.from_data import loadtxt # NOQA\n\n from chainerx.manipulation.shape import ravel # NOQA\n\n from chainerx.math.misc import clip # NOQA\n\n from chainerx import random # NOQA\n\n _global_context = _core.Context()\n _core.set_global_default_context(_global_context)\n\n # Implements ndarray methods in Python\n from chainerx import _ndarray\n _ndarray.populate()\n\n # Temporary workaround implementations that fall back to NumPy/CuPy's\n # respective functions.\n from chainerx import _fallback_workarounds\n _fallback_workarounds.populate()\n\n # Dynamically inject docstrings\n from chainerx import _docs\n _docs.set_docs()\n\n from chainerx import _cuda\n # Share memory pool with CuPy.\n if bool(int(os.getenv('CHAINERX_CUDA_CUPY_SHARE_ALLOCATOR', '0'))):\n _cuda.cupy_share_allocator()\nelse:\n class ndarray(object):\n\n \"\"\"Dummy class for type testing.\"\"\"\n\n def __init__(self, *args, **kwargs):\n raise RuntimeError('chainerx is not available.')\n\n\ndef is_available():\n return _available\n\n\nif _available and _core._is_debug():\n # Warn if the ChainerX core binary is built in debug mode\n warnings.warn(\n 'ChainerX core binary is built in debug mode.', stacklevel=2)\n", "path": "chainerx/__init__.py"}], "after_files": [{"content": "import os\nimport warnings\n\ntry:\n from chainerx import _build_info\nexcept ImportError:\n raise ImportError(\n '''\\\nCannot import chainerx because _build_info.py cannot be found.\n\nThe chainer and chainerx module being imported was not correctly \\\ninstalled by `pip install`.\n\nIt may be caused by either of the following reasons.\n\n1. You are directly importing chainer source files without installing it with \\\n`pip install`.\n2. You installed chainer in non-editable mode (`pip install` without -e) and \\\nare importing chainer source files instead of the installed module.''')\n\nif _build_info.build_chainerx:\n from chainerx import _core\n _available = True\nelse:\n _available = False\n\n\nif _available:\n from numpy import dtype # NOQA\n from numpy import ( # NOQA\n Inf, Infinity, NAN, NINF, NZERO, NaN, PINF, PZERO,\n e, euler_gamma,\n inf, infty, nan,\n newaxis,\n pi)\n from numpy import (\n bool_, int8, int16, int32, int64, uint8, float16, float32, float64) # NOQA\n all_dtypes = (\n bool_, int8, int16, int32, int64, uint8, float16, float32, float64)\n\n from chainerx._core import * # NOQA\n from chainerx._core import _to_cupy # NOQA\n\n from builtins import bool, int, float # NOQA\n\n from chainerx import _device # NOQA\n\n from chainerx.creation.from_data import asanyarray # NOQA\n from chainerx.creation.from_data import fromfile # NOQA\n from chainerx.creation.from_data import fromfunction # NOQA\n from chainerx.creation.from_data import fromiter # NOQA\n from chainerx.creation.from_data import fromstring # NOQA\n from chainerx.creation.from_data import loadtxt # NOQA\n\n from chainerx.manipulation.shape import ravel # NOQA\n\n from chainerx.math.misc import clip # NOQA\n\n from chainerx import random # NOQA\n\n _global_context = _core.Context()\n _core.set_global_default_context(_global_context)\n\n # Implements ndarray methods in Python\n from chainerx import _ndarray\n _ndarray.populate()\n\n # Temporary workaround implementations that fall back to NumPy/CuPy's\n # respective functions.\n from chainerx import _fallback_workarounds\n _fallback_workarounds.populate()\n\n # Dynamically inject docstrings\n from chainerx import _docs\n _docs.set_docs()\n\n from chainerx import _cuda\n # Share memory pool with CuPy.\n if bool(int(os.getenv('CHAINERX_CUDA_CUPY_SHARE_ALLOCATOR', '0'))):\n _cuda.cupy_share_allocator()\nelse:\n class ndarray(object):\n\n \"\"\"Dummy class for type testing.\"\"\"\n\n def __init__(self, *args, **kwargs):\n raise RuntimeError('chainerx is not available.')\n\n\ndef is_available():\n return _available\n\n\nif _available and _core._is_debug():\n # Warn if the ChainerX core binary is built in debug mode\n warnings.warn(\n 'ChainerX core binary is built in debug mode.', stacklevel=2)\n", "path": "chainerx/__init__.py"}]}
| 1,172 | 222 |
gh_patches_debug_28311
|
rasdani/github-patches
|
git_diff
|
liqd__a4-meinberlin-3152
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
when bplan is archived, office worker update email is sent
**URL:**
**user:**
**expected behaviour:** Office workers only get an update email when bplan is "published, changed or created as draft"
**behaviour:** The automatic archiving also changes the bplan and by that triggers the email. Thinking about it, it should also trigger the get-point thing (and might result in failure errors being thrown).
**important screensize:**
**device & browser:**
**Comment/Question:**
Screenshot?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `meinberlin/apps/bplan/signals.py`
Content:
```
1 from django.db.models.signals import post_save
2 from django.dispatch import receiver
3
4 from . import emails
5 from . import tasks
6 from .models import Bplan
7 from .models import Statement
8
9
10 @receiver(post_save, sender=Bplan)
11 def get_location(sender, instance, update_fields, **kwargs):
12 if instance.identifier and (not update_fields or
13 'point' not in update_fields):
14 tasks.get_location_information(instance.pk)
15
16
17 @receiver(post_save, sender=Statement)
18 def send_notification(sender, instance, created, **kwargs):
19 if created:
20 emails.OfficeWorkerNotification.send(instance)
21
22 if instance.email:
23 emails.SubmitterConfirmation.send(instance)
24
25
26 @receiver(post_save, sender=Bplan)
27 def send_update(sender, instance, update_fields, **kwargs):
28 if not update_fields or 'point' not in update_fields:
29 emails.OfficeWorkerUpdateConfirmation.send(instance)
30
```
Path: `meinberlin/apps/bplan/management/commands/bplan_auto_archive.py`
Content:
```
1 from datetime import timedelta
2
3 from django.core.management.base import BaseCommand
4 from django.utils import timezone
5
6 from meinberlin.apps.bplan import models as bplan_models
7
8
9 class Command(BaseCommand):
10 help = 'Archive finished bplan projects and delete old statements.'
11
12 def handle(self, *args, **options):
13 bplans = bplan_models.Bplan.objects.filter(is_draft=False)
14 for bplan in bplans:
15 if bplan.has_finished and not bplan.is_archived:
16 bplan.is_archived = True
17 bplan.save()
18 self.stdout.write('Archived bplan {}.'.format(bplan.name))
19
20 # Delete statements of archived projects
21 # To prevent deleting statements that have not been sent by mail yet
22 # only statements older then 48h are deleted.
23 num_deleted, _ = bplan_models.Statement.objects\
24 .filter(module__project__is_archived=True)\
25 .filter(created__lt=timezone.now() - timedelta(hours=48))\
26 .delete()
27 if num_deleted:
28 self.stdout.write('Deleted {} statements from archived bplans.'
29 .format(num_deleted))
30
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py b/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py
--- a/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py
+++ b/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py
@@ -14,7 +14,9 @@
for bplan in bplans:
if bplan.has_finished and not bplan.is_archived:
bplan.is_archived = True
- bplan.save()
+ bplan.save(
+ update_fields=['is_archived']
+ )
self.stdout.write('Archived bplan {}.'.format(bplan.name))
# Delete statements of archived projects
diff --git a/meinberlin/apps/bplan/signals.py b/meinberlin/apps/bplan/signals.py
--- a/meinberlin/apps/bplan/signals.py
+++ b/meinberlin/apps/bplan/signals.py
@@ -10,7 +10,8 @@
@receiver(post_save, sender=Bplan)
def get_location(sender, instance, update_fields, **kwargs):
if instance.identifier and (not update_fields or
- 'point' not in update_fields):
+ ('point' not in update_fields and
+ 'is_archived' not in update_fields)):
tasks.get_location_information(instance.pk)
@@ -25,5 +26,6 @@
@receiver(post_save, sender=Bplan)
def send_update(sender, instance, update_fields, **kwargs):
- if not update_fields or 'point' not in update_fields:
+ if (not update_fields or ('point' not in update_fields and
+ 'is_archived' not in update_fields)):
emails.OfficeWorkerUpdateConfirmation.send(instance)
|
{"golden_diff": "diff --git a/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py b/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py\n--- a/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py\n+++ b/meinberlin/apps/bplan/management/commands/bplan_auto_archive.py\n@@ -14,7 +14,9 @@\n for bplan in bplans:\n if bplan.has_finished and not bplan.is_archived:\n bplan.is_archived = True\n- bplan.save()\n+ bplan.save(\n+ update_fields=['is_archived']\n+ )\n self.stdout.write('Archived bplan {}.'.format(bplan.name))\n \n # Delete statements of archived projects\ndiff --git a/meinberlin/apps/bplan/signals.py b/meinberlin/apps/bplan/signals.py\n--- a/meinberlin/apps/bplan/signals.py\n+++ b/meinberlin/apps/bplan/signals.py\n@@ -10,7 +10,8 @@\n @receiver(post_save, sender=Bplan)\n def get_location(sender, instance, update_fields, **kwargs):\n if instance.identifier and (not update_fields or\n- 'point' not in update_fields):\n+ ('point' not in update_fields and\n+ 'is_archived' not in update_fields)):\n tasks.get_location_information(instance.pk)\n \n \n@@ -25,5 +26,6 @@\n \n @receiver(post_save, sender=Bplan)\n def send_update(sender, instance, update_fields, **kwargs):\n- if not update_fields or 'point' not in update_fields:\n+ if (not update_fields or ('point' not in update_fields and\n+ 'is_archived' not in update_fields)):\n emails.OfficeWorkerUpdateConfirmation.send(instance)\n", "issue": "when bplan is archived, office worker update email is sent\n**URL:** \r\n**user:** \r\n**expected behaviour:** Office workers only get an update email when bplan is \"published, changed or created as draft\"\r\n**behaviour:** The automatic archiving also changes the bplan and by that triggers the email. Thinking about it, it should also trigger the get-point thing (and might result in failure errors being thrown).\r\n**important screensize:**\r\n**device & browser:** \r\n**Comment/Question:** \r\n\r\nScreenshot?\r\n\n", "before_files": [{"content": "from django.db.models.signals import post_save\nfrom django.dispatch import receiver\n\nfrom . import emails\nfrom . import tasks\nfrom .models import Bplan\nfrom .models import Statement\n\n\n@receiver(post_save, sender=Bplan)\ndef get_location(sender, instance, update_fields, **kwargs):\n if instance.identifier and (not update_fields or\n 'point' not in update_fields):\n tasks.get_location_information(instance.pk)\n\n\n@receiver(post_save, sender=Statement)\ndef send_notification(sender, instance, created, **kwargs):\n if created:\n emails.OfficeWorkerNotification.send(instance)\n\n if instance.email:\n emails.SubmitterConfirmation.send(instance)\n\n\n@receiver(post_save, sender=Bplan)\ndef send_update(sender, instance, update_fields, **kwargs):\n if not update_fields or 'point' not in update_fields:\n emails.OfficeWorkerUpdateConfirmation.send(instance)\n", "path": "meinberlin/apps/bplan/signals.py"}, {"content": "from datetime import timedelta\n\nfrom django.core.management.base import BaseCommand\nfrom django.utils import timezone\n\nfrom meinberlin.apps.bplan import models as bplan_models\n\n\nclass Command(BaseCommand):\n help = 'Archive finished bplan projects and delete old statements.'\n\n def handle(self, *args, **options):\n bplans = bplan_models.Bplan.objects.filter(is_draft=False)\n for bplan in bplans:\n if bplan.has_finished and not bplan.is_archived:\n bplan.is_archived = True\n bplan.save()\n self.stdout.write('Archived bplan {}.'.format(bplan.name))\n\n # Delete statements of archived projects\n # To prevent deleting statements that have not been sent by mail yet\n # only statements older then 48h are deleted.\n num_deleted, _ = bplan_models.Statement.objects\\\n .filter(module__project__is_archived=True)\\\n .filter(created__lt=timezone.now() - timedelta(hours=48))\\\n .delete()\n if num_deleted:\n self.stdout.write('Deleted {} statements from archived bplans.'\n .format(num_deleted))\n", "path": "meinberlin/apps/bplan/management/commands/bplan_auto_archive.py"}], "after_files": [{"content": "from django.db.models.signals import post_save\nfrom django.dispatch import receiver\n\nfrom . import emails\nfrom . import tasks\nfrom .models import Bplan\nfrom .models import Statement\n\n\n@receiver(post_save, sender=Bplan)\ndef get_location(sender, instance, update_fields, **kwargs):\n if instance.identifier and (not update_fields or\n ('point' not in update_fields and\n 'is_archived' not in update_fields)):\n tasks.get_location_information(instance.pk)\n\n\n@receiver(post_save, sender=Statement)\ndef send_notification(sender, instance, created, **kwargs):\n if created:\n emails.OfficeWorkerNotification.send(instance)\n\n if instance.email:\n emails.SubmitterConfirmation.send(instance)\n\n\n@receiver(post_save, sender=Bplan)\ndef send_update(sender, instance, update_fields, **kwargs):\n if (not update_fields or ('point' not in update_fields and\n 'is_archived' not in update_fields)):\n emails.OfficeWorkerUpdateConfirmation.send(instance)\n", "path": "meinberlin/apps/bplan/signals.py"}, {"content": "from datetime import timedelta\n\nfrom django.core.management.base import BaseCommand\nfrom django.utils import timezone\n\nfrom meinberlin.apps.bplan import models as bplan_models\n\n\nclass Command(BaseCommand):\n help = 'Archive finished bplan projects and delete old statements.'\n\n def handle(self, *args, **options):\n bplans = bplan_models.Bplan.objects.filter(is_draft=False)\n for bplan in bplans:\n if bplan.has_finished and not bplan.is_archived:\n bplan.is_archived = True\n bplan.save(\n update_fields=['is_archived']\n )\n self.stdout.write('Archived bplan {}.'.format(bplan.name))\n\n # Delete statements of archived projects\n # To prevent deleting statements that have not been sent by mail yet\n # only statements older then 48h are deleted.\n num_deleted, _ = bplan_models.Statement.objects\\\n .filter(module__project__is_archived=True)\\\n .filter(created__lt=timezone.now() - timedelta(hours=48))\\\n .delete()\n if num_deleted:\n self.stdout.write('Deleted {} statements from archived bplans.'\n .format(num_deleted))\n", "path": "meinberlin/apps/bplan/management/commands/bplan_auto_archive.py"}]}
| 932 | 395 |
gh_patches_debug_31063
|
rasdani/github-patches
|
git_diff
|
aws-cloudformation__cfn-lint-781
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Single-subnet ELB detection
*cfn-lint version: 0.10.3*
*Description of issue.*
When configuring `AWS::ElasticLoadBalancingV2::LoadBalancer`, if you specify only 1 subnet, you get back from AWS:
> 2019-02-05 12:22:39 +1100 Elb AWS::ElasticLoadBalancingV2::LoadBalancer CREATE_FAILED At least two subnets in two different Availability Zones must be specified (Service: AmazonElasticLoadBalancingV2; Status Code: 400; Error Code: ValidationError; Request ID: ...)
This could be covered by a cfn-lint check.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cfnlint/rules/resources/elb/Elb.py`
Content:
```
1 """
2 Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
3
4 Permission is hereby granted, free of charge, to any person obtaining a copy of this
5 software and associated documentation files (the "Software"), to deal in the Software
6 without restriction, including without limitation the rights to use, copy, modify,
7 merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
8 permit persons to whom the Software is furnished to do so.
9
10 THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
11 INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
12 PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
13 HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
14 OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
15 SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
16 """
17 import six
18 from cfnlint import CloudFormationLintRule
19 from cfnlint import RuleMatch
20
21
22 class Elb(CloudFormationLintRule):
23 """Check if Elb Resource Properties"""
24 id = 'E2503'
25 shortdesc = 'Resource ELB Properties'
26 description = 'See if Elb Resource Properties are set correctly \
27 HTTPS has certificate HTTP has no certificate'
28 source_url = 'https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-ec2-elb-listener.html'
29 tags = ['properties', 'elb']
30
31 def check_protocol_value(self, value, path, **kwargs):
32 """
33 Check Protocol Value
34 """
35 matches = []
36 if isinstance(value, six.string_types):
37 if value.upper() not in kwargs['accepted_protocols']:
38 message = 'Protocol must be {0} is invalid at {1}'
39 matches.append(RuleMatch(path, message.format((', '.join(kwargs['accepted_protocols'])), ('/'.join(map(str, path))))))
40 elif value.upper() in kwargs['certificate_protocols']:
41 if not kwargs['certificates']:
42 message = 'Certificates should be specified when using HTTPS for {0}'
43 matches.append(RuleMatch(path, message.format(('/'.join(map(str, path))))))
44
45 return matches
46
47 def match(self, cfn):
48 """Check ELB Resource Parameters"""
49
50 matches = []
51
52 results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::Listener'])
53 for result in results:
54 matches.extend(
55 cfn.check_value(
56 result['Value'], 'Protocol', result['Path'],
57 check_value=self.check_protocol_value,
58 accepted_protocols=['HTTP', 'HTTPS', 'TCP', 'TLS'],
59 certificate_protocols=['HTTPS', 'TLS'],
60 certificates=result['Value'].get('Certificates')))
61
62 results = cfn.get_resource_properties(['AWS::ElasticLoadBalancing::LoadBalancer', 'Listeners'])
63 for result in results:
64 if isinstance(result['Value'], list):
65 for index, listener in enumerate(result['Value']):
66 matches.extend(
67 cfn.check_value(
68 listener, 'Protocol', result['Path'] + [index],
69 check_value=self.check_protocol_value,
70 accepted_protocols=['HTTP', 'HTTPS', 'TCP', 'SSL'],
71 certificate_protocols=['HTTPS', 'SSL'],
72 certificates=listener.get('SSLCertificateId')))
73
74 results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::LoadBalancer'])
75 for result in results:
76 properties = result['Value']
77 if 'Type' in properties and properties['Type'] == 'network':
78 if 'SecurityGroups' in properties:
79 path = result['Path'] + ['SecurityGroups']
80 matches.append(RuleMatch(path, 'Security groups are not supported for load balancers with type "network"'))
81
82 return matches
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/cfnlint/rules/resources/elb/Elb.py b/src/cfnlint/rules/resources/elb/Elb.py
--- a/src/cfnlint/rules/resources/elb/Elb.py
+++ b/src/cfnlint/rules/resources/elb/Elb.py
@@ -44,6 +44,24 @@
return matches
+ def check_alb_subnets(self, props, path):
+ """ Validate at least two subnets with ALBs"""
+ matches = []
+ elb_type = props.get('Type')
+ if elb_type == 'application':
+ subnets = props.get('Subnets')
+ if isinstance(subnets, list):
+ if len(subnets) < 2:
+ path = path + ['Subnets']
+ matches.append(RuleMatch(path, 'You must specify at least two Subnets for load balancers with type "application"'))
+ subnet_mappings = props.get('SubnetMappings')
+ if isinstance(subnet_mappings, list):
+ if len(subnet_mappings) < 2:
+ path = path + ['SubnetMappings']
+ matches.append(RuleMatch(path, 'You must specify at least two SubnetMappings for load balancers with type "application"'))
+
+ return matches
+
def match(self, cfn):
"""Check ELB Resource Parameters"""
@@ -74,9 +92,12 @@
results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::LoadBalancer'])
for result in results:
properties = result['Value']
- if 'Type' in properties and properties['Type'] == 'network':
+ elb_type = properties.get('Type')
+ if elb_type == 'network':
if 'SecurityGroups' in properties:
path = result['Path'] + ['SecurityGroups']
matches.append(RuleMatch(path, 'Security groups are not supported for load balancers with type "network"'))
+ matches.extend(self.check_alb_subnets(properties, result['Path']))
+
return matches
|
{"golden_diff": "diff --git a/src/cfnlint/rules/resources/elb/Elb.py b/src/cfnlint/rules/resources/elb/Elb.py\n--- a/src/cfnlint/rules/resources/elb/Elb.py\n+++ b/src/cfnlint/rules/resources/elb/Elb.py\n@@ -44,6 +44,24 @@\n \n return matches\n \n+ def check_alb_subnets(self, props, path):\n+ \"\"\" Validate at least two subnets with ALBs\"\"\"\n+ matches = []\n+ elb_type = props.get('Type')\n+ if elb_type == 'application':\n+ subnets = props.get('Subnets')\n+ if isinstance(subnets, list):\n+ if len(subnets) < 2:\n+ path = path + ['Subnets']\n+ matches.append(RuleMatch(path, 'You must specify at least two Subnets for load balancers with type \"application\"'))\n+ subnet_mappings = props.get('SubnetMappings')\n+ if isinstance(subnet_mappings, list):\n+ if len(subnet_mappings) < 2:\n+ path = path + ['SubnetMappings']\n+ matches.append(RuleMatch(path, 'You must specify at least two SubnetMappings for load balancers with type \"application\"'))\n+\n+ return matches\n+\n def match(self, cfn):\n \"\"\"Check ELB Resource Parameters\"\"\"\n \n@@ -74,9 +92,12 @@\n results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::LoadBalancer'])\n for result in results:\n properties = result['Value']\n- if 'Type' in properties and properties['Type'] == 'network':\n+ elb_type = properties.get('Type')\n+ if elb_type == 'network':\n if 'SecurityGroups' in properties:\n path = result['Path'] + ['SecurityGroups']\n matches.append(RuleMatch(path, 'Security groups are not supported for load balancers with type \"network\"'))\n \n+ matches.extend(self.check_alb_subnets(properties, result['Path']))\n+\n return matches\n", "issue": "Single-subnet ELB detection\n*cfn-lint version: 0.10.3*\r\n\r\n*Description of issue.*\r\n\r\nWhen configuring `AWS::ElasticLoadBalancingV2::LoadBalancer`, if you specify only 1 subnet, you get back from AWS:\r\n\r\n> 2019-02-05 12:22:39 +1100 Elb AWS::ElasticLoadBalancingV2::LoadBalancer CREATE_FAILED At least two subnets in two different Availability Zones must be specified (Service: AmazonElasticLoadBalancingV2; Status Code: 400; Error Code: ValidationError; Request ID: ...)\r\n\r\nThis could be covered by a cfn-lint check.\n", "before_files": [{"content": "\"\"\"\n Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nimport six\nfrom cfnlint import CloudFormationLintRule\nfrom cfnlint import RuleMatch\n\n\nclass Elb(CloudFormationLintRule):\n \"\"\"Check if Elb Resource Properties\"\"\"\n id = 'E2503'\n shortdesc = 'Resource ELB Properties'\n description = 'See if Elb Resource Properties are set correctly \\\nHTTPS has certificate HTTP has no certificate'\n source_url = 'https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-ec2-elb-listener.html'\n tags = ['properties', 'elb']\n\n def check_protocol_value(self, value, path, **kwargs):\n \"\"\"\n Check Protocol Value\n \"\"\"\n matches = []\n if isinstance(value, six.string_types):\n if value.upper() not in kwargs['accepted_protocols']:\n message = 'Protocol must be {0} is invalid at {1}'\n matches.append(RuleMatch(path, message.format((', '.join(kwargs['accepted_protocols'])), ('/'.join(map(str, path))))))\n elif value.upper() in kwargs['certificate_protocols']:\n if not kwargs['certificates']:\n message = 'Certificates should be specified when using HTTPS for {0}'\n matches.append(RuleMatch(path, message.format(('/'.join(map(str, path))))))\n\n return matches\n\n def match(self, cfn):\n \"\"\"Check ELB Resource Parameters\"\"\"\n\n matches = []\n\n results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::Listener'])\n for result in results:\n matches.extend(\n cfn.check_value(\n result['Value'], 'Protocol', result['Path'],\n check_value=self.check_protocol_value,\n accepted_protocols=['HTTP', 'HTTPS', 'TCP', 'TLS'],\n certificate_protocols=['HTTPS', 'TLS'],\n certificates=result['Value'].get('Certificates')))\n\n results = cfn.get_resource_properties(['AWS::ElasticLoadBalancing::LoadBalancer', 'Listeners'])\n for result in results:\n if isinstance(result['Value'], list):\n for index, listener in enumerate(result['Value']):\n matches.extend(\n cfn.check_value(\n listener, 'Protocol', result['Path'] + [index],\n check_value=self.check_protocol_value,\n accepted_protocols=['HTTP', 'HTTPS', 'TCP', 'SSL'],\n certificate_protocols=['HTTPS', 'SSL'],\n certificates=listener.get('SSLCertificateId')))\n\n results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::LoadBalancer'])\n for result in results:\n properties = result['Value']\n if 'Type' in properties and properties['Type'] == 'network':\n if 'SecurityGroups' in properties:\n path = result['Path'] + ['SecurityGroups']\n matches.append(RuleMatch(path, 'Security groups are not supported for load balancers with type \"network\"'))\n\n return matches\n", "path": "src/cfnlint/rules/resources/elb/Elb.py"}], "after_files": [{"content": "\"\"\"\n Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nimport six\nfrom cfnlint import CloudFormationLintRule\nfrom cfnlint import RuleMatch\n\n\nclass Elb(CloudFormationLintRule):\n \"\"\"Check if Elb Resource Properties\"\"\"\n id = 'E2503'\n shortdesc = 'Resource ELB Properties'\n description = 'See if Elb Resource Properties are set correctly \\\nHTTPS has certificate HTTP has no certificate'\n source_url = 'https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-ec2-elb-listener.html'\n tags = ['properties', 'elb']\n\n def check_protocol_value(self, value, path, **kwargs):\n \"\"\"\n Check Protocol Value\n \"\"\"\n matches = []\n if isinstance(value, six.string_types):\n if value.upper() not in kwargs['accepted_protocols']:\n message = 'Protocol must be {0} is invalid at {1}'\n matches.append(RuleMatch(path, message.format((', '.join(kwargs['accepted_protocols'])), ('/'.join(map(str, path))))))\n elif value.upper() in kwargs['certificate_protocols']:\n if not kwargs['certificates']:\n message = 'Certificates should be specified when using HTTPS for {0}'\n matches.append(RuleMatch(path, message.format(('/'.join(map(str, path))))))\n\n return matches\n\n def check_alb_subnets(self, props, path):\n \"\"\" Validate at least two subnets with ALBs\"\"\"\n matches = []\n elb_type = props.get('Type')\n if elb_type == 'application':\n subnets = props.get('Subnets')\n if isinstance(subnets, list):\n if len(subnets) < 2:\n path = path + ['Subnets']\n matches.append(RuleMatch(path, 'You must specify at least two Subnets for load balancers with type \"application\"'))\n subnet_mappings = props.get('SubnetMappings')\n if isinstance(subnet_mappings, list):\n if len(subnet_mappings) < 2:\n path = path + ['SubnetMappings']\n matches.append(RuleMatch(path, 'You must specify at least two SubnetMappings for load balancers with type \"application\"'))\n\n return matches\n\n def match(self, cfn):\n \"\"\"Check ELB Resource Parameters\"\"\"\n\n matches = []\n\n results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::Listener'])\n for result in results:\n matches.extend(\n cfn.check_value(\n result['Value'], 'Protocol', result['Path'],\n check_value=self.check_protocol_value,\n accepted_protocols=['HTTP', 'HTTPS', 'TCP', 'TLS'],\n certificate_protocols=['HTTPS', 'TLS'],\n certificates=result['Value'].get('Certificates')))\n\n results = cfn.get_resource_properties(['AWS::ElasticLoadBalancing::LoadBalancer', 'Listeners'])\n for result in results:\n if isinstance(result['Value'], list):\n for index, listener in enumerate(result['Value']):\n matches.extend(\n cfn.check_value(\n listener, 'Protocol', result['Path'] + [index],\n check_value=self.check_protocol_value,\n accepted_protocols=['HTTP', 'HTTPS', 'TCP', 'SSL'],\n certificate_protocols=['HTTPS', 'SSL'],\n certificates=listener.get('SSLCertificateId')))\n\n results = cfn.get_resource_properties(['AWS::ElasticLoadBalancingV2::LoadBalancer'])\n for result in results:\n properties = result['Value']\n elb_type = properties.get('Type')\n if elb_type == 'network':\n if 'SecurityGroups' in properties:\n path = result['Path'] + ['SecurityGroups']\n matches.append(RuleMatch(path, 'Security groups are not supported for load balancers with type \"network\"'))\n\n matches.extend(self.check_alb_subnets(properties, result['Path']))\n\n return matches\n", "path": "src/cfnlint/rules/resources/elb/Elb.py"}]}
| 1,400 | 451 |
gh_patches_debug_19382
|
rasdani/github-patches
|
git_diff
|
pymodbus-dev__pymodbus-931
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
callback_server.py example error
<!--
Please use the Pymodbus gitter channel at https://gitter.im/pymodbus_dev/Lobby or Stack Overflow(tag [pymodbus](https://stackoverflow.com/questions/tagged/pymodbus) for
support questions.
Before opening a new issue, make sure you do the following:
* check that your issue isn't already filed: https://github.com/riptideio/pymodbus/issues
* prepare a short, runnable example that reproduce the issue with the latest development version of Pymodbus
-->
### Versions
* Python: 3.7
* OS: Ubuntu 20
* Pymodbus: 2.5.0
* Modbus Hardware (if used):
### Pymodbus Specific
* Server: rtu - sync
* Client: rtu - sync
### Description
When I adapt the "callback_server.py" example, I make the queue client modify the datastore, but my client doesn't see the changes.
after some work, it's due to the use of Process creation for the queue listener that makes changes in it's own scope.
### Correction
change the queue Process listener creation :
```python
p = Process(target=device_writer, args=(queue,))
p.start()
```
by a Thread queue listenrer makes changes could be seen by my client because of modification made in the same server scope:
```python
from threading import Thread
...
t = Thread(target = device_writer, args=(queue,))
t.start()
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/common/callback_server.py`
Content:
```
1 #!/usr/bin/env python3
2 # pylint: disable=missing-type-doc,missing-param-doc,differing-param-doc
3 """Pymodbus Server With Callbacks.
4
5 This is an example of adding callbacks to a running modbus server
6 when a value is written to it. In order for this to work, it needs
7 a device-mapping file.
8 """
9 import logging
10 from multiprocessing import Queue, Process
11
12 # --------------------------------------------------------------------------- #
13 # import the modbus libraries we need
14 # --------------------------------------------------------------------------- #
15 from pymodbus.version import version
16 from pymodbus.server.asynchronous import StartTcpServer
17 from pymodbus.device import ModbusDeviceIdentification
18 from pymodbus.datastore import ModbusSparseDataBlock
19 from pymodbus.datastore import ModbusSlaveContext, ModbusServerContext
20
21 # from pymodbus.transaction import ModbusRtuFramer, ModbusAsciiFramer
22
23
24 # --------------------------------------------------------------------------- #
25 # configure the service logging
26 # --------------------------------------------------------------------------- #
27 log = logging.getLogger()
28 log.setLevel(logging.DEBUG)
29
30 # --------------------------------------------------------------------------- #
31 # create your custom data block with callbacks
32 # --------------------------------------------------------------------------- #
33
34
35 class CallbackDataBlock(ModbusSparseDataBlock):
36 """A datablock that stores the new value in memory,
37
38 and passes the operation to a message queue for further processing.
39 """
40
41 def __init__(self, devices, queue):
42 """Initialize."""
43 self.devices = devices
44 self.queue = queue
45
46 values = {k: 0 for k in devices.keys()}
47 values[0xBEEF] = len(values) # the number of devices
48 super().__init__(values)
49
50 def setValues(self, address, value): # pylint: disable=arguments-differ
51 """Set the requested values of the datastore
52
53 :param address: The starting address
54 :param values: The new values to be set
55 """
56 super().setValues(address, value)
57 self.queue.put((self.devices.get(address, None), value))
58
59
60 # --------------------------------------------------------------------------- #
61 # define your callback process
62 # --------------------------------------------------------------------------- #
63
64
65 def rescale_value(value):
66 """Rescale the input value from the range of 0..100 to -3200..3200.
67
68 :param value: The input value to scale
69 :returns: The rescaled value
70 """
71 scale = 1 if value >= 50 else -1
72 cur = value if value < 50 else (value - 50)
73 return scale * (cur * 64)
74
75
76 def device_writer(queue):
77 """Process new messages from a queue to write to device outputs
78
79 :param queue: The queue to get new messages from
80 """
81 while True:
82 device, value = queue.get()
83 rescale_value(value[0])
84 txt = f"Write({device}) = {value}"
85 log.debug(txt)
86 if not device:
87 continue
88 # do any logic here to update your devices
89
90
91 # --------------------------------------------------------------------------- #
92 # initialize your device map
93 # --------------------------------------------------------------------------- #
94
95
96 def read_device_map(path):
97 """Read the device path to address mapping from file::
98
99 0x0001,/dev/device1
100 0x0002,/dev/device2
101
102 :param path: The path to the input file
103 :returns: The input mapping file
104 """
105 devices = {}
106 with open(path, "r") as stream: # pylint: disable=unspecified-encoding
107 for line in stream:
108 piece = line.strip().split(",")
109 devices[int(piece[0], 16)] = piece[1]
110 return devices
111
112
113 def run_callback_server():
114 """Run callback server."""
115 # ----------------------------------------------------------------------- #
116 # initialize your data store
117 # ----------------------------------------------------------------------- #
118 queue = Queue()
119 devices = read_device_map("device-mapping")
120 block = CallbackDataBlock(devices, queue)
121 store = ModbusSlaveContext(di=block, co=block, hr=block, ir=block)
122 context = ModbusServerContext(slaves=store, single=True)
123
124 # ----------------------------------------------------------------------- #
125 # initialize the server information
126 # ----------------------------------------------------------------------- #
127 identity = ModbusDeviceIdentification(
128 info_name={
129 "VendorName": "pymodbus",
130 "ProductCode": "PM",
131 "VendorUrl": "http://github.com/riptideio/pymodbus/",
132 "ProductName": "pymodbus Server",
133 "ModelName": "pymodbus Server",
134 "MajorMinorRevision": version.short(),
135 }
136 )
137
138 # ----------------------------------------------------------------------- #
139 # run the server you want
140 # ----------------------------------------------------------------------- #
141 proc = Process(target=device_writer, args=(queue,))
142 proc.start()
143 StartTcpServer(context, identity=identity, address=("localhost", 5020))
144
145
146 if __name__ == "__main__":
147 run_callback_server()
148
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/examples/common/callback_server.py b/examples/common/callback_server.py
--- a/examples/common/callback_server.py
+++ b/examples/common/callback_server.py
@@ -7,7 +7,8 @@
a device-mapping file.
"""
import logging
-from multiprocessing import Queue, Process
+from multiprocessing import Queue
+from threading import Thread
# --------------------------------------------------------------------------- #
# import the modbus libraries we need
@@ -138,8 +139,8 @@
# ----------------------------------------------------------------------- #
# run the server you want
# ----------------------------------------------------------------------- #
- proc = Process(target=device_writer, args=(queue,))
- proc.start()
+ thread = Thread(target=device_writer, args=(queue,))
+ thread.start()
StartTcpServer(context, identity=identity, address=("localhost", 5020))
|
{"golden_diff": "diff --git a/examples/common/callback_server.py b/examples/common/callback_server.py\n--- a/examples/common/callback_server.py\n+++ b/examples/common/callback_server.py\n@@ -7,7 +7,8 @@\n a device-mapping file.\n \"\"\"\n import logging\n-from multiprocessing import Queue, Process\n+from multiprocessing import Queue\n+from threading import Thread\n \n # --------------------------------------------------------------------------- #\n # import the modbus libraries we need\n@@ -138,8 +139,8 @@\n # ----------------------------------------------------------------------- #\n # run the server you want\n # ----------------------------------------------------------------------- #\n- proc = Process(target=device_writer, args=(queue,))\n- proc.start()\n+ thread = Thread(target=device_writer, args=(queue,))\n+ thread.start()\n StartTcpServer(context, identity=identity, address=(\"localhost\", 5020))\n", "issue": "callback_server.py example error\n<!--\r\nPlease use the Pymodbus gitter channel at https://gitter.im/pymodbus_dev/Lobby or Stack Overflow(tag [pymodbus](https://stackoverflow.com/questions/tagged/pymodbus) for\r\nsupport questions.\r\n\r\nBefore opening a new issue, make sure you do the following:\r\n * check that your issue isn't already filed: https://github.com/riptideio/pymodbus/issues\r\n * prepare a short, runnable example that reproduce the issue with the latest development version of Pymodbus\r\n-->\r\n\r\n### Versions\r\n\r\n* Python: 3.7\r\n* OS: Ubuntu 20\r\n* Pymodbus: 2.5.0\r\n* Modbus Hardware (if used): \r\n\r\n### Pymodbus Specific\r\n* Server: rtu - sync\r\n* Client: rtu - sync\r\n\r\n### Description\r\n\r\nWhen I adapt the \"callback_server.py\" example, I make the queue client modify the datastore, but my client doesn't see the changes.\r\nafter some work, it's due to the use of Process creation for the queue listener that makes changes in it's own scope.\r\n\r\n### Correction\r\nchange the queue Process listener creation :\r\n```python\r\n p = Process(target=device_writer, args=(queue,))\r\n p.start()\r\n```\r\nby a Thread queue listenrer makes changes could be seen by my client because of modification made in the same server scope:\r\n```python\r\nfrom threading import Thread\r\n...\r\n t = Thread(target = device_writer, args=(queue,))\r\n t.start()\r\n```\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n# pylint: disable=missing-type-doc,missing-param-doc,differing-param-doc\n\"\"\"Pymodbus Server With Callbacks.\n\nThis is an example of adding callbacks to a running modbus server\nwhen a value is written to it. In order for this to work, it needs\na device-mapping file.\n\"\"\"\nimport logging\nfrom multiprocessing import Queue, Process\n\n# --------------------------------------------------------------------------- #\n# import the modbus libraries we need\n# --------------------------------------------------------------------------- #\nfrom pymodbus.version import version\nfrom pymodbus.server.asynchronous import StartTcpServer\nfrom pymodbus.device import ModbusDeviceIdentification\nfrom pymodbus.datastore import ModbusSparseDataBlock\nfrom pymodbus.datastore import ModbusSlaveContext, ModbusServerContext\n\n# from pymodbus.transaction import ModbusRtuFramer, ModbusAsciiFramer\n\n\n# --------------------------------------------------------------------------- #\n# configure the service logging\n# --------------------------------------------------------------------------- #\nlog = logging.getLogger()\nlog.setLevel(logging.DEBUG)\n\n# --------------------------------------------------------------------------- #\n# create your custom data block with callbacks\n# --------------------------------------------------------------------------- #\n\n\nclass CallbackDataBlock(ModbusSparseDataBlock):\n \"\"\"A datablock that stores the new value in memory,\n\n and passes the operation to a message queue for further processing.\n \"\"\"\n\n def __init__(self, devices, queue):\n \"\"\"Initialize.\"\"\"\n self.devices = devices\n self.queue = queue\n\n values = {k: 0 for k in devices.keys()}\n values[0xBEEF] = len(values) # the number of devices\n super().__init__(values)\n\n def setValues(self, address, value): # pylint: disable=arguments-differ\n \"\"\"Set the requested values of the datastore\n\n :param address: The starting address\n :param values: The new values to be set\n \"\"\"\n super().setValues(address, value)\n self.queue.put((self.devices.get(address, None), value))\n\n\n# --------------------------------------------------------------------------- #\n# define your callback process\n# --------------------------------------------------------------------------- #\n\n\ndef rescale_value(value):\n \"\"\"Rescale the input value from the range of 0..100 to -3200..3200.\n\n :param value: The input value to scale\n :returns: The rescaled value\n \"\"\"\n scale = 1 if value >= 50 else -1\n cur = value if value < 50 else (value - 50)\n return scale * (cur * 64)\n\n\ndef device_writer(queue):\n \"\"\"Process new messages from a queue to write to device outputs\n\n :param queue: The queue to get new messages from\n \"\"\"\n while True:\n device, value = queue.get()\n rescale_value(value[0])\n txt = f\"Write({device}) = {value}\"\n log.debug(txt)\n if not device:\n continue\n # do any logic here to update your devices\n\n\n# --------------------------------------------------------------------------- #\n# initialize your device map\n# --------------------------------------------------------------------------- #\n\n\ndef read_device_map(path):\n \"\"\"Read the device path to address mapping from file::\n\n 0x0001,/dev/device1\n 0x0002,/dev/device2\n\n :param path: The path to the input file\n :returns: The input mapping file\n \"\"\"\n devices = {}\n with open(path, \"r\") as stream: # pylint: disable=unspecified-encoding\n for line in stream:\n piece = line.strip().split(\",\")\n devices[int(piece[0], 16)] = piece[1]\n return devices\n\n\ndef run_callback_server():\n \"\"\"Run callback server.\"\"\"\n # ----------------------------------------------------------------------- #\n # initialize your data store\n # ----------------------------------------------------------------------- #\n queue = Queue()\n devices = read_device_map(\"device-mapping\")\n block = CallbackDataBlock(devices, queue)\n store = ModbusSlaveContext(di=block, co=block, hr=block, ir=block)\n context = ModbusServerContext(slaves=store, single=True)\n\n # ----------------------------------------------------------------------- #\n # initialize the server information\n # ----------------------------------------------------------------------- #\n identity = ModbusDeviceIdentification(\n info_name={\n \"VendorName\": \"pymodbus\",\n \"ProductCode\": \"PM\",\n \"VendorUrl\": \"http://github.com/riptideio/pymodbus/\",\n \"ProductName\": \"pymodbus Server\",\n \"ModelName\": \"pymodbus Server\",\n \"MajorMinorRevision\": version.short(),\n }\n )\n\n # ----------------------------------------------------------------------- #\n # run the server you want\n # ----------------------------------------------------------------------- #\n proc = Process(target=device_writer, args=(queue,))\n proc.start()\n StartTcpServer(context, identity=identity, address=(\"localhost\", 5020))\n\n\nif __name__ == \"__main__\":\n run_callback_server()\n", "path": "examples/common/callback_server.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n# pylint: disable=missing-type-doc,missing-param-doc,differing-param-doc\n\"\"\"Pymodbus Server With Callbacks.\n\nThis is an example of adding callbacks to a running modbus server\nwhen a value is written to it. In order for this to work, it needs\na device-mapping file.\n\"\"\"\nimport logging\nfrom multiprocessing import Queue\nfrom threading import Thread\n\n# --------------------------------------------------------------------------- #\n# import the modbus libraries we need\n# --------------------------------------------------------------------------- #\nfrom pymodbus.version import version\nfrom pymodbus.server.asynchronous import StartTcpServer\nfrom pymodbus.device import ModbusDeviceIdentification\nfrom pymodbus.datastore import ModbusSparseDataBlock\nfrom pymodbus.datastore import ModbusSlaveContext, ModbusServerContext\n\n# from pymodbus.transaction import ModbusRtuFramer, ModbusAsciiFramer\n\n\n# --------------------------------------------------------------------------- #\n# configure the service logging\n# --------------------------------------------------------------------------- #\nlog = logging.getLogger()\nlog.setLevel(logging.DEBUG)\n\n# --------------------------------------------------------------------------- #\n# create your custom data block with callbacks\n# --------------------------------------------------------------------------- #\n\n\nclass CallbackDataBlock(ModbusSparseDataBlock):\n \"\"\"A datablock that stores the new value in memory,\n\n and passes the operation to a message queue for further processing.\n \"\"\"\n\n def __init__(self, devices, queue):\n \"\"\"Initialize.\"\"\"\n self.devices = devices\n self.queue = queue\n\n values = {k: 0 for k in devices.keys()}\n values[0xBEEF] = len(values) # the number of devices\n super().__init__(values)\n\n def setValues(self, address, value): # pylint: disable=arguments-differ\n \"\"\"Set the requested values of the datastore\n\n :param address: The starting address\n :param values: The new values to be set\n \"\"\"\n super().setValues(address, value)\n self.queue.put((self.devices.get(address, None), value))\n\n\n# --------------------------------------------------------------------------- #\n# define your callback process\n# --------------------------------------------------------------------------- #\n\n\ndef rescale_value(value):\n \"\"\"Rescale the input value from the range of 0..100 to -3200..3200.\n\n :param value: The input value to scale\n :returns: The rescaled value\n \"\"\"\n scale = 1 if value >= 50 else -1\n cur = value if value < 50 else (value - 50)\n return scale * (cur * 64)\n\n\ndef device_writer(queue):\n \"\"\"Process new messages from a queue to write to device outputs\n\n :param queue: The queue to get new messages from\n \"\"\"\n while True:\n device, value = queue.get()\n rescale_value(value[0])\n txt = f\"Write({device}) = {value}\"\n log.debug(txt)\n if not device:\n continue\n # do any logic here to update your devices\n\n\n# --------------------------------------------------------------------------- #\n# initialize your device map\n# --------------------------------------------------------------------------- #\n\n\ndef read_device_map(path):\n \"\"\"Read the device path to address mapping from file::\n\n 0x0001,/dev/device1\n 0x0002,/dev/device2\n\n :param path: The path to the input file\n :returns: The input mapping file\n \"\"\"\n devices = {}\n with open(path, \"r\") as stream: # pylint: disable=unspecified-encoding\n for line in stream:\n piece = line.strip().split(\",\")\n devices[int(piece[0], 16)] = piece[1]\n return devices\n\n\ndef run_callback_server():\n \"\"\"Run callback server.\"\"\"\n # ----------------------------------------------------------------------- #\n # initialize your data store\n # ----------------------------------------------------------------------- #\n queue = Queue()\n devices = read_device_map(\"device-mapping\")\n block = CallbackDataBlock(devices, queue)\n store = ModbusSlaveContext(di=block, co=block, hr=block, ir=block)\n context = ModbusServerContext(slaves=store, single=True)\n\n # ----------------------------------------------------------------------- #\n # initialize the server information\n # ----------------------------------------------------------------------- #\n identity = ModbusDeviceIdentification(\n info_name={\n \"VendorName\": \"pymodbus\",\n \"ProductCode\": \"PM\",\n \"VendorUrl\": \"http://github.com/riptideio/pymodbus/\",\n \"ProductName\": \"pymodbus Server\",\n \"ModelName\": \"pymodbus Server\",\n \"MajorMinorRevision\": version.short(),\n }\n )\n\n # ----------------------------------------------------------------------- #\n # run the server you want\n # ----------------------------------------------------------------------- #\n thread = Thread(target=device_writer, args=(queue,))\n thread.start()\n StartTcpServer(context, identity=identity, address=(\"localhost\", 5020))\n\n\nif __name__ == \"__main__\":\n run_callback_server()\n", "path": "examples/common/callback_server.py"}]}
| 1,972 | 172 |
gh_patches_debug_42674
|
rasdani/github-patches
|
git_diff
|
python-telegram-bot__python-telegram-bot-2517
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[FEATURE] Add a pattern for result_id of ChosenInlineResultHandler
In this way you can separate the results of your inline queries and redirect them to specific function as it happens for callback queries.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `telegram/ext/choseninlineresulthandler.py`
Content:
```
1 #!/usr/bin/env python
2 #
3 # A library that provides a Python interface to the Telegram Bot API
4 # Copyright (C) 2015-2021
5 # Leandro Toledo de Souza <[email protected]>
6 #
7 # This program is free software: you can redistribute it and/or modify
8 # it under the terms of the GNU Lesser Public License as published by
9 # the Free Software Foundation, either version 3 of the License, or
10 # (at your option) any later version.
11 #
12 # This program is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU Lesser Public License for more details.
16 #
17 # You should have received a copy of the GNU Lesser Public License
18 # along with this program. If not, see [http://www.gnu.org/licenses/].
19 """This module contains the ChosenInlineResultHandler class."""
20
21 from typing import Optional, TypeVar, Union
22
23 from telegram import Update
24
25 from .handler import Handler
26
27 RT = TypeVar('RT')
28
29
30 class ChosenInlineResultHandler(Handler[Update]):
31 """Handler class to handle Telegram updates that contain a chosen inline result.
32
33 Note:
34 :attr:`pass_user_data` and :attr:`pass_chat_data` determine whether a ``dict`` you
35 can use to keep any data in will be sent to the :attr:`callback` function. Related to
36 either the user or the chat that the update was sent in. For each update from the same user
37 or in the same chat, it will be the same ``dict``.
38
39 Note that this is DEPRECATED, and you should use context based callbacks. See
40 https://git.io/fxJuV for more info.
41
42 Warning:
43 When setting ``run_async`` to :obj:`True`, you cannot rely on adding custom
44 attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.
45
46 Args:
47 callback (:obj:`callable`): The callback function for this handler. Will be called when
48 :attr:`check_update` has determined that an update should be processed by this handler.
49 Callback signature for context based API:
50
51 ``def callback(update: Update, context: CallbackContext)``
52
53 The return value of the callback is usually ignored except for the special case of
54 :class:`telegram.ext.ConversationHandler`.
55 pass_update_queue (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called
56 ``update_queue`` will be passed to the callback function. It will be the ``Queue``
57 instance used by the :class:`telegram.ext.Updater` and :class:`telegram.ext.Dispatcher`
58 that contains new updates which can be used to insert updates. Default is :obj:`False`.
59 DEPRECATED: Please switch to context based callbacks.
60 pass_job_queue (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called
61 ``job_queue`` will be passed to the callback function. It will be a
62 :class:`telegram.ext.JobQueue` instance created by the :class:`telegram.ext.Updater`
63 which can be used to schedule new jobs. Default is :obj:`False`.
64 DEPRECATED: Please switch to context based callbacks.
65 pass_user_data (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called
66 ``user_data`` will be passed to the callback function. Default is :obj:`False`.
67 DEPRECATED: Please switch to context based callbacks.
68 pass_chat_data (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called
69 ``chat_data`` will be passed to the callback function. Default is :obj:`False`.
70 DEPRECATED: Please switch to context based callbacks.
71 run_async (:obj:`bool`): Determines whether the callback will run asynchronously.
72 Defaults to :obj:`False`.
73
74 Attributes:
75 callback (:obj:`callable`): The callback function for this handler.
76 pass_update_queue (:obj:`bool`): Determines whether ``update_queue`` will be
77 passed to the callback function.
78 pass_job_queue (:obj:`bool`): Determines whether ``job_queue`` will be passed to
79 the callback function.
80 pass_user_data (:obj:`bool`): Determines whether ``user_data`` will be passed to
81 the callback function.
82 pass_chat_data (:obj:`bool`): Determines whether ``chat_data`` will be passed to
83 the callback function.
84 run_async (:obj:`bool`): Determines whether the callback will run asynchronously.
85
86 """
87
88 def check_update(self, update: object) -> Optional[Union[bool, object]]:
89 """Determines whether an update should be passed to this handlers :attr:`callback`.
90
91 Args:
92 update (:class:`telegram.Update` | :obj:`object`): Incoming update.
93
94 Returns:
95 :obj:`bool`
96
97 """
98 return isinstance(update, Update) and update.chosen_inline_result
99
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/telegram/ext/choseninlineresulthandler.py b/telegram/ext/choseninlineresulthandler.py
--- a/telegram/ext/choseninlineresulthandler.py
+++ b/telegram/ext/choseninlineresulthandler.py
@@ -17,15 +17,19 @@
# You should have received a copy of the GNU Lesser Public License
# along with this program. If not, see [http://www.gnu.org/licenses/].
"""This module contains the ChosenInlineResultHandler class."""
-
-from typing import Optional, TypeVar, Union
+import re
+from typing import Optional, TypeVar, Union, Callable, TYPE_CHECKING, Pattern, Match, cast
from telegram import Update
+from telegram.utils.helpers import DefaultValue, DEFAULT_FALSE
from .handler import Handler
RT = TypeVar('RT')
+if TYPE_CHECKING:
+ from telegram.ext import CallbackContext, Dispatcher
+
class ChosenInlineResultHandler(Handler[Update]):
"""Handler class to handle Telegram updates that contain a chosen inline result.
@@ -70,6 +74,12 @@
DEPRECATED: Please switch to context based callbacks.
run_async (:obj:`bool`): Determines whether the callback will run asynchronously.
Defaults to :obj:`False`.
+ pattern (:obj:`str` | `Pattern`, optional): Regex pattern. If not :obj:`None`, ``re.match``
+ is used on :attr:`telegram.ChosenInlineResult.result_id` to determine if an update
+ should be handled by this handler. This is accessible in the callback as
+ :attr:`telegram.ext.CallbackContext.matches`.
+
+ .. versionadded:: 13.6
Attributes:
callback (:obj:`callable`): The callback function for this handler.
@@ -82,9 +92,37 @@
pass_chat_data (:obj:`bool`): Determines whether ``chat_data`` will be passed to
the callback function.
run_async (:obj:`bool`): Determines whether the callback will run asynchronously.
+ pattern (`Pattern`): Optional. Regex pattern to test
+ :attr:`telegram.ChosenInlineResult.result_id` against.
+
+ .. versionadded:: 13.6
"""
+ def __init__(
+ self,
+ callback: Callable[[Update, 'CallbackContext'], RT],
+ pass_update_queue: bool = False,
+ pass_job_queue: bool = False,
+ pass_user_data: bool = False,
+ pass_chat_data: bool = False,
+ run_async: Union[bool, DefaultValue] = DEFAULT_FALSE,
+ pattern: Union[str, Pattern] = None,
+ ):
+ super().__init__(
+ callback,
+ pass_update_queue=pass_update_queue,
+ pass_job_queue=pass_job_queue,
+ pass_user_data=pass_user_data,
+ pass_chat_data=pass_chat_data,
+ run_async=run_async,
+ )
+
+ if isinstance(pattern, str):
+ pattern = re.compile(pattern)
+
+ self.pattern = pattern
+
def check_update(self, update: object) -> Optional[Union[bool, object]]:
"""Determines whether an update should be passed to this handlers :attr:`callback`.
@@ -95,4 +133,24 @@
:obj:`bool`
"""
- return isinstance(update, Update) and update.chosen_inline_result
+ if isinstance(update, Update) and update.chosen_inline_result:
+ if self.pattern:
+ match = re.match(self.pattern, update.chosen_inline_result.result_id)
+ if match:
+ return match
+ else:
+ return True
+ return None
+
+ def collect_additional_context(
+ self,
+ context: 'CallbackContext',
+ update: Update,
+ dispatcher: 'Dispatcher',
+ check_result: Union[bool, Match],
+ ) -> None:
+ """This function adds the matched regex pattern result to
+ :attr:`telegram.ext.CallbackContext.matches`."""
+ if self.pattern:
+ check_result = cast(Match, check_result)
+ context.matches = [check_result]
|
{"golden_diff": "diff --git a/telegram/ext/choseninlineresulthandler.py b/telegram/ext/choseninlineresulthandler.py\n--- a/telegram/ext/choseninlineresulthandler.py\n+++ b/telegram/ext/choseninlineresulthandler.py\n@@ -17,15 +17,19 @@\n # You should have received a copy of the GNU Lesser Public License\n # along with this program. If not, see [http://www.gnu.org/licenses/].\n \"\"\"This module contains the ChosenInlineResultHandler class.\"\"\"\n-\n-from typing import Optional, TypeVar, Union\n+import re\n+from typing import Optional, TypeVar, Union, Callable, TYPE_CHECKING, Pattern, Match, cast\n \n from telegram import Update\n \n+from telegram.utils.helpers import DefaultValue, DEFAULT_FALSE\n from .handler import Handler\n \n RT = TypeVar('RT')\n \n+if TYPE_CHECKING:\n+ from telegram.ext import CallbackContext, Dispatcher\n+\n \n class ChosenInlineResultHandler(Handler[Update]):\n \"\"\"Handler class to handle Telegram updates that contain a chosen inline result.\n@@ -70,6 +74,12 @@\n DEPRECATED: Please switch to context based callbacks.\n run_async (:obj:`bool`): Determines whether the callback will run asynchronously.\n Defaults to :obj:`False`.\n+ pattern (:obj:`str` | `Pattern`, optional): Regex pattern. If not :obj:`None`, ``re.match``\n+ is used on :attr:`telegram.ChosenInlineResult.result_id` to determine if an update\n+ should be handled by this handler. This is accessible in the callback as\n+ :attr:`telegram.ext.CallbackContext.matches`.\n+\n+ .. versionadded:: 13.6\n \n Attributes:\n callback (:obj:`callable`): The callback function for this handler.\n@@ -82,9 +92,37 @@\n pass_chat_data (:obj:`bool`): Determines whether ``chat_data`` will be passed to\n the callback function.\n run_async (:obj:`bool`): Determines whether the callback will run asynchronously.\n+ pattern (`Pattern`): Optional. Regex pattern to test\n+ :attr:`telegram.ChosenInlineResult.result_id` against.\n+\n+ .. versionadded:: 13.6\n \n \"\"\"\n \n+ def __init__(\n+ self,\n+ callback: Callable[[Update, 'CallbackContext'], RT],\n+ pass_update_queue: bool = False,\n+ pass_job_queue: bool = False,\n+ pass_user_data: bool = False,\n+ pass_chat_data: bool = False,\n+ run_async: Union[bool, DefaultValue] = DEFAULT_FALSE,\n+ pattern: Union[str, Pattern] = None,\n+ ):\n+ super().__init__(\n+ callback,\n+ pass_update_queue=pass_update_queue,\n+ pass_job_queue=pass_job_queue,\n+ pass_user_data=pass_user_data,\n+ pass_chat_data=pass_chat_data,\n+ run_async=run_async,\n+ )\n+\n+ if isinstance(pattern, str):\n+ pattern = re.compile(pattern)\n+\n+ self.pattern = pattern\n+\n def check_update(self, update: object) -> Optional[Union[bool, object]]:\n \"\"\"Determines whether an update should be passed to this handlers :attr:`callback`.\n \n@@ -95,4 +133,24 @@\n :obj:`bool`\n \n \"\"\"\n- return isinstance(update, Update) and update.chosen_inline_result\n+ if isinstance(update, Update) and update.chosen_inline_result:\n+ if self.pattern:\n+ match = re.match(self.pattern, update.chosen_inline_result.result_id)\n+ if match:\n+ return match\n+ else:\n+ return True\n+ return None\n+\n+ def collect_additional_context(\n+ self,\n+ context: 'CallbackContext',\n+ update: Update,\n+ dispatcher: 'Dispatcher',\n+ check_result: Union[bool, Match],\n+ ) -> None:\n+ \"\"\"This function adds the matched regex pattern result to\n+ :attr:`telegram.ext.CallbackContext.matches`.\"\"\"\n+ if self.pattern:\n+ check_result = cast(Match, check_result)\n+ context.matches = [check_result]\n", "issue": "[FEATURE] Add a pattern for result_id of ChosenInlineResultHandler\nIn this way you can separate the results of your inline queries and redirect them to specific function as it happens for callback queries.\n", "before_files": [{"content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2021\n# Leandro Toledo de Souza <[email protected]>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains the ChosenInlineResultHandler class.\"\"\"\n\nfrom typing import Optional, TypeVar, Union\n\nfrom telegram import Update\n\nfrom .handler import Handler\n\nRT = TypeVar('RT')\n\n\nclass ChosenInlineResultHandler(Handler[Update]):\n \"\"\"Handler class to handle Telegram updates that contain a chosen inline result.\n\n Note:\n :attr:`pass_user_data` and :attr:`pass_chat_data` determine whether a ``dict`` you\n can use to keep any data in will be sent to the :attr:`callback` function. Related to\n either the user or the chat that the update was sent in. For each update from the same user\n or in the same chat, it will be the same ``dict``.\n\n Note that this is DEPRECATED, and you should use context based callbacks. See\n https://git.io/fxJuV for more info.\n\n Warning:\n When setting ``run_async`` to :obj:`True`, you cannot rely on adding custom\n attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.\n\n Args:\n callback (:obj:`callable`): The callback function for this handler. Will be called when\n :attr:`check_update` has determined that an update should be processed by this handler.\n Callback signature for context based API:\n\n ``def callback(update: Update, context: CallbackContext)``\n\n The return value of the callback is usually ignored except for the special case of\n :class:`telegram.ext.ConversationHandler`.\n pass_update_queue (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``update_queue`` will be passed to the callback function. It will be the ``Queue``\n instance used by the :class:`telegram.ext.Updater` and :class:`telegram.ext.Dispatcher`\n that contains new updates which can be used to insert updates. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n pass_job_queue (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``job_queue`` will be passed to the callback function. It will be a\n :class:`telegram.ext.JobQueue` instance created by the :class:`telegram.ext.Updater`\n which can be used to schedule new jobs. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n pass_user_data (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``user_data`` will be passed to the callback function. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n pass_chat_data (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``chat_data`` will be passed to the callback function. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n run_async (:obj:`bool`): Determines whether the callback will run asynchronously.\n Defaults to :obj:`False`.\n\n Attributes:\n callback (:obj:`callable`): The callback function for this handler.\n pass_update_queue (:obj:`bool`): Determines whether ``update_queue`` will be\n passed to the callback function.\n pass_job_queue (:obj:`bool`): Determines whether ``job_queue`` will be passed to\n the callback function.\n pass_user_data (:obj:`bool`): Determines whether ``user_data`` will be passed to\n the callback function.\n pass_chat_data (:obj:`bool`): Determines whether ``chat_data`` will be passed to\n the callback function.\n run_async (:obj:`bool`): Determines whether the callback will run asynchronously.\n\n \"\"\"\n\n def check_update(self, update: object) -> Optional[Union[bool, object]]:\n \"\"\"Determines whether an update should be passed to this handlers :attr:`callback`.\n\n Args:\n update (:class:`telegram.Update` | :obj:`object`): Incoming update.\n\n Returns:\n :obj:`bool`\n\n \"\"\"\n return isinstance(update, Update) and update.chosen_inline_result\n", "path": "telegram/ext/choseninlineresulthandler.py"}], "after_files": [{"content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2021\n# Leandro Toledo de Souza <[email protected]>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains the ChosenInlineResultHandler class.\"\"\"\nimport re\nfrom typing import Optional, TypeVar, Union, Callable, TYPE_CHECKING, Pattern, Match, cast\n\nfrom telegram import Update\n\nfrom telegram.utils.helpers import DefaultValue, DEFAULT_FALSE\nfrom .handler import Handler\n\nRT = TypeVar('RT')\n\nif TYPE_CHECKING:\n from telegram.ext import CallbackContext, Dispatcher\n\n\nclass ChosenInlineResultHandler(Handler[Update]):\n \"\"\"Handler class to handle Telegram updates that contain a chosen inline result.\n\n Note:\n :attr:`pass_user_data` and :attr:`pass_chat_data` determine whether a ``dict`` you\n can use to keep any data in will be sent to the :attr:`callback` function. Related to\n either the user or the chat that the update was sent in. For each update from the same user\n or in the same chat, it will be the same ``dict``.\n\n Note that this is DEPRECATED, and you should use context based callbacks. See\n https://git.io/fxJuV for more info.\n\n Warning:\n When setting ``run_async`` to :obj:`True`, you cannot rely on adding custom\n attributes to :class:`telegram.ext.CallbackContext`. See its docs for more info.\n\n Args:\n callback (:obj:`callable`): The callback function for this handler. Will be called when\n :attr:`check_update` has determined that an update should be processed by this handler.\n Callback signature for context based API:\n\n ``def callback(update: Update, context: CallbackContext)``\n\n The return value of the callback is usually ignored except for the special case of\n :class:`telegram.ext.ConversationHandler`.\n pass_update_queue (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``update_queue`` will be passed to the callback function. It will be the ``Queue``\n instance used by the :class:`telegram.ext.Updater` and :class:`telegram.ext.Dispatcher`\n that contains new updates which can be used to insert updates. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n pass_job_queue (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``job_queue`` will be passed to the callback function. It will be a\n :class:`telegram.ext.JobQueue` instance created by the :class:`telegram.ext.Updater`\n which can be used to schedule new jobs. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n pass_user_data (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``user_data`` will be passed to the callback function. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n pass_chat_data (:obj:`bool`, optional): If set to :obj:`True`, a keyword argument called\n ``chat_data`` will be passed to the callback function. Default is :obj:`False`.\n DEPRECATED: Please switch to context based callbacks.\n run_async (:obj:`bool`): Determines whether the callback will run asynchronously.\n Defaults to :obj:`False`.\n pattern (:obj:`str` | `Pattern`, optional): Regex pattern. If not :obj:`None`, ``re.match``\n is used on :attr:`telegram.ChosenInlineResult.result_id` to determine if an update\n should be handled by this handler. This is accessible in the callback as\n :attr:`telegram.ext.CallbackContext.matches`.\n\n .. versionadded:: 13.6\n\n Attributes:\n callback (:obj:`callable`): The callback function for this handler.\n pass_update_queue (:obj:`bool`): Determines whether ``update_queue`` will be\n passed to the callback function.\n pass_job_queue (:obj:`bool`): Determines whether ``job_queue`` will be passed to\n the callback function.\n pass_user_data (:obj:`bool`): Determines whether ``user_data`` will be passed to\n the callback function.\n pass_chat_data (:obj:`bool`): Determines whether ``chat_data`` will be passed to\n the callback function.\n run_async (:obj:`bool`): Determines whether the callback will run asynchronously.\n pattern (`Pattern`): Optional. Regex pattern to test\n :attr:`telegram.ChosenInlineResult.result_id` against.\n\n .. versionadded:: 13.6\n\n \"\"\"\n\n def __init__(\n self,\n callback: Callable[[Update, 'CallbackContext'], RT],\n pass_update_queue: bool = False,\n pass_job_queue: bool = False,\n pass_user_data: bool = False,\n pass_chat_data: bool = False,\n run_async: Union[bool, DefaultValue] = DEFAULT_FALSE,\n pattern: Union[str, Pattern] = None,\n ):\n super().__init__(\n callback,\n pass_update_queue=pass_update_queue,\n pass_job_queue=pass_job_queue,\n pass_user_data=pass_user_data,\n pass_chat_data=pass_chat_data,\n run_async=run_async,\n )\n\n if isinstance(pattern, str):\n pattern = re.compile(pattern)\n\n self.pattern = pattern\n\n def check_update(self, update: object) -> Optional[Union[bool, object]]:\n \"\"\"Determines whether an update should be passed to this handlers :attr:`callback`.\n\n Args:\n update (:class:`telegram.Update` | :obj:`object`): Incoming update.\n\n Returns:\n :obj:`bool`\n\n \"\"\"\n if isinstance(update, Update) and update.chosen_inline_result:\n if self.pattern:\n match = re.match(self.pattern, update.chosen_inline_result.result_id)\n if match:\n return match\n else:\n return True\n return None\n\n def collect_additional_context(\n self,\n context: 'CallbackContext',\n update: Update,\n dispatcher: 'Dispatcher',\n check_result: Union[bool, Match],\n ) -> None:\n \"\"\"This function adds the matched regex pattern result to\n :attr:`telegram.ext.CallbackContext.matches`.\"\"\"\n if self.pattern:\n check_result = cast(Match, check_result)\n context.matches = [check_result]\n", "path": "telegram/ext/choseninlineresulthandler.py"}]}
| 1,573 | 913 |
gh_patches_debug_6980
|
rasdani/github-patches
|
git_diff
|
sopel-irc__sopel-2174
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
UTF-8 Check fails on windows in Powershell
### Description
The current version of sopel in pip (py3.9.6) and from git master both prompt this message upon running sopel even with a utf8 configuration:
> WARNING!!! You are running with a non-UTF8 locale environment variable (e.g. LC_ALL is set to "C"), which makes Python 3 do stupid things. If you get strange errors, please set it to something like "en_US.UTF-8".
Despite having a "English_United States", "utf8" locale from python.
### Reproduction steps
Pull into a fresh python dev environment on windows 10.
Install sopel via pip or from source.
Run sopel.
### Expected behavior
To not get a warning about UTF-8 since it's configured.
### Logs
```
(.venv) PS C:\Users\Michael\Documents\Visual Studio Projects\Python\sopel> $PSVersionTable
Name Value
---- -----
PSVersion 7.1.4
PSEdition Core
GitCommitId 7.1.4
OS Microsoft Windows 10.0.19041
Platform Win32NT
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0…}
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
WSManStackVersion 3.0
(.venv) PS C:\Users\Michael\Documents\Visual Studio Projects\Python\sopel> Get-WinSystemLocale
LCID Name DisplayName
---- ---- -----------
1033 en-US English (United States)
(.venv) PS C:\Users\Michael\Documents\Visual Studio Projects\Python\sopel> echo $OutputEncoding
Preamble :
BodyName : utf-8
EncodingName : Unicode (UTF-8)
HeaderName : utf-8
WebName : utf-8
WindowsCodePage : 1200
IsBrowserDisplay : True
IsBrowserSave : True
IsMailNewsDisplay : True
IsMailNewsSave : True
IsSingleByte : False
EncoderFallback : System.Text.EncoderReplacementFallback
DecoderFallback : System.Text.DecoderReplacementFallback
IsReadOnly : True
CodePage : 65001
(.venv) PS C:\Users\Michael\Documents\Visual Studio Projects\Python\sopel> py
Python 3.9.6 (tags/v3.9.6:db3ff76, Jun 28 2021, 15:26:21) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import locale
>>> locale.getlocale()
('English_United States', 'utf8')
>>> exit()
```
### Environment
- Sopel `.version`: 8.0.0
- Sopel installed via: pip && source
- Python version: 3.9.6
- Operating system: Windows 10
- IRCd `/version`: N/A
- Relevant plugins: N/A
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sopel/__init__.py`
Content:
```
1 # ASCII ONLY IN THIS FILE THOUGH!!!!!!!
2 # Python does some stupid bullshit of respecting LC_ALL over the encoding on the
3 # file, so in order to undo Python's ridiculous fucking idiocy, we have to have
4 # our own check.
5
6 # Copyright 2008, Sean B. Palmer, inamidst.com
7 # Copyright 2012, Elsie Powell, http://embolalia.com
8 # Copyright 2012, Elad Alfassa <[email protected]>
9 #
10 # Licensed under the Eiffel Forum License 2.
11
12 from __future__ import generator_stop
13
14 from collections import namedtuple
15 import locale
16 import re
17 import sys
18
19 import pkg_resources
20
21 __all__ = [
22 'bot',
23 'config',
24 'db',
25 'formatting',
26 'irc',
27 'loader',
28 'logger',
29 'module', # deprecated in 7.1, removed in 9.0
30 'plugin',
31 'tools',
32 'trigger',
33 'version_info',
34 ]
35
36 loc = locale.getlocale()
37 if not loc[1] or 'UTF-8' not in loc[1]:
38 print('WARNING!!! You are running with a non-UTF8 locale environment '
39 'variable (e.g. LC_ALL is set to "C"), which makes Python 3 do '
40 'stupid things. If you get strange errors, please set it to '
41 'something like "en_US.UTF-8".', file=sys.stderr)
42
43
44 __version__ = pkg_resources.get_distribution('sopel').version
45
46
47 def _version_info(version=__version__):
48 regex = re.compile(r'(\d+)\.(\d+)\.(\d+)(?:[\-\.]?(a|b|rc)(\d+))?.*')
49 version_groups = regex.match(version).groups()
50 major, minor, micro = (int(piece) for piece in version_groups[0:3])
51 level = version_groups[3]
52 serial = int(version_groups[4] or 0)
53 if level == 'a':
54 level = 'alpha'
55 elif level == 'b':
56 level = 'beta'
57 elif level == 'rc':
58 level = 'candidate'
59 elif not level and version_groups[4] is None:
60 level = 'final'
61 else:
62 level = 'alpha'
63 version_type = namedtuple('version_info',
64 'major, minor, micro, releaselevel, serial')
65 return version_type(major, minor, micro, level, serial)
66
67
68 version_info = _version_info()
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sopel/__init__.py b/sopel/__init__.py
--- a/sopel/__init__.py
+++ b/sopel/__init__.py
@@ -34,7 +34,7 @@
]
loc = locale.getlocale()
-if not loc[1] or 'UTF-8' not in loc[1]:
+if not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):
print('WARNING!!! You are running with a non-UTF8 locale environment '
'variable (e.g. LC_ALL is set to "C"), which makes Python 3 do '
'stupid things. If you get strange errors, please set it to '
|
{"golden_diff": "diff --git a/sopel/__init__.py b/sopel/__init__.py\n--- a/sopel/__init__.py\n+++ b/sopel/__init__.py\n@@ -34,7 +34,7 @@\n ]\n \n loc = locale.getlocale()\n-if not loc[1] or 'UTF-8' not in loc[1]:\n+if not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):\n print('WARNING!!! You are running with a non-UTF8 locale environment '\n 'variable (e.g. LC_ALL is set to \"C\"), which makes Python 3 do '\n 'stupid things. If you get strange errors, please set it to '\n", "issue": "UTF-8 Check fails on windows in Powershell\n### Description\r\nThe current version of sopel in pip (py3.9.6) and from git master both prompt this message upon running sopel even with a utf8 configuration:\r\n\r\n> WARNING!!! You are running with a non-UTF8 locale environment variable (e.g. LC_ALL is set to \"C\"), which makes Python 3 do stupid things. If you get strange errors, please set it to something like \"en_US.UTF-8\".\r\n\r\nDespite having a \"English_United States\", \"utf8\" locale from python.\r\n\r\n### Reproduction steps\r\nPull into a fresh python dev environment on windows 10.\r\nInstall sopel via pip or from source.\r\nRun sopel.\r\n\r\n### Expected behavior\r\nTo not get a warning about UTF-8 since it's configured.\r\n\r\n### Logs\r\n```\r\n(.venv) PS C:\\Users\\Michael\\Documents\\Visual Studio Projects\\Python\\sopel> $PSVersionTable \r\n\r\nName Value\r\n---- -----\r\nPSVersion 7.1.4\r\nPSEdition Core\r\nGitCommitId 7.1.4\r\nOS Microsoft Windows 10.0.19041\r\nPlatform Win32NT\r\nPSCompatibleVersions {1.0, 2.0, 3.0, 4.0\u2026}\r\nPSRemotingProtocolVersion 2.3\r\nSerializationVersion 1.1.0.1\r\nWSManStackVersion 3.0\r\n\r\n(.venv) PS C:\\Users\\Michael\\Documents\\Visual Studio Projects\\Python\\sopel> Get-WinSystemLocale \r\n\r\nLCID Name DisplayName\r\n---- ---- -----------\r\n1033 en-US English (United States)\r\n\r\n(.venv) PS C:\\Users\\Michael\\Documents\\Visual Studio Projects\\Python\\sopel> echo $OutputEncoding\r\n\r\nPreamble : \r\nBodyName : utf-8\r\nEncodingName : Unicode (UTF-8)\r\nHeaderName : utf-8\r\nWebName : utf-8\r\nWindowsCodePage : 1200\r\nIsBrowserDisplay : True\r\nIsBrowserSave : True\r\nIsMailNewsDisplay : True\r\nIsMailNewsSave : True\r\nIsSingleByte : False\r\nEncoderFallback : System.Text.EncoderReplacementFallback\r\nDecoderFallback : System.Text.DecoderReplacementFallback\r\nIsReadOnly : True\r\nCodePage : 65001\r\n\r\n\r\n(.venv) PS C:\\Users\\Michael\\Documents\\Visual Studio Projects\\Python\\sopel> py\r\nPython 3.9.6 (tags/v3.9.6:db3ff76, Jun 28 2021, 15:26:21) [MSC v.1929 64 bit (AMD64)] on win32\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import locale\r\n>>> locale.getlocale()\r\n('English_United States', 'utf8')\r\n>>> exit()\r\n```\r\n\r\n### Environment\r\n- Sopel `.version`: 8.0.0\r\n- Sopel installed via: pip && source\r\n- Python version: 3.9.6\r\n- Operating system: Windows 10\r\n- IRCd `/version`: N/A\r\n- Relevant plugins: N/A\r\n\n", "before_files": [{"content": "# ASCII ONLY IN THIS FILE THOUGH!!!!!!!\n# Python does some stupid bullshit of respecting LC_ALL over the encoding on the\n# file, so in order to undo Python's ridiculous fucking idiocy, we have to have\n# our own check.\n\n# Copyright 2008, Sean B. Palmer, inamidst.com\n# Copyright 2012, Elsie Powell, http://embolalia.com\n# Copyright 2012, Elad Alfassa <[email protected]>\n#\n# Licensed under the Eiffel Forum License 2.\n\nfrom __future__ import generator_stop\n\nfrom collections import namedtuple\nimport locale\nimport re\nimport sys\n\nimport pkg_resources\n\n__all__ = [\n 'bot',\n 'config',\n 'db',\n 'formatting',\n 'irc',\n 'loader',\n 'logger',\n 'module', # deprecated in 7.1, removed in 9.0\n 'plugin',\n 'tools',\n 'trigger',\n 'version_info',\n]\n\nloc = locale.getlocale()\nif not loc[1] or 'UTF-8' not in loc[1]:\n print('WARNING!!! You are running with a non-UTF8 locale environment '\n 'variable (e.g. LC_ALL is set to \"C\"), which makes Python 3 do '\n 'stupid things. If you get strange errors, please set it to '\n 'something like \"en_US.UTF-8\".', file=sys.stderr)\n\n\n__version__ = pkg_resources.get_distribution('sopel').version\n\n\ndef _version_info(version=__version__):\n regex = re.compile(r'(\\d+)\\.(\\d+)\\.(\\d+)(?:[\\-\\.]?(a|b|rc)(\\d+))?.*')\n version_groups = regex.match(version).groups()\n major, minor, micro = (int(piece) for piece in version_groups[0:3])\n level = version_groups[3]\n serial = int(version_groups[4] or 0)\n if level == 'a':\n level = 'alpha'\n elif level == 'b':\n level = 'beta'\n elif level == 'rc':\n level = 'candidate'\n elif not level and version_groups[4] is None:\n level = 'final'\n else:\n level = 'alpha'\n version_type = namedtuple('version_info',\n 'major, minor, micro, releaselevel, serial')\n return version_type(major, minor, micro, level, serial)\n\n\nversion_info = _version_info()\n", "path": "sopel/__init__.py"}], "after_files": [{"content": "# ASCII ONLY IN THIS FILE THOUGH!!!!!!!\n# Python does some stupid bullshit of respecting LC_ALL over the encoding on the\n# file, so in order to undo Python's ridiculous fucking idiocy, we have to have\n# our own check.\n\n# Copyright 2008, Sean B. Palmer, inamidst.com\n# Copyright 2012, Elsie Powell, http://embolalia.com\n# Copyright 2012, Elad Alfassa <[email protected]>\n#\n# Licensed under the Eiffel Forum License 2.\n\nfrom __future__ import generator_stop\n\nfrom collections import namedtuple\nimport locale\nimport re\nimport sys\n\nimport pkg_resources\n\n__all__ = [\n 'bot',\n 'config',\n 'db',\n 'formatting',\n 'irc',\n 'loader',\n 'logger',\n 'module', # deprecated in 7.1, removed in 9.0\n 'plugin',\n 'tools',\n 'trigger',\n 'version_info',\n]\n\nloc = locale.getlocale()\nif not loc[1] or ('UTF-8' not in loc[1] and 'utf8' not in loc[1]):\n print('WARNING!!! You are running with a non-UTF8 locale environment '\n 'variable (e.g. LC_ALL is set to \"C\"), which makes Python 3 do '\n 'stupid things. If you get strange errors, please set it to '\n 'something like \"en_US.UTF-8\".', file=sys.stderr)\n\n\n__version__ = pkg_resources.get_distribution('sopel').version\n\n\ndef _version_info(version=__version__):\n regex = re.compile(r'(\\d+)\\.(\\d+)\\.(\\d+)(?:[\\-\\.]?(a|b|rc)(\\d+))?.*')\n version_groups = regex.match(version).groups()\n major, minor, micro = (int(piece) for piece in version_groups[0:3])\n level = version_groups[3]\n serial = int(version_groups[4] or 0)\n if level == 'a':\n level = 'alpha'\n elif level == 'b':\n level = 'beta'\n elif level == 'rc':\n level = 'candidate'\n elif not level and version_groups[4] is None:\n level = 'final'\n else:\n level = 'alpha'\n version_type = namedtuple('version_info',\n 'major, minor, micro, releaselevel, serial')\n return version_type(major, minor, micro, level, serial)\n\n\nversion_info = _version_info()\n", "path": "sopel/__init__.py"}]}
| 1,666 | 164 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.