problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_28357
|
rasdani/github-patches
|
git_diff
|
pyca__cryptography-4200
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Some OpenSSL bignum arithmetic operations are not in constant time
### What's wrong:
1. Some arithmetic operations exposed in the [OpenSSL bignum binding](https://github.com/pyca/cryptography/blob/master/src/_cffi_src/openssl/bignum.py) are not in constant time.
2. These functions are specifically:
1. `BN_div`
2. `BN_mod_inverse`
3. `BN_mod_exp`
### How to resolve:
1. The OpenSSL solution to this is to call `BN_set_flags` on the secret BIGNUM and set the `BN_FLG_CONSTTIME` flag. [The OpenSSL docs reference this here.](https://github.com/openssl/openssl/blob/master/include/openssl/bn.h#L61)
2. Expose the needed operations:
1. `BN_MONT_CTX` operations.
2. `BN_mod_exp_mont`, and `BN_mod_exp_mont_consttime`
3. `BN_set_flags`, `BN_get_flags`, and the `BN_FLG_CONSTTIME` flag.
### Notes:
1. Cryptography.io isn't affected by these timing issues. I have clarified with @alex and @reaperhulk.
2. If you are a downstream user utilizing these operations on private/secret values, then _you are affected_.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/_cffi_src/openssl/bignum.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5 from __future__ import absolute_import, division, print_function
6
7 INCLUDES = """
8 #include <openssl/bn.h>
9 """
10
11 TYPES = """
12 typedef ... BN_CTX;
13 typedef ... BIGNUM;
14 typedef int... BN_ULONG;
15 """
16
17 FUNCTIONS = """
18 BIGNUM *BN_new(void);
19 void BN_free(BIGNUM *);
20 void BN_clear_free(BIGNUM *);
21
22 int BN_rand(BIGNUM *, int, int, int);
23 int BN_rand_range(BIGNUM *, BIGNUM *);
24
25 BN_CTX *BN_CTX_new(void);
26 void BN_CTX_free(BN_CTX *);
27
28 void BN_CTX_start(BN_CTX *);
29 BIGNUM *BN_CTX_get(BN_CTX *);
30 void BN_CTX_end(BN_CTX *);
31
32 BIGNUM *BN_copy(BIGNUM *, const BIGNUM *);
33 BIGNUM *BN_dup(const BIGNUM *);
34
35 int BN_set_word(BIGNUM *, BN_ULONG);
36 BN_ULONG BN_get_word(const BIGNUM *);
37
38 const BIGNUM *BN_value_one(void);
39
40 char *BN_bn2hex(const BIGNUM *);
41 int BN_hex2bn(BIGNUM **, const char *);
42 int BN_dec2bn(BIGNUM **, const char *);
43
44 int BN_bn2bin(const BIGNUM *, unsigned char *);
45 BIGNUM *BN_bin2bn(const unsigned char *, int, BIGNUM *);
46
47 int BN_num_bits(const BIGNUM *);
48
49 int BN_cmp(const BIGNUM *, const BIGNUM *);
50 int BN_add(BIGNUM *, const BIGNUM *, const BIGNUM *);
51 int BN_sub(BIGNUM *, const BIGNUM *, const BIGNUM *);
52 int BN_mul(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
53 int BN_sqr(BIGNUM *, const BIGNUM *, BN_CTX *);
54 int BN_div(BIGNUM *, BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
55 int BN_nnmod(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
56 int BN_mod_add(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,
57 BN_CTX *);
58 int BN_mod_sub(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,
59 BN_CTX *);
60 int BN_mod_mul(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,
61 BN_CTX *);
62 int BN_mod_sqr(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
63 int BN_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
64 int BN_mod_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,
65 BN_CTX *);
66 int BN_gcd(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
67 BIGNUM *BN_mod_inverse(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
68
69 int BN_set_bit(BIGNUM *, int);
70 int BN_clear_bit(BIGNUM *, int);
71
72 int BN_is_bit_set(const BIGNUM *, int);
73
74 int BN_mask_bits(BIGNUM *, int);
75
76 int BN_num_bytes(const BIGNUM *);
77
78 int BN_zero(BIGNUM *);
79 int BN_one(BIGNUM *);
80 int BN_mod(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
81
82 int BN_lshift(BIGNUM *, const BIGNUM *, int);
83 int BN_lshift1(BIGNUM *, BIGNUM *);
84
85 int BN_rshift(BIGNUM *, BIGNUM *, int);
86 int BN_rshift1(BIGNUM *, BIGNUM *);
87 """
88
89 CUSTOMIZATIONS = """
90 """
91
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/_cffi_src/openssl/bignum.py b/src/_cffi_src/openssl/bignum.py
--- a/src/_cffi_src/openssl/bignum.py
+++ b/src/_cffi_src/openssl/bignum.py
@@ -10,11 +10,17 @@
TYPES = """
typedef ... BN_CTX;
+typedef ... BN_MONT_CTX;
typedef ... BIGNUM;
typedef int... BN_ULONG;
"""
FUNCTIONS = """
+#define BN_FLG_CONSTTIME ...
+
+void BN_set_flags(BIGNUM *, int);
+int BN_get_flags(const BIGNUM *, int);
+
BIGNUM *BN_new(void);
void BN_free(BIGNUM *);
void BN_clear_free(BIGNUM *);
@@ -29,6 +35,10 @@
BIGNUM *BN_CTX_get(BN_CTX *);
void BN_CTX_end(BN_CTX *);
+BN_MONT_CTX *BN_MONT_CTX_new(void);
+int BN_MONT_CTX_set(BN_MONT_CTX *, BIGNUM *, BN_CTX *);
+void BN_MONT_CTX_free(BN_MONT_CTX *);
+
BIGNUM *BN_copy(BIGNUM *, const BIGNUM *);
BIGNUM *BN_dup(const BIGNUM *);
@@ -63,6 +73,10 @@
int BN_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
int BN_mod_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,
BN_CTX *);
+int BN_mod_exp_mont(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,
+ BN_CTX *, BN_MONT_CTX *);
+int BN_mod_exp_mont_consttime(BIGNUM *, const BIGNUM *, const BIGNUM *,
+ const BIGNUM *, BN_CTX *, BN_MONT_CTX *);
int BN_gcd(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
BIGNUM *BN_mod_inverse(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);
|
{"golden_diff": "diff --git a/src/_cffi_src/openssl/bignum.py b/src/_cffi_src/openssl/bignum.py\n--- a/src/_cffi_src/openssl/bignum.py\n+++ b/src/_cffi_src/openssl/bignum.py\n@@ -10,11 +10,17 @@\n \n TYPES = \"\"\"\n typedef ... BN_CTX;\n+typedef ... BN_MONT_CTX;\n typedef ... BIGNUM;\n typedef int... BN_ULONG;\n \"\"\"\n \n FUNCTIONS = \"\"\"\n+#define BN_FLG_CONSTTIME ...\n+\n+void BN_set_flags(BIGNUM *, int);\n+int BN_get_flags(const BIGNUM *, int);\n+\n BIGNUM *BN_new(void);\n void BN_free(BIGNUM *);\n void BN_clear_free(BIGNUM *);\n@@ -29,6 +35,10 @@\n BIGNUM *BN_CTX_get(BN_CTX *);\n void BN_CTX_end(BN_CTX *);\n \n+BN_MONT_CTX *BN_MONT_CTX_new(void);\n+int BN_MONT_CTX_set(BN_MONT_CTX *, BIGNUM *, BN_CTX *);\n+void BN_MONT_CTX_free(BN_MONT_CTX *);\n+\n BIGNUM *BN_copy(BIGNUM *, const BIGNUM *);\n BIGNUM *BN_dup(const BIGNUM *);\n \n@@ -63,6 +73,10 @@\n int BN_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\n int BN_mod_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\n+int BN_mod_exp_mont(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n+ BN_CTX *, BN_MONT_CTX *);\n+int BN_mod_exp_mont_consttime(BIGNUM *, const BIGNUM *, const BIGNUM *,\n+ const BIGNUM *, BN_CTX *, BN_MONT_CTX *);\n int BN_gcd(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\n BIGNUM *BN_mod_inverse(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\n", "issue": "Some OpenSSL bignum arithmetic operations are not in constant time\n### What's wrong:\r\n1. Some arithmetic operations exposed in the [OpenSSL bignum binding](https://github.com/pyca/cryptography/blob/master/src/_cffi_src/openssl/bignum.py) are not in constant time.\r\n2. These functions are specifically:\r\n 1. `BN_div`\r\n 2. `BN_mod_inverse`\r\n 3. `BN_mod_exp`\r\n\r\n### How to resolve:\r\n1. The OpenSSL solution to this is to call `BN_set_flags` on the secret BIGNUM and set the `BN_FLG_CONSTTIME` flag. [The OpenSSL docs reference this here.](https://github.com/openssl/openssl/blob/master/include/openssl/bn.h#L61)\r\n2. Expose the needed operations:\r\n 1. `BN_MONT_CTX` operations.\r\n 2. `BN_mod_exp_mont`, and `BN_mod_exp_mont_consttime` \r\n 3. `BN_set_flags`, `BN_get_flags`, and the `BN_FLG_CONSTTIME` flag.\r\n\r\n### Notes:\r\n1. Cryptography.io isn't affected by these timing issues. I have clarified with @alex and @reaperhulk.\r\n2. If you are a downstream user utilizing these operations on private/secret values, then _you are affected_.\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nINCLUDES = \"\"\"\n#include <openssl/bn.h>\n\"\"\"\n\nTYPES = \"\"\"\ntypedef ... BN_CTX;\ntypedef ... BIGNUM;\ntypedef int... BN_ULONG;\n\"\"\"\n\nFUNCTIONS = \"\"\"\nBIGNUM *BN_new(void);\nvoid BN_free(BIGNUM *);\nvoid BN_clear_free(BIGNUM *);\n\nint BN_rand(BIGNUM *, int, int, int);\nint BN_rand_range(BIGNUM *, BIGNUM *);\n\nBN_CTX *BN_CTX_new(void);\nvoid BN_CTX_free(BN_CTX *);\n\nvoid BN_CTX_start(BN_CTX *);\nBIGNUM *BN_CTX_get(BN_CTX *);\nvoid BN_CTX_end(BN_CTX *);\n\nBIGNUM *BN_copy(BIGNUM *, const BIGNUM *);\nBIGNUM *BN_dup(const BIGNUM *);\n\nint BN_set_word(BIGNUM *, BN_ULONG);\nBN_ULONG BN_get_word(const BIGNUM *);\n\nconst BIGNUM *BN_value_one(void);\n\nchar *BN_bn2hex(const BIGNUM *);\nint BN_hex2bn(BIGNUM **, const char *);\nint BN_dec2bn(BIGNUM **, const char *);\n\nint BN_bn2bin(const BIGNUM *, unsigned char *);\nBIGNUM *BN_bin2bn(const unsigned char *, int, BIGNUM *);\n\nint BN_num_bits(const BIGNUM *);\n\nint BN_cmp(const BIGNUM *, const BIGNUM *);\nint BN_add(BIGNUM *, const BIGNUM *, const BIGNUM *);\nint BN_sub(BIGNUM *, const BIGNUM *, const BIGNUM *);\nint BN_mul(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_sqr(BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_div(BIGNUM *, BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_nnmod(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_mod_add(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_mod_sub(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_mod_mul(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_mod_sqr(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_mod_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_gcd(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nBIGNUM *BN_mod_inverse(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\n\nint BN_set_bit(BIGNUM *, int);\nint BN_clear_bit(BIGNUM *, int);\n\nint BN_is_bit_set(const BIGNUM *, int);\n\nint BN_mask_bits(BIGNUM *, int);\n\nint BN_num_bytes(const BIGNUM *);\n\nint BN_zero(BIGNUM *);\nint BN_one(BIGNUM *);\nint BN_mod(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\n\nint BN_lshift(BIGNUM *, const BIGNUM *, int);\nint BN_lshift1(BIGNUM *, BIGNUM *);\n\nint BN_rshift(BIGNUM *, BIGNUM *, int);\nint BN_rshift1(BIGNUM *, BIGNUM *);\n\"\"\"\n\nCUSTOMIZATIONS = \"\"\"\n\"\"\"\n", "path": "src/_cffi_src/openssl/bignum.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nINCLUDES = \"\"\"\n#include <openssl/bn.h>\n\"\"\"\n\nTYPES = \"\"\"\ntypedef ... BN_CTX;\ntypedef ... BN_MONT_CTX;\ntypedef ... BIGNUM;\ntypedef int... BN_ULONG;\n\"\"\"\n\nFUNCTIONS = \"\"\"\n#define BN_FLG_CONSTTIME ...\n\nvoid BN_set_flags(BIGNUM *, int);\nint BN_get_flags(const BIGNUM *, int);\n\nBIGNUM *BN_new(void);\nvoid BN_free(BIGNUM *);\nvoid BN_clear_free(BIGNUM *);\n\nint BN_rand(BIGNUM *, int, int, int);\nint BN_rand_range(BIGNUM *, BIGNUM *);\n\nBN_CTX *BN_CTX_new(void);\nvoid BN_CTX_free(BN_CTX *);\n\nvoid BN_CTX_start(BN_CTX *);\nBIGNUM *BN_CTX_get(BN_CTX *);\nvoid BN_CTX_end(BN_CTX *);\n\nBN_MONT_CTX *BN_MONT_CTX_new(void);\nint BN_MONT_CTX_set(BN_MONT_CTX *, BIGNUM *, BN_CTX *);\nvoid BN_MONT_CTX_free(BN_MONT_CTX *);\n\nBIGNUM *BN_copy(BIGNUM *, const BIGNUM *);\nBIGNUM *BN_dup(const BIGNUM *);\n\nint BN_set_word(BIGNUM *, BN_ULONG);\nBN_ULONG BN_get_word(const BIGNUM *);\n\nconst BIGNUM *BN_value_one(void);\n\nchar *BN_bn2hex(const BIGNUM *);\nint BN_hex2bn(BIGNUM **, const char *);\nint BN_dec2bn(BIGNUM **, const char *);\n\nint BN_bn2bin(const BIGNUM *, unsigned char *);\nBIGNUM *BN_bin2bn(const unsigned char *, int, BIGNUM *);\n\nint BN_num_bits(const BIGNUM *);\n\nint BN_cmp(const BIGNUM *, const BIGNUM *);\nint BN_add(BIGNUM *, const BIGNUM *, const BIGNUM *);\nint BN_sub(BIGNUM *, const BIGNUM *, const BIGNUM *);\nint BN_mul(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_sqr(BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_div(BIGNUM *, BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_nnmod(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_mod_add(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_mod_sub(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_mod_mul(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_mod_sqr(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nint BN_mod_exp(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *);\nint BN_mod_exp_mont(BIGNUM *, const BIGNUM *, const BIGNUM *, const BIGNUM *,\n BN_CTX *, BN_MONT_CTX *);\nint BN_mod_exp_mont_consttime(BIGNUM *, const BIGNUM *, const BIGNUM *,\n const BIGNUM *, BN_CTX *, BN_MONT_CTX *);\nint BN_gcd(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\nBIGNUM *BN_mod_inverse(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\n\nint BN_set_bit(BIGNUM *, int);\nint BN_clear_bit(BIGNUM *, int);\n\nint BN_is_bit_set(const BIGNUM *, int);\n\nint BN_mask_bits(BIGNUM *, int);\n\nint BN_num_bytes(const BIGNUM *);\n\nint BN_zero(BIGNUM *);\nint BN_one(BIGNUM *);\nint BN_mod(BIGNUM *, const BIGNUM *, const BIGNUM *, BN_CTX *);\n\nint BN_lshift(BIGNUM *, const BIGNUM *, int);\nint BN_lshift1(BIGNUM *, BIGNUM *);\n\nint BN_rshift(BIGNUM *, BIGNUM *, int);\nint BN_rshift1(BIGNUM *, BIGNUM *);\n\"\"\"\n\nCUSTOMIZATIONS = \"\"\"\n\"\"\"\n", "path": "src/_cffi_src/openssl/bignum.py"}]}
| 1,531 | 443 |
gh_patches_debug_19759
|
rasdani/github-patches
|
git_diff
|
sql-machine-learning__elasticdl-761
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Provide default gpu resource name and validation
Currently users can only pass "gpu" as part of the resource name. However, k8s requires it to be either `"nvidia.com/gpu"` or `"amd.com/gpu"` if AMD plugin is enabled. There are other different vendors as well but a pattern to use for validation would be `"<vendor>.com/gpu"`.
We should consider adding `"nvidia.com/gpu"` as the default and validate for user provided gpu resource name based on the pattern `"<vendor>.com/gpu"`.
Reference: https://kubernetes.io/docs/tasks/manage-gpus/scheduling-gpus/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `elasticdl/python/elasticdl/common/k8s_utils.py`
Content:
```
1 import re
2
3
4 _ALLOWED_RESOURCE_TYPES = ["memory", "disk", "ephemeral-storage", "cpu", "gpu"]
5
6
7 def _is_numeric(n):
8 try:
9 float(n)
10 except ValueError:
11 return False
12 return True
13
14
15 def _valid_gpu_spec(gpu_str):
16 if not gpu_str.isnumeric():
17 raise ValueError("invalid gpu request spec: " + gpu_str)
18 return gpu_str
19
20
21 def _valid_cpu_spec(cpu_str):
22 regexp = re.compile("([1-9]{1})([0-9]*)m$")
23 if not regexp.match(cpu_str) and not _is_numeric(cpu_str):
24 raise ValueError("invalid cpu request spec: " + cpu_str)
25 return cpu_str
26
27
28 def _valid_mem_spec(mem_str):
29 regexp = re.compile("([1-9]{1})([0-9]*)(E|P|T|G|M|K|Ei|Pi|Ti|Gi|Mi|Ki)$")
30 if not regexp.match(mem_str):
31 raise ValueError("invalid memory request spec: " + mem_str)
32 return mem_str
33
34
35 def parse_resource(resource_str):
36 """Parse combined k8s resource string into a dict.
37
38 Args:
39 resource_str: The string representation for k8s resource,
40 e.g. "cpu=250m,memory=32Mi,disk=64Mi,gpu=1,ephemeral-storage=32Mi".
41
42 Return:
43 A Python dictionary parsed from the given resource string.
44 """
45 kvs = resource_str.split(",")
46 resource_names = []
47 parsed_res_dict = {}
48 for kv in kvs:
49 k, v = kv.split("=")
50 if k not in resource_names:
51 resource_names.append(k)
52 else:
53 raise ValueError(
54 "The resource string contains duplicate resource names: %s" % k
55 )
56 if k in ["memory", "disk", "ephemeral-storage"]:
57 _valid_mem_spec(v)
58 elif k == "cpu":
59 _valid_cpu_spec(v)
60 elif k == "gpu":
61 _valid_gpu_spec(v)
62 else:
63 raise ValueError(
64 "%s is not in the allowed list of resource types: %s"
65 % (k, _ALLOWED_RESOURCE_TYPES)
66 )
67 parsed_res_dict[k] = v
68 return parsed_res_dict
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/elasticdl/python/elasticdl/common/k8s_utils.py b/elasticdl/python/elasticdl/common/k8s_utils.py
--- a/elasticdl/python/elasticdl/common/k8s_utils.py
+++ b/elasticdl/python/elasticdl/common/k8s_utils.py
@@ -2,6 +2,10 @@
_ALLOWED_RESOURCE_TYPES = ["memory", "disk", "ephemeral-storage", "cpu", "gpu"]
+# Any domain name is (syntactically) valid if it's a dot-separated list of
+# identifiers, each no longer than 63 characters, and made up of letters,
+# digits and dashes (no underscores).
+_GPU_VENDOR_REGEX_STR = r"^[a-zA-Z\d-]{,63}(\.[a-zA-Z\d-]{,63})*/gpu$"
def _is_numeric(n):
@@ -57,7 +61,14 @@
_valid_mem_spec(v)
elif k == "cpu":
_valid_cpu_spec(v)
- elif k == "gpu":
+ elif "gpu" in k:
+ if k == "gpu":
+ k = "nvidia.com/gpu"
+ elif not re.compile(_GPU_VENDOR_REGEX_STR).match(k):
+ raise ValueError(
+ "gpu resource name does not have a valid vendor name: %s"
+ % k
+ )
_valid_gpu_spec(v)
else:
raise ValueError(
|
{"golden_diff": "diff --git a/elasticdl/python/elasticdl/common/k8s_utils.py b/elasticdl/python/elasticdl/common/k8s_utils.py\n--- a/elasticdl/python/elasticdl/common/k8s_utils.py\n+++ b/elasticdl/python/elasticdl/common/k8s_utils.py\n@@ -2,6 +2,10 @@\n \n \n _ALLOWED_RESOURCE_TYPES = [\"memory\", \"disk\", \"ephemeral-storage\", \"cpu\", \"gpu\"]\n+# Any domain name is (syntactically) valid if it's a dot-separated list of\n+# identifiers, each no longer than 63 characters, and made up of letters,\n+# digits and dashes (no underscores).\n+_GPU_VENDOR_REGEX_STR = r\"^[a-zA-Z\\d-]{,63}(\\.[a-zA-Z\\d-]{,63})*/gpu$\"\n \n \n def _is_numeric(n):\n@@ -57,7 +61,14 @@\n _valid_mem_spec(v)\n elif k == \"cpu\":\n _valid_cpu_spec(v)\n- elif k == \"gpu\":\n+ elif \"gpu\" in k:\n+ if k == \"gpu\":\n+ k = \"nvidia.com/gpu\"\n+ elif not re.compile(_GPU_VENDOR_REGEX_STR).match(k):\n+ raise ValueError(\n+ \"gpu resource name does not have a valid vendor name: %s\"\n+ % k\n+ )\n _valid_gpu_spec(v)\n else:\n raise ValueError(\n", "issue": "Provide default gpu resource name and validation\nCurrently users can only pass \"gpu\" as part of the resource name. However, k8s requires it to be either `\"nvidia.com/gpu\"` or `\"amd.com/gpu\"` if AMD plugin is enabled. There are other different vendors as well but a pattern to use for validation would be `\"<vendor>.com/gpu\"`.\r\n\r\nWe should consider adding `\"nvidia.com/gpu\"` as the default and validate for user provided gpu resource name based on the pattern `\"<vendor>.com/gpu\"`.\r\n\r\nReference: https://kubernetes.io/docs/tasks/manage-gpus/scheduling-gpus/\n", "before_files": [{"content": "import re\n\n\n_ALLOWED_RESOURCE_TYPES = [\"memory\", \"disk\", \"ephemeral-storage\", \"cpu\", \"gpu\"]\n\n\ndef _is_numeric(n):\n try:\n float(n)\n except ValueError:\n return False\n return True\n\n\ndef _valid_gpu_spec(gpu_str):\n if not gpu_str.isnumeric():\n raise ValueError(\"invalid gpu request spec: \" + gpu_str)\n return gpu_str\n\n\ndef _valid_cpu_spec(cpu_str):\n regexp = re.compile(\"([1-9]{1})([0-9]*)m$\")\n if not regexp.match(cpu_str) and not _is_numeric(cpu_str):\n raise ValueError(\"invalid cpu request spec: \" + cpu_str)\n return cpu_str\n\n\ndef _valid_mem_spec(mem_str):\n regexp = re.compile(\"([1-9]{1})([0-9]*)(E|P|T|G|M|K|Ei|Pi|Ti|Gi|Mi|Ki)$\")\n if not regexp.match(mem_str):\n raise ValueError(\"invalid memory request spec: \" + mem_str)\n return mem_str\n\n\ndef parse_resource(resource_str):\n \"\"\"Parse combined k8s resource string into a dict.\n\n Args:\n resource_str: The string representation for k8s resource,\n e.g. \"cpu=250m,memory=32Mi,disk=64Mi,gpu=1,ephemeral-storage=32Mi\".\n\n Return:\n A Python dictionary parsed from the given resource string.\n \"\"\"\n kvs = resource_str.split(\",\")\n resource_names = []\n parsed_res_dict = {}\n for kv in kvs:\n k, v = kv.split(\"=\")\n if k not in resource_names:\n resource_names.append(k)\n else:\n raise ValueError(\n \"The resource string contains duplicate resource names: %s\" % k\n )\n if k in [\"memory\", \"disk\", \"ephemeral-storage\"]:\n _valid_mem_spec(v)\n elif k == \"cpu\":\n _valid_cpu_spec(v)\n elif k == \"gpu\":\n _valid_gpu_spec(v)\n else:\n raise ValueError(\n \"%s is not in the allowed list of resource types: %s\"\n % (k, _ALLOWED_RESOURCE_TYPES)\n )\n parsed_res_dict[k] = v\n return parsed_res_dict\n", "path": "elasticdl/python/elasticdl/common/k8s_utils.py"}], "after_files": [{"content": "import re\n\n\n_ALLOWED_RESOURCE_TYPES = [\"memory\", \"disk\", \"ephemeral-storage\", \"cpu\", \"gpu\"]\n# Any domain name is (syntactically) valid if it's a dot-separated list of\n# identifiers, each no longer than 63 characters, and made up of letters,\n# digits and dashes (no underscores).\n_GPU_VENDOR_REGEX_STR = r\"^[a-zA-Z\\d-]{,63}(\\.[a-zA-Z\\d-]{,63})*/gpu$\"\n\n\ndef _is_numeric(n):\n try:\n float(n)\n except ValueError:\n return False\n return True\n\n\ndef _valid_gpu_spec(gpu_str):\n if not gpu_str.isnumeric():\n raise ValueError(\"invalid gpu request spec: \" + gpu_str)\n return gpu_str\n\n\ndef _valid_cpu_spec(cpu_str):\n regexp = re.compile(\"([1-9]{1})([0-9]*)m$\")\n if not regexp.match(cpu_str) and not _is_numeric(cpu_str):\n raise ValueError(\"invalid cpu request spec: \" + cpu_str)\n return cpu_str\n\n\ndef _valid_mem_spec(mem_str):\n regexp = re.compile(\"([1-9]{1})([0-9]*)(E|P|T|G|M|K|Ei|Pi|Ti|Gi|Mi|Ki)$\")\n if not regexp.match(mem_str):\n raise ValueError(\"invalid memory request spec: \" + mem_str)\n return mem_str\n\n\ndef parse_resource(resource_str):\n \"\"\"Parse combined k8s resource string into a dict.\n\n Args:\n resource_str: The string representation for k8s resource,\n e.g. \"cpu=250m,memory=32Mi,disk=64Mi,gpu=1,ephemeral-storage=32Mi\".\n\n Return:\n A Python dictionary parsed from the given resource string.\n \"\"\"\n kvs = resource_str.split(\",\")\n resource_names = []\n parsed_res_dict = {}\n for kv in kvs:\n k, v = kv.split(\"=\")\n if k not in resource_names:\n resource_names.append(k)\n else:\n raise ValueError(\n \"The resource string contains duplicate resource names: %s\" % k\n )\n if k in [\"memory\", \"disk\", \"ephemeral-storage\"]:\n _valid_mem_spec(v)\n elif k == \"cpu\":\n _valid_cpu_spec(v)\n elif \"gpu\" in k:\n if k == \"gpu\":\n k = \"nvidia.com/gpu\"\n elif not re.compile(_GPU_VENDOR_REGEX_STR).match(k):\n raise ValueError(\n \"gpu resource name does not have a valid vendor name: %s\"\n % k\n )\n _valid_gpu_spec(v)\n else:\n raise ValueError(\n \"%s is not in the allowed list of resource types: %s\"\n % (k, _ALLOWED_RESOURCE_TYPES)\n )\n parsed_res_dict[k] = v\n return parsed_res_dict\n", "path": "elasticdl/python/elasticdl/common/k8s_utils.py"}]}
| 1,033 | 318 |
gh_patches_debug_12859
|
rasdani/github-patches
|
git_diff
|
secdev__scapy-2556
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
plain_str not consistent between Py2 and Py3
**Python 2 (expected result)**
```python
>>> plain_str(b'\x3e\xb4\xef\x2c\x11')
'>\xb4\xef,\x11'
```
**Python 3 (unexpected result)**
```python
>>> plain_str(b'\x3e\xb4\xef\x2c\x11')
'>,\x11'
```
**Cause** is `errors="ignore"`
```python
def plain_str(x):
"""Convert basic byte objects to str"""
if isinstance(x, bytes):
return x.decode(errors="ignore")
return str(x)
```
It just ignores values that are not in a valid range for the codec.
**Possible fix**
```python
def plain_str(x):
"""Convert basic byte objects to str"""
if isinstance(x, bytes):
return x.decode(errors="backslashreplace")
return str(x)
```
But this breaks compatibility for Python 3.4
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scapy/compat.py`
Content:
```
1 # This file is part of Scapy
2 # See http://www.secdev.org/projects/scapy for more information
3 # Copyright (C) Philippe Biondi <[email protected]>
4 # Copyright (C) Gabriel Potter <[email protected]>
5 # This program is published under a GPLv2 license
6
7 """
8 Python 2 and 3 link classes.
9 """
10
11 from __future__ import absolute_import
12 import base64
13 import binascii
14 import gzip
15 import struct
16
17 import scapy.modules.six as six
18
19 ###########
20 # Python3 #
21 ###########
22
23
24 def lambda_tuple_converter(func):
25 """
26 Converts a Python 2 function as
27 lambda (x,y): x + y
28 In the Python 3 format:
29 lambda x,y : x + y
30 """
31 if func is not None and func.__code__.co_argcount == 1:
32 return lambda *args: func(args[0] if len(args) == 1 else args)
33 else:
34 return func
35
36
37 if six.PY2:
38 bytes_encode = plain_str = str
39 chb = lambda x: x if isinstance(x, str) else chr(x)
40 orb = ord
41
42 def raw(x):
43 """Builds a packet and returns its bytes representation.
44 This function is and always be cross-version compatible"""
45 if hasattr(x, "__bytes__"):
46 return x.__bytes__()
47 return bytes(x)
48 else:
49 def raw(x):
50 """Builds a packet and returns its bytes representation.
51 This function is and always be cross-version compatible"""
52 return bytes(x)
53
54 def bytes_encode(x):
55 """Ensure that the given object is bytes.
56 If the parameter is a packet, raw() should be preferred.
57 """
58 if isinstance(x, str):
59 return x.encode()
60 return bytes(x)
61
62 def plain_str(x):
63 """Convert basic byte objects to str"""
64 if isinstance(x, bytes):
65 return x.decode(errors="ignore")
66 return str(x)
67
68 def chb(x):
69 """Same than chr() but encode as bytes."""
70 return struct.pack("!B", x)
71
72 def orb(x):
73 """Return ord(x) when not already an int."""
74 if isinstance(x, int):
75 return x
76 return ord(x)
77
78
79 def bytes_hex(x):
80 """Hexify a str or a bytes object"""
81 return binascii.b2a_hex(bytes_encode(x))
82
83
84 def hex_bytes(x):
85 """De-hexify a str or a byte object"""
86 return binascii.a2b_hex(bytes_encode(x))
87
88
89 def base64_bytes(x):
90 """Turn base64 into bytes"""
91 if six.PY2:
92 return base64.decodestring(x)
93 return base64.decodebytes(bytes_encode(x))
94
95
96 def bytes_base64(x):
97 """Turn bytes into base64"""
98 if six.PY2:
99 return base64.encodestring(x).replace('\n', '')
100 return base64.encodebytes(bytes_encode(x)).replace(b'\n', b'')
101
102
103 if six.PY2:
104 from StringIO import StringIO
105
106 def gzip_decompress(x):
107 """Decompress using gzip"""
108 with gzip.GzipFile(fileobj=StringIO(x), mode='rb') as fdesc:
109 return fdesc.read()
110
111 def gzip_compress(x):
112 """Compress using gzip"""
113 buf = StringIO()
114 with gzip.GzipFile(fileobj=buf, mode='wb') as fdesc:
115 fdesc.write(x)
116 return buf.getvalue()
117 else:
118 gzip_decompress = gzip.decompress
119 gzip_compress = gzip.compress
120
121 # Typing compatibility
122
123 try:
124 # Only required if using mypy-lang for static typing
125 from typing import Optional, List, Union, Callable, Any, AnyStr, Tuple, \
126 Sized, Dict, Pattern, cast
127 except ImportError:
128 # Let's make some fake ones.
129
130 def cast(_type, obj):
131 return obj
132
133 class _FakeType(object):
134 # make the objects subscriptable indefinetly
135 def __getitem__(self, item):
136 return _FakeType()
137
138 Optional = _FakeType()
139 Union = _FakeType()
140 Callable = _FakeType()
141 List = _FakeType()
142 Dict = _FakeType()
143 Any = _FakeType()
144 AnyStr = _FakeType()
145 Tuple = _FakeType()
146 Pattern = _FakeType()
147
148 class Sized(object):
149 pass
150
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/scapy/compat.py b/scapy/compat.py
--- a/scapy/compat.py
+++ b/scapy/compat.py
@@ -59,11 +59,19 @@
return x.encode()
return bytes(x)
- def plain_str(x):
- """Convert basic byte objects to str"""
- if isinstance(x, bytes):
- return x.decode(errors="ignore")
- return str(x)
+ if six.PY34:
+ def plain_str(x):
+ """Convert basic byte objects to str"""
+ if isinstance(x, bytes):
+ return x.decode(errors="ignore")
+ return str(x)
+ else:
+ # Python 3.5+
+ def plain_str(x):
+ """Convert basic byte objects to str"""
+ if isinstance(x, bytes):
+ return x.decode(errors="backslashreplace")
+ return str(x)
def chb(x):
"""Same than chr() but encode as bytes."""
|
{"golden_diff": "diff --git a/scapy/compat.py b/scapy/compat.py\n--- a/scapy/compat.py\n+++ b/scapy/compat.py\n@@ -59,11 +59,19 @@\n return x.encode()\n return bytes(x)\n \n- def plain_str(x):\n- \"\"\"Convert basic byte objects to str\"\"\"\n- if isinstance(x, bytes):\n- return x.decode(errors=\"ignore\")\n- return str(x)\n+ if six.PY34:\n+ def plain_str(x):\n+ \"\"\"Convert basic byte objects to str\"\"\"\n+ if isinstance(x, bytes):\n+ return x.decode(errors=\"ignore\")\n+ return str(x)\n+ else:\n+ # Python 3.5+\n+ def plain_str(x):\n+ \"\"\"Convert basic byte objects to str\"\"\"\n+ if isinstance(x, bytes):\n+ return x.decode(errors=\"backslashreplace\")\n+ return str(x)\n \n def chb(x):\n \"\"\"Same than chr() but encode as bytes.\"\"\"\n", "issue": "plain_str not consistent between Py2 and Py3\n**Python 2 (expected result)**\r\n```python\r\n>>> plain_str(b'\\x3e\\xb4\\xef\\x2c\\x11')\r\n'>\\xb4\\xef,\\x11'\r\n```\r\n\r\n**Python 3 (unexpected result)**\r\n```python\r\n>>> plain_str(b'\\x3e\\xb4\\xef\\x2c\\x11') \r\n'>,\\x11'\r\n```\r\n\r\n**Cause** is `errors=\"ignore\"`\r\n```python\r\ndef plain_str(x): \r\n \"\"\"Convert basic byte objects to str\"\"\"\r\n if isinstance(x, bytes): \r\n return x.decode(errors=\"ignore\") \r\n return str(x) \r\n```\r\n\r\nIt just ignores values that are not in a valid range for the codec.\r\n\r\n**Possible fix**\r\n```python\r\ndef plain_str(x): \r\n \"\"\"Convert basic byte objects to str\"\"\"\r\n if isinstance(x, bytes): \r\n return x.decode(errors=\"backslashreplace\") \r\n return str(x) \r\n``` \r\nBut this breaks compatibility for Python 3.4\r\n\r\n\n", "before_files": [{"content": "# This file is part of Scapy\n# See http://www.secdev.org/projects/scapy for more information\n# Copyright (C) Philippe Biondi <[email protected]>\n# Copyright (C) Gabriel Potter <[email protected]>\n# This program is published under a GPLv2 license\n\n\"\"\"\nPython 2 and 3 link classes.\n\"\"\"\n\nfrom __future__ import absolute_import\nimport base64\nimport binascii\nimport gzip\nimport struct\n\nimport scapy.modules.six as six\n\n###########\n# Python3 #\n###########\n\n\ndef lambda_tuple_converter(func):\n \"\"\"\n Converts a Python 2 function as\n lambda (x,y): x + y\n In the Python 3 format:\n lambda x,y : x + y\n \"\"\"\n if func is not None and func.__code__.co_argcount == 1:\n return lambda *args: func(args[0] if len(args) == 1 else args)\n else:\n return func\n\n\nif six.PY2:\n bytes_encode = plain_str = str\n chb = lambda x: x if isinstance(x, str) else chr(x)\n orb = ord\n\n def raw(x):\n \"\"\"Builds a packet and returns its bytes representation.\n This function is and always be cross-version compatible\"\"\"\n if hasattr(x, \"__bytes__\"):\n return x.__bytes__()\n return bytes(x)\nelse:\n def raw(x):\n \"\"\"Builds a packet and returns its bytes representation.\n This function is and always be cross-version compatible\"\"\"\n return bytes(x)\n\n def bytes_encode(x):\n \"\"\"Ensure that the given object is bytes.\n If the parameter is a packet, raw() should be preferred.\n \"\"\"\n if isinstance(x, str):\n return x.encode()\n return bytes(x)\n\n def plain_str(x):\n \"\"\"Convert basic byte objects to str\"\"\"\n if isinstance(x, bytes):\n return x.decode(errors=\"ignore\")\n return str(x)\n\n def chb(x):\n \"\"\"Same than chr() but encode as bytes.\"\"\"\n return struct.pack(\"!B\", x)\n\n def orb(x):\n \"\"\"Return ord(x) when not already an int.\"\"\"\n if isinstance(x, int):\n return x\n return ord(x)\n\n\ndef bytes_hex(x):\n \"\"\"Hexify a str or a bytes object\"\"\"\n return binascii.b2a_hex(bytes_encode(x))\n\n\ndef hex_bytes(x):\n \"\"\"De-hexify a str or a byte object\"\"\"\n return binascii.a2b_hex(bytes_encode(x))\n\n\ndef base64_bytes(x):\n \"\"\"Turn base64 into bytes\"\"\"\n if six.PY2:\n return base64.decodestring(x)\n return base64.decodebytes(bytes_encode(x))\n\n\ndef bytes_base64(x):\n \"\"\"Turn bytes into base64\"\"\"\n if six.PY2:\n return base64.encodestring(x).replace('\\n', '')\n return base64.encodebytes(bytes_encode(x)).replace(b'\\n', b'')\n\n\nif six.PY2:\n from StringIO import StringIO\n\n def gzip_decompress(x):\n \"\"\"Decompress using gzip\"\"\"\n with gzip.GzipFile(fileobj=StringIO(x), mode='rb') as fdesc:\n return fdesc.read()\n\n def gzip_compress(x):\n \"\"\"Compress using gzip\"\"\"\n buf = StringIO()\n with gzip.GzipFile(fileobj=buf, mode='wb') as fdesc:\n fdesc.write(x)\n return buf.getvalue()\nelse:\n gzip_decompress = gzip.decompress\n gzip_compress = gzip.compress\n\n# Typing compatibility\n\ntry:\n # Only required if using mypy-lang for static typing\n from typing import Optional, List, Union, Callable, Any, AnyStr, Tuple, \\\n Sized, Dict, Pattern, cast\nexcept ImportError:\n # Let's make some fake ones.\n\n def cast(_type, obj):\n return obj\n\n class _FakeType(object):\n # make the objects subscriptable indefinetly\n def __getitem__(self, item):\n return _FakeType()\n\n Optional = _FakeType()\n Union = _FakeType()\n Callable = _FakeType()\n List = _FakeType()\n Dict = _FakeType()\n Any = _FakeType()\n AnyStr = _FakeType()\n Tuple = _FakeType()\n Pattern = _FakeType()\n\n class Sized(object):\n pass\n", "path": "scapy/compat.py"}], "after_files": [{"content": "# This file is part of Scapy\n# See http://www.secdev.org/projects/scapy for more information\n# Copyright (C) Philippe Biondi <[email protected]>\n# Copyright (C) Gabriel Potter <[email protected]>\n# This program is published under a GPLv2 license\n\n\"\"\"\nPython 2 and 3 link classes.\n\"\"\"\n\nfrom __future__ import absolute_import\nimport base64\nimport binascii\nimport gzip\nimport struct\n\nimport scapy.modules.six as six\n\n###########\n# Python3 #\n###########\n\n\ndef lambda_tuple_converter(func):\n \"\"\"\n Converts a Python 2 function as\n lambda (x,y): x + y\n In the Python 3 format:\n lambda x,y : x + y\n \"\"\"\n if func is not None and func.__code__.co_argcount == 1:\n return lambda *args: func(args[0] if len(args) == 1 else args)\n else:\n return func\n\n\nif six.PY2:\n bytes_encode = plain_str = str\n chb = lambda x: x if isinstance(x, str) else chr(x)\n orb = ord\n\n def raw(x):\n \"\"\"Builds a packet and returns its bytes representation.\n This function is and always be cross-version compatible\"\"\"\n if hasattr(x, \"__bytes__\"):\n return x.__bytes__()\n return bytes(x)\nelse:\n def raw(x):\n \"\"\"Builds a packet and returns its bytes representation.\n This function is and always be cross-version compatible\"\"\"\n return bytes(x)\n\n def bytes_encode(x):\n \"\"\"Ensure that the given object is bytes.\n If the parameter is a packet, raw() should be preferred.\n \"\"\"\n if isinstance(x, str):\n return x.encode()\n return bytes(x)\n\n if six.PY34:\n def plain_str(x):\n \"\"\"Convert basic byte objects to str\"\"\"\n if isinstance(x, bytes):\n return x.decode(errors=\"ignore\")\n return str(x)\n else:\n # Python 3.5+\n def plain_str(x):\n \"\"\"Convert basic byte objects to str\"\"\"\n if isinstance(x, bytes):\n return x.decode(errors=\"backslashreplace\")\n return str(x)\n\n def chb(x):\n \"\"\"Same than chr() but encode as bytes.\"\"\"\n return struct.pack(\"!B\", x)\n\n def orb(x):\n \"\"\"Return ord(x) when not already an int.\"\"\"\n if isinstance(x, int):\n return x\n return ord(x)\n\n\ndef bytes_hex(x):\n \"\"\"Hexify a str or a bytes object\"\"\"\n return binascii.b2a_hex(bytes_encode(x))\n\n\ndef hex_bytes(x):\n \"\"\"De-hexify a str or a byte object\"\"\"\n return binascii.a2b_hex(bytes_encode(x))\n\n\ndef base64_bytes(x):\n \"\"\"Turn base64 into bytes\"\"\"\n if six.PY2:\n return base64.decodestring(x)\n return base64.decodebytes(bytes_encode(x))\n\n\ndef bytes_base64(x):\n \"\"\"Turn bytes into base64\"\"\"\n if six.PY2:\n return base64.encodestring(x).replace('\\n', '')\n return base64.encodebytes(bytes_encode(x)).replace(b'\\n', b'')\n\n\nif six.PY2:\n from StringIO import StringIO\n\n def gzip_decompress(x):\n \"\"\"Decompress using gzip\"\"\"\n with gzip.GzipFile(fileobj=StringIO(x), mode='rb') as fdesc:\n return fdesc.read()\n\n def gzip_compress(x):\n \"\"\"Compress using gzip\"\"\"\n buf = StringIO()\n with gzip.GzipFile(fileobj=buf, mode='wb') as fdesc:\n fdesc.write(x)\n return buf.getvalue()\nelse:\n gzip_decompress = gzip.decompress\n gzip_compress = gzip.compress\n\n# Typing compatibility\n\ntry:\n # Only required if using mypy-lang for static typing\n from typing import Optional, List, Union, Callable, Any, AnyStr, Tuple, \\\n Sized, Dict, Pattern, cast\nexcept ImportError:\n # Let's make some fake ones.\n\n def cast(_type, obj):\n return obj\n\n class _FakeType(object):\n # make the objects subscriptable indefinetly\n def __getitem__(self, item):\n return _FakeType()\n\n Optional = _FakeType()\n Union = _FakeType()\n Callable = _FakeType()\n List = _FakeType()\n Dict = _FakeType()\n Any = _FakeType()\n AnyStr = _FakeType()\n Tuple = _FakeType()\n Pattern = _FakeType()\n\n class Sized(object):\n pass\n", "path": "scapy/compat.py"}]}
| 1,792 | 219 |
gh_patches_debug_8552
|
rasdani/github-patches
|
git_diff
|
Gallopsled__pwntools-323
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Importing shellcraft submodules doesn't work
```
$ python
>>> import pwnlib.shellcraft.arm
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named arm
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwnlib/shellcraft/__init__.py`
Content:
```
1 from types import ModuleType
2 import sys, os, re
3 from . import internal
4 from ..context import context
5
6 class module(ModuleType):
7 def __init__(self, name, directory):
8 super(module, self).__init__(name)
9
10 # Insert nice properties
11 self.__dict__.update({
12 '__file__': __file__,
13 '__package__': __package__,
14 '__path__': __path__,
15 })
16
17 # Save the shellcode directory
18 self._dir = directory
19
20 # Find the absolute path of the directory
21 self._absdir = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates', self._dir)
22
23 # Get the docstring
24 with open(os.path.join(self._absdir, "__doc__")) as fd:
25 self.__doc__ = fd.read()
26
27 # Insert into the module list
28 sys.modules[self.__name__] = self
29
30 def __lazyinit__(self):
31
32 # Create a dictionary of submodules
33 self._submodules = {}
34 self._shellcodes = {}
35 for name in os.listdir(self._absdir):
36 path = os.path.join(self._absdir, name)
37 if os.path.isdir(path):
38 self._submodules[name] = module(self.__name__ + '.' + name, os.path.join(self._dir, name))
39 elif os.path.isfile(path) and name != '__doc__' and name[0] != '.':
40 funcname, _ext = os.path.splitext(name)
41 if not re.match('^[a-zA-Z][a-zA-Z0-9_]*$', funcname):
42 raise ValueError("found illegal filename, %r" % name)
43 self._shellcodes[funcname] = name
44
45 # Put the submodules into toplevel
46 self.__dict__.update(self._submodules)
47
48 # These are exported
49 self.__all__ = sorted(self._shellcodes.keys() + self._submodules.keys())
50
51 # Make sure this is not called again
52 self.__lazyinit__ = None
53
54 def __getattr__(self, key):
55 self.__lazyinit__ and self.__lazyinit__()
56
57 # Maybe the lazyinit added it
58 if key in self.__dict__:
59 return self.__dict__[key]
60
61 # This function lazy-loads the shellcodes
62 if key in self._shellcodes:
63 real = internal.make_function(key, self._shellcodes[key], self._dir)
64 setattr(self, key, real)
65 return real
66
67 for m in self._context_modules():
68 try:
69 return getattr(m, key)
70 except AttributeError:
71 pass
72
73 raise AttributeError("'module' object has no attribute '%s'" % key)
74
75 def __dir__(self):
76 # This function lists the available submodules, available shellcodes
77 # and potentially shellcodes available in submodules that should be
78 # avilable because of the context
79 self.__lazyinit__ and self.__lazyinit__()
80
81 result = list(self._submodules.keys())
82 result.extend(('__file__', '__package__', '__path__',
83 '__all__', '__name__'))
84 result.extend(self.__shellcodes__())
85
86 return result
87
88 def _context_modules(self):
89 self.__lazyinit__ and self.__lazyinit__()
90 for k, m in self._submodules.items():
91 if k in [context.arch, context.os]:
92 yield m
93
94 def __shellcodes__(self):
95 self.__lazyinit__ and self.__lazyinit__()
96 result = self._shellcodes.keys()
97 for m in self._context_modules():
98 result.extend(m.__shellcodes__())
99 return result
100
101 # To prevent garbage collection
102 tether = sys.modules[__name__]
103
104 # Create the module structure
105 module(__name__, '')
106
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pwnlib/shellcraft/__init__.py b/pwnlib/shellcraft/__init__.py
--- a/pwnlib/shellcraft/__init__.py
+++ b/pwnlib/shellcraft/__init__.py
@@ -102,4 +102,22 @@
tether = sys.modules[__name__]
# Create the module structure
-module(__name__, '')
+shellcraft = module(__name__, '')
+
+class LazyImporter:
+ def find_module(self, fullname, path):
+ if not fullname.startswith('pwnlib.shellcraft.'):
+ return None
+
+ parts = fullname.split('.')[2:]
+ cur = shellcraft
+ for part in parts:
+ cur = getattr(cur, part, None)
+ if not isinstance(cur, ModuleType):
+ return None
+
+ return self
+
+ def load_module(self, fullname):
+ return sys.modules[fullname]
+sys.meta_path.append(LazyImporter())
|
{"golden_diff": "diff --git a/pwnlib/shellcraft/__init__.py b/pwnlib/shellcraft/__init__.py\n--- a/pwnlib/shellcraft/__init__.py\n+++ b/pwnlib/shellcraft/__init__.py\n@@ -102,4 +102,22 @@\n tether = sys.modules[__name__]\n \n # Create the module structure\n-module(__name__, '')\n+shellcraft = module(__name__, '')\n+\n+class LazyImporter:\n+ def find_module(self, fullname, path):\n+ if not fullname.startswith('pwnlib.shellcraft.'):\n+ return None\n+\n+ parts = fullname.split('.')[2:]\n+ cur = shellcraft\n+ for part in parts:\n+ cur = getattr(cur, part, None)\n+ if not isinstance(cur, ModuleType):\n+ return None\n+\n+ return self\n+\n+ def load_module(self, fullname):\n+ return sys.modules[fullname]\n+sys.meta_path.append(LazyImporter())\n", "issue": "Importing shellcraft submodules doesn't work\n```\n$ python\n>>> import pwnlib.shellcraft.arm\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nImportError: No module named arm\n```\n\n", "before_files": [{"content": "from types import ModuleType\nimport sys, os, re\nfrom . import internal\nfrom ..context import context\n\nclass module(ModuleType):\n def __init__(self, name, directory):\n super(module, self).__init__(name)\n\n # Insert nice properties\n self.__dict__.update({\n '__file__': __file__,\n '__package__': __package__,\n '__path__': __path__,\n })\n\n # Save the shellcode directory\n self._dir = directory\n\n # Find the absolute path of the directory\n self._absdir = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates', self._dir)\n\n # Get the docstring\n with open(os.path.join(self._absdir, \"__doc__\")) as fd:\n self.__doc__ = fd.read()\n\n # Insert into the module list\n sys.modules[self.__name__] = self\n\n def __lazyinit__(self):\n\n # Create a dictionary of submodules\n self._submodules = {}\n self._shellcodes = {}\n for name in os.listdir(self._absdir):\n path = os.path.join(self._absdir, name)\n if os.path.isdir(path):\n self._submodules[name] = module(self.__name__ + '.' + name, os.path.join(self._dir, name))\n elif os.path.isfile(path) and name != '__doc__' and name[0] != '.':\n funcname, _ext = os.path.splitext(name)\n if not re.match('^[a-zA-Z][a-zA-Z0-9_]*$', funcname):\n raise ValueError(\"found illegal filename, %r\" % name)\n self._shellcodes[funcname] = name\n\n # Put the submodules into toplevel\n self.__dict__.update(self._submodules)\n\n # These are exported\n self.__all__ = sorted(self._shellcodes.keys() + self._submodules.keys())\n\n # Make sure this is not called again\n self.__lazyinit__ = None\n\n def __getattr__(self, key):\n self.__lazyinit__ and self.__lazyinit__()\n\n # Maybe the lazyinit added it\n if key in self.__dict__:\n return self.__dict__[key]\n\n # This function lazy-loads the shellcodes\n if key in self._shellcodes:\n real = internal.make_function(key, self._shellcodes[key], self._dir)\n setattr(self, key, real)\n return real\n\n for m in self._context_modules():\n try:\n return getattr(m, key)\n except AttributeError:\n pass\n\n raise AttributeError(\"'module' object has no attribute '%s'\" % key)\n\n def __dir__(self):\n # This function lists the available submodules, available shellcodes\n # and potentially shellcodes available in submodules that should be\n # avilable because of the context\n self.__lazyinit__ and self.__lazyinit__()\n\n result = list(self._submodules.keys())\n result.extend(('__file__', '__package__', '__path__',\n '__all__', '__name__'))\n result.extend(self.__shellcodes__())\n\n return result\n\n def _context_modules(self):\n self.__lazyinit__ and self.__lazyinit__()\n for k, m in self._submodules.items():\n if k in [context.arch, context.os]:\n yield m\n\n def __shellcodes__(self):\n self.__lazyinit__ and self.__lazyinit__()\n result = self._shellcodes.keys()\n for m in self._context_modules():\n result.extend(m.__shellcodes__())\n return result\n\n# To prevent garbage collection\ntether = sys.modules[__name__]\n\n# Create the module structure\nmodule(__name__, '')\n", "path": "pwnlib/shellcraft/__init__.py"}], "after_files": [{"content": "from types import ModuleType\nimport sys, os, re\nfrom . import internal\nfrom ..context import context\n\nclass module(ModuleType):\n def __init__(self, name, directory):\n super(module, self).__init__(name)\n\n # Insert nice properties\n self.__dict__.update({\n '__file__': __file__,\n '__package__': __package__,\n '__path__': __path__,\n })\n\n # Save the shellcode directory\n self._dir = directory\n\n # Find the absolute path of the directory\n self._absdir = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'templates', self._dir)\n\n # Get the docstring\n with open(os.path.join(self._absdir, \"__doc__\")) as fd:\n self.__doc__ = fd.read()\n\n # Insert into the module list\n sys.modules[self.__name__] = self\n\n def __lazyinit__(self):\n\n # Create a dictionary of submodules\n self._submodules = {}\n self._shellcodes = {}\n for name in os.listdir(self._absdir):\n path = os.path.join(self._absdir, name)\n if os.path.isdir(path):\n self._submodules[name] = module(self.__name__ + '.' + name, os.path.join(self._dir, name))\n elif os.path.isfile(path) and name != '__doc__' and name[0] != '.':\n funcname, _ext = os.path.splitext(name)\n if not re.match('^[a-zA-Z][a-zA-Z0-9_]*$', funcname):\n raise ValueError(\"found illegal filename, %r\" % name)\n self._shellcodes[funcname] = name\n\n # Put the submodules into toplevel\n self.__dict__.update(self._submodules)\n\n # These are exported\n self.__all__ = sorted(self._shellcodes.keys() + self._submodules.keys())\n\n # Make sure this is not called again\n self.__lazyinit__ = None\n\n def __getattr__(self, key):\n self.__lazyinit__ and self.__lazyinit__()\n\n # Maybe the lazyinit added it\n if key in self.__dict__:\n return self.__dict__[key]\n\n # This function lazy-loads the shellcodes\n if key in self._shellcodes:\n real = internal.make_function(key, self._shellcodes[key], self._dir)\n setattr(self, key, real)\n return real\n\n for m in self._context_modules():\n try:\n return getattr(m, key)\n except AttributeError:\n pass\n\n raise AttributeError(\"'module' object has no attribute '%s'\" % key)\n\n def __dir__(self):\n # This function lists the available submodules, available shellcodes\n # and potentially shellcodes available in submodules that should be\n # avilable because of the context\n self.__lazyinit__ and self.__lazyinit__()\n\n result = list(self._submodules.keys())\n result.extend(('__file__', '__package__', '__path__',\n '__all__', '__name__'))\n result.extend(self.__shellcodes__())\n\n return result\n\n def _context_modules(self):\n self.__lazyinit__ and self.__lazyinit__()\n for k, m in self._submodules.items():\n if k in [context.arch, context.os]:\n yield m\n\n def __shellcodes__(self):\n self.__lazyinit__ and self.__lazyinit__()\n result = self._shellcodes.keys()\n for m in self._context_modules():\n result.extend(m.__shellcodes__())\n return result\n\n# To prevent garbage collection\ntether = sys.modules[__name__]\n\n# Create the module structure\nshellcraft = module(__name__, '')\n\nclass LazyImporter:\n def find_module(self, fullname, path):\n if not fullname.startswith('pwnlib.shellcraft.'):\n return None\n\n parts = fullname.split('.')[2:]\n cur = shellcraft\n for part in parts:\n cur = getattr(cur, part, None)\n if not isinstance(cur, ModuleType):\n return None\n\n return self\n\n def load_module(self, fullname):\n return sys.modules[fullname]\nsys.meta_path.append(LazyImporter())\n", "path": "pwnlib/shellcraft/__init__.py"}]}
| 1,349 | 215 |
gh_patches_debug_26773
|
rasdani/github-patches
|
git_diff
|
optuna__optuna-4940
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix `Test` warnings from `BoTorch`
### Motivation
Resolve following warnings from test:
```
tests/terminator_tests/improvement_tests/gp_tests/test_botorch.py: 4 warnings
tests/visualization_tests/test_terminator_improvement.py: 46 warnings
/opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/site-packages/botorch/fit.py:139: DeprecationWarning:
`fit_gpytorch_model` is marked for deprecation, consider using `fit_gpytorch_mll` instead.
```
### Suggestion
Look for the proper way to fix this warning (Hopefully, just replacing `fit_gpytorch_model` with `fit_gpytorch_mll`).
### Additional context (optional)
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `optuna/terminator/improvement/gp/botorch.py`
Content:
```
1 from __future__ import annotations
2
3 from typing import Optional
4
5 import numpy as np
6
7 from optuna._imports import try_import
8 from optuna.distributions import _is_distribution_log
9 from optuna.distributions import CategoricalDistribution
10 from optuna.distributions import FloatDistribution
11 from optuna.distributions import IntDistribution
12 from optuna.search_space import intersection_search_space
13 from optuna.terminator.improvement.gp.base import BaseGaussianProcess
14 from optuna.trial._frozen import FrozenTrial
15 from optuna.trial._state import TrialState
16
17
18 with try_import() as _imports:
19 from botorch.fit import fit_gpytorch_model
20 from botorch.models import SingleTaskGP
21 from botorch.models.transforms import Normalize
22 from botorch.models.transforms import Standardize
23 import gpytorch
24 import torch
25
26 __all__ = [
27 "fit_gpytorch_model",
28 "SingleTaskGP",
29 "Normalize",
30 "Standardize",
31 "gpytorch",
32 "torch",
33 ]
34
35
36 class _BoTorchGaussianProcess(BaseGaussianProcess):
37 def __init__(self) -> None:
38 _imports.check()
39
40 self._gp: Optional[SingleTaskGP] = None
41
42 def fit(
43 self,
44 trials: list[FrozenTrial],
45 ) -> None:
46 self._trials = trials
47
48 x, bounds = _convert_trials_to_tensors(trials)
49
50 n_params = x.shape[1]
51
52 y = torch.tensor([trial.value for trial in trials], dtype=torch.float64)
53 y = torch.unsqueeze(y, 1)
54
55 self._gp = SingleTaskGP(
56 x,
57 y,
58 input_transform=Normalize(d=n_params, bounds=bounds),
59 outcome_transform=Standardize(m=1),
60 )
61
62 mll = gpytorch.mlls.ExactMarginalLogLikelihood(self._gp.likelihood, self._gp)
63
64 fit_gpytorch_model(mll)
65
66 def predict_mean_std(
67 self,
68 trials: list[FrozenTrial],
69 ) -> tuple[np.ndarray, np.ndarray]:
70 assert self._gp is not None
71
72 x, _ = _convert_trials_to_tensors(trials)
73
74 with torch.no_grad(), gpytorch.settings.fast_pred_var():
75 posterior = self._gp.posterior(x)
76 mean = posterior.mean
77 variance = posterior.variance
78 std = variance.sqrt()
79
80 return mean.detach().numpy(), std.detach().numpy()
81
82
83 def _convert_trials_to_tensors(trials: list[FrozenTrial]) -> tuple[torch.Tensor, torch.Tensor]:
84 """Convert a list of FrozenTrial objects to tensors inputs and bounds.
85
86 This function assumes the following condition for input trials:
87 - any categorical param is converted to a float or int one;
88 - log is unscaled for any float/int distribution;
89 - the state is COMPLETE for any trial;
90 - direction is MINIMIZE for any trial.
91 """
92 search_space = intersection_search_space(trials)
93 sorted_params = sorted(search_space.keys())
94
95 x = []
96 for trial in trials:
97 assert trial.state == TrialState.COMPLETE
98 x_row = []
99 for param in sorted_params:
100 distribution = search_space[param]
101
102 assert not _is_distribution_log(distribution)
103 assert not isinstance(distribution, CategoricalDistribution)
104
105 param_value = float(trial.params[param])
106 x_row.append(param_value)
107
108 x.append(x_row)
109
110 min_bounds = []
111 max_bounds = []
112 for param, distribution in search_space.items():
113 assert isinstance(distribution, (FloatDistribution, IntDistribution))
114 min_bounds.append(distribution.low)
115 max_bounds.append(distribution.high)
116 bounds = [min_bounds, max_bounds]
117
118 return torch.tensor(x, dtype=torch.float64), torch.tensor(bounds, dtype=torch.float64)
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/optuna/terminator/improvement/gp/botorch.py b/optuna/terminator/improvement/gp/botorch.py
--- a/optuna/terminator/improvement/gp/botorch.py
+++ b/optuna/terminator/improvement/gp/botorch.py
@@ -3,6 +3,7 @@
from typing import Optional
import numpy as np
+from packaging import version
from optuna._imports import try_import
from optuna.distributions import _is_distribution_log
@@ -16,15 +17,20 @@
with try_import() as _imports:
- from botorch.fit import fit_gpytorch_model
+ import botorch
from botorch.models import SingleTaskGP
from botorch.models.transforms import Normalize
from botorch.models.transforms import Standardize
import gpytorch
import torch
+ if version.parse(botorch.version.version) < version.parse("0.8.0"):
+ from botorch.fit import fit_gpytorch_model as fit_gpytorch_mll
+ else:
+ from botorch.fit import fit_gpytorch_mll
+
__all__ = [
- "fit_gpytorch_model",
+ "fit_gpytorch_mll",
"SingleTaskGP",
"Normalize",
"Standardize",
@@ -61,7 +67,7 @@
mll = gpytorch.mlls.ExactMarginalLogLikelihood(self._gp.likelihood, self._gp)
- fit_gpytorch_model(mll)
+ fit_gpytorch_mll(mll)
def predict_mean_std(
self,
|
{"golden_diff": "diff --git a/optuna/terminator/improvement/gp/botorch.py b/optuna/terminator/improvement/gp/botorch.py\n--- a/optuna/terminator/improvement/gp/botorch.py\n+++ b/optuna/terminator/improvement/gp/botorch.py\n@@ -3,6 +3,7 @@\n from typing import Optional\n \n import numpy as np\n+from packaging import version\n \n from optuna._imports import try_import\n from optuna.distributions import _is_distribution_log\n@@ -16,15 +17,20 @@\n \n \n with try_import() as _imports:\n- from botorch.fit import fit_gpytorch_model\n+ import botorch\n from botorch.models import SingleTaskGP\n from botorch.models.transforms import Normalize\n from botorch.models.transforms import Standardize\n import gpytorch\n import torch\n \n+ if version.parse(botorch.version.version) < version.parse(\"0.8.0\"):\n+ from botorch.fit import fit_gpytorch_model as fit_gpytorch_mll\n+ else:\n+ from botorch.fit import fit_gpytorch_mll\n+\n __all__ = [\n- \"fit_gpytorch_model\",\n+ \"fit_gpytorch_mll\",\n \"SingleTaskGP\",\n \"Normalize\",\n \"Standardize\",\n@@ -61,7 +67,7 @@\n \n mll = gpytorch.mlls.ExactMarginalLogLikelihood(self._gp.likelihood, self._gp)\n \n- fit_gpytorch_model(mll)\n+ fit_gpytorch_mll(mll)\n \n def predict_mean_std(\n self,\n", "issue": "Fix `Test` warnings from `BoTorch`\n### Motivation\n\nResolve following warnings from test:\r\n```\r\ntests/terminator_tests/improvement_tests/gp_tests/test_botorch.py: 4 warnings\r\ntests/visualization_tests/test_terminator_improvement.py: 46 warnings\r\n /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/site-packages/botorch/fit.py:139: DeprecationWarning:\r\n \r\n `fit_gpytorch_model` is marked for deprecation, consider using `fit_gpytorch_mll` instead.\r\n```\n\n### Suggestion\n\nLook for the proper way to fix this warning (Hopefully, just replacing `fit_gpytorch_model` with `fit_gpytorch_mll`).\n\n### Additional context (optional)\n\n_No response_\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom typing import Optional\n\nimport numpy as np\n\nfrom optuna._imports import try_import\nfrom optuna.distributions import _is_distribution_log\nfrom optuna.distributions import CategoricalDistribution\nfrom optuna.distributions import FloatDistribution\nfrom optuna.distributions import IntDistribution\nfrom optuna.search_space import intersection_search_space\nfrom optuna.terminator.improvement.gp.base import BaseGaussianProcess\nfrom optuna.trial._frozen import FrozenTrial\nfrom optuna.trial._state import TrialState\n\n\nwith try_import() as _imports:\n from botorch.fit import fit_gpytorch_model\n from botorch.models import SingleTaskGP\n from botorch.models.transforms import Normalize\n from botorch.models.transforms import Standardize\n import gpytorch\n import torch\n\n__all__ = [\n \"fit_gpytorch_model\",\n \"SingleTaskGP\",\n \"Normalize\",\n \"Standardize\",\n \"gpytorch\",\n \"torch\",\n]\n\n\nclass _BoTorchGaussianProcess(BaseGaussianProcess):\n def __init__(self) -> None:\n _imports.check()\n\n self._gp: Optional[SingleTaskGP] = None\n\n def fit(\n self,\n trials: list[FrozenTrial],\n ) -> None:\n self._trials = trials\n\n x, bounds = _convert_trials_to_tensors(trials)\n\n n_params = x.shape[1]\n\n y = torch.tensor([trial.value for trial in trials], dtype=torch.float64)\n y = torch.unsqueeze(y, 1)\n\n self._gp = SingleTaskGP(\n x,\n y,\n input_transform=Normalize(d=n_params, bounds=bounds),\n outcome_transform=Standardize(m=1),\n )\n\n mll = gpytorch.mlls.ExactMarginalLogLikelihood(self._gp.likelihood, self._gp)\n\n fit_gpytorch_model(mll)\n\n def predict_mean_std(\n self,\n trials: list[FrozenTrial],\n ) -> tuple[np.ndarray, np.ndarray]:\n assert self._gp is not None\n\n x, _ = _convert_trials_to_tensors(trials)\n\n with torch.no_grad(), gpytorch.settings.fast_pred_var():\n posterior = self._gp.posterior(x)\n mean = posterior.mean\n variance = posterior.variance\n std = variance.sqrt()\n\n return mean.detach().numpy(), std.detach().numpy()\n\n\ndef _convert_trials_to_tensors(trials: list[FrozenTrial]) -> tuple[torch.Tensor, torch.Tensor]:\n \"\"\"Convert a list of FrozenTrial objects to tensors inputs and bounds.\n\n This function assumes the following condition for input trials:\n - any categorical param is converted to a float or int one;\n - log is unscaled for any float/int distribution;\n - the state is COMPLETE for any trial;\n - direction is MINIMIZE for any trial.\n \"\"\"\n search_space = intersection_search_space(trials)\n sorted_params = sorted(search_space.keys())\n\n x = []\n for trial in trials:\n assert trial.state == TrialState.COMPLETE\n x_row = []\n for param in sorted_params:\n distribution = search_space[param]\n\n assert not _is_distribution_log(distribution)\n assert not isinstance(distribution, CategoricalDistribution)\n\n param_value = float(trial.params[param])\n x_row.append(param_value)\n\n x.append(x_row)\n\n min_bounds = []\n max_bounds = []\n for param, distribution in search_space.items():\n assert isinstance(distribution, (FloatDistribution, IntDistribution))\n min_bounds.append(distribution.low)\n max_bounds.append(distribution.high)\n bounds = [min_bounds, max_bounds]\n\n return torch.tensor(x, dtype=torch.float64), torch.tensor(bounds, dtype=torch.float64)\n", "path": "optuna/terminator/improvement/gp/botorch.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import Optional\n\nimport numpy as np\nfrom packaging import version\n\nfrom optuna._imports import try_import\nfrom optuna.distributions import _is_distribution_log\nfrom optuna.distributions import CategoricalDistribution\nfrom optuna.distributions import FloatDistribution\nfrom optuna.distributions import IntDistribution\nfrom optuna.search_space import intersection_search_space\nfrom optuna.terminator.improvement.gp.base import BaseGaussianProcess\nfrom optuna.trial._frozen import FrozenTrial\nfrom optuna.trial._state import TrialState\n\n\nwith try_import() as _imports:\n import botorch\n from botorch.models import SingleTaskGP\n from botorch.models.transforms import Normalize\n from botorch.models.transforms import Standardize\n import gpytorch\n import torch\n\n if version.parse(botorch.version.version) < version.parse(\"0.8.0\"):\n from botorch.fit import fit_gpytorch_model as fit_gpytorch_mll\n else:\n from botorch.fit import fit_gpytorch_mll\n\n__all__ = [\n \"fit_gpytorch_mll\",\n \"SingleTaskGP\",\n \"Normalize\",\n \"Standardize\",\n \"gpytorch\",\n \"torch\",\n]\n\n\nclass _BoTorchGaussianProcess(BaseGaussianProcess):\n def __init__(self) -> None:\n _imports.check()\n\n self._gp: Optional[SingleTaskGP] = None\n\n def fit(\n self,\n trials: list[FrozenTrial],\n ) -> None:\n self._trials = trials\n\n x, bounds = _convert_trials_to_tensors(trials)\n\n n_params = x.shape[1]\n\n y = torch.tensor([trial.value for trial in trials], dtype=torch.float64)\n y = torch.unsqueeze(y, 1)\n\n self._gp = SingleTaskGP(\n x,\n y,\n input_transform=Normalize(d=n_params, bounds=bounds),\n outcome_transform=Standardize(m=1),\n )\n\n mll = gpytorch.mlls.ExactMarginalLogLikelihood(self._gp.likelihood, self._gp)\n\n fit_gpytorch_mll(mll)\n\n def predict_mean_std(\n self,\n trials: list[FrozenTrial],\n ) -> tuple[np.ndarray, np.ndarray]:\n assert self._gp is not None\n\n x, _ = _convert_trials_to_tensors(trials)\n\n with torch.no_grad(), gpytorch.settings.fast_pred_var():\n posterior = self._gp.posterior(x)\n mean = posterior.mean\n variance = posterior.variance\n std = variance.sqrt()\n\n return mean.detach().numpy(), std.detach().numpy()\n\n\ndef _convert_trials_to_tensors(trials: list[FrozenTrial]) -> tuple[torch.Tensor, torch.Tensor]:\n \"\"\"Convert a list of FrozenTrial objects to tensors inputs and bounds.\n\n This function assumes the following condition for input trials:\n - any categorical param is converted to a float or int one;\n - log is unscaled for any float/int distribution;\n - the state is COMPLETE for any trial;\n - direction is MINIMIZE for any trial.\n \"\"\"\n search_space = intersection_search_space(trials)\n sorted_params = sorted(search_space.keys())\n\n x = []\n for trial in trials:\n assert trial.state == TrialState.COMPLETE\n x_row = []\n for param in sorted_params:\n distribution = search_space[param]\n\n assert not _is_distribution_log(distribution)\n assert not isinstance(distribution, CategoricalDistribution)\n\n param_value = float(trial.params[param])\n x_row.append(param_value)\n\n x.append(x_row)\n\n min_bounds = []\n max_bounds = []\n for param, distribution in search_space.items():\n assert isinstance(distribution, (FloatDistribution, IntDistribution))\n min_bounds.append(distribution.low)\n max_bounds.append(distribution.high)\n bounds = [min_bounds, max_bounds]\n\n return torch.tensor(x, dtype=torch.float64), torch.tensor(bounds, dtype=torch.float64)\n", "path": "optuna/terminator/improvement/gp/botorch.py"}]}
| 1,523 | 373 |
gh_patches_debug_8875
|
rasdani/github-patches
|
git_diff
|
microsoft__botbuilder-python-1401
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add tests for SkillHttpClient
see dotnet and javascript imp
[enhancement]
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 from logging import Logger
5
6 from botbuilder.core import InvokeResponse
7 from botbuilder.integration.aiohttp import BotFrameworkHttpClient
8 from botbuilder.core.skills import (
9 ConversationIdFactoryBase,
10 SkillConversationIdFactoryOptions,
11 BotFrameworkSkill,
12 )
13 from botbuilder.schema import Activity
14 from botframework.connector.auth import (
15 AuthenticationConstants,
16 ChannelProvider,
17 GovernmentConstants,
18 SimpleCredentialProvider,
19 )
20
21
22 class SkillHttpClient(BotFrameworkHttpClient):
23 def __init__(
24 self,
25 credential_provider: SimpleCredentialProvider,
26 skill_conversation_id_factory: ConversationIdFactoryBase,
27 channel_provider: ChannelProvider = None,
28 logger: Logger = None,
29 ):
30 if not skill_conversation_id_factory:
31 raise TypeError(
32 "SkillHttpClient(): skill_conversation_id_factory can't be None"
33 )
34
35 super().__init__(credential_provider)
36
37 self._skill_conversation_id_factory = skill_conversation_id_factory
38 self._channel_provider = channel_provider
39
40 async def post_activity_to_skill(
41 self,
42 from_bot_id: str,
43 to_skill: BotFrameworkSkill,
44 service_url: str,
45 activity: Activity,
46 originating_audience: str = None,
47 ) -> InvokeResponse:
48
49 if originating_audience is None:
50 originating_audience = (
51 GovernmentConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE
52 if self._channel_provider is not None
53 and self._channel_provider.IsGovernment()
54 else AuthenticationConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE
55 )
56
57 options = SkillConversationIdFactoryOptions(
58 from_bot_oauth_scope=originating_audience,
59 from_bot_id=from_bot_id,
60 activity=activity,
61 bot_framework_skill=to_skill,
62 )
63
64 skill_conversation_id = await self._skill_conversation_id_factory.create_skill_conversation_id(
65 options
66 )
67
68 return await super().post_activity(
69 from_bot_id,
70 to_skill.app_id,
71 to_skill.skill_endpoint,
72 service_url,
73 skill_conversation_id,
74 activity,
75 )
76
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py b/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py
--- a/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py
+++ b/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py
@@ -50,7 +50,7 @@
originating_audience = (
GovernmentConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE
if self._channel_provider is not None
- and self._channel_provider.IsGovernment()
+ and self._channel_provider.is_government()
else AuthenticationConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE
)
|
{"golden_diff": "diff --git a/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py b/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py\n--- a/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py\n+++ b/libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py\n@@ -50,7 +50,7 @@\n originating_audience = (\n GovernmentConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE\n if self._channel_provider is not None\n- and self._channel_provider.IsGovernment()\n+ and self._channel_provider.is_government()\n else AuthenticationConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE\n )\n", "issue": "Add tests for SkillHttpClient\nsee dotnet and javascript imp\r\n\r\n[enhancement]\r\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nfrom logging import Logger\n\nfrom botbuilder.core import InvokeResponse\nfrom botbuilder.integration.aiohttp import BotFrameworkHttpClient\nfrom botbuilder.core.skills import (\n ConversationIdFactoryBase,\n SkillConversationIdFactoryOptions,\n BotFrameworkSkill,\n)\nfrom botbuilder.schema import Activity\nfrom botframework.connector.auth import (\n AuthenticationConstants,\n ChannelProvider,\n GovernmentConstants,\n SimpleCredentialProvider,\n)\n\n\nclass SkillHttpClient(BotFrameworkHttpClient):\n def __init__(\n self,\n credential_provider: SimpleCredentialProvider,\n skill_conversation_id_factory: ConversationIdFactoryBase,\n channel_provider: ChannelProvider = None,\n logger: Logger = None,\n ):\n if not skill_conversation_id_factory:\n raise TypeError(\n \"SkillHttpClient(): skill_conversation_id_factory can't be None\"\n )\n\n super().__init__(credential_provider)\n\n self._skill_conversation_id_factory = skill_conversation_id_factory\n self._channel_provider = channel_provider\n\n async def post_activity_to_skill(\n self,\n from_bot_id: str,\n to_skill: BotFrameworkSkill,\n service_url: str,\n activity: Activity,\n originating_audience: str = None,\n ) -> InvokeResponse:\n\n if originating_audience is None:\n originating_audience = (\n GovernmentConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE\n if self._channel_provider is not None\n and self._channel_provider.IsGovernment()\n else AuthenticationConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE\n )\n\n options = SkillConversationIdFactoryOptions(\n from_bot_oauth_scope=originating_audience,\n from_bot_id=from_bot_id,\n activity=activity,\n bot_framework_skill=to_skill,\n )\n\n skill_conversation_id = await self._skill_conversation_id_factory.create_skill_conversation_id(\n options\n )\n\n return await super().post_activity(\n from_bot_id,\n to_skill.app_id,\n to_skill.skill_endpoint,\n service_url,\n skill_conversation_id,\n activity,\n )\n", "path": "libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nfrom logging import Logger\n\nfrom botbuilder.core import InvokeResponse\nfrom botbuilder.integration.aiohttp import BotFrameworkHttpClient\nfrom botbuilder.core.skills import (\n ConversationIdFactoryBase,\n SkillConversationIdFactoryOptions,\n BotFrameworkSkill,\n)\nfrom botbuilder.schema import Activity\nfrom botframework.connector.auth import (\n AuthenticationConstants,\n ChannelProvider,\n GovernmentConstants,\n SimpleCredentialProvider,\n)\n\n\nclass SkillHttpClient(BotFrameworkHttpClient):\n def __init__(\n self,\n credential_provider: SimpleCredentialProvider,\n skill_conversation_id_factory: ConversationIdFactoryBase,\n channel_provider: ChannelProvider = None,\n logger: Logger = None,\n ):\n if not skill_conversation_id_factory:\n raise TypeError(\n \"SkillHttpClient(): skill_conversation_id_factory can't be None\"\n )\n\n super().__init__(credential_provider)\n\n self._skill_conversation_id_factory = skill_conversation_id_factory\n self._channel_provider = channel_provider\n\n async def post_activity_to_skill(\n self,\n from_bot_id: str,\n to_skill: BotFrameworkSkill,\n service_url: str,\n activity: Activity,\n originating_audience: str = None,\n ) -> InvokeResponse:\n\n if originating_audience is None:\n originating_audience = (\n GovernmentConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE\n if self._channel_provider is not None\n and self._channel_provider.is_government()\n else AuthenticationConstants.TO_CHANNEL_FROM_BOT_OAUTH_SCOPE\n )\n\n options = SkillConversationIdFactoryOptions(\n from_bot_oauth_scope=originating_audience,\n from_bot_id=from_bot_id,\n activity=activity,\n bot_framework_skill=to_skill,\n )\n\n skill_conversation_id = await self._skill_conversation_id_factory.create_skill_conversation_id(\n options\n )\n\n return await super().post_activity(\n from_bot_id,\n to_skill.app_id,\n to_skill.skill_endpoint,\n service_url,\n skill_conversation_id,\n activity,\n )\n", "path": "libraries/botbuilder-integration-aiohttp/botbuilder/integration/aiohttp/skills/skill_http_client.py"}]}
| 894 | 189 |
gh_patches_debug_803
|
rasdani/github-patches
|
git_diff
|
googleapis__google-api-python-client-871
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AttributeError: module 'googleapiclient' has no attribute '__version__'
When importing new version of google-api-python-client
`from apiclient import discovery`
i'm getting the error
`AttributeError: module 'googleapiclient' has no attribute '__version__'`
https://github.com/googleapis/google-api-python-client/blob/84d45619d753cb04d957651886231034194058b6/apiclient/__init__.py#L22
i guess this happens since you have removed `__version__` var from `googleapiclient/__init__.py`
https://github.com/googleapis/google-api-python-client/commit/f706cfd821ab7457e5db37abfc3619772657dd0e#diff-b926d296d4c856bcbf877809e4523562L15
can you please fix? @busunkim96 @mik-laj @crwilcox
Traceback:
``` from apiclient import discovery
File "/usr/local/lib/python3.7/site-packages/apiclient/__init__.py", line 22, in <module>
__version__ = googleapiclient.__version__
AttributeError: module 'googleapiclient' has no attribute '__version__'
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apiclient/__init__.py`
Content:
```
1 """Retain apiclient as an alias for googleapiclient."""
2
3 from six import iteritems
4
5 import googleapiclient
6
7 from googleapiclient import channel
8 from googleapiclient import discovery
9 from googleapiclient import errors
10 from googleapiclient import http
11 from googleapiclient import mimeparse
12 from googleapiclient import model
13
14 try:
15 from googleapiclient import sample_tools
16 except ImportError:
17 # Silently ignore, because the vast majority of consumers won't use it and
18 # it has deep dependence on oauth2client, an optional dependency.
19 sample_tools = None
20 from googleapiclient import schema
21
22 __version__ = googleapiclient.__version__
23
24 _SUBMODULES = {
25 "channel": channel,
26 "discovery": discovery,
27 "errors": errors,
28 "http": http,
29 "mimeparse": mimeparse,
30 "model": model,
31 "sample_tools": sample_tools,
32 "schema": schema,
33 }
34
35 import sys
36
37 for module_name, module in iteritems(_SUBMODULES):
38 sys.modules["apiclient.%s" % module_name] = module
39
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/apiclient/__init__.py b/apiclient/__init__.py
--- a/apiclient/__init__.py
+++ b/apiclient/__init__.py
@@ -19,8 +19,6 @@
sample_tools = None
from googleapiclient import schema
-__version__ = googleapiclient.__version__
-
_SUBMODULES = {
"channel": channel,
"discovery": discovery,
|
{"golden_diff": "diff --git a/apiclient/__init__.py b/apiclient/__init__.py\n--- a/apiclient/__init__.py\n+++ b/apiclient/__init__.py\n@@ -19,8 +19,6 @@\n sample_tools = None\n from googleapiclient import schema\n \n-__version__ = googleapiclient.__version__\n-\n _SUBMODULES = {\n \"channel\": channel,\n \"discovery\": discovery,\n", "issue": "AttributeError: module 'googleapiclient' has no attribute '__version__'\nWhen importing new version of google-api-python-client\r\n`from apiclient import discovery`\r\ni'm getting the error\r\n`AttributeError: module 'googleapiclient' has no attribute '__version__'`\r\n\r\nhttps://github.com/googleapis/google-api-python-client/blob/84d45619d753cb04d957651886231034194058b6/apiclient/__init__.py#L22\r\n\r\ni guess this happens since you have removed `__version__` var from `googleapiclient/__init__.py`\r\n\r\nhttps://github.com/googleapis/google-api-python-client/commit/f706cfd821ab7457e5db37abfc3619772657dd0e#diff-b926d296d4c856bcbf877809e4523562L15\r\n\r\ncan you please fix? @busunkim96 @mik-laj @crwilcox \r\n\r\nTraceback:\r\n``` from apiclient import discovery\r\n File \"/usr/local/lib/python3.7/site-packages/apiclient/__init__.py\", line 22, in <module>\r\n __version__ = googleapiclient.__version__\r\nAttributeError: module 'googleapiclient' has no attribute '__version__'\n", "before_files": [{"content": "\"\"\"Retain apiclient as an alias for googleapiclient.\"\"\"\n\nfrom six import iteritems\n\nimport googleapiclient\n\nfrom googleapiclient import channel\nfrom googleapiclient import discovery\nfrom googleapiclient import errors\nfrom googleapiclient import http\nfrom googleapiclient import mimeparse\nfrom googleapiclient import model\n\ntry:\n from googleapiclient import sample_tools\nexcept ImportError:\n # Silently ignore, because the vast majority of consumers won't use it and\n # it has deep dependence on oauth2client, an optional dependency.\n sample_tools = None\nfrom googleapiclient import schema\n\n__version__ = googleapiclient.__version__\n\n_SUBMODULES = {\n \"channel\": channel,\n \"discovery\": discovery,\n \"errors\": errors,\n \"http\": http,\n \"mimeparse\": mimeparse,\n \"model\": model,\n \"sample_tools\": sample_tools,\n \"schema\": schema,\n}\n\nimport sys\n\nfor module_name, module in iteritems(_SUBMODULES):\n sys.modules[\"apiclient.%s\" % module_name] = module\n", "path": "apiclient/__init__.py"}], "after_files": [{"content": "\"\"\"Retain apiclient as an alias for googleapiclient.\"\"\"\n\nfrom six import iteritems\n\nimport googleapiclient\n\nfrom googleapiclient import channel\nfrom googleapiclient import discovery\nfrom googleapiclient import errors\nfrom googleapiclient import http\nfrom googleapiclient import mimeparse\nfrom googleapiclient import model\n\ntry:\n from googleapiclient import sample_tools\nexcept ImportError:\n # Silently ignore, because the vast majority of consumers won't use it and\n # it has deep dependence on oauth2client, an optional dependency.\n sample_tools = None\nfrom googleapiclient import schema\n\n_SUBMODULES = {\n \"channel\": channel,\n \"discovery\": discovery,\n \"errors\": errors,\n \"http\": http,\n \"mimeparse\": mimeparse,\n \"model\": model,\n \"sample_tools\": sample_tools,\n \"schema\": schema,\n}\n\nimport sys\n\nfor module_name, module in iteritems(_SUBMODULES):\n sys.modules[\"apiclient.%s\" % module_name] = module\n", "path": "apiclient/__init__.py"}]}
| 904 | 98 |
gh_patches_debug_19691
|
rasdani/github-patches
|
git_diff
|
aws-cloudformation__cfn-lint-2967
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Invalid Aliases when using SSM dynamic references
### CloudFormation Lint Version
0.83.0
### What operating system are you using?
Mac
### Describe the bug
When using a dynamic reference to resolve the Alias domain, cfn-lint fails indicating it's an invalid alias. Shouldn't the code check if this is a `REGEX_DYN_REF` in https://github.com/aws-cloudformation/cfn-lint/blob/main/src/cfnlint/rules/resources/cloudfront/Aliases.py and ignore if so?
A workaround would be to use "!Sub" which apparently is ignored already (`FUNCTIONS`). Shouldn't we also ignore when `REGEX_DYN_REF`?
### Expected behavior
E3013 shouldn't be informed, since there's no way to validate the dynamic-reference value from cfn-lint perspective (?)
### Reproduction template
```
CloudFront:
Type: AWS::CloudFront::Distribution
Properties:
DistributionConfig:
Enabled: true
Aliases:
- "{{resolve:ssm:/env/fqdns/certifier}}"
DefaultRootObject: index.html
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cfnlint/rules/resources/cloudfront/Aliases.py`
Content:
```
1 """
2 Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
3 SPDX-License-Identifier: MIT-0
4 """
5 import regex as re
6
7 from cfnlint.helpers import FUNCTIONS
8 from cfnlint.rules import CloudFormationLintRule, RuleMatch
9
10
11 class Aliases(CloudFormationLintRule):
12 """Check if CloudFront Aliases are valid domain names"""
13
14 id = "E3013"
15 shortdesc = "CloudFront Aliases"
16 description = "CloudFront aliases should contain valid domain names"
17 source_url = "https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-cloudfront-distribution-distributionconfig.html#cfn-cloudfront-distribution-distributionconfig-aliases"
18 tags = ["properties", "cloudfront"]
19
20 def match(self, cfn):
21 """Check cloudfront Resource Parameters"""
22
23 matches = []
24
25 valid_domain = re.compile(
26 r"^(?:[a-z0-9\*](?:[a-z0-9-]{0,61}[a-z0-9])?\.)+[a-z0-9][a-z0-9-]{0,61}[a-z0-9]$"
27 )
28
29 results = cfn.get_resource_properties(
30 ["AWS::CloudFront::Distribution", "DistributionConfig"]
31 )
32 for result in results:
33 aliases = result["Value"].get("Aliases")
34 if aliases:
35 for alias in aliases:
36 if isinstance(alias, str) and alias not in FUNCTIONS:
37 wildcard = alias.split(".")
38 if "*" in wildcard[1:]:
39 path = result["Path"] + ["Aliases"]
40 message = f'Invalid use of wildcards: {alias} at {"/".join(result["Path"])}'
41 matches.append(RuleMatch(path, message))
42 if not re.match(valid_domain, alias):
43 path = result["Path"] + ["Aliases"]
44 message = f'Invalid alias found: {alias} at {"/".join(result["Path"])}'
45 matches.append(RuleMatch(path, message))
46
47 return matches
48
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/cfnlint/rules/resources/cloudfront/Aliases.py b/src/cfnlint/rules/resources/cloudfront/Aliases.py
--- a/src/cfnlint/rules/resources/cloudfront/Aliases.py
+++ b/src/cfnlint/rules/resources/cloudfront/Aliases.py
@@ -4,7 +4,7 @@
"""
import regex as re
-from cfnlint.helpers import FUNCTIONS
+from cfnlint.helpers import FUNCTIONS, REGEX_DYN_REF
from cfnlint.rules import CloudFormationLintRule, RuleMatch
@@ -35,6 +35,8 @@
for alias in aliases:
if isinstance(alias, str) and alias not in FUNCTIONS:
wildcard = alias.split(".")
+ if re.match(REGEX_DYN_REF, alias):
+ continue
if "*" in wildcard[1:]:
path = result["Path"] + ["Aliases"]
message = f'Invalid use of wildcards: {alias} at {"/".join(result["Path"])}'
|
{"golden_diff": "diff --git a/src/cfnlint/rules/resources/cloudfront/Aliases.py b/src/cfnlint/rules/resources/cloudfront/Aliases.py\n--- a/src/cfnlint/rules/resources/cloudfront/Aliases.py\n+++ b/src/cfnlint/rules/resources/cloudfront/Aliases.py\n@@ -4,7 +4,7 @@\n \"\"\"\n import regex as re\n \n-from cfnlint.helpers import FUNCTIONS\n+from cfnlint.helpers import FUNCTIONS, REGEX_DYN_REF\n from cfnlint.rules import CloudFormationLintRule, RuleMatch\n \n \n@@ -35,6 +35,8 @@\n for alias in aliases:\n if isinstance(alias, str) and alias not in FUNCTIONS:\n wildcard = alias.split(\".\")\n+ if re.match(REGEX_DYN_REF, alias):\n+ continue\n if \"*\" in wildcard[1:]:\n path = result[\"Path\"] + [\"Aliases\"]\n message = f'Invalid use of wildcards: {alias} at {\"/\".join(result[\"Path\"])}'\n", "issue": "Invalid Aliases when using SSM dynamic references\n### CloudFormation Lint Version\r\n\r\n0.83.0\r\n\r\n### What operating system are you using?\r\n\r\nMac\r\n\r\n### Describe the bug\r\n\r\nWhen using a dynamic reference to resolve the Alias domain, cfn-lint fails indicating it's an invalid alias. Shouldn't the code check if this is a `REGEX_DYN_REF` in https://github.com/aws-cloudformation/cfn-lint/blob/main/src/cfnlint/rules/resources/cloudfront/Aliases.py and ignore if so?\r\n\r\nA workaround would be to use \"!Sub\" which apparently is ignored already (`FUNCTIONS`). Shouldn't we also ignore when `REGEX_DYN_REF`?\r\n\r\n### Expected behavior\r\n\r\nE3013 shouldn't be informed, since there's no way to validate the dynamic-reference value from cfn-lint perspective (?)\r\n\r\n### Reproduction template\r\n\r\n```\r\nCloudFront:\r\n Type: AWS::CloudFront::Distribution\r\n Properties:\r\n DistributionConfig:\r\n Enabled: true\r\n Aliases:\r\n - \"{{resolve:ssm:/env/fqdns/certifier}}\"\r\n DefaultRootObject: index.html\r\n``` \n", "before_files": [{"content": "\"\"\"\nCopyright Amazon.com, Inc. or its affiliates. All Rights Reserved.\nSPDX-License-Identifier: MIT-0\n\"\"\"\nimport regex as re\n\nfrom cfnlint.helpers import FUNCTIONS\nfrom cfnlint.rules import CloudFormationLintRule, RuleMatch\n\n\nclass Aliases(CloudFormationLintRule):\n \"\"\"Check if CloudFront Aliases are valid domain names\"\"\"\n\n id = \"E3013\"\n shortdesc = \"CloudFront Aliases\"\n description = \"CloudFront aliases should contain valid domain names\"\n source_url = \"https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-cloudfront-distribution-distributionconfig.html#cfn-cloudfront-distribution-distributionconfig-aliases\"\n tags = [\"properties\", \"cloudfront\"]\n\n def match(self, cfn):\n \"\"\"Check cloudfront Resource Parameters\"\"\"\n\n matches = []\n\n valid_domain = re.compile(\n r\"^(?:[a-z0-9\\*](?:[a-z0-9-]{0,61}[a-z0-9])?\\.)+[a-z0-9][a-z0-9-]{0,61}[a-z0-9]$\"\n )\n\n results = cfn.get_resource_properties(\n [\"AWS::CloudFront::Distribution\", \"DistributionConfig\"]\n )\n for result in results:\n aliases = result[\"Value\"].get(\"Aliases\")\n if aliases:\n for alias in aliases:\n if isinstance(alias, str) and alias not in FUNCTIONS:\n wildcard = alias.split(\".\")\n if \"*\" in wildcard[1:]:\n path = result[\"Path\"] + [\"Aliases\"]\n message = f'Invalid use of wildcards: {alias} at {\"/\".join(result[\"Path\"])}'\n matches.append(RuleMatch(path, message))\n if not re.match(valid_domain, alias):\n path = result[\"Path\"] + [\"Aliases\"]\n message = f'Invalid alias found: {alias} at {\"/\".join(result[\"Path\"])}'\n matches.append(RuleMatch(path, message))\n\n return matches\n", "path": "src/cfnlint/rules/resources/cloudfront/Aliases.py"}], "after_files": [{"content": "\"\"\"\nCopyright Amazon.com, Inc. or its affiliates. All Rights Reserved.\nSPDX-License-Identifier: MIT-0\n\"\"\"\nimport regex as re\n\nfrom cfnlint.helpers import FUNCTIONS, REGEX_DYN_REF\nfrom cfnlint.rules import CloudFormationLintRule, RuleMatch\n\n\nclass Aliases(CloudFormationLintRule):\n \"\"\"Check if CloudFront Aliases are valid domain names\"\"\"\n\n id = \"E3013\"\n shortdesc = \"CloudFront Aliases\"\n description = \"CloudFront aliases should contain valid domain names\"\n source_url = \"https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-cloudfront-distribution-distributionconfig.html#cfn-cloudfront-distribution-distributionconfig-aliases\"\n tags = [\"properties\", \"cloudfront\"]\n\n def match(self, cfn):\n \"\"\"Check cloudfront Resource Parameters\"\"\"\n\n matches = []\n\n valid_domain = re.compile(\n r\"^(?:[a-z0-9\\*](?:[a-z0-9-]{0,61}[a-z0-9])?\\.)+[a-z0-9][a-z0-9-]{0,61}[a-z0-9]$\"\n )\n\n results = cfn.get_resource_properties(\n [\"AWS::CloudFront::Distribution\", \"DistributionConfig\"]\n )\n for result in results:\n aliases = result[\"Value\"].get(\"Aliases\")\n if aliases:\n for alias in aliases:\n if isinstance(alias, str) and alias not in FUNCTIONS:\n wildcard = alias.split(\".\")\n if re.match(REGEX_DYN_REF, alias):\n continue\n if \"*\" in wildcard[1:]:\n path = result[\"Path\"] + [\"Aliases\"]\n message = f'Invalid use of wildcards: {alias} at {\"/\".join(result[\"Path\"])}'\n matches.append(RuleMatch(path, message))\n if not re.match(valid_domain, alias):\n path = result[\"Path\"] + [\"Aliases\"]\n message = f'Invalid alias found: {alias} at {\"/\".join(result[\"Path\"])}'\n matches.append(RuleMatch(path, message))\n\n return matches\n", "path": "src/cfnlint/rules/resources/cloudfront/Aliases.py"}]}
| 1,033 | 213 |
gh_patches_debug_13562
|
rasdani/github-patches
|
git_diff
|
projectmesa__mesa-1984
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
JupyterViz space view limits
**What's the problem this feature will solve?**
At the moment, when drawing spaces with JupyterViz, the axes limits are left to be automatically determined by Matplotlib. Would it not make more sense (in particular of continuous spaces, to derive xlim and xmax from the space itself?
**Describe the solution you'd like**
The simplest fix would be to adapt `_draw_grid` to use `space.width` and `space.height` while `_draw_continuos_space` uses `space.x_min`, `space.x_max`, `space.y_min`, and `space.y_max`. Are there reasons where the user might want to rely on matpltolib's automatic determination of the bounds? If so, then a slightly more sophisticated solution would be needed where kwargs are passed to from jupyterviz to the underlying space_drawer function. So you would add a `space_drawer_kwargs` keyword argument.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mesa/experimental/components/matplotlib.py`
Content:
```
1 from typing import Optional
2
3 import networkx as nx
4 import solara
5 from matplotlib.figure import Figure
6 from matplotlib.ticker import MaxNLocator
7
8 import mesa
9
10
11 @solara.component
12 def SpaceMatplotlib(model, agent_portrayal, dependencies: Optional[list[any]] = None):
13 space_fig = Figure()
14 space_ax = space_fig.subplots()
15 space = getattr(model, "grid", None)
16 if space is None:
17 # Sometimes the space is defined as model.space instead of model.grid
18 space = model.space
19 if isinstance(space, mesa.space.NetworkGrid):
20 _draw_network_grid(space, space_ax, agent_portrayal)
21 elif isinstance(space, mesa.space.ContinuousSpace):
22 _draw_continuous_space(space, space_ax, agent_portrayal)
23 else:
24 _draw_grid(space, space_ax, agent_portrayal)
25 space_ax.set_axis_off()
26 solara.FigureMatplotlib(space_fig, format="png", dependencies=dependencies)
27
28
29 def _draw_grid(space, space_ax, agent_portrayal):
30 def portray(g):
31 x = []
32 y = []
33 s = [] # size
34 c = [] # color
35 for i in range(g.width):
36 for j in range(g.height):
37 content = g._grid[i][j]
38 if not content:
39 continue
40 if not hasattr(content, "__iter__"):
41 # Is a single grid
42 content = [content]
43 for agent in content:
44 data = agent_portrayal(agent)
45 x.append(i)
46 y.append(j)
47 if "size" in data:
48 s.append(data["size"])
49 if "color" in data:
50 c.append(data["color"])
51 out = {"x": x, "y": y}
52 if len(s) > 0:
53 out["s"] = s
54 if len(c) > 0:
55 out["c"] = c
56 return out
57
58 space_ax.scatter(**portray(space))
59
60
61 def _draw_network_grid(space, space_ax, agent_portrayal):
62 graph = space.G
63 pos = nx.spring_layout(graph, seed=0)
64 nx.draw(
65 graph,
66 ax=space_ax,
67 pos=pos,
68 **agent_portrayal(graph),
69 )
70
71
72 def _draw_continuous_space(space, space_ax, agent_portrayal):
73 def portray(space):
74 x = []
75 y = []
76 s = [] # size
77 c = [] # color
78 for agent in space._agent_to_index:
79 data = agent_portrayal(agent)
80 _x, _y = agent.pos
81 x.append(_x)
82 y.append(_y)
83 if "size" in data:
84 s.append(data["size"])
85 if "color" in data:
86 c.append(data["color"])
87 out = {"x": x, "y": y}
88 if len(s) > 0:
89 out["s"] = s
90 if len(c) > 0:
91 out["c"] = c
92 return out
93
94 space_ax.scatter(**portray(space))
95
96
97 def make_plot(model, measure):
98 fig = Figure()
99 ax = fig.subplots()
100 df = model.datacollector.get_model_vars_dataframe()
101 if isinstance(measure, str):
102 ax.plot(df.loc[:, measure])
103 ax.set_ylabel(measure)
104 elif isinstance(measure, dict):
105 for m, color in measure.items():
106 ax.plot(df.loc[:, m], label=m, color=color)
107 fig.legend()
108 elif isinstance(measure, (list, tuple)):
109 for m in measure:
110 ax.plot(df.loc[:, m], label=m)
111 fig.legend()
112 # Set integer x axis
113 ax.xaxis.set_major_locator(MaxNLocator(integer=True))
114 solara.FigureMatplotlib(fig)
115
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mesa/experimental/components/matplotlib.py b/mesa/experimental/components/matplotlib.py
--- a/mesa/experimental/components/matplotlib.py
+++ b/mesa/experimental/components/matplotlib.py
@@ -55,6 +55,8 @@
out["c"] = c
return out
+ space_ax.set_xlim(-1, space.width)
+ space_ax.set_ylim(-1, space.height)
space_ax.scatter(**portray(space))
@@ -91,6 +93,14 @@
out["c"] = c
return out
+ width = space.x_max - space.x_min
+ x_padding = width / 20
+ height = space.y_max - space.y_min
+ y_padding = height / 20
+ space_ax.set_xlim(space.x_min - x_padding, space.x_max + x_padding)
+ space_ax.set_ylim(space.y_min - y_padding, space.y_max + y_padding)
+ space_ax.scatter(**portray(space))
+
space_ax.scatter(**portray(space))
|
{"golden_diff": "diff --git a/mesa/experimental/components/matplotlib.py b/mesa/experimental/components/matplotlib.py\n--- a/mesa/experimental/components/matplotlib.py\n+++ b/mesa/experimental/components/matplotlib.py\n@@ -55,6 +55,8 @@\n out[\"c\"] = c\n return out\n \n+ space_ax.set_xlim(-1, space.width)\n+ space_ax.set_ylim(-1, space.height)\n space_ax.scatter(**portray(space))\n \n \n@@ -91,6 +93,14 @@\n out[\"c\"] = c\n return out\n \n+ width = space.x_max - space.x_min\n+ x_padding = width / 20\n+ height = space.y_max - space.y_min\n+ y_padding = height / 20\n+ space_ax.set_xlim(space.x_min - x_padding, space.x_max + x_padding)\n+ space_ax.set_ylim(space.y_min - y_padding, space.y_max + y_padding)\n+ space_ax.scatter(**portray(space))\n+\n space_ax.scatter(**portray(space))\n", "issue": "JupyterViz space view limits\n**What's the problem this feature will solve?**\r\nAt the moment, when drawing spaces with JupyterViz, the axes limits are left to be automatically determined by Matplotlib. Would it not make more sense (in particular of continuous spaces, to derive xlim and xmax from the space itself?\r\n\r\n**Describe the solution you'd like**\r\nThe simplest fix would be to adapt `_draw_grid` to use `space.width` and `space.height` while `_draw_continuos_space` uses `space.x_min`, `space.x_max`, `space.y_min`, and `space.y_max`. Are there reasons where the user might want to rely on matpltolib's automatic determination of the bounds? If so, then a slightly more sophisticated solution would be needed where kwargs are passed to from jupyterviz to the underlying space_drawer function. So you would add a `space_drawer_kwargs` keyword argument.\r\n\r\n\n", "before_files": [{"content": "from typing import Optional\n\nimport networkx as nx\nimport solara\nfrom matplotlib.figure import Figure\nfrom matplotlib.ticker import MaxNLocator\n\nimport mesa\n\n\[email protected]\ndef SpaceMatplotlib(model, agent_portrayal, dependencies: Optional[list[any]] = None):\n space_fig = Figure()\n space_ax = space_fig.subplots()\n space = getattr(model, \"grid\", None)\n if space is None:\n # Sometimes the space is defined as model.space instead of model.grid\n space = model.space\n if isinstance(space, mesa.space.NetworkGrid):\n _draw_network_grid(space, space_ax, agent_portrayal)\n elif isinstance(space, mesa.space.ContinuousSpace):\n _draw_continuous_space(space, space_ax, agent_portrayal)\n else:\n _draw_grid(space, space_ax, agent_portrayal)\n space_ax.set_axis_off()\n solara.FigureMatplotlib(space_fig, format=\"png\", dependencies=dependencies)\n\n\ndef _draw_grid(space, space_ax, agent_portrayal):\n def portray(g):\n x = []\n y = []\n s = [] # size\n c = [] # color\n for i in range(g.width):\n for j in range(g.height):\n content = g._grid[i][j]\n if not content:\n continue\n if not hasattr(content, \"__iter__\"):\n # Is a single grid\n content = [content]\n for agent in content:\n data = agent_portrayal(agent)\n x.append(i)\n y.append(j)\n if \"size\" in data:\n s.append(data[\"size\"])\n if \"color\" in data:\n c.append(data[\"color\"])\n out = {\"x\": x, \"y\": y}\n if len(s) > 0:\n out[\"s\"] = s\n if len(c) > 0:\n out[\"c\"] = c\n return out\n\n space_ax.scatter(**portray(space))\n\n\ndef _draw_network_grid(space, space_ax, agent_portrayal):\n graph = space.G\n pos = nx.spring_layout(graph, seed=0)\n nx.draw(\n graph,\n ax=space_ax,\n pos=pos,\n **agent_portrayal(graph),\n )\n\n\ndef _draw_continuous_space(space, space_ax, agent_portrayal):\n def portray(space):\n x = []\n y = []\n s = [] # size\n c = [] # color\n for agent in space._agent_to_index:\n data = agent_portrayal(agent)\n _x, _y = agent.pos\n x.append(_x)\n y.append(_y)\n if \"size\" in data:\n s.append(data[\"size\"])\n if \"color\" in data:\n c.append(data[\"color\"])\n out = {\"x\": x, \"y\": y}\n if len(s) > 0:\n out[\"s\"] = s\n if len(c) > 0:\n out[\"c\"] = c\n return out\n\n space_ax.scatter(**portray(space))\n\n\ndef make_plot(model, measure):\n fig = Figure()\n ax = fig.subplots()\n df = model.datacollector.get_model_vars_dataframe()\n if isinstance(measure, str):\n ax.plot(df.loc[:, measure])\n ax.set_ylabel(measure)\n elif isinstance(measure, dict):\n for m, color in measure.items():\n ax.plot(df.loc[:, m], label=m, color=color)\n fig.legend()\n elif isinstance(measure, (list, tuple)):\n for m in measure:\n ax.plot(df.loc[:, m], label=m)\n fig.legend()\n # Set integer x axis\n ax.xaxis.set_major_locator(MaxNLocator(integer=True))\n solara.FigureMatplotlib(fig)\n", "path": "mesa/experimental/components/matplotlib.py"}], "after_files": [{"content": "from typing import Optional\n\nimport networkx as nx\nimport solara\nfrom matplotlib.figure import Figure\nfrom matplotlib.ticker import MaxNLocator\n\nimport mesa\n\n\[email protected]\ndef SpaceMatplotlib(model, agent_portrayal, dependencies: Optional[list[any]] = None):\n space_fig = Figure()\n space_ax = space_fig.subplots()\n space = getattr(model, \"grid\", None)\n if space is None:\n # Sometimes the space is defined as model.space instead of model.grid\n space = model.space\n if isinstance(space, mesa.space.NetworkGrid):\n _draw_network_grid(space, space_ax, agent_portrayal)\n elif isinstance(space, mesa.space.ContinuousSpace):\n _draw_continuous_space(space, space_ax, agent_portrayal)\n else:\n _draw_grid(space, space_ax, agent_portrayal)\n space_ax.set_axis_off()\n solara.FigureMatplotlib(space_fig, format=\"png\", dependencies=dependencies)\n\n\ndef _draw_grid(space, space_ax, agent_portrayal):\n def portray(g):\n x = []\n y = []\n s = [] # size\n c = [] # color\n for i in range(g.width):\n for j in range(g.height):\n content = g._grid[i][j]\n if not content:\n continue\n if not hasattr(content, \"__iter__\"):\n # Is a single grid\n content = [content]\n for agent in content:\n data = agent_portrayal(agent)\n x.append(i)\n y.append(j)\n if \"size\" in data:\n s.append(data[\"size\"])\n if \"color\" in data:\n c.append(data[\"color\"])\n out = {\"x\": x, \"y\": y}\n if len(s) > 0:\n out[\"s\"] = s\n if len(c) > 0:\n out[\"c\"] = c\n return out\n\n space_ax.set_xlim(-1, space.width)\n space_ax.set_ylim(-1, space.height)\n space_ax.scatter(**portray(space))\n\n\ndef _draw_network_grid(space, space_ax, agent_portrayal):\n graph = space.G\n pos = nx.spring_layout(graph, seed=0)\n nx.draw(\n graph,\n ax=space_ax,\n pos=pos,\n **agent_portrayal(graph),\n )\n\n\ndef _draw_continuous_space(space, space_ax, agent_portrayal):\n def portray(space):\n x = []\n y = []\n s = [] # size\n c = [] # color\n for agent in space._agent_to_index:\n data = agent_portrayal(agent)\n _x, _y = agent.pos\n x.append(_x)\n y.append(_y)\n if \"size\" in data:\n s.append(data[\"size\"])\n if \"color\" in data:\n c.append(data[\"color\"])\n out = {\"x\": x, \"y\": y}\n if len(s) > 0:\n out[\"s\"] = s\n if len(c) > 0:\n out[\"c\"] = c\n return out\n\n width = space.x_max - space.x_min\n x_padding = width / 20\n height = space.y_max - space.y_min\n y_padding = height / 20\n space_ax.set_xlim(space.x_min - x_padding, space.x_max + x_padding)\n space_ax.set_ylim(space.y_min - y_padding, space.y_max + y_padding)\n space_ax.scatter(**portray(space))\n\n space_ax.scatter(**portray(space))\n\n\ndef make_plot(model, measure):\n fig = Figure()\n ax = fig.subplots()\n df = model.datacollector.get_model_vars_dataframe()\n if isinstance(measure, str):\n ax.plot(df.loc[:, measure])\n ax.set_ylabel(measure)\n elif isinstance(measure, dict):\n for m, color in measure.items():\n ax.plot(df.loc[:, m], label=m, color=color)\n fig.legend()\n elif isinstance(measure, (list, tuple)):\n for m in measure:\n ax.plot(df.loc[:, m], label=m)\n fig.legend()\n # Set integer x axis\n ax.xaxis.set_major_locator(MaxNLocator(integer=True))\n solara.FigureMatplotlib(fig)\n", "path": "mesa/experimental/components/matplotlib.py"}]}
| 1,507 | 233 |
gh_patches_debug_27391
|
rasdani/github-patches
|
git_diff
|
jupyterhub__jupyterhub-250
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Admin panel - Add ability to import multiple users at once
It would be nice to add a list of users as a bulk list of newline delimited users.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `jupyterhub/apihandlers/users.py`
Content:
```
1 """User handlers"""
2
3 # Copyright (c) Jupyter Development Team.
4 # Distributed under the terms of the Modified BSD License.
5
6 import json
7
8 from tornado import gen, web
9
10 from .. import orm
11 from ..utils import admin_only
12 from .base import APIHandler
13
14
15 class UserListAPIHandler(APIHandler):
16 @admin_only
17 def get(self):
18 users = self.db.query(orm.User)
19 data = [ self.user_model(u) for u in users ]
20 self.write(json.dumps(data))
21
22
23 def admin_or_self(method):
24 """Decorator for restricting access to either the target user or admin"""
25 def m(self, name):
26 current = self.get_current_user()
27 if current is None:
28 raise web.HTTPError(403)
29 if not (current.name == name or current.admin):
30 raise web.HTTPError(403)
31
32 # raise 404 if not found
33 if not self.find_user(name):
34 raise web.HTTPError(404)
35 return method(self, name)
36 return m
37
38 class UserAPIHandler(APIHandler):
39
40 @admin_or_self
41 def get(self, name):
42 user = self.find_user(name)
43 self.write(json.dumps(self.user_model(user)))
44
45 @admin_only
46 @gen.coroutine
47 def post(self, name):
48 data = self.get_json_body()
49 user = self.find_user(name)
50 if user is not None:
51 raise web.HTTPError(400, "User %s already exists" % name)
52
53 user = self.user_from_username(name)
54 if data:
55 self._check_user_model(data)
56 if 'admin' in data:
57 user.admin = data['admin']
58 self.db.commit()
59
60 try:
61 yield gen.maybe_future(self.authenticator.add_user(user))
62 except Exception:
63 self.log.error("Failed to create user: %s" % name, exc_info=True)
64 self.db.delete(user)
65 self.db.commit()
66 raise web.HTTPError(400, "Failed to create user: %s" % name)
67
68 self.write(json.dumps(self.user_model(user)))
69 self.set_status(201)
70
71 @admin_only
72 @gen.coroutine
73 def delete(self, name):
74 user = self.find_user(name)
75 if user is None:
76 raise web.HTTPError(404)
77 if user.name == self.get_current_user().name:
78 raise web.HTTPError(400, "Cannot delete yourself!")
79 if user.stop_pending:
80 raise web.HTTPError(400, "%s's server is in the process of stopping, please wait." % name)
81 if user.running:
82 yield self.stop_single_user(user)
83 if user.stop_pending:
84 raise web.HTTPError(400, "%s's server is in the process of stopping, please wait." % name)
85
86 yield gen.maybe_future(self.authenticator.delete_user(user))
87
88 # remove from the db
89 self.db.delete(user)
90 self.db.commit()
91
92 self.set_status(204)
93
94 @admin_only
95 def patch(self, name):
96 user = self.find_user(name)
97 if user is None:
98 raise web.HTTPError(404)
99 data = self.get_json_body()
100 self._check_user_model(data)
101 for key, value in data.items():
102 setattr(user, key, value)
103 self.db.commit()
104 self.write(json.dumps(self.user_model(user)))
105
106
107 class UserServerAPIHandler(APIHandler):
108 @gen.coroutine
109 @admin_or_self
110 def post(self, name):
111 user = self.find_user(name)
112 if user.spawner:
113 state = yield user.spawner.poll()
114 if state is None:
115 raise web.HTTPError(400, "%s's server is already running" % name)
116
117 yield self.spawn_single_user(user)
118 status = 202 if user.spawn_pending else 201
119 self.set_status(status)
120
121 @gen.coroutine
122 @admin_or_self
123 def delete(self, name):
124 user = self.find_user(name)
125 if user.stop_pending:
126 self.set_status(202)
127 return
128 if not user.running:
129 raise web.HTTPError(400, "%s's server is not running" % name)
130 status = yield user.spawner.poll()
131 if status is not None:
132 raise web.HTTPError(400, "%s's server is not running" % name)
133 yield self.stop_single_user(user)
134 status = 202 if user.stop_pending else 204
135 self.set_status(status)
136
137 class UserAdminAccessAPIHandler(APIHandler):
138 """Grant admins access to single-user servers
139
140 This handler sets the necessary cookie for an admin to login to a single-user server.
141 """
142 @admin_only
143 def post(self, name):
144 current = self.get_current_user()
145 self.log.warn("Admin user %s has requested access to %s's server",
146 current.name, name,
147 )
148 if not self.settings.get('admin_access', False):
149 raise web.HTTPError(403, "admin access to user servers disabled")
150 user = self.find_user(name)
151 if user is None:
152 raise web.HTTPError(404)
153 if not user.running:
154 raise web.HTTPError(400, "%s's server is not running" % name)
155 self.set_server_cookie(user)
156
157
158 default_handlers = [
159 (r"/api/users", UserListAPIHandler),
160 (r"/api/users/([^/]+)", UserAPIHandler),
161 (r"/api/users/([^/]+)/server", UserServerAPIHandler),
162 (r"/api/users/([^/]+)/admin-access", UserAdminAccessAPIHandler),
163 ]
164
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/jupyterhub/apihandlers/users.py b/jupyterhub/apihandlers/users.py
--- a/jupyterhub/apihandlers/users.py
+++ b/jupyterhub/apihandlers/users.py
@@ -18,6 +18,49 @@
users = self.db.query(orm.User)
data = [ self.user_model(u) for u in users ]
self.write(json.dumps(data))
+
+ @admin_only
+ @gen.coroutine
+ def post(self):
+ data = self.get_json_body()
+ if not data or not isinstance(data, dict) or not data.get('usernames'):
+ raise web.HTTPError(400, "Must specify at least one user to create")
+
+ usernames = data.pop('usernames')
+ self._check_user_model(data)
+ # admin is set for all users
+ # to create admin and non-admin users requires at least two API requests
+ admin = data.get('admin', False)
+
+ to_create = []
+ for name in usernames:
+ user = self.find_user(name)
+ if user is not None:
+ self.log.warn("User %s already exists" % name)
+ else:
+ to_create.append(name)
+
+ if not to_create:
+ raise web.HTTPError(400, "All %i users already exist" % len(usernames))
+
+ created = []
+ for name in to_create:
+ user = self.user_from_username(name)
+ if admin:
+ user.admin = True
+ self.db.commit()
+ try:
+ yield gen.maybe_future(self.authenticator.add_user(user))
+ except Exception:
+ self.log.error("Failed to create user: %s" % name, exc_info=True)
+ self.db.delete(user)
+ self.db.commit()
+ raise web.HTTPError(400, "Failed to create user: %s" % name)
+ else:
+ created.append(user)
+
+ self.write(json.dumps([ self.user_model(u) for u in created ]))
+ self.set_status(201)
def admin_or_self(method):
|
{"golden_diff": "diff --git a/jupyterhub/apihandlers/users.py b/jupyterhub/apihandlers/users.py\n--- a/jupyterhub/apihandlers/users.py\n+++ b/jupyterhub/apihandlers/users.py\n@@ -18,6 +18,49 @@\n users = self.db.query(orm.User)\n data = [ self.user_model(u) for u in users ]\n self.write(json.dumps(data))\n+ \n+ @admin_only\n+ @gen.coroutine\n+ def post(self):\n+ data = self.get_json_body()\n+ if not data or not isinstance(data, dict) or not data.get('usernames'):\n+ raise web.HTTPError(400, \"Must specify at least one user to create\")\n+ \n+ usernames = data.pop('usernames')\n+ self._check_user_model(data)\n+ # admin is set for all users\n+ # to create admin and non-admin users requires at least two API requests\n+ admin = data.get('admin', False)\n+ \n+ to_create = []\n+ for name in usernames:\n+ user = self.find_user(name)\n+ if user is not None:\n+ self.log.warn(\"User %s already exists\" % name)\n+ else:\n+ to_create.append(name)\n+ \n+ if not to_create:\n+ raise web.HTTPError(400, \"All %i users already exist\" % len(usernames))\n+ \n+ created = []\n+ for name in to_create:\n+ user = self.user_from_username(name)\n+ if admin:\n+ user.admin = True\n+ self.db.commit()\n+ try:\n+ yield gen.maybe_future(self.authenticator.add_user(user))\n+ except Exception:\n+ self.log.error(\"Failed to create user: %s\" % name, exc_info=True)\n+ self.db.delete(user)\n+ self.db.commit()\n+ raise web.HTTPError(400, \"Failed to create user: %s\" % name)\n+ else:\n+ created.append(user)\n+ \n+ self.write(json.dumps([ self.user_model(u) for u in created ]))\n+ self.set_status(201)\n \n \n def admin_or_self(method):\n", "issue": "Admin panel - Add ability to import multiple users at once\nIt would be nice to add a list of users as a bulk list of newline delimited users.\n\n", "before_files": [{"content": "\"\"\"User handlers\"\"\"\n\n# Copyright (c) Jupyter Development Team.\n# Distributed under the terms of the Modified BSD License.\n\nimport json\n\nfrom tornado import gen, web\n\nfrom .. import orm\nfrom ..utils import admin_only\nfrom .base import APIHandler\n\n\nclass UserListAPIHandler(APIHandler):\n @admin_only\n def get(self):\n users = self.db.query(orm.User)\n data = [ self.user_model(u) for u in users ]\n self.write(json.dumps(data))\n\n\ndef admin_or_self(method):\n \"\"\"Decorator for restricting access to either the target user or admin\"\"\"\n def m(self, name):\n current = self.get_current_user()\n if current is None:\n raise web.HTTPError(403)\n if not (current.name == name or current.admin):\n raise web.HTTPError(403)\n \n # raise 404 if not found\n if not self.find_user(name):\n raise web.HTTPError(404)\n return method(self, name)\n return m\n\nclass UserAPIHandler(APIHandler):\n \n @admin_or_self\n def get(self, name):\n user = self.find_user(name)\n self.write(json.dumps(self.user_model(user)))\n \n @admin_only\n @gen.coroutine\n def post(self, name):\n data = self.get_json_body()\n user = self.find_user(name)\n if user is not None:\n raise web.HTTPError(400, \"User %s already exists\" % name)\n \n user = self.user_from_username(name)\n if data:\n self._check_user_model(data)\n if 'admin' in data:\n user.admin = data['admin']\n self.db.commit()\n \n try:\n yield gen.maybe_future(self.authenticator.add_user(user))\n except Exception:\n self.log.error(\"Failed to create user: %s\" % name, exc_info=True)\n self.db.delete(user)\n self.db.commit()\n raise web.HTTPError(400, \"Failed to create user: %s\" % name)\n \n self.write(json.dumps(self.user_model(user)))\n self.set_status(201)\n \n @admin_only\n @gen.coroutine\n def delete(self, name):\n user = self.find_user(name)\n if user is None:\n raise web.HTTPError(404)\n if user.name == self.get_current_user().name:\n raise web.HTTPError(400, \"Cannot delete yourself!\")\n if user.stop_pending:\n raise web.HTTPError(400, \"%s's server is in the process of stopping, please wait.\" % name)\n if user.running:\n yield self.stop_single_user(user)\n if user.stop_pending:\n raise web.HTTPError(400, \"%s's server is in the process of stopping, please wait.\" % name)\n \n yield gen.maybe_future(self.authenticator.delete_user(user))\n \n # remove from the db\n self.db.delete(user)\n self.db.commit()\n \n self.set_status(204)\n \n @admin_only\n def patch(self, name):\n user = self.find_user(name)\n if user is None:\n raise web.HTTPError(404)\n data = self.get_json_body()\n self._check_user_model(data)\n for key, value in data.items():\n setattr(user, key, value)\n self.db.commit()\n self.write(json.dumps(self.user_model(user)))\n\n\nclass UserServerAPIHandler(APIHandler):\n @gen.coroutine\n @admin_or_self\n def post(self, name):\n user = self.find_user(name)\n if user.spawner:\n state = yield user.spawner.poll()\n if state is None:\n raise web.HTTPError(400, \"%s's server is already running\" % name)\n\n yield self.spawn_single_user(user)\n status = 202 if user.spawn_pending else 201\n self.set_status(status)\n\n @gen.coroutine\n @admin_or_self\n def delete(self, name):\n user = self.find_user(name)\n if user.stop_pending:\n self.set_status(202)\n return\n if not user.running:\n raise web.HTTPError(400, \"%s's server is not running\" % name)\n status = yield user.spawner.poll()\n if status is not None:\n raise web.HTTPError(400, \"%s's server is not running\" % name)\n yield self.stop_single_user(user)\n status = 202 if user.stop_pending else 204\n self.set_status(status)\n\nclass UserAdminAccessAPIHandler(APIHandler):\n \"\"\"Grant admins access to single-user servers\n \n This handler sets the necessary cookie for an admin to login to a single-user server.\n \"\"\"\n @admin_only\n def post(self, name):\n current = self.get_current_user()\n self.log.warn(\"Admin user %s has requested access to %s's server\",\n current.name, name,\n )\n if not self.settings.get('admin_access', False):\n raise web.HTTPError(403, \"admin access to user servers disabled\")\n user = self.find_user(name)\n if user is None:\n raise web.HTTPError(404)\n if not user.running:\n raise web.HTTPError(400, \"%s's server is not running\" % name)\n self.set_server_cookie(user)\n\n\ndefault_handlers = [\n (r\"/api/users\", UserListAPIHandler),\n (r\"/api/users/([^/]+)\", UserAPIHandler),\n (r\"/api/users/([^/]+)/server\", UserServerAPIHandler),\n (r\"/api/users/([^/]+)/admin-access\", UserAdminAccessAPIHandler),\n]\n", "path": "jupyterhub/apihandlers/users.py"}], "after_files": [{"content": "\"\"\"User handlers\"\"\"\n\n# Copyright (c) Jupyter Development Team.\n# Distributed under the terms of the Modified BSD License.\n\nimport json\n\nfrom tornado import gen, web\n\nfrom .. import orm\nfrom ..utils import admin_only\nfrom .base import APIHandler\n\n\nclass UserListAPIHandler(APIHandler):\n @admin_only\n def get(self):\n users = self.db.query(orm.User)\n data = [ self.user_model(u) for u in users ]\n self.write(json.dumps(data))\n \n @admin_only\n @gen.coroutine\n def post(self):\n data = self.get_json_body()\n if not data or not isinstance(data, dict) or not data.get('usernames'):\n raise web.HTTPError(400, \"Must specify at least one user to create\")\n \n usernames = data.pop('usernames')\n self._check_user_model(data)\n # admin is set for all users\n # to create admin and non-admin users requires at least two API requests\n admin = data.get('admin', False)\n \n to_create = []\n for name in usernames:\n user = self.find_user(name)\n if user is not None:\n self.log.warn(\"User %s already exists\" % name)\n else:\n to_create.append(name)\n \n if not to_create:\n raise web.HTTPError(400, \"All %i users already exist\" % len(usernames))\n \n created = []\n for name in to_create:\n user = self.user_from_username(name)\n if admin:\n user.admin = True\n self.db.commit()\n try:\n yield gen.maybe_future(self.authenticator.add_user(user))\n except Exception:\n self.log.error(\"Failed to create user: %s\" % name, exc_info=True)\n self.db.delete(user)\n self.db.commit()\n raise web.HTTPError(400, \"Failed to create user: %s\" % name)\n else:\n created.append(user)\n \n self.write(json.dumps([ self.user_model(u) for u in created ]))\n self.set_status(201)\n\n\ndef admin_or_self(method):\n \"\"\"Decorator for restricting access to either the target user or admin\"\"\"\n def m(self, name):\n current = self.get_current_user()\n if current is None:\n raise web.HTTPError(403)\n if not (current.name == name or current.admin):\n raise web.HTTPError(403)\n \n # raise 404 if not found\n if not self.find_user(name):\n raise web.HTTPError(404)\n return method(self, name)\n return m\n\nclass UserAPIHandler(APIHandler):\n \n @admin_or_self\n def get(self, name):\n user = self.find_user(name)\n self.write(json.dumps(self.user_model(user)))\n \n @admin_only\n @gen.coroutine\n def post(self, name):\n data = self.get_json_body()\n user = self.find_user(name)\n if user is not None:\n raise web.HTTPError(400, \"User %s already exists\" % name)\n \n user = self.user_from_username(name)\n if data:\n self._check_user_model(data)\n if 'admin' in data:\n user.admin = data['admin']\n self.db.commit()\n \n try:\n yield gen.maybe_future(self.authenticator.add_user(user))\n except Exception:\n self.log.error(\"Failed to create user: %s\" % name, exc_info=True)\n self.db.delete(user)\n self.db.commit()\n raise web.HTTPError(400, \"Failed to create user: %s\" % name)\n \n self.write(json.dumps(self.user_model(user)))\n self.set_status(201)\n \n @admin_only\n @gen.coroutine\n def delete(self, name):\n user = self.find_user(name)\n if user is None:\n raise web.HTTPError(404)\n if user.name == self.get_current_user().name:\n raise web.HTTPError(400, \"Cannot delete yourself!\")\n if user.stop_pending:\n raise web.HTTPError(400, \"%s's server is in the process of stopping, please wait.\" % name)\n if user.running:\n yield self.stop_single_user(user)\n if user.stop_pending:\n raise web.HTTPError(400, \"%s's server is in the process of stopping, please wait.\" % name)\n \n yield gen.maybe_future(self.authenticator.delete_user(user))\n \n # remove from the db\n self.db.delete(user)\n self.db.commit()\n \n self.set_status(204)\n \n @admin_only\n def patch(self, name):\n user = self.find_user(name)\n if user is None:\n raise web.HTTPError(404)\n data = self.get_json_body()\n self._check_user_model(data)\n for key, value in data.items():\n setattr(user, key, value)\n self.db.commit()\n self.write(json.dumps(self.user_model(user)))\n\n\nclass UserServerAPIHandler(APIHandler):\n @gen.coroutine\n @admin_or_self\n def post(self, name):\n user = self.find_user(name)\n if user.spawner:\n state = yield user.spawner.poll()\n if state is None:\n raise web.HTTPError(400, \"%s's server is already running\" % name)\n\n yield self.spawn_single_user(user)\n status = 202 if user.spawn_pending else 201\n self.set_status(status)\n\n @gen.coroutine\n @admin_or_self\n def delete(self, name):\n user = self.find_user(name)\n if user.stop_pending:\n self.set_status(202)\n return\n if not user.running:\n raise web.HTTPError(400, \"%s's server is not running\" % name)\n status = yield user.spawner.poll()\n if status is not None:\n raise web.HTTPError(400, \"%s's server is not running\" % name)\n yield self.stop_single_user(user)\n status = 202 if user.stop_pending else 204\n self.set_status(status)\n\nclass UserAdminAccessAPIHandler(APIHandler):\n \"\"\"Grant admins access to single-user servers\n \n This handler sets the necessary cookie for an admin to login to a single-user server.\n \"\"\"\n @admin_only\n def post(self, name):\n current = self.get_current_user()\n self.log.warn(\"Admin user %s has requested access to %s's server\",\n current.name, name,\n )\n if not self.settings.get('admin_access', False):\n raise web.HTTPError(403, \"admin access to user servers disabled\")\n user = self.find_user(name)\n if user is None:\n raise web.HTTPError(404)\n if not user.running:\n raise web.HTTPError(400, \"%s's server is not running\" % name)\n self.set_server_cookie(user)\n\n\ndefault_handlers = [\n (r\"/api/users\", UserListAPIHandler),\n (r\"/api/users/([^/]+)\", UserAPIHandler),\n (r\"/api/users/([^/]+)/server\", UserServerAPIHandler),\n (r\"/api/users/([^/]+)/admin-access\", UserAdminAccessAPIHandler),\n]\n", "path": "jupyterhub/apihandlers/users.py"}]}
| 1,914 | 475 |
gh_patches_debug_26616
|
rasdani/github-patches
|
git_diff
|
kivy__kivy-3859
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Button behavior has a property MIN_STATE_TIME
This property can't be used in kv cause of the uppercase.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kivy/uix/behaviors/button.py`
Content:
```
1 '''See :class:`ButtonBehavior` for details.
2 '''
3
4 __all__ = ('ButtonBehavior', )
5
6 from kivy.clock import Clock
7 from kivy.properties import OptionProperty, ObjectProperty, BooleanProperty
8 from time import time
9
10
11 class ButtonBehavior(object):
12 '''
13 This `mixin <https://en.wikipedia.org/wiki/Mixin>`_ class provides
14 :class:`~kivy.uix.button.Button` behavior.
15
16 :Events:
17 `on_press`
18 Fired when the button is pressed.
19 `on_release`
20 Fired when the button is released (i.e. the touch/click that
21 pressed the button goes away).
22 '''
23
24 state = OptionProperty('normal', options=('normal', 'down'))
25 '''The state of the button, must be one of 'normal' or 'down'.
26 The state is 'down' only when the button is currently touched/clicked,
27 otherwise its 'normal'.
28
29 :attr:`state` is an :class:`~kivy.properties.OptionProperty` and defaults
30 to 'normal'.
31 '''
32
33 last_touch = ObjectProperty(None)
34 '''Contains the last relevant touch received by the Button. This can
35 be used in `on_press` or `on_release` in order to know which touch
36 dispatched the event.
37
38 .. versionadded:: 1.8.0
39
40 :attr:`last_touch` is a :class:`~kivy.properties.ObjectProperty` and
41 defaults to `None`.
42 '''
43
44 MIN_STATE_TIME = 0.035
45 '''The minimum period of time which the widget must remain in the
46 `'down'` state.
47
48 :attr:`MIN_STATE_TIME` is a float and defaults to 0.035.
49 '''
50
51 always_release = BooleanProperty(True)
52 '''This determines whether or not the widget fires an `on_release` event if
53 the touch_up is outside the widget.
54
55 .. versionadded:: 1.9.0
56
57 :attr:`always_release` is a :class:`~kivy.properties.BooleanProperty` and
58 defaults to `True`.
59 '''
60
61 def __init__(self, **kwargs):
62 self.register_event_type('on_press')
63 self.register_event_type('on_release')
64 super(ButtonBehavior, self).__init__(**kwargs)
65 self.__state_event = None
66 self.__touch_time = None
67 self.fbind('state', self.cancel_event)
68
69 def _do_press(self):
70 self.state = 'down'
71
72 def _do_release(self, *args):
73 self.state = 'normal'
74
75 def cancel_event(self, *args):
76 if self.__state_event:
77 self.__state_event.cancel()
78 self.__state_event = None
79
80 def on_touch_down(self, touch):
81 if super(ButtonBehavior, self).on_touch_down(touch):
82 return True
83 if touch.is_mouse_scrolling:
84 return False
85 if not self.collide_point(touch.x, touch.y):
86 return False
87 if self in touch.ud:
88 return False
89 touch.grab(self)
90 touch.ud[self] = True
91 self.last_touch = touch
92 self.__touch_time = time()
93 self._do_press()
94 self.dispatch('on_press')
95 return True
96
97 def on_touch_move(self, touch):
98 if touch.grab_current is self:
99 return True
100 if super(ButtonBehavior, self).on_touch_move(touch):
101 return True
102 return self in touch.ud
103
104 def on_touch_up(self, touch):
105 if touch.grab_current is not self:
106 return super(ButtonBehavior, self).on_touch_up(touch)
107 assert(self in touch.ud)
108 touch.ungrab(self)
109 self.last_touch = touch
110
111 if (not self.always_release
112 and not self.collide_point(*touch.pos)):
113 self.state = 'normal'
114 return
115
116 touchtime = time() - self.__touch_time
117 if touchtime < self.MIN_STATE_TIME:
118 self.__state_event = Clock.schedule_once(
119 self._do_release, self.MIN_STATE_TIME - touchtime)
120 else:
121 self._do_release()
122 self.dispatch('on_release')
123 return True
124
125 def on_press(self):
126 pass
127
128 def on_release(self):
129 pass
130
131 def trigger_action(self, duration=0.1):
132 '''Trigger whatever action(s) have been bound to the button by calling
133 both the on_press and on_release callbacks.
134
135 This simulates a quick button press without using any touch events.
136
137 Duration is the length of the press in seconds. Pass 0 if you want
138 the action to happen instantly.
139
140 .. versionadded:: 1.8.0
141 '''
142 self._do_press()
143 self.dispatch('on_press')
144
145 def trigger_release(dt):
146 self._do_release()
147 self.dispatch('on_release')
148 if not duration:
149 trigger_release(0)
150 else:
151 Clock.schedule_once(trigger_release, duration)
152
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/kivy/uix/behaviors/button.py b/kivy/uix/behaviors/button.py
--- a/kivy/uix/behaviors/button.py
+++ b/kivy/uix/behaviors/button.py
@@ -4,8 +4,10 @@
__all__ = ('ButtonBehavior', )
from kivy.clock import Clock
-from kivy.properties import OptionProperty, ObjectProperty, BooleanProperty
+from kivy.properties import OptionProperty, ObjectProperty, \
+ BooleanProperty, NumericProperty, AliasProperty
from time import time
+from kivy.logger import Logger
class ButtonBehavior(object):
@@ -41,13 +43,26 @@
defaults to `None`.
'''
- MIN_STATE_TIME = 0.035
+ min_state_time = NumericProperty(0.035)
'''The minimum period of time which the widget must remain in the
`'down'` state.
- :attr:`MIN_STATE_TIME` is a float and defaults to 0.035.
+ :attr:`min_state_time` is a float and defaults to 0.035.
'''
+ def _set_min_state_time(self, value):
+ Logger.warning(
+ 'MIN_STATE_TIME is deprecated, use min_state_time instead')
+ self.min_state_time = value
+
+ def _get_min_state_time(self):
+ Logger.warning(
+ 'MIN_STATE_TIME is deprecated, use min_state_time instead')
+ return self.min_state_time
+
+ MIN_STATE_TIME = AliasProperty(
+ _get_min_state_time, _set_min_state_time, bind=('min_state_time', ))
+
always_release = BooleanProperty(True)
'''This determines whether or not the widget fires an `on_release` event if
the touch_up is outside the widget.
|
{"golden_diff": "diff --git a/kivy/uix/behaviors/button.py b/kivy/uix/behaviors/button.py\n--- a/kivy/uix/behaviors/button.py\n+++ b/kivy/uix/behaviors/button.py\n@@ -4,8 +4,10 @@\n __all__ = ('ButtonBehavior', )\n \n from kivy.clock import Clock\n-from kivy.properties import OptionProperty, ObjectProperty, BooleanProperty\n+from kivy.properties import OptionProperty, ObjectProperty, \\\n+ BooleanProperty, NumericProperty, AliasProperty\n from time import time\n+from kivy.logger import Logger\n \n \n class ButtonBehavior(object):\n@@ -41,13 +43,26 @@\n defaults to `None`.\n '''\n \n- MIN_STATE_TIME = 0.035\n+ min_state_time = NumericProperty(0.035)\n '''The minimum period of time which the widget must remain in the\n `'down'` state.\n \n- :attr:`MIN_STATE_TIME` is a float and defaults to 0.035.\n+ :attr:`min_state_time` is a float and defaults to 0.035.\n '''\n \n+ def _set_min_state_time(self, value):\n+ Logger.warning(\n+ 'MIN_STATE_TIME is deprecated, use min_state_time instead')\n+ self.min_state_time = value\n+\n+ def _get_min_state_time(self):\n+ Logger.warning(\n+ 'MIN_STATE_TIME is deprecated, use min_state_time instead')\n+ return self.min_state_time\n+\n+ MIN_STATE_TIME = AliasProperty(\n+ _get_min_state_time, _set_min_state_time, bind=('min_state_time', ))\n+\n always_release = BooleanProperty(True)\n '''This determines whether or not the widget fires an `on_release` event if\n the touch_up is outside the widget.\n", "issue": "Button behavior has a property MIN_STATE_TIME\nThis property can't be used in kv cause of the uppercase.\n\n", "before_files": [{"content": "'''See :class:`ButtonBehavior` for details.\n'''\n\n__all__ = ('ButtonBehavior', )\n\nfrom kivy.clock import Clock\nfrom kivy.properties import OptionProperty, ObjectProperty, BooleanProperty\nfrom time import time\n\n\nclass ButtonBehavior(object):\n '''\n This `mixin <https://en.wikipedia.org/wiki/Mixin>`_ class provides\n :class:`~kivy.uix.button.Button` behavior.\n\n :Events:\n `on_press`\n Fired when the button is pressed.\n `on_release`\n Fired when the button is released (i.e. the touch/click that\n pressed the button goes away).\n '''\n\n state = OptionProperty('normal', options=('normal', 'down'))\n '''The state of the button, must be one of 'normal' or 'down'.\n The state is 'down' only when the button is currently touched/clicked,\n otherwise its 'normal'.\n\n :attr:`state` is an :class:`~kivy.properties.OptionProperty` and defaults\n to 'normal'.\n '''\n\n last_touch = ObjectProperty(None)\n '''Contains the last relevant touch received by the Button. This can\n be used in `on_press` or `on_release` in order to know which touch\n dispatched the event.\n\n .. versionadded:: 1.8.0\n\n :attr:`last_touch` is a :class:`~kivy.properties.ObjectProperty` and\n defaults to `None`.\n '''\n\n MIN_STATE_TIME = 0.035\n '''The minimum period of time which the widget must remain in the\n `'down'` state.\n\n :attr:`MIN_STATE_TIME` is a float and defaults to 0.035.\n '''\n\n always_release = BooleanProperty(True)\n '''This determines whether or not the widget fires an `on_release` event if\n the touch_up is outside the widget.\n\n .. versionadded:: 1.9.0\n\n :attr:`always_release` is a :class:`~kivy.properties.BooleanProperty` and\n defaults to `True`.\n '''\n\n def __init__(self, **kwargs):\n self.register_event_type('on_press')\n self.register_event_type('on_release')\n super(ButtonBehavior, self).__init__(**kwargs)\n self.__state_event = None\n self.__touch_time = None\n self.fbind('state', self.cancel_event)\n\n def _do_press(self):\n self.state = 'down'\n\n def _do_release(self, *args):\n self.state = 'normal'\n\n def cancel_event(self, *args):\n if self.__state_event:\n self.__state_event.cancel()\n self.__state_event = None\n\n def on_touch_down(self, touch):\n if super(ButtonBehavior, self).on_touch_down(touch):\n return True\n if touch.is_mouse_scrolling:\n return False\n if not self.collide_point(touch.x, touch.y):\n return False\n if self in touch.ud:\n return False\n touch.grab(self)\n touch.ud[self] = True\n self.last_touch = touch\n self.__touch_time = time()\n self._do_press()\n self.dispatch('on_press')\n return True\n\n def on_touch_move(self, touch):\n if touch.grab_current is self:\n return True\n if super(ButtonBehavior, self).on_touch_move(touch):\n return True\n return self in touch.ud\n\n def on_touch_up(self, touch):\n if touch.grab_current is not self:\n return super(ButtonBehavior, self).on_touch_up(touch)\n assert(self in touch.ud)\n touch.ungrab(self)\n self.last_touch = touch\n\n if (not self.always_release\n and not self.collide_point(*touch.pos)):\n self.state = 'normal'\n return\n\n touchtime = time() - self.__touch_time\n if touchtime < self.MIN_STATE_TIME:\n self.__state_event = Clock.schedule_once(\n self._do_release, self.MIN_STATE_TIME - touchtime)\n else:\n self._do_release()\n self.dispatch('on_release')\n return True\n\n def on_press(self):\n pass\n\n def on_release(self):\n pass\n\n def trigger_action(self, duration=0.1):\n '''Trigger whatever action(s) have been bound to the button by calling\n both the on_press and on_release callbacks.\n\n This simulates a quick button press without using any touch events.\n\n Duration is the length of the press in seconds. Pass 0 if you want\n the action to happen instantly.\n\n .. versionadded:: 1.8.0\n '''\n self._do_press()\n self.dispatch('on_press')\n\n def trigger_release(dt):\n self._do_release()\n self.dispatch('on_release')\n if not duration:\n trigger_release(0)\n else:\n Clock.schedule_once(trigger_release, duration)\n", "path": "kivy/uix/behaviors/button.py"}], "after_files": [{"content": "'''See :class:`ButtonBehavior` for details.\n'''\n\n__all__ = ('ButtonBehavior', )\n\nfrom kivy.clock import Clock\nfrom kivy.properties import OptionProperty, ObjectProperty, \\\n BooleanProperty, NumericProperty, AliasProperty\nfrom time import time\nfrom kivy.logger import Logger\n\n\nclass ButtonBehavior(object):\n '''\n This `mixin <https://en.wikipedia.org/wiki/Mixin>`_ class provides\n :class:`~kivy.uix.button.Button` behavior.\n\n :Events:\n `on_press`\n Fired when the button is pressed.\n `on_release`\n Fired when the button is released (i.e. the touch/click that\n pressed the button goes away).\n '''\n\n state = OptionProperty('normal', options=('normal', 'down'))\n '''The state of the button, must be one of 'normal' or 'down'.\n The state is 'down' only when the button is currently touched/clicked,\n otherwise its 'normal'.\n\n :attr:`state` is an :class:`~kivy.properties.OptionProperty` and defaults\n to 'normal'.\n '''\n\n last_touch = ObjectProperty(None)\n '''Contains the last relevant touch received by the Button. This can\n be used in `on_press` or `on_release` in order to know which touch\n dispatched the event.\n\n .. versionadded:: 1.8.0\n\n :attr:`last_touch` is a :class:`~kivy.properties.ObjectProperty` and\n defaults to `None`.\n '''\n\n min_state_time = NumericProperty(0.035)\n '''The minimum period of time which the widget must remain in the\n `'down'` state.\n\n :attr:`min_state_time` is a float and defaults to 0.035.\n '''\n\n def _set_min_state_time(self, value):\n Logger.warning(\n 'MIN_STATE_TIME is deprecated, use min_state_time instead')\n self.min_state_time = value\n\n def _get_min_state_time(self):\n Logger.warning(\n 'MIN_STATE_TIME is deprecated, use min_state_time instead')\n return self.min_state_time\n\n MIN_STATE_TIME = AliasProperty(\n _get_min_state_time, _set_min_state_time, bind=('min_state_time', ))\n\n always_release = BooleanProperty(True)\n '''This determines whether or not the widget fires an `on_release` event if\n the touch_up is outside the widget.\n\n .. versionadded:: 1.9.0\n\n :attr:`always_release` is a :class:`~kivy.properties.BooleanProperty` and\n defaults to `True`.\n '''\n\n def __init__(self, **kwargs):\n self.register_event_type('on_press')\n self.register_event_type('on_release')\n super(ButtonBehavior, self).__init__(**kwargs)\n self.__state_event = None\n self.__touch_time = None\n self.fbind('state', self.cancel_event)\n\n def _do_press(self):\n self.state = 'down'\n\n def _do_release(self, *args):\n self.state = 'normal'\n\n def cancel_event(self, *args):\n if self.__state_event:\n self.__state_event.cancel()\n self.__state_event = None\n\n def on_touch_down(self, touch):\n if super(ButtonBehavior, self).on_touch_down(touch):\n return True\n if touch.is_mouse_scrolling:\n return False\n if not self.collide_point(touch.x, touch.y):\n return False\n if self in touch.ud:\n return False\n touch.grab(self)\n touch.ud[self] = True\n self.last_touch = touch\n self.__touch_time = time()\n self._do_press()\n self.dispatch('on_press')\n return True\n\n def on_touch_move(self, touch):\n if touch.grab_current is self:\n return True\n if super(ButtonBehavior, self).on_touch_move(touch):\n return True\n return self in touch.ud\n\n def on_touch_up(self, touch):\n if touch.grab_current is not self:\n return super(ButtonBehavior, self).on_touch_up(touch)\n assert(self in touch.ud)\n touch.ungrab(self)\n self.last_touch = touch\n\n if (not self.always_release\n and not self.collide_point(*touch.pos)):\n self.state = 'normal'\n return\n\n touchtime = time() - self.__touch_time\n if touchtime < self.MIN_STATE_TIME:\n self.__state_event = Clock.schedule_once(\n self._do_release, self.MIN_STATE_TIME - touchtime)\n else:\n self._do_release()\n self.dispatch('on_release')\n return True\n\n def on_press(self):\n pass\n\n def on_release(self):\n pass\n\n def trigger_action(self, duration=0.1):\n '''Trigger whatever action(s) have been bound to the button by calling\n both the on_press and on_release callbacks.\n\n This simulates a quick button press without using any touch events.\n\n Duration is the length of the press in seconds. Pass 0 if you want\n the action to happen instantly.\n\n .. versionadded:: 1.8.0\n '''\n self._do_press()\n self.dispatch('on_press')\n\n def trigger_release(dt):\n self._do_release()\n self.dispatch('on_release')\n if not duration:\n trigger_release(0)\n else:\n Clock.schedule_once(trigger_release, duration)\n", "path": "kivy/uix/behaviors/button.py"}]}
| 1,702 | 401 |
gh_patches_debug_14403
|
rasdani/github-patches
|
git_diff
|
dbt-labs__dbt-core-4359
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Ensure that all dbt invocations have an invocation_id
This is a sub-issue of #4260.
In order to make sure we can distinguish logs from different runs we should always have an invocation_id. If tracking is off, it isn't created today so we would need to create it. (to add it to logging, add it as a class property on the superclass) (previously considered: preserve process id like it is today, global run_id reported at start, or allow some sort of user markers).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/dbt/lib.py`
Content:
```
1 # TODO: this file is one big TODO
2 import os
3 from dbt.exceptions import RuntimeException
4 from dbt import flags
5 from collections import namedtuple
6
7 RuntimeArgs = namedtuple(
8 'RuntimeArgs', 'project_dir profiles_dir single_threaded profile_name'
9 )
10
11
12 def get_dbt_config(project_dir, single_threaded=False):
13 from dbt.config.runtime import RuntimeConfig
14 import dbt.adapters.factory
15
16 if os.getenv('DBT_PROFILES_DIR'):
17 profiles_dir = os.getenv('DBT_PROFILES_DIR')
18 else:
19 profiles_dir = os.path.expanduser("~/.dbt")
20
21 # Construct a phony config
22 config = RuntimeConfig.from_args(RuntimeArgs(
23 project_dir, profiles_dir, single_threaded, 'user'
24 ))
25 # Clear previously registered adapters--
26 # this fixes cacheing behavior on the dbt-server
27 flags.set_from_args('', config)
28 dbt.adapters.factory.reset_adapters()
29 # Load the relevant adapter
30 dbt.adapters.factory.register_adapter(config)
31
32 return config
33
34
35 def get_task_by_type(type):
36 # TODO: we need to tell dbt-server what tasks are available
37 from dbt.task.run import RunTask
38 from dbt.task.list import ListTask
39 from dbt.task.seed import SeedTask
40 from dbt.task.test import TestTask
41 from dbt.task.build import BuildTask
42 from dbt.task.snapshot import SnapshotTask
43 from dbt.task.run_operation import RunOperationTask
44
45 if type == 'run':
46 return RunTask
47 elif type == 'test':
48 return TestTask
49 elif type == 'list':
50 return ListTask
51 elif type == 'seed':
52 return SeedTask
53 elif type == 'build':
54 return BuildTask
55 elif type == 'snapshot':
56 return SnapshotTask
57 elif type == 'run_operation':
58 return RunOperationTask
59
60 raise RuntimeException('not a valid task')
61
62
63 def create_task(type, args, manifest, config):
64 task = get_task_by_type(type)
65
66 def no_op(*args, **kwargs):
67 pass
68
69 # TODO: yuck, let's rethink tasks a little
70 task = task(args, config)
71
72 # Wow! We can monkeypatch taskCls.load_manifest to return _our_ manifest
73 task.load_manifest = no_op
74 task.manifest = manifest
75 return task
76
77
78 def _get_operation_node(manifest, project_path, sql):
79 from dbt.parser.manifest import process_node
80 from dbt.parser.sql import SqlBlockParser
81 import dbt.adapters.factory
82
83 config = get_dbt_config(project_path)
84 block_parser = SqlBlockParser(
85 project=config,
86 manifest=manifest,
87 root_project=config,
88 )
89
90 adapter = dbt.adapters.factory.get_adapter(config)
91 # TODO : This needs a real name?
92 sql_node = block_parser.parse_remote(sql, 'name')
93 process_node(config, manifest, sql_node)
94 return config, sql_node, adapter
95
96
97 def compile_sql(manifest, project_path, sql):
98 from dbt.task.sql import SqlCompileRunner
99
100 config, node, adapter = _get_operation_node(manifest, project_path, sql)
101 runner = SqlCompileRunner(config, adapter, node, 1, 1)
102 return runner.safe_run(manifest)
103
104
105 def execute_sql(manifest, project_path, sql):
106 from dbt.task.sql import SqlExecuteRunner
107
108 config, node, adapter = _get_operation_node(manifest, project_path, sql)
109 runner = SqlExecuteRunner(config, adapter, node, 1, 1)
110 # TODO: use same interface for runner
111 return runner.safe_run(manifest)
112
113
114 def parse_to_manifest(config):
115 from dbt.parser.manifest import ManifestLoader
116
117 return ManifestLoader.get_full_manifest(config)
118
119
120 def deserialize_manifest(manifest_msgpack):
121 from dbt.contracts.graph.manifest import Manifest
122
123 return Manifest.from_msgpack(manifest_msgpack)
124
125
126 def serialize_manifest(manifest):
127 # TODO: what should this take as an arg?
128 return manifest.to_msgpack()
129
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/core/dbt/lib.py b/core/dbt/lib.py
--- a/core/dbt/lib.py
+++ b/core/dbt/lib.py
@@ -12,6 +12,7 @@
def get_dbt_config(project_dir, single_threaded=False):
from dbt.config.runtime import RuntimeConfig
import dbt.adapters.factory
+ import dbt.events.functions
if os.getenv('DBT_PROFILES_DIR'):
profiles_dir = os.getenv('DBT_PROFILES_DIR')
@@ -28,6 +29,8 @@
dbt.adapters.factory.reset_adapters()
# Load the relevant adapter
dbt.adapters.factory.register_adapter(config)
+ # Set invocation id
+ dbt.events.functions.set_invocation_id()
return config
|
{"golden_diff": "diff --git a/core/dbt/lib.py b/core/dbt/lib.py\n--- a/core/dbt/lib.py\n+++ b/core/dbt/lib.py\n@@ -12,6 +12,7 @@\n def get_dbt_config(project_dir, single_threaded=False):\n from dbt.config.runtime import RuntimeConfig\n import dbt.adapters.factory\n+ import dbt.events.functions\n \n if os.getenv('DBT_PROFILES_DIR'):\n profiles_dir = os.getenv('DBT_PROFILES_DIR')\n@@ -28,6 +29,8 @@\n dbt.adapters.factory.reset_adapters()\n # Load the relevant adapter\n dbt.adapters.factory.register_adapter(config)\n+ # Set invocation id\n+ dbt.events.functions.set_invocation_id()\n \n return config\n", "issue": "Ensure that all dbt invocations have an invocation_id\nThis is a sub-issue of #4260. \n\nIn order to make sure we can distinguish logs from different runs we should always have an invocation_id. If tracking is off, it isn't created today so we would need to create it. (to add it to logging, add it as a class property on the superclass) (previously considered: preserve process id like it is today, global run_id reported at start, or allow some sort of user markers).\n", "before_files": [{"content": "# TODO: this file is one big TODO\nimport os\nfrom dbt.exceptions import RuntimeException\nfrom dbt import flags\nfrom collections import namedtuple\n\nRuntimeArgs = namedtuple(\n 'RuntimeArgs', 'project_dir profiles_dir single_threaded profile_name'\n)\n\n\ndef get_dbt_config(project_dir, single_threaded=False):\n from dbt.config.runtime import RuntimeConfig\n import dbt.adapters.factory\n\n if os.getenv('DBT_PROFILES_DIR'):\n profiles_dir = os.getenv('DBT_PROFILES_DIR')\n else:\n profiles_dir = os.path.expanduser(\"~/.dbt\")\n\n # Construct a phony config\n config = RuntimeConfig.from_args(RuntimeArgs(\n project_dir, profiles_dir, single_threaded, 'user'\n ))\n # Clear previously registered adapters--\n # this fixes cacheing behavior on the dbt-server\n flags.set_from_args('', config)\n dbt.adapters.factory.reset_adapters()\n # Load the relevant adapter\n dbt.adapters.factory.register_adapter(config)\n\n return config\n\n\ndef get_task_by_type(type):\n # TODO: we need to tell dbt-server what tasks are available\n from dbt.task.run import RunTask\n from dbt.task.list import ListTask\n from dbt.task.seed import SeedTask\n from dbt.task.test import TestTask\n from dbt.task.build import BuildTask\n from dbt.task.snapshot import SnapshotTask\n from dbt.task.run_operation import RunOperationTask\n\n if type == 'run':\n return RunTask\n elif type == 'test':\n return TestTask\n elif type == 'list':\n return ListTask\n elif type == 'seed':\n return SeedTask\n elif type == 'build':\n return BuildTask\n elif type == 'snapshot':\n return SnapshotTask\n elif type == 'run_operation':\n return RunOperationTask\n\n raise RuntimeException('not a valid task')\n\n\ndef create_task(type, args, manifest, config):\n task = get_task_by_type(type)\n\n def no_op(*args, **kwargs):\n pass\n\n # TODO: yuck, let's rethink tasks a little\n task = task(args, config)\n\n # Wow! We can monkeypatch taskCls.load_manifest to return _our_ manifest\n task.load_manifest = no_op\n task.manifest = manifest\n return task\n\n\ndef _get_operation_node(manifest, project_path, sql):\n from dbt.parser.manifest import process_node\n from dbt.parser.sql import SqlBlockParser\n import dbt.adapters.factory\n\n config = get_dbt_config(project_path)\n block_parser = SqlBlockParser(\n project=config,\n manifest=manifest,\n root_project=config,\n )\n\n adapter = dbt.adapters.factory.get_adapter(config)\n # TODO : This needs a real name?\n sql_node = block_parser.parse_remote(sql, 'name')\n process_node(config, manifest, sql_node)\n return config, sql_node, adapter\n\n\ndef compile_sql(manifest, project_path, sql):\n from dbt.task.sql import SqlCompileRunner\n\n config, node, adapter = _get_operation_node(manifest, project_path, sql)\n runner = SqlCompileRunner(config, adapter, node, 1, 1)\n return runner.safe_run(manifest)\n\n\ndef execute_sql(manifest, project_path, sql):\n from dbt.task.sql import SqlExecuteRunner\n\n config, node, adapter = _get_operation_node(manifest, project_path, sql)\n runner = SqlExecuteRunner(config, adapter, node, 1, 1)\n # TODO: use same interface for runner\n return runner.safe_run(manifest)\n\n\ndef parse_to_manifest(config):\n from dbt.parser.manifest import ManifestLoader\n\n return ManifestLoader.get_full_manifest(config)\n\n\ndef deserialize_manifest(manifest_msgpack):\n from dbt.contracts.graph.manifest import Manifest\n\n return Manifest.from_msgpack(manifest_msgpack)\n\n\ndef serialize_manifest(manifest):\n # TODO: what should this take as an arg?\n return manifest.to_msgpack()\n", "path": "core/dbt/lib.py"}], "after_files": [{"content": "# TODO: this file is one big TODO\nimport os\nfrom dbt.exceptions import RuntimeException\nfrom dbt import flags\nfrom collections import namedtuple\n\nRuntimeArgs = namedtuple(\n 'RuntimeArgs', 'project_dir profiles_dir single_threaded profile_name'\n)\n\n\ndef get_dbt_config(project_dir, single_threaded=False):\n from dbt.config.runtime import RuntimeConfig\n import dbt.adapters.factory\n import dbt.events.functions\n\n if os.getenv('DBT_PROFILES_DIR'):\n profiles_dir = os.getenv('DBT_PROFILES_DIR')\n else:\n profiles_dir = os.path.expanduser(\"~/.dbt\")\n\n # Construct a phony config\n config = RuntimeConfig.from_args(RuntimeArgs(\n project_dir, profiles_dir, single_threaded, 'user'\n ))\n # Clear previously registered adapters--\n # this fixes cacheing behavior on the dbt-server\n flags.set_from_args('', config)\n dbt.adapters.factory.reset_adapters()\n # Load the relevant adapter\n dbt.adapters.factory.register_adapter(config)\n # Set invocation id\n dbt.events.functions.set_invocation_id()\n\n return config\n\n\ndef get_task_by_type(type):\n # TODO: we need to tell dbt-server what tasks are available\n from dbt.task.run import RunTask\n from dbt.task.list import ListTask\n from dbt.task.seed import SeedTask\n from dbt.task.test import TestTask\n from dbt.task.build import BuildTask\n from dbt.task.snapshot import SnapshotTask\n from dbt.task.run_operation import RunOperationTask\n\n if type == 'run':\n return RunTask\n elif type == 'test':\n return TestTask\n elif type == 'list':\n return ListTask\n elif type == 'seed':\n return SeedTask\n elif type == 'build':\n return BuildTask\n elif type == 'snapshot':\n return SnapshotTask\n elif type == 'run_operation':\n return RunOperationTask\n\n raise RuntimeException('not a valid task')\n\n\ndef create_task(type, args, manifest, config):\n task = get_task_by_type(type)\n\n def no_op(*args, **kwargs):\n pass\n\n # TODO: yuck, let's rethink tasks a little\n task = task(args, config)\n\n # Wow! We can monkeypatch taskCls.load_manifest to return _our_ manifest\n task.load_manifest = no_op\n task.manifest = manifest\n return task\n\n\ndef _get_operation_node(manifest, project_path, sql):\n from dbt.parser.manifest import process_node\n from dbt.parser.sql import SqlBlockParser\n import dbt.adapters.factory\n\n config = get_dbt_config(project_path)\n block_parser = SqlBlockParser(\n project=config,\n manifest=manifest,\n root_project=config,\n )\n\n adapter = dbt.adapters.factory.get_adapter(config)\n # TODO : This needs a real name?\n sql_node = block_parser.parse_remote(sql, 'name')\n process_node(config, manifest, sql_node)\n return config, sql_node, adapter\n\n\ndef compile_sql(manifest, project_path, sql):\n from dbt.task.sql import SqlCompileRunner\n\n config, node, adapter = _get_operation_node(manifest, project_path, sql)\n runner = SqlCompileRunner(config, adapter, node, 1, 1)\n return runner.safe_run(manifest)\n\n\ndef execute_sql(manifest, project_path, sql):\n from dbt.task.sql import SqlExecuteRunner\n\n config, node, adapter = _get_operation_node(manifest, project_path, sql)\n runner = SqlExecuteRunner(config, adapter, node, 1, 1)\n # TODO: use same interface for runner\n return runner.safe_run(manifest)\n\n\ndef parse_to_manifest(config):\n from dbt.parser.manifest import ManifestLoader\n\n return ManifestLoader.get_full_manifest(config)\n\n\ndef deserialize_manifest(manifest_msgpack):\n from dbt.contracts.graph.manifest import Manifest\n\n return Manifest.from_msgpack(manifest_msgpack)\n\n\ndef serialize_manifest(manifest):\n # TODO: what should this take as an arg?\n return manifest.to_msgpack()\n", "path": "core/dbt/lib.py"}]}
| 1,531 | 166 |
gh_patches_debug_5450
|
rasdani/github-patches
|
git_diff
|
mozilla__bugbug-1713
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Set up Sentry for bugbug
Adding it will help investigate issues (like the Mercurial woes).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `http_service/bugbug_http/worker.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 # This Source Code Form is subject to the terms of the Mozilla Public
4 # License, v. 2.0. If a copy of the MPL was not distributed with this file,
5 # You can obtain one at http://mozilla.org/MPL/2.0/.
6
7 import os
8 import sys
9
10 from redis import Redis
11 from rq import Connection, Worker
12
13 import bugbug_http.boot
14
15
16 def main():
17 # Bootstrap the worker assets
18 bugbug_http.boot.boot_worker()
19
20 # Provide queue names to listen to as arguments to this script,
21 # similar to rq worker
22 redis_url = os.environ.get("REDIS_URL", "redis://localhost/0")
23 redis_conn = Redis.from_url(redis_url)
24 with Connection(connection=redis_conn):
25 qs = sys.argv[1:] or ["default"]
26
27 w = Worker(qs)
28 w.work()
29
30
31 if __name__ == "__main__":
32 main()
33
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/http_service/bugbug_http/worker.py b/http_service/bugbug_http/worker.py
--- a/http_service/bugbug_http/worker.py
+++ b/http_service/bugbug_http/worker.py
@@ -7,10 +7,20 @@
import os
import sys
+import sentry_sdk
from redis import Redis
from rq import Connection, Worker
+from sentry_sdk.integrations.rq import RqIntegration
import bugbug_http.boot
+from bugbug import get_bugbug_version
+
+if os.environ.get("SENTRY_DSN"):
+ sentry_sdk.init(
+ os.environ.get("SENTRY_DSN"),
+ integrations=[RqIntegration()],
+ release=get_bugbug_version(),
+ )
def main():
|
{"golden_diff": "diff --git a/http_service/bugbug_http/worker.py b/http_service/bugbug_http/worker.py\n--- a/http_service/bugbug_http/worker.py\n+++ b/http_service/bugbug_http/worker.py\n@@ -7,10 +7,20 @@\n import os\n import sys\n \n+import sentry_sdk\n from redis import Redis\n from rq import Connection, Worker\n+from sentry_sdk.integrations.rq import RqIntegration\n \n import bugbug_http.boot\n+from bugbug import get_bugbug_version\n+\n+if os.environ.get(\"SENTRY_DSN\"):\n+ sentry_sdk.init(\n+ os.environ.get(\"SENTRY_DSN\"),\n+ integrations=[RqIntegration()],\n+ release=get_bugbug_version(),\n+ )\n \n \n def main():\n", "issue": "Set up Sentry for bugbug\nAdding it will help investigate issues (like the Mercurial woes).\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport os\nimport sys\n\nfrom redis import Redis\nfrom rq import Connection, Worker\n\nimport bugbug_http.boot\n\n\ndef main():\n # Bootstrap the worker assets\n bugbug_http.boot.boot_worker()\n\n # Provide queue names to listen to as arguments to this script,\n # similar to rq worker\n redis_url = os.environ.get(\"REDIS_URL\", \"redis://localhost/0\")\n redis_conn = Redis.from_url(redis_url)\n with Connection(connection=redis_conn):\n qs = sys.argv[1:] or [\"default\"]\n\n w = Worker(qs)\n w.work()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "http_service/bugbug_http/worker.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport os\nimport sys\n\nimport sentry_sdk\nfrom redis import Redis\nfrom rq import Connection, Worker\nfrom sentry_sdk.integrations.rq import RqIntegration\n\nimport bugbug_http.boot\nfrom bugbug import get_bugbug_version\n\nif os.environ.get(\"SENTRY_DSN\"):\n sentry_sdk.init(\n os.environ.get(\"SENTRY_DSN\"),\n integrations=[RqIntegration()],\n release=get_bugbug_version(),\n )\n\n\ndef main():\n # Bootstrap the worker assets\n bugbug_http.boot.boot_worker()\n\n # Provide queue names to listen to as arguments to this script,\n # similar to rq worker\n redis_url = os.environ.get(\"REDIS_URL\", \"redis://localhost/0\")\n redis_conn = Redis.from_url(redis_url)\n with Connection(connection=redis_conn):\n qs = sys.argv[1:] or [\"default\"]\n\n w = Worker(qs)\n w.work()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "http_service/bugbug_http/worker.py"}]}
| 552 | 169 |
gh_patches_debug_58376
|
rasdani/github-patches
|
git_diff
|
deis__deis-323
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`deis run` generates ugly error if app code not yet pushed
I ran `deis run ls -la` after I had created the app, but before I had pushed the code with `git push deis master`. Here is the error I received:
```
ben$ example-python-flask > deis run ls -la
Warning: non-zero return code 255
lxc-start: No such file or directory - failed to mount '/opt/deis/runtime/slugs/hushed-sailfish-1/app' on '/usr/lib/lxc/root///app'
lxc-start: failed to setup the mount entries for '5f4f4d932501338fa2062d52e5893dfbd3933fa09102c67493a169a2a87ee479'
lxc-start: failed to setup the container
lxc-start: invalid sequence number 1. expected 2
lxc-start: failed to spawn '5f4f4d932501338fa2062d52e5893dfbd3933fa09102c67493a169a2a87ee479'
lxc-start: Device or resource busy - failed to remove cgroup '/sys/fs/cgroup/cpuset//lxc/5f4f4d932501338fa2062d52e5893dfbd3933fa09102c67493a169a2a87ee479'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `api/tasks.py`
Content:
```
1
2 from __future__ import unicode_literals
3 import importlib
4
5 from celery import task
6 from celery.canvas import group
7
8 from deis import settings
9 from provider import import_provider_module
10
11 # import user-defined config management module
12 CM = importlib.import_module(settings.CM_MODULE)
13
14
15 @task
16 def build_layer(layer):
17 provider = import_provider_module(layer.flavor.provider.type)
18 provider.build_layer(layer.flat())
19
20
21 @task
22 def destroy_layer(layer):
23 provider = import_provider_module(layer.flavor.provider.type)
24 provider.destroy_layer(layer.flat())
25 layer.delete()
26
27
28 @task
29 def build_node(node):
30 provider = import_provider_module(node.layer.flavor.provider.type)
31 provider_id, fqdn, metadata = provider.build_node(node.flat())
32 node.provider_id = provider_id
33 node.fqdn = fqdn
34 node.metadata = metadata
35 node.save()
36 CM.bootstrap_node(node.flat())
37
38
39 @task
40 def destroy_node(node):
41 provider = import_provider_module(node.layer.flavor.provider.type)
42 provider.destroy_node(node.flat())
43 CM.purge_node(node.flat())
44 node.delete()
45
46
47 @task
48 def converge_node(node):
49 output, rc = CM.converge_node(node.flat())
50 return output, rc
51
52
53 @task
54 def run_node(node, command):
55 output, rc = CM.run_node(node.flat(), command)
56 return output, rc
57
58
59 @task
60 def build_formation(formation):
61 return
62
63
64 @task
65 def destroy_formation(formation):
66 app_tasks = [destroy_app.si(a) for a in formation.app_set.all()]
67 node_tasks = [destroy_node.si(n) for n in formation.node_set.all()]
68 layer_tasks = [destroy_layer.si(l) for l in formation.layer_set.all()]
69 group(app_tasks + node_tasks).apply_async().join()
70 group(layer_tasks).apply_async().join()
71 CM.purge_formation(formation.flat())
72 formation.delete()
73
74
75 @task
76 def converge_formation(formation):
77 nodes = formation.node_set.all()
78 subtasks = []
79 for n in nodes:
80 subtask = converge_node.si(n)
81 subtasks.append(subtask)
82 group(*subtasks).apply_async().join()
83
84
85 @task
86 def build_app(app):
87 return
88
89
90 @task
91 def destroy_app(app):
92 CM.purge_app(app.flat())
93 app.delete()
94 app.formation.publish()
95
96
97 @task
98 def converge_controller():
99 CM.converge_controller()
100 return None
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/api/tasks.py b/api/tasks.py
--- a/api/tasks.py
+++ b/api/tasks.py
@@ -53,6 +53,8 @@
@task
def run_node(node, command):
output, rc = CM.run_node(node.flat(), command)
+ if rc != 0 and 'failed to setup the container' in output:
+ output = '\033[35mPlease run `git push deis master` first.\033[0m\n' + output
return output, rc
|
{"golden_diff": "diff --git a/api/tasks.py b/api/tasks.py\n--- a/api/tasks.py\n+++ b/api/tasks.py\n@@ -53,6 +53,8 @@\n @task\n def run_node(node, command):\n output, rc = CM.run_node(node.flat(), command)\n+ if rc != 0 and 'failed to setup the container' in output:\n+ output = '\\033[35mPlease run `git push deis master` first.\\033[0m\\n' + output\n return output, rc\n", "issue": "`deis run` generates ugly error if app code not yet pushed\nI ran `deis run ls -la` after I had created the app, but before I had pushed the code with `git push deis master`. Here is the error I received:\n\n```\nben$ example-python-flask > deis run ls -la\nWarning: non-zero return code 255\nlxc-start: No such file or directory - failed to mount '/opt/deis/runtime/slugs/hushed-sailfish-1/app' on '/usr/lib/lxc/root///app'\nlxc-start: failed to setup the mount entries for '5f4f4d932501338fa2062d52e5893dfbd3933fa09102c67493a169a2a87ee479'\nlxc-start: failed to setup the container\nlxc-start: invalid sequence number 1. expected 2\nlxc-start: failed to spawn '5f4f4d932501338fa2062d52e5893dfbd3933fa09102c67493a169a2a87ee479'\nlxc-start: Device or resource busy - failed to remove cgroup '/sys/fs/cgroup/cpuset//lxc/5f4f4d932501338fa2062d52e5893dfbd3933fa09102c67493a169a2a87ee479'\n```\n\n", "before_files": [{"content": "\nfrom __future__ import unicode_literals\nimport importlib\n\nfrom celery import task\nfrom celery.canvas import group\n\nfrom deis import settings\nfrom provider import import_provider_module\n\n# import user-defined config management module\nCM = importlib.import_module(settings.CM_MODULE)\n\n\n@task\ndef build_layer(layer):\n provider = import_provider_module(layer.flavor.provider.type)\n provider.build_layer(layer.flat())\n\n\n@task\ndef destroy_layer(layer):\n provider = import_provider_module(layer.flavor.provider.type)\n provider.destroy_layer(layer.flat())\n layer.delete()\n\n\n@task\ndef build_node(node):\n provider = import_provider_module(node.layer.flavor.provider.type)\n provider_id, fqdn, metadata = provider.build_node(node.flat())\n node.provider_id = provider_id\n node.fqdn = fqdn\n node.metadata = metadata\n node.save()\n CM.bootstrap_node(node.flat())\n\n\n@task\ndef destroy_node(node):\n provider = import_provider_module(node.layer.flavor.provider.type)\n provider.destroy_node(node.flat())\n CM.purge_node(node.flat())\n node.delete()\n\n\n@task\ndef converge_node(node):\n output, rc = CM.converge_node(node.flat())\n return output, rc\n\n\n@task\ndef run_node(node, command):\n output, rc = CM.run_node(node.flat(), command)\n return output, rc\n\n\n@task\ndef build_formation(formation):\n return\n\n\n@task\ndef destroy_formation(formation):\n app_tasks = [destroy_app.si(a) for a in formation.app_set.all()]\n node_tasks = [destroy_node.si(n) for n in formation.node_set.all()]\n layer_tasks = [destroy_layer.si(l) for l in formation.layer_set.all()]\n group(app_tasks + node_tasks).apply_async().join()\n group(layer_tasks).apply_async().join()\n CM.purge_formation(formation.flat())\n formation.delete()\n\n\n@task\ndef converge_formation(formation):\n nodes = formation.node_set.all()\n subtasks = []\n for n in nodes:\n subtask = converge_node.si(n)\n subtasks.append(subtask)\n group(*subtasks).apply_async().join()\n\n\n@task\ndef build_app(app):\n return\n\n\n@task\ndef destroy_app(app):\n CM.purge_app(app.flat())\n app.delete()\n app.formation.publish()\n\n\n@task\ndef converge_controller():\n CM.converge_controller()\n return None\n", "path": "api/tasks.py"}], "after_files": [{"content": "\nfrom __future__ import unicode_literals\nimport importlib\n\nfrom celery import task\nfrom celery.canvas import group\n\nfrom deis import settings\nfrom provider import import_provider_module\n\n# import user-defined config management module\nCM = importlib.import_module(settings.CM_MODULE)\n\n\n@task\ndef build_layer(layer):\n provider = import_provider_module(layer.flavor.provider.type)\n provider.build_layer(layer.flat())\n\n\n@task\ndef destroy_layer(layer):\n provider = import_provider_module(layer.flavor.provider.type)\n provider.destroy_layer(layer.flat())\n layer.delete()\n\n\n@task\ndef build_node(node):\n provider = import_provider_module(node.layer.flavor.provider.type)\n provider_id, fqdn, metadata = provider.build_node(node.flat())\n node.provider_id = provider_id\n node.fqdn = fqdn\n node.metadata = metadata\n node.save()\n CM.bootstrap_node(node.flat())\n\n\n@task\ndef destroy_node(node):\n provider = import_provider_module(node.layer.flavor.provider.type)\n provider.destroy_node(node.flat())\n CM.purge_node(node.flat())\n node.delete()\n\n\n@task\ndef converge_node(node):\n output, rc = CM.converge_node(node.flat())\n return output, rc\n\n\n@task\ndef run_node(node, command):\n output, rc = CM.run_node(node.flat(), command)\n if rc != 0 and 'failed to setup the container' in output:\n output = '\\033[35mPlease run `git push deis master` first.\\033[0m\\n' + output\n return output, rc\n\n\n@task\ndef build_formation(formation):\n return\n\n\n@task\ndef destroy_formation(formation):\n app_tasks = [destroy_app.si(a) for a in formation.app_set.all()]\n node_tasks = [destroy_node.si(n) for n in formation.node_set.all()]\n layer_tasks = [destroy_layer.si(l) for l in formation.layer_set.all()]\n group(app_tasks + node_tasks).apply_async().join()\n group(layer_tasks).apply_async().join()\n CM.purge_formation(formation.flat())\n formation.delete()\n\n\n@task\ndef converge_formation(formation):\n nodes = formation.node_set.all()\n subtasks = []\n for n in nodes:\n subtask = converge_node.si(n)\n subtasks.append(subtask)\n group(*subtasks).apply_async().join()\n\n\n@task\ndef build_app(app):\n return\n\n\n@task\ndef destroy_app(app):\n CM.purge_app(app.flat())\n app.delete()\n app.formation.publish()\n\n\n@task\ndef converge_controller():\n CM.converge_controller()\n return None\n", "path": "api/tasks.py"}]}
| 1,356 | 117 |
gh_patches_debug_15911
|
rasdani/github-patches
|
git_diff
|
pallets__click-2599
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
deprecate `__version__` attribute
The `__version__` attribute is an old pattern from early in Python packaging. Setuptools eventually made it easier to use the pattern by allowing reading the value from the attribute at build time, and some other build backends have done the same.
However, there's no reason to expose this directly in code anymore. It's usually easier to use feature detection (`hasattr`, `try/except`) instead. `importlib.metadata.version("click")` can be used to get the version at runtime in a standard way, if it's really needed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/click/__init__.py`
Content:
```
1 """
2 Click is a simple Python module inspired by the stdlib optparse to make
3 writing command line scripts fun. Unlike other modules, it's based
4 around a simple API that does not come with too much magic and is
5 composable.
6 """
7 from __future__ import annotations
8
9 from .core import Argument as Argument
10 from .core import Command as Command
11 from .core import CommandCollection as CommandCollection
12 from .core import Context as Context
13 from .core import Group as Group
14 from .core import Option as Option
15 from .core import Parameter as Parameter
16 from .decorators import argument as argument
17 from .decorators import command as command
18 from .decorators import confirmation_option as confirmation_option
19 from .decorators import group as group
20 from .decorators import help_option as help_option
21 from .decorators import make_pass_decorator as make_pass_decorator
22 from .decorators import option as option
23 from .decorators import pass_context as pass_context
24 from .decorators import pass_obj as pass_obj
25 from .decorators import password_option as password_option
26 from .decorators import version_option as version_option
27 from .exceptions import Abort as Abort
28 from .exceptions import BadArgumentUsage as BadArgumentUsage
29 from .exceptions import BadOptionUsage as BadOptionUsage
30 from .exceptions import BadParameter as BadParameter
31 from .exceptions import ClickException as ClickException
32 from .exceptions import FileError as FileError
33 from .exceptions import MissingParameter as MissingParameter
34 from .exceptions import NoSuchOption as NoSuchOption
35 from .exceptions import UsageError as UsageError
36 from .formatting import HelpFormatter as HelpFormatter
37 from .formatting import wrap_text as wrap_text
38 from .globals import get_current_context as get_current_context
39 from .termui import clear as clear
40 from .termui import confirm as confirm
41 from .termui import echo_via_pager as echo_via_pager
42 from .termui import edit as edit
43 from .termui import getchar as getchar
44 from .termui import launch as launch
45 from .termui import pause as pause
46 from .termui import progressbar as progressbar
47 from .termui import prompt as prompt
48 from .termui import secho as secho
49 from .termui import style as style
50 from .termui import unstyle as unstyle
51 from .types import BOOL as BOOL
52 from .types import Choice as Choice
53 from .types import DateTime as DateTime
54 from .types import File as File
55 from .types import FLOAT as FLOAT
56 from .types import FloatRange as FloatRange
57 from .types import INT as INT
58 from .types import IntRange as IntRange
59 from .types import ParamType as ParamType
60 from .types import Path as Path
61 from .types import STRING as STRING
62 from .types import Tuple as Tuple
63 from .types import UNPROCESSED as UNPROCESSED
64 from .types import UUID as UUID
65 from .utils import echo as echo
66 from .utils import format_filename as format_filename
67 from .utils import get_app_dir as get_app_dir
68 from .utils import get_binary_stream as get_binary_stream
69 from .utils import get_text_stream as get_text_stream
70 from .utils import open_file as open_file
71
72 __version__ = "8.2.0.dev0"
73
74
75 def __getattr__(name: str) -> object:
76 import warnings
77
78 if name == "BaseCommand":
79 from .core import _BaseCommand
80
81 warnings.warn(
82 "'BaseCommand' is deprecated and will be removed in Click 9.0. Use"
83 " 'Command' instead.",
84 DeprecationWarning,
85 stacklevel=2,
86 )
87 return _BaseCommand
88
89 if name == "MultiCommand":
90 from .core import _MultiCommand
91
92 warnings.warn(
93 "'MultiCommand' is deprecated and will be removed in Click 9.0. Use"
94 " 'Group' instead.",
95 DeprecationWarning,
96 stacklevel=2,
97 )
98 return _MultiCommand
99
100 if name == "OptionParser":
101 from .parser import _OptionParser
102
103 warnings.warn(
104 "'OptionParser' is deprecated and will be removed in Click 9.0. The"
105 " old parser is available in 'optparse'.",
106 DeprecationWarning,
107 stacklevel=2,
108 )
109 return _OptionParser
110
111 raise AttributeError(name)
112
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/click/__init__.py b/src/click/__init__.py
--- a/src/click/__init__.py
+++ b/src/click/__init__.py
@@ -69,8 +69,6 @@
from .utils import get_text_stream as get_text_stream
from .utils import open_file as open_file
-__version__ = "8.2.0.dev0"
-
def __getattr__(name: str) -> object:
import warnings
@@ -108,4 +106,17 @@
)
return _OptionParser
+ if name == "__version__":
+ import importlib.metadata
+ import warnings
+
+ warnings.warn(
+ "The '__version__' attribute is deprecated and will be removed in"
+ " Click 9.1. Use feature detection or"
+ " 'importlib.metadata.version(\"click\")' instead.",
+ DeprecationWarning,
+ stacklevel=2,
+ )
+ return importlib.metadata.version("click")
+
raise AttributeError(name)
|
{"golden_diff": "diff --git a/src/click/__init__.py b/src/click/__init__.py\n--- a/src/click/__init__.py\n+++ b/src/click/__init__.py\n@@ -69,8 +69,6 @@\n from .utils import get_text_stream as get_text_stream\n from .utils import open_file as open_file\n \n-__version__ = \"8.2.0.dev0\"\n-\n \n def __getattr__(name: str) -> object:\n import warnings\n@@ -108,4 +106,17 @@\n )\n return _OptionParser\n \n+ if name == \"__version__\":\n+ import importlib.metadata\n+ import warnings\n+\n+ warnings.warn(\n+ \"The '__version__' attribute is deprecated and will be removed in\"\n+ \" Click 9.1. Use feature detection or\"\n+ \" 'importlib.metadata.version(\\\"click\\\")' instead.\",\n+ DeprecationWarning,\n+ stacklevel=2,\n+ )\n+ return importlib.metadata.version(\"click\")\n+\n raise AttributeError(name)\n", "issue": "deprecate `__version__` attribute\nThe `__version__` attribute is an old pattern from early in Python packaging. Setuptools eventually made it easier to use the pattern by allowing reading the value from the attribute at build time, and some other build backends have done the same.\r\n\r\nHowever, there's no reason to expose this directly in code anymore. It's usually easier to use feature detection (`hasattr`, `try/except`) instead. `importlib.metadata.version(\"click\")` can be used to get the version at runtime in a standard way, if it's really needed.\n", "before_files": [{"content": "\"\"\"\nClick is a simple Python module inspired by the stdlib optparse to make\nwriting command line scripts fun. Unlike other modules, it's based\naround a simple API that does not come with too much magic and is\ncomposable.\n\"\"\"\nfrom __future__ import annotations\n\nfrom .core import Argument as Argument\nfrom .core import Command as Command\nfrom .core import CommandCollection as CommandCollection\nfrom .core import Context as Context\nfrom .core import Group as Group\nfrom .core import Option as Option\nfrom .core import Parameter as Parameter\nfrom .decorators import argument as argument\nfrom .decorators import command as command\nfrom .decorators import confirmation_option as confirmation_option\nfrom .decorators import group as group\nfrom .decorators import help_option as help_option\nfrom .decorators import make_pass_decorator as make_pass_decorator\nfrom .decorators import option as option\nfrom .decorators import pass_context as pass_context\nfrom .decorators import pass_obj as pass_obj\nfrom .decorators import password_option as password_option\nfrom .decorators import version_option as version_option\nfrom .exceptions import Abort as Abort\nfrom .exceptions import BadArgumentUsage as BadArgumentUsage\nfrom .exceptions import BadOptionUsage as BadOptionUsage\nfrom .exceptions import BadParameter as BadParameter\nfrom .exceptions import ClickException as ClickException\nfrom .exceptions import FileError as FileError\nfrom .exceptions import MissingParameter as MissingParameter\nfrom .exceptions import NoSuchOption as NoSuchOption\nfrom .exceptions import UsageError as UsageError\nfrom .formatting import HelpFormatter as HelpFormatter\nfrom .formatting import wrap_text as wrap_text\nfrom .globals import get_current_context as get_current_context\nfrom .termui import clear as clear\nfrom .termui import confirm as confirm\nfrom .termui import echo_via_pager as echo_via_pager\nfrom .termui import edit as edit\nfrom .termui import getchar as getchar\nfrom .termui import launch as launch\nfrom .termui import pause as pause\nfrom .termui import progressbar as progressbar\nfrom .termui import prompt as prompt\nfrom .termui import secho as secho\nfrom .termui import style as style\nfrom .termui import unstyle as unstyle\nfrom .types import BOOL as BOOL\nfrom .types import Choice as Choice\nfrom .types import DateTime as DateTime\nfrom .types import File as File\nfrom .types import FLOAT as FLOAT\nfrom .types import FloatRange as FloatRange\nfrom .types import INT as INT\nfrom .types import IntRange as IntRange\nfrom .types import ParamType as ParamType\nfrom .types import Path as Path\nfrom .types import STRING as STRING\nfrom .types import Tuple as Tuple\nfrom .types import UNPROCESSED as UNPROCESSED\nfrom .types import UUID as UUID\nfrom .utils import echo as echo\nfrom .utils import format_filename as format_filename\nfrom .utils import get_app_dir as get_app_dir\nfrom .utils import get_binary_stream as get_binary_stream\nfrom .utils import get_text_stream as get_text_stream\nfrom .utils import open_file as open_file\n\n__version__ = \"8.2.0.dev0\"\n\n\ndef __getattr__(name: str) -> object:\n import warnings\n\n if name == \"BaseCommand\":\n from .core import _BaseCommand\n\n warnings.warn(\n \"'BaseCommand' is deprecated and will be removed in Click 9.0. Use\"\n \" 'Command' instead.\",\n DeprecationWarning,\n stacklevel=2,\n )\n return _BaseCommand\n\n if name == \"MultiCommand\":\n from .core import _MultiCommand\n\n warnings.warn(\n \"'MultiCommand' is deprecated and will be removed in Click 9.0. Use\"\n \" 'Group' instead.\",\n DeprecationWarning,\n stacklevel=2,\n )\n return _MultiCommand\n\n if name == \"OptionParser\":\n from .parser import _OptionParser\n\n warnings.warn(\n \"'OptionParser' is deprecated and will be removed in Click 9.0. The\"\n \" old parser is available in 'optparse'.\",\n DeprecationWarning,\n stacklevel=2,\n )\n return _OptionParser\n\n raise AttributeError(name)\n", "path": "src/click/__init__.py"}], "after_files": [{"content": "\"\"\"\nClick is a simple Python module inspired by the stdlib optparse to make\nwriting command line scripts fun. Unlike other modules, it's based\naround a simple API that does not come with too much magic and is\ncomposable.\n\"\"\"\nfrom __future__ import annotations\n\nfrom .core import Argument as Argument\nfrom .core import Command as Command\nfrom .core import CommandCollection as CommandCollection\nfrom .core import Context as Context\nfrom .core import Group as Group\nfrom .core import Option as Option\nfrom .core import Parameter as Parameter\nfrom .decorators import argument as argument\nfrom .decorators import command as command\nfrom .decorators import confirmation_option as confirmation_option\nfrom .decorators import group as group\nfrom .decorators import help_option as help_option\nfrom .decorators import make_pass_decorator as make_pass_decorator\nfrom .decorators import option as option\nfrom .decorators import pass_context as pass_context\nfrom .decorators import pass_obj as pass_obj\nfrom .decorators import password_option as password_option\nfrom .decorators import version_option as version_option\nfrom .exceptions import Abort as Abort\nfrom .exceptions import BadArgumentUsage as BadArgumentUsage\nfrom .exceptions import BadOptionUsage as BadOptionUsage\nfrom .exceptions import BadParameter as BadParameter\nfrom .exceptions import ClickException as ClickException\nfrom .exceptions import FileError as FileError\nfrom .exceptions import MissingParameter as MissingParameter\nfrom .exceptions import NoSuchOption as NoSuchOption\nfrom .exceptions import UsageError as UsageError\nfrom .formatting import HelpFormatter as HelpFormatter\nfrom .formatting import wrap_text as wrap_text\nfrom .globals import get_current_context as get_current_context\nfrom .termui import clear as clear\nfrom .termui import confirm as confirm\nfrom .termui import echo_via_pager as echo_via_pager\nfrom .termui import edit as edit\nfrom .termui import getchar as getchar\nfrom .termui import launch as launch\nfrom .termui import pause as pause\nfrom .termui import progressbar as progressbar\nfrom .termui import prompt as prompt\nfrom .termui import secho as secho\nfrom .termui import style as style\nfrom .termui import unstyle as unstyle\nfrom .types import BOOL as BOOL\nfrom .types import Choice as Choice\nfrom .types import DateTime as DateTime\nfrom .types import File as File\nfrom .types import FLOAT as FLOAT\nfrom .types import FloatRange as FloatRange\nfrom .types import INT as INT\nfrom .types import IntRange as IntRange\nfrom .types import ParamType as ParamType\nfrom .types import Path as Path\nfrom .types import STRING as STRING\nfrom .types import Tuple as Tuple\nfrom .types import UNPROCESSED as UNPROCESSED\nfrom .types import UUID as UUID\nfrom .utils import echo as echo\nfrom .utils import format_filename as format_filename\nfrom .utils import get_app_dir as get_app_dir\nfrom .utils import get_binary_stream as get_binary_stream\nfrom .utils import get_text_stream as get_text_stream\nfrom .utils import open_file as open_file\n\n\ndef __getattr__(name: str) -> object:\n import warnings\n\n if name == \"BaseCommand\":\n from .core import _BaseCommand\n\n warnings.warn(\n \"'BaseCommand' is deprecated and will be removed in Click 9.0. Use\"\n \" 'Command' instead.\",\n DeprecationWarning,\n stacklevel=2,\n )\n return _BaseCommand\n\n if name == \"MultiCommand\":\n from .core import _MultiCommand\n\n warnings.warn(\n \"'MultiCommand' is deprecated and will be removed in Click 9.0. Use\"\n \" 'Group' instead.\",\n DeprecationWarning,\n stacklevel=2,\n )\n return _MultiCommand\n\n if name == \"OptionParser\":\n from .parser import _OptionParser\n\n warnings.warn(\n \"'OptionParser' is deprecated and will be removed in Click 9.0. The\"\n \" old parser is available in 'optparse'.\",\n DeprecationWarning,\n stacklevel=2,\n )\n return _OptionParser\n\n if name == \"__version__\":\n import importlib.metadata\n import warnings\n\n warnings.warn(\n \"The '__version__' attribute is deprecated and will be removed in\"\n \" Click 9.1. Use feature detection or\"\n \" 'importlib.metadata.version(\\\"click\\\")' instead.\",\n DeprecationWarning,\n stacklevel=2,\n )\n return importlib.metadata.version(\"click\")\n\n raise AttributeError(name)\n", "path": "src/click/__init__.py"}]}
| 1,518 | 232 |
gh_patches_debug_1752
|
rasdani/github-patches
|
git_diff
|
spack__spack-4809
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
problem with xproto
the xproto could be compile properly but while installing, i come across the following problem
```
==> 'make' '-j8'
/Applications/Xcode.app/Contents/Developer/usr/bin/make all-recursive
Making all in specs
Making all in SIAddresses
make[3]: Nothing to be done for `all'.
make[3]: Nothing to be done for `all-am'.
make[2]: Nothing to be done for `all-am'.
==> 'make' '-j8' 'install'
Making install in specs
Making install in SIAddresses
make[3]: Nothing to be done for `install-exec-am'.
make[3]: Nothing to be done for `install-data-am'.
make[3]: Nothing to be done for `install-exec-am'.
/spack/var/spack/stage/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/xproto-7.0.31/install-sh -c -d '/spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc/xproto'
/spack/var/spack/stage/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/xproto-7.0.31/install-sh -c -d '/spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc/xproto'
mkdir: /spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share: File exists
mkdir: /spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc: File exists
mkdir: /spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc/xproto: File exists
make[3]: *** [install-dist_shelfDATA] Error 1
make[3]: *** Waiting for unfinished jobs....
make[2]: *** [install-am] Error 2
make[1]: *** [install-recursive] Error 1
make: *** [install-recursive] Error 1
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `var/spack/repos/builtin/packages/xproto/package.py`
Content:
```
1 ##############################################################################
2 # Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
3 # Produced at the Lawrence Livermore National Laboratory.
4 #
5 # This file is part of Spack.
6 # Created by Todd Gamblin, [email protected], All rights reserved.
7 # LLNL-CODE-647188
8 #
9 # For details, see https://github.com/llnl/spack
10 # Please also see the NOTICE and LICENSE files for our notice and the LGPL.
11 #
12 # This program is free software; you can redistribute it and/or modify
13 # it under the terms of the GNU Lesser General Public License (as
14 # published by the Free Software Foundation) version 2.1, February 1999.
15 #
16 # This program is distributed in the hope that it will be useful, but
17 # WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
18 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
19 # conditions of the GNU Lesser General Public License for more details.
20 #
21 # You should have received a copy of the GNU Lesser General Public
22 # License along with this program; if not, write to the Free Software
23 # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
24 ##############################################################################
25 from spack import *
26
27
28 class Xproto(AutotoolsPackage):
29 """X Window System Core Protocol.
30
31 This package provides the headers and specification documents defining
32 the X Window System Core Protocol, Version 11.
33
34 It also includes a number of headers that aren't purely protocol related,
35 but are depended upon by many other X Window System packages to provide
36 common definitions and porting layer."""
37
38 homepage = "http://cgit.freedesktop.org/xorg/proto/x11proto"
39 url = "https://www.x.org/archive/individual/proto/xproto-7.0.31.tar.gz"
40
41 version('7.0.31', '04b925bf9e472c80f9212615cd684f1e')
42 version('7.0.29', '16a78dd2c5ad73011105c96235f6a0af')
43
44 depends_on('[email protected]:', type='build')
45 depends_on('util-macros', type='build')
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/var/spack/repos/builtin/packages/xproto/package.py b/var/spack/repos/builtin/packages/xproto/package.py
--- a/var/spack/repos/builtin/packages/xproto/package.py
+++ b/var/spack/repos/builtin/packages/xproto/package.py
@@ -43,3 +43,8 @@
depends_on('[email protected]:', type='build')
depends_on('util-macros', type='build')
+
+ def install(self, spec, prefix):
+ # Installation fails in parallel
+ # See https://github.com/LLNL/spack/issues/4805
+ make('install', parallel=False)
|
{"golden_diff": "diff --git a/var/spack/repos/builtin/packages/xproto/package.py b/var/spack/repos/builtin/packages/xproto/package.py\n--- a/var/spack/repos/builtin/packages/xproto/package.py\n+++ b/var/spack/repos/builtin/packages/xproto/package.py\n@@ -43,3 +43,8 @@\n \n depends_on('[email protected]:', type='build')\n depends_on('util-macros', type='build')\n+\n+ def install(self, spec, prefix):\n+ # Installation fails in parallel\n+ # See https://github.com/LLNL/spack/issues/4805\n+ make('install', parallel=False)\n", "issue": "problem with xproto\nthe xproto could be compile properly but while installing, i come across the following problem\r\n\r\n```\r\n==> 'make' '-j8'\r\n/Applications/Xcode.app/Contents/Developer/usr/bin/make all-recursive\r\nMaking all in specs\r\nMaking all in SIAddresses\r\nmake[3]: Nothing to be done for `all'.\r\nmake[3]: Nothing to be done for `all-am'.\r\nmake[2]: Nothing to be done for `all-am'.\r\n==> 'make' '-j8' 'install'\r\nMaking install in specs\r\nMaking install in SIAddresses\r\nmake[3]: Nothing to be done for `install-exec-am'.\r\nmake[3]: Nothing to be done for `install-data-am'.\r\nmake[3]: Nothing to be done for `install-exec-am'.\r\n /spack/var/spack/stage/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/xproto-7.0.31/install-sh -c -d '/spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc/xproto'\r\n /spack/var/spack/stage/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/xproto-7.0.31/install-sh -c -d '/spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc/xproto'\r\nmkdir: /spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share: File exists\r\nmkdir: /spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc: File exists\r\nmkdir: /spack/opt/spack/darwin-sierra-x86_64/clang-3.9.0/xproto-7.0.31-rgclwhmbundqudnhjaavmnvuzceqlgsh/share/doc/xproto: File exists\r\nmake[3]: *** [install-dist_shelfDATA] Error 1\r\nmake[3]: *** Waiting for unfinished jobs....\r\nmake[2]: *** [install-am] Error 2\r\nmake[1]: *** [install-recursive] Error 1\r\nmake: *** [install-recursive] Error 1\r\n```\n", "before_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/llnl/spack\n# Please also see the NOTICE and LICENSE files for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Xproto(AutotoolsPackage):\n \"\"\"X Window System Core Protocol.\n\n This package provides the headers and specification documents defining\n the X Window System Core Protocol, Version 11.\n\n It also includes a number of headers that aren't purely protocol related,\n but are depended upon by many other X Window System packages to provide\n common definitions and porting layer.\"\"\"\n\n homepage = \"http://cgit.freedesktop.org/xorg/proto/x11proto\"\n url = \"https://www.x.org/archive/individual/proto/xproto-7.0.31.tar.gz\"\n\n version('7.0.31', '04b925bf9e472c80f9212615cd684f1e')\n version('7.0.29', '16a78dd2c5ad73011105c96235f6a0af')\n\n depends_on('[email protected]:', type='build')\n depends_on('util-macros', type='build')\n", "path": "var/spack/repos/builtin/packages/xproto/package.py"}], "after_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/llnl/spack\n# Please also see the NOTICE and LICENSE files for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Xproto(AutotoolsPackage):\n \"\"\"X Window System Core Protocol.\n\n This package provides the headers and specification documents defining\n the X Window System Core Protocol, Version 11.\n\n It also includes a number of headers that aren't purely protocol related,\n but are depended upon by many other X Window System packages to provide\n common definitions and porting layer.\"\"\"\n\n homepage = \"http://cgit.freedesktop.org/xorg/proto/x11proto\"\n url = \"https://www.x.org/archive/individual/proto/xproto-7.0.31.tar.gz\"\n\n version('7.0.31', '04b925bf9e472c80f9212615cd684f1e')\n version('7.0.29', '16a78dd2c5ad73011105c96235f6a0af')\n\n depends_on('[email protected]:', type='build')\n depends_on('util-macros', type='build')\n\n def install(self, spec, prefix):\n # Installation fails in parallel\n # See https://github.com/LLNL/spack/issues/4805\n make('install', parallel=False)\n", "path": "var/spack/repos/builtin/packages/xproto/package.py"}]}
| 1,522 | 147 |
gh_patches_debug_932
|
rasdani/github-patches
|
git_diff
|
praw-dev__praw-1304
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Sphinx stops emitting warnings if it encounters only one
**Describe the bug**
<!-- A clear and concise description of what the bug is. --> When running pre_push, if Sphinx runs into an warning, it does does print any more. When there are lots of warnings, it takes a lot of time to re-run pre_push per warning
I recommend adding the command line argument `--keep-going`. This will cause it to print all warnings.
**System Info**
- PRAW Version: Latest
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pre_push.py`
Content:
```
1 #!/usr/bin/env python3
2 """Run static analysis on the project."""
3
4 import argparse
5 import sys
6 from os import path
7 from shutil import rmtree
8 from subprocess import CalledProcessError, check_call
9 from tempfile import mkdtemp
10
11 current_directory = path.abspath(path.join(__file__, ".."))
12
13
14 def do_process(args, shell=False):
15 """Run program provided by args.
16
17 Return True on success.
18
19 Output failed message on non-zero exit and return False.
20
21 Exit if command is not found.
22 """
23 print("Running: {}".format(" ".join(args)))
24 try:
25 check_call(args, shell=shell)
26 except CalledProcessError:
27 print("\nFailed: {}".format(" ".join(args)))
28 return False
29 except Exception as exc:
30 sys.stderr.write(str(exc) + "\n")
31 sys.exit(1)
32 return True
33
34
35 def run_static():
36 """Runs the static tests.
37
38 Returns a statuscode of 0 if everything ran correctly.
39 Otherwise, it will return statuscode 1
40 """
41 success = True
42 success &= do_process(
43 [
44 sys.executable,
45 path.join(current_directory, "tools", "static_word_checks.py"),
46 "--replace",
47 ]
48 )
49 success &= do_process(["black ."], shell=True)
50 success &= do_process(["flake8", "--exclude=.eggs,build,docs"])
51 success &= do_process(["pydocstyle", "praw"])
52 # success &= do_process(["pylint", "--rcfile=.pylintrc", "praw"])
53
54 tmp_dir = mkdtemp()
55 try:
56 success &= do_process(["sphinx-build", "-W", "docs", tmp_dir])
57 finally:
58 rmtree(tmp_dir)
59
60 return success
61
62
63 def run_unit():
64 """Runs the unit-tests.
65
66 Follows the behavior of the static tests,
67 where any failed tests cause pre_push.py to fail.
68 """
69 return do_process(
70 [sys.executable, path.join(current_directory, "setup.py"), "test"]
71 )
72
73
74 def main():
75 """Runs the main function.
76
77 usage: pre_push.py [-h] [-n] [-u] [-a]
78
79 Run static and/or unit-tests
80 """
81 parser = argparse.ArgumentParser(
82 description="Run static and/or unit-tests"
83 )
84 parser.add_argument(
85 "-n",
86 "--unstatic",
87 action="store_true",
88 help="Do not run static tests (black/flake8/pydocstyle/sphinx-build)",
89 default=False,
90 )
91 parser.add_argument(
92 "-u",
93 "--unit-tests",
94 "--unit",
95 action="store_true",
96 default=False,
97 help="Run the unit tests",
98 )
99 parser.add_argument(
100 "-a",
101 "--all",
102 action="store_true",
103 default=False,
104 help="Run all of the tests (static and unit). "
105 "Overrides the unstatic argument.",
106 )
107 args = parser.parse_args()
108 success = True
109 try:
110 if not args.unstatic or args.all:
111 success &= run_static()
112 if args.all or args.unit_tests:
113 success &= run_unit()
114 except KeyboardInterrupt:
115 return int(not False)
116 return int(not success)
117
118
119 if __name__ == "__main__":
120 exit_code = main()
121 print(
122 "\npre_push.py: Success!" if not exit_code else "\npre_push.py: Fail"
123 )
124 sys.exit(exit_code)
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pre_push.py b/pre_push.py
--- a/pre_push.py
+++ b/pre_push.py
@@ -53,7 +53,9 @@
tmp_dir = mkdtemp()
try:
- success &= do_process(["sphinx-build", "-W", "docs", tmp_dir])
+ success &= do_process(
+ ["sphinx-build", "-W", "--keep-going", "docs", tmp_dir]
+ )
finally:
rmtree(tmp_dir)
|
{"golden_diff": "diff --git a/pre_push.py b/pre_push.py\n--- a/pre_push.py\n+++ b/pre_push.py\n@@ -53,7 +53,9 @@\n \n tmp_dir = mkdtemp()\n try:\n- success &= do_process([\"sphinx-build\", \"-W\", \"docs\", tmp_dir])\n+ success &= do_process(\n+ [\"sphinx-build\", \"-W\", \"--keep-going\", \"docs\", tmp_dir]\n+ )\n finally:\n rmtree(tmp_dir)\n", "issue": "Sphinx stops emitting warnings if it encounters only one\n**Describe the bug**\r\n<!-- A clear and concise description of what the bug is. --> When running pre_push, if Sphinx runs into an warning, it does does print any more. When there are lots of warnings, it takes a lot of time to re-run pre_push per warning\r\n\r\nI recommend adding the command line argument `--keep-going`. This will cause it to print all warnings.\r\n\r\n**System Info**\r\n - PRAW Version: Latest\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n\"\"\"Run static analysis on the project.\"\"\"\n\nimport argparse\nimport sys\nfrom os import path\nfrom shutil import rmtree\nfrom subprocess import CalledProcessError, check_call\nfrom tempfile import mkdtemp\n\ncurrent_directory = path.abspath(path.join(__file__, \"..\"))\n\n\ndef do_process(args, shell=False):\n \"\"\"Run program provided by args.\n\n Return True on success.\n\n Output failed message on non-zero exit and return False.\n\n Exit if command is not found.\n \"\"\"\n print(\"Running: {}\".format(\" \".join(args)))\n try:\n check_call(args, shell=shell)\n except CalledProcessError:\n print(\"\\nFailed: {}\".format(\" \".join(args)))\n return False\n except Exception as exc:\n sys.stderr.write(str(exc) + \"\\n\")\n sys.exit(1)\n return True\n\n\ndef run_static():\n \"\"\"Runs the static tests.\n\n Returns a statuscode of 0 if everything ran correctly.\n Otherwise, it will return statuscode 1\n \"\"\"\n success = True\n success &= do_process(\n [\n sys.executable,\n path.join(current_directory, \"tools\", \"static_word_checks.py\"),\n \"--replace\",\n ]\n )\n success &= do_process([\"black .\"], shell=True)\n success &= do_process([\"flake8\", \"--exclude=.eggs,build,docs\"])\n success &= do_process([\"pydocstyle\", \"praw\"])\n # success &= do_process([\"pylint\", \"--rcfile=.pylintrc\", \"praw\"])\n\n tmp_dir = mkdtemp()\n try:\n success &= do_process([\"sphinx-build\", \"-W\", \"docs\", tmp_dir])\n finally:\n rmtree(tmp_dir)\n\n return success\n\n\ndef run_unit():\n \"\"\"Runs the unit-tests.\n\n Follows the behavior of the static tests,\n where any failed tests cause pre_push.py to fail.\n \"\"\"\n return do_process(\n [sys.executable, path.join(current_directory, \"setup.py\"), \"test\"]\n )\n\n\ndef main():\n \"\"\"Runs the main function.\n\n usage: pre_push.py [-h] [-n] [-u] [-a]\n\n Run static and/or unit-tests\n \"\"\"\n parser = argparse.ArgumentParser(\n description=\"Run static and/or unit-tests\"\n )\n parser.add_argument(\n \"-n\",\n \"--unstatic\",\n action=\"store_true\",\n help=\"Do not run static tests (black/flake8/pydocstyle/sphinx-build)\",\n default=False,\n )\n parser.add_argument(\n \"-u\",\n \"--unit-tests\",\n \"--unit\",\n action=\"store_true\",\n default=False,\n help=\"Run the unit tests\",\n )\n parser.add_argument(\n \"-a\",\n \"--all\",\n action=\"store_true\",\n default=False,\n help=\"Run all of the tests (static and unit). \"\n \"Overrides the unstatic argument.\",\n )\n args = parser.parse_args()\n success = True\n try:\n if not args.unstatic or args.all:\n success &= run_static()\n if args.all or args.unit_tests:\n success &= run_unit()\n except KeyboardInterrupt:\n return int(not False)\n return int(not success)\n\n\nif __name__ == \"__main__\":\n exit_code = main()\n print(\n \"\\npre_push.py: Success!\" if not exit_code else \"\\npre_push.py: Fail\"\n )\n sys.exit(exit_code)\n", "path": "pre_push.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\"\"\"Run static analysis on the project.\"\"\"\n\nimport argparse\nimport sys\nfrom os import path\nfrom shutil import rmtree\nfrom subprocess import CalledProcessError, check_call\nfrom tempfile import mkdtemp\n\ncurrent_directory = path.abspath(path.join(__file__, \"..\"))\n\n\ndef do_process(args, shell=False):\n \"\"\"Run program provided by args.\n\n Return True on success.\n\n Output failed message on non-zero exit and return False.\n\n Exit if command is not found.\n \"\"\"\n print(\"Running: {}\".format(\" \".join(args)))\n try:\n check_call(args, shell=shell)\n except CalledProcessError:\n print(\"\\nFailed: {}\".format(\" \".join(args)))\n return False\n except Exception as exc:\n sys.stderr.write(str(exc) + \"\\n\")\n sys.exit(1)\n return True\n\n\ndef run_static():\n \"\"\"Runs the static tests.\n\n Returns a statuscode of 0 if everything ran correctly.\n Otherwise, it will return statuscode 1\n \"\"\"\n success = True\n success &= do_process(\n [\n sys.executable,\n path.join(current_directory, \"tools\", \"static_word_checks.py\"),\n \"--replace\",\n ]\n )\n success &= do_process([\"black .\"], shell=True)\n success &= do_process([\"flake8\", \"--exclude=.eggs,build,docs\"])\n success &= do_process([\"pydocstyle\", \"praw\"])\n # success &= do_process([\"pylint\", \"--rcfile=.pylintrc\", \"praw\"])\n\n tmp_dir = mkdtemp()\n try:\n success &= do_process(\n [\"sphinx-build\", \"-W\", \"--keep-going\", \"docs\", tmp_dir]\n )\n finally:\n rmtree(tmp_dir)\n\n return success\n\n\ndef run_unit():\n \"\"\"Runs the unit-tests.\n\n Follows the behavior of the static tests,\n where any failed tests cause pre_push.py to fail.\n \"\"\"\n return do_process(\n [sys.executable, path.join(current_directory, \"setup.py\"), \"test\"]\n )\n\n\ndef main():\n \"\"\"Runs the main function.\n\n usage: pre_push.py [-h] [-n] [-u] [-a]\n\n Run static and/or unit-tests\n \"\"\"\n parser = argparse.ArgumentParser(\n description=\"Run static and/or unit-tests\"\n )\n parser.add_argument(\n \"-n\",\n \"--unstatic\",\n action=\"store_true\",\n help=\"Do not run static tests (black/flake8/pydocstyle/sphinx-build)\",\n default=False,\n )\n parser.add_argument(\n \"-u\",\n \"--unit-tests\",\n \"--unit\",\n action=\"store_true\",\n default=False,\n help=\"Run the unit tests\",\n )\n parser.add_argument(\n \"-a\",\n \"--all\",\n action=\"store_true\",\n default=False,\n help=\"Run all of the tests (static and unit). \"\n \"Overrides the unstatic argument.\",\n )\n args = parser.parse_args()\n success = True\n try:\n if not args.unstatic or args.all:\n success &= run_static()\n if args.all or args.unit_tests:\n success &= run_unit()\n except KeyboardInterrupt:\n return int(not False)\n return int(not success)\n\n\nif __name__ == \"__main__\":\n exit_code = main()\n print(\n \"\\npre_push.py: Success!\" if not exit_code else \"\\npre_push.py: Fail\"\n )\n sys.exit(exit_code)\n", "path": "pre_push.py"}]}
| 1,372 | 107 |
gh_patches_debug_314
|
rasdani/github-patches
|
git_diff
|
zulip__zulip-20788
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
"Pan and zoom" cuts off images instead of using the available space
If you have a tall image and a wide monitor (and wide browser viewport), and you try to zoom… the image stays trapped inside the same box it occupied before you even tried to zoom. If the image is super wide instead of tall, the same thing happens the other way around.
This leads to a lot of frustrating panning around, to look at the different parts of the image through this narrow keyhole, while tons of screen space next to it doesn't get used.
This is the biggest of the issues described by @vanclute in #18939. It was reported again by @alexanderglueck as #19837, and I just ran into it myself ([chat](https://chat.zulip.org/#narrow/stream/6-frontend/topic/pan.2Fzoom/near/1308717)). Here's a nice illustration from #19837:

Instead, when zooming we should use the full space available. This may be bigger than the area the image occupied when it was scaled down to fit completely in the space available, because the available box may have a different aspect ratio from the image.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `version.py`
Content:
```
1 import os
2
3 ZULIP_VERSION = "5.0-dev+git"
4
5 # Add information on number of commits and commit hash to version, if available
6 zulip_git_version_file = os.path.join(
7 os.path.dirname(os.path.abspath(__file__)), "zulip-git-version"
8 )
9 lines = [ZULIP_VERSION, ""]
10 if os.path.exists(zulip_git_version_file):
11 with open(zulip_git_version_file) as f:
12 lines = f.readlines() + ["", ""]
13 ZULIP_VERSION = lines.pop(0).strip()
14 ZULIP_MERGE_BASE = lines.pop(0).strip()
15
16 LATEST_MAJOR_VERSION = "4.0"
17 LATEST_RELEASE_VERSION = "4.9"
18 LATEST_RELEASE_ANNOUNCEMENT = "https://blog.zulip.com/2021/05/13/zulip-4-0-released/"
19
20 # Versions of the desktop app below DESKTOP_MINIMUM_VERSION will be
21 # prevented from connecting to the Zulip server. Versions above
22 # DESKTOP_MINIMUM_VERSION but below DESKTOP_WARNING_VERSION will have
23 # a banner at the top of the page asking the user to upgrade.
24 DESKTOP_MINIMUM_VERSION = "5.2.0"
25 DESKTOP_WARNING_VERSION = "5.4.3"
26
27 # Bump the API_FEATURE_LEVEL whenever an API change is made
28 # that clients might want to condition on. If we forget at
29 # the time we make the change, then bump it later as soon
30 # as we notice; clients using API_FEATURE_LEVEL will just not
31 # use the new feature/API until the bump.
32 #
33 # Changes should be accompanied by documentation explaining what the
34 # new level means in templates/zerver/api/changelog.md, as well as
35 # "**Changes**" entries in the endpoint's documentation in `zulip.yaml`.
36 API_FEATURE_LEVEL = 115
37
38 # Bump the minor PROVISION_VERSION to indicate that folks should provision
39 # only when going from an old version of the code to a newer version. Bump
40 # the major version to indicate that folks should provision in both
41 # directions.
42
43 # Typically,
44 # * adding a dependency only requires a minor version bump;
45 # * removing a dependency requires a major version bump;
46 # * upgrading a dependency requires a major version bump, unless the
47 # upgraded dependency is backwards compatible with all of our
48 # historical commits sharing the same major version, in which case a
49 # minor version bump suffices.
50
51 PROVISION_VERSION = "173.3"
52
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/version.py b/version.py
--- a/version.py
+++ b/version.py
@@ -48,4 +48,4 @@
# historical commits sharing the same major version, in which case a
# minor version bump suffices.
-PROVISION_VERSION = "173.3"
+PROVISION_VERSION = "173.4"
|
{"golden_diff": "diff --git a/version.py b/version.py\n--- a/version.py\n+++ b/version.py\n@@ -48,4 +48,4 @@\n # historical commits sharing the same major version, in which case a\n # minor version bump suffices.\n \n-PROVISION_VERSION = \"173.3\"\n+PROVISION_VERSION = \"173.4\"\n", "issue": "\"Pan and zoom\" cuts off images instead of using the available space\nIf you have a tall image and a wide monitor (and wide browser viewport), and you try to zoom\u2026 the image stays trapped inside the same box it occupied before you even tried to zoom. If the image is super wide instead of tall, the same thing happens the other way around.\r\n\r\nThis leads to a lot of frustrating panning around, to look at the different parts of the image through this narrow keyhole, while tons of screen space next to it doesn't get used.\r\n\r\nThis is the biggest of the issues described by @vanclute in #18939. It was reported again by @alexanderglueck as #19837, and I just ran into it myself ([chat](https://chat.zulip.org/#narrow/stream/6-frontend/topic/pan.2Fzoom/near/1308717)). Here's a nice illustration from #19837: \r\n\r\n\r\nInstead, when zooming we should use the full space available. This may be bigger than the area the image occupied when it was scaled down to fit completely in the space available, because the available box may have a different aspect ratio from the image.\r\n\n", "before_files": [{"content": "import os\n\nZULIP_VERSION = \"5.0-dev+git\"\n\n# Add information on number of commits and commit hash to version, if available\nzulip_git_version_file = os.path.join(\n os.path.dirname(os.path.abspath(__file__)), \"zulip-git-version\"\n)\nlines = [ZULIP_VERSION, \"\"]\nif os.path.exists(zulip_git_version_file):\n with open(zulip_git_version_file) as f:\n lines = f.readlines() + [\"\", \"\"]\nZULIP_VERSION = lines.pop(0).strip()\nZULIP_MERGE_BASE = lines.pop(0).strip()\n\nLATEST_MAJOR_VERSION = \"4.0\"\nLATEST_RELEASE_VERSION = \"4.9\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.com/2021/05/13/zulip-4-0-released/\"\n\n# Versions of the desktop app below DESKTOP_MINIMUM_VERSION will be\n# prevented from connecting to the Zulip server. Versions above\n# DESKTOP_MINIMUM_VERSION but below DESKTOP_WARNING_VERSION will have\n# a banner at the top of the page asking the user to upgrade.\nDESKTOP_MINIMUM_VERSION = \"5.2.0\"\nDESKTOP_WARNING_VERSION = \"5.4.3\"\n\n# Bump the API_FEATURE_LEVEL whenever an API change is made\n# that clients might want to condition on. If we forget at\n# the time we make the change, then bump it later as soon\n# as we notice; clients using API_FEATURE_LEVEL will just not\n# use the new feature/API until the bump.\n#\n# Changes should be accompanied by documentation explaining what the\n# new level means in templates/zerver/api/changelog.md, as well as\n# \"**Changes**\" entries in the endpoint's documentation in `zulip.yaml`.\nAPI_FEATURE_LEVEL = 115\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically,\n# * adding a dependency only requires a minor version bump;\n# * removing a dependency requires a major version bump;\n# * upgrading a dependency requires a major version bump, unless the\n# upgraded dependency is backwards compatible with all of our\n# historical commits sharing the same major version, in which case a\n# minor version bump suffices.\n\nPROVISION_VERSION = \"173.3\"\n", "path": "version.py"}], "after_files": [{"content": "import os\n\nZULIP_VERSION = \"5.0-dev+git\"\n\n# Add information on number of commits and commit hash to version, if available\nzulip_git_version_file = os.path.join(\n os.path.dirname(os.path.abspath(__file__)), \"zulip-git-version\"\n)\nlines = [ZULIP_VERSION, \"\"]\nif os.path.exists(zulip_git_version_file):\n with open(zulip_git_version_file) as f:\n lines = f.readlines() + [\"\", \"\"]\nZULIP_VERSION = lines.pop(0).strip()\nZULIP_MERGE_BASE = lines.pop(0).strip()\n\nLATEST_MAJOR_VERSION = \"4.0\"\nLATEST_RELEASE_VERSION = \"4.9\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.com/2021/05/13/zulip-4-0-released/\"\n\n# Versions of the desktop app below DESKTOP_MINIMUM_VERSION will be\n# prevented from connecting to the Zulip server. Versions above\n# DESKTOP_MINIMUM_VERSION but below DESKTOP_WARNING_VERSION will have\n# a banner at the top of the page asking the user to upgrade.\nDESKTOP_MINIMUM_VERSION = \"5.2.0\"\nDESKTOP_WARNING_VERSION = \"5.4.3\"\n\n# Bump the API_FEATURE_LEVEL whenever an API change is made\n# that clients might want to condition on. If we forget at\n# the time we make the change, then bump it later as soon\n# as we notice; clients using API_FEATURE_LEVEL will just not\n# use the new feature/API until the bump.\n#\n# Changes should be accompanied by documentation explaining what the\n# new level means in templates/zerver/api/changelog.md, as well as\n# \"**Changes**\" entries in the endpoint's documentation in `zulip.yaml`.\nAPI_FEATURE_LEVEL = 115\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically,\n# * adding a dependency only requires a minor version bump;\n# * removing a dependency requires a major version bump;\n# * upgrading a dependency requires a major version bump, unless the\n# upgraded dependency is backwards compatible with all of our\n# historical commits sharing the same major version, in which case a\n# minor version bump suffices.\n\nPROVISION_VERSION = \"173.4\"\n", "path": "version.py"}]}
| 1,224 | 80 |
gh_patches_debug_11329
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-center-index-4675
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[package] odbc/2.3.7: sysconfdir is in Conan local cache
<!--
Please don't forget to update the issue title.
Include all applicable information to help us reproduce your problem.
-->
### Package and Environment Details (include every applicable attribute)
* Package Name/Version: **odbc/2.3.7**
### Steps to reproduce (Include if Applicable)
The odbc library has a configuration option called `--sysconfdir` that specifies the default location for configuration files. By default the value is `${prefix}/etc`. The problem is that `${prefix}` ends up being a location in the Conan local cache, for example:
```
/userhome/epederson/.conan/data/odbc/2.3.7/bincrafters/stable/package/761432f3453a17c960259b5b2605153e3bcffd7d/etc
```
This location is only valid on the machine that the package was built on.
The typical value for `sysconfdir` is `/etc`.
I will provide a PR that adds `--sysconfdir=/etc` to the configuration settings.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/odbc/all/conanfile.py`
Content:
```
1 import glob
2 import os
3 from conans import ConanFile, AutoToolsBuildEnvironment, tools
4 from conans.errors import ConanInvalidConfiguration
5
6
7 class OdbcConan(ConanFile):
8 name = "odbc"
9 description = "Package providing unixODBC"
10 topics = ("odbc", "database", "dbms", "data-access")
11 url = "https://github.com/conan-io/conan-center-index"
12 homepage = "http://www.unixodbc.org"
13 license = ("LGPL-2.1", "GPL-2.1")
14 exports_sources = "patches/**"
15 settings = "os", "compiler", "build_type", "arch"
16 options = {
17 "shared": [True, False],
18 "fPIC": [True, False],
19 "with_libiconv": [True, False]
20 }
21 default_options = {
22 "shared": False,
23 "fPIC": True,
24 "with_libiconv": True
25 }
26
27 _autotools = None
28
29 @property
30 def _source_subfolder(self):
31 return "source_subfolder"
32
33 def configure(self):
34 if self.options.shared:
35 del self.options.fPIC
36 del self.settings.compiler.libcxx
37 del self.settings.compiler.cppstd
38 if self.settings.os == "Windows":
39 raise ConanInvalidConfiguration("Windows not supported yet. Please, open an issue if you need such support")
40
41 def requirements(self):
42 if self.options.with_libiconv:
43 self.requires("libiconv/1.16")
44
45 def source(self):
46 tools.get(**self.conan_data["sources"][self.version])
47 extracted_dir = "unixODBC-%s" % self.version
48 os.rename(extracted_dir, self._source_subfolder)
49
50 def _configure_autotools(self):
51 if self._autotools:
52 return self._autotools
53 self._autotools = AutoToolsBuildEnvironment(self)
54 static_flag = "no" if self.options.shared else "yes"
55 shared_flag = "yes" if self.options.shared else "no"
56 libiconv_flag = "yes" if self.options.with_libiconv else "no"
57 args = ["--enable-static=%s" % static_flag,
58 "--enable-shared=%s" % shared_flag,
59 "--enable-ltdl-install",
60 "--enable-iconv=%s" % libiconv_flag]
61 if self.options.with_libiconv:
62 libiconv_prefix = self.deps_cpp_info["libiconv"].rootpath
63 args.append("--with-libiconv-prefix=%s" % libiconv_prefix)
64 self._autotools.configure(configure_dir=self._source_subfolder, args=args)
65 return self._autotools
66
67 def build(self):
68 for patch in self.conan_data.get("patches", {}).get(self.version, []):
69 tools.patch(**patch)
70 autotools = self._configure_autotools()
71 autotools.make()
72
73 def package(self):
74 self.copy("COPYING", src=self._source_subfolder, dst="licenses")
75 autotools = self._configure_autotools()
76 autotools.install()
77 tools.rmdir(os.path.join(self.package_folder, "share"))
78 tools.rmdir(os.path.join(self.package_folder, "etc"))
79 tools.rmdir(os.path.join(self.package_folder, "lib", "pkgconfig"))
80 for la_file in glob.glob(os.path.join(self.package_folder, "lib", "*.la")):
81 os.remove(la_file)
82
83 def package_info(self):
84 self.cpp_info.names["cmake_find_package"] = "ODBC"
85 self.cpp_info.names["cmake_find_package_multi"] = "ODBC"
86 # odbc
87 self.cpp_info.components["_odbc"].names["pkg_config"] = "odbc"
88 self.cpp_info.components["_odbc"].libs = ["odbc"]
89 self.cpp_info.components["_odbc"].requires = ["odbcltdl"]
90 if self.options.with_libiconv:
91 self.cpp_info.components["_odbc"].requires.append("libiconv::libiconv")
92 # odbcinst
93 self.cpp_info.components["odbcinst"].names["pkg_config"] = "odbcinst"
94 self.cpp_info.components["odbcinst"].libs = ["odbcinst"]
95 self.cpp_info.components["odbcinst"].requires = ["odbcltdl"]
96 # odbccr
97 self.cpp_info.components["odbccr"].names["pkg_config"] = "odbccr"
98 self.cpp_info.components["odbccr"].libs = ["odbccr"]
99
100 self.cpp_info.components["odbcltdl"].libs = ["ltdl"]
101
102 if self.settings.os == "Linux":
103 self.cpp_info.components["_odbc"].system_libs = ["pthread"]
104 self.cpp_info.components["odbcinst"].system_libs = ["pthread"]
105 self.cpp_info.components["odbcltdl"].system_libs = ["dl"]
106
107 bin_path = os.path.join(self.package_folder, "bin")
108 self.output.info("Appending PATH environment variable: {}".format(bin_path))
109 self.env_info.PATH.append(bin_path)
110
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/recipes/odbc/all/conanfile.py b/recipes/odbc/all/conanfile.py
--- a/recipes/odbc/all/conanfile.py
+++ b/recipes/odbc/all/conanfile.py
@@ -57,7 +57,8 @@
args = ["--enable-static=%s" % static_flag,
"--enable-shared=%s" % shared_flag,
"--enable-ltdl-install",
- "--enable-iconv=%s" % libiconv_flag]
+ "--enable-iconv=%s" % libiconv_flag,
+ "--sysconfdir=/etc"]
if self.options.with_libiconv:
libiconv_prefix = self.deps_cpp_info["libiconv"].rootpath
args.append("--with-libiconv-prefix=%s" % libiconv_prefix)
|
{"golden_diff": "diff --git a/recipes/odbc/all/conanfile.py b/recipes/odbc/all/conanfile.py\n--- a/recipes/odbc/all/conanfile.py\n+++ b/recipes/odbc/all/conanfile.py\n@@ -57,7 +57,8 @@\n args = [\"--enable-static=%s\" % static_flag,\n \"--enable-shared=%s\" % shared_flag,\n \"--enable-ltdl-install\",\n- \"--enable-iconv=%s\" % libiconv_flag]\n+ \"--enable-iconv=%s\" % libiconv_flag,\n+ \"--sysconfdir=/etc\"]\n if self.options.with_libiconv:\n libiconv_prefix = self.deps_cpp_info[\"libiconv\"].rootpath\n args.append(\"--with-libiconv-prefix=%s\" % libiconv_prefix)\n", "issue": "[package] odbc/2.3.7: sysconfdir is in Conan local cache\n<!-- \r\n Please don't forget to update the issue title.\r\n Include all applicable information to help us reproduce your problem.\r\n-->\r\n\r\n### Package and Environment Details (include every applicable attribute)\r\n * Package Name/Version: **odbc/2.3.7**\r\n\r\n### Steps to reproduce (Include if Applicable)\r\n\r\nThe odbc library has a configuration option called `--sysconfdir` that specifies the default location for configuration files. By default the value is `${prefix}/etc`. The problem is that `${prefix}` ends up being a location in the Conan local cache, for example:\r\n```\r\n/userhome/epederson/.conan/data/odbc/2.3.7/bincrafters/stable/package/761432f3453a17c960259b5b2605153e3bcffd7d/etc\r\n```\r\nThis location is only valid on the machine that the package was built on. \r\n\r\nThe typical value for `sysconfdir` is `/etc`. \r\n\r\nI will provide a PR that adds `--sysconfdir=/etc` to the configuration settings.\n", "before_files": [{"content": "import glob\nimport os\nfrom conans import ConanFile, AutoToolsBuildEnvironment, tools\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass OdbcConan(ConanFile):\n name = \"odbc\"\n description = \"Package providing unixODBC\"\n topics = (\"odbc\", \"database\", \"dbms\", \"data-access\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"http://www.unixodbc.org\"\n license = (\"LGPL-2.1\", \"GPL-2.1\")\n exports_sources = \"patches/**\"\n settings = \"os\", \"compiler\", \"build_type\", \"arch\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n \"with_libiconv\": [True, False]\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n \"with_libiconv\": True\n }\n\n _autotools = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def configure(self):\n if self.options.shared:\n del self.options.fPIC\n del self.settings.compiler.libcxx\n del self.settings.compiler.cppstd\n if self.settings.os == \"Windows\":\n raise ConanInvalidConfiguration(\"Windows not supported yet. Please, open an issue if you need such support\")\n\n def requirements(self):\n if self.options.with_libiconv:\n self.requires(\"libiconv/1.16\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = \"unixODBC-%s\" % self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def _configure_autotools(self):\n if self._autotools:\n return self._autotools\n self._autotools = AutoToolsBuildEnvironment(self)\n static_flag = \"no\" if self.options.shared else \"yes\"\n shared_flag = \"yes\" if self.options.shared else \"no\"\n libiconv_flag = \"yes\" if self.options.with_libiconv else \"no\"\n args = [\"--enable-static=%s\" % static_flag,\n \"--enable-shared=%s\" % shared_flag,\n \"--enable-ltdl-install\",\n \"--enable-iconv=%s\" % libiconv_flag]\n if self.options.with_libiconv:\n libiconv_prefix = self.deps_cpp_info[\"libiconv\"].rootpath\n args.append(\"--with-libiconv-prefix=%s\" % libiconv_prefix)\n self._autotools.configure(configure_dir=self._source_subfolder, args=args)\n return self._autotools\n\n def build(self):\n for patch in self.conan_data.get(\"patches\", {}).get(self.version, []):\n tools.patch(**patch)\n autotools = self._configure_autotools()\n autotools.make()\n\n def package(self):\n self.copy(\"COPYING\", src=self._source_subfolder, dst=\"licenses\")\n autotools = self._configure_autotools()\n autotools.install()\n tools.rmdir(os.path.join(self.package_folder, \"share\"))\n tools.rmdir(os.path.join(self.package_folder, \"etc\"))\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n for la_file in glob.glob(os.path.join(self.package_folder, \"lib\", \"*.la\")):\n os.remove(la_file)\n\n def package_info(self):\n self.cpp_info.names[\"cmake_find_package\"] = \"ODBC\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"ODBC\"\n # odbc\n self.cpp_info.components[\"_odbc\"].names[\"pkg_config\"] = \"odbc\"\n self.cpp_info.components[\"_odbc\"].libs = [\"odbc\"]\n self.cpp_info.components[\"_odbc\"].requires = [\"odbcltdl\"]\n if self.options.with_libiconv:\n self.cpp_info.components[\"_odbc\"].requires.append(\"libiconv::libiconv\")\n # odbcinst\n self.cpp_info.components[\"odbcinst\"].names[\"pkg_config\"] = \"odbcinst\"\n self.cpp_info.components[\"odbcinst\"].libs = [\"odbcinst\"]\n self.cpp_info.components[\"odbcinst\"].requires = [\"odbcltdl\"]\n # odbccr\n self.cpp_info.components[\"odbccr\"].names[\"pkg_config\"] = \"odbccr\"\n self.cpp_info.components[\"odbccr\"].libs = [\"odbccr\"]\n\n self.cpp_info.components[\"odbcltdl\"].libs = [\"ltdl\"]\n\n if self.settings.os == \"Linux\":\n self.cpp_info.components[\"_odbc\"].system_libs = [\"pthread\"]\n self.cpp_info.components[\"odbcinst\"].system_libs = [\"pthread\"]\n self.cpp_info.components[\"odbcltdl\"].system_libs = [\"dl\"]\n\n bin_path = os.path.join(self.package_folder, \"bin\")\n self.output.info(\"Appending PATH environment variable: {}\".format(bin_path))\n self.env_info.PATH.append(bin_path)\n", "path": "recipes/odbc/all/conanfile.py"}], "after_files": [{"content": "import glob\nimport os\nfrom conans import ConanFile, AutoToolsBuildEnvironment, tools\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass OdbcConan(ConanFile):\n name = \"odbc\"\n description = \"Package providing unixODBC\"\n topics = (\"odbc\", \"database\", \"dbms\", \"data-access\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"http://www.unixodbc.org\"\n license = (\"LGPL-2.1\", \"GPL-2.1\")\n exports_sources = \"patches/**\"\n settings = \"os\", \"compiler\", \"build_type\", \"arch\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n \"with_libiconv\": [True, False]\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n \"with_libiconv\": True\n }\n\n _autotools = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def configure(self):\n if self.options.shared:\n del self.options.fPIC\n del self.settings.compiler.libcxx\n del self.settings.compiler.cppstd\n if self.settings.os == \"Windows\":\n raise ConanInvalidConfiguration(\"Windows not supported yet. Please, open an issue if you need such support\")\n\n def requirements(self):\n if self.options.with_libiconv:\n self.requires(\"libiconv/1.16\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = \"unixODBC-%s\" % self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def _configure_autotools(self):\n if self._autotools:\n return self._autotools\n self._autotools = AutoToolsBuildEnvironment(self)\n static_flag = \"no\" if self.options.shared else \"yes\"\n shared_flag = \"yes\" if self.options.shared else \"no\"\n libiconv_flag = \"yes\" if self.options.with_libiconv else \"no\"\n args = [\"--enable-static=%s\" % static_flag,\n \"--enable-shared=%s\" % shared_flag,\n \"--enable-ltdl-install\",\n \"--enable-iconv=%s\" % libiconv_flag,\n \"--sysconfdir=/etc\"]\n if self.options.with_libiconv:\n libiconv_prefix = self.deps_cpp_info[\"libiconv\"].rootpath\n args.append(\"--with-libiconv-prefix=%s\" % libiconv_prefix)\n self._autotools.configure(configure_dir=self._source_subfolder, args=args)\n return self._autotools\n\n def build(self):\n for patch in self.conan_data.get(\"patches\", {}).get(self.version, []):\n tools.patch(**patch)\n autotools = self._configure_autotools()\n autotools.make()\n\n def package(self):\n self.copy(\"COPYING\", src=self._source_subfolder, dst=\"licenses\")\n autotools = self._configure_autotools()\n autotools.install()\n tools.rmdir(os.path.join(self.package_folder, \"share\"))\n tools.rmdir(os.path.join(self.package_folder, \"etc\"))\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n for la_file in glob.glob(os.path.join(self.package_folder, \"lib\", \"*.la\")):\n os.remove(la_file)\n\n def package_info(self):\n self.cpp_info.names[\"cmake_find_package\"] = \"ODBC\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"ODBC\"\n # odbc\n self.cpp_info.components[\"_odbc\"].names[\"pkg_config\"] = \"odbc\"\n self.cpp_info.components[\"_odbc\"].libs = [\"odbc\"]\n self.cpp_info.components[\"_odbc\"].requires = [\"odbcltdl\"]\n if self.options.with_libiconv:\n self.cpp_info.components[\"_odbc\"].requires.append(\"libiconv::libiconv\")\n # odbcinst\n self.cpp_info.components[\"odbcinst\"].names[\"pkg_config\"] = \"odbcinst\"\n self.cpp_info.components[\"odbcinst\"].libs = [\"odbcinst\"]\n self.cpp_info.components[\"odbcinst\"].requires = [\"odbcltdl\"]\n # odbccr\n self.cpp_info.components[\"odbccr\"].names[\"pkg_config\"] = \"odbccr\"\n self.cpp_info.components[\"odbccr\"].libs = [\"odbccr\"]\n\n self.cpp_info.components[\"odbcltdl\"].libs = [\"ltdl\"]\n\n if self.settings.os == \"Linux\":\n self.cpp_info.components[\"_odbc\"].system_libs = [\"pthread\"]\n self.cpp_info.components[\"odbcinst\"].system_libs = [\"pthread\"]\n self.cpp_info.components[\"odbcltdl\"].system_libs = [\"dl\"]\n\n bin_path = os.path.join(self.package_folder, \"bin\")\n self.output.info(\"Appending PATH environment variable: {}\".format(bin_path))\n self.env_info.PATH.append(bin_path)\n", "path": "recipes/odbc/all/conanfile.py"}]}
| 1,859 | 182 |
gh_patches_debug_10406
|
rasdani/github-patches
|
git_diff
|
iterative__dvc-9925
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
unnecessary remote repository fetches
DVC has started cloning external repositories even on cases where it's not strictly required.
Anything that uses `outs_graph` has started cloning external repositories.
Commands like `add/plots modify/pull/push/fetch/move` are affected, and maybe potentially others.
### Reproduce
```bash
git clone https://github.com/iterative/example-get-started.git
cd example-get-started
dvc add foo
```
### Expected
No cloning occurs.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dvc/repo/graph.py`
Content:
```
1 from typing import TYPE_CHECKING, Any, Iterator, List, Optional, Set, TypeVar
2
3 from dvc.fs import localfs
4 from dvc.utils.fs import path_isin
5
6 if TYPE_CHECKING:
7 from networkx import DiGraph
8
9 from dvc.stage import Stage
10
11 T = TypeVar("T")
12
13
14 def check_acyclic(graph: "DiGraph") -> None:
15 import networkx as nx
16
17 from dvc.exceptions import CyclicGraphError
18
19 try:
20 edges = nx.find_cycle(graph, orientation="original")
21 except nx.NetworkXNoCycle:
22 return
23
24 stages: Set["Stage"] = set()
25 for from_node, to_node, _ in edges:
26 stages.add(from_node)
27 stages.add(to_node)
28
29 raise CyclicGraphError(list(stages))
30
31
32 def get_pipeline(pipelines, node):
33 found = [i for i in pipelines if i.has_node(node)]
34 if not found:
35 return None
36
37 assert len(found) == 1
38 return found[0]
39
40
41 def get_pipelines(graph: "DiGraph"):
42 import networkx as nx
43
44 return [graph.subgraph(c).copy() for c in nx.weakly_connected_components(graph)]
45
46
47 def get_subgraph_of_nodes(
48 graph: "DiGraph", sources: Optional[List[Any]] = None, downstream: bool = False
49 ) -> "DiGraph":
50 from networkx import dfs_postorder_nodes, reverse_view
51
52 if not sources:
53 return graph
54
55 g = reverse_view(graph) if downstream else graph
56 nodes = []
57 for source in sources:
58 nodes.extend(dfs_postorder_nodes(g, source))
59 return graph.subgraph(nodes)
60
61
62 def collect_pipeline(stage: "Stage", graph: "DiGraph") -> Iterator["Stage"]:
63 import networkx as nx
64
65 pipeline = get_pipeline(get_pipelines(graph), stage)
66 if not pipeline:
67 return iter([])
68
69 return nx.dfs_postorder_nodes(pipeline, stage)
70
71
72 def collect_inside_path(path: str, graph: "DiGraph") -> List["Stage"]:
73 import networkx as nx
74
75 stages = nx.dfs_postorder_nodes(graph)
76 return [stage for stage in stages if path_isin(stage.path, path)]
77
78
79 def build_graph(stages, outs_trie=None):
80 """Generate a graph by using the given stages on the given directory
81
82 The nodes of the graph are the stage's path relative to the root.
83
84 Edges are created when the output of one stage is used as a
85 dependency in other stage.
86
87 The direction of the edges goes from the stage to its dependency:
88
89 For example, running the following:
90
91 $ dvc run -o A "echo A > A"
92 $ dvc run -d A -o B "echo B > B"
93 $ dvc run -d B -o C "echo C > C"
94
95 Will create the following graph:
96
97 ancestors <--
98 |
99 C.dvc -> B.dvc -> A.dvc
100 | |
101 | --> descendants
102 |
103 ------- pipeline ------>
104 |
105 v
106 (weakly connected components)
107
108 Args:
109 stages (list): used to build a graph from
110
111 Raises:
112 OutputDuplicationError: two outputs with the same path
113 StagePathAsOutputError: stage inside an output directory
114 OverlappingOutputPathsError: output inside output directory
115 CyclicGraphError: resulting graph has cycles
116 """
117 import networkx as nx
118
119 from dvc.exceptions import StagePathAsOutputError
120
121 from .trie import build_outs_trie
122
123 graph = nx.DiGraph()
124
125 # Use trie to efficiently find overlapping outs and deps
126 outs_trie = outs_trie or build_outs_trie(stages)
127
128 for stage in stages:
129 out = outs_trie.shortest_prefix(localfs.path.parts(stage.path)).value
130 if out:
131 raise StagePathAsOutputError(stage, str(out))
132
133 # Building graph
134 graph.add_nodes_from(stages)
135 for stage in stages:
136 if stage.is_repo_import:
137 continue
138
139 for dep in stage.deps:
140 dep_key = dep.fs.path.parts(dep.fs_path)
141 overlapping = [n.value for n in outs_trie.prefixes(dep_key)]
142 if outs_trie.has_subtrie(dep_key):
143 overlapping.extend(outs_trie.values(prefix=dep_key))
144
145 graph.add_edges_from((stage, out.stage) for out in overlapping)
146 check_acyclic(graph)
147
148 return graph
149
150
151 # NOTE: using stage graph instead of just list of stages to make sure that it
152 # has already passed all the sanity checks like cycles/overlapping outputs and
153 # so on.
154 def build_outs_graph(graph, outs_trie):
155 import networkx as nx
156
157 outs_graph = nx.DiGraph()
158
159 outs_graph.add_nodes_from(outs_trie.values())
160 for stage in graph.nodes():
161 for dep in stage.deps:
162 if dep.fs_path is None:
163 # RepoDependency don't have a path
164 continue
165 dep_key = dep.fs.path.parts(dep.fs_path)
166 overlapping = [n.value for n in outs_trie.prefixes(dep_key)]
167 if outs_trie.has_subtrie(dep_key):
168 overlapping.extend(outs_trie.values(prefix=dep_key))
169
170 for from_out in stage.outs:
171 outs_graph.add_edges_from((from_out, out) for out in overlapping)
172 return outs_graph
173
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/dvc/repo/graph.py b/dvc/repo/graph.py
--- a/dvc/repo/graph.py
+++ b/dvc/repo/graph.py
@@ -158,10 +158,9 @@
outs_graph.add_nodes_from(outs_trie.values())
for stage in graph.nodes():
+ if stage.is_repo_import:
+ continue
for dep in stage.deps:
- if dep.fs_path is None:
- # RepoDependency don't have a path
- continue
dep_key = dep.fs.path.parts(dep.fs_path)
overlapping = [n.value for n in outs_trie.prefixes(dep_key)]
if outs_trie.has_subtrie(dep_key):
|
{"golden_diff": "diff --git a/dvc/repo/graph.py b/dvc/repo/graph.py\n--- a/dvc/repo/graph.py\n+++ b/dvc/repo/graph.py\n@@ -158,10 +158,9 @@\n \n outs_graph.add_nodes_from(outs_trie.values())\n for stage in graph.nodes():\n+ if stage.is_repo_import:\n+ continue\n for dep in stage.deps:\n- if dep.fs_path is None:\n- # RepoDependency don't have a path\n- continue\n dep_key = dep.fs.path.parts(dep.fs_path)\n overlapping = [n.value for n in outs_trie.prefixes(dep_key)]\n if outs_trie.has_subtrie(dep_key):\n", "issue": "unnecessary remote repository fetches\nDVC has started cloning external repositories even on cases where it's not strictly required. \r\nAnything that uses `outs_graph` has started cloning external repositories.\r\n\r\nCommands like `add/plots modify/pull/push/fetch/move` are affected, and maybe potentially others.\r\n\r\n### Reproduce\r\n\r\n```bash\r\ngit clone https://github.com/iterative/example-get-started.git\r\ncd example-get-started\r\ndvc add foo\r\n```\r\n\r\n### Expected\r\n\r\nNo cloning occurs.\r\n\r\n\r\n\n", "before_files": [{"content": "from typing import TYPE_CHECKING, Any, Iterator, List, Optional, Set, TypeVar\n\nfrom dvc.fs import localfs\nfrom dvc.utils.fs import path_isin\n\nif TYPE_CHECKING:\n from networkx import DiGraph\n\n from dvc.stage import Stage\n\nT = TypeVar(\"T\")\n\n\ndef check_acyclic(graph: \"DiGraph\") -> None:\n import networkx as nx\n\n from dvc.exceptions import CyclicGraphError\n\n try:\n edges = nx.find_cycle(graph, orientation=\"original\")\n except nx.NetworkXNoCycle:\n return\n\n stages: Set[\"Stage\"] = set()\n for from_node, to_node, _ in edges:\n stages.add(from_node)\n stages.add(to_node)\n\n raise CyclicGraphError(list(stages))\n\n\ndef get_pipeline(pipelines, node):\n found = [i for i in pipelines if i.has_node(node)]\n if not found:\n return None\n\n assert len(found) == 1\n return found[0]\n\n\ndef get_pipelines(graph: \"DiGraph\"):\n import networkx as nx\n\n return [graph.subgraph(c).copy() for c in nx.weakly_connected_components(graph)]\n\n\ndef get_subgraph_of_nodes(\n graph: \"DiGraph\", sources: Optional[List[Any]] = None, downstream: bool = False\n) -> \"DiGraph\":\n from networkx import dfs_postorder_nodes, reverse_view\n\n if not sources:\n return graph\n\n g = reverse_view(graph) if downstream else graph\n nodes = []\n for source in sources:\n nodes.extend(dfs_postorder_nodes(g, source))\n return graph.subgraph(nodes)\n\n\ndef collect_pipeline(stage: \"Stage\", graph: \"DiGraph\") -> Iterator[\"Stage\"]:\n import networkx as nx\n\n pipeline = get_pipeline(get_pipelines(graph), stage)\n if not pipeline:\n return iter([])\n\n return nx.dfs_postorder_nodes(pipeline, stage)\n\n\ndef collect_inside_path(path: str, graph: \"DiGraph\") -> List[\"Stage\"]:\n import networkx as nx\n\n stages = nx.dfs_postorder_nodes(graph)\n return [stage for stage in stages if path_isin(stage.path, path)]\n\n\ndef build_graph(stages, outs_trie=None):\n \"\"\"Generate a graph by using the given stages on the given directory\n\n The nodes of the graph are the stage's path relative to the root.\n\n Edges are created when the output of one stage is used as a\n dependency in other stage.\n\n The direction of the edges goes from the stage to its dependency:\n\n For example, running the following:\n\n $ dvc run -o A \"echo A > A\"\n $ dvc run -d A -o B \"echo B > B\"\n $ dvc run -d B -o C \"echo C > C\"\n\n Will create the following graph:\n\n ancestors <--\n |\n C.dvc -> B.dvc -> A.dvc\n | |\n | --> descendants\n |\n ------- pipeline ------>\n |\n v\n (weakly connected components)\n\n Args:\n stages (list): used to build a graph from\n\n Raises:\n OutputDuplicationError: two outputs with the same path\n StagePathAsOutputError: stage inside an output directory\n OverlappingOutputPathsError: output inside output directory\n CyclicGraphError: resulting graph has cycles\n \"\"\"\n import networkx as nx\n\n from dvc.exceptions import StagePathAsOutputError\n\n from .trie import build_outs_trie\n\n graph = nx.DiGraph()\n\n # Use trie to efficiently find overlapping outs and deps\n outs_trie = outs_trie or build_outs_trie(stages)\n\n for stage in stages:\n out = outs_trie.shortest_prefix(localfs.path.parts(stage.path)).value\n if out:\n raise StagePathAsOutputError(stage, str(out))\n\n # Building graph\n graph.add_nodes_from(stages)\n for stage in stages:\n if stage.is_repo_import:\n continue\n\n for dep in stage.deps:\n dep_key = dep.fs.path.parts(dep.fs_path)\n overlapping = [n.value for n in outs_trie.prefixes(dep_key)]\n if outs_trie.has_subtrie(dep_key):\n overlapping.extend(outs_trie.values(prefix=dep_key))\n\n graph.add_edges_from((stage, out.stage) for out in overlapping)\n check_acyclic(graph)\n\n return graph\n\n\n# NOTE: using stage graph instead of just list of stages to make sure that it\n# has already passed all the sanity checks like cycles/overlapping outputs and\n# so on.\ndef build_outs_graph(graph, outs_trie):\n import networkx as nx\n\n outs_graph = nx.DiGraph()\n\n outs_graph.add_nodes_from(outs_trie.values())\n for stage in graph.nodes():\n for dep in stage.deps:\n if dep.fs_path is None:\n # RepoDependency don't have a path\n continue\n dep_key = dep.fs.path.parts(dep.fs_path)\n overlapping = [n.value for n in outs_trie.prefixes(dep_key)]\n if outs_trie.has_subtrie(dep_key):\n overlapping.extend(outs_trie.values(prefix=dep_key))\n\n for from_out in stage.outs:\n outs_graph.add_edges_from((from_out, out) for out in overlapping)\n return outs_graph\n", "path": "dvc/repo/graph.py"}], "after_files": [{"content": "from typing import TYPE_CHECKING, Any, Iterator, List, Optional, Set, TypeVar\n\nfrom dvc.fs import localfs\nfrom dvc.utils.fs import path_isin\n\nif TYPE_CHECKING:\n from networkx import DiGraph\n\n from dvc.stage import Stage\n\nT = TypeVar(\"T\")\n\n\ndef check_acyclic(graph: \"DiGraph\") -> None:\n import networkx as nx\n\n from dvc.exceptions import CyclicGraphError\n\n try:\n edges = nx.find_cycle(graph, orientation=\"original\")\n except nx.NetworkXNoCycle:\n return\n\n stages: Set[\"Stage\"] = set()\n for from_node, to_node, _ in edges:\n stages.add(from_node)\n stages.add(to_node)\n\n raise CyclicGraphError(list(stages))\n\n\ndef get_pipeline(pipelines, node):\n found = [i for i in pipelines if i.has_node(node)]\n if not found:\n return None\n\n assert len(found) == 1\n return found[0]\n\n\ndef get_pipelines(graph: \"DiGraph\"):\n import networkx as nx\n\n return [graph.subgraph(c).copy() for c in nx.weakly_connected_components(graph)]\n\n\ndef get_subgraph_of_nodes(\n graph: \"DiGraph\", sources: Optional[List[Any]] = None, downstream: bool = False\n) -> \"DiGraph\":\n from networkx import dfs_postorder_nodes, reverse_view\n\n if not sources:\n return graph\n\n g = reverse_view(graph) if downstream else graph\n nodes = []\n for source in sources:\n nodes.extend(dfs_postorder_nodes(g, source))\n return graph.subgraph(nodes)\n\n\ndef collect_pipeline(stage: \"Stage\", graph: \"DiGraph\") -> Iterator[\"Stage\"]:\n import networkx as nx\n\n pipeline = get_pipeline(get_pipelines(graph), stage)\n if not pipeline:\n return iter([])\n\n return nx.dfs_postorder_nodes(pipeline, stage)\n\n\ndef collect_inside_path(path: str, graph: \"DiGraph\") -> List[\"Stage\"]:\n import networkx as nx\n\n stages = nx.dfs_postorder_nodes(graph)\n return [stage for stage in stages if path_isin(stage.path, path)]\n\n\ndef build_graph(stages, outs_trie=None):\n \"\"\"Generate a graph by using the given stages on the given directory\n\n The nodes of the graph are the stage's path relative to the root.\n\n Edges are created when the output of one stage is used as a\n dependency in other stage.\n\n The direction of the edges goes from the stage to its dependency:\n\n For example, running the following:\n\n $ dvc run -o A \"echo A > A\"\n $ dvc run -d A -o B \"echo B > B\"\n $ dvc run -d B -o C \"echo C > C\"\n\n Will create the following graph:\n\n ancestors <--\n |\n C.dvc -> B.dvc -> A.dvc\n | |\n | --> descendants\n |\n ------- pipeline ------>\n |\n v\n (weakly connected components)\n\n Args:\n stages (list): used to build a graph from\n\n Raises:\n OutputDuplicationError: two outputs with the same path\n StagePathAsOutputError: stage inside an output directory\n OverlappingOutputPathsError: output inside output directory\n CyclicGraphError: resulting graph has cycles\n \"\"\"\n import networkx as nx\n\n from dvc.exceptions import StagePathAsOutputError\n\n from .trie import build_outs_trie\n\n graph = nx.DiGraph()\n\n # Use trie to efficiently find overlapping outs and deps\n outs_trie = outs_trie or build_outs_trie(stages)\n\n for stage in stages:\n out = outs_trie.shortest_prefix(localfs.path.parts(stage.path)).value\n if out:\n raise StagePathAsOutputError(stage, str(out))\n\n # Building graph\n graph.add_nodes_from(stages)\n for stage in stages:\n if stage.is_repo_import:\n continue\n\n for dep in stage.deps:\n dep_key = dep.fs.path.parts(dep.fs_path)\n overlapping = [n.value for n in outs_trie.prefixes(dep_key)]\n if outs_trie.has_subtrie(dep_key):\n overlapping.extend(outs_trie.values(prefix=dep_key))\n\n graph.add_edges_from((stage, out.stage) for out in overlapping)\n check_acyclic(graph)\n\n return graph\n\n\n# NOTE: using stage graph instead of just list of stages to make sure that it\n# has already passed all the sanity checks like cycles/overlapping outputs and\n# so on.\ndef build_outs_graph(graph, outs_trie):\n import networkx as nx\n\n outs_graph = nx.DiGraph()\n\n outs_graph.add_nodes_from(outs_trie.values())\n for stage in graph.nodes():\n if stage.is_repo_import:\n continue\n for dep in stage.deps:\n dep_key = dep.fs.path.parts(dep.fs_path)\n overlapping = [n.value for n in outs_trie.prefixes(dep_key)]\n if outs_trie.has_subtrie(dep_key):\n overlapping.extend(outs_trie.values(prefix=dep_key))\n\n for from_out in stage.outs:\n outs_graph.add_edges_from((from_out, out) for out in overlapping)\n return outs_graph\n", "path": "dvc/repo/graph.py"}]}
| 1,973 | 155 |
gh_patches_debug_7854
|
rasdani/github-patches
|
git_diff
|
coala__coala-4969
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add docstring for coala_modes.py mode_normal
This function should have a proper docstring, and indicate that `log_printer` is unused & deprecated.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `coalib/coala_modes.py`
Content:
```
1 def mode_normal(console_printer, log_printer, args, debug=False):
2 import functools
3
4 from coalib.coala_main import run_coala
5 from coalib.output.ConsoleInteraction import (
6 acquire_settings, nothing_done,
7 print_results, print_section_beginning)
8
9 partial_print_sec_beg = functools.partial(
10 print_section_beginning,
11 console_printer)
12 results, exitcode, _ = run_coala(
13 print_results=print_results,
14 acquire_settings=acquire_settings,
15 print_section_beginning=partial_print_sec_beg,
16 nothing_done=nothing_done,
17 console_printer=console_printer,
18 args=args,
19 debug=debug)
20
21 return exitcode
22
23
24 def mode_non_interactive(console_printer, args, debug=False):
25 import functools
26
27 from coalib.coala_main import run_coala
28 from coalib.output.ConsoleInteraction import (
29 print_results_no_input, print_section_beginning)
30
31 partial_print_sec_beg = functools.partial(
32 print_section_beginning,
33 console_printer)
34 results, exitcode, _ = run_coala(
35 print_results=print_results_no_input,
36 print_section_beginning=partial_print_sec_beg,
37 force_show_patch=True,
38 console_printer=console_printer,
39 args=args,
40 debug=debug)
41
42 return exitcode
43
44
45 def mode_json(args, debug=False):
46 import json
47
48 from coalib.coala_main import run_coala
49 from coalib.output.Logging import configure_json_logging
50 from coalib.output.JSONEncoder import create_json_encoder
51
52 if args.log_json:
53 log_stream = configure_json_logging()
54
55 JSONEncoder = create_json_encoder(use_relpath=args.relpath)
56
57 results, exitcode, _ = run_coala(args=args, debug=debug)
58
59 retval = {'results': results}
60
61 if args.log_json:
62 retval['logs'] = [json.loads(line) for line in
63 log_stream.getvalue().splitlines()]
64
65 if args.output:
66 filename = str(args.output[0])
67 with open(filename, 'w') as fp:
68 json.dump(retval, fp,
69 cls=JSONEncoder,
70 sort_keys=True,
71 indent=2,
72 separators=(',', ': '))
73 else:
74 print(json.dumps(retval,
75 cls=JSONEncoder,
76 sort_keys=True,
77 indent=2,
78 separators=(',', ': ')))
79
80 return 0 if args.show_bears else exitcode
81
82
83 def mode_format(args, debug=False):
84 from coalib.coala_main import run_coala
85 from coalib.output.ConsoleInteraction import print_results_formatted
86
87 _, exitcode, _ = run_coala(
88 print_results=print_results_formatted, args=args, debug=debug)
89 return exitcode
90
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/coalib/coala_modes.py b/coalib/coala_modes.py
--- a/coalib/coala_modes.py
+++ b/coalib/coala_modes.py
@@ -1,4 +1,13 @@
def mode_normal(console_printer, log_printer, args, debug=False):
+ """
+ This is the default coala mode. User interaction is allowed in this mode.
+
+ :param console_printer: Object to print messages on the console.
+ :param log_printer: Deprecated.
+ :param args: Alternative pre-parsed CLI arguments.
+ :param debug: Run in debug mode, bypassing multiprocessing,
+ and not catching any exceptions.
+ """
import functools
from coalib.coala_main import run_coala
|
{"golden_diff": "diff --git a/coalib/coala_modes.py b/coalib/coala_modes.py\n--- a/coalib/coala_modes.py\n+++ b/coalib/coala_modes.py\n@@ -1,4 +1,13 @@\n def mode_normal(console_printer, log_printer, args, debug=False):\n+ \"\"\"\n+ This is the default coala mode. User interaction is allowed in this mode.\n+\n+ :param console_printer: Object to print messages on the console.\n+ :param log_printer: Deprecated.\n+ :param args: Alternative pre-parsed CLI arguments.\n+ :param debug: Run in debug mode, bypassing multiprocessing,\n+ and not catching any exceptions.\n+ \"\"\"\n import functools\n \n from coalib.coala_main import run_coala\n", "issue": "Add docstring for coala_modes.py mode_normal\nThis function should have a proper docstring, and indicate that `log_printer` is unused & deprecated.\n", "before_files": [{"content": "def mode_normal(console_printer, log_printer, args, debug=False):\n import functools\n\n from coalib.coala_main import run_coala\n from coalib.output.ConsoleInteraction import (\n acquire_settings, nothing_done,\n print_results, print_section_beginning)\n\n partial_print_sec_beg = functools.partial(\n print_section_beginning,\n console_printer)\n results, exitcode, _ = run_coala(\n print_results=print_results,\n acquire_settings=acquire_settings,\n print_section_beginning=partial_print_sec_beg,\n nothing_done=nothing_done,\n console_printer=console_printer,\n args=args,\n debug=debug)\n\n return exitcode\n\n\ndef mode_non_interactive(console_printer, args, debug=False):\n import functools\n\n from coalib.coala_main import run_coala\n from coalib.output.ConsoleInteraction import (\n print_results_no_input, print_section_beginning)\n\n partial_print_sec_beg = functools.partial(\n print_section_beginning,\n console_printer)\n results, exitcode, _ = run_coala(\n print_results=print_results_no_input,\n print_section_beginning=partial_print_sec_beg,\n force_show_patch=True,\n console_printer=console_printer,\n args=args,\n debug=debug)\n\n return exitcode\n\n\ndef mode_json(args, debug=False):\n import json\n\n from coalib.coala_main import run_coala\n from coalib.output.Logging import configure_json_logging\n from coalib.output.JSONEncoder import create_json_encoder\n\n if args.log_json:\n log_stream = configure_json_logging()\n\n JSONEncoder = create_json_encoder(use_relpath=args.relpath)\n\n results, exitcode, _ = run_coala(args=args, debug=debug)\n\n retval = {'results': results}\n\n if args.log_json:\n retval['logs'] = [json.loads(line) for line in\n log_stream.getvalue().splitlines()]\n\n if args.output:\n filename = str(args.output[0])\n with open(filename, 'w') as fp:\n json.dump(retval, fp,\n cls=JSONEncoder,\n sort_keys=True,\n indent=2,\n separators=(',', ': '))\n else:\n print(json.dumps(retval,\n cls=JSONEncoder,\n sort_keys=True,\n indent=2,\n separators=(',', ': ')))\n\n return 0 if args.show_bears else exitcode\n\n\ndef mode_format(args, debug=False):\n from coalib.coala_main import run_coala\n from coalib.output.ConsoleInteraction import print_results_formatted\n\n _, exitcode, _ = run_coala(\n print_results=print_results_formatted, args=args, debug=debug)\n return exitcode\n", "path": "coalib/coala_modes.py"}], "after_files": [{"content": "def mode_normal(console_printer, log_printer, args, debug=False):\n \"\"\"\n This is the default coala mode. User interaction is allowed in this mode.\n\n :param console_printer: Object to print messages on the console.\n :param log_printer: Deprecated.\n :param args: Alternative pre-parsed CLI arguments.\n :param debug: Run in debug mode, bypassing multiprocessing,\n and not catching any exceptions.\n \"\"\"\n import functools\n\n from coalib.coala_main import run_coala\n from coalib.output.ConsoleInteraction import (\n acquire_settings, nothing_done,\n print_results, print_section_beginning)\n\n partial_print_sec_beg = functools.partial(\n print_section_beginning,\n console_printer)\n results, exitcode, _ = run_coala(\n print_results=print_results,\n acquire_settings=acquire_settings,\n print_section_beginning=partial_print_sec_beg,\n nothing_done=nothing_done,\n console_printer=console_printer,\n args=args,\n debug=debug)\n\n return exitcode\n\n\ndef mode_non_interactive(console_printer, args, debug=False):\n import functools\n\n from coalib.coala_main import run_coala\n from coalib.output.ConsoleInteraction import (\n print_results_no_input, print_section_beginning)\n\n partial_print_sec_beg = functools.partial(\n print_section_beginning,\n console_printer)\n results, exitcode, _ = run_coala(\n print_results=print_results_no_input,\n print_section_beginning=partial_print_sec_beg,\n force_show_patch=True,\n console_printer=console_printer,\n args=args,\n debug=debug)\n\n return exitcode\n\n\ndef mode_json(args, debug=False):\n import json\n\n from coalib.coala_main import run_coala\n from coalib.output.Logging import configure_json_logging\n from coalib.output.JSONEncoder import create_json_encoder\n\n if args.log_json:\n log_stream = configure_json_logging()\n\n JSONEncoder = create_json_encoder(use_relpath=args.relpath)\n\n results, exitcode, _ = run_coala(args=args, debug=debug)\n\n retval = {'results': results}\n\n if args.log_json:\n retval['logs'] = [json.loads(line) for line in\n log_stream.getvalue().splitlines()]\n\n if args.output:\n filename = str(args.output[0])\n with open(filename, 'w') as fp:\n json.dump(retval, fp,\n cls=JSONEncoder,\n sort_keys=True,\n indent=2,\n separators=(',', ': '))\n else:\n print(json.dumps(retval,\n cls=JSONEncoder,\n sort_keys=True,\n indent=2,\n separators=(',', ': ')))\n\n return 0 if args.show_bears else exitcode\n\n\ndef mode_format(args, debug=False):\n from coalib.coala_main import run_coala\n from coalib.output.ConsoleInteraction import print_results_formatted\n\n _, exitcode, _ = run_coala(\n print_results=print_results_formatted, args=args, debug=debug)\n return exitcode\n", "path": "coalib/coala_modes.py"}]}
| 1,043 | 169 |
gh_patches_debug_26041
|
rasdani/github-patches
|
git_diff
|
comic__grand-challenge.org-2410
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add a time column to challenge requests table
**Is your feature request related to a problem? Please describe.**
Its not clear right away when each challenge request was submitted.
**Describe the solution you'd like**
I would like to add a column that shows the time when challenge request was submitted. It would be handy.
**Describe alternatives you've considered**
Looking at a specific challenge request page to see the time.
**Additional context**
Idea shown on the picture below. Color coding is not necessary, but could be nice.

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/grandchallenge/challenges/admin.py`
Content:
```
1 from django.contrib import admin, messages
2 from django.contrib.admin import ModelAdmin
3 from django.core.exceptions import ValidationError
4
5 from grandchallenge.challenges.emails import send_challenge_status_update_email
6 from grandchallenge.challenges.models import (
7 Challenge,
8 ChallengeRequest,
9 ChallengeSeries,
10 ExternalChallenge,
11 )
12
13
14 class ChallengeAdmin(ModelAdmin):
15 readonly_fields = ("creator",)
16 autocomplete_fields = ("publications",)
17 ordering = ("-created",)
18 list_display = ("short_name", "created")
19 search_fields = ("short_name",)
20
21
22 class ExternalChallengeAdmin(ModelAdmin):
23 readonly_fields = ("creator",)
24 autocomplete_fields = ("publications",)
25
26
27 class ChallengeRequestAdmin(ModelAdmin):
28 readonly_fields = ("creator",)
29 ordering = ("-created",)
30 list_display = ("title", "short_name", "creator", "created", "status")
31 actions = ["create_challenge", "send_status_update_email"]
32
33 @admin.action(description="Create challenge for this request")
34 def create_challenge(self, request, queryset):
35 for challengerequest in queryset:
36 try:
37 challengerequest.create_challenge()
38 except ValidationError:
39 self.message_user(
40 request,
41 f"There already is a challenge with short "
42 f"name: {challengerequest.short_name}",
43 messages.WARNING,
44 )
45
46 @admin.action(description="Send status update email to requester")
47 def send_status_update_email(self, request, queryset):
48 for challengerequest in queryset:
49 if (
50 challengerequest.status
51 == challengerequest.ChallengeRequestStatusChoices.ACCEPTED
52 ):
53 try:
54 challenge = Challenge.objects.get(
55 short_name=challengerequest.short_name
56 )
57 except Challenge.DoesNotExist:
58 challenge = challengerequest.create_challenge()
59 else:
60 challenge = None
61 send_challenge_status_update_email(
62 challengerequest=challengerequest, challenge=challenge
63 )
64
65
66 admin.site.register(Challenge, ChallengeAdmin)
67 admin.site.register(ExternalChallenge, ExternalChallengeAdmin)
68 admin.site.register(ChallengeSeries)
69 admin.site.register(ChallengeRequest, ChallengeRequestAdmin)
70
```
Path: `app/grandchallenge/core/templatetags/naturaldelta.py`
Content:
```
1 import humanize
2 from django import template
3
4 register = template.Library()
5
6
7 @register.filter
8 def naturaldelta(value):
9 return humanize.naturaldelta(value, months=False)
10
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/app/grandchallenge/challenges/admin.py b/app/grandchallenge/challenges/admin.py
--- a/app/grandchallenge/challenges/admin.py
+++ b/app/grandchallenge/challenges/admin.py
@@ -27,8 +27,22 @@
class ChallengeRequestAdmin(ModelAdmin):
readonly_fields = ("creator",)
ordering = ("-created",)
- list_display = ("title", "short_name", "creator", "created", "status")
+ list_display = (
+ "title",
+ "short_name",
+ "creator",
+ "created",
+ "status",
+ "challenge_type",
+ "total_cost",
+ "budget_for_hosting_challenge",
+ )
actions = ["create_challenge", "send_status_update_email"]
+ list_filter = ["status", "challenge_type"]
+
+ @admin.display(description="Total cost")
+ def total_cost(self, obj):
+ return "%s" % (obj.budget["Total"])
@admin.action(description="Create challenge for this request")
def create_challenge(self, request, queryset):
diff --git a/app/grandchallenge/core/templatetags/naturaldelta.py b/app/grandchallenge/core/templatetags/naturaldelta.py
--- a/app/grandchallenge/core/templatetags/naturaldelta.py
+++ b/app/grandchallenge/core/templatetags/naturaldelta.py
@@ -1,5 +1,6 @@
import humanize
from django import template
+from django.utils import timezone
register = template.Library()
@@ -7,3 +8,8 @@
@register.filter
def naturaldelta(value):
return humanize.naturaldelta(value, months=False)
+
+
[email protected]
+def timedifference(value):
+ return (timezone.now() - value).days
|
{"golden_diff": "diff --git a/app/grandchallenge/challenges/admin.py b/app/grandchallenge/challenges/admin.py\n--- a/app/grandchallenge/challenges/admin.py\n+++ b/app/grandchallenge/challenges/admin.py\n@@ -27,8 +27,22 @@\n class ChallengeRequestAdmin(ModelAdmin):\n readonly_fields = (\"creator\",)\n ordering = (\"-created\",)\n- list_display = (\"title\", \"short_name\", \"creator\", \"created\", \"status\")\n+ list_display = (\n+ \"title\",\n+ \"short_name\",\n+ \"creator\",\n+ \"created\",\n+ \"status\",\n+ \"challenge_type\",\n+ \"total_cost\",\n+ \"budget_for_hosting_challenge\",\n+ )\n actions = [\"create_challenge\", \"send_status_update_email\"]\n+ list_filter = [\"status\", \"challenge_type\"]\n+\n+ @admin.display(description=\"Total cost\")\n+ def total_cost(self, obj):\n+ return \"%s\" % (obj.budget[\"Total\"])\n \n @admin.action(description=\"Create challenge for this request\")\n def create_challenge(self, request, queryset):\ndiff --git a/app/grandchallenge/core/templatetags/naturaldelta.py b/app/grandchallenge/core/templatetags/naturaldelta.py\n--- a/app/grandchallenge/core/templatetags/naturaldelta.py\n+++ b/app/grandchallenge/core/templatetags/naturaldelta.py\n@@ -1,5 +1,6 @@\n import humanize\n from django import template\n+from django.utils import timezone\n \n register = template.Library()\n \n@@ -7,3 +8,8 @@\n @register.filter\n def naturaldelta(value):\n return humanize.naturaldelta(value, months=False)\n+\n+\[email protected]\n+def timedifference(value):\n+ return (timezone.now() - value).days\n", "issue": "Add a time column to challenge requests table\n**Is your feature request related to a problem? Please describe.**\r\nIts not clear right away when each challenge request was submitted.\r\n\r\n\r\n**Describe the solution you'd like**\r\nI would like to add a column that shows the time when challenge request was submitted. It would be handy. \r\n\r\n**Describe alternatives you've considered**\r\nLooking at a specific challenge request page to see the time.\r\n\r\n**Additional context**\r\nIdea shown on the picture below. Color coding is not necessary, but could be nice.\r\n\n", "before_files": [{"content": "from django.contrib import admin, messages\nfrom django.contrib.admin import ModelAdmin\nfrom django.core.exceptions import ValidationError\n\nfrom grandchallenge.challenges.emails import send_challenge_status_update_email\nfrom grandchallenge.challenges.models import (\n Challenge,\n ChallengeRequest,\n ChallengeSeries,\n ExternalChallenge,\n)\n\n\nclass ChallengeAdmin(ModelAdmin):\n readonly_fields = (\"creator\",)\n autocomplete_fields = (\"publications\",)\n ordering = (\"-created\",)\n list_display = (\"short_name\", \"created\")\n search_fields = (\"short_name\",)\n\n\nclass ExternalChallengeAdmin(ModelAdmin):\n readonly_fields = (\"creator\",)\n autocomplete_fields = (\"publications\",)\n\n\nclass ChallengeRequestAdmin(ModelAdmin):\n readonly_fields = (\"creator\",)\n ordering = (\"-created\",)\n list_display = (\"title\", \"short_name\", \"creator\", \"created\", \"status\")\n actions = [\"create_challenge\", \"send_status_update_email\"]\n\n @admin.action(description=\"Create challenge for this request\")\n def create_challenge(self, request, queryset):\n for challengerequest in queryset:\n try:\n challengerequest.create_challenge()\n except ValidationError:\n self.message_user(\n request,\n f\"There already is a challenge with short \"\n f\"name: {challengerequest.short_name}\",\n messages.WARNING,\n )\n\n @admin.action(description=\"Send status update email to requester\")\n def send_status_update_email(self, request, queryset):\n for challengerequest in queryset:\n if (\n challengerequest.status\n == challengerequest.ChallengeRequestStatusChoices.ACCEPTED\n ):\n try:\n challenge = Challenge.objects.get(\n short_name=challengerequest.short_name\n )\n except Challenge.DoesNotExist:\n challenge = challengerequest.create_challenge()\n else:\n challenge = None\n send_challenge_status_update_email(\n challengerequest=challengerequest, challenge=challenge\n )\n\n\nadmin.site.register(Challenge, ChallengeAdmin)\nadmin.site.register(ExternalChallenge, ExternalChallengeAdmin)\nadmin.site.register(ChallengeSeries)\nadmin.site.register(ChallengeRequest, ChallengeRequestAdmin)\n", "path": "app/grandchallenge/challenges/admin.py"}, {"content": "import humanize\nfrom django import template\n\nregister = template.Library()\n\n\[email protected]\ndef naturaldelta(value):\n return humanize.naturaldelta(value, months=False)\n", "path": "app/grandchallenge/core/templatetags/naturaldelta.py"}], "after_files": [{"content": "from django.contrib import admin, messages\nfrom django.contrib.admin import ModelAdmin\nfrom django.core.exceptions import ValidationError\n\nfrom grandchallenge.challenges.emails import send_challenge_status_update_email\nfrom grandchallenge.challenges.models import (\n Challenge,\n ChallengeRequest,\n ChallengeSeries,\n ExternalChallenge,\n)\n\n\nclass ChallengeAdmin(ModelAdmin):\n readonly_fields = (\"creator\",)\n autocomplete_fields = (\"publications\",)\n ordering = (\"-created\",)\n list_display = (\"short_name\", \"created\")\n search_fields = (\"short_name\",)\n\n\nclass ExternalChallengeAdmin(ModelAdmin):\n readonly_fields = (\"creator\",)\n autocomplete_fields = (\"publications\",)\n\n\nclass ChallengeRequestAdmin(ModelAdmin):\n readonly_fields = (\"creator\",)\n ordering = (\"-created\",)\n list_display = (\n \"title\",\n \"short_name\",\n \"creator\",\n \"created\",\n \"status\",\n \"challenge_type\",\n \"total_cost\",\n \"budget_for_hosting_challenge\",\n )\n actions = [\"create_challenge\", \"send_status_update_email\"]\n list_filter = [\"status\", \"challenge_type\"]\n\n @admin.display(description=\"Total cost\")\n def total_cost(self, obj):\n return \"%s\" % (obj.budget[\"Total\"])\n\n @admin.action(description=\"Create challenge for this request\")\n def create_challenge(self, request, queryset):\n for challengerequest in queryset:\n try:\n challengerequest.create_challenge()\n except ValidationError:\n self.message_user(\n request,\n f\"There already is a challenge with short \"\n f\"name: {challengerequest.short_name}\",\n messages.WARNING,\n )\n\n @admin.action(description=\"Send status update email to requester\")\n def send_status_update_email(self, request, queryset):\n for challengerequest in queryset:\n if (\n challengerequest.status\n == challengerequest.ChallengeRequestStatusChoices.ACCEPTED\n ):\n try:\n challenge = Challenge.objects.get(\n short_name=challengerequest.short_name\n )\n except Challenge.DoesNotExist:\n challenge = challengerequest.create_challenge()\n else:\n challenge = None\n send_challenge_status_update_email(\n challengerequest=challengerequest, challenge=challenge\n )\n\n\nadmin.site.register(Challenge, ChallengeAdmin)\nadmin.site.register(ExternalChallenge, ExternalChallengeAdmin)\nadmin.site.register(ChallengeSeries)\nadmin.site.register(ChallengeRequest, ChallengeRequestAdmin)\n", "path": "app/grandchallenge/challenges/admin.py"}, {"content": "import humanize\nfrom django import template\nfrom django.utils import timezone\n\nregister = template.Library()\n\n\[email protected]\ndef naturaldelta(value):\n return humanize.naturaldelta(value, months=False)\n\n\[email protected]\ndef timedifference(value):\n return (timezone.now() - value).days\n", "path": "app/grandchallenge/core/templatetags/naturaldelta.py"}]}
| 1,084 | 394 |
gh_patches_debug_32324
|
rasdani/github-patches
|
git_diff
|
modal-labs__modal-examples-695
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
remove duplicate webscraper example once #669 is merged
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `10_integrations/webscraper.py`
Content:
```
1 # ---
2 # runtimes: ["runc", "gvisor"]
3 # ---
4 import os
5
6 import modal
7
8 stub = modal.Stub("example-linkscraper")
9
10
11 playwright_image = modal.Image.debian_slim(
12 python_version="3.10"
13 ).run_commands( # Doesn't work with 3.11 yet
14 "apt-get update",
15 "apt-get install -y software-properties-common",
16 "apt-add-repository non-free",
17 "apt-add-repository contrib",
18 "pip install playwright==1.30.0",
19 "playwright install-deps chromium",
20 "playwright install chromium",
21 )
22
23
24 @stub.function(image=playwright_image)
25 async def get_links(url: str) -> set[str]:
26 from playwright.async_api import async_playwright
27
28 async with async_playwright() as p:
29 browser = await p.chromium.launch()
30 page = await browser.new_page()
31 await page.goto(url)
32 links = await page.eval_on_selector_all(
33 "a[href]", "elements => elements.map(element => element.href)"
34 )
35 await browser.close()
36
37 return set(links)
38
39
40 slack_sdk_image = modal.Image.debian_slim().pip_install("slack-sdk")
41
42
43 @stub.function(
44 image=slack_sdk_image,
45 secrets=[modal.Secret.from_name("scraper-slack-secret")],
46 )
47 def bot_token_msg(channel, message):
48 import slack_sdk
49
50 print(f"Posting {message} to #{channel}")
51 client = slack_sdk.WebClient(token=os.environ["SLACK_BOT_TOKEN"])
52 client.chat_postMessage(channel=channel, text=message)
53
54
55 @stub.function()
56 def scrape():
57 links_of_interest = ["http://modal.com"]
58
59 for links in get_links.map(links_of_interest):
60 for link in links:
61 bot_token_msg.remote("scraped-links", link)
62
63
64 @stub.function(schedule=modal.Period(days=1))
65 def daily_scrape():
66 scrape.remote()
67
68
69 @stub.local_entrypoint()
70 def run():
71 scrape.remote()
72
```
Path: `misc/webscraper.py`
Content:
```
1 # ---
2 # runtimes: ["runc", "gvisor"]
3 # ---
4 import os
5
6 import modal
7
8 stub = modal.Stub("example-linkscraper")
9
10
11 playwright_image = modal.Image.debian_slim(
12 python_version="3.10"
13 ).run_commands( # Doesn't work with 3.11 yet
14 "apt-get update",
15 "apt-get install -y software-properties-common",
16 "apt-add-repository non-free",
17 "apt-add-repository contrib",
18 "pip install playwright==1.30.0",
19 "playwright install-deps chromium",
20 "playwright install chromium",
21 )
22
23
24 @stub.function(image=playwright_image)
25 async def get_links(url: str) -> set[str]:
26 from playwright.async_api import async_playwright
27
28 async with async_playwright() as p:
29 browser = await p.chromium.launch()
30 page = await browser.new_page()
31 await page.goto(url)
32 links = await page.eval_on_selector_all(
33 "a[href]", "elements => elements.map(element => element.href)"
34 )
35 await browser.close()
36
37 return set(links)
38
39
40 slack_sdk_image = modal.Image.debian_slim().pip_install("slack-sdk")
41
42
43 @stub.function(
44 image=slack_sdk_image,
45 secrets=[modal.Secret.from_name("scraper-slack-secret")],
46 )
47 def bot_token_msg(channel, message):
48 import slack_sdk
49
50 print(f"Posting {message} to #{channel}")
51 client = slack_sdk.WebClient(token=os.environ["SLACK_BOT_TOKEN"])
52 client.chat_postMessage(channel=channel, text=message)
53
54
55 @stub.function()
56 def scrape():
57 links_of_interest = ["http://modal.com"]
58
59 for links in get_links.map(links_of_interest):
60 for link in links:
61 bot_token_msg.remote("scraped-links", link)
62
63
64 @stub.function(schedule=modal.Period(days=1))
65 def daily_scrape():
66 scrape.remote()
67
68
69 @stub.local_entrypoint()
70 def run():
71 scrape.remote()
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/10_integrations/webscraper.py b/10_integrations/webscraper.py
--- a/10_integrations/webscraper.py
+++ b/10_integrations/webscraper.py
@@ -15,7 +15,7 @@
"apt-get install -y software-properties-common",
"apt-add-repository non-free",
"apt-add-repository contrib",
- "pip install playwright==1.30.0",
+ "pip install playwright==1.42.0",
"playwright install-deps chromium",
"playwright install chromium",
)
diff --git a/misc/webscraper.py b/misc/webscraper.py
deleted file mode 100644
--- a/misc/webscraper.py
+++ /dev/null
@@ -1,71 +0,0 @@
-# ---
-# runtimes: ["runc", "gvisor"]
-# ---
-import os
-
-import modal
-
-stub = modal.Stub("example-linkscraper")
-
-
-playwright_image = modal.Image.debian_slim(
- python_version="3.10"
-).run_commands( # Doesn't work with 3.11 yet
- "apt-get update",
- "apt-get install -y software-properties-common",
- "apt-add-repository non-free",
- "apt-add-repository contrib",
- "pip install playwright==1.30.0",
- "playwright install-deps chromium",
- "playwright install chromium",
-)
-
-
[email protected](image=playwright_image)
-async def get_links(url: str) -> set[str]:
- from playwright.async_api import async_playwright
-
- async with async_playwright() as p:
- browser = await p.chromium.launch()
- page = await browser.new_page()
- await page.goto(url)
- links = await page.eval_on_selector_all(
- "a[href]", "elements => elements.map(element => element.href)"
- )
- await browser.close()
-
- return set(links)
-
-
-slack_sdk_image = modal.Image.debian_slim().pip_install("slack-sdk")
-
-
[email protected](
- image=slack_sdk_image,
- secrets=[modal.Secret.from_name("scraper-slack-secret")],
-)
-def bot_token_msg(channel, message):
- import slack_sdk
-
- print(f"Posting {message} to #{channel}")
- client = slack_sdk.WebClient(token=os.environ["SLACK_BOT_TOKEN"])
- client.chat_postMessage(channel=channel, text=message)
-
-
[email protected]()
-def scrape():
- links_of_interest = ["http://modal.com"]
-
- for links in get_links.map(links_of_interest):
- for link in links:
- bot_token_msg.remote("scraped-links", link)
-
-
[email protected](schedule=modal.Period(days=1))
-def daily_scrape():
- scrape.remote()
-
-
[email protected]_entrypoint()
-def run():
- scrape.remote()
|
{"golden_diff": "diff --git a/10_integrations/webscraper.py b/10_integrations/webscraper.py\n--- a/10_integrations/webscraper.py\n+++ b/10_integrations/webscraper.py\n@@ -15,7 +15,7 @@\n \"apt-get install -y software-properties-common\",\n \"apt-add-repository non-free\",\n \"apt-add-repository contrib\",\n- \"pip install playwright==1.30.0\",\n+ \"pip install playwright==1.42.0\",\n \"playwright install-deps chromium\",\n \"playwright install chromium\",\n )\ndiff --git a/misc/webscraper.py b/misc/webscraper.py\ndeleted file mode 100644\n--- a/misc/webscraper.py\n+++ /dev/null\n@@ -1,71 +0,0 @@\n-# ---\n-# runtimes: [\"runc\", \"gvisor\"]\n-# ---\n-import os\n-\n-import modal\n-\n-stub = modal.Stub(\"example-linkscraper\")\n-\n-\n-playwright_image = modal.Image.debian_slim(\n- python_version=\"3.10\"\n-).run_commands( # Doesn't work with 3.11 yet\n- \"apt-get update\",\n- \"apt-get install -y software-properties-common\",\n- \"apt-add-repository non-free\",\n- \"apt-add-repository contrib\",\n- \"pip install playwright==1.30.0\",\n- \"playwright install-deps chromium\",\n- \"playwright install chromium\",\n-)\n-\n-\[email protected](image=playwright_image)\n-async def get_links(url: str) -> set[str]:\n- from playwright.async_api import async_playwright\n-\n- async with async_playwright() as p:\n- browser = await p.chromium.launch()\n- page = await browser.new_page()\n- await page.goto(url)\n- links = await page.eval_on_selector_all(\n- \"a[href]\", \"elements => elements.map(element => element.href)\"\n- )\n- await browser.close()\n-\n- return set(links)\n-\n-\n-slack_sdk_image = modal.Image.debian_slim().pip_install(\"slack-sdk\")\n-\n-\[email protected](\n- image=slack_sdk_image,\n- secrets=[modal.Secret.from_name(\"scraper-slack-secret\")],\n-)\n-def bot_token_msg(channel, message):\n- import slack_sdk\n-\n- print(f\"Posting {message} to #{channel}\")\n- client = slack_sdk.WebClient(token=os.environ[\"SLACK_BOT_TOKEN\"])\n- client.chat_postMessage(channel=channel, text=message)\n-\n-\[email protected]()\n-def scrape():\n- links_of_interest = [\"http://modal.com\"]\n-\n- for links in get_links.map(links_of_interest):\n- for link in links:\n- bot_token_msg.remote(\"scraped-links\", link)\n-\n-\[email protected](schedule=modal.Period(days=1))\n-def daily_scrape():\n- scrape.remote()\n-\n-\[email protected]_entrypoint()\n-def run():\n- scrape.remote()\n", "issue": "remove duplicate webscraper example once #669 is merged\n\n", "before_files": [{"content": "# ---\n# runtimes: [\"runc\", \"gvisor\"]\n# ---\nimport os\n\nimport modal\n\nstub = modal.Stub(\"example-linkscraper\")\n\n\nplaywright_image = modal.Image.debian_slim(\n python_version=\"3.10\"\n).run_commands( # Doesn't work with 3.11 yet\n \"apt-get update\",\n \"apt-get install -y software-properties-common\",\n \"apt-add-repository non-free\",\n \"apt-add-repository contrib\",\n \"pip install playwright==1.30.0\",\n \"playwright install-deps chromium\",\n \"playwright install chromium\",\n)\n\n\[email protected](image=playwright_image)\nasync def get_links(url: str) -> set[str]:\n from playwright.async_api import async_playwright\n\n async with async_playwright() as p:\n browser = await p.chromium.launch()\n page = await browser.new_page()\n await page.goto(url)\n links = await page.eval_on_selector_all(\n \"a[href]\", \"elements => elements.map(element => element.href)\"\n )\n await browser.close()\n\n return set(links)\n\n\nslack_sdk_image = modal.Image.debian_slim().pip_install(\"slack-sdk\")\n\n\[email protected](\n image=slack_sdk_image,\n secrets=[modal.Secret.from_name(\"scraper-slack-secret\")],\n)\ndef bot_token_msg(channel, message):\n import slack_sdk\n\n print(f\"Posting {message} to #{channel}\")\n client = slack_sdk.WebClient(token=os.environ[\"SLACK_BOT_TOKEN\"])\n client.chat_postMessage(channel=channel, text=message)\n\n\[email protected]()\ndef scrape():\n links_of_interest = [\"http://modal.com\"]\n\n for links in get_links.map(links_of_interest):\n for link in links:\n bot_token_msg.remote(\"scraped-links\", link)\n\n\[email protected](schedule=modal.Period(days=1))\ndef daily_scrape():\n scrape.remote()\n\n\[email protected]_entrypoint()\ndef run():\n scrape.remote()\n", "path": "10_integrations/webscraper.py"}, {"content": "# ---\n# runtimes: [\"runc\", \"gvisor\"]\n# ---\nimport os\n\nimport modal\n\nstub = modal.Stub(\"example-linkscraper\")\n\n\nplaywright_image = modal.Image.debian_slim(\n python_version=\"3.10\"\n).run_commands( # Doesn't work with 3.11 yet\n \"apt-get update\",\n \"apt-get install -y software-properties-common\",\n \"apt-add-repository non-free\",\n \"apt-add-repository contrib\",\n \"pip install playwright==1.30.0\",\n \"playwright install-deps chromium\",\n \"playwright install chromium\",\n)\n\n\[email protected](image=playwright_image)\nasync def get_links(url: str) -> set[str]:\n from playwright.async_api import async_playwright\n\n async with async_playwright() as p:\n browser = await p.chromium.launch()\n page = await browser.new_page()\n await page.goto(url)\n links = await page.eval_on_selector_all(\n \"a[href]\", \"elements => elements.map(element => element.href)\"\n )\n await browser.close()\n\n return set(links)\n\n\nslack_sdk_image = modal.Image.debian_slim().pip_install(\"slack-sdk\")\n\n\[email protected](\n image=slack_sdk_image,\n secrets=[modal.Secret.from_name(\"scraper-slack-secret\")],\n)\ndef bot_token_msg(channel, message):\n import slack_sdk\n\n print(f\"Posting {message} to #{channel}\")\n client = slack_sdk.WebClient(token=os.environ[\"SLACK_BOT_TOKEN\"])\n client.chat_postMessage(channel=channel, text=message)\n\n\[email protected]()\ndef scrape():\n links_of_interest = [\"http://modal.com\"]\n\n for links in get_links.map(links_of_interest):\n for link in links:\n bot_token_msg.remote(\"scraped-links\", link)\n\n\[email protected](schedule=modal.Period(days=1))\ndef daily_scrape():\n scrape.remote()\n\n\[email protected]_entrypoint()\ndef run():\n scrape.remote()\n", "path": "misc/webscraper.py"}], "after_files": [{"content": "# ---\n# runtimes: [\"runc\", \"gvisor\"]\n# ---\nimport os\n\nimport modal\n\nstub = modal.Stub(\"example-linkscraper\")\n\n\nplaywright_image = modal.Image.debian_slim(\n python_version=\"3.10\"\n).run_commands( # Doesn't work with 3.11 yet\n \"apt-get update\",\n \"apt-get install -y software-properties-common\",\n \"apt-add-repository non-free\",\n \"apt-add-repository contrib\",\n \"pip install playwright==1.42.0\",\n \"playwright install-deps chromium\",\n \"playwright install chromium\",\n)\n\n\[email protected](image=playwright_image)\nasync def get_links(url: str) -> set[str]:\n from playwright.async_api import async_playwright\n\n async with async_playwright() as p:\n browser = await p.chromium.launch()\n page = await browser.new_page()\n await page.goto(url)\n links = await page.eval_on_selector_all(\n \"a[href]\", \"elements => elements.map(element => element.href)\"\n )\n await browser.close()\n\n return set(links)\n\n\nslack_sdk_image = modal.Image.debian_slim().pip_install(\"slack-sdk\")\n\n\[email protected](\n image=slack_sdk_image,\n secrets=[modal.Secret.from_name(\"scraper-slack-secret\")],\n)\ndef bot_token_msg(channel, message):\n import slack_sdk\n\n print(f\"Posting {message} to #{channel}\")\n client = slack_sdk.WebClient(token=os.environ[\"SLACK_BOT_TOKEN\"])\n client.chat_postMessage(channel=channel, text=message)\n\n\[email protected]()\ndef scrape():\n links_of_interest = [\"http://modal.com\"]\n\n for links in get_links.map(links_of_interest):\n for link in links:\n bot_token_msg.remote(\"scraped-links\", link)\n\n\[email protected](schedule=modal.Period(days=1))\ndef daily_scrape():\n scrape.remote()\n\n\[email protected]_entrypoint()\ndef run():\n scrape.remote()\n", "path": "10_integrations/webscraper.py"}, {"content": null, "path": "misc/webscraper.py"}]}
| 1,426 | 665 |
gh_patches_debug_14271
|
rasdani/github-patches
|
git_diff
|
getsentry__snuba-3697
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
About the sessions-subscription-results subscription issue
### Environment
- sentry | snuba version :23.1.1
https://github.com/getsentry/snuba/pull/2737 ,@lynnagara Hello, I have a question about this pr, hope to get your answer, thank you very much
- After removing subscriptions-scheduler-executor-session support in snuba, how to write data to the topic of sessions-subscription-results? Because I see that the crash rate warning code in sentry is still there and has not changed, for example
- https://github.com/getsentry/sentry/pull/28526
https://github.com/getsentry/sentry/blob/8e00dcdf463d916b9ca79ddbe13e99f161d58db1/src/sentry/snuba/query_subscription_consumer.py#L61-L61
My original question is as follows, I have enabled the organizations:incidents function in sentry and subscribed to sessions-results through the following script
```bash
sentry
run
query-subscription-consumer
--topic=sessions-subscription-results
```
Because there is no data in the sessions-subscription-results topic, the crash rate alarm cannot work
<img width="1568" alt="image" src="https://user-images.githubusercontent.com/18591662/216570393-64748a25-1cd4-4980-966c-f7665dc8482b.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `snuba/cli/subscriptions_scheduler_executor.py`
Content:
```
1 import signal
2 from contextlib import contextmanager
3 from typing import Any, Iterator, Optional, Sequence
4
5 import click
6 from arroyo import configure_metrics
7 from arroyo.backends.kafka import KafkaProducer
8
9 from snuba import environment, state
10 from snuba.attribution.log import flush_attribution_producer
11 from snuba.datasets.entities.entity_key import EntityKey
12 from snuba.datasets.entities.factory import get_entity
13 from snuba.environment import setup_logging, setup_sentry
14 from snuba.subscriptions.combined_scheduler_executor import (
15 build_scheduler_executor_consumer,
16 )
17 from snuba.utils.metrics.wrapper import MetricsWrapper
18 from snuba.utils.streams.configuration_builder import build_kafka_producer_configuration
19 from snuba.utils.streams.metrics_adapter import StreamMetricsAdapter
20
21
22 @click.command()
23 @click.option(
24 "--dataset",
25 "dataset_name",
26 required=True,
27 type=click.Choice(["events", "transactions", "metrics"]),
28 help="The dataset to target.",
29 )
30 @click.option(
31 "--entity",
32 "entity_names",
33 required=True,
34 multiple=True,
35 type=click.Choice(["events", "transactions", "metrics_counters", "metrics_sets"]),
36 help="The entity to target.",
37 )
38 @click.option(
39 "--consumer-group",
40 default="snuba-subscription-executor",
41 help="Consumer group used for consuming the scheduled subscription topic/s.",
42 )
43 @click.option(
44 "--followed-consumer-group",
45 required=True,
46 help="Name of the consumer group to follow",
47 )
48 @click.option(
49 "--total-concurrent-queries",
50 default=64,
51 type=int,
52 help="Total max number of concurrent queries for all replicas. Used to calculate max_concurrent_queries.",
53 )
54 @click.option(
55 "--auto-offset-reset",
56 default="error",
57 type=click.Choice(["error", "earliest", "latest"]),
58 help="Kafka consumer auto offset reset.",
59 )
60 @click.option(
61 "--no-strict-offset-reset",
62 is_flag=True,
63 help="Forces the kafka consumer auto offset reset.",
64 )
65 @click.option("--schedule-ttl", type=int, default=60 * 5)
66 @click.option("--delay-seconds", type=int)
67 @click.option(
68 "--stale-threshold-seconds",
69 type=int,
70 help="Skip scheduling if timestamp is beyond this threshold compared to the system time",
71 )
72 @click.option("--log-level", help="Logging level to use.")
73 def subscriptions_scheduler_executor(
74 *,
75 dataset_name: str,
76 entity_names: Sequence[str],
77 consumer_group: str,
78 followed_consumer_group: str,
79 total_concurrent_queries: int,
80 auto_offset_reset: str,
81 no_strict_offset_reset: bool,
82 schedule_ttl: int,
83 delay_seconds: Optional[int],
84 stale_threshold_seconds: Optional[int],
85 log_level: Optional[str],
86 ) -> None:
87 """
88 Combined subscriptions scheduler and executor. Alternative to the separate scheduler and executor processes.
89 """
90 setup_logging(log_level)
91 setup_sentry()
92
93 metrics = MetricsWrapper(
94 environment.metrics,
95 "subscriptions.scheduler_executor",
96 tags={"dataset": dataset_name},
97 )
98
99 configure_metrics(StreamMetricsAdapter(metrics))
100
101 # Just get the result topic configuration from the first entity. Later we
102 # check they all have the same result topic anyway before building the consumer.
103 entity_key = EntityKey(entity_names[0])
104
105 storage = get_entity(entity_key).get_writable_storage()
106 assert storage is not None
107 stream_loader = storage.get_table_writer().get_stream_loader()
108 result_topic_spec = stream_loader.get_subscription_scheduled_topic_spec()
109 assert result_topic_spec is not None
110
111 producer = KafkaProducer(
112 build_kafka_producer_configuration(
113 result_topic_spec.topic,
114 override_params={"partitioner": "consistent"},
115 )
116 )
117
118 processor = build_scheduler_executor_consumer(
119 dataset_name,
120 entity_names,
121 consumer_group,
122 followed_consumer_group,
123 producer,
124 auto_offset_reset,
125 not no_strict_offset_reset,
126 schedule_ttl,
127 delay_seconds,
128 stale_threshold_seconds,
129 total_concurrent_queries,
130 metrics,
131 )
132
133 def handler(signum: int, frame: Any) -> None:
134 processor.signal_shutdown()
135
136 signal.signal(signal.SIGINT, handler)
137 signal.signal(signal.SIGTERM, handler)
138
139 with closing(producer), flush_querylog(), flush_attribution_producer():
140 processor.run()
141
142
143 @contextmanager
144 def closing(producer: KafkaProducer) -> Iterator[Optional[KafkaProducer]]:
145 try:
146 yield producer
147 finally:
148 producer.close().result()
149
150
151 @contextmanager
152 def flush_querylog() -> Iterator[None]:
153 try:
154 yield
155 finally:
156 state.flush_producer()
157
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/snuba/cli/subscriptions_scheduler_executor.py b/snuba/cli/subscriptions_scheduler_executor.py
--- a/snuba/cli/subscriptions_scheduler_executor.py
+++ b/snuba/cli/subscriptions_scheduler_executor.py
@@ -24,7 +24,7 @@
"--dataset",
"dataset_name",
required=True,
- type=click.Choice(["events", "transactions", "metrics"]),
+ type=click.Choice(["events", "transactions", "metrics", "sessions"]),
help="The dataset to target.",
)
@click.option(
@@ -32,7 +32,9 @@
"entity_names",
required=True,
multiple=True,
- type=click.Choice(["events", "transactions", "metrics_counters", "metrics_sets"]),
+ type=click.Choice(
+ ["events", "transactions", "metrics_counters", "metrics_sets", "sessions"]
+ ),
help="The entity to target.",
)
@click.option(
|
{"golden_diff": "diff --git a/snuba/cli/subscriptions_scheduler_executor.py b/snuba/cli/subscriptions_scheduler_executor.py\n--- a/snuba/cli/subscriptions_scheduler_executor.py\n+++ b/snuba/cli/subscriptions_scheduler_executor.py\n@@ -24,7 +24,7 @@\n \"--dataset\",\n \"dataset_name\",\n required=True,\n- type=click.Choice([\"events\", \"transactions\", \"metrics\"]),\n+ type=click.Choice([\"events\", \"transactions\", \"metrics\", \"sessions\"]),\n help=\"The dataset to target.\",\n )\n @click.option(\n@@ -32,7 +32,9 @@\n \"entity_names\",\n required=True,\n multiple=True,\n- type=click.Choice([\"events\", \"transactions\", \"metrics_counters\", \"metrics_sets\"]),\n+ type=click.Choice(\n+ [\"events\", \"transactions\", \"metrics_counters\", \"metrics_sets\", \"sessions\"]\n+ ),\n help=\"The entity to target.\",\n )\n @click.option(\n", "issue": "About the sessions-subscription-results subscription issue\n### Environment\r\n\r\n- sentry | snuba version \uff1a23.1.1\r\n\r\nhttps://github.com/getsentry/snuba/pull/2737 \uff0c@lynnagara Hello, I have a question about this pr, hope to get your answer, thank you very much\r\n\r\n- After removing subscriptions-scheduler-executor-session support in snuba, how to write data to the topic of sessions-subscription-results? Because I see that the crash rate warning code in sentry is still there and has not changed, for example\r\n\r\n- https://github.com/getsentry/sentry/pull/28526\r\n\r\nhttps://github.com/getsentry/sentry/blob/8e00dcdf463d916b9ca79ddbe13e99f161d58db1/src/sentry/snuba/query_subscription_consumer.py#L61-L61\r\n\r\nMy original question is as follows, I have enabled the organizations:incidents function in sentry and subscribed to sessions-results through the following script\r\n```bash\r\nsentry\r\nrun\r\nquery-subscription-consumer\r\n--topic=sessions-subscription-results\r\n```\r\nBecause there is no data in the sessions-subscription-results topic, the crash rate alarm cannot work\r\n<img width=\"1568\" alt=\"image\" src=\"https://user-images.githubusercontent.com/18591662/216570393-64748a25-1cd4-4980-966c-f7665dc8482b.png\">\r\n\r\n\r\n\r\n\r\n\r\n\n", "before_files": [{"content": "import signal\nfrom contextlib import contextmanager\nfrom typing import Any, Iterator, Optional, Sequence\n\nimport click\nfrom arroyo import configure_metrics\nfrom arroyo.backends.kafka import KafkaProducer\n\nfrom snuba import environment, state\nfrom snuba.attribution.log import flush_attribution_producer\nfrom snuba.datasets.entities.entity_key import EntityKey\nfrom snuba.datasets.entities.factory import get_entity\nfrom snuba.environment import setup_logging, setup_sentry\nfrom snuba.subscriptions.combined_scheduler_executor import (\n build_scheduler_executor_consumer,\n)\nfrom snuba.utils.metrics.wrapper import MetricsWrapper\nfrom snuba.utils.streams.configuration_builder import build_kafka_producer_configuration\nfrom snuba.utils.streams.metrics_adapter import StreamMetricsAdapter\n\n\[email protected]()\[email protected](\n \"--dataset\",\n \"dataset_name\",\n required=True,\n type=click.Choice([\"events\", \"transactions\", \"metrics\"]),\n help=\"The dataset to target.\",\n)\[email protected](\n \"--entity\",\n \"entity_names\",\n required=True,\n multiple=True,\n type=click.Choice([\"events\", \"transactions\", \"metrics_counters\", \"metrics_sets\"]),\n help=\"The entity to target.\",\n)\[email protected](\n \"--consumer-group\",\n default=\"snuba-subscription-executor\",\n help=\"Consumer group used for consuming the scheduled subscription topic/s.\",\n)\[email protected](\n \"--followed-consumer-group\",\n required=True,\n help=\"Name of the consumer group to follow\",\n)\[email protected](\n \"--total-concurrent-queries\",\n default=64,\n type=int,\n help=\"Total max number of concurrent queries for all replicas. Used to calculate max_concurrent_queries.\",\n)\[email protected](\n \"--auto-offset-reset\",\n default=\"error\",\n type=click.Choice([\"error\", \"earliest\", \"latest\"]),\n help=\"Kafka consumer auto offset reset.\",\n)\[email protected](\n \"--no-strict-offset-reset\",\n is_flag=True,\n help=\"Forces the kafka consumer auto offset reset.\",\n)\[email protected](\"--schedule-ttl\", type=int, default=60 * 5)\[email protected](\"--delay-seconds\", type=int)\[email protected](\n \"--stale-threshold-seconds\",\n type=int,\n help=\"Skip scheduling if timestamp is beyond this threshold compared to the system time\",\n)\[email protected](\"--log-level\", help=\"Logging level to use.\")\ndef subscriptions_scheduler_executor(\n *,\n dataset_name: str,\n entity_names: Sequence[str],\n consumer_group: str,\n followed_consumer_group: str,\n total_concurrent_queries: int,\n auto_offset_reset: str,\n no_strict_offset_reset: bool,\n schedule_ttl: int,\n delay_seconds: Optional[int],\n stale_threshold_seconds: Optional[int],\n log_level: Optional[str],\n) -> None:\n \"\"\"\n Combined subscriptions scheduler and executor. Alternative to the separate scheduler and executor processes.\n \"\"\"\n setup_logging(log_level)\n setup_sentry()\n\n metrics = MetricsWrapper(\n environment.metrics,\n \"subscriptions.scheduler_executor\",\n tags={\"dataset\": dataset_name},\n )\n\n configure_metrics(StreamMetricsAdapter(metrics))\n\n # Just get the result topic configuration from the first entity. Later we\n # check they all have the same result topic anyway before building the consumer.\n entity_key = EntityKey(entity_names[0])\n\n storage = get_entity(entity_key).get_writable_storage()\n assert storage is not None\n stream_loader = storage.get_table_writer().get_stream_loader()\n result_topic_spec = stream_loader.get_subscription_scheduled_topic_spec()\n assert result_topic_spec is not None\n\n producer = KafkaProducer(\n build_kafka_producer_configuration(\n result_topic_spec.topic,\n override_params={\"partitioner\": \"consistent\"},\n )\n )\n\n processor = build_scheduler_executor_consumer(\n dataset_name,\n entity_names,\n consumer_group,\n followed_consumer_group,\n producer,\n auto_offset_reset,\n not no_strict_offset_reset,\n schedule_ttl,\n delay_seconds,\n stale_threshold_seconds,\n total_concurrent_queries,\n metrics,\n )\n\n def handler(signum: int, frame: Any) -> None:\n processor.signal_shutdown()\n\n signal.signal(signal.SIGINT, handler)\n signal.signal(signal.SIGTERM, handler)\n\n with closing(producer), flush_querylog(), flush_attribution_producer():\n processor.run()\n\n\n@contextmanager\ndef closing(producer: KafkaProducer) -> Iterator[Optional[KafkaProducer]]:\n try:\n yield producer\n finally:\n producer.close().result()\n\n\n@contextmanager\ndef flush_querylog() -> Iterator[None]:\n try:\n yield\n finally:\n state.flush_producer()\n", "path": "snuba/cli/subscriptions_scheduler_executor.py"}], "after_files": [{"content": "import signal\nfrom contextlib import contextmanager\nfrom typing import Any, Iterator, Optional, Sequence\n\nimport click\nfrom arroyo import configure_metrics\nfrom arroyo.backends.kafka import KafkaProducer\n\nfrom snuba import environment, state\nfrom snuba.attribution.log import flush_attribution_producer\nfrom snuba.datasets.entities.entity_key import EntityKey\nfrom snuba.datasets.entities.factory import get_entity\nfrom snuba.environment import setup_logging, setup_sentry\nfrom snuba.subscriptions.combined_scheduler_executor import (\n build_scheduler_executor_consumer,\n)\nfrom snuba.utils.metrics.wrapper import MetricsWrapper\nfrom snuba.utils.streams.configuration_builder import build_kafka_producer_configuration\nfrom snuba.utils.streams.metrics_adapter import StreamMetricsAdapter\n\n\[email protected]()\[email protected](\n \"--dataset\",\n \"dataset_name\",\n required=True,\n type=click.Choice([\"events\", \"transactions\", \"metrics\", \"sessions\"]),\n help=\"The dataset to target.\",\n)\[email protected](\n \"--entity\",\n \"entity_names\",\n required=True,\n multiple=True,\n type=click.Choice(\n [\"events\", \"transactions\", \"metrics_counters\", \"metrics_sets\", \"sessions\"]\n ),\n help=\"The entity to target.\",\n)\[email protected](\n \"--consumer-group\",\n default=\"snuba-subscription-executor\",\n help=\"Consumer group used for consuming the scheduled subscription topic/s.\",\n)\[email protected](\n \"--followed-consumer-group\",\n required=True,\n help=\"Name of the consumer group to follow\",\n)\[email protected](\n \"--total-concurrent-queries\",\n default=64,\n type=int,\n help=\"Total max number of concurrent queries for all replicas. Used to calculate max_concurrent_queries.\",\n)\[email protected](\n \"--auto-offset-reset\",\n default=\"error\",\n type=click.Choice([\"error\", \"earliest\", \"latest\"]),\n help=\"Kafka consumer auto offset reset.\",\n)\[email protected](\n \"--no-strict-offset-reset\",\n is_flag=True,\n help=\"Forces the kafka consumer auto offset reset.\",\n)\[email protected](\"--schedule-ttl\", type=int, default=60 * 5)\[email protected](\"--delay-seconds\", type=int)\[email protected](\n \"--stale-threshold-seconds\",\n type=int,\n help=\"Skip scheduling if timestamp is beyond this threshold compared to the system time\",\n)\[email protected](\"--log-level\", help=\"Logging level to use.\")\ndef subscriptions_scheduler_executor(\n *,\n dataset_name: str,\n entity_names: Sequence[str],\n consumer_group: str,\n followed_consumer_group: str,\n total_concurrent_queries: int,\n auto_offset_reset: str,\n no_strict_offset_reset: bool,\n schedule_ttl: int,\n delay_seconds: Optional[int],\n stale_threshold_seconds: Optional[int],\n log_level: Optional[str],\n) -> None:\n \"\"\"\n Combined subscriptions scheduler and executor. Alternative to the separate scheduler and executor processes.\n \"\"\"\n setup_logging(log_level)\n setup_sentry()\n\n metrics = MetricsWrapper(\n environment.metrics,\n \"subscriptions.scheduler_executor\",\n tags={\"dataset\": dataset_name},\n )\n\n configure_metrics(StreamMetricsAdapter(metrics))\n\n # Just get the result topic configuration from the first entity. Later we\n # check they all have the same result topic anyway before building the consumer.\n entity_key = EntityKey(entity_names[0])\n\n storage = get_entity(entity_key).get_writable_storage()\n assert storage is not None\n stream_loader = storage.get_table_writer().get_stream_loader()\n result_topic_spec = stream_loader.get_subscription_scheduled_topic_spec()\n assert result_topic_spec is not None\n\n producer = KafkaProducer(\n build_kafka_producer_configuration(\n result_topic_spec.topic,\n override_params={\"partitioner\": \"consistent\"},\n )\n )\n\n processor = build_scheduler_executor_consumer(\n dataset_name,\n entity_names,\n consumer_group,\n followed_consumer_group,\n producer,\n auto_offset_reset,\n not no_strict_offset_reset,\n schedule_ttl,\n delay_seconds,\n stale_threshold_seconds,\n total_concurrent_queries,\n metrics,\n )\n\n def handler(signum: int, frame: Any) -> None:\n processor.signal_shutdown()\n\n signal.signal(signal.SIGINT, handler)\n signal.signal(signal.SIGTERM, handler)\n\n with closing(producer), flush_querylog(), flush_attribution_producer():\n processor.run()\n\n\n@contextmanager\ndef closing(producer: KafkaProducer) -> Iterator[Optional[KafkaProducer]]:\n try:\n yield producer\n finally:\n producer.close().result()\n\n\n@contextmanager\ndef flush_querylog() -> Iterator[None]:\n try:\n yield\n finally:\n state.flush_producer()\n", "path": "snuba/cli/subscriptions_scheduler_executor.py"}]}
| 1,973 | 206 |
gh_patches_debug_13891
|
rasdani/github-patches
|
git_diff
|
Mailu__Mailu-1862
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Radicale password check fails (length limited?)
So far, the webdav radicale is protected with basic auth (apparently htaccess or similar).
If user password > 8 then it fails to connect.
we should remove this limitation to let use stong passwords
Radicale password check fails (length limited?)
So far, the webdav radicale is protected with basic auth (apparently htaccess or similar).
If user password > 8 then it fails to connect.
we should remove this limitation to let use stong passwords
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/admin/mailu/internal/views/auth.py`
Content:
```
1 from mailu import models, utils
2 from mailu.internal import internal, nginx
3 from flask import current_app as app
4
5 import flask
6 import flask_login
7 import base64
8 import ipaddress
9
10
11 @internal.route("/auth/email")
12 def nginx_authentication():
13 """ Main authentication endpoint for Nginx email server
14 """
15 limiter = utils.limiter.get_limiter(app.config["AUTH_RATELIMIT"], "auth-ip")
16 client_ip = flask.request.headers["Client-Ip"]
17 if not limiter.test(client_ip):
18 response = flask.Response()
19 response.headers['Auth-Status'] = 'Authentication rate limit from one source exceeded'
20 response.headers['Auth-Error-Code'] = '451 4.3.2'
21 if int(flask.request.headers['Auth-Login-Attempt']) < 10:
22 response.headers['Auth-Wait'] = '3'
23 return response
24 headers = nginx.handle_authentication(flask.request.headers)
25 response = flask.Response()
26 for key, value in headers.items():
27 response.headers[key] = str(value)
28 if ("Auth-Status" not in headers) or (headers["Auth-Status"] != "OK"):
29 limit_subnet = str(app.config["AUTH_RATELIMIT_SUBNET"]) != 'False'
30 subnet = ipaddress.ip_network(app.config["SUBNET"])
31 if limit_subnet or ipaddress.ip_address(client_ip) not in subnet:
32 limiter.hit(flask.request.headers["Client-Ip"])
33 return response
34
35
36 @internal.route("/auth/admin")
37 def admin_authentication():
38 """ Fails if the user is not an authenticated admin.
39 """
40 if (not flask_login.current_user.is_anonymous
41 and flask_login.current_user.global_admin
42 and flask_login.current_user.enabled):
43 return ""
44 return flask.abort(403)
45
46 @internal.route("/auth/user")
47 def user_authentication():
48 """ Fails if the user is not authenticated.
49 """
50 if (not flask_login.current_user.is_anonymous
51 and flask_login.current_user.enabled):
52 response = flask.Response()
53 response.headers["X-User"] = flask_login.current_user.get_id()
54 response.headers["X-User-Token"] = models.User.get_temp_token(flask_login.current_user.get_id())
55 return response
56 return flask.abort(403)
57
58
59 @internal.route("/auth/basic")
60 def basic_authentication():
61 """ Tries to authenticate using the Authorization header.
62 """
63 authorization = flask.request.headers.get("Authorization")
64 if authorization and authorization.startswith("Basic "):
65 encoded = authorization.replace("Basic ", "")
66 user_email, password = base64.b64decode(encoded).split(b":")
67 user = models.User.query.get(user_email.decode("utf8"))
68 if nginx.check_credentials(user, password.decode('utf-8'), flask.request.remote_addr, "web"):
69 response = flask.Response()
70 response.headers["X-User"] = user.email
71 return response
72 response = flask.Response(status=401)
73 response.headers["WWW-Authenticate"] = 'Basic realm="Login Required"'
74 return response
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/core/admin/mailu/internal/views/auth.py b/core/admin/mailu/internal/views/auth.py
--- a/core/admin/mailu/internal/views/auth.py
+++ b/core/admin/mailu/internal/views/auth.py
@@ -63,7 +63,7 @@
authorization = flask.request.headers.get("Authorization")
if authorization and authorization.startswith("Basic "):
encoded = authorization.replace("Basic ", "")
- user_email, password = base64.b64decode(encoded).split(b":")
+ user_email, password = base64.b64decode(encoded).split(b":", 1)
user = models.User.query.get(user_email.decode("utf8"))
if nginx.check_credentials(user, password.decode('utf-8'), flask.request.remote_addr, "web"):
response = flask.Response()
|
{"golden_diff": "diff --git a/core/admin/mailu/internal/views/auth.py b/core/admin/mailu/internal/views/auth.py\n--- a/core/admin/mailu/internal/views/auth.py\n+++ b/core/admin/mailu/internal/views/auth.py\n@@ -63,7 +63,7 @@\n authorization = flask.request.headers.get(\"Authorization\")\n if authorization and authorization.startswith(\"Basic \"):\n encoded = authorization.replace(\"Basic \", \"\")\n- user_email, password = base64.b64decode(encoded).split(b\":\")\n+ user_email, password = base64.b64decode(encoded).split(b\":\", 1)\n user = models.User.query.get(user_email.decode(\"utf8\"))\n if nginx.check_credentials(user, password.decode('utf-8'), flask.request.remote_addr, \"web\"):\n response = flask.Response()\n", "issue": "Radicale password check fails (length limited?)\nSo far, the webdav radicale is protected with basic auth (apparently htaccess or similar).\r\nIf user password > 8 then it fails to connect. \r\nwe should remove this limitation to let use stong passwords\nRadicale password check fails (length limited?)\nSo far, the webdav radicale is protected with basic auth (apparently htaccess or similar).\r\nIf user password > 8 then it fails to connect. \r\nwe should remove this limitation to let use stong passwords\n", "before_files": [{"content": "from mailu import models, utils\nfrom mailu.internal import internal, nginx\nfrom flask import current_app as app\n\nimport flask\nimport flask_login\nimport base64\nimport ipaddress\n\n\[email protected](\"/auth/email\")\ndef nginx_authentication():\n \"\"\" Main authentication endpoint for Nginx email server\n \"\"\"\n limiter = utils.limiter.get_limiter(app.config[\"AUTH_RATELIMIT\"], \"auth-ip\")\n client_ip = flask.request.headers[\"Client-Ip\"]\n if not limiter.test(client_ip):\n response = flask.Response()\n response.headers['Auth-Status'] = 'Authentication rate limit from one source exceeded'\n response.headers['Auth-Error-Code'] = '451 4.3.2'\n if int(flask.request.headers['Auth-Login-Attempt']) < 10:\n response.headers['Auth-Wait'] = '3'\n return response\n headers = nginx.handle_authentication(flask.request.headers)\n response = flask.Response()\n for key, value in headers.items():\n response.headers[key] = str(value)\n if (\"Auth-Status\" not in headers) or (headers[\"Auth-Status\"] != \"OK\"):\n limit_subnet = str(app.config[\"AUTH_RATELIMIT_SUBNET\"]) != 'False'\n subnet = ipaddress.ip_network(app.config[\"SUBNET\"])\n if limit_subnet or ipaddress.ip_address(client_ip) not in subnet:\n limiter.hit(flask.request.headers[\"Client-Ip\"])\n return response\n\n\[email protected](\"/auth/admin\")\ndef admin_authentication():\n \"\"\" Fails if the user is not an authenticated admin.\n \"\"\"\n if (not flask_login.current_user.is_anonymous\n and flask_login.current_user.global_admin\n and flask_login.current_user.enabled):\n return \"\"\n return flask.abort(403)\n\[email protected](\"/auth/user\")\ndef user_authentication():\n \"\"\" Fails if the user is not authenticated.\n \"\"\"\n if (not flask_login.current_user.is_anonymous\n and flask_login.current_user.enabled):\n response = flask.Response()\n response.headers[\"X-User\"] = flask_login.current_user.get_id()\n response.headers[\"X-User-Token\"] = models.User.get_temp_token(flask_login.current_user.get_id())\n return response\n return flask.abort(403)\n\n\[email protected](\"/auth/basic\")\ndef basic_authentication():\n \"\"\" Tries to authenticate using the Authorization header.\n \"\"\"\n authorization = flask.request.headers.get(\"Authorization\")\n if authorization and authorization.startswith(\"Basic \"):\n encoded = authorization.replace(\"Basic \", \"\")\n user_email, password = base64.b64decode(encoded).split(b\":\")\n user = models.User.query.get(user_email.decode(\"utf8\"))\n if nginx.check_credentials(user, password.decode('utf-8'), flask.request.remote_addr, \"web\"):\n response = flask.Response()\n response.headers[\"X-User\"] = user.email\n return response\n response = flask.Response(status=401)\n response.headers[\"WWW-Authenticate\"] = 'Basic realm=\"Login Required\"'\n return response\n", "path": "core/admin/mailu/internal/views/auth.py"}], "after_files": [{"content": "from mailu import models, utils\nfrom mailu.internal import internal, nginx\nfrom flask import current_app as app\n\nimport flask\nimport flask_login\nimport base64\nimport ipaddress\n\n\[email protected](\"/auth/email\")\ndef nginx_authentication():\n \"\"\" Main authentication endpoint for Nginx email server\n \"\"\"\n limiter = utils.limiter.get_limiter(app.config[\"AUTH_RATELIMIT\"], \"auth-ip\")\n client_ip = flask.request.headers[\"Client-Ip\"]\n if not limiter.test(client_ip):\n response = flask.Response()\n response.headers['Auth-Status'] = 'Authentication rate limit from one source exceeded'\n response.headers['Auth-Error-Code'] = '451 4.3.2'\n if int(flask.request.headers['Auth-Login-Attempt']) < 10:\n response.headers['Auth-Wait'] = '3'\n return response\n headers = nginx.handle_authentication(flask.request.headers)\n response = flask.Response()\n for key, value in headers.items():\n response.headers[key] = str(value)\n if (\"Auth-Status\" not in headers) or (headers[\"Auth-Status\"] != \"OK\"):\n limit_subnet = str(app.config[\"AUTH_RATELIMIT_SUBNET\"]) != 'False'\n subnet = ipaddress.ip_network(app.config[\"SUBNET\"])\n if limit_subnet or ipaddress.ip_address(client_ip) not in subnet:\n limiter.hit(flask.request.headers[\"Client-Ip\"])\n return response\n\n\[email protected](\"/auth/admin\")\ndef admin_authentication():\n \"\"\" Fails if the user is not an authenticated admin.\n \"\"\"\n if (not flask_login.current_user.is_anonymous\n and flask_login.current_user.global_admin\n and flask_login.current_user.enabled):\n return \"\"\n return flask.abort(403)\n\[email protected](\"/auth/user\")\ndef user_authentication():\n \"\"\" Fails if the user is not authenticated.\n \"\"\"\n if (not flask_login.current_user.is_anonymous\n and flask_login.current_user.enabled):\n response = flask.Response()\n response.headers[\"X-User\"] = flask_login.current_user.get_id()\n response.headers[\"X-User-Token\"] = models.User.get_temp_token(flask_login.current_user.get_id())\n return response\n return flask.abort(403)\n\n\[email protected](\"/auth/basic\")\ndef basic_authentication():\n \"\"\" Tries to authenticate using the Authorization header.\n \"\"\"\n authorization = flask.request.headers.get(\"Authorization\")\n if authorization and authorization.startswith(\"Basic \"):\n encoded = authorization.replace(\"Basic \", \"\")\n user_email, password = base64.b64decode(encoded).split(b\":\", 1)\n user = models.User.query.get(user_email.decode(\"utf8\"))\n if nginx.check_credentials(user, password.decode('utf-8'), flask.request.remote_addr, \"web\"):\n response = flask.Response()\n response.headers[\"X-User\"] = user.email\n return response\n response = flask.Response(status=401)\n response.headers[\"WWW-Authenticate\"] = 'Basic realm=\"Login Required\"'\n return response\n", "path": "core/admin/mailu/internal/views/auth.py"}]}
| 1,168 | 170 |
gh_patches_debug_11556
|
rasdani/github-patches
|
git_diff
|
zestedesavoir__zds-site-6174
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Doublon pour les logs en couleur dans l'env de dev
Sur mon environnement de dev, les logs en couleur apparaissent en double : dans leur version couleur et puis dans leur version standard.

OS : Ubuntu Budgie 20.04
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `zds/settings/dev.py`
Content:
```
1 from colorlog import ColoredFormatter
2
3 from .abstract_base import *
4
5 DEBUG = True
6
7 # NOTE: Can be removed once Django 3 is used
8 ALLOWED_HOSTS = [".localhost", "127.0.0.1", "[::1]"]
9
10 INSTALLED_APPS += (
11 "debug_toolbar",
12 "django_extensions",
13 )
14
15 MIDDLEWARE = (
16 "debug_toolbar.middleware.DebugToolbarMiddleware",
17 "zds.middlewares.nocacheindevmiddleware.NoCacheInDevMiddleware",
18 ) + MIDDLEWARE
19
20 LOGGING = {
21 "version": 1,
22 "disable_existing_loggers": False,
23 "formatters": {
24 "verbose": {
25 "()": ColoredFormatter,
26 "format": "%(log_color)s %(levelname)s %(reset)s %(bold_black)s%(name)s%(reset)s %(message)s",
27 "log_colors": {
28 "DEBUG": "fg_white,bg_black",
29 "INFO": "fg_black,bg_bold_white",
30 "WARNING": "fg_black,bg_bold_yellow",
31 "ERROR": "fg_bold_white,bg_bold_red",
32 "CRITICAL": "fg_bold_white,bg_bold_red",
33 },
34 },
35 "django.server": {
36 "()": ColoredFormatter,
37 "format": "%(log_color)s%(message)s",
38 "log_colors": {
39 "INFO": "bold_black",
40 "WARNING": "bold_yellow",
41 "ERROR": "bold_red",
42 "CRITICAL": "bold_red",
43 },
44 },
45 },
46 "handlers": {
47 "console": {
48 "level": "DEBUG",
49 "class": "logging.StreamHandler",
50 "formatter": "verbose",
51 },
52 "django.server": {
53 "level": "DEBUG",
54 "class": "logging.StreamHandler",
55 "formatter": "django.server",
56 },
57 },
58 "loggers": {
59 "django": {
60 "level": "INFO",
61 "handlers": ["console"],
62 },
63 "django.server": {
64 "level": "INFO",
65 "handlers": ["django.server"],
66 "propagate": False,
67 },
68 "zds": {
69 "level": "INFO",
70 "handlers": ["console"],
71 },
72 },
73 }
74
75 ZDS_APP["site"]["url"] = "http://127.0.0.1:8000"
76 ZDS_APP["site"]["dns"] = "127.0.0.1:8000"
77
78 ZDS_APP["very_top_banner"] = {
79 "background_color": "#666",
80 "border_color": "#353535",
81 "color": "white",
82 "message": "Version locale",
83 "slug": "version-locale",
84 }
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/zds/settings/dev.py b/zds/settings/dev.py
--- a/zds/settings/dev.py
+++ b/zds/settings/dev.py
@@ -55,10 +55,12 @@
"formatter": "django.server",
},
},
+ "root": {
+ "handlers": ["console"],
+ },
"loggers": {
"django": {
"level": "INFO",
- "handlers": ["console"],
},
"django.server": {
"level": "INFO",
@@ -67,7 +69,6 @@
},
"zds": {
"level": "INFO",
- "handlers": ["console"],
},
},
}
|
{"golden_diff": "diff --git a/zds/settings/dev.py b/zds/settings/dev.py\n--- a/zds/settings/dev.py\n+++ b/zds/settings/dev.py\n@@ -55,10 +55,12 @@\n \"formatter\": \"django.server\",\n },\n },\n+ \"root\": {\n+ \"handlers\": [\"console\"],\n+ },\n \"loggers\": {\n \"django\": {\n \"level\": \"INFO\",\n- \"handlers\": [\"console\"],\n },\n \"django.server\": {\n \"level\": \"INFO\",\n@@ -67,7 +69,6 @@\n },\n \"zds\": {\n \"level\": \"INFO\",\n- \"handlers\": [\"console\"],\n },\n },\n }\n", "issue": "Doublon pour les logs en couleur dans l'env de dev\nSur mon environnement de dev, les logs en couleur apparaissent en double : dans leur version couleur et puis dans leur version standard.\r\n\r\n\r\n\r\nOS : Ubuntu Budgie 20.04\n", "before_files": [{"content": "from colorlog import ColoredFormatter\n\nfrom .abstract_base import *\n\nDEBUG = True\n\n# NOTE: Can be removed once Django 3 is used\nALLOWED_HOSTS = [\".localhost\", \"127.0.0.1\", \"[::1]\"]\n\nINSTALLED_APPS += (\n \"debug_toolbar\",\n \"django_extensions\",\n)\n\nMIDDLEWARE = (\n \"debug_toolbar.middleware.DebugToolbarMiddleware\",\n \"zds.middlewares.nocacheindevmiddleware.NoCacheInDevMiddleware\",\n) + MIDDLEWARE\n\nLOGGING = {\n \"version\": 1,\n \"disable_existing_loggers\": False,\n \"formatters\": {\n \"verbose\": {\n \"()\": ColoredFormatter,\n \"format\": \"%(log_color)s %(levelname)s %(reset)s %(bold_black)s%(name)s%(reset)s %(message)s\",\n \"log_colors\": {\n \"DEBUG\": \"fg_white,bg_black\",\n \"INFO\": \"fg_black,bg_bold_white\",\n \"WARNING\": \"fg_black,bg_bold_yellow\",\n \"ERROR\": \"fg_bold_white,bg_bold_red\",\n \"CRITICAL\": \"fg_bold_white,bg_bold_red\",\n },\n },\n \"django.server\": {\n \"()\": ColoredFormatter,\n \"format\": \"%(log_color)s%(message)s\",\n \"log_colors\": {\n \"INFO\": \"bold_black\",\n \"WARNING\": \"bold_yellow\",\n \"ERROR\": \"bold_red\",\n \"CRITICAL\": \"bold_red\",\n },\n },\n },\n \"handlers\": {\n \"console\": {\n \"level\": \"DEBUG\",\n \"class\": \"logging.StreamHandler\",\n \"formatter\": \"verbose\",\n },\n \"django.server\": {\n \"level\": \"DEBUG\",\n \"class\": \"logging.StreamHandler\",\n \"formatter\": \"django.server\",\n },\n },\n \"loggers\": {\n \"django\": {\n \"level\": \"INFO\",\n \"handlers\": [\"console\"],\n },\n \"django.server\": {\n \"level\": \"INFO\",\n \"handlers\": [\"django.server\"],\n \"propagate\": False,\n },\n \"zds\": {\n \"level\": \"INFO\",\n \"handlers\": [\"console\"],\n },\n },\n}\n\nZDS_APP[\"site\"][\"url\"] = \"http://127.0.0.1:8000\"\nZDS_APP[\"site\"][\"dns\"] = \"127.0.0.1:8000\"\n\nZDS_APP[\"very_top_banner\"] = {\n \"background_color\": \"#666\",\n \"border_color\": \"#353535\",\n \"color\": \"white\",\n \"message\": \"Version locale\",\n \"slug\": \"version-locale\",\n}\n", "path": "zds/settings/dev.py"}], "after_files": [{"content": "from colorlog import ColoredFormatter\n\nfrom .abstract_base import *\n\nDEBUG = True\n\n# NOTE: Can be removed once Django 3 is used\nALLOWED_HOSTS = [\".localhost\", \"127.0.0.1\", \"[::1]\"]\n\nINSTALLED_APPS += (\n \"debug_toolbar\",\n \"django_extensions\",\n)\n\nMIDDLEWARE = (\n \"debug_toolbar.middleware.DebugToolbarMiddleware\",\n \"zds.middlewares.nocacheindevmiddleware.NoCacheInDevMiddleware\",\n) + MIDDLEWARE\n\nLOGGING = {\n \"version\": 1,\n \"disable_existing_loggers\": False,\n \"formatters\": {\n \"verbose\": {\n \"()\": ColoredFormatter,\n \"format\": \"%(log_color)s %(levelname)s %(reset)s %(bold_black)s%(name)s%(reset)s %(message)s\",\n \"log_colors\": {\n \"DEBUG\": \"fg_white,bg_black\",\n \"INFO\": \"fg_black,bg_bold_white\",\n \"WARNING\": \"fg_black,bg_bold_yellow\",\n \"ERROR\": \"fg_bold_white,bg_bold_red\",\n \"CRITICAL\": \"fg_bold_white,bg_bold_red\",\n },\n },\n \"django.server\": {\n \"()\": ColoredFormatter,\n \"format\": \"%(log_color)s%(message)s\",\n \"log_colors\": {\n \"INFO\": \"bold_black\",\n \"WARNING\": \"bold_yellow\",\n \"ERROR\": \"bold_red\",\n \"CRITICAL\": \"bold_red\",\n },\n },\n },\n \"handlers\": {\n \"console\": {\n \"level\": \"DEBUG\",\n \"class\": \"logging.StreamHandler\",\n \"formatter\": \"verbose\",\n },\n \"django.server\": {\n \"level\": \"DEBUG\",\n \"class\": \"logging.StreamHandler\",\n \"formatter\": \"django.server\",\n },\n },\n \"root\": {\n \"handlers\": [\"console\"],\n },\n \"loggers\": {\n \"django\": {\n \"level\": \"INFO\",\n },\n \"django.server\": {\n \"level\": \"INFO\",\n \"handlers\": [\"django.server\"],\n \"propagate\": False,\n },\n \"zds\": {\n \"level\": \"INFO\",\n },\n },\n}\n\nZDS_APP[\"site\"][\"url\"] = \"http://127.0.0.1:8000\"\nZDS_APP[\"site\"][\"dns\"] = \"127.0.0.1:8000\"\n\nZDS_APP[\"very_top_banner\"] = {\n \"background_color\": \"#666\",\n \"border_color\": \"#353535\",\n \"color\": \"white\",\n \"message\": \"Version locale\",\n \"slug\": \"version-locale\",\n}\n", "path": "zds/settings/dev.py"}]}
| 1,123 | 156 |
gh_patches_debug_242
|
rasdani/github-patches
|
git_diff
|
sanic-org__sanic-1559
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
2 failed tests when tox is not used (missing fixture "benchmark")
`pytest-benchmark` is not present in `tests_require`, so there are 2 failed tests in `tests/benchmark/test_route_resolution_benchmark.py` when tox is not used.
This requirement is present in `tox.ini` so tox and Travis CI are working fine.
(I don't know what's a better fix — disable the benchmark tests or add `pytest-benchmark` to `tests_require`, so I didn't create a PR)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 """
2 Sanic
3 """
4 import codecs
5 import os
6 import re
7 import sys
8 from distutils.util import strtobool
9
10 from setuptools import setup
11 from setuptools.command.test import test as TestCommand
12
13
14 class PyTest(TestCommand):
15 """
16 Provide a Test runner to be used from setup.py to run unit tests
17 """
18
19 user_options = [("pytest-args=", "a", "Arguments to pass to pytest")]
20
21 def initialize_options(self):
22 TestCommand.initialize_options(self)
23 self.pytest_args = ""
24
25 def run_tests(self):
26 import shlex
27 import pytest
28
29 errno = pytest.main(shlex.split(self.pytest_args))
30 sys.exit(errno)
31
32
33 def open_local(paths, mode="r", encoding="utf8"):
34 path = os.path.join(os.path.abspath(os.path.dirname(__file__)), *paths)
35
36 return codecs.open(path, mode, encoding)
37
38
39 with open_local(["sanic", "__init__.py"], encoding="latin1") as fp:
40 try:
41 version = re.findall(
42 r"^__version__ = \"([^']+)\"\r?$", fp.read(), re.M
43 )[0]
44 except IndexError:
45 raise RuntimeError("Unable to determine version.")
46
47 with open_local(["README.rst"]) as rm:
48 long_description = rm.read()
49
50 setup_kwargs = {
51 "name": "sanic",
52 "version": version,
53 "url": "http://github.com/channelcat/sanic/",
54 "license": "MIT",
55 "author": "Channel Cat",
56 "author_email": "[email protected]",
57 "description": (
58 "A microframework based on uvloop, httptools, and learnings of flask"
59 ),
60 "long_description": long_description,
61 "packages": ["sanic"],
62 "platforms": "any",
63 "classifiers": [
64 "Development Status :: 4 - Beta",
65 "Environment :: Web Environment",
66 "License :: OSI Approved :: MIT License",
67 "Programming Language :: Python :: 3.5",
68 "Programming Language :: Python :: 3.6",
69 "Programming Language :: Python :: 3.7",
70 ],
71 }
72
73 env_dependency = (
74 '; sys_platform != "win32" ' 'and implementation_name == "cpython"'
75 )
76 ujson = "ujson>=1.35" + env_dependency
77 uvloop = "uvloop>=0.5.3" + env_dependency
78
79 requirements = [
80 "httptools>=0.0.10",
81 uvloop,
82 ujson,
83 "aiofiles>=0.3.0",
84 "websockets>=6.0,<7.0",
85 "multidict>=4.0,<5.0",
86 ]
87
88 tests_require = [
89 "pytest==4.1.0",
90 "multidict>=4.0,<5.0",
91 "gunicorn",
92 "pytest-cov",
93 "aiohttp>=2.3.0,<=3.2.1",
94 "beautifulsoup4",
95 uvloop,
96 ujson,
97 "pytest-sanic",
98 "pytest-sugar",
99 ]
100
101 if strtobool(os.environ.get("SANIC_NO_UJSON", "no")):
102 print("Installing without uJSON")
103 requirements.remove(ujson)
104 tests_require.remove(ujson)
105
106 # 'nt' means windows OS
107 if strtobool(os.environ.get("SANIC_NO_UVLOOP", "no")):
108 print("Installing without uvLoop")
109 requirements.remove(uvloop)
110 tests_require.remove(uvloop)
111
112 extras_require = {
113 "test": tests_require,
114 "dev": tests_require + ["aiofiles", "tox", "black", "flake8"],
115 "docs": [
116 "sphinx",
117 "sphinx_rtd_theme",
118 "recommonmark",
119 "sphinxcontrib-asyncio",
120 "docutils",
121 "pygments"
122 ],
123 }
124
125 setup_kwargs["install_requires"] = requirements
126 setup_kwargs["tests_require"] = tests_require
127 setup_kwargs["extras_require"] = extras_require
128 setup_kwargs["cmdclass"] = {"test": PyTest}
129 setup(**setup_kwargs)
130
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -96,6 +96,7 @@
ujson,
"pytest-sanic",
"pytest-sugar",
+ "pytest-benchmark",
]
if strtobool(os.environ.get("SANIC_NO_UJSON", "no")):
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -96,6 +96,7 @@\n ujson,\n \"pytest-sanic\",\n \"pytest-sugar\",\n+ \"pytest-benchmark\",\n ]\n \n if strtobool(os.environ.get(\"SANIC_NO_UJSON\", \"no\")):\n", "issue": "2 failed tests when tox is not used (missing fixture \"benchmark\")\n`pytest-benchmark` is not present in `tests_require`, so there are 2 failed tests in `tests/benchmark/test_route_resolution_benchmark.py` when tox is not used.\r\n\r\nThis requirement is present in `tox.ini` so tox and Travis CI are working fine.\r\n\r\n(I don't know what's a better fix \u2014 disable the benchmark tests or add `pytest-benchmark` to `tests_require`, so I didn't create a PR)\n", "before_files": [{"content": "\"\"\"\nSanic\n\"\"\"\nimport codecs\nimport os\nimport re\nimport sys\nfrom distutils.util import strtobool\n\nfrom setuptools import setup\nfrom setuptools.command.test import test as TestCommand\n\n\nclass PyTest(TestCommand):\n \"\"\"\n Provide a Test runner to be used from setup.py to run unit tests\n \"\"\"\n\n user_options = [(\"pytest-args=\", \"a\", \"Arguments to pass to pytest\")]\n\n def initialize_options(self):\n TestCommand.initialize_options(self)\n self.pytest_args = \"\"\n\n def run_tests(self):\n import shlex\n import pytest\n\n errno = pytest.main(shlex.split(self.pytest_args))\n sys.exit(errno)\n\n\ndef open_local(paths, mode=\"r\", encoding=\"utf8\"):\n path = os.path.join(os.path.abspath(os.path.dirname(__file__)), *paths)\n\n return codecs.open(path, mode, encoding)\n\n\nwith open_local([\"sanic\", \"__init__.py\"], encoding=\"latin1\") as fp:\n try:\n version = re.findall(\n r\"^__version__ = \\\"([^']+)\\\"\\r?$\", fp.read(), re.M\n )[0]\n except IndexError:\n raise RuntimeError(\"Unable to determine version.\")\n\nwith open_local([\"README.rst\"]) as rm:\n long_description = rm.read()\n\nsetup_kwargs = {\n \"name\": \"sanic\",\n \"version\": version,\n \"url\": \"http://github.com/channelcat/sanic/\",\n \"license\": \"MIT\",\n \"author\": \"Channel Cat\",\n \"author_email\": \"[email protected]\",\n \"description\": (\n \"A microframework based on uvloop, httptools, and learnings of flask\"\n ),\n \"long_description\": long_description,\n \"packages\": [\"sanic\"],\n \"platforms\": \"any\",\n \"classifiers\": [\n \"Development Status :: 4 - Beta\",\n \"Environment :: Web Environment\",\n \"License :: OSI Approved :: MIT License\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n ],\n}\n\nenv_dependency = (\n '; sys_platform != \"win32\" ' 'and implementation_name == \"cpython\"'\n)\nujson = \"ujson>=1.35\" + env_dependency\nuvloop = \"uvloop>=0.5.3\" + env_dependency\n\nrequirements = [\n \"httptools>=0.0.10\",\n uvloop,\n ujson,\n \"aiofiles>=0.3.0\",\n \"websockets>=6.0,<7.0\",\n \"multidict>=4.0,<5.0\",\n]\n\ntests_require = [\n \"pytest==4.1.0\",\n \"multidict>=4.0,<5.0\",\n \"gunicorn\",\n \"pytest-cov\",\n \"aiohttp>=2.3.0,<=3.2.1\",\n \"beautifulsoup4\",\n uvloop,\n ujson,\n \"pytest-sanic\",\n \"pytest-sugar\",\n]\n\nif strtobool(os.environ.get(\"SANIC_NO_UJSON\", \"no\")):\n print(\"Installing without uJSON\")\n requirements.remove(ujson)\n tests_require.remove(ujson)\n\n# 'nt' means windows OS\nif strtobool(os.environ.get(\"SANIC_NO_UVLOOP\", \"no\")):\n print(\"Installing without uvLoop\")\n requirements.remove(uvloop)\n tests_require.remove(uvloop)\n\nextras_require = {\n \"test\": tests_require,\n \"dev\": tests_require + [\"aiofiles\", \"tox\", \"black\", \"flake8\"],\n \"docs\": [\n \"sphinx\",\n \"sphinx_rtd_theme\",\n \"recommonmark\",\n \"sphinxcontrib-asyncio\",\n \"docutils\",\n \"pygments\"\n ],\n}\n\nsetup_kwargs[\"install_requires\"] = requirements\nsetup_kwargs[\"tests_require\"] = tests_require\nsetup_kwargs[\"extras_require\"] = extras_require\nsetup_kwargs[\"cmdclass\"] = {\"test\": PyTest}\nsetup(**setup_kwargs)\n", "path": "setup.py"}], "after_files": [{"content": "\"\"\"\nSanic\n\"\"\"\nimport codecs\nimport os\nimport re\nimport sys\nfrom distutils.util import strtobool\n\nfrom setuptools import setup\nfrom setuptools.command.test import test as TestCommand\n\n\nclass PyTest(TestCommand):\n \"\"\"\n Provide a Test runner to be used from setup.py to run unit tests\n \"\"\"\n\n user_options = [(\"pytest-args=\", \"a\", \"Arguments to pass to pytest\")]\n\n def initialize_options(self):\n TestCommand.initialize_options(self)\n self.pytest_args = \"\"\n\n def run_tests(self):\n import shlex\n import pytest\n\n errno = pytest.main(shlex.split(self.pytest_args))\n sys.exit(errno)\n\n\ndef open_local(paths, mode=\"r\", encoding=\"utf8\"):\n path = os.path.join(os.path.abspath(os.path.dirname(__file__)), *paths)\n\n return codecs.open(path, mode, encoding)\n\n\nwith open_local([\"sanic\", \"__init__.py\"], encoding=\"latin1\") as fp:\n try:\n version = re.findall(\n r\"^__version__ = \\\"([^']+)\\\"\\r?$\", fp.read(), re.M\n )[0]\n except IndexError:\n raise RuntimeError(\"Unable to determine version.\")\n\nwith open_local([\"README.rst\"]) as rm:\n long_description = rm.read()\n\nsetup_kwargs = {\n \"name\": \"sanic\",\n \"version\": version,\n \"url\": \"http://github.com/channelcat/sanic/\",\n \"license\": \"MIT\",\n \"author\": \"Channel Cat\",\n \"author_email\": \"[email protected]\",\n \"description\": (\n \"A microframework based on uvloop, httptools, and learnings of flask\"\n ),\n \"long_description\": long_description,\n \"packages\": [\"sanic\"],\n \"platforms\": \"any\",\n \"classifiers\": [\n \"Development Status :: 4 - Beta\",\n \"Environment :: Web Environment\",\n \"License :: OSI Approved :: MIT License\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n ],\n}\n\nenv_dependency = (\n '; sys_platform != \"win32\" ' 'and implementation_name == \"cpython\"'\n)\nujson = \"ujson>=1.35\" + env_dependency\nuvloop = \"uvloop>=0.5.3\" + env_dependency\n\nrequirements = [\n \"httptools>=0.0.10\",\n uvloop,\n ujson,\n \"aiofiles>=0.3.0\",\n \"websockets>=6.0,<7.0\",\n \"multidict>=4.0,<5.0\",\n]\n\ntests_require = [\n \"pytest==4.1.0\",\n \"multidict>=4.0,<5.0\",\n \"gunicorn\",\n \"pytest-cov\",\n \"aiohttp>=2.3.0,<=3.2.1\",\n \"beautifulsoup4\",\n uvloop,\n ujson,\n \"pytest-sanic\",\n \"pytest-sugar\",\n \"pytest-benchmark\",\n]\n\nif strtobool(os.environ.get(\"SANIC_NO_UJSON\", \"no\")):\n print(\"Installing without uJSON\")\n requirements.remove(ujson)\n tests_require.remove(ujson)\n\n# 'nt' means windows OS\nif strtobool(os.environ.get(\"SANIC_NO_UVLOOP\", \"no\")):\n print(\"Installing without uvLoop\")\n requirements.remove(uvloop)\n tests_require.remove(uvloop)\n\nextras_require = {\n \"test\": tests_require,\n \"dev\": tests_require + [\"aiofiles\", \"tox\", \"black\", \"flake8\"],\n \"docs\": [\n \"sphinx\",\n \"sphinx_rtd_theme\",\n \"recommonmark\",\n \"sphinxcontrib-asyncio\",\n \"docutils\",\n \"pygments\"\n ],\n}\n\nsetup_kwargs[\"install_requires\"] = requirements\nsetup_kwargs[\"tests_require\"] = tests_require\nsetup_kwargs[\"extras_require\"] = extras_require\nsetup_kwargs[\"cmdclass\"] = {\"test\": PyTest}\nsetup(**setup_kwargs)\n", "path": "setup.py"}]}
| 1,535 | 75 |
gh_patches_debug_40285
|
rasdani/github-patches
|
git_diff
|
optuna__optuna-3115
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[RFC] Show progress bar based on `timeout`
<!-- Please write a clear and concise description of the feature proposal. -->
## Motivation
The current [`study.optimize`](https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.Study.html#optuna.study.Study.optimize) shows the progress bar when we specify the number of trials. We can show the progress bar based on the `timeout` value as well.
Note that before implementing this feature, we need to decide the priority of `n_trials` and `timeout`. When we specify both values as not `None` values, there are two ways to show the progress bar.
Moreover, we need to discuss the case when both `None`. I suppose not showing the progress bar makes sense when both are `None`.
## Description
<!-- Please write a detailed description of the new feature. -->
- Show progress bar using `timeout` value.
- Add test by following https://github.com/optuna/optuna/pull/2900
## Alternatives (optional)
<!-- Please write a clear and concise description of any alternative solutions or features you've considered. -->
## Additional context (optional)
We might need to support the `n_jobs!=1` case depending on https://github.com/optuna/optuna/issues/2957.
<!-- Please add any other context or screenshots about the feature request here. -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `optuna/progress_bar.py`
Content:
```
1 import logging
2 from typing import Any
3 from typing import Optional
4
5 from tqdm.auto import tqdm
6
7 from optuna import logging as optuna_logging
8 from optuna._experimental import experimental
9
10
11 _tqdm_handler: Optional["_TqdmLoggingHandler"] = None
12
13
14 # Reference: https://gist.github.com/hvy/8b80c2cedf02b15c24f85d1fa17ebe02
15 class _TqdmLoggingHandler(logging.StreamHandler):
16 def emit(self, record: Any) -> None:
17 try:
18 msg = self.format(record)
19 tqdm.write(msg)
20 self.flush()
21 except (KeyboardInterrupt, SystemExit):
22 raise
23 except Exception:
24 self.handleError(record)
25
26
27 class _ProgressBar(object):
28 """Progress Bar implementation for :func:`~optuna.study.Study.optimize` on the top of `tqdm`.
29
30 Args:
31 is_valid:
32 Whether to show progress bars in :func:`~optuna.study.Study.optimize`.
33 n_trials:
34 The number of trials.
35 timeout:
36 Stop study after the given number of second(s).
37 """
38
39 def __init__(
40 self, is_valid: bool, n_trials: Optional[int] = None, timeout: Optional[float] = None
41 ) -> None:
42 self._is_valid = is_valid
43 self._n_trials = n_trials
44 self._timeout = timeout
45
46 if self._is_valid:
47 self._init_valid()
48
49 # TODO(hvy): Remove initialization indirection via this method when the progress bar is no
50 # longer experimental.
51 @experimental("1.2.0", name="Progress bar")
52 def _init_valid(self) -> None:
53 self._progress_bar = tqdm(range(self._n_trials) if self._n_trials is not None else None)
54 global _tqdm_handler
55
56 _tqdm_handler = _TqdmLoggingHandler()
57 _tqdm_handler.setLevel(logging.INFO)
58 _tqdm_handler.setFormatter(optuna_logging.create_default_formatter())
59 optuna_logging.disable_default_handler()
60 optuna_logging._get_library_root_logger().addHandler(_tqdm_handler)
61
62 def update(self, elapsed_seconds: Optional[float]) -> None:
63 """Update the progress bars if ``is_valid`` is :obj:`True`.
64
65 Args:
66 elapsed_seconds:
67 The time past since :func:`~optuna.study.Study.optimize` started.
68 """
69 if self._is_valid:
70 self._progress_bar.update(1)
71 if self._timeout is not None and elapsed_seconds is not None:
72 self._progress_bar.set_postfix_str(
73 "{:.02f}/{} seconds".format(elapsed_seconds, self._timeout)
74 )
75
76 def close(self) -> None:
77 """Close progress bars."""
78 if self._is_valid:
79 self._progress_bar.close()
80 assert _tqdm_handler is not None
81 optuna_logging._get_library_root_logger().removeHandler(_tqdm_handler)
82 optuna_logging.enable_default_handler()
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/optuna/progress_bar.py b/optuna/progress_bar.py
--- a/optuna/progress_bar.py
+++ b/optuna/progress_bar.py
@@ -39,9 +39,11 @@
def __init__(
self, is_valid: bool, n_trials: Optional[int] = None, timeout: Optional[float] = None
) -> None:
- self._is_valid = is_valid
+
+ self._is_valid = is_valid and (n_trials or timeout) is not None
self._n_trials = n_trials
self._timeout = timeout
+ self._last_elapsed_seconds = 0.0
if self._is_valid:
self._init_valid()
@@ -50,7 +52,21 @@
# longer experimental.
@experimental("1.2.0", name="Progress bar")
def _init_valid(self) -> None:
- self._progress_bar = tqdm(range(self._n_trials) if self._n_trials is not None else None)
+
+ if self._n_trials is not None:
+ self._progress_bar = tqdm(total=self._n_trials)
+
+ else:
+ fmt = "{percentage:3.0f}%|{bar}| {elapsed}/{desc}"
+ self._progress_bar = tqdm(total=self._timeout, bar_format=fmt)
+
+ # Using description string instead postfix string
+ # to display formatted timeout, since postfix carries
+ # extra comma space auto-format.
+ # https://github.com/tqdm/tqdm/issues/712
+ total = tqdm.format_interval(self._timeout)
+ self._progress_bar.set_description_str(total)
+
global _tqdm_handler
_tqdm_handler = _TqdmLoggingHandler()
@@ -59,22 +75,37 @@
optuna_logging.disable_default_handler()
optuna_logging._get_library_root_logger().addHandler(_tqdm_handler)
- def update(self, elapsed_seconds: Optional[float]) -> None:
+ def update(self, elapsed_seconds: float) -> None:
"""Update the progress bars if ``is_valid`` is :obj:`True`.
Args:
elapsed_seconds:
The time past since :func:`~optuna.study.Study.optimize` started.
"""
+
if self._is_valid:
- self._progress_bar.update(1)
- if self._timeout is not None and elapsed_seconds is not None:
- self._progress_bar.set_postfix_str(
- "{:.02f}/{} seconds".format(elapsed_seconds, self._timeout)
- )
+ if self._n_trials is not None:
+ self._progress_bar.update(1)
+ if self._timeout is not None:
+ self._progress_bar.set_postfix_str(
+ "{:.02f}/{} seconds".format(elapsed_seconds, self._timeout)
+ )
+
+ elif self._timeout is not None:
+ time_diff = elapsed_seconds - self._last_elapsed_seconds
+ if elapsed_seconds > self._timeout:
+ # Clip elapsed time to avoid tqdm warnings.
+ time_diff -= elapsed_seconds - self._timeout
+
+ self._progress_bar.update(time_diff)
+ self._last_elapsed_seconds = elapsed_seconds
+
+ else:
+ assert False
def close(self) -> None:
"""Close progress bars."""
+
if self._is_valid:
self._progress_bar.close()
assert _tqdm_handler is not None
|
{"golden_diff": "diff --git a/optuna/progress_bar.py b/optuna/progress_bar.py\n--- a/optuna/progress_bar.py\n+++ b/optuna/progress_bar.py\n@@ -39,9 +39,11 @@\n def __init__(\n self, is_valid: bool, n_trials: Optional[int] = None, timeout: Optional[float] = None\n ) -> None:\n- self._is_valid = is_valid\n+\n+ self._is_valid = is_valid and (n_trials or timeout) is not None\n self._n_trials = n_trials\n self._timeout = timeout\n+ self._last_elapsed_seconds = 0.0\n \n if self._is_valid:\n self._init_valid()\n@@ -50,7 +52,21 @@\n # longer experimental.\n @experimental(\"1.2.0\", name=\"Progress bar\")\n def _init_valid(self) -> None:\n- self._progress_bar = tqdm(range(self._n_trials) if self._n_trials is not None else None)\n+\n+ if self._n_trials is not None:\n+ self._progress_bar = tqdm(total=self._n_trials)\n+\n+ else:\n+ fmt = \"{percentage:3.0f}%|{bar}| {elapsed}/{desc}\"\n+ self._progress_bar = tqdm(total=self._timeout, bar_format=fmt)\n+\n+ # Using description string instead postfix string\n+ # to display formatted timeout, since postfix carries\n+ # extra comma space auto-format.\n+ # https://github.com/tqdm/tqdm/issues/712\n+ total = tqdm.format_interval(self._timeout)\n+ self._progress_bar.set_description_str(total)\n+\n global _tqdm_handler\n \n _tqdm_handler = _TqdmLoggingHandler()\n@@ -59,22 +75,37 @@\n optuna_logging.disable_default_handler()\n optuna_logging._get_library_root_logger().addHandler(_tqdm_handler)\n \n- def update(self, elapsed_seconds: Optional[float]) -> None:\n+ def update(self, elapsed_seconds: float) -> None:\n \"\"\"Update the progress bars if ``is_valid`` is :obj:`True`.\n \n Args:\n elapsed_seconds:\n The time past since :func:`~optuna.study.Study.optimize` started.\n \"\"\"\n+\n if self._is_valid:\n- self._progress_bar.update(1)\n- if self._timeout is not None and elapsed_seconds is not None:\n- self._progress_bar.set_postfix_str(\n- \"{:.02f}/{} seconds\".format(elapsed_seconds, self._timeout)\n- )\n+ if self._n_trials is not None:\n+ self._progress_bar.update(1)\n+ if self._timeout is not None:\n+ self._progress_bar.set_postfix_str(\n+ \"{:.02f}/{} seconds\".format(elapsed_seconds, self._timeout)\n+ )\n+\n+ elif self._timeout is not None:\n+ time_diff = elapsed_seconds - self._last_elapsed_seconds\n+ if elapsed_seconds > self._timeout:\n+ # Clip elapsed time to avoid tqdm warnings.\n+ time_diff -= elapsed_seconds - self._timeout\n+\n+ self._progress_bar.update(time_diff)\n+ self._last_elapsed_seconds = elapsed_seconds\n+\n+ else:\n+ assert False\n \n def close(self) -> None:\n \"\"\"Close progress bars.\"\"\"\n+\n if self._is_valid:\n self._progress_bar.close()\n assert _tqdm_handler is not None\n", "issue": "[RFC] Show progress bar based on `timeout`\n<!-- Please write a clear and concise description of the feature proposal. -->\r\n\r\n## Motivation\r\n\r\nThe current [`study.optimize`](https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.Study.html#optuna.study.Study.optimize) shows the progress bar when we specify the number of trials. We can show the progress bar based on the `timeout` value as well.\r\n\r\nNote that before implementing this feature, we need to decide the priority of `n_trials` and `timeout`. When we specify both values as not `None` values, there are two ways to show the progress bar. \r\n\r\nMoreover, we need to discuss the case when both `None`. I suppose not showing the progress bar makes sense when both are `None`.\r\n\r\n## Description\r\n\r\n<!-- Please write a detailed description of the new feature. -->\r\n\r\n\r\n- Show progress bar using `timeout` value.\r\n- Add test by following https://github.com/optuna/optuna/pull/2900\r\n\r\n## Alternatives (optional)\r\n\r\n\r\n<!-- Please write a clear and concise description of any alternative solutions or features you've considered. -->\r\n\r\n## Additional context (optional)\r\n\r\nWe might need to support the `n_jobs!=1` case depending on https://github.com/optuna/optuna/issues/2957.\r\n\r\n<!-- Please add any other context or screenshots about the feature request here. -->\r\n\n", "before_files": [{"content": "import logging\nfrom typing import Any\nfrom typing import Optional\n\nfrom tqdm.auto import tqdm\n\nfrom optuna import logging as optuna_logging\nfrom optuna._experimental import experimental\n\n\n_tqdm_handler: Optional[\"_TqdmLoggingHandler\"] = None\n\n\n# Reference: https://gist.github.com/hvy/8b80c2cedf02b15c24f85d1fa17ebe02\nclass _TqdmLoggingHandler(logging.StreamHandler):\n def emit(self, record: Any) -> None:\n try:\n msg = self.format(record)\n tqdm.write(msg)\n self.flush()\n except (KeyboardInterrupt, SystemExit):\n raise\n except Exception:\n self.handleError(record)\n\n\nclass _ProgressBar(object):\n \"\"\"Progress Bar implementation for :func:`~optuna.study.Study.optimize` on the top of `tqdm`.\n\n Args:\n is_valid:\n Whether to show progress bars in :func:`~optuna.study.Study.optimize`.\n n_trials:\n The number of trials.\n timeout:\n Stop study after the given number of second(s).\n \"\"\"\n\n def __init__(\n self, is_valid: bool, n_trials: Optional[int] = None, timeout: Optional[float] = None\n ) -> None:\n self._is_valid = is_valid\n self._n_trials = n_trials\n self._timeout = timeout\n\n if self._is_valid:\n self._init_valid()\n\n # TODO(hvy): Remove initialization indirection via this method when the progress bar is no\n # longer experimental.\n @experimental(\"1.2.0\", name=\"Progress bar\")\n def _init_valid(self) -> None:\n self._progress_bar = tqdm(range(self._n_trials) if self._n_trials is not None else None)\n global _tqdm_handler\n\n _tqdm_handler = _TqdmLoggingHandler()\n _tqdm_handler.setLevel(logging.INFO)\n _tqdm_handler.setFormatter(optuna_logging.create_default_formatter())\n optuna_logging.disable_default_handler()\n optuna_logging._get_library_root_logger().addHandler(_tqdm_handler)\n\n def update(self, elapsed_seconds: Optional[float]) -> None:\n \"\"\"Update the progress bars if ``is_valid`` is :obj:`True`.\n\n Args:\n elapsed_seconds:\n The time past since :func:`~optuna.study.Study.optimize` started.\n \"\"\"\n if self._is_valid:\n self._progress_bar.update(1)\n if self._timeout is not None and elapsed_seconds is not None:\n self._progress_bar.set_postfix_str(\n \"{:.02f}/{} seconds\".format(elapsed_seconds, self._timeout)\n )\n\n def close(self) -> None:\n \"\"\"Close progress bars.\"\"\"\n if self._is_valid:\n self._progress_bar.close()\n assert _tqdm_handler is not None\n optuna_logging._get_library_root_logger().removeHandler(_tqdm_handler)\n optuna_logging.enable_default_handler()\n", "path": "optuna/progress_bar.py"}], "after_files": [{"content": "import logging\nfrom typing import Any\nfrom typing import Optional\n\nfrom tqdm.auto import tqdm\n\nfrom optuna import logging as optuna_logging\nfrom optuna._experimental import experimental\n\n\n_tqdm_handler: Optional[\"_TqdmLoggingHandler\"] = None\n\n\n# Reference: https://gist.github.com/hvy/8b80c2cedf02b15c24f85d1fa17ebe02\nclass _TqdmLoggingHandler(logging.StreamHandler):\n def emit(self, record: Any) -> None:\n try:\n msg = self.format(record)\n tqdm.write(msg)\n self.flush()\n except (KeyboardInterrupt, SystemExit):\n raise\n except Exception:\n self.handleError(record)\n\n\nclass _ProgressBar(object):\n \"\"\"Progress Bar implementation for :func:`~optuna.study.Study.optimize` on the top of `tqdm`.\n\n Args:\n is_valid:\n Whether to show progress bars in :func:`~optuna.study.Study.optimize`.\n n_trials:\n The number of trials.\n timeout:\n Stop study after the given number of second(s).\n \"\"\"\n\n def __init__(\n self, is_valid: bool, n_trials: Optional[int] = None, timeout: Optional[float] = None\n ) -> None:\n\n self._is_valid = is_valid and (n_trials or timeout) is not None\n self._n_trials = n_trials\n self._timeout = timeout\n self._last_elapsed_seconds = 0.0\n\n if self._is_valid:\n self._init_valid()\n\n # TODO(hvy): Remove initialization indirection via this method when the progress bar is no\n # longer experimental.\n @experimental(\"1.2.0\", name=\"Progress bar\")\n def _init_valid(self) -> None:\n\n if self._n_trials is not None:\n self._progress_bar = tqdm(total=self._n_trials)\n\n else:\n fmt = \"{percentage:3.0f}%|{bar}| {elapsed}/{desc}\"\n self._progress_bar = tqdm(total=self._timeout, bar_format=fmt)\n\n # Using description string instead postfix string\n # to display formatted timeout, since postfix carries\n # extra comma space auto-format.\n # https://github.com/tqdm/tqdm/issues/712\n total = tqdm.format_interval(self._timeout)\n self._progress_bar.set_description_str(total)\n\n global _tqdm_handler\n\n _tqdm_handler = _TqdmLoggingHandler()\n _tqdm_handler.setLevel(logging.INFO)\n _tqdm_handler.setFormatter(optuna_logging.create_default_formatter())\n optuna_logging.disable_default_handler()\n optuna_logging._get_library_root_logger().addHandler(_tqdm_handler)\n\n def update(self, elapsed_seconds: float) -> None:\n \"\"\"Update the progress bars if ``is_valid`` is :obj:`True`.\n\n Args:\n elapsed_seconds:\n The time past since :func:`~optuna.study.Study.optimize` started.\n \"\"\"\n\n if self._is_valid:\n if self._n_trials is not None:\n self._progress_bar.update(1)\n if self._timeout is not None:\n self._progress_bar.set_postfix_str(\n \"{:.02f}/{} seconds\".format(elapsed_seconds, self._timeout)\n )\n\n elif self._timeout is not None:\n time_diff = elapsed_seconds - self._last_elapsed_seconds\n if elapsed_seconds > self._timeout:\n # Clip elapsed time to avoid tqdm warnings.\n time_diff -= elapsed_seconds - self._timeout\n\n self._progress_bar.update(time_diff)\n self._last_elapsed_seconds = elapsed_seconds\n\n else:\n assert False\n\n def close(self) -> None:\n \"\"\"Close progress bars.\"\"\"\n\n if self._is_valid:\n self._progress_bar.close()\n assert _tqdm_handler is not None\n optuna_logging._get_library_root_logger().removeHandler(_tqdm_handler)\n optuna_logging.enable_default_handler()\n", "path": "optuna/progress_bar.py"}]}
| 1,363 | 768 |
gh_patches_debug_27844
|
rasdani/github-patches
|
git_diff
|
Zeroto521__my-data-toolkit-713
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ENH: New geoaccessor for GeoSeries to return tuple of coordinates `(x, y)`
<!--
Thanks for contributing a pull request!
Please follow these standard acronyms to start the commit message:
- ENH: enhancement
- BUG: bug fix
- DOC: documentation
- TYP: type annotations
- TST: addition or modification of tests
- MAINT: maintenance commit (refactoring, typos, etc.)
- BLD: change related to building
- REL: related to releasing
- API: an (incompatible) API change
- DEP: deprecate something, or remove a deprecated object
- DEV: development tool or utility
- REV: revert an earlier commit
- PERF: performance improvement
- BOT: always commit via a bot
- CI: related to CI or CD
- CLN: Code cleanup
-->
- [ ] closes #xxxx
- [x] whatsnew entry
as title
```python
>>> import dtoolkit.geoaccessor
>>> import geopandas as gpd
>>> from shapely.geometry import Point
>>> s = geopandas.GeoSeries([Point(1, 1), Point(2, 2), Point(3, 3)])
>>> s.xy
0 (1.0, 1.0)
1 (2.0, 2.0)
2 (3.0, 3.0)
dtype: object
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dtoolkit/geoaccessor/geoseries/xy.py`
Content:
```
1 import geopandas as gpd
2 import pandas as pd
3
4 from dtoolkit.geoaccessor.register import register_geoseries_method
5
6
7 @register_geoseries_method
8 def xy(s: gpd.GeoSeries, /) -> pd.Series:
9 """
10 Return the x and y location of Point geometries in a GeoSeries.
11
12 Returns
13 -------
14 Series
15 tuple of x and y coordinates.
16
17 See Also
18 --------
19 geopandas.GeoSeries.x
20 geopandas.GeoSeries.y
21
22 Examples
23 --------
24 >>> import dtoolkit.geoaccessor
25 >>> import geopandas as gpd
26 >>> from shapely.geometry import Point
27 >>> s = gpd.GeoSeries([Point(1, 1), Point(2, 2), Point(3, 3)])
28 >>> s
29 0 POINT (1.00000 1.00000)
30 1 POINT (2.00000 2.00000)
31 2 POINT (3.00000 3.00000)
32 dtype: geometry
33 >>> s.xy()
34 0 (1.0, 1.0)
35 1 (2.0, 2.0)
36 2 (3.0, 3.0)
37 dtype: object
38 """
39
40 return pd.concat((s.x, s.y), axis=1).apply(tuple, axis=1)
41
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/dtoolkit/geoaccessor/geoseries/xy.py b/dtoolkit/geoaccessor/geoseries/xy.py
--- a/dtoolkit/geoaccessor/geoseries/xy.py
+++ b/dtoolkit/geoaccessor/geoseries/xy.py
@@ -5,14 +5,19 @@
@register_geoseries_method
-def xy(s: gpd.GeoSeries, /) -> pd.Series:
+def xy(s: gpd.GeoSeries, /, reverse: bool = False) -> pd.Series:
"""
Return the x and y location of Point geometries in a GeoSeries.
+ Parameters
+ ----------
+ reverse : bool, default False
+ If True, return (y, x) instead.
+
Returns
-------
Series
- tuple of x and y coordinates.
+ tuple of coordinate.
See Also
--------
@@ -24,17 +29,26 @@
>>> import dtoolkit.geoaccessor
>>> import geopandas as gpd
>>> from shapely.geometry import Point
- >>> s = gpd.GeoSeries([Point(1, 1), Point(2, 2), Point(3, 3)])
+ >>> s = gpd.GeoSeries([Point(0, 1), Point(0, 2), Point(0, 3)])
>>> s
- 0 POINT (1.00000 1.00000)
- 1 POINT (2.00000 2.00000)
- 2 POINT (3.00000 3.00000)
+ 0 POINT (0.00000 1.00000)
+ 1 POINT (0.00000 2.00000)
+ 2 POINT (0.00000 3.00000)
dtype: geometry
>>> s.xy()
- 0 (1.0, 1.0)
- 1 (2.0, 2.0)
- 2 (3.0, 3.0)
+ 0 (0.0, 1.0)
+ 1 (0.0, 2.0)
+ 2 (0.0, 3.0)
+ dtype: object
+
+ Set ``reverse=True`` to return (y, x).
+
+ >>> s.xy(True)
+ 0 (1.0, 0.0)
+ 1 (2.0, 0.0)
+ 2 (3.0, 0.0)
dtype: object
"""
- return pd.concat((s.x, s.y), axis=1).apply(tuple, axis=1)
+ coordinates = (s.y, s.x) if reverse else (s.x, s.y)
+ return pd.concat(coordinates, axis=1).apply(tuple, axis=1)
|
{"golden_diff": "diff --git a/dtoolkit/geoaccessor/geoseries/xy.py b/dtoolkit/geoaccessor/geoseries/xy.py\n--- a/dtoolkit/geoaccessor/geoseries/xy.py\n+++ b/dtoolkit/geoaccessor/geoseries/xy.py\n@@ -5,14 +5,19 @@\n \n \n @register_geoseries_method\n-def xy(s: gpd.GeoSeries, /) -> pd.Series:\n+def xy(s: gpd.GeoSeries, /, reverse: bool = False) -> pd.Series:\n \"\"\"\n Return the x and y location of Point geometries in a GeoSeries.\n \n+ Parameters\n+ ----------\n+ reverse : bool, default False\n+ If True, return (y, x) instead.\n+\n Returns\n -------\n Series\n- tuple of x and y coordinates.\n+ tuple of coordinate.\n \n See Also\n --------\n@@ -24,17 +29,26 @@\n >>> import dtoolkit.geoaccessor\n >>> import geopandas as gpd\n >>> from shapely.geometry import Point\n- >>> s = gpd.GeoSeries([Point(1, 1), Point(2, 2), Point(3, 3)])\n+ >>> s = gpd.GeoSeries([Point(0, 1), Point(0, 2), Point(0, 3)])\n >>> s\n- 0 POINT (1.00000 1.00000)\n- 1 POINT (2.00000 2.00000)\n- 2 POINT (3.00000 3.00000)\n+ 0 POINT (0.00000 1.00000)\n+ 1 POINT (0.00000 2.00000)\n+ 2 POINT (0.00000 3.00000)\n dtype: geometry\n >>> s.xy()\n- 0 (1.0, 1.0)\n- 1 (2.0, 2.0)\n- 2 (3.0, 3.0)\n+ 0 (0.0, 1.0)\n+ 1 (0.0, 2.0)\n+ 2 (0.0, 3.0)\n+ dtype: object\n+\n+ Set ``reverse=True`` to return (y, x).\n+\n+ >>> s.xy(True)\n+ 0 (1.0, 0.0)\n+ 1 (2.0, 0.0)\n+ 2 (3.0, 0.0)\n dtype: object\n \"\"\"\n \n- return pd.concat((s.x, s.y), axis=1).apply(tuple, axis=1)\n+ coordinates = (s.y, s.x) if reverse else (s.x, s.y)\n+ return pd.concat(coordinates, axis=1).apply(tuple, axis=1)\n", "issue": "ENH: New geoaccessor for GeoSeries to return tuple of coordinates `(x, y)`\n<!--\r\nThanks for contributing a pull request!\r\n\r\nPlease follow these standard acronyms to start the commit message:\r\n\r\n- ENH: enhancement\r\n- BUG: bug fix\r\n- DOC: documentation\r\n- TYP: type annotations\r\n- TST: addition or modification of tests\r\n- MAINT: maintenance commit (refactoring, typos, etc.)\r\n- BLD: change related to building\r\n- REL: related to releasing\r\n- API: an (incompatible) API change\r\n- DEP: deprecate something, or remove a deprecated object\r\n- DEV: development tool or utility\r\n- REV: revert an earlier commit\r\n- PERF: performance improvement\r\n- BOT: always commit via a bot\r\n- CI: related to CI or CD\r\n- CLN: Code cleanup\r\n-->\r\n\r\n- [ ] closes #xxxx\r\n- [x] whatsnew entry\r\n\r\nas title\r\n\r\n```python\r\n >>> import dtoolkit.geoaccessor\r\n >>> import geopandas as gpd\r\n >>> from shapely.geometry import Point\r\n >>> s = geopandas.GeoSeries([Point(1, 1), Point(2, 2), Point(3, 3)])\r\n >>> s.xy\r\n 0 (1.0, 1.0)\r\n 1 (2.0, 2.0)\r\n 2 (3.0, 3.0)\r\n dtype: object\r\n```\n", "before_files": [{"content": "import geopandas as gpd\nimport pandas as pd\n\nfrom dtoolkit.geoaccessor.register import register_geoseries_method\n\n\n@register_geoseries_method\ndef xy(s: gpd.GeoSeries, /) -> pd.Series:\n \"\"\"\n Return the x and y location of Point geometries in a GeoSeries.\n\n Returns\n -------\n Series\n tuple of x and y coordinates.\n\n See Also\n --------\n geopandas.GeoSeries.x\n geopandas.GeoSeries.y\n\n Examples\n --------\n >>> import dtoolkit.geoaccessor\n >>> import geopandas as gpd\n >>> from shapely.geometry import Point\n >>> s = gpd.GeoSeries([Point(1, 1), Point(2, 2), Point(3, 3)])\n >>> s\n 0 POINT (1.00000 1.00000)\n 1 POINT (2.00000 2.00000)\n 2 POINT (3.00000 3.00000)\n dtype: geometry\n >>> s.xy()\n 0 (1.0, 1.0)\n 1 (2.0, 2.0)\n 2 (3.0, 3.0)\n dtype: object\n \"\"\"\n\n return pd.concat((s.x, s.y), axis=1).apply(tuple, axis=1)\n", "path": "dtoolkit/geoaccessor/geoseries/xy.py"}], "after_files": [{"content": "import geopandas as gpd\nimport pandas as pd\n\nfrom dtoolkit.geoaccessor.register import register_geoseries_method\n\n\n@register_geoseries_method\ndef xy(s: gpd.GeoSeries, /, reverse: bool = False) -> pd.Series:\n \"\"\"\n Return the x and y location of Point geometries in a GeoSeries.\n\n Parameters\n ----------\n reverse : bool, default False\n If True, return (y, x) instead.\n\n Returns\n -------\n Series\n tuple of coordinate.\n\n See Also\n --------\n geopandas.GeoSeries.x\n geopandas.GeoSeries.y\n\n Examples\n --------\n >>> import dtoolkit.geoaccessor\n >>> import geopandas as gpd\n >>> from shapely.geometry import Point\n >>> s = gpd.GeoSeries([Point(0, 1), Point(0, 2), Point(0, 3)])\n >>> s\n 0 POINT (0.00000 1.00000)\n 1 POINT (0.00000 2.00000)\n 2 POINT (0.00000 3.00000)\n dtype: geometry\n >>> s.xy()\n 0 (0.0, 1.0)\n 1 (0.0, 2.0)\n 2 (0.0, 3.0)\n dtype: object\n\n Set ``reverse=True`` to return (y, x).\n\n >>> s.xy(True)\n 0 (1.0, 0.0)\n 1 (2.0, 0.0)\n 2 (3.0, 0.0)\n dtype: object\n \"\"\"\n\n coordinates = (s.y, s.x) if reverse else (s.x, s.y)\n return pd.concat(coordinates, axis=1).apply(tuple, axis=1)\n", "path": "dtoolkit/geoaccessor/geoseries/xy.py"}]}
| 995 | 696 |
gh_patches_debug_18990
|
rasdani/github-patches
|
git_diff
|
qutebrowser__qutebrowser-5916
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Port completion.models.listcategory to QRegularExpression
In `qutebrowser/completion/models/listcategory.py`, we use `QRegExp`, which is going to be removed in Qt 6:
```python
rx = QRegExp(val, Qt.CaseInsensitive)
self.setFilterRegExp(rx)
```
We should instead [port](https://doc-snapshots.qt.io/qt6-dev/qtcore-changes-qt6.html#regular-expression-classes) this to use [QRegularExpression](https://doc.qt.io/qt-5/qregularexpression.html) (or, perhaps, Python's `re`?) instead.
~~(Also, we should probably call `qtutils.ensure_valid(rx)`)~~ (done in `master` already)
cc @rcorre
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qutebrowser/completion/models/listcategory.py`
Content:
```
1 # vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:
2
3 # Copyright 2017-2020 Ryan Roden-Corrent (rcorre) <[email protected]>
4 #
5 # This file is part of qutebrowser.
6 #
7 # qutebrowser is free software: you can redistribute it and/or modify
8 # it under the terms of the GNU General Public License as published by
9 # the Free Software Foundation, either version 3 of the License, or
10 # (at your option) any later version.
11 #
12 # qutebrowser is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU General Public License for more details.
16 #
17 # You should have received a copy of the GNU General Public License
18 # along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.
19
20 """Completion category that uses a list of tuples as a data source."""
21
22 import re
23 from typing import Iterable, Tuple
24
25 from PyQt5.QtCore import Qt, QSortFilterProxyModel, QRegExp
26 from PyQt5.QtGui import QStandardItem, QStandardItemModel
27 from PyQt5.QtWidgets import QWidget
28
29 from qutebrowser.completion.models import util
30 from qutebrowser.utils import qtutils, log
31
32
33 class ListCategory(QSortFilterProxyModel):
34
35 """Expose a list of items as a category for the CompletionModel."""
36
37 def __init__(self,
38 name: str,
39 items: Iterable[Tuple[str, ...]],
40 sort: bool = True,
41 delete_func: util.DeleteFuncType = None,
42 parent: QWidget = None):
43 super().__init__(parent)
44 self.name = name
45 self.srcmodel = QStandardItemModel(parent=self)
46 self._pattern = ''
47 # ListCategory filters all columns
48 self.columns_to_filter = [0, 1, 2]
49 self.setFilterKeyColumn(-1)
50 for item in items:
51 self.srcmodel.appendRow([QStandardItem(x) for x in item])
52 self.setSourceModel(self.srcmodel)
53 self.delete_func = delete_func
54 self._sort = sort
55
56 def set_pattern(self, val):
57 """Setter for pattern.
58
59 Args:
60 val: The value to set.
61 """
62 self._pattern = val
63 val = re.sub(r' +', r' ', val) # See #1919
64 val = re.escape(val)
65 val = val.replace(r'\ ', '.*')
66 rx = QRegExp(val, Qt.CaseInsensitive)
67 qtutils.ensure_valid(rx)
68 self.setFilterRegExp(rx)
69 self.invalidate()
70 sortcol = 0
71 self.sort(sortcol)
72
73 def lessThan(self, lindex, rindex):
74 """Custom sorting implementation.
75
76 Prefers all items which start with self._pattern. Other than that, uses
77 normal Python string sorting.
78
79 Args:
80 lindex: The QModelIndex of the left item (*left* < right)
81 rindex: The QModelIndex of the right item (left < *right*)
82
83 Return:
84 True if left < right, else False
85 """
86 qtutils.ensure_valid(lindex)
87 qtutils.ensure_valid(rindex)
88
89 left = self.srcmodel.data(lindex)
90 right = self.srcmodel.data(rindex)
91
92 if left is None or right is None: # pragma: no cover
93 log.completion.warning("Got unexpected None value, "
94 "left={!r} right={!r} "
95 "lindex={!r} rindex={!r}"
96 .format(left, right, lindex, rindex))
97 return False
98
99 leftstart = left.startswith(self._pattern)
100 rightstart = right.startswith(self._pattern)
101
102 if leftstart and not rightstart:
103 return True
104 elif rightstart and not leftstart:
105 return False
106 elif self._sort:
107 return left < right
108 else:
109 return False
110
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/qutebrowser/completion/models/listcategory.py b/qutebrowser/completion/models/listcategory.py
--- a/qutebrowser/completion/models/listcategory.py
+++ b/qutebrowser/completion/models/listcategory.py
@@ -22,7 +22,7 @@
import re
from typing import Iterable, Tuple
-from PyQt5.QtCore import Qt, QSortFilterProxyModel, QRegExp
+from PyQt5.QtCore import QSortFilterProxyModel, QRegularExpression
from PyQt5.QtGui import QStandardItem, QStandardItemModel
from PyQt5.QtWidgets import QWidget
@@ -63,9 +63,9 @@
val = re.sub(r' +', r' ', val) # See #1919
val = re.escape(val)
val = val.replace(r'\ ', '.*')
- rx = QRegExp(val, Qt.CaseInsensitive)
+ rx = QRegularExpression(val, QRegularExpression.CaseInsensitiveOption)
qtutils.ensure_valid(rx)
- self.setFilterRegExp(rx)
+ self.setFilterRegularExpression(rx)
self.invalidate()
sortcol = 0
self.sort(sortcol)
|
{"golden_diff": "diff --git a/qutebrowser/completion/models/listcategory.py b/qutebrowser/completion/models/listcategory.py\n--- a/qutebrowser/completion/models/listcategory.py\n+++ b/qutebrowser/completion/models/listcategory.py\n@@ -22,7 +22,7 @@\n import re\n from typing import Iterable, Tuple\n \n-from PyQt5.QtCore import Qt, QSortFilterProxyModel, QRegExp\n+from PyQt5.QtCore import QSortFilterProxyModel, QRegularExpression\n from PyQt5.QtGui import QStandardItem, QStandardItemModel\n from PyQt5.QtWidgets import QWidget\n \n@@ -63,9 +63,9 @@\n val = re.sub(r' +', r' ', val) # See #1919\n val = re.escape(val)\n val = val.replace(r'\\ ', '.*')\n- rx = QRegExp(val, Qt.CaseInsensitive)\n+ rx = QRegularExpression(val, QRegularExpression.CaseInsensitiveOption)\n qtutils.ensure_valid(rx)\n- self.setFilterRegExp(rx)\n+ self.setFilterRegularExpression(rx)\n self.invalidate()\n sortcol = 0\n self.sort(sortcol)\n", "issue": "Port completion.models.listcategory to QRegularExpression\nIn `qutebrowser/completion/models/listcategory.py`, we use `QRegExp`, which is going to be removed in Qt 6:\r\n\r\n```python\r\n rx = QRegExp(val, Qt.CaseInsensitive)\r\n self.setFilterRegExp(rx)\r\n```\r\n\r\nWe should instead [port](https://doc-snapshots.qt.io/qt6-dev/qtcore-changes-qt6.html#regular-expression-classes) this to use [QRegularExpression](https://doc.qt.io/qt-5/qregularexpression.html) (or, perhaps, Python's `re`?) instead.\r\n\r\n~~(Also, we should probably call `qtutils.ensure_valid(rx)`)~~ (done in `master` already)\r\n\r\ncc @rcorre \n", "before_files": [{"content": "# vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:\n\n# Copyright 2017-2020 Ryan Roden-Corrent (rcorre) <[email protected]>\n#\n# This file is part of qutebrowser.\n#\n# qutebrowser is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# qutebrowser is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.\n\n\"\"\"Completion category that uses a list of tuples as a data source.\"\"\"\n\nimport re\nfrom typing import Iterable, Tuple\n\nfrom PyQt5.QtCore import Qt, QSortFilterProxyModel, QRegExp\nfrom PyQt5.QtGui import QStandardItem, QStandardItemModel\nfrom PyQt5.QtWidgets import QWidget\n\nfrom qutebrowser.completion.models import util\nfrom qutebrowser.utils import qtutils, log\n\n\nclass ListCategory(QSortFilterProxyModel):\n\n \"\"\"Expose a list of items as a category for the CompletionModel.\"\"\"\n\n def __init__(self,\n name: str,\n items: Iterable[Tuple[str, ...]],\n sort: bool = True,\n delete_func: util.DeleteFuncType = None,\n parent: QWidget = None):\n super().__init__(parent)\n self.name = name\n self.srcmodel = QStandardItemModel(parent=self)\n self._pattern = ''\n # ListCategory filters all columns\n self.columns_to_filter = [0, 1, 2]\n self.setFilterKeyColumn(-1)\n for item in items:\n self.srcmodel.appendRow([QStandardItem(x) for x in item])\n self.setSourceModel(self.srcmodel)\n self.delete_func = delete_func\n self._sort = sort\n\n def set_pattern(self, val):\n \"\"\"Setter for pattern.\n\n Args:\n val: The value to set.\n \"\"\"\n self._pattern = val\n val = re.sub(r' +', r' ', val) # See #1919\n val = re.escape(val)\n val = val.replace(r'\\ ', '.*')\n rx = QRegExp(val, Qt.CaseInsensitive)\n qtutils.ensure_valid(rx)\n self.setFilterRegExp(rx)\n self.invalidate()\n sortcol = 0\n self.sort(sortcol)\n\n def lessThan(self, lindex, rindex):\n \"\"\"Custom sorting implementation.\n\n Prefers all items which start with self._pattern. Other than that, uses\n normal Python string sorting.\n\n Args:\n lindex: The QModelIndex of the left item (*left* < right)\n rindex: The QModelIndex of the right item (left < *right*)\n\n Return:\n True if left < right, else False\n \"\"\"\n qtutils.ensure_valid(lindex)\n qtutils.ensure_valid(rindex)\n\n left = self.srcmodel.data(lindex)\n right = self.srcmodel.data(rindex)\n\n if left is None or right is None: # pragma: no cover\n log.completion.warning(\"Got unexpected None value, \"\n \"left={!r} right={!r} \"\n \"lindex={!r} rindex={!r}\"\n .format(left, right, lindex, rindex))\n return False\n\n leftstart = left.startswith(self._pattern)\n rightstart = right.startswith(self._pattern)\n\n if leftstart and not rightstart:\n return True\n elif rightstart and not leftstart:\n return False\n elif self._sort:\n return left < right\n else:\n return False\n", "path": "qutebrowser/completion/models/listcategory.py"}], "after_files": [{"content": "# vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:\n\n# Copyright 2017-2020 Ryan Roden-Corrent (rcorre) <[email protected]>\n#\n# This file is part of qutebrowser.\n#\n# qutebrowser is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# qutebrowser is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.\n\n\"\"\"Completion category that uses a list of tuples as a data source.\"\"\"\n\nimport re\nfrom typing import Iterable, Tuple\n\nfrom PyQt5.QtCore import QSortFilterProxyModel, QRegularExpression\nfrom PyQt5.QtGui import QStandardItem, QStandardItemModel\nfrom PyQt5.QtWidgets import QWidget\n\nfrom qutebrowser.completion.models import util\nfrom qutebrowser.utils import qtutils, log\n\n\nclass ListCategory(QSortFilterProxyModel):\n\n \"\"\"Expose a list of items as a category for the CompletionModel.\"\"\"\n\n def __init__(self,\n name: str,\n items: Iterable[Tuple[str, ...]],\n sort: bool = True,\n delete_func: util.DeleteFuncType = None,\n parent: QWidget = None):\n super().__init__(parent)\n self.name = name\n self.srcmodel = QStandardItemModel(parent=self)\n self._pattern = ''\n # ListCategory filters all columns\n self.columns_to_filter = [0, 1, 2]\n self.setFilterKeyColumn(-1)\n for item in items:\n self.srcmodel.appendRow([QStandardItem(x) for x in item])\n self.setSourceModel(self.srcmodel)\n self.delete_func = delete_func\n self._sort = sort\n\n def set_pattern(self, val):\n \"\"\"Setter for pattern.\n\n Args:\n val: The value to set.\n \"\"\"\n self._pattern = val\n val = re.sub(r' +', r' ', val) # See #1919\n val = re.escape(val)\n val = val.replace(r'\\ ', '.*')\n rx = QRegularExpression(val, QRegularExpression.CaseInsensitiveOption)\n qtutils.ensure_valid(rx)\n self.setFilterRegularExpression(rx)\n self.invalidate()\n sortcol = 0\n self.sort(sortcol)\n\n def lessThan(self, lindex, rindex):\n \"\"\"Custom sorting implementation.\n\n Prefers all items which start with self._pattern. Other than that, uses\n normal Python string sorting.\n\n Args:\n lindex: The QModelIndex of the left item (*left* < right)\n rindex: The QModelIndex of the right item (left < *right*)\n\n Return:\n True if left < right, else False\n \"\"\"\n qtutils.ensure_valid(lindex)\n qtutils.ensure_valid(rindex)\n\n left = self.srcmodel.data(lindex)\n right = self.srcmodel.data(rindex)\n\n if left is None or right is None: # pragma: no cover\n log.completion.warning(\"Got unexpected None value, \"\n \"left={!r} right={!r} \"\n \"lindex={!r} rindex={!r}\"\n .format(left, right, lindex, rindex))\n return False\n\n leftstart = left.startswith(self._pattern)\n rightstart = right.startswith(self._pattern)\n\n if leftstart and not rightstart:\n return True\n elif rightstart and not leftstart:\n return False\n elif self._sort:\n return left < right\n else:\n return False\n", "path": "qutebrowser/completion/models/listcategory.py"}]}
| 1,507 | 245 |
gh_patches_debug_1842
|
rasdani/github-patches
|
git_diff
|
DistrictDataLabs__yellowbrick-1162
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pytest-runner is deprecated
pytest-runner is deprecated: https://github.com/pytest-dev/pytest-runner/#deprecation-notice
If I find time, then I can make a PR, but I thought I'd let you know in the meantime.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # setup
3 # Setup script for installing yellowbrick
4 #
5 # Author: Benjamin Bengfort
6 # Created: Wed May 18 14:33:26 2016 -0400
7 #
8 # Copyright (C) 2016 The scikit-yb developers
9 # For license information, see LICENSE.txt and NOTICE.md
10 #
11 # ID: setup.py [c4f3ba7] [email protected] $
12
13 """
14 Setup script for installing yellowbrick.
15 See http://bbengfort.github.io/programmer/2016/01/20/packaging-with-pypi.html
16 """
17
18 ##########################################################################
19 ## Imports
20 ##########################################################################
21
22 import os
23 import codecs
24
25 from setuptools import setup
26 from setuptools import find_packages
27
28 ##########################################################################
29 ## Package Information
30 ##########################################################################
31
32 ## Basic information
33 ## Basic information
34 NAME = "yellowbrick"
35 DESCRIPTION = "A suite of visual analysis and diagnostic tools for machine learning."
36 AUTHOR = "The scikit-yb developers"
37 EMAIL = "[email protected]"
38 MAINTAINER = "The scikit-yb developers"
39 LICENSE = "Apache 2"
40 REPOSITORY = "https://github.com/DistrictDataLabs/yellowbrick"
41 PACKAGE = "yellowbrick"
42 URL = "http://scikit-yb.org/"
43
44 ## Define the keywords
45 KEYWORDS = (
46 "visualization",
47 "machine learning",
48 "scikit-learn",
49 "matplotlib",
50 "data science",
51 )
52
53 ## Define the classifiers
54 ## See https://pypi.python.org/pypi?%3Aaction=list_classifiers
55 CLASSIFIERS = (
56 "Development Status :: 5 - Production/Stable",
57 "Intended Audience :: Developers",
58 "Intended Audience :: Science/Research",
59 "License :: OSI Approved :: Apache Software License",
60 "Natural Language :: English",
61 "Operating System :: OS Independent",
62 "Programming Language :: Python",
63 "Programming Language :: Python :: 3.5",
64 "Programming Language :: Python :: 3.6",
65 "Programming Language :: Python :: 3.7",
66 "Programming Language :: Python :: 3.8",
67 "Topic :: Software Development",
68 "Topic :: Software Development :: Libraries :: Python Modules",
69 "Topic :: Scientific/Engineering :: Visualization",
70 )
71
72 ## Important Paths
73 PROJECT = os.path.abspath(os.path.dirname(__file__))
74 REQUIRE_PATH = "requirements.txt"
75 VERSION_PATH = os.path.join(PACKAGE, "version.py")
76 PKG_DESCRIBE = "DESCRIPTION.md"
77
78 ## Directories to ignore in find_packages
79 EXCLUDES = (
80 "tests", "tests.*",
81 "bin",
82 "docs", "docs.*",
83 "fixtures",
84 "register",
85 "notebooks", "notebooks.*",
86 "examples", "examples.*",
87 "binder", "binder.*",
88 "paper",
89 )
90
91 ##########################################################################
92 ## Helper Functions
93 ##########################################################################
94
95
96 def read(*parts):
97 """
98 Assume UTF-8 encoding and return the contents of the file located at the
99 absolute path from the REPOSITORY joined with *parts.
100 """
101 with codecs.open(os.path.join(PROJECT, *parts), "rb", "utf-8") as f:
102 return f.read()
103
104
105 def get_version(path=VERSION_PATH):
106 """
107 Reads the python file defined in the VERSION_PATH to find the get_version
108 function, and executes it to ensure that it is loaded correctly. Separating
109 the version in this way ensures no additional code is executed.
110 """
111 namespace = {}
112 exec(read(path), namespace)
113 return namespace["get_version"](short=True)
114
115
116 def get_requires(path=REQUIRE_PATH):
117 """
118 Yields a generator of requirements as defined by the REQUIRE_PATH which
119 should point to a requirements.txt output by `pip freeze`.
120 """
121 for line in read(path).splitlines():
122 line = line.strip()
123 if line and not line.startswith("#"):
124 yield line
125
126
127 def get_description_type(path=PKG_DESCRIBE):
128 """
129 Returns the long_description_content_type based on the extension of the
130 package describe path (e.g. .txt, .rst, or .md).
131 """
132 _, ext = os.path.splitext(path)
133 return {".rst": "text/x-rst", ".txt": "text/plain", ".md": "text/markdown"}[ext]
134
135
136 ##########################################################################
137 ## Define the configuration
138 ##########################################################################
139
140 config = {
141 "name": NAME,
142 "version": get_version(),
143 "description": DESCRIPTION,
144 "long_description": read(PKG_DESCRIBE),
145 "long_description_content_type": get_description_type(PKG_DESCRIBE),
146 "classifiers": CLASSIFIERS,
147 "keywords": KEYWORDS,
148 "license": LICENSE,
149 "author": AUTHOR,
150 "author_email": EMAIL,
151 "url": URL,
152 "maintainer": MAINTAINER,
153 "maintainer_email": EMAIL,
154 "project_urls": {
155 "Documentation": URL,
156 "Download": "{}/tarball/v{}".format(REPOSITORY, get_version()),
157 "Source": REPOSITORY,
158 "Tracker": "{}/issues".format(REPOSITORY),
159 },
160 "download_url": "{}/tarball/v{}".format(REPOSITORY, get_version()),
161 "packages": find_packages(where=PROJECT, exclude=EXCLUDES),
162 "package_data": {"yellowbrick": ["datasets/manifest.json"]},
163 "zip_safe": False,
164 "entry_points": {"console_scripts": []},
165 "install_requires": list(get_requires()),
166 "python_requires": ">=3.4, <4",
167 "setup_requires": ["pytest-runner"],
168 "tests_require": ["pytest"],
169 }
170
171
172 ##########################################################################
173 ## Run setup script
174 ##########################################################################
175
176 if __name__ == "__main__":
177 setup(**config)
178
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -163,9 +163,7 @@
"zip_safe": False,
"entry_points": {"console_scripts": []},
"install_requires": list(get_requires()),
- "python_requires": ">=3.4, <4",
- "setup_requires": ["pytest-runner"],
- "tests_require": ["pytest"],
+ "python_requires": ">=3.4, <4"
}
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -163,9 +163,7 @@\n \"zip_safe\": False,\n \"entry_points\": {\"console_scripts\": []},\n \"install_requires\": list(get_requires()),\n- \"python_requires\": \">=3.4, <4\",\n- \"setup_requires\": [\"pytest-runner\"],\n- \"tests_require\": [\"pytest\"],\n+ \"python_requires\": \">=3.4, <4\"\n }\n", "issue": "pytest-runner is deprecated\npytest-runner is deprecated: https://github.com/pytest-dev/pytest-runner/#deprecation-notice\r\n\r\nIf I find time, then I can make a PR, but I thought I'd let you know in the meantime.\n", "before_files": [{"content": "#!/usr/bin/env python\n# setup\n# Setup script for installing yellowbrick\n#\n# Author: Benjamin Bengfort\n# Created: Wed May 18 14:33:26 2016 -0400\n#\n# Copyright (C) 2016 The scikit-yb developers\n# For license information, see LICENSE.txt and NOTICE.md\n#\n# ID: setup.py [c4f3ba7] [email protected] $\n\n\"\"\"\nSetup script for installing yellowbrick.\nSee http://bbengfort.github.io/programmer/2016/01/20/packaging-with-pypi.html\n\"\"\"\n\n##########################################################################\n## Imports\n##########################################################################\n\nimport os\nimport codecs\n\nfrom setuptools import setup\nfrom setuptools import find_packages\n\n##########################################################################\n## Package Information\n##########################################################################\n\n## Basic information\n## Basic information\nNAME = \"yellowbrick\"\nDESCRIPTION = \"A suite of visual analysis and diagnostic tools for machine learning.\"\nAUTHOR = \"The scikit-yb developers\"\nEMAIL = \"[email protected]\"\nMAINTAINER = \"The scikit-yb developers\"\nLICENSE = \"Apache 2\"\nREPOSITORY = \"https://github.com/DistrictDataLabs/yellowbrick\"\nPACKAGE = \"yellowbrick\"\nURL = \"http://scikit-yb.org/\"\n\n## Define the keywords\nKEYWORDS = (\n \"visualization\",\n \"machine learning\",\n \"scikit-learn\",\n \"matplotlib\",\n \"data science\",\n)\n\n## Define the classifiers\n## See https://pypi.python.org/pypi?%3Aaction=list_classifiers\nCLASSIFIERS = (\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Topic :: Software Development\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n \"Topic :: Scientific/Engineering :: Visualization\",\n)\n\n## Important Paths\nPROJECT = os.path.abspath(os.path.dirname(__file__))\nREQUIRE_PATH = \"requirements.txt\"\nVERSION_PATH = os.path.join(PACKAGE, \"version.py\")\nPKG_DESCRIBE = \"DESCRIPTION.md\"\n\n## Directories to ignore in find_packages\nEXCLUDES = (\n \"tests\", \"tests.*\",\n \"bin\",\n \"docs\", \"docs.*\",\n \"fixtures\",\n \"register\",\n \"notebooks\", \"notebooks.*\",\n \"examples\", \"examples.*\",\n \"binder\", \"binder.*\",\n \"paper\",\n)\n\n##########################################################################\n## Helper Functions\n##########################################################################\n\n\ndef read(*parts):\n \"\"\"\n Assume UTF-8 encoding and return the contents of the file located at the\n absolute path from the REPOSITORY joined with *parts.\n \"\"\"\n with codecs.open(os.path.join(PROJECT, *parts), \"rb\", \"utf-8\") as f:\n return f.read()\n\n\ndef get_version(path=VERSION_PATH):\n \"\"\"\n Reads the python file defined in the VERSION_PATH to find the get_version\n function, and executes it to ensure that it is loaded correctly. Separating\n the version in this way ensures no additional code is executed.\n \"\"\"\n namespace = {}\n exec(read(path), namespace)\n return namespace[\"get_version\"](short=True)\n\n\ndef get_requires(path=REQUIRE_PATH):\n \"\"\"\n Yields a generator of requirements as defined by the REQUIRE_PATH which\n should point to a requirements.txt output by `pip freeze`.\n \"\"\"\n for line in read(path).splitlines():\n line = line.strip()\n if line and not line.startswith(\"#\"):\n yield line\n\n\ndef get_description_type(path=PKG_DESCRIBE):\n \"\"\"\n Returns the long_description_content_type based on the extension of the\n package describe path (e.g. .txt, .rst, or .md).\n \"\"\"\n _, ext = os.path.splitext(path)\n return {\".rst\": \"text/x-rst\", \".txt\": \"text/plain\", \".md\": \"text/markdown\"}[ext]\n\n\n##########################################################################\n## Define the configuration\n##########################################################################\n\nconfig = {\n \"name\": NAME,\n \"version\": get_version(),\n \"description\": DESCRIPTION,\n \"long_description\": read(PKG_DESCRIBE),\n \"long_description_content_type\": get_description_type(PKG_DESCRIBE),\n \"classifiers\": CLASSIFIERS,\n \"keywords\": KEYWORDS,\n \"license\": LICENSE,\n \"author\": AUTHOR,\n \"author_email\": EMAIL,\n \"url\": URL,\n \"maintainer\": MAINTAINER,\n \"maintainer_email\": EMAIL,\n \"project_urls\": {\n \"Documentation\": URL,\n \"Download\": \"{}/tarball/v{}\".format(REPOSITORY, get_version()),\n \"Source\": REPOSITORY,\n \"Tracker\": \"{}/issues\".format(REPOSITORY),\n },\n \"download_url\": \"{}/tarball/v{}\".format(REPOSITORY, get_version()),\n \"packages\": find_packages(where=PROJECT, exclude=EXCLUDES),\n \"package_data\": {\"yellowbrick\": [\"datasets/manifest.json\"]},\n \"zip_safe\": False,\n \"entry_points\": {\"console_scripts\": []},\n \"install_requires\": list(get_requires()),\n \"python_requires\": \">=3.4, <4\",\n \"setup_requires\": [\"pytest-runner\"],\n \"tests_require\": [\"pytest\"],\n}\n\n\n##########################################################################\n## Run setup script\n##########################################################################\n\nif __name__ == \"__main__\":\n setup(**config)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# setup\n# Setup script for installing yellowbrick\n#\n# Author: Benjamin Bengfort\n# Created: Wed May 18 14:33:26 2016 -0400\n#\n# Copyright (C) 2016 The scikit-yb developers\n# For license information, see LICENSE.txt and NOTICE.md\n#\n# ID: setup.py [c4f3ba7] [email protected] $\n\n\"\"\"\nSetup script for installing yellowbrick.\nSee http://bbengfort.github.io/programmer/2016/01/20/packaging-with-pypi.html\n\"\"\"\n\n##########################################################################\n## Imports\n##########################################################################\n\nimport os\nimport codecs\n\nfrom setuptools import setup\nfrom setuptools import find_packages\n\n##########################################################################\n## Package Information\n##########################################################################\n\n## Basic information\n## Basic information\nNAME = \"yellowbrick\"\nDESCRIPTION = \"A suite of visual analysis and diagnostic tools for machine learning.\"\nAUTHOR = \"The scikit-yb developers\"\nEMAIL = \"[email protected]\"\nMAINTAINER = \"The scikit-yb developers\"\nLICENSE = \"Apache 2\"\nREPOSITORY = \"https://github.com/DistrictDataLabs/yellowbrick\"\nPACKAGE = \"yellowbrick\"\nURL = \"http://scikit-yb.org/\"\n\n## Define the keywords\nKEYWORDS = (\n \"visualization\",\n \"machine learning\",\n \"scikit-learn\",\n \"matplotlib\",\n \"data science\",\n)\n\n## Define the classifiers\n## See https://pypi.python.org/pypi?%3Aaction=list_classifiers\nCLASSIFIERS = (\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Topic :: Software Development\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n \"Topic :: Scientific/Engineering :: Visualization\",\n)\n\n## Important Paths\nPROJECT = os.path.abspath(os.path.dirname(__file__))\nREQUIRE_PATH = \"requirements.txt\"\nVERSION_PATH = os.path.join(PACKAGE, \"version.py\")\nPKG_DESCRIBE = \"DESCRIPTION.md\"\n\n## Directories to ignore in find_packages\nEXCLUDES = (\n \"tests\", \"tests.*\",\n \"bin\",\n \"docs\", \"docs.*\",\n \"fixtures\",\n \"register\",\n \"notebooks\", \"notebooks.*\",\n \"examples\", \"examples.*\",\n \"binder\", \"binder.*\",\n \"paper\",\n)\n\n##########################################################################\n## Helper Functions\n##########################################################################\n\n\ndef read(*parts):\n \"\"\"\n Assume UTF-8 encoding and return the contents of the file located at the\n absolute path from the REPOSITORY joined with *parts.\n \"\"\"\n with codecs.open(os.path.join(PROJECT, *parts), \"rb\", \"utf-8\") as f:\n return f.read()\n\n\ndef get_version(path=VERSION_PATH):\n \"\"\"\n Reads the python file defined in the VERSION_PATH to find the get_version\n function, and executes it to ensure that it is loaded correctly. Separating\n the version in this way ensures no additional code is executed.\n \"\"\"\n namespace = {}\n exec(read(path), namespace)\n return namespace[\"get_version\"](short=True)\n\n\ndef get_requires(path=REQUIRE_PATH):\n \"\"\"\n Yields a generator of requirements as defined by the REQUIRE_PATH which\n should point to a requirements.txt output by `pip freeze`.\n \"\"\"\n for line in read(path).splitlines():\n line = line.strip()\n if line and not line.startswith(\"#\"):\n yield line\n\n\ndef get_description_type(path=PKG_DESCRIBE):\n \"\"\"\n Returns the long_description_content_type based on the extension of the\n package describe path (e.g. .txt, .rst, or .md).\n \"\"\"\n _, ext = os.path.splitext(path)\n return {\".rst\": \"text/x-rst\", \".txt\": \"text/plain\", \".md\": \"text/markdown\"}[ext]\n\n\n##########################################################################\n## Define the configuration\n##########################################################################\n\nconfig = {\n \"name\": NAME,\n \"version\": get_version(),\n \"description\": DESCRIPTION,\n \"long_description\": read(PKG_DESCRIBE),\n \"long_description_content_type\": get_description_type(PKG_DESCRIBE),\n \"classifiers\": CLASSIFIERS,\n \"keywords\": KEYWORDS,\n \"license\": LICENSE,\n \"author\": AUTHOR,\n \"author_email\": EMAIL,\n \"url\": URL,\n \"maintainer\": MAINTAINER,\n \"maintainer_email\": EMAIL,\n \"project_urls\": {\n \"Documentation\": URL,\n \"Download\": \"{}/tarball/v{}\".format(REPOSITORY, get_version()),\n \"Source\": REPOSITORY,\n \"Tracker\": \"{}/issues\".format(REPOSITORY),\n },\n \"download_url\": \"{}/tarball/v{}\".format(REPOSITORY, get_version()),\n \"packages\": find_packages(where=PROJECT, exclude=EXCLUDES),\n \"package_data\": {\"yellowbrick\": [\"datasets/manifest.json\"]},\n \"zip_safe\": False,\n \"entry_points\": {\"console_scripts\": []},\n \"install_requires\": list(get_requires()),\n \"python_requires\": \">=3.4, <4\"\n}\n\n\n##########################################################################\n## Run setup script\n##########################################################################\n\nif __name__ == \"__main__\":\n setup(**config)\n", "path": "setup.py"}]}
| 1,983 | 111 |
gh_patches_debug_36242
|
rasdani/github-patches
|
git_diff
|
pytorch__ignite-1756
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve `ignite.contirb.metrics` implementation
For these metrics in `ignite.contrib.metrics` :
- [Average Precision](https://github.com/pytorch/ignite/blob/master/ignite/contrib/metrics/average_precision.py)
- [Precision Recall Curve](https://github.com/pytorch/ignite/blob/master/ignite/contrib/metrics/precision_recall_curve.py)
- [Roc Auc](https://github.com/pytorch/ignite/blob/master/ignite/contrib/metrics/roc_auc.py)
We need to make the implementation simpler. Similar to the one here #1690. New arg `device` will be added, and we need to add the necessary distributed tests as well. See here for the tests https://github.com/pytorch/ignite/blob/master/tests/ignite/contrib/metrics/regression/test_canberra_metric.py#L99
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ignite/contrib/metrics/average_precision.py`
Content:
```
1 from typing import Callable
2
3 import torch
4
5 from ignite.metrics import EpochMetric
6
7
8 def average_precision_compute_fn(y_preds: torch.Tensor, y_targets: torch.Tensor) -> float:
9 try:
10 from sklearn.metrics import average_precision_score
11 except ImportError:
12 raise RuntimeError("This contrib module requires sklearn to be installed.")
13
14 y_true = y_targets.numpy()
15 y_pred = y_preds.numpy()
16 return average_precision_score(y_true, y_pred)
17
18
19 class AveragePrecision(EpochMetric):
20 """Computes Average Precision accumulating predictions and the ground-truth during an epoch
21 and applying `sklearn.metrics.average_precision_score <https://scikit-learn.org/stable/modules/generated/
22 sklearn.metrics.average_precision_score.html#sklearn.metrics.average_precision_score>`_ .
23
24 Args:
25 output_transform: a callable that is used to transform the
26 :class:`~ignite.engine.engine.Engine`'s ``process_function``'s output into the
27 form expected by the metric. This can be useful if, for example, you have a multi-output model and
28 you want to compute the metric with respect to one of the outputs.
29 check_compute_fn: Default False. If True, `average_precision_score
30 <https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html
31 #sklearn.metrics.average_precision_score>`_ is run on the first batch of data to ensure there are
32 no issues. User will be warned in case there are any issues computing the function.
33
34 AveragePrecision expects y to be comprised of 0's and 1's. y_pred must either be probability estimates or
35 confidence values. To apply an activation to y_pred, use output_transform as shown below:
36
37 .. code-block:: python
38
39 def activated_output_transform(output):
40 y_pred, y = output
41 y_pred = torch.softmax(y_pred, dim=1)
42 return y_pred, y
43
44 avg_precision = AveragePrecision(activated_output_transform)
45
46 """
47
48 def __init__(self, output_transform: Callable = lambda x: x, check_compute_fn: bool = False) -> None:
49 super(AveragePrecision, self).__init__(
50 average_precision_compute_fn, output_transform=output_transform, check_compute_fn=check_compute_fn
51 )
52
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ignite/contrib/metrics/average_precision.py b/ignite/contrib/metrics/average_precision.py
--- a/ignite/contrib/metrics/average_precision.py
+++ b/ignite/contrib/metrics/average_precision.py
@@ -1,4 +1,4 @@
-from typing import Callable
+from typing import Callable, Optional, Union
import torch
@@ -6,13 +6,10 @@
def average_precision_compute_fn(y_preds: torch.Tensor, y_targets: torch.Tensor) -> float:
- try:
- from sklearn.metrics import average_precision_score
- except ImportError:
- raise RuntimeError("This contrib module requires sklearn to be installed.")
+ from sklearn.metrics import average_precision_score
- y_true = y_targets.numpy()
- y_pred = y_preds.numpy()
+ y_true = y_targets.cpu().numpy()
+ y_pred = y_preds.cpu().numpy()
return average_precision_score(y_true, y_pred)
@@ -30,6 +27,7 @@
<https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html
#sklearn.metrics.average_precision_score>`_ is run on the first batch of data to ensure there are
no issues. User will be warned in case there are any issues computing the function.
+ device: optional device specification for internal storage.
AveragePrecision expects y to be comprised of 0's and 1's. y_pred must either be probability estimates or
confidence values. To apply an activation to y_pred, use output_transform as shown below:
@@ -45,7 +43,21 @@
"""
- def __init__(self, output_transform: Callable = lambda x: x, check_compute_fn: bool = False) -> None:
+ def __init__(
+ self,
+ output_transform: Callable = lambda x: x,
+ check_compute_fn: bool = False,
+ device: Union[str, torch.device] = torch.device("cpu"),
+ ):
+
+ try:
+ from sklearn.metrics import average_precision_score
+ except ImportError:
+ raise RuntimeError("This contrib module requires sklearn to be installed.")
+
super(AveragePrecision, self).__init__(
- average_precision_compute_fn, output_transform=output_transform, check_compute_fn=check_compute_fn
+ average_precision_compute_fn,
+ output_transform=output_transform,
+ check_compute_fn=check_compute_fn,
+ device=device,
)
|
{"golden_diff": "diff --git a/ignite/contrib/metrics/average_precision.py b/ignite/contrib/metrics/average_precision.py\n--- a/ignite/contrib/metrics/average_precision.py\n+++ b/ignite/contrib/metrics/average_precision.py\n@@ -1,4 +1,4 @@\n-from typing import Callable\n+from typing import Callable, Optional, Union\n \n import torch\n \n@@ -6,13 +6,10 @@\n \n \n def average_precision_compute_fn(y_preds: torch.Tensor, y_targets: torch.Tensor) -> float:\n- try:\n- from sklearn.metrics import average_precision_score\n- except ImportError:\n- raise RuntimeError(\"This contrib module requires sklearn to be installed.\")\n+ from sklearn.metrics import average_precision_score\n \n- y_true = y_targets.numpy()\n- y_pred = y_preds.numpy()\n+ y_true = y_targets.cpu().numpy()\n+ y_pred = y_preds.cpu().numpy()\n return average_precision_score(y_true, y_pred)\n \n \n@@ -30,6 +27,7 @@\n <https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html\n #sklearn.metrics.average_precision_score>`_ is run on the first batch of data to ensure there are\n no issues. User will be warned in case there are any issues computing the function.\n+ device: optional device specification for internal storage.\n \n AveragePrecision expects y to be comprised of 0's and 1's. y_pred must either be probability estimates or\n confidence values. To apply an activation to y_pred, use output_transform as shown below:\n@@ -45,7 +43,21 @@\n \n \"\"\"\n \n- def __init__(self, output_transform: Callable = lambda x: x, check_compute_fn: bool = False) -> None:\n+ def __init__(\n+ self,\n+ output_transform: Callable = lambda x: x,\n+ check_compute_fn: bool = False,\n+ device: Union[str, torch.device] = torch.device(\"cpu\"),\n+ ):\n+\n+ try:\n+ from sklearn.metrics import average_precision_score\n+ except ImportError:\n+ raise RuntimeError(\"This contrib module requires sklearn to be installed.\")\n+\n super(AveragePrecision, self).__init__(\n- average_precision_compute_fn, output_transform=output_transform, check_compute_fn=check_compute_fn\n+ average_precision_compute_fn,\n+ output_transform=output_transform,\n+ check_compute_fn=check_compute_fn,\n+ device=device,\n )\n", "issue": "Improve `ignite.contirb.metrics` implementation \nFor these metrics in `ignite.contrib.metrics` :\r\n- [Average Precision](https://github.com/pytorch/ignite/blob/master/ignite/contrib/metrics/average_precision.py)\r\n- [Precision Recall Curve](https://github.com/pytorch/ignite/blob/master/ignite/contrib/metrics/precision_recall_curve.py)\r\n- [Roc Auc](https://github.com/pytorch/ignite/blob/master/ignite/contrib/metrics/roc_auc.py) \r\nWe need to make the implementation simpler. Similar to the one here #1690. New arg `device` will be added, and we need to add the necessary distributed tests as well. See here for the tests https://github.com/pytorch/ignite/blob/master/tests/ignite/contrib/metrics/regression/test_canberra_metric.py#L99\r\n\n", "before_files": [{"content": "from typing import Callable\n\nimport torch\n\nfrom ignite.metrics import EpochMetric\n\n\ndef average_precision_compute_fn(y_preds: torch.Tensor, y_targets: torch.Tensor) -> float:\n try:\n from sklearn.metrics import average_precision_score\n except ImportError:\n raise RuntimeError(\"This contrib module requires sklearn to be installed.\")\n\n y_true = y_targets.numpy()\n y_pred = y_preds.numpy()\n return average_precision_score(y_true, y_pred)\n\n\nclass AveragePrecision(EpochMetric):\n \"\"\"Computes Average Precision accumulating predictions and the ground-truth during an epoch\n and applying `sklearn.metrics.average_precision_score <https://scikit-learn.org/stable/modules/generated/\n sklearn.metrics.average_precision_score.html#sklearn.metrics.average_precision_score>`_ .\n\n Args:\n output_transform: a callable that is used to transform the\n :class:`~ignite.engine.engine.Engine`'s ``process_function``'s output into the\n form expected by the metric. This can be useful if, for example, you have a multi-output model and\n you want to compute the metric with respect to one of the outputs.\n check_compute_fn: Default False. If True, `average_precision_score\n <https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html\n #sklearn.metrics.average_precision_score>`_ is run on the first batch of data to ensure there are\n no issues. User will be warned in case there are any issues computing the function.\n\n AveragePrecision expects y to be comprised of 0's and 1's. y_pred must either be probability estimates or\n confidence values. To apply an activation to y_pred, use output_transform as shown below:\n\n .. code-block:: python\n\n def activated_output_transform(output):\n y_pred, y = output\n y_pred = torch.softmax(y_pred, dim=1)\n return y_pred, y\n\n avg_precision = AveragePrecision(activated_output_transform)\n\n \"\"\"\n\n def __init__(self, output_transform: Callable = lambda x: x, check_compute_fn: bool = False) -> None:\n super(AveragePrecision, self).__init__(\n average_precision_compute_fn, output_transform=output_transform, check_compute_fn=check_compute_fn\n )\n", "path": "ignite/contrib/metrics/average_precision.py"}], "after_files": [{"content": "from typing import Callable, Optional, Union\n\nimport torch\n\nfrom ignite.metrics import EpochMetric\n\n\ndef average_precision_compute_fn(y_preds: torch.Tensor, y_targets: torch.Tensor) -> float:\n from sklearn.metrics import average_precision_score\n\n y_true = y_targets.cpu().numpy()\n y_pred = y_preds.cpu().numpy()\n return average_precision_score(y_true, y_pred)\n\n\nclass AveragePrecision(EpochMetric):\n \"\"\"Computes Average Precision accumulating predictions and the ground-truth during an epoch\n and applying `sklearn.metrics.average_precision_score <https://scikit-learn.org/stable/modules/generated/\n sklearn.metrics.average_precision_score.html#sklearn.metrics.average_precision_score>`_ .\n\n Args:\n output_transform: a callable that is used to transform the\n :class:`~ignite.engine.engine.Engine`'s ``process_function``'s output into the\n form expected by the metric. This can be useful if, for example, you have a multi-output model and\n you want to compute the metric with respect to one of the outputs.\n check_compute_fn: Default False. If True, `average_precision_score\n <https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html\n #sklearn.metrics.average_precision_score>`_ is run on the first batch of data to ensure there are\n no issues. User will be warned in case there are any issues computing the function.\n device: optional device specification for internal storage.\n\n AveragePrecision expects y to be comprised of 0's and 1's. y_pred must either be probability estimates or\n confidence values. To apply an activation to y_pred, use output_transform as shown below:\n\n .. code-block:: python\n\n def activated_output_transform(output):\n y_pred, y = output\n y_pred = torch.softmax(y_pred, dim=1)\n return y_pred, y\n\n avg_precision = AveragePrecision(activated_output_transform)\n\n \"\"\"\n\n def __init__(\n self,\n output_transform: Callable = lambda x: x,\n check_compute_fn: bool = False,\n device: Union[str, torch.device] = torch.device(\"cpu\"),\n ):\n\n try:\n from sklearn.metrics import average_precision_score\n except ImportError:\n raise RuntimeError(\"This contrib module requires sklearn to be installed.\")\n\n super(AveragePrecision, self).__init__(\n average_precision_compute_fn,\n output_transform=output_transform,\n check_compute_fn=check_compute_fn,\n device=device,\n )\n", "path": "ignite/contrib/metrics/average_precision.py"}]}
| 1,021 | 533 |
gh_patches_debug_37409
|
rasdani/github-patches
|
git_diff
|
alltheplaces__alltheplaces-2875
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider tacocabana is broken
During the global build at 2021-05-26-14-42-23, spider **tacocabana** failed with **0 features** and **1 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/logs/tacocabana.log) and [the output](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/tacocabana.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/tacocabana.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/tacocabana.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import scrapy
3 import json
4 import re
5
6 from locations.items import GeojsonPointItem
7
8 class TacocabanaSpider(scrapy.Spider):
9 name = "tacocabana"
10 item_attributes = { 'brand': "Taco Cabana" }
11 allowed_domains = ["www.tacocabana.com"]
12 start_urls = (
13 "http://www.tacocabana.com/wp-admin/admin-ajax.php?action=get_ajax_processor&processor=get-locations&queryType=&postID=816",
14 )
15
16 def parse(self, response):
17 data = json.loads(re.sub(r"\s<.*?>.*<.*?>\s", "", response.body_as_unicode()))
18
19 for store in data:
20 properties = {
21 "phone" : store["phone_number"],
22 "ref" : str(store["locator_store_number"]),
23 "name" : store["post_title"],
24 "opening_hours" : store["hours"],
25 "website" : store["permalink"],
26 "lat" : store["x_coordinate"],
27 "lon" : store["y_coordinate"],
28 "street" : store["street_address_1"] + store["street_address_2"],
29 "city" : store["city"],
30 "state" : store["state"],
31 "postcode" : store["zip_code"]
32 }
33
34 yield GeojsonPointItem(**properties)
35
36 else:
37 self.logger.info("No results")
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/locations/spiders/tacocabana.py b/locations/spiders/tacocabana.py
--- a/locations/spiders/tacocabana.py
+++ b/locations/spiders/tacocabana.py
@@ -1,37 +1,55 @@
# -*- coding: utf-8 -*-
import scrapy
-import json
-import re
from locations.items import GeojsonPointItem
+
class TacocabanaSpider(scrapy.Spider):
name = "tacocabana"
- item_attributes = { 'brand': "Taco Cabana" }
- allowed_domains = ["www.tacocabana.com"]
- start_urls = (
- "http://www.tacocabana.com/wp-admin/admin-ajax.php?action=get_ajax_processor&processor=get-locations&queryType=&postID=816",
- )
-
+ item_attributes = {"brand": "Taco Cabana"}
+ allowed_domains = ["api.koala.fuzzhq.com"]
+
+ def start_requests(self):
+ yield scrapy.http.JsonRequest(
+ "https://api.koala.fuzzhq.com/oauth/access_token",
+ data={
+ "client_id": "3nA4STkGif0fZGApqxMlVewy3h8HN6Fsy7jVOACP",
+ "client_secret": "8oBU5gWiNg04zYzz61hN3ETrTIzvmbGyeLCX0F1s",
+ "grant_type": "ordering_app_credentials",
+ "scope": "group:ordering_app",
+ },
+ callback=self.fetch_locations,
+ )
+
+ def fetch_locations(self, response):
+ self.access_token = response.json()["access_token"]
+ yield self.request(
+ "https://api.koala.fuzzhq.com/v1/ordering/store-locations/?include[]=operating_hours&include[]=attributes&per_page=50"
+ )
+
+ def request(self, url):
+ return scrapy.Request(
+ url, headers={"Authorization": f"Bearer {self.access_token}"}
+ )
+
def parse(self, response):
- data = json.loads(re.sub(r"\s<.*?>.*<.*?>\s", "", response.body_as_unicode()))
+ data = response.json()
- for store in data:
+ for store in data["data"]:
properties = {
- "phone" : store["phone_number"],
- "ref" : str(store["locator_store_number"]),
- "name" : store["post_title"],
- "opening_hours" : store["hours"],
- "website" : store["permalink"],
- "lat" : store["x_coordinate"],
- "lon" : store["y_coordinate"],
- "street" : store["street_address_1"] + store["street_address_2"],
- "city" : store["city"],
- "state" : store["state"],
- "postcode" : store["zip_code"]
+ "website": f'https://olo.tacocabana.com/menu/{store["slug"]}?showInfoModal=true',
+ "ref": store["brand_id"],
+ "lat": store["latitude"],
+ "lon": store["longitude"],
+ "addr_full": store["street_address"],
+ "city": store["city"],
+ "state": store["cached_data"]["state"],
+ "country": store["country"],
+ "postcode": store["zip_code"],
+ "phone": store["phone_number"],
}
-
yield GeojsonPointItem(**properties)
-
- else:
- self.logger.info("No results")
+
+ next_url = data["meta"]["pagination"]["links"]["next"]
+ if next_url:
+ yield self.request(next_url)
|
{"golden_diff": "diff --git a/locations/spiders/tacocabana.py b/locations/spiders/tacocabana.py\n--- a/locations/spiders/tacocabana.py\n+++ b/locations/spiders/tacocabana.py\n@@ -1,37 +1,55 @@\n # -*- coding: utf-8 -*-\n import scrapy\n-import json\n-import re\n \n from locations.items import GeojsonPointItem\n \n+\n class TacocabanaSpider(scrapy.Spider):\n name = \"tacocabana\"\n- item_attributes = { 'brand': \"Taco Cabana\" }\n- allowed_domains = [\"www.tacocabana.com\"]\n- start_urls = (\n- \"http://www.tacocabana.com/wp-admin/admin-ajax.php?action=get_ajax_processor&processor=get-locations&queryType=&postID=816\",\n- )\n- \n+ item_attributes = {\"brand\": \"Taco Cabana\"}\n+ allowed_domains = [\"api.koala.fuzzhq.com\"]\n+\n+ def start_requests(self):\n+ yield scrapy.http.JsonRequest(\n+ \"https://api.koala.fuzzhq.com/oauth/access_token\",\n+ data={\n+ \"client_id\": \"3nA4STkGif0fZGApqxMlVewy3h8HN6Fsy7jVOACP\",\n+ \"client_secret\": \"8oBU5gWiNg04zYzz61hN3ETrTIzvmbGyeLCX0F1s\",\n+ \"grant_type\": \"ordering_app_credentials\",\n+ \"scope\": \"group:ordering_app\",\n+ },\n+ callback=self.fetch_locations,\n+ )\n+\n+ def fetch_locations(self, response):\n+ self.access_token = response.json()[\"access_token\"]\n+ yield self.request(\n+ \"https://api.koala.fuzzhq.com/v1/ordering/store-locations/?include[]=operating_hours&include[]=attributes&per_page=50\"\n+ )\n+\n+ def request(self, url):\n+ return scrapy.Request(\n+ url, headers={\"Authorization\": f\"Bearer {self.access_token}\"}\n+ )\n+\n def parse(self, response):\n- data = json.loads(re.sub(r\"\\s<.*?>.*<.*?>\\s\", \"\", response.body_as_unicode()))\n+ data = response.json()\n \n- for store in data:\n+ for store in data[\"data\"]:\n properties = {\n- \"phone\" : store[\"phone_number\"],\n- \"ref\" : str(store[\"locator_store_number\"]),\n- \"name\" : store[\"post_title\"],\n- \"opening_hours\" : store[\"hours\"],\n- \"website\" : store[\"permalink\"],\n- \"lat\" : store[\"x_coordinate\"],\n- \"lon\" : store[\"y_coordinate\"],\n- \"street\" : store[\"street_address_1\"] + store[\"street_address_2\"],\n- \"city\" : store[\"city\"],\n- \"state\" : store[\"state\"],\n- \"postcode\" : store[\"zip_code\"]\n+ \"website\": f'https://olo.tacocabana.com/menu/{store[\"slug\"]}?showInfoModal=true',\n+ \"ref\": store[\"brand_id\"],\n+ \"lat\": store[\"latitude\"],\n+ \"lon\": store[\"longitude\"],\n+ \"addr_full\": store[\"street_address\"],\n+ \"city\": store[\"city\"],\n+ \"state\": store[\"cached_data\"][\"state\"],\n+ \"country\": store[\"country\"],\n+ \"postcode\": store[\"zip_code\"],\n+ \"phone\": store[\"phone_number\"],\n }\n- \n yield GeojsonPointItem(**properties)\n- \n- else:\n- self.logger.info(\"No results\")\n+\n+ next_url = data[\"meta\"][\"pagination\"][\"links\"][\"next\"]\n+ if next_url:\n+ yield self.request(next_url)\n", "issue": "Spider tacocabana is broken\nDuring the global build at 2021-05-26-14-42-23, spider **tacocabana** failed with **0 features** and **1 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/logs/tacocabana.log) and [the output](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/tacocabana.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/tacocabana.geojson))\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nimport json\nimport re\n\nfrom locations.items import GeojsonPointItem\n\nclass TacocabanaSpider(scrapy.Spider):\n name = \"tacocabana\"\n item_attributes = { 'brand': \"Taco Cabana\" }\n allowed_domains = [\"www.tacocabana.com\"]\n start_urls = (\n \"http://www.tacocabana.com/wp-admin/admin-ajax.php?action=get_ajax_processor&processor=get-locations&queryType=&postID=816\",\n )\n \n def parse(self, response):\n data = json.loads(re.sub(r\"\\s<.*?>.*<.*?>\\s\", \"\", response.body_as_unicode()))\n\n for store in data:\n properties = {\n \"phone\" : store[\"phone_number\"],\n \"ref\" : str(store[\"locator_store_number\"]),\n \"name\" : store[\"post_title\"],\n \"opening_hours\" : store[\"hours\"],\n \"website\" : store[\"permalink\"],\n \"lat\" : store[\"x_coordinate\"],\n \"lon\" : store[\"y_coordinate\"],\n \"street\" : store[\"street_address_1\"] + store[\"street_address_2\"],\n \"city\" : store[\"city\"],\n \"state\" : store[\"state\"],\n \"postcode\" : store[\"zip_code\"]\n }\n \n yield GeojsonPointItem(**properties)\n \n else:\n self.logger.info(\"No results\")\n", "path": "locations/spiders/tacocabana.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\n\nfrom locations.items import GeojsonPointItem\n\n\nclass TacocabanaSpider(scrapy.Spider):\n name = \"tacocabana\"\n item_attributes = {\"brand\": \"Taco Cabana\"}\n allowed_domains = [\"api.koala.fuzzhq.com\"]\n\n def start_requests(self):\n yield scrapy.http.JsonRequest(\n \"https://api.koala.fuzzhq.com/oauth/access_token\",\n data={\n \"client_id\": \"3nA4STkGif0fZGApqxMlVewy3h8HN6Fsy7jVOACP\",\n \"client_secret\": \"8oBU5gWiNg04zYzz61hN3ETrTIzvmbGyeLCX0F1s\",\n \"grant_type\": \"ordering_app_credentials\",\n \"scope\": \"group:ordering_app\",\n },\n callback=self.fetch_locations,\n )\n\n def fetch_locations(self, response):\n self.access_token = response.json()[\"access_token\"]\n yield self.request(\n \"https://api.koala.fuzzhq.com/v1/ordering/store-locations/?include[]=operating_hours&include[]=attributes&per_page=50\"\n )\n\n def request(self, url):\n return scrapy.Request(\n url, headers={\"Authorization\": f\"Bearer {self.access_token}\"}\n )\n\n def parse(self, response):\n data = response.json()\n\n for store in data[\"data\"]:\n properties = {\n \"website\": f'https://olo.tacocabana.com/menu/{store[\"slug\"]}?showInfoModal=true',\n \"ref\": store[\"brand_id\"],\n \"lat\": store[\"latitude\"],\n \"lon\": store[\"longitude\"],\n \"addr_full\": store[\"street_address\"],\n \"city\": store[\"city\"],\n \"state\": store[\"cached_data\"][\"state\"],\n \"country\": store[\"country\"],\n \"postcode\": store[\"zip_code\"],\n \"phone\": store[\"phone_number\"],\n }\n yield GeojsonPointItem(**properties)\n\n next_url = data[\"meta\"][\"pagination\"][\"links\"][\"next\"]\n if next_url:\n yield self.request(next_url)\n", "path": "locations/spiders/tacocabana.py"}]}
| 832 | 839 |
gh_patches_debug_2630
|
rasdani/github-patches
|
git_diff
|
joke2k__faker-826
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pt_BR email not returning valid email addresses
When creating a fake Factory with the pt_BR it is not returning valid email addresses.
Example:
```
melocauã@bol.com.br
joã[email protected]
laví[email protected]
vitó[email protected]
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/providers/internet/pt_BR/__init__.py`
Content:
```
1 # coding=utf-8
2 from __future__ import unicode_literals
3 from .. import Provider as InternetProvider
4
5
6 class Provider(InternetProvider):
7 safe_email_tlds = ('com', 'net', 'br', 'br')
8 free_email_domains = (
9 'gmail.com',
10 'hotmail.com',
11 'yahoo.com.br',
12 'uol.com.br',
13 'bol.com.br',
14 'ig.com.br')
15 tlds = ('com', 'com', 'com', 'net', 'org', 'br', 'br', 'br')
16
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/faker/providers/internet/pt_BR/__init__.py b/faker/providers/internet/pt_BR/__init__.py
--- a/faker/providers/internet/pt_BR/__init__.py
+++ b/faker/providers/internet/pt_BR/__init__.py
@@ -13,3 +13,11 @@
'bol.com.br',
'ig.com.br')
tlds = ('com', 'com', 'com', 'net', 'org', 'br', 'br', 'br')
+ replacements = (
+ ('à', 'a'), ('â', 'a'), ('ã', 'a'),
+ ('ç', 'c'),
+ ('é', 'e'), ('ê', 'e'),
+ ('í', 'i'),
+ ('ô', 'o'), ('ö', 'o'), ('õ', 'o'),
+ ('ú', 'u'),
+ )
|
{"golden_diff": "diff --git a/faker/providers/internet/pt_BR/__init__.py b/faker/providers/internet/pt_BR/__init__.py\n--- a/faker/providers/internet/pt_BR/__init__.py\n+++ b/faker/providers/internet/pt_BR/__init__.py\n@@ -13,3 +13,11 @@\n 'bol.com.br',\n 'ig.com.br')\n tlds = ('com', 'com', 'com', 'net', 'org', 'br', 'br', 'br')\n+ replacements = (\n+ ('\u00e0', 'a'), ('\u00e2', 'a'), ('\u00e3', 'a'),\n+ ('\u00e7', 'c'),\n+ ('\u00e9', 'e'), ('\u00ea', 'e'),\n+ ('\u00ed', 'i'),\n+ ('\u00f4', 'o'), ('\u00f6', 'o'), ('\u00f5', 'o'),\n+ ('\u00fa', 'u'),\n+ )\n", "issue": "pt_BR email not returning valid email addresses\nWhen creating a fake Factory with the pt_BR it is not returning valid email addresses.\r\nExample:\r\n```\r\nmelocau\[email protected]\r\njo\[email protected]\r\nlav\[email protected]\r\nvit\[email protected]\r\n```\n", "before_files": [{"content": "# coding=utf-8\nfrom __future__ import unicode_literals\nfrom .. import Provider as InternetProvider\n\n\nclass Provider(InternetProvider):\n safe_email_tlds = ('com', 'net', 'br', 'br')\n free_email_domains = (\n 'gmail.com',\n 'hotmail.com',\n 'yahoo.com.br',\n 'uol.com.br',\n 'bol.com.br',\n 'ig.com.br')\n tlds = ('com', 'com', 'com', 'net', 'org', 'br', 'br', 'br')\n", "path": "faker/providers/internet/pt_BR/__init__.py"}], "after_files": [{"content": "# coding=utf-8\nfrom __future__ import unicode_literals\nfrom .. import Provider as InternetProvider\n\n\nclass Provider(InternetProvider):\n safe_email_tlds = ('com', 'net', 'br', 'br')\n free_email_domains = (\n 'gmail.com',\n 'hotmail.com',\n 'yahoo.com.br',\n 'uol.com.br',\n 'bol.com.br',\n 'ig.com.br')\n tlds = ('com', 'com', 'com', 'net', 'org', 'br', 'br', 'br')\n replacements = (\n ('\u00e0', 'a'), ('\u00e2', 'a'), ('\u00e3', 'a'),\n ('\u00e7', 'c'),\n ('\u00e9', 'e'), ('\u00ea', 'e'),\n ('\u00ed', 'i'),\n ('\u00f4', 'o'), ('\u00f6', 'o'), ('\u00f5', 'o'),\n ('\u00fa', 'u'),\n )\n", "path": "faker/providers/internet/pt_BR/__init__.py"}]}
| 477 | 199 |
gh_patches_debug_8784
|
rasdani/github-patches
|
git_diff
|
digitalfabrik__integreat-cms-632
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
get_last_root_page template tag not working
### Describe the Bug
<!-- A clear and concise description of what the bug is. -->
The `get_last_root_page` template tag is not working:
https://github.com/Integreat/integreat-cms/blob/develop/src/cms/templatetags/page_filters.py#L10-L20
This is because in the page QuerySet aggregation, the difference() function is used:
https://github.com/Integreat/integreat-cms/blob/a285cb3c74b2a2c501147076338e2b2a70c89bd6/src/cms/models/regions/region.py#L177
After difference(), it is not possible to use filter(), see https://docs.djangoproject.com/en/2.2/ref/models/querysets/#union
This leads to the last drop-region in the page tree being incorrectly tied to the last page in the list, not necessarily the last root page.
### Steps to Reproduce
1. Go to Page tree
2. Drag & drop page to most bottom line
### Expected Behavior
<!-- A clear and concise description of what you expected to happen. -->
The page should be placed next to the last root page
### Actual Behavior
<!-- A clear and concise description of what actually happened. -->
The page is placed next to the last page in the list
### Additional Information
<!-- Add any other context (e.g. logs, screenshots, etc.) about the problem here. -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cms/templatetags/page_filters.py`
Content:
```
1 """
2 This is a collection of tags and filters for :class:`~cms.models.pages.page.Page` objects.
3 """
4 from django import template
5
6 register = template.Library()
7
8
9 @register.simple_tag
10 def get_last_root_page(pages):
11 """
12 This tag returns the last page on the root level.
13
14 :param pages: The requested page tree
15 :type pages: ~mptt.querysets.TreeQuerySet [ ~cms.models.pages.page.Page ]
16
17 :return: The last root page of the given :class:`~django.db.models.query.QuerySet`
18 :rtype: ~cms.models.pages.page.Page
19 """
20 return pages.filter(parent=None).last()
21
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/cms/templatetags/page_filters.py b/src/cms/templatetags/page_filters.py
--- a/src/cms/templatetags/page_filters.py
+++ b/src/cms/templatetags/page_filters.py
@@ -12,9 +12,9 @@
This tag returns the last page on the root level.
:param pages: The requested page tree
- :type pages: ~mptt.querysets.TreeQuerySet [ ~cms.models.pages.page.Page ]
+ :type pages: list [ ~cms.models.pages.page.Page ]
- :return: The last root page of the given :class:`~django.db.models.query.QuerySet`
+ :return: The last root page of the given page list
:rtype: ~cms.models.pages.page.Page
"""
- return pages.filter(parent=None).last()
+ return list(filter(lambda p: not p.parent, pages))[-1]
|
{"golden_diff": "diff --git a/src/cms/templatetags/page_filters.py b/src/cms/templatetags/page_filters.py\n--- a/src/cms/templatetags/page_filters.py\n+++ b/src/cms/templatetags/page_filters.py\n@@ -12,9 +12,9 @@\n This tag returns the last page on the root level.\n \n :param pages: The requested page tree\n- :type pages: ~mptt.querysets.TreeQuerySet [ ~cms.models.pages.page.Page ]\n+ :type pages: list [ ~cms.models.pages.page.Page ]\n \n- :return: The last root page of the given :class:`~django.db.models.query.QuerySet`\n+ :return: The last root page of the given page list\n :rtype: ~cms.models.pages.page.Page\n \"\"\"\n- return pages.filter(parent=None).last()\n+ return list(filter(lambda p: not p.parent, pages))[-1]\n", "issue": "get_last_root_page template tag not working\n### Describe the Bug\r\n<!-- A clear and concise description of what the bug is. -->\r\nThe `get_last_root_page` template tag is not working:\r\nhttps://github.com/Integreat/integreat-cms/blob/develop/src/cms/templatetags/page_filters.py#L10-L20\r\n\r\nThis is because in the page QuerySet aggregation, the difference() function is used:\r\nhttps://github.com/Integreat/integreat-cms/blob/a285cb3c74b2a2c501147076338e2b2a70c89bd6/src/cms/models/regions/region.py#L177\r\n\r\nAfter difference(), it is not possible to use filter(), see https://docs.djangoproject.com/en/2.2/ref/models/querysets/#union\r\n\r\nThis leads to the last drop-region in the page tree being incorrectly tied to the last page in the list, not necessarily the last root page.\r\n\r\n### Steps to Reproduce\r\n\r\n1. Go to Page tree\r\n2. Drag & drop page to most bottom line\r\n\r\n### Expected Behavior\r\n<!-- A clear and concise description of what you expected to happen. -->\r\nThe page should be placed next to the last root page\r\n\r\n### Actual Behavior\r\n<!-- A clear and concise description of what actually happened. -->\r\nThe page is placed next to the last page in the list\r\n\r\n### Additional Information\r\n<!-- Add any other context (e.g. logs, screenshots, etc.) about the problem here. -->\r\n\r\n\n", "before_files": [{"content": "\"\"\"\nThis is a collection of tags and filters for :class:`~cms.models.pages.page.Page` objects.\n\"\"\"\nfrom django import template\n\nregister = template.Library()\n\n\[email protected]_tag\ndef get_last_root_page(pages):\n \"\"\"\n This tag returns the last page on the root level.\n\n :param pages: The requested page tree\n :type pages: ~mptt.querysets.TreeQuerySet [ ~cms.models.pages.page.Page ]\n\n :return: The last root page of the given :class:`~django.db.models.query.QuerySet`\n :rtype: ~cms.models.pages.page.Page\n \"\"\"\n return pages.filter(parent=None).last()\n", "path": "src/cms/templatetags/page_filters.py"}], "after_files": [{"content": "\"\"\"\nThis is a collection of tags and filters for :class:`~cms.models.pages.page.Page` objects.\n\"\"\"\nfrom django import template\n\nregister = template.Library()\n\n\[email protected]_tag\ndef get_last_root_page(pages):\n \"\"\"\n This tag returns the last page on the root level.\n\n :param pages: The requested page tree\n :type pages: list [ ~cms.models.pages.page.Page ]\n\n :return: The last root page of the given page list\n :rtype: ~cms.models.pages.page.Page\n \"\"\"\n return list(filter(lambda p: not p.parent, pages))[-1]\n", "path": "src/cms/templatetags/page_filters.py"}]}
| 760 | 203 |
gh_patches_debug_21667
|
rasdani/github-patches
|
git_diff
|
fedora-infra__bodhi-2005
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
bodhi-dequqe-stable dies if any update in the queue is no longer eligible to go stable
QuLogic from Freenode reported today that batched updates didn't go stable at 03:00 UTC like they should have. I confirmed that the cron job ran, but I didn't see any notes about its output. I then ran the command by hand and received this output:
```
[bowlofeggs@bodhi-backend01 ~][PROD]$ sudo -u apache /usr/bin/bodhi-dequeue-stable
No handlers could be found for logger "bodhi.server"
This update has not yet met the minimum testing requirements defined in the <a href="https://fedoraproject.org/wiki/Package_update_acceptance_criteria">Package Update Acceptance Criteria</a>
```
The [```dequeue_stable()```](https://github.com/fedora-infra/bodhi/blob/3.0.0/bodhi/server/scripts/dequeue_stable.py#L28-L46) function runs a large transaction with only a single try/except. It seems that some update in the queue no longer meets testing requirements (probably due to receiving a -1 karma after going to batched) and is raising an Exception when the tool attempts to mark it for stable. Since there is only one try/except handler, this causes the whole transaction to be rolled back.
It should be easy to fix this - we just need a try/except around each update.
Thanks to QuLogic from Freenode for reporting this issue to me.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bodhi/server/scripts/dequeue_stable.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright © 2017 Caleigh Runge-Hottman
3 #
4 # This file is part of Bodhi.
5 #
6 # This program is free software; you can redistribute it and/or
7 # modify it under the terms of the GNU General Public License
8 # as published by the Free Software Foundation; either version 2
9 # of the License, or (at your option) any later version.
10 #
11 # This program is distributed in the hope that it will be useful,
12 # but WITHOUT ANY WARRANTY; without even the implied warranty of
13 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 # GNU General Public License for more details.
15 #
16 # You should have received a copy of the GNU General Public License
17 # along with this program; if not, write to the Free Software
18 # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
19 """This script is responsible for moving all updates with a batched request to a stable request."""
20
21 import sys
22
23 import click
24
25 from bodhi.server import buildsys, config, models, Session, initialize_db
26
27
28 @click.command()
29 @click.version_option(message='%(version)s')
30 def dequeue_stable():
31 """Convert all batched requests to stable requests."""
32 initialize_db(config.config)
33 buildsys.setup_buildsystem(config.config)
34 db = Session()
35
36 try:
37 batched = db.query(models.Update).filter_by(request=models.UpdateRequest.batched).all()
38 for update in batched:
39 update.set_request(db, models.UpdateRequest.stable, u'bodhi')
40 db.commit()
41
42 except Exception as e:
43 print(str(e))
44 db.rollback()
45 Session.remove()
46 sys.exit(1)
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/bodhi/server/scripts/dequeue_stable.py b/bodhi/server/scripts/dequeue_stable.py
--- a/bodhi/server/scripts/dequeue_stable.py
+++ b/bodhi/server/scripts/dequeue_stable.py
@@ -1,5 +1,5 @@
# -*- coding: utf-8 -*-
-# Copyright © 2017 Caleigh Runge-Hottman
+# Copyright © 2017 Caleigh Runge-Hottman and Red Hat, Inc.
#
# This file is part of Bodhi.
#
@@ -36,11 +36,17 @@
try:
batched = db.query(models.Update).filter_by(request=models.UpdateRequest.batched).all()
for update in batched:
- update.set_request(db, models.UpdateRequest.stable, u'bodhi')
- db.commit()
-
+ try:
+ update.set_request(db, models.UpdateRequest.stable, u'bodhi')
+ db.commit()
+ except Exception as e:
+ print('Unable to stabilize {}: {}'.format(update.alias, str(e)))
+ db.rollback()
+ msg = u"Bodhi is unable to request this update for stabilization: {}"
+ update.comment(db, msg.format(str(e)), author=u'bodhi')
+ db.commit()
except Exception as e:
print(str(e))
- db.rollback()
- Session.remove()
sys.exit(1)
+ finally:
+ Session.remove()
|
{"golden_diff": "diff --git a/bodhi/server/scripts/dequeue_stable.py b/bodhi/server/scripts/dequeue_stable.py\n--- a/bodhi/server/scripts/dequeue_stable.py\n+++ b/bodhi/server/scripts/dequeue_stable.py\n@@ -1,5 +1,5 @@\n # -*- coding: utf-8 -*-\n-# Copyright \u00a9 2017 Caleigh Runge-Hottman\n+# Copyright \u00a9 2017 Caleigh Runge-Hottman and Red Hat, Inc.\n #\n # This file is part of Bodhi.\n #\n@@ -36,11 +36,17 @@\n try:\n batched = db.query(models.Update).filter_by(request=models.UpdateRequest.batched).all()\n for update in batched:\n- update.set_request(db, models.UpdateRequest.stable, u'bodhi')\n- db.commit()\n-\n+ try:\n+ update.set_request(db, models.UpdateRequest.stable, u'bodhi')\n+ db.commit()\n+ except Exception as e:\n+ print('Unable to stabilize {}: {}'.format(update.alias, str(e)))\n+ db.rollback()\n+ msg = u\"Bodhi is unable to request this update for stabilization: {}\"\n+ update.comment(db, msg.format(str(e)), author=u'bodhi')\n+ db.commit()\n except Exception as e:\n print(str(e))\n- db.rollback()\n- Session.remove()\n sys.exit(1)\n+ finally:\n+ Session.remove()\n", "issue": "bodhi-dequqe-stable dies if any update in the queue is no longer eligible to go stable\nQuLogic from Freenode reported today that batched updates didn't go stable at 03:00 UTC like they should have. I confirmed that the cron job ran, but I didn't see any notes about its output. I then ran the command by hand and received this output:\r\n\r\n```\r\n[bowlofeggs@bodhi-backend01 ~][PROD]$ sudo -u apache /usr/bin/bodhi-dequeue-stable\r\nNo handlers could be found for logger \"bodhi.server\"\r\nThis update has not yet met the minimum testing requirements defined in the <a href=\"https://fedoraproject.org/wiki/Package_update_acceptance_criteria\">Package Update Acceptance Criteria</a>\r\n```\r\n\r\nThe [```dequeue_stable()```](https://github.com/fedora-infra/bodhi/blob/3.0.0/bodhi/server/scripts/dequeue_stable.py#L28-L46) function runs a large transaction with only a single try/except. It seems that some update in the queue no longer meets testing requirements (probably due to receiving a -1 karma after going to batched) and is raising an Exception when the tool attempts to mark it for stable. Since there is only one try/except handler, this causes the whole transaction to be rolled back.\r\n\r\nIt should be easy to fix this - we just need a try/except around each update.\r\n\r\nThanks to QuLogic from Freenode for reporting this issue to me.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright \u00a9 2017 Caleigh Runge-Hottman\n#\n# This file is part of Bodhi.\n#\n# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program; if not, write to the Free Software\n# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n\"\"\"This script is responsible for moving all updates with a batched request to a stable request.\"\"\"\n\nimport sys\n\nimport click\n\nfrom bodhi.server import buildsys, config, models, Session, initialize_db\n\n\[email protected]()\[email protected]_option(message='%(version)s')\ndef dequeue_stable():\n \"\"\"Convert all batched requests to stable requests.\"\"\"\n initialize_db(config.config)\n buildsys.setup_buildsystem(config.config)\n db = Session()\n\n try:\n batched = db.query(models.Update).filter_by(request=models.UpdateRequest.batched).all()\n for update in batched:\n update.set_request(db, models.UpdateRequest.stable, u'bodhi')\n db.commit()\n\n except Exception as e:\n print(str(e))\n db.rollback()\n Session.remove()\n sys.exit(1)\n", "path": "bodhi/server/scripts/dequeue_stable.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright \u00a9 2017 Caleigh Runge-Hottman and Red Hat, Inc.\n#\n# This file is part of Bodhi.\n#\n# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program; if not, write to the Free Software\n# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n\"\"\"This script is responsible for moving all updates with a batched request to a stable request.\"\"\"\n\nimport sys\n\nimport click\n\nfrom bodhi.server import buildsys, config, models, Session, initialize_db\n\n\[email protected]()\[email protected]_option(message='%(version)s')\ndef dequeue_stable():\n \"\"\"Convert all batched requests to stable requests.\"\"\"\n initialize_db(config.config)\n buildsys.setup_buildsystem(config.config)\n db = Session()\n\n try:\n batched = db.query(models.Update).filter_by(request=models.UpdateRequest.batched).all()\n for update in batched:\n try:\n update.set_request(db, models.UpdateRequest.stable, u'bodhi')\n db.commit()\n except Exception as e:\n print('Unable to stabilize {}: {}'.format(update.alias, str(e)))\n db.rollback()\n msg = u\"Bodhi is unable to request this update for stabilization: {}\"\n update.comment(db, msg.format(str(e)), author=u'bodhi')\n db.commit()\n except Exception as e:\n print(str(e))\n sys.exit(1)\n finally:\n Session.remove()\n", "path": "bodhi/server/scripts/dequeue_stable.py"}]}
| 1,057 | 319 |
gh_patches_debug_4008
|
rasdani/github-patches
|
git_diff
|
zestedesavoir__zds-site-2705
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Derniers sujets sur la Home : la date sur mobile n'est pas optimisée
Sur mobile on à en général pas beaucoup de place. Et il faudrait éviter d'afficher la date literralle pour optimiser la place. Cf screen (paysage).

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `zds/featured/forms.py`
Content:
```
1 # coding: utf-8
2 from crispy_forms.bootstrap import StrictButton
3 from crispy_forms.helper import FormHelper
4 from crispy_forms.layout import Layout, Field, ButtonHolder
5 from django import forms
6 from django.core.urlresolvers import reverse
7 from django.utils.translation import ugettext_lazy as _
8
9 from zds.featured.models import FeaturedResource, FeaturedMessage
10
11
12 class FeaturedResourceForm(forms.ModelForm):
13 class Meta:
14 model = FeaturedResource
15
16 fields = ['title', 'type', 'authors', 'image_url', 'url']
17
18 title = forms.CharField(
19 label=_(u'Titre'),
20 max_length=FeaturedResource._meta.get_field('title').max_length,
21 widget=forms.TextInput(
22 attrs={
23 'required': 'required',
24 }
25 )
26 )
27
28 type = forms.CharField(
29 label=_(u'Type'),
30 max_length=FeaturedResource._meta.get_field('type').max_length,
31 widget=forms.TextInput(
32 attrs={
33 'placeholder': _(u'ex: Un projet, un article, un tutoriel...'),
34 'required': 'required',
35 }
36 )
37 )
38
39 authors = forms.CharField(
40 label=_('Auteurs'),
41 widget=forms.TextInput(
42 attrs={
43 'placeholder': _(u'Les auteurs doivent être séparés par une virgule.'),
44 'required': 'required',
45 'data-autocomplete': '{ "type": "multiple" }'
46 }
47 )
48 )
49
50 image_url = forms.CharField(
51 label='Image URL',
52 max_length=FeaturedResource._meta.get_field('image_url').max_length,
53 widget=forms.TextInput(
54 attrs={
55 'placeholder': _(u'Lien vers l\'url de l\'image de la une.')
56 }
57 )
58 )
59
60 url = forms.CharField(
61 label='URL',
62 max_length=FeaturedResource._meta.get_field('url').max_length,
63 widget=forms.TextInput(
64 attrs={
65 'placeholder': _(u'Lien vers l\'url de la ressource.')
66 }
67 )
68 )
69
70 def __init__(self, *args, **kwargs):
71 super(FeaturedResourceForm, self).__init__(*args, **kwargs)
72 self.helper = FormHelper()
73 self.helper.form_class = 'content-wrapper'
74 self.helper.form_method = 'post'
75 self.helper.form_action = reverse('featured-resource-create')
76
77 self.helper.layout = Layout(
78 Field('title'),
79 Field('type'),
80 Field('authors'),
81 Field('image_url'),
82 Field('url'),
83 ButtonHolder(
84 StrictButton(_(u'Enregistrer'), type='submit'),
85 ),
86 )
87
88
89 class FeaturedMessageForm(forms.ModelForm):
90 class Meta:
91 model = FeaturedMessage
92
93 fields = ['message', 'url']
94
95 message = forms.CharField(
96 label=_(u'Message'),
97 max_length=FeaturedMessage._meta.get_field('message').max_length,
98 widget=forms.TextInput(
99 attrs={
100 'required': 'required',
101 }
102 )
103 )
104
105 url = forms.CharField(
106 label=_(u'URL'),
107 max_length=FeaturedMessage._meta.get_field('url').max_length,
108 widget=forms.TextInput(
109 attrs={
110 'placeholder': _(u'Lien vers l\'url du message.'),
111 'required': 'required',
112 }
113 )
114 )
115
116 def __init__(self, *args, **kwargs):
117 super(FeaturedMessageForm, self).__init__(*args, **kwargs)
118 self.helper = FormHelper()
119 self.helper.form_class = 'content-wrapper'
120 self.helper.form_method = 'post'
121 self.helper.form_action = reverse('featured-message-create')
122
123 self.helper.layout = Layout(
124 Field('message'),
125 Field('url'),
126 ButtonHolder(
127 StrictButton(_(u'Enregistrer'), type='submit'),
128 ),
129 )
130
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/zds/featured/forms.py b/zds/featured/forms.py
--- a/zds/featured/forms.py
+++ b/zds/featured/forms.py
@@ -52,7 +52,7 @@
max_length=FeaturedResource._meta.get_field('image_url').max_length,
widget=forms.TextInput(
attrs={
- 'placeholder': _(u'Lien vers l\'url de l\'image de la une.')
+ 'placeholder': _(u'Lien vers l\'url de l\'image de la une (dimensions: 228x228).')
}
)
)
|
{"golden_diff": "diff --git a/zds/featured/forms.py b/zds/featured/forms.py\n--- a/zds/featured/forms.py\n+++ b/zds/featured/forms.py\n@@ -52,7 +52,7 @@\n max_length=FeaturedResource._meta.get_field('image_url').max_length,\n widget=forms.TextInput(\n attrs={\n- 'placeholder': _(u'Lien vers l\\'url de l\\'image de la une.')\n+ 'placeholder': _(u'Lien vers l\\'url de l\\'image de la une (dimensions: 228x228).')\n }\n )\n )\n", "issue": "Derniers sujets sur la Home : la date sur mobile n'est pas optimis\u00e9e \nSur mobile on \u00e0 en g\u00e9n\u00e9ral pas beaucoup de place. Et il faudrait \u00e9viter d'afficher la date literralle pour optimiser la place. Cf screen (paysage).\n\n\n\n", "before_files": [{"content": "# coding: utf-8\nfrom crispy_forms.bootstrap import StrictButton\nfrom crispy_forms.helper import FormHelper\nfrom crispy_forms.layout import Layout, Field, ButtonHolder\nfrom django import forms\nfrom django.core.urlresolvers import reverse\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom zds.featured.models import FeaturedResource, FeaturedMessage\n\n\nclass FeaturedResourceForm(forms.ModelForm):\n class Meta:\n model = FeaturedResource\n\n fields = ['title', 'type', 'authors', 'image_url', 'url']\n\n title = forms.CharField(\n label=_(u'Titre'),\n max_length=FeaturedResource._meta.get_field('title').max_length,\n widget=forms.TextInput(\n attrs={\n 'required': 'required',\n }\n )\n )\n\n type = forms.CharField(\n label=_(u'Type'),\n max_length=FeaturedResource._meta.get_field('type').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'ex: Un projet, un article, un tutoriel...'),\n 'required': 'required',\n }\n )\n )\n\n authors = forms.CharField(\n label=_('Auteurs'),\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Les auteurs doivent \u00eatre s\u00e9par\u00e9s par une virgule.'),\n 'required': 'required',\n 'data-autocomplete': '{ \"type\": \"multiple\" }'\n }\n )\n )\n\n image_url = forms.CharField(\n label='Image URL',\n max_length=FeaturedResource._meta.get_field('image_url').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Lien vers l\\'url de l\\'image de la une.')\n }\n )\n )\n\n url = forms.CharField(\n label='URL',\n max_length=FeaturedResource._meta.get_field('url').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Lien vers l\\'url de la ressource.')\n }\n )\n )\n\n def __init__(self, *args, **kwargs):\n super(FeaturedResourceForm, self).__init__(*args, **kwargs)\n self.helper = FormHelper()\n self.helper.form_class = 'content-wrapper'\n self.helper.form_method = 'post'\n self.helper.form_action = reverse('featured-resource-create')\n\n self.helper.layout = Layout(\n Field('title'),\n Field('type'),\n Field('authors'),\n Field('image_url'),\n Field('url'),\n ButtonHolder(\n StrictButton(_(u'Enregistrer'), type='submit'),\n ),\n )\n\n\nclass FeaturedMessageForm(forms.ModelForm):\n class Meta:\n model = FeaturedMessage\n\n fields = ['message', 'url']\n\n message = forms.CharField(\n label=_(u'Message'),\n max_length=FeaturedMessage._meta.get_field('message').max_length,\n widget=forms.TextInput(\n attrs={\n 'required': 'required',\n }\n )\n )\n\n url = forms.CharField(\n label=_(u'URL'),\n max_length=FeaturedMessage._meta.get_field('url').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Lien vers l\\'url du message.'),\n 'required': 'required',\n }\n )\n )\n\n def __init__(self, *args, **kwargs):\n super(FeaturedMessageForm, self).__init__(*args, **kwargs)\n self.helper = FormHelper()\n self.helper.form_class = 'content-wrapper'\n self.helper.form_method = 'post'\n self.helper.form_action = reverse('featured-message-create')\n\n self.helper.layout = Layout(\n Field('message'),\n Field('url'),\n ButtonHolder(\n StrictButton(_(u'Enregistrer'), type='submit'),\n ),\n )\n", "path": "zds/featured/forms.py"}], "after_files": [{"content": "# coding: utf-8\nfrom crispy_forms.bootstrap import StrictButton\nfrom crispy_forms.helper import FormHelper\nfrom crispy_forms.layout import Layout, Field, ButtonHolder\nfrom django import forms\nfrom django.core.urlresolvers import reverse\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom zds.featured.models import FeaturedResource, FeaturedMessage\n\n\nclass FeaturedResourceForm(forms.ModelForm):\n class Meta:\n model = FeaturedResource\n\n fields = ['title', 'type', 'authors', 'image_url', 'url']\n\n title = forms.CharField(\n label=_(u'Titre'),\n max_length=FeaturedResource._meta.get_field('title').max_length,\n widget=forms.TextInput(\n attrs={\n 'required': 'required',\n }\n )\n )\n\n type = forms.CharField(\n label=_(u'Type'),\n max_length=FeaturedResource._meta.get_field('type').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'ex: Un projet, un article, un tutoriel...'),\n 'required': 'required',\n }\n )\n )\n\n authors = forms.CharField(\n label=_('Auteurs'),\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Les auteurs doivent \u00eatre s\u00e9par\u00e9s par une virgule.'),\n 'required': 'required',\n 'data-autocomplete': '{ \"type\": \"multiple\" }'\n }\n )\n )\n\n image_url = forms.CharField(\n label='Image URL',\n max_length=FeaturedResource._meta.get_field('image_url').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Lien vers l\\'url de l\\'image de la une (dimensions: 228x228).')\n }\n )\n )\n\n url = forms.CharField(\n label='URL',\n max_length=FeaturedResource._meta.get_field('url').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Lien vers l\\'url de la ressource.')\n }\n )\n )\n\n def __init__(self, *args, **kwargs):\n super(FeaturedResourceForm, self).__init__(*args, **kwargs)\n self.helper = FormHelper()\n self.helper.form_class = 'content-wrapper'\n self.helper.form_method = 'post'\n self.helper.form_action = reverse('featured-resource-create')\n\n self.helper.layout = Layout(\n Field('title'),\n Field('type'),\n Field('authors'),\n Field('image_url'),\n Field('url'),\n ButtonHolder(\n StrictButton(_(u'Enregistrer'), type='submit'),\n ),\n )\n\n\nclass FeaturedMessageForm(forms.ModelForm):\n class Meta:\n model = FeaturedMessage\n\n fields = ['message', 'url']\n\n message = forms.CharField(\n label=_(u'Message'),\n max_length=FeaturedMessage._meta.get_field('message').max_length,\n widget=forms.TextInput(\n attrs={\n 'required': 'required',\n }\n )\n )\n\n url = forms.CharField(\n label=_(u'URL'),\n max_length=FeaturedMessage._meta.get_field('url').max_length,\n widget=forms.TextInput(\n attrs={\n 'placeholder': _(u'Lien vers l\\'url du message.'),\n 'required': 'required',\n }\n )\n )\n\n def __init__(self, *args, **kwargs):\n super(FeaturedMessageForm, self).__init__(*args, **kwargs)\n self.helper = FormHelper()\n self.helper.form_class = 'content-wrapper'\n self.helper.form_method = 'post'\n self.helper.form_action = reverse('featured-message-create')\n\n self.helper.layout = Layout(\n Field('message'),\n Field('url'),\n ButtonHolder(\n StrictButton(_(u'Enregistrer'), type='submit'),\n ),\n )\n", "path": "zds/featured/forms.py"}]}
| 1,495 | 133 |
gh_patches_debug_7194
|
rasdani/github-patches
|
git_diff
|
instadeepai__Mava-1041
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG]: Flasbax bug on LBF (type issue)
### Describe the bug
In the AgentID wrapper, the new_agents_view type is not enforced to be consistent, so for LBF with flashbax a dtype error emerges.
### To Reproduce
Steps to reproduce the behavior:
1. Run LBF with flashbax.
### Expected behavior
Expected the observation to be added easily to the buffer.
### Context (Environment)
- Updated jumanji
### Additional context
This is somewhat an exercise in opening an issue!
### Possible Solution
Will make a PR soon! Basically a cast to the agents_view type.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mava/wrappers/observation.py`
Content:
```
1 # Copyright 2022 InstaDeep Ltd. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from typing import Tuple, Union
16
17 import chex
18 import jax.numpy as jnp
19 from jumanji import specs
20 from jumanji.env import Environment
21 from jumanji.types import TimeStep
22 from jumanji.wrappers import Wrapper
23
24 from mava.types import Observation, ObservationGlobalState, State
25
26
27 class AgentIDWrapper(Wrapper):
28 """A wrapper to add a one-hot vector as agent IDs to the original observation.
29 It can be useful in multi-agent environments where agents require unique identification.
30 """
31
32 def __init__(self, env: Environment):
33 super().__init__(env)
34
35 def _add_agent_ids(
36 self, timestep: TimeStep, num_agents: int
37 ) -> Union[Observation, ObservationGlobalState]:
38 """Adds agent IDs to the observation."""
39 obs = timestep.observation
40 agent_ids = jnp.eye(num_agents)
41 agents_view = jnp.concatenate([agent_ids, obs.agents_view], axis=-1)
42
43 return obs._replace(agents_view=agents_view) # type: ignore
44
45 def reset(self, key: chex.PRNGKey) -> Tuple[State, TimeStep]:
46 """Reset the environment."""
47 state, timestep = self._env.reset(key)
48 timestep.observation = self._add_agent_ids(timestep, self._env.num_agents)
49
50 return state, timestep
51
52 def step(
53 self,
54 state: State,
55 action: chex.Array,
56 ) -> Tuple[State, TimeStep]:
57 """Step the environment."""
58 state, timestep = self._env.step(state, action)
59 timestep.observation = self._add_agent_ids(timestep, self._env.num_agents)
60
61 return state, timestep
62
63 def observation_spec(
64 self,
65 ) -> Union[specs.Spec[Observation], specs.Spec[ObservationGlobalState]]:
66 """Specification of the observation of the selected environment."""
67 obs_spec = self._env.observation_spec()
68 num_obs_features = obs_spec.agents_view.shape[-1] + self._env.num_agents
69 dtype = obs_spec.agents_view.dtype
70 agents_view = specs.Array((self._env.num_agents, num_obs_features), dtype, "agents_view")
71
72 return obs_spec.replace(agents_view=agents_view)
73
74
75 class GlobalStateWrapper(Wrapper):
76 """Wrapper for adding global state to an environment that follows the mava API.
77
78 The wrapper includes a global environment state to be used by the centralised critic.
79 Note here that since most environments do not have a global state, we create one
80 by concatenating the observations of all agents.
81 """
82
83 def modify_timestep(self, timestep: TimeStep) -> TimeStep[ObservationGlobalState]:
84 global_state = jnp.concatenate(timestep.observation.agents_view, axis=0)
85 global_state = jnp.tile(global_state, (self._env.num_agents, 1))
86
87 observation = ObservationGlobalState(
88 global_state=global_state,
89 agents_view=timestep.observation.agents_view,
90 action_mask=timestep.observation.action_mask,
91 step_count=timestep.observation.step_count,
92 )
93
94 return timestep.replace(observation=observation)
95
96 def reset(self, key: chex.PRNGKey) -> Tuple[State, TimeStep]:
97 """Reset the environment. Updates the step count."""
98 state, timestep = self._env.reset(key)
99 return state, self.modify_timestep(timestep)
100
101 def step(self, state: State, action: chex.Array) -> Tuple[State, TimeStep]:
102 """Step the environment. Updates the step count."""
103 state, timestep = self._env.step(state, action)
104 return state, self.modify_timestep(timestep)
105
106 def observation_spec(self) -> specs.Spec[ObservationGlobalState]:
107 """Specification of the observation of the selected environment."""
108
109 obs_spec = self._env.observation_spec()
110 num_obs_features = obs_spec.agents_view.shape[-1]
111 global_state = specs.Array(
112 (self._env.num_agents, self._env.num_agents * num_obs_features),
113 obs_spec.agents_view.dtype,
114 "global_state",
115 )
116
117 return specs.Spec(
118 ObservationGlobalState,
119 "ObservationSpec",
120 agents_view=obs_spec.agents_view,
121 action_mask=obs_spec.action_mask,
122 global_state=global_state,
123 step_count=obs_spec.step_count,
124 )
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mava/wrappers/observation.py b/mava/wrappers/observation.py
--- a/mava/wrappers/observation.py
+++ b/mava/wrappers/observation.py
@@ -38,7 +38,11 @@
"""Adds agent IDs to the observation."""
obs = timestep.observation
agent_ids = jnp.eye(num_agents)
- agents_view = jnp.concatenate([agent_ids, obs.agents_view], axis=-1)
+ agents_view = jnp.concatenate(
+ [agent_ids, obs.agents_view],
+ axis=-1,
+ dtype=obs.agents_view.dtype,
+ )
return obs._replace(agents_view=agents_view) # type: ignore
|
{"golden_diff": "diff --git a/mava/wrappers/observation.py b/mava/wrappers/observation.py\n--- a/mava/wrappers/observation.py\n+++ b/mava/wrappers/observation.py\n@@ -38,7 +38,11 @@\n \"\"\"Adds agent IDs to the observation.\"\"\"\n obs = timestep.observation\n agent_ids = jnp.eye(num_agents)\n- agents_view = jnp.concatenate([agent_ids, obs.agents_view], axis=-1)\n+ agents_view = jnp.concatenate(\n+ [agent_ids, obs.agents_view],\n+ axis=-1,\n+ dtype=obs.agents_view.dtype,\n+ )\n \n return obs._replace(agents_view=agents_view) # type: ignore\n", "issue": "[BUG]: Flasbax bug on LBF (type issue)\n### Describe the bug\r\nIn the AgentID wrapper, the new_agents_view type is not enforced to be consistent, so for LBF with flashbax a dtype error emerges.\r\n\r\n### To Reproduce\r\nSteps to reproduce the behavior:\r\n1. Run LBF with flashbax.\r\n\r\n### Expected behavior\r\nExpected the observation to be added easily to the buffer.\r\n\r\n### Context (Environment)\r\n - Updated jumanji\r\n\r\n### Additional context\r\nThis is somewhat an exercise in opening an issue!\r\n\r\n### Possible Solution\r\nWill make a PR soon! Basically a cast to the agents_view type.\r\n\n", "before_files": [{"content": "# Copyright 2022 InstaDeep Ltd. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Tuple, Union\n\nimport chex\nimport jax.numpy as jnp\nfrom jumanji import specs\nfrom jumanji.env import Environment\nfrom jumanji.types import TimeStep\nfrom jumanji.wrappers import Wrapper\n\nfrom mava.types import Observation, ObservationGlobalState, State\n\n\nclass AgentIDWrapper(Wrapper):\n \"\"\"A wrapper to add a one-hot vector as agent IDs to the original observation.\n It can be useful in multi-agent environments where agents require unique identification.\n \"\"\"\n\n def __init__(self, env: Environment):\n super().__init__(env)\n\n def _add_agent_ids(\n self, timestep: TimeStep, num_agents: int\n ) -> Union[Observation, ObservationGlobalState]:\n \"\"\"Adds agent IDs to the observation.\"\"\"\n obs = timestep.observation\n agent_ids = jnp.eye(num_agents)\n agents_view = jnp.concatenate([agent_ids, obs.agents_view], axis=-1)\n\n return obs._replace(agents_view=agents_view) # type: ignore\n\n def reset(self, key: chex.PRNGKey) -> Tuple[State, TimeStep]:\n \"\"\"Reset the environment.\"\"\"\n state, timestep = self._env.reset(key)\n timestep.observation = self._add_agent_ids(timestep, self._env.num_agents)\n\n return state, timestep\n\n def step(\n self,\n state: State,\n action: chex.Array,\n ) -> Tuple[State, TimeStep]:\n \"\"\"Step the environment.\"\"\"\n state, timestep = self._env.step(state, action)\n timestep.observation = self._add_agent_ids(timestep, self._env.num_agents)\n\n return state, timestep\n\n def observation_spec(\n self,\n ) -> Union[specs.Spec[Observation], specs.Spec[ObservationGlobalState]]:\n \"\"\"Specification of the observation of the selected environment.\"\"\"\n obs_spec = self._env.observation_spec()\n num_obs_features = obs_spec.agents_view.shape[-1] + self._env.num_agents\n dtype = obs_spec.agents_view.dtype\n agents_view = specs.Array((self._env.num_agents, num_obs_features), dtype, \"agents_view\")\n\n return obs_spec.replace(agents_view=agents_view)\n\n\nclass GlobalStateWrapper(Wrapper):\n \"\"\"Wrapper for adding global state to an environment that follows the mava API.\n\n The wrapper includes a global environment state to be used by the centralised critic.\n Note here that since most environments do not have a global state, we create one\n by concatenating the observations of all agents.\n \"\"\"\n\n def modify_timestep(self, timestep: TimeStep) -> TimeStep[ObservationGlobalState]:\n global_state = jnp.concatenate(timestep.observation.agents_view, axis=0)\n global_state = jnp.tile(global_state, (self._env.num_agents, 1))\n\n observation = ObservationGlobalState(\n global_state=global_state,\n agents_view=timestep.observation.agents_view,\n action_mask=timestep.observation.action_mask,\n step_count=timestep.observation.step_count,\n )\n\n return timestep.replace(observation=observation)\n\n def reset(self, key: chex.PRNGKey) -> Tuple[State, TimeStep]:\n \"\"\"Reset the environment. Updates the step count.\"\"\"\n state, timestep = self._env.reset(key)\n return state, self.modify_timestep(timestep)\n\n def step(self, state: State, action: chex.Array) -> Tuple[State, TimeStep]:\n \"\"\"Step the environment. Updates the step count.\"\"\"\n state, timestep = self._env.step(state, action)\n return state, self.modify_timestep(timestep)\n\n def observation_spec(self) -> specs.Spec[ObservationGlobalState]:\n \"\"\"Specification of the observation of the selected environment.\"\"\"\n\n obs_spec = self._env.observation_spec()\n num_obs_features = obs_spec.agents_view.shape[-1]\n global_state = specs.Array(\n (self._env.num_agents, self._env.num_agents * num_obs_features),\n obs_spec.agents_view.dtype,\n \"global_state\",\n )\n\n return specs.Spec(\n ObservationGlobalState,\n \"ObservationSpec\",\n agents_view=obs_spec.agents_view,\n action_mask=obs_spec.action_mask,\n global_state=global_state,\n step_count=obs_spec.step_count,\n )\n", "path": "mava/wrappers/observation.py"}], "after_files": [{"content": "# Copyright 2022 InstaDeep Ltd. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Tuple, Union\n\nimport chex\nimport jax.numpy as jnp\nfrom jumanji import specs\nfrom jumanji.env import Environment\nfrom jumanji.types import TimeStep\nfrom jumanji.wrappers import Wrapper\n\nfrom mava.types import Observation, ObservationGlobalState, State\n\n\nclass AgentIDWrapper(Wrapper):\n \"\"\"A wrapper to add a one-hot vector as agent IDs to the original observation.\n It can be useful in multi-agent environments where agents require unique identification.\n \"\"\"\n\n def __init__(self, env: Environment):\n super().__init__(env)\n\n def _add_agent_ids(\n self, timestep: TimeStep, num_agents: int\n ) -> Union[Observation, ObservationGlobalState]:\n \"\"\"Adds agent IDs to the observation.\"\"\"\n obs = timestep.observation\n agent_ids = jnp.eye(num_agents)\n agents_view = jnp.concatenate(\n [agent_ids, obs.agents_view],\n axis=-1,\n dtype=obs.agents_view.dtype,\n )\n\n return obs._replace(agents_view=agents_view) # type: ignore\n\n def reset(self, key: chex.PRNGKey) -> Tuple[State, TimeStep]:\n \"\"\"Reset the environment.\"\"\"\n state, timestep = self._env.reset(key)\n timestep.observation = self._add_agent_ids(timestep, self._env.num_agents)\n\n return state, timestep\n\n def step(\n self,\n state: State,\n action: chex.Array,\n ) -> Tuple[State, TimeStep]:\n \"\"\"Step the environment.\"\"\"\n state, timestep = self._env.step(state, action)\n timestep.observation = self._add_agent_ids(timestep, self._env.num_agents)\n\n return state, timestep\n\n def observation_spec(\n self,\n ) -> Union[specs.Spec[Observation], specs.Spec[ObservationGlobalState]]:\n \"\"\"Specification of the observation of the selected environment.\"\"\"\n obs_spec = self._env.observation_spec()\n num_obs_features = obs_spec.agents_view.shape[-1] + self._env.num_agents\n dtype = obs_spec.agents_view.dtype\n agents_view = specs.Array((self._env.num_agents, num_obs_features), dtype, \"agents_view\")\n\n return obs_spec.replace(agents_view=agents_view)\n\n\nclass GlobalStateWrapper(Wrapper):\n \"\"\"Wrapper for adding global state to an environment that follows the mava API.\n\n The wrapper includes a global environment state to be used by the centralised critic.\n Note here that since most environments do not have a global state, we create one\n by concatenating the observations of all agents.\n \"\"\"\n\n def modify_timestep(self, timestep: TimeStep) -> TimeStep[ObservationGlobalState]:\n global_state = jnp.concatenate(timestep.observation.agents_view, axis=0)\n global_state = jnp.tile(global_state, (self._env.num_agents, 1))\n\n observation = ObservationGlobalState(\n global_state=global_state,\n agents_view=timestep.observation.agents_view,\n action_mask=timestep.observation.action_mask,\n step_count=timestep.observation.step_count,\n )\n\n return timestep.replace(observation=observation)\n\n def reset(self, key: chex.PRNGKey) -> Tuple[State, TimeStep]:\n \"\"\"Reset the environment. Updates the step count.\"\"\"\n state, timestep = self._env.reset(key)\n return state, self.modify_timestep(timestep)\n\n def step(self, state: State, action: chex.Array) -> Tuple[State, TimeStep]:\n \"\"\"Step the environment. Updates the step count.\"\"\"\n state, timestep = self._env.step(state, action)\n return state, self.modify_timestep(timestep)\n\n def observation_spec(self) -> specs.Spec[ObservationGlobalState]:\n \"\"\"Specification of the observation of the selected environment.\"\"\"\n\n obs_spec = self._env.observation_spec()\n num_obs_features = obs_spec.agents_view.shape[-1]\n global_state = specs.Array(\n (self._env.num_agents, self._env.num_agents * num_obs_features),\n obs_spec.agents_view.dtype,\n \"global_state\",\n )\n\n return specs.Spec(\n ObservationGlobalState,\n \"ObservationSpec\",\n agents_view=obs_spec.agents_view,\n action_mask=obs_spec.action_mask,\n global_state=global_state,\n step_count=obs_spec.step_count,\n )\n", "path": "mava/wrappers/observation.py"}]}
| 1,739 | 162 |
gh_patches_debug_62030
|
rasdani/github-patches
|
git_diff
|
fonttools__fonttools-2472
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[feaLib] "fonttools feaLib" should error out, not continue
If there's a parse/build error when using the feaLib command line tool, we currently do this:
https://github.com/fonttools/fonttools/blob/445108f735b22a5ca37f669808d47906d024fe24/Lib/fontTools/feaLib/__main__.py#L69-L73
i.e. we save the font anyway and exit with status code 0.
My Makefiles and I think this is a terrible idea, and I would like to change it. Any objections / thoughts?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `Lib/fontTools/feaLib/__main__.py`
Content:
```
1 from fontTools.ttLib import TTFont
2 from fontTools.feaLib.builder import addOpenTypeFeatures, Builder
3 from fontTools.feaLib.error import FeatureLibError
4 from fontTools import configLogger
5 from fontTools.misc.cliTools import makeOutputFileName
6 import sys
7 import argparse
8 import logging
9
10
11 log = logging.getLogger("fontTools.feaLib")
12
13
14 def main(args=None):
15 """Add features from a feature file (.fea) into a OTF font"""
16 parser = argparse.ArgumentParser(
17 description="Use fontTools to compile OpenType feature files (*.fea)."
18 )
19 parser.add_argument(
20 "input_fea", metavar="FEATURES", help="Path to the feature file"
21 )
22 parser.add_argument(
23 "input_font", metavar="INPUT_FONT", help="Path to the input font"
24 )
25 parser.add_argument(
26 "-o",
27 "--output",
28 dest="output_font",
29 metavar="OUTPUT_FONT",
30 help="Path to the output font.",
31 )
32 parser.add_argument(
33 "-t",
34 "--tables",
35 metavar="TABLE_TAG",
36 choices=Builder.supportedTables,
37 nargs="+",
38 help="Specify the table(s) to be built.",
39 )
40 parser.add_argument(
41 "-d",
42 "--debug",
43 action="store_true",
44 help="Add source-level debugging information to font.",
45 )
46 parser.add_argument(
47 "-v",
48 "--verbose",
49 help="increase the logger verbosity. Multiple -v " "options are allowed.",
50 action="count",
51 default=0,
52 )
53 parser.add_argument(
54 "--traceback", help="show traceback for exceptions.", action="store_true"
55 )
56 options = parser.parse_args(args)
57
58 levels = ["WARNING", "INFO", "DEBUG"]
59 configLogger(level=levels[min(len(levels) - 1, options.verbose)])
60
61 output_font = options.output_font or makeOutputFileName(options.input_font)
62 log.info("Compiling features to '%s'" % (output_font))
63
64 font = TTFont(options.input_font)
65 try:
66 addOpenTypeFeatures(
67 font, options.input_fea, tables=options.tables, debug=options.debug
68 )
69 except FeatureLibError as e:
70 if options.traceback:
71 raise
72 log.error(e)
73 font.save(output_font)
74
75
76 if __name__ == "__main__":
77 sys.exit(main())
78
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/Lib/fontTools/feaLib/__main__.py b/Lib/fontTools/feaLib/__main__.py
--- a/Lib/fontTools/feaLib/__main__.py
+++ b/Lib/fontTools/feaLib/__main__.py
@@ -70,6 +70,7 @@
if options.traceback:
raise
log.error(e)
+ sys.exit(1)
font.save(output_font)
|
{"golden_diff": "diff --git a/Lib/fontTools/feaLib/__main__.py b/Lib/fontTools/feaLib/__main__.py\n--- a/Lib/fontTools/feaLib/__main__.py\n+++ b/Lib/fontTools/feaLib/__main__.py\n@@ -70,6 +70,7 @@\n if options.traceback:\n raise\n log.error(e)\n+ sys.exit(1)\n font.save(output_font)\n", "issue": "[feaLib] \"fonttools feaLib\" should error out, not continue\nIf there's a parse/build error when using the feaLib command line tool, we currently do this:\r\n\r\nhttps://github.com/fonttools/fonttools/blob/445108f735b22a5ca37f669808d47906d024fe24/Lib/fontTools/feaLib/__main__.py#L69-L73\r\n\r\ni.e. we save the font anyway and exit with status code 0.\r\n\r\nMy Makefiles and I think this is a terrible idea, and I would like to change it. Any objections / thoughts?\r\n\r\n\n", "before_files": [{"content": "from fontTools.ttLib import TTFont\nfrom fontTools.feaLib.builder import addOpenTypeFeatures, Builder\nfrom fontTools.feaLib.error import FeatureLibError\nfrom fontTools import configLogger\nfrom fontTools.misc.cliTools import makeOutputFileName\nimport sys\nimport argparse\nimport logging\n\n\nlog = logging.getLogger(\"fontTools.feaLib\")\n\n\ndef main(args=None):\n \"\"\"Add features from a feature file (.fea) into a OTF font\"\"\"\n parser = argparse.ArgumentParser(\n description=\"Use fontTools to compile OpenType feature files (*.fea).\"\n )\n parser.add_argument(\n \"input_fea\", metavar=\"FEATURES\", help=\"Path to the feature file\"\n )\n parser.add_argument(\n \"input_font\", metavar=\"INPUT_FONT\", help=\"Path to the input font\"\n )\n parser.add_argument(\n \"-o\",\n \"--output\",\n dest=\"output_font\",\n metavar=\"OUTPUT_FONT\",\n help=\"Path to the output font.\",\n )\n parser.add_argument(\n \"-t\",\n \"--tables\",\n metavar=\"TABLE_TAG\",\n choices=Builder.supportedTables,\n nargs=\"+\",\n help=\"Specify the table(s) to be built.\",\n )\n parser.add_argument(\n \"-d\",\n \"--debug\",\n action=\"store_true\",\n help=\"Add source-level debugging information to font.\",\n )\n parser.add_argument(\n \"-v\",\n \"--verbose\",\n help=\"increase the logger verbosity. Multiple -v \" \"options are allowed.\",\n action=\"count\",\n default=0,\n )\n parser.add_argument(\n \"--traceback\", help=\"show traceback for exceptions.\", action=\"store_true\"\n )\n options = parser.parse_args(args)\n\n levels = [\"WARNING\", \"INFO\", \"DEBUG\"]\n configLogger(level=levels[min(len(levels) - 1, options.verbose)])\n\n output_font = options.output_font or makeOutputFileName(options.input_font)\n log.info(\"Compiling features to '%s'\" % (output_font))\n\n font = TTFont(options.input_font)\n try:\n addOpenTypeFeatures(\n font, options.input_fea, tables=options.tables, debug=options.debug\n )\n except FeatureLibError as e:\n if options.traceback:\n raise\n log.error(e)\n font.save(output_font)\n\n\nif __name__ == \"__main__\":\n sys.exit(main())\n", "path": "Lib/fontTools/feaLib/__main__.py"}], "after_files": [{"content": "from fontTools.ttLib import TTFont\nfrom fontTools.feaLib.builder import addOpenTypeFeatures, Builder\nfrom fontTools.feaLib.error import FeatureLibError\nfrom fontTools import configLogger\nfrom fontTools.misc.cliTools import makeOutputFileName\nimport sys\nimport argparse\nimport logging\n\n\nlog = logging.getLogger(\"fontTools.feaLib\")\n\n\ndef main(args=None):\n \"\"\"Add features from a feature file (.fea) into a OTF font\"\"\"\n parser = argparse.ArgumentParser(\n description=\"Use fontTools to compile OpenType feature files (*.fea).\"\n )\n parser.add_argument(\n \"input_fea\", metavar=\"FEATURES\", help=\"Path to the feature file\"\n )\n parser.add_argument(\n \"input_font\", metavar=\"INPUT_FONT\", help=\"Path to the input font\"\n )\n parser.add_argument(\n \"-o\",\n \"--output\",\n dest=\"output_font\",\n metavar=\"OUTPUT_FONT\",\n help=\"Path to the output font.\",\n )\n parser.add_argument(\n \"-t\",\n \"--tables\",\n metavar=\"TABLE_TAG\",\n choices=Builder.supportedTables,\n nargs=\"+\",\n help=\"Specify the table(s) to be built.\",\n )\n parser.add_argument(\n \"-d\",\n \"--debug\",\n action=\"store_true\",\n help=\"Add source-level debugging information to font.\",\n )\n parser.add_argument(\n \"-v\",\n \"--verbose\",\n help=\"increase the logger verbosity. Multiple -v \" \"options are allowed.\",\n action=\"count\",\n default=0,\n )\n parser.add_argument(\n \"--traceback\", help=\"show traceback for exceptions.\", action=\"store_true\"\n )\n options = parser.parse_args(args)\n\n levels = [\"WARNING\", \"INFO\", \"DEBUG\"]\n configLogger(level=levels[min(len(levels) - 1, options.verbose)])\n\n output_font = options.output_font or makeOutputFileName(options.input_font)\n log.info(\"Compiling features to '%s'\" % (output_font))\n\n font = TTFont(options.input_font)\n try:\n addOpenTypeFeatures(\n font, options.input_fea, tables=options.tables, debug=options.debug\n )\n except FeatureLibError as e:\n if options.traceback:\n raise\n log.error(e)\n sys.exit(1)\n font.save(output_font)\n\n\nif __name__ == \"__main__\":\n sys.exit(main())\n", "path": "Lib/fontTools/feaLib/__main__.py"}]}
| 1,065 | 95 |
gh_patches_debug_43633
|
rasdani/github-patches
|
git_diff
|
bids-standard__pybids-17
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add get_fieldmap() method
Option 1 (simple):
```
>> layout = BIDSLayout("/data/ds")
>> layout.get_fieldmap("sub-01/func/sub-01_task-rest_bold.nii.gz")
[
"/data/ds/sub-01/fmap/sub-01_magnitude1.nii.gz",
"/data/ds/sub-01/fmap/sub-01_magnitude2.nii.gz",
"/data/ds/sub-01/fmap/sub-01_phasediff.nii.gz"
]
```
Option 2 (extensive):
```
>> layout = BIDSLayout("/data/ds")
>> layout.get_fieldmap("sub-01/func/sub-01_task-rest_bold.nii.gz")
{
'type': 'phasediff',
'metadata': {
"EchoTime1" : 0.00600,
"EchoTime2" : 0.00746,
"IntendedFor" : "func/sub01_taskmotor_bold.nii.gz"
},
'data' : [
"/data/ds/sub-01/fmap/sub-01_magnitude1.nii.gz",
"/data/ds/sub-01/fmap/sub-01_magnitude2.nii.gz",
"/data/ds/sub-01/fmap/sub-01_phasediff.nii.gz"
]
}
```
WDYT @rwblair @oesteban @yarikoptic @tyarkoni @satra?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bids/grabbids/bids_layout.py`
Content:
```
1 import os
2 import re
3 import json
4
5 from itertools import combinations
6 from os.path import dirname
7 from os.path import realpath
8 from os.path import join as pathjoin
9 from os.path import split as pathsplit
10
11 from grabbit import Layout
12
13 __all__ = ['BIDSLayout']
14
15
16 class BIDSLayout(Layout):
17 def __init__(self, path, config=None):
18 if config is None:
19 root = dirname(realpath(__file__))
20 config = pathjoin(root, 'config', 'bids.json')
21 super(BIDSLayout, self).__init__(path, config)
22
23 def get_metadata(self, path):
24 sidecarJSON = path.replace(".nii.gz", ".json").replace(".nii", ".json")
25 path_components = pathsplit(sidecarJSON)
26 filename_components = path_components[-1].split("_")
27 ses = None
28 suffix = filename_components[-1]
29
30 sub = filename_components[0]
31 keyword_components = filename_components[1:-1]
32 if filename_components[1][:3] == "ses":
33 ses = filename_components[1]
34 keyword_components = filename_components[2:-1]
35
36 potentialJSONs = []
37 for prefixes, conditional in ( # Levels
38 (tuple(), True), # top
39 ((sub,), True), # subject
40 ((sub, ses), ses) # session
41 ):
42 if not conditional:
43 continue
44 for k in range(len(keyword_components) + 1):
45 # print(k)
46 for components in combinations(keyword_components, k):
47 # print(components)
48 potentialJSONs.append(
49 pathjoin(
50 self.root,
51 *(prefixes +
52 ("_".join(prefixes + components + (suffix,)),))))
53
54 merged_param_dict = {}
55 for json_file_path in potentialJSONs:
56 if os.path.exists(json_file_path):
57 param_dict = json.load(open(json_file_path, "r"))
58 merged_param_dict.update(param_dict)
59
60 return merged_param_dict
61
62 def find_match(self, target, source=None):
63
64 # Try to take the easy way out
65 if source is not None:
66 _target = source.split('.')[0] + '.' + target
67 if os.path.exists(_target):
68 return target
69
70 if target in list(self.entities.keys()):
71 candidates = list(self.entities[target].files.keys())
72 else:
73 candidates = []
74
75 for root, directories, filenames in os.walk(self.root):
76 for f in filenames:
77 if re.search(target + '$', f):
78 if os.path.sep == "\\":
79 f = f.replace("\\", "\\\\")
80 candidates.append(f)
81
82 if source is None:
83 return candidates
84
85 # Walk up the file hierarchy from source, find first match
86 if not os.path.exists(source):
87 raise OSError("The file '%s' doesn't exist." % source)
88 elif not source.startswith(self.root):
89 raise ValueError("The file '%s' is not contained "
90 "within the current project "
91 "directory (%s)." % (source, self.root))
92 rel = os.path.relpath(dirname(source), self.root)
93 sep = os.path.sep
94 chunks = rel.split(sep)
95 n_chunks = len(chunks)
96 for i in range(n_chunks, -1, -1):
97 path = pathjoin(self.root, *chunks[:i])
98 patt = path + '\%s[^\%s]+$' % (sep, sep)
99 if sep == "\\":
100 patt = path + '\\[^\\]+$'
101 patt = patt.replace("\\", "\\\\")
102 matches = [x for x in candidates if re.search(patt, x)]
103 if matches:
104 if len(matches) == 1:
105 return matches[0]
106 else:
107 raise ValueError("Ambiguous target: more than one "
108 "candidate file found in "
109 "directory '%s'." % path)
110 return None
111
```
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2
3 import os
4 from setuptools import setup, find_packages
5 PACKAGES = find_packages()
6
7 # Get version and release info, which is all stored in bids/version.py
8 ver_file = os.path.join('bids', 'version.py')
9 with open(ver_file) as f:
10 exec(f.read())
11
12 opts = dict(name=NAME,
13 maintainer=MAINTAINER,
14 maintainer_email=MAINTAINER_EMAIL,
15 description=DESCRIPTION,
16 long_description=LONG_DESCRIPTION,
17 url=URL,
18 download_url=DOWNLOAD_URL,
19 license=LICENSE,
20 classifiers=CLASSIFIERS,
21 author=AUTHOR,
22 author_email=AUTHOR_EMAIL,
23 platforms=PLATFORMS,
24 version=VERSION,
25 packages=PACKAGES,
26 requires=REQUIRES,
27 install_requires=REQUIRES)
28
29
30 if __name__ == '__main__':
31 setup(**opts)
32
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/bids/grabbids/bids_layout.py b/bids/grabbids/bids_layout.py
--- a/bids/grabbids/bids_layout.py
+++ b/bids/grabbids/bids_layout.py
@@ -18,7 +18,7 @@
if config is None:
root = dirname(realpath(__file__))
config = pathjoin(root, 'config', 'bids.json')
- super(BIDSLayout, self).__init__(path, config)
+ super(BIDSLayout, self).__init__(path, config, dynamic_getters=True)
def get_metadata(self, path):
sidecarJSON = path.replace(".nii.gz", ".json").replace(".nii", ".json")
@@ -34,21 +34,21 @@
keyword_components = filename_components[2:-1]
potentialJSONs = []
- for prefixes, conditional in ( # Levels
- (tuple(), True), # top
- ((sub,), True), # subject
- ((sub, ses), ses) # session
+ for prefixes, midlayer, conditional in ( # Levels
+ (tuple(), tuple(), True), # top
+ ((sub,), tuple(), True), # subject
+ ((sub, ), (pathsplit(path_components[-2])[-1],), True),
+ ((sub, ses), tuple(), ses), # session
+ ((sub, ses), (pathsplit(path_components[-2])[-1],), ses)
):
if not conditional:
continue
for k in range(len(keyword_components) + 1):
- # print(k)
for components in combinations(keyword_components, k):
- # print(components)
potentialJSONs.append(
pathjoin(
self.root,
- *(prefixes +
+ *(prefixes + midlayer +
("_".join(prefixes + components + (suffix,)),))))
merged_param_dict = {}
@@ -59,6 +59,45 @@
return merged_param_dict
+ def get_fieldmap(self, path):
+ sub = os.path.split(path)[1].split("_")[0].split("sub-")[1]
+ fieldmap_set = {}
+ for file in self.get(subject=sub,
+ type='(phase1|phase2|phasediff|epi|fieldmap)',
+ extensions=['nii.gz', 'nii']):
+ metadata = self.get_metadata(file.filename)
+ if metadata and "IntendedFor" in metadata.keys():
+ if path.endswith(metadata["IntendedFor"]):
+ if file.type == "phasediff":
+ fieldmap_set = {"phasediff": file.filename,
+ "magnitude1": file.filename.replace(
+ "phasediff", "magnitude1"),
+ "magnitude2": file.filename.replace(
+ "phasediff", "magnitude2"),
+ "type": "phasediff"}
+ break
+ elif file.type == "phase1":
+ fieldmap_set["phase1"] = file.filename
+ fieldmap_set["magnitude1"] = \
+ file.filename.replace("phase1", "magnitude1")
+ fieldmap_set["type"] = "phase"
+ elif file.type == "phase2":
+ fieldmap_set["phase2"] = file.filename
+ fieldmap_set["magnitude2"] = \
+ file.filename.replace("phase2", "magnitude2")
+ fieldmap_set["type"] = "phase"
+ elif file.type == "epi":
+ if "epi" not in fieldmap_set.keys():
+ fieldmap_set["epi"] = []
+ fieldmap_set["epi"].append(file.filename)
+ fieldmap_set["type"] = "epi"
+ elif file.type == "fieldmap":
+ fieldmap_set["fieldmap"] = file.filename
+ fieldmap_set["magnitude"] = \
+ file.filename.replace("fieldmap", "magnitude")
+ fieldmap_set["type"] = "fieldmap"
+ return fieldmap_set
+
def find_match(self, target, source=None):
# Try to take the easy way out
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -23,6 +23,7 @@
platforms=PLATFORMS,
version=VERSION,
packages=PACKAGES,
+ package_data={'bids': ['grabbids/config/bids.json']},
requires=REQUIRES,
install_requires=REQUIRES)
|
{"golden_diff": "diff --git a/bids/grabbids/bids_layout.py b/bids/grabbids/bids_layout.py\n--- a/bids/grabbids/bids_layout.py\n+++ b/bids/grabbids/bids_layout.py\n@@ -18,7 +18,7 @@\n if config is None:\n root = dirname(realpath(__file__))\n config = pathjoin(root, 'config', 'bids.json')\n- super(BIDSLayout, self).__init__(path, config)\n+ super(BIDSLayout, self).__init__(path, config, dynamic_getters=True)\n \n def get_metadata(self, path):\n sidecarJSON = path.replace(\".nii.gz\", \".json\").replace(\".nii\", \".json\")\n@@ -34,21 +34,21 @@\n keyword_components = filename_components[2:-1]\n \n potentialJSONs = []\n- for prefixes, conditional in ( # Levels\n- (tuple(), True), # top\n- ((sub,), True), # subject\n- ((sub, ses), ses) # session\n+ for prefixes, midlayer, conditional in ( # Levels\n+ (tuple(), tuple(), True), # top\n+ ((sub,), tuple(), True), # subject\n+ ((sub, ), (pathsplit(path_components[-2])[-1],), True),\n+ ((sub, ses), tuple(), ses), # session\n+ ((sub, ses), (pathsplit(path_components[-2])[-1],), ses)\n ):\n if not conditional:\n continue\n for k in range(len(keyword_components) + 1):\n- # print(k)\n for components in combinations(keyword_components, k):\n- # print(components)\n potentialJSONs.append(\n pathjoin(\n self.root,\n- *(prefixes +\n+ *(prefixes + midlayer +\n (\"_\".join(prefixes + components + (suffix,)),))))\n \n merged_param_dict = {}\n@@ -59,6 +59,45 @@\n \n return merged_param_dict\n \n+ def get_fieldmap(self, path):\n+ sub = os.path.split(path)[1].split(\"_\")[0].split(\"sub-\")[1]\n+ fieldmap_set = {}\n+ for file in self.get(subject=sub,\n+ type='(phase1|phase2|phasediff|epi|fieldmap)',\n+ extensions=['nii.gz', 'nii']):\n+ metadata = self.get_metadata(file.filename)\n+ if metadata and \"IntendedFor\" in metadata.keys():\n+ if path.endswith(metadata[\"IntendedFor\"]):\n+ if file.type == \"phasediff\":\n+ fieldmap_set = {\"phasediff\": file.filename,\n+ \"magnitude1\": file.filename.replace(\n+ \"phasediff\", \"magnitude1\"),\n+ \"magnitude2\": file.filename.replace(\n+ \"phasediff\", \"magnitude2\"),\n+ \"type\": \"phasediff\"}\n+ break\n+ elif file.type == \"phase1\":\n+ fieldmap_set[\"phase1\"] = file.filename\n+ fieldmap_set[\"magnitude1\"] = \\\n+ file.filename.replace(\"phase1\", \"magnitude1\")\n+ fieldmap_set[\"type\"] = \"phase\"\n+ elif file.type == \"phase2\":\n+ fieldmap_set[\"phase2\"] = file.filename\n+ fieldmap_set[\"magnitude2\"] = \\\n+ file.filename.replace(\"phase2\", \"magnitude2\")\n+ fieldmap_set[\"type\"] = \"phase\"\n+ elif file.type == \"epi\":\n+ if \"epi\" not in fieldmap_set.keys():\n+ fieldmap_set[\"epi\"] = []\n+ fieldmap_set[\"epi\"].append(file.filename)\n+ fieldmap_set[\"type\"] = \"epi\"\n+ elif file.type == \"fieldmap\":\n+ fieldmap_set[\"fieldmap\"] = file.filename\n+ fieldmap_set[\"magnitude\"] = \\\n+ file.filename.replace(\"fieldmap\", \"magnitude\")\n+ fieldmap_set[\"type\"] = \"fieldmap\"\n+ return fieldmap_set\n+\n def find_match(self, target, source=None):\n \n # Try to take the easy way out\ndiff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -23,6 +23,7 @@\n platforms=PLATFORMS,\n version=VERSION,\n packages=PACKAGES,\n+ package_data={'bids': ['grabbids/config/bids.json']},\n requires=REQUIRES,\n install_requires=REQUIRES)\n", "issue": "Add get_fieldmap() method\nOption 1 (simple):\n\n```\n>> layout = BIDSLayout(\"/data/ds\")\n>> layout.get_fieldmap(\"sub-01/func/sub-01_task-rest_bold.nii.gz\")\n[\n \"/data/ds/sub-01/fmap/sub-01_magnitude1.nii.gz\", \n \"/data/ds/sub-01/fmap/sub-01_magnitude2.nii.gz\", \n \"/data/ds/sub-01/fmap/sub-01_phasediff.nii.gz\"\n]\n```\n\nOption 2 (extensive):\n\n```\n>> layout = BIDSLayout(\"/data/ds\")\n>> layout.get_fieldmap(\"sub-01/func/sub-01_task-rest_bold.nii.gz\")\n{\n'type': 'phasediff',\n'metadata': {\n \"EchoTime1\" : 0.00600,\n \"EchoTime2\" : 0.00746,\n \"IntendedFor\" : \"func/sub\u00ad01_task\u00admotor_bold.nii.gz\" \n },\n'data' : [\n \"/data/ds/sub-01/fmap/sub-01_magnitude1.nii.gz\", \n \"/data/ds/sub-01/fmap/sub-01_magnitude2.nii.gz\", \n \"/data/ds/sub-01/fmap/sub-01_phasediff.nii.gz\"\n ]\n}\n```\n\nWDYT @rwblair @oesteban @yarikoptic @tyarkoni @satra?\n\n", "before_files": [{"content": "import os\nimport re\nimport json\n\nfrom itertools import combinations\nfrom os.path import dirname\nfrom os.path import realpath\nfrom os.path import join as pathjoin\nfrom os.path import split as pathsplit\n\nfrom grabbit import Layout\n\n__all__ = ['BIDSLayout']\n\n\nclass BIDSLayout(Layout):\n def __init__(self, path, config=None):\n if config is None:\n root = dirname(realpath(__file__))\n config = pathjoin(root, 'config', 'bids.json')\n super(BIDSLayout, self).__init__(path, config)\n\n def get_metadata(self, path):\n sidecarJSON = path.replace(\".nii.gz\", \".json\").replace(\".nii\", \".json\")\n path_components = pathsplit(sidecarJSON)\n filename_components = path_components[-1].split(\"_\")\n ses = None\n suffix = filename_components[-1]\n\n sub = filename_components[0]\n keyword_components = filename_components[1:-1]\n if filename_components[1][:3] == \"ses\":\n ses = filename_components[1]\n keyword_components = filename_components[2:-1]\n\n potentialJSONs = []\n for prefixes, conditional in ( # Levels\n (tuple(), True), # top\n ((sub,), True), # subject\n ((sub, ses), ses) # session\n ):\n if not conditional:\n continue\n for k in range(len(keyword_components) + 1):\n # print(k)\n for components in combinations(keyword_components, k):\n # print(components)\n potentialJSONs.append(\n pathjoin(\n self.root,\n *(prefixes +\n (\"_\".join(prefixes + components + (suffix,)),))))\n\n merged_param_dict = {}\n for json_file_path in potentialJSONs:\n if os.path.exists(json_file_path):\n param_dict = json.load(open(json_file_path, \"r\"))\n merged_param_dict.update(param_dict)\n\n return merged_param_dict\n\n def find_match(self, target, source=None):\n\n # Try to take the easy way out\n if source is not None:\n _target = source.split('.')[0] + '.' + target\n if os.path.exists(_target):\n return target\n\n if target in list(self.entities.keys()):\n candidates = list(self.entities[target].files.keys())\n else:\n candidates = []\n\n for root, directories, filenames in os.walk(self.root):\n for f in filenames:\n if re.search(target + '$', f):\n if os.path.sep == \"\\\\\":\n f = f.replace(\"\\\\\", \"\\\\\\\\\")\n candidates.append(f)\n\n if source is None:\n return candidates\n\n # Walk up the file hierarchy from source, find first match\n if not os.path.exists(source):\n raise OSError(\"The file '%s' doesn't exist.\" % source)\n elif not source.startswith(self.root):\n raise ValueError(\"The file '%s' is not contained \"\n \"within the current project \"\n \"directory (%s).\" % (source, self.root))\n rel = os.path.relpath(dirname(source), self.root)\n sep = os.path.sep\n chunks = rel.split(sep)\n n_chunks = len(chunks)\n for i in range(n_chunks, -1, -1):\n path = pathjoin(self.root, *chunks[:i])\n patt = path + '\\%s[^\\%s]+$' % (sep, sep)\n if sep == \"\\\\\":\n patt = path + '\\\\[^\\\\]+$'\n patt = patt.replace(\"\\\\\", \"\\\\\\\\\")\n matches = [x for x in candidates if re.search(patt, x)]\n if matches:\n if len(matches) == 1:\n return matches[0]\n else:\n raise ValueError(\"Ambiguous target: more than one \"\n \"candidate file found in \"\n \"directory '%s'.\" % path)\n return None\n", "path": "bids/grabbids/bids_layout.py"}, {"content": "#!/usr/bin/env python\n\nimport os\nfrom setuptools import setup, find_packages\nPACKAGES = find_packages()\n\n# Get version and release info, which is all stored in bids/version.py\nver_file = os.path.join('bids', 'version.py')\nwith open(ver_file) as f:\n exec(f.read())\n\nopts = dict(name=NAME,\n maintainer=MAINTAINER,\n maintainer_email=MAINTAINER_EMAIL,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n url=URL,\n download_url=DOWNLOAD_URL,\n license=LICENSE,\n classifiers=CLASSIFIERS,\n author=AUTHOR,\n author_email=AUTHOR_EMAIL,\n platforms=PLATFORMS,\n version=VERSION,\n packages=PACKAGES,\n requires=REQUIRES,\n install_requires=REQUIRES)\n\n\nif __name__ == '__main__':\n setup(**opts)\n", "path": "setup.py"}], "after_files": [{"content": "import os\nimport re\nimport json\n\nfrom itertools import combinations\nfrom os.path import dirname\nfrom os.path import realpath\nfrom os.path import join as pathjoin\nfrom os.path import split as pathsplit\n\nfrom grabbit import Layout\n\n__all__ = ['BIDSLayout']\n\n\nclass BIDSLayout(Layout):\n def __init__(self, path, config=None):\n if config is None:\n root = dirname(realpath(__file__))\n config = pathjoin(root, 'config', 'bids.json')\n super(BIDSLayout, self).__init__(path, config, dynamic_getters=True)\n\n def get_metadata(self, path):\n sidecarJSON = path.replace(\".nii.gz\", \".json\").replace(\".nii\", \".json\")\n path_components = pathsplit(sidecarJSON)\n filename_components = path_components[-1].split(\"_\")\n ses = None\n suffix = filename_components[-1]\n\n sub = filename_components[0]\n keyword_components = filename_components[1:-1]\n if filename_components[1][:3] == \"ses\":\n ses = filename_components[1]\n keyword_components = filename_components[2:-1]\n\n potentialJSONs = []\n for prefixes, midlayer, conditional in ( # Levels\n (tuple(), tuple(), True), # top\n ((sub,), tuple(), True), # subject\n ((sub, ), (pathsplit(path_components[-2])[-1],), True),\n ((sub, ses), tuple(), ses), # session\n ((sub, ses), (pathsplit(path_components[-2])[-1],), ses)\n ):\n if not conditional:\n continue\n for k in range(len(keyword_components) + 1):\n for components in combinations(keyword_components, k):\n potentialJSONs.append(\n pathjoin(\n self.root,\n *(prefixes + midlayer +\n (\"_\".join(prefixes + components + (suffix,)),))))\n\n merged_param_dict = {}\n for json_file_path in potentialJSONs:\n if os.path.exists(json_file_path):\n param_dict = json.load(open(json_file_path, \"r\"))\n merged_param_dict.update(param_dict)\n\n return merged_param_dict\n\n def get_fieldmap(self, path):\n sub = os.path.split(path)[1].split(\"_\")[0].split(\"sub-\")[1]\n fieldmap_set = {}\n for file in self.get(subject=sub,\n type='(phase1|phase2|phasediff|epi|fieldmap)',\n extensions=['nii.gz', 'nii']):\n metadata = self.get_metadata(file.filename)\n if metadata and \"IntendedFor\" in metadata.keys():\n if path.endswith(metadata[\"IntendedFor\"]):\n if file.type == \"phasediff\":\n fieldmap_set = {\"phasediff\": file.filename,\n \"magnitude1\": file.filename.replace(\n \"phasediff\", \"magnitude1\"),\n \"magnitude2\": file.filename.replace(\n \"phasediff\", \"magnitude2\"),\n \"type\": \"phasediff\"}\n break\n elif file.type == \"phase1\":\n fieldmap_set[\"phase1\"] = file.filename\n fieldmap_set[\"magnitude1\"] = \\\n file.filename.replace(\"phase1\", \"magnitude1\")\n fieldmap_set[\"type\"] = \"phase\"\n elif file.type == \"phase2\":\n fieldmap_set[\"phase2\"] = file.filename\n fieldmap_set[\"magnitude2\"] = \\\n file.filename.replace(\"phase2\", \"magnitude2\")\n fieldmap_set[\"type\"] = \"phase\"\n elif file.type == \"epi\":\n if \"epi\" not in fieldmap_set.keys():\n fieldmap_set[\"epi\"] = []\n fieldmap_set[\"epi\"].append(file.filename)\n fieldmap_set[\"type\"] = \"epi\"\n elif file.type == \"fieldmap\":\n fieldmap_set[\"fieldmap\"] = file.filename\n fieldmap_set[\"magnitude\"] = \\\n file.filename.replace(\"fieldmap\", \"magnitude\")\n fieldmap_set[\"type\"] = \"fieldmap\"\n return fieldmap_set\n\n def find_match(self, target, source=None):\n\n # Try to take the easy way out\n if source is not None:\n _target = source.split('.')[0] + '.' + target\n if os.path.exists(_target):\n return target\n\n if target in list(self.entities.keys()):\n candidates = list(self.entities[target].files.keys())\n else:\n candidates = []\n\n for root, directories, filenames in os.walk(self.root):\n for f in filenames:\n if re.search(target + '$', f):\n if os.path.sep == \"\\\\\":\n f = f.replace(\"\\\\\", \"\\\\\\\\\")\n candidates.append(f)\n\n if source is None:\n return candidates\n\n # Walk up the file hierarchy from source, find first match\n if not os.path.exists(source):\n raise OSError(\"The file '%s' doesn't exist.\" % source)\n elif not source.startswith(self.root):\n raise ValueError(\"The file '%s' is not contained \"\n \"within the current project \"\n \"directory (%s).\" % (source, self.root))\n rel = os.path.relpath(dirname(source), self.root)\n sep = os.path.sep\n chunks = rel.split(sep)\n n_chunks = len(chunks)\n for i in range(n_chunks, -1, -1):\n path = pathjoin(self.root, *chunks[:i])\n patt = path + '\\%s[^\\%s]+$' % (sep, sep)\n if sep == \"\\\\\":\n patt = path + '\\\\[^\\\\]+$'\n patt = patt.replace(\"\\\\\", \"\\\\\\\\\")\n matches = [x for x in candidates if re.search(patt, x)]\n if matches:\n if len(matches) == 1:\n return matches[0]\n else:\n raise ValueError(\"Ambiguous target: more than one \"\n \"candidate file found in \"\n \"directory '%s'.\" % path)\n return None\n", "path": "bids/grabbids/bids_layout.py"}, {"content": "#!/usr/bin/env python\n\nimport os\nfrom setuptools import setup, find_packages\nPACKAGES = find_packages()\n\n# Get version and release info, which is all stored in bids/version.py\nver_file = os.path.join('bids', 'version.py')\nwith open(ver_file) as f:\n exec(f.read())\n\nopts = dict(name=NAME,\n maintainer=MAINTAINER,\n maintainer_email=MAINTAINER_EMAIL,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n url=URL,\n download_url=DOWNLOAD_URL,\n license=LICENSE,\n classifiers=CLASSIFIERS,\n author=AUTHOR,\n author_email=AUTHOR_EMAIL,\n platforms=PLATFORMS,\n version=VERSION,\n packages=PACKAGES,\n package_data={'bids': ['grabbids/config/bids.json']},\n requires=REQUIRES,\n install_requires=REQUIRES)\n\n\nif __name__ == '__main__':\n setup(**opts)\n", "path": "setup.py"}]}
| 1,904 | 995 |
gh_patches_debug_51
|
rasdani/github-patches
|
git_diff
|
magenta__magenta-1254
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Pip installation fails due to librosa dependency
Hi,
I'm trying to install the magenta-gpu but when I did a pip install magenta-gpu:
**librosa 0.6.2 has requirement joblib>=0.12, but you'll have joblib 0.11 which is incompatible.**
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `magenta/version.py`
Content:
```
1 # Copyright 2016 Google Inc. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 r"""Separate file for storing the current version of Magenta.
15
16 Stored in a separate file so that setup.py can reference the version without
17 pulling in all the dependencies in __init__.py.
18 """
19
20 __version__ = '0.3.10'
21
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/magenta/version.py b/magenta/version.py
--- a/magenta/version.py
+++ b/magenta/version.py
@@ -17,4 +17,4 @@
pulling in all the dependencies in __init__.py.
"""
-__version__ = '0.3.10'
+__version__ = '0.3.11'
|
{"golden_diff": "diff --git a/magenta/version.py b/magenta/version.py\n--- a/magenta/version.py\n+++ b/magenta/version.py\n@@ -17,4 +17,4 @@\n pulling in all the dependencies in __init__.py.\n \"\"\"\n \n-__version__ = '0.3.10'\n+__version__ = '0.3.11'\n", "issue": "Pip installation fails due to librosa dependency\nHi,\r\n\r\nI'm trying to install the magenta-gpu but when I did a pip install magenta-gpu:\r\n\r\n**librosa 0.6.2 has requirement joblib>=0.12, but you'll have joblib 0.11 which is incompatible.**\r\n\n", "before_files": [{"content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nr\"\"\"Separate file for storing the current version of Magenta.\n\nStored in a separate file so that setup.py can reference the version without\npulling in all the dependencies in __init__.py.\n\"\"\"\n\n__version__ = '0.3.10'\n", "path": "magenta/version.py"}], "after_files": [{"content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nr\"\"\"Separate file for storing the current version of Magenta.\n\nStored in a separate file so that setup.py can reference the version without\npulling in all the dependencies in __init__.py.\n\"\"\"\n\n__version__ = '0.3.11'\n", "path": "magenta/version.py"}]}
| 552 | 79 |
gh_patches_debug_2350
|
rasdani/github-patches
|
git_diff
|
mirumee__ariadne-184
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update setup.py to include html and py.typed files in published package
Ariadne now includes `graphql_playground.html` django template and `py.typed` file for enabling typing. We should make sure those two get published together with rest of the project.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #! /usr/bin/env python
2 import os
3 from setuptools import setup
4
5 CLASSIFIERS = [
6 "Development Status :: 4 - Beta",
7 "Intended Audience :: Developers",
8 "License :: OSI Approved :: BSD License",
9 "Operating System :: OS Independent",
10 "Programming Language :: Python",
11 "Programming Language :: Python :: 3.6",
12 "Programming Language :: Python :: 3.7",
13 "Topic :: Software Development :: Libraries :: Python Modules",
14 ]
15
16 README_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), "README.md")
17 with open(README_PATH, "r") as f:
18 README = f.read()
19
20 setup(
21 name="ariadne",
22 author="Mirumee Software",
23 author_email="[email protected]",
24 description="Ariadne is a Python library for implementing GraphQL servers.",
25 long_description=README,
26 long_description_content_type="text/markdown",
27 license="BSD",
28 version="0.4.0",
29 url="https://github.com/mirumee/ariadne",
30 packages=["ariadne"],
31 package_data={"ariadne": ["py.typed"]},
32 install_requires=[
33 "graphql-core-next>=1.0.4",
34 "starlette<0.13",
35 "typing_extensions>=3.6.0",
36 ],
37 classifiers=CLASSIFIERS,
38 platforms=["any"],
39 zip_safe=False,
40 )
41
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
old mode 100644
new mode 100755
--- a/setup.py
+++ b/setup.py
@@ -28,7 +28,7 @@
version="0.4.0",
url="https://github.com/mirumee/ariadne",
packages=["ariadne"],
- package_data={"ariadne": ["py.typed"]},
+ include_package_data=True,
install_requires=[
"graphql-core-next>=1.0.4",
"starlette<0.13",
|
{"golden_diff": "diff --git a/setup.py b/setup.py\nold mode 100644\nnew mode 100755\n--- a/setup.py\n+++ b/setup.py\n@@ -28,7 +28,7 @@\n version=\"0.4.0\",\n url=\"https://github.com/mirumee/ariadne\",\n packages=[\"ariadne\"],\n- package_data={\"ariadne\": [\"py.typed\"]},\n+ include_package_data=True,\n install_requires=[\n \"graphql-core-next>=1.0.4\",\n \"starlette<0.13\",\n", "issue": "Update setup.py to include html and py.typed files in published package\nAriadne now includes `graphql_playground.html` django template and `py.typed` file for enabling typing. We should make sure those two get published together with rest of the project.\n", "before_files": [{"content": "#! /usr/bin/env python\nimport os\nfrom setuptools import setup\n\nCLASSIFIERS = [\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n]\n\nREADME_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"README.md\")\nwith open(README_PATH, \"r\") as f:\n README = f.read()\n\nsetup(\n name=\"ariadne\",\n author=\"Mirumee Software\",\n author_email=\"[email protected]\",\n description=\"Ariadne is a Python library for implementing GraphQL servers.\",\n long_description=README,\n long_description_content_type=\"text/markdown\",\n license=\"BSD\",\n version=\"0.4.0\",\n url=\"https://github.com/mirumee/ariadne\",\n packages=[\"ariadne\"],\n package_data={\"ariadne\": [\"py.typed\"]},\n install_requires=[\n \"graphql-core-next>=1.0.4\",\n \"starlette<0.13\",\n \"typing_extensions>=3.6.0\",\n ],\n classifiers=CLASSIFIERS,\n platforms=[\"any\"],\n zip_safe=False,\n)\n", "path": "setup.py"}], "after_files": [{"content": "#! /usr/bin/env python\nimport os\nfrom setuptools import setup\n\nCLASSIFIERS = [\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n]\n\nREADME_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"README.md\")\nwith open(README_PATH, \"r\") as f:\n README = f.read()\n\nsetup(\n name=\"ariadne\",\n author=\"Mirumee Software\",\n author_email=\"[email protected]\",\n description=\"Ariadne is a Python library for implementing GraphQL servers.\",\n long_description=README,\n long_description_content_type=\"text/markdown\",\n license=\"BSD\",\n version=\"0.4.0\",\n url=\"https://github.com/mirumee/ariadne\",\n packages=[\"ariadne\"],\n include_package_data=True,\n install_requires=[\n \"graphql-core-next>=1.0.4\",\n \"starlette<0.13\",\n \"typing_extensions>=3.6.0\",\n ],\n classifiers=CLASSIFIERS,\n platforms=[\"any\"],\n zip_safe=False,\n)\n", "path": "setup.py"}]}
| 692 | 132 |
gh_patches_debug_36798
|
rasdani/github-patches
|
git_diff
|
Mailu__Mailu-3025
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[SUGG] Little verification in Mailu setup
Hi thanks for Mailu it is a good project.
I submit this suggestion because i made the error and take many time to find it.
In step 4 of Mailu Setup for Docker compose :
**Subnet of the docker network** it could be nice to verify if the last octet of the IP4 address is equal to 0 because if it is not the SMTP wont work.

Regards
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup/server.py`
Content:
```
1 import flask
2 import flask_bootstrap
3 import redis
4 import json
5 import os
6 import jinja2
7 import uuid
8 import string
9 import random
10 import ipaddress
11 import hashlib
12 import time
13
14
15 version = os.getenv("this_version", "master")
16 static_url_path = "/" + version + "/static"
17 app = flask.Flask(__name__, static_url_path=static_url_path)
18 flask_bootstrap.Bootstrap(app)
19 db = redis.StrictRedis(host='redis', port=6379, db=0)
20
21
22 def render_flavor(flavor, template, data):
23 return flask.render_template(
24 os.path.join(flavor, template),
25 **data
26 )
27
28
29 @app.add_template_global
30 def secret(length=16):
31 charset = string.ascii_uppercase + string.digits
32 return ''.join(
33 random.SystemRandom().choice(charset)
34 for _ in range(length)
35 )
36
37 #Original copied from https://github.com/andrewlkho/ulagen
38 def random_ipv6_subnet():
39 eui64 = uuid.getnode() >> 24 << 48 | 0xfffe000000 | uuid.getnode() & 0xffffff
40 eui64_canon = "-".join([format(eui64, "02X")[i:i+2] for i in range(0, 18, 2)])
41
42 h = hashlib.sha1()
43 h.update((eui64_canon + str(time.time() - time.mktime((1900, 1, 1, 0, 0, 0, 0, 1, -1)))).encode('utf-8'))
44 globalid = h.hexdigest()[0:10]
45
46 prefix = ":".join(("fd" + globalid[0:2], globalid[2:6], globalid[6:10]))
47 return prefix
48
49 def build_app(path):
50
51 app.jinja_env.trim_blocks = True
52 app.jinja_env.lstrip_blocks = True
53
54 @app.context_processor
55 def app_context():
56 return dict(
57 versions=os.getenv("VERSIONS","master").split(','),
58 stable_version = os.getenv("stable_version", "master")
59 )
60
61 prefix_bp = flask.Blueprint(version.replace(".", "_"), __name__)
62 prefix_bp.jinja_loader = jinja2.ChoiceLoader([
63 jinja2.FileSystemLoader(os.path.join(path, "templates")),
64 jinja2.FileSystemLoader(os.path.join(path, "flavors"))
65 ])
66
67 root_bp = flask.Blueprint("root", __name__)
68 root_bp.jinja_loader = jinja2.ChoiceLoader([
69 jinja2.FileSystemLoader(os.path.join(path, "templates")),
70 jinja2.FileSystemLoader(os.path.join(path, "flavors"))
71 ])
72
73 @prefix_bp.context_processor
74 @root_bp.context_processor
75 def bp_context(version=version):
76 return dict(version=version)
77
78 @prefix_bp.route("/")
79 @root_bp.route("/")
80 def wizard():
81 return flask.render_template(
82 'wizard.html',
83 flavor="compose",
84 steps=sorted(os.listdir(os.path.join(path, "templates", "steps", "compose"))),
85 subnet6=random_ipv6_subnet()
86 )
87
88 @prefix_bp.route("/submit", methods=["POST"])
89 @root_bp.route("/submit", methods=["POST"])
90 def submit():
91 data = flask.request.form.copy()
92 data['uid'] = str(uuid.uuid4())
93 try:
94 data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])
95 except ValueError as err:
96 return "Error while generating files: " + str(err)
97 db.set(data['uid'], json.dumps(data))
98 return flask.redirect(flask.url_for('.setup', uid=data['uid']))
99
100 @prefix_bp.route("/setup/<uid>", methods=["GET"])
101 @root_bp.route("/setup/<uid>", methods=["GET"])
102 def setup(uid):
103 data = json.loads(db.get(uid))
104 flavor = data.get("flavor", "compose")
105 rendered = render_flavor(flavor, "setup.html", data)
106 return flask.render_template("setup.html", contents=rendered)
107
108 @prefix_bp.route("/file/<uid>/<filepath>", methods=["GET"])
109 @root_bp.route("/file/<uid>/<filepath>", methods=["GET"])
110 def file(uid, filepath):
111 data = json.loads(db.get(uid))
112 flavor = data.get("flavor", "compose")
113 return flask.Response(
114 render_flavor(flavor, filepath, data),
115 mimetype="application/text"
116 )
117
118 app.register_blueprint(prefix_bp, url_prefix="/{}".format(version))
119 app.register_blueprint(root_bp)
120
121
122 if __name__ == "__main__":
123 build_app("/tmp/mailutest")
124 app.run(debug=True)
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup/server.py b/setup/server.py
--- a/setup/server.py
+++ b/setup/server.py
@@ -10,12 +10,16 @@
import ipaddress
import hashlib
import time
-
+import secrets
+from flask_bootstrap import StaticCDN
version = os.getenv("this_version", "master")
static_url_path = "/" + version + "/static"
app = flask.Flask(__name__, static_url_path=static_url_path)
+app.secret_key = secrets.token_hex(16)
flask_bootstrap.Bootstrap(app)
+# Load our jQuery. Do not use jQuery 1.
+app.extensions['bootstrap']['cdns']['jquery'] = StaticCDN()
db = redis.StrictRedis(host='redis', port=6379, db=0)
@@ -90,12 +94,47 @@
def submit():
data = flask.request.form.copy()
data['uid'] = str(uuid.uuid4())
+ valid = True
+ try:
+ ipaddress.IPv4Address(data['bind4'])
+ except:
+ flask.flash('Configured IPv4 address is invalid', 'error')
+ valid = False
+ try:
+ ipaddress.IPv6Address(data['bind6'])
+ except:
+ flask.flash('Configured IPv6 address is invalid', 'error')
+ valid = False
+ try:
+ ipaddress.IPv4Network(data['subnet'])
+ except:
+ flask.flash('Configured subnet(IPv4) is invalid', 'error')
+ valid = False
+ try:
+ ipaddress.IPv6Network(data['subnet6'])
+ except:
+ flask.flash('Configured subnet(IPv6) is invalid', 'error')
+ valid = False
try:
data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])
except ValueError as err:
- return "Error while generating files: " + str(err)
- db.set(data['uid'], json.dumps(data))
- return flask.redirect(flask.url_for('.setup', uid=data['uid']))
+ flask.flash('Invalid configuration: ' + str(err))
+ valid = False
+ if 'api_enabled' in data:
+ if (data['api_enabled'] == 'true'):
+ if data['api_token'] == '':
+ flask.flash('API token cannot be empty when API is enabled', 'error')
+ valid = False
+ if valid:
+ db.set(data['uid'], json.dumps(data))
+ return flask.redirect(flask.url_for('.setup', uid=data['uid']))
+ else:
+ return flask.render_template(
+ 'wizard.html',
+ flavor="compose",
+ steps=sorted(os.listdir(os.path.join(path, "templates", "steps", "compose"))),
+ subnet6=random_ipv6_subnet()
+ )
@prefix_bp.route("/setup/<uid>", methods=["GET"])
@root_bp.route("/setup/<uid>", methods=["GET"])
|
{"golden_diff": "diff --git a/setup/server.py b/setup/server.py\n--- a/setup/server.py\n+++ b/setup/server.py\n@@ -10,12 +10,16 @@\n import ipaddress\n import hashlib\n import time\n-\n+import secrets\n+from flask_bootstrap import StaticCDN\n \n version = os.getenv(\"this_version\", \"master\")\n static_url_path = \"/\" + version + \"/static\"\n app = flask.Flask(__name__, static_url_path=static_url_path)\n+app.secret_key = secrets.token_hex(16)\n flask_bootstrap.Bootstrap(app)\n+# Load our jQuery. Do not use jQuery 1.\n+app.extensions['bootstrap']['cdns']['jquery'] = StaticCDN()\n db = redis.StrictRedis(host='redis', port=6379, db=0)\n \n \n@@ -90,12 +94,47 @@\n def submit():\n data = flask.request.form.copy()\n data['uid'] = str(uuid.uuid4())\n+ valid = True\n+ try:\n+ ipaddress.IPv4Address(data['bind4'])\n+ except:\n+ flask.flash('Configured IPv4 address is invalid', 'error')\n+ valid = False\n+ try:\n+ ipaddress.IPv6Address(data['bind6'])\n+ except:\n+ flask.flash('Configured IPv6 address is invalid', 'error')\n+ valid = False\n+ try:\n+ ipaddress.IPv4Network(data['subnet'])\n+ except:\n+ flask.flash('Configured subnet(IPv4) is invalid', 'error')\n+ valid = False\n+ try:\n+ ipaddress.IPv6Network(data['subnet6'])\n+ except:\n+ flask.flash('Configured subnet(IPv6) is invalid', 'error')\n+ valid = False\n try:\n data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])\n except ValueError as err:\n- return \"Error while generating files: \" + str(err)\n- db.set(data['uid'], json.dumps(data))\n- return flask.redirect(flask.url_for('.setup', uid=data['uid']))\n+ flask.flash('Invalid configuration: ' + str(err))\n+ valid = False\n+ if 'api_enabled' in data:\n+ if (data['api_enabled'] == 'true'):\n+ if data['api_token'] == '':\n+ flask.flash('API token cannot be empty when API is enabled', 'error')\n+ valid = False\n+ if valid:\n+ db.set(data['uid'], json.dumps(data))\n+ return flask.redirect(flask.url_for('.setup', uid=data['uid']))\n+ else:\n+ return flask.render_template(\n+ 'wizard.html',\n+ flavor=\"compose\",\n+ steps=sorted(os.listdir(os.path.join(path, \"templates\", \"steps\", \"compose\"))),\n+ subnet6=random_ipv6_subnet()\n+ )\n \n @prefix_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n @root_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n", "issue": "[SUGG] Little verification in Mailu setup\nHi thanks for Mailu it is a good project.\r\nI submit this suggestion because i made the error and take many time to find it.\r\nIn step 4 of Mailu Setup for Docker compose : \r\n**Subnet of the docker network** it could be nice to verify if the last octet of the IP4 address is equal to 0 because if it is not the SMTP wont work.\r\n\r\n\r\n\r\nRegards \r\n\n", "before_files": [{"content": "import flask\nimport flask_bootstrap\nimport redis\nimport json\nimport os\nimport jinja2\nimport uuid\nimport string\nimport random\nimport ipaddress\nimport hashlib\nimport time\n\n\nversion = os.getenv(\"this_version\", \"master\")\nstatic_url_path = \"/\" + version + \"/static\"\napp = flask.Flask(__name__, static_url_path=static_url_path)\nflask_bootstrap.Bootstrap(app)\ndb = redis.StrictRedis(host='redis', port=6379, db=0)\n\n\ndef render_flavor(flavor, template, data):\n return flask.render_template(\n os.path.join(flavor, template),\n **data\n )\n\n\[email protected]_template_global\ndef secret(length=16):\n charset = string.ascii_uppercase + string.digits\n return ''.join(\n random.SystemRandom().choice(charset)\n for _ in range(length)\n )\n\n#Original copied from https://github.com/andrewlkho/ulagen\ndef random_ipv6_subnet():\n eui64 = uuid.getnode() >> 24 << 48 | 0xfffe000000 | uuid.getnode() & 0xffffff\n eui64_canon = \"-\".join([format(eui64, \"02X\")[i:i+2] for i in range(0, 18, 2)])\n\n h = hashlib.sha1()\n h.update((eui64_canon + str(time.time() - time.mktime((1900, 1, 1, 0, 0, 0, 0, 1, -1)))).encode('utf-8'))\n globalid = h.hexdigest()[0:10]\n\n prefix = \":\".join((\"fd\" + globalid[0:2], globalid[2:6], globalid[6:10]))\n return prefix\n\ndef build_app(path):\n\n app.jinja_env.trim_blocks = True\n app.jinja_env.lstrip_blocks = True\n\n @app.context_processor\n def app_context():\n return dict(\n versions=os.getenv(\"VERSIONS\",\"master\").split(','),\n stable_version = os.getenv(\"stable_version\", \"master\")\n )\n\n prefix_bp = flask.Blueprint(version.replace(\".\", \"_\"), __name__)\n prefix_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n root_bp = flask.Blueprint(\"root\", __name__)\n root_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n @prefix_bp.context_processor\n @root_bp.context_processor\n def bp_context(version=version):\n return dict(version=version)\n\n @prefix_bp.route(\"/\")\n @root_bp.route(\"/\")\n def wizard():\n return flask.render_template(\n 'wizard.html',\n flavor=\"compose\",\n steps=sorted(os.listdir(os.path.join(path, \"templates\", \"steps\", \"compose\"))),\n subnet6=random_ipv6_subnet()\n )\n\n @prefix_bp.route(\"/submit\", methods=[\"POST\"])\n @root_bp.route(\"/submit\", methods=[\"POST\"])\n def submit():\n data = flask.request.form.copy()\n data['uid'] = str(uuid.uuid4())\n try:\n data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])\n except ValueError as err:\n return \"Error while generating files: \" + str(err)\n db.set(data['uid'], json.dumps(data))\n return flask.redirect(flask.url_for('.setup', uid=data['uid']))\n\n @prefix_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n @root_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n def setup(uid):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n rendered = render_flavor(flavor, \"setup.html\", data)\n return flask.render_template(\"setup.html\", contents=rendered)\n\n @prefix_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n @root_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n def file(uid, filepath):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n return flask.Response(\n render_flavor(flavor, filepath, data),\n mimetype=\"application/text\"\n )\n\n app.register_blueprint(prefix_bp, url_prefix=\"/{}\".format(version))\n app.register_blueprint(root_bp)\n\n\nif __name__ == \"__main__\":\n build_app(\"/tmp/mailutest\")\n app.run(debug=True)\n", "path": "setup/server.py"}], "after_files": [{"content": "import flask\nimport flask_bootstrap\nimport redis\nimport json\nimport os\nimport jinja2\nimport uuid\nimport string\nimport random\nimport ipaddress\nimport hashlib\nimport time\nimport secrets\nfrom flask_bootstrap import StaticCDN\n\nversion = os.getenv(\"this_version\", \"master\")\nstatic_url_path = \"/\" + version + \"/static\"\napp = flask.Flask(__name__, static_url_path=static_url_path)\napp.secret_key = secrets.token_hex(16)\nflask_bootstrap.Bootstrap(app)\n# Load our jQuery. Do not use jQuery 1.\napp.extensions['bootstrap']['cdns']['jquery'] = StaticCDN()\ndb = redis.StrictRedis(host='redis', port=6379, db=0)\n\n\ndef render_flavor(flavor, template, data):\n return flask.render_template(\n os.path.join(flavor, template),\n **data\n )\n\n\[email protected]_template_global\ndef secret(length=16):\n charset = string.ascii_uppercase + string.digits\n return ''.join(\n random.SystemRandom().choice(charset)\n for _ in range(length)\n )\n\n#Original copied from https://github.com/andrewlkho/ulagen\ndef random_ipv6_subnet():\n eui64 = uuid.getnode() >> 24 << 48 | 0xfffe000000 | uuid.getnode() & 0xffffff\n eui64_canon = \"-\".join([format(eui64, \"02X\")[i:i+2] for i in range(0, 18, 2)])\n\n h = hashlib.sha1()\n h.update((eui64_canon + str(time.time() - time.mktime((1900, 1, 1, 0, 0, 0, 0, 1, -1)))).encode('utf-8'))\n globalid = h.hexdigest()[0:10]\n\n prefix = \":\".join((\"fd\" + globalid[0:2], globalid[2:6], globalid[6:10]))\n return prefix\n\ndef build_app(path):\n\n app.jinja_env.trim_blocks = True\n app.jinja_env.lstrip_blocks = True\n\n @app.context_processor\n def app_context():\n return dict(\n versions=os.getenv(\"VERSIONS\",\"master\").split(','),\n stable_version = os.getenv(\"stable_version\", \"master\")\n )\n\n prefix_bp = flask.Blueprint(version.replace(\".\", \"_\"), __name__)\n prefix_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n root_bp = flask.Blueprint(\"root\", __name__)\n root_bp.jinja_loader = jinja2.ChoiceLoader([\n jinja2.FileSystemLoader(os.path.join(path, \"templates\")),\n jinja2.FileSystemLoader(os.path.join(path, \"flavors\"))\n ])\n\n @prefix_bp.context_processor\n @root_bp.context_processor\n def bp_context(version=version):\n return dict(version=version)\n\n @prefix_bp.route(\"/\")\n @root_bp.route(\"/\")\n def wizard():\n return flask.render_template(\n 'wizard.html',\n flavor=\"compose\",\n steps=sorted(os.listdir(os.path.join(path, \"templates\", \"steps\", \"compose\"))),\n subnet6=random_ipv6_subnet()\n )\n\n @prefix_bp.route(\"/submit\", methods=[\"POST\"])\n @root_bp.route(\"/submit\", methods=[\"POST\"])\n def submit():\n data = flask.request.form.copy()\n data['uid'] = str(uuid.uuid4())\n valid = True\n try:\n ipaddress.IPv4Address(data['bind4'])\n except:\n flask.flash('Configured IPv4 address is invalid', 'error')\n valid = False\n try:\n ipaddress.IPv6Address(data['bind6'])\n except:\n flask.flash('Configured IPv6 address is invalid', 'error')\n valid = False\n try:\n ipaddress.IPv4Network(data['subnet'])\n except:\n flask.flash('Configured subnet(IPv4) is invalid', 'error')\n valid = False\n try:\n ipaddress.IPv6Network(data['subnet6'])\n except:\n flask.flash('Configured subnet(IPv6) is invalid', 'error')\n valid = False\n try:\n data['dns'] = str(ipaddress.IPv4Network(data['subnet'], strict=False)[-2])\n except ValueError as err:\n flask.flash('Invalid configuration: ' + str(err))\n valid = False\n if 'api_enabled' in data:\n if (data['api_enabled'] == 'true'):\n if data['api_token'] == '':\n flask.flash('API token cannot be empty when API is enabled', 'error')\n valid = False\n if valid:\n db.set(data['uid'], json.dumps(data))\n return flask.redirect(flask.url_for('.setup', uid=data['uid']))\n else:\n return flask.render_template(\n 'wizard.html',\n flavor=\"compose\",\n steps=sorted(os.listdir(os.path.join(path, \"templates\", \"steps\", \"compose\"))),\n subnet6=random_ipv6_subnet()\n )\n\n @prefix_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n @root_bp.route(\"/setup/<uid>\", methods=[\"GET\"])\n def setup(uid):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n rendered = render_flavor(flavor, \"setup.html\", data)\n return flask.render_template(\"setup.html\", contents=rendered)\n\n @prefix_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n @root_bp.route(\"/file/<uid>/<filepath>\", methods=[\"GET\"])\n def file(uid, filepath):\n data = json.loads(db.get(uid))\n flavor = data.get(\"flavor\", \"compose\")\n return flask.Response(\n render_flavor(flavor, filepath, data),\n mimetype=\"application/text\"\n )\n\n app.register_blueprint(prefix_bp, url_prefix=\"/{}\".format(version))\n app.register_blueprint(root_bp)\n\n\nif __name__ == \"__main__\":\n build_app(\"/tmp/mailutest\")\n app.run(debug=True)\n", "path": "setup/server.py"}]}
| 1,712 | 656 |
gh_patches_debug_10910
|
rasdani/github-patches
|
git_diff
|
learningequality__kolibri-4096
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
The new learners and coachs account, can't view the kolibri 0.11.0a3 at first login
<!--
Instructions:
* Fill out the sections below, replace …'s with information about your issue
* Use the 'preview' function above this text box to verify formatting before submitting
-->
### Observed behavior
When I create a learners account and test to login, the page show like this "The requested URL / was not found on this server."
<!--
Description of the behavior that was observed, including screenshots or other references when applicable
-->
### Expected behavior
Must login to kolibri page.
<!--
Description of what behavior was expected but did not occur
-->
### Errors and logs
```
The character encoding of the HTML document was not declared. The document will render with garbled text in some browser configurations if the document contains characters from outside the US-ASCII range. The character encoding of the page must be declared in the document or in the transfer protocol. 127.0.0.1:8080
```
<!--
Relevant logs from:
* the command line
* ~/.kolibri/kolibri.log
* the browser console
Please wrap errors in triple backticks for clean formatting like this:
```
01:10 info: something happened
01:12 error: something bad happened
```
-->
…
### Steps to reproduce
1. login with admin account
2. go to facility and user tab
3. create new learner account
4. try to login the new learner account and see the error page.
<!--
Precise steps that someone else can follow in order to see this behavior
-->
…
### Context
Windows 7 and 10
kolibri 0.11.0a3
firefox 61.0.1(64-bit)
<!--
Tell us about your environment, including:
* Kolibri version
* Operating system
* Browser
-->
### Screenshot

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kolibri/core/views.py`
Content:
```
1 from django import http
2 from django.conf import settings
3 from django.contrib.auth import logout
4 from django.core.urlresolvers import translate_url
5 from django.http import Http404
6 from django.http import HttpResponseRedirect
7 from django.utils.http import is_safe_url
8 from django.utils.translation import check_for_language
9 from django.utils.translation import LANGUAGE_SESSION_KEY
10 from django.utils.translation import ugettext_lazy as _
11 from django.views.generic.base import View
12 from django.views.i18n import LANGUAGE_QUERY_PARAMETER
13
14 from kolibri.core.auth.constants import user_kinds
15 from kolibri.core.auth.models import Role
16 from kolibri.core.decorators import signin_redirect_exempt
17 from kolibri.core.hooks import RoleBasedRedirectHook
18
19
20 # Modified from django.views.i18n
21 @signin_redirect_exempt
22 def set_language(request):
23 """
24 Redirect to a given url while setting the chosen language in the
25 session or cookie. The url and the language code need to be
26 specified in the request parameters.
27 Since this view changes how the user will see the rest of the site, it must
28 only be accessed as a POST request. If called as a GET request, it will
29 redirect to the page in the request (the 'next' parameter) without changing
30 any state.
31 """
32 next = request.POST.get('next', request.GET.get('next'))
33 if not is_safe_url(url=next, host=request.get_host()):
34 next = request.META.get('HTTP_REFERER')
35 if not is_safe_url(url=next, host=request.get_host()):
36 next = '/'
37 response = http.HttpResponseRedirect(next)
38 if request.method == 'POST':
39 lang_code = request.POST.get(LANGUAGE_QUERY_PARAMETER)
40 if lang_code and check_for_language(lang_code):
41 next_trans = translate_url(next, lang_code)
42 if next_trans != next:
43 response = http.HttpResponseRedirect(next_trans)
44 if hasattr(request, 'session'):
45 request.session[LANGUAGE_SESSION_KEY] = lang_code
46 # Always set cookie
47 response.set_cookie(settings.LANGUAGE_COOKIE_NAME, lang_code,
48 max_age=settings.LANGUAGE_COOKIE_AGE,
49 path=settings.LANGUAGE_COOKIE_PATH,
50 domain=settings.LANGUAGE_COOKIE_DOMAIN)
51 return response
52
53
54 def logout_view(request):
55 logout(request)
56 return http.HttpResponseRedirect('/')
57
58
59 def get_url_by_role(role, first_login):
60 obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks
61 if hook.role == role and hook.first_login == first_login), None)
62 if obj:
63 return obj.url
64
65
66 class GuestRedirectView(View):
67 def get(self, request):
68 """
69 Redirects a guest user to a learner accessible page.
70 """
71 return HttpResponseRedirect(get_url_by_role(user_kinds.LEARNER, False))
72
73
74 class RootURLRedirectView(View):
75
76 def get(self, request):
77 """
78 Redirects user based on the highest role they have for which a redirect is defined.
79 """
80 first_login = request.session.get("first_login", False)
81 if request.user.is_authenticated():
82 url = None
83 if request.user.is_superuser:
84 url = url or get_url_by_role(user_kinds.SUPERUSER, first_login)
85 roles = set(Role.objects.filter(user_id=request.user.id).values_list('kind', flat=True).distinct())
86 if user_kinds.ADMIN in roles:
87 url = url or get_url_by_role(user_kinds.ADMIN, first_login)
88 if user_kinds.COACH in roles:
89 url = url or get_url_by_role(user_kinds.COACH, first_login)
90 url = url or get_url_by_role(user_kinds.LEARNER, first_login)
91 else:
92 url = get_url_by_role(user_kinds.ANONYMOUS, first_login)
93 if url:
94 return HttpResponseRedirect(url)
95 raise Http404(_("No appropriate redirect pages found, it is likely that Kolibri is badly configured"))
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/kolibri/core/views.py b/kolibri/core/views.py
--- a/kolibri/core/views.py
+++ b/kolibri/core/views.py
@@ -59,6 +59,13 @@
def get_url_by_role(role, first_login):
obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks
if hook.role == role and hook.first_login == first_login), None)
+
+ if obj is None and first_login:
+ # If it is the first_login, do a fallback to find the non-first login behaviour when it is
+ # not available
+ obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks
+ if hook.role == role and hook.first_login is False), None)
+
if obj:
return obj.url
|
{"golden_diff": "diff --git a/kolibri/core/views.py b/kolibri/core/views.py\n--- a/kolibri/core/views.py\n+++ b/kolibri/core/views.py\n@@ -59,6 +59,13 @@\n def get_url_by_role(role, first_login):\n obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks\n if hook.role == role and hook.first_login == first_login), None)\n+\n+ if obj is None and first_login:\n+ # If it is the first_login, do a fallback to find the non-first login behaviour when it is\n+ # not available\n+ obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks\n+ if hook.role == role and hook.first_login is False), None)\n+\n if obj:\n return obj.url\n", "issue": "The new learners and coachs account, can't view the kolibri 0.11.0a3 at first login\n<!--\r\nInstructions:\r\n * Fill out the sections below, replace \u2026's with information about your issue\r\n * Use the 'preview' function above this text box to verify formatting before submitting\r\n-->\r\n\r\n### Observed behavior\r\nWhen I create a learners account and test to login, the page show like this \"The requested URL / was not found on this server.\"\r\n\r\n<!--\r\nDescription of the behavior that was observed, including screenshots or other references when applicable\r\n-->\r\n\r\n\r\n\r\n### Expected behavior\r\nMust login to kolibri page.\r\n<!--\r\nDescription of what behavior was expected but did not occur\r\n-->\r\n\r\n\r\n### Errors and logs\r\n```\r\nThe character encoding of the HTML document was not declared. The document will render with garbled text in some browser configurations if the document contains characters from outside the US-ASCII range. The character encoding of the page must be declared in the document or in the transfer protocol. 127.0.0.1:8080 \r\n```\r\n<!--\r\nRelevant logs from:\r\n * the command line\r\n * ~/.kolibri/kolibri.log\r\n * the browser console\r\n\r\nPlease wrap errors in triple backticks for clean formatting like this:\r\n```\r\n01:10 info: something happened\r\n01:12 error: something bad happened\r\n```\r\n-->\r\n\r\n\u2026\r\n\r\n### Steps to reproduce\r\n1. login with admin account\r\n2. go to facility and user tab\r\n3. create new learner account\r\n4. try to login the new learner account and see the error page.\r\n<!--\r\nPrecise steps that someone else can follow in order to see this behavior\r\n-->\r\n\r\n\u2026\r\n\r\n### Context\r\nWindows 7 and 10\r\nkolibri 0.11.0a3\r\nfirefox 61.0.1(64-bit)\r\n<!--\r\nTell us about your environment, including:\r\n * Kolibri version\r\n * Operating system\r\n * Browser\r\n-->\r\n### Screenshot\r\n\r\n\n", "before_files": [{"content": "from django import http\nfrom django.conf import settings\nfrom django.contrib.auth import logout\nfrom django.core.urlresolvers import translate_url\nfrom django.http import Http404\nfrom django.http import HttpResponseRedirect\nfrom django.utils.http import is_safe_url\nfrom django.utils.translation import check_for_language\nfrom django.utils.translation import LANGUAGE_SESSION_KEY\nfrom django.utils.translation import ugettext_lazy as _\nfrom django.views.generic.base import View\nfrom django.views.i18n import LANGUAGE_QUERY_PARAMETER\n\nfrom kolibri.core.auth.constants import user_kinds\nfrom kolibri.core.auth.models import Role\nfrom kolibri.core.decorators import signin_redirect_exempt\nfrom kolibri.core.hooks import RoleBasedRedirectHook\n\n\n# Modified from django.views.i18n\n@signin_redirect_exempt\ndef set_language(request):\n \"\"\"\n Redirect to a given url while setting the chosen language in the\n session or cookie. The url and the language code need to be\n specified in the request parameters.\n Since this view changes how the user will see the rest of the site, it must\n only be accessed as a POST request. If called as a GET request, it will\n redirect to the page in the request (the 'next' parameter) without changing\n any state.\n \"\"\"\n next = request.POST.get('next', request.GET.get('next'))\n if not is_safe_url(url=next, host=request.get_host()):\n next = request.META.get('HTTP_REFERER')\n if not is_safe_url(url=next, host=request.get_host()):\n next = '/'\n response = http.HttpResponseRedirect(next)\n if request.method == 'POST':\n lang_code = request.POST.get(LANGUAGE_QUERY_PARAMETER)\n if lang_code and check_for_language(lang_code):\n next_trans = translate_url(next, lang_code)\n if next_trans != next:\n response = http.HttpResponseRedirect(next_trans)\n if hasattr(request, 'session'):\n request.session[LANGUAGE_SESSION_KEY] = lang_code\n # Always set cookie\n response.set_cookie(settings.LANGUAGE_COOKIE_NAME, lang_code,\n max_age=settings.LANGUAGE_COOKIE_AGE,\n path=settings.LANGUAGE_COOKIE_PATH,\n domain=settings.LANGUAGE_COOKIE_DOMAIN)\n return response\n\n\ndef logout_view(request):\n logout(request)\n return http.HttpResponseRedirect('/')\n\n\ndef get_url_by_role(role, first_login):\n obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks\n if hook.role == role and hook.first_login == first_login), None)\n if obj:\n return obj.url\n\n\nclass GuestRedirectView(View):\n def get(self, request):\n \"\"\"\n Redirects a guest user to a learner accessible page.\n \"\"\"\n return HttpResponseRedirect(get_url_by_role(user_kinds.LEARNER, False))\n\n\nclass RootURLRedirectView(View):\n\n def get(self, request):\n \"\"\"\n Redirects user based on the highest role they have for which a redirect is defined.\n \"\"\"\n first_login = request.session.get(\"first_login\", False)\n if request.user.is_authenticated():\n url = None\n if request.user.is_superuser:\n url = url or get_url_by_role(user_kinds.SUPERUSER, first_login)\n roles = set(Role.objects.filter(user_id=request.user.id).values_list('kind', flat=True).distinct())\n if user_kinds.ADMIN in roles:\n url = url or get_url_by_role(user_kinds.ADMIN, first_login)\n if user_kinds.COACH in roles:\n url = url or get_url_by_role(user_kinds.COACH, first_login)\n url = url or get_url_by_role(user_kinds.LEARNER, first_login)\n else:\n url = get_url_by_role(user_kinds.ANONYMOUS, first_login)\n if url:\n return HttpResponseRedirect(url)\n raise Http404(_(\"No appropriate redirect pages found, it is likely that Kolibri is badly configured\"))\n", "path": "kolibri/core/views.py"}], "after_files": [{"content": "from django import http\nfrom django.conf import settings\nfrom django.contrib.auth import logout\nfrom django.core.urlresolvers import translate_url\nfrom django.http import Http404\nfrom django.http import HttpResponseRedirect\nfrom django.utils.http import is_safe_url\nfrom django.utils.translation import check_for_language\nfrom django.utils.translation import LANGUAGE_SESSION_KEY\nfrom django.utils.translation import ugettext_lazy as _\nfrom django.views.generic.base import View\nfrom django.views.i18n import LANGUAGE_QUERY_PARAMETER\n\nfrom kolibri.core.auth.constants import user_kinds\nfrom kolibri.core.auth.models import Role\nfrom kolibri.core.decorators import signin_redirect_exempt\nfrom kolibri.core.hooks import RoleBasedRedirectHook\n\n\n# Modified from django.views.i18n\n@signin_redirect_exempt\ndef set_language(request):\n \"\"\"\n Redirect to a given url while setting the chosen language in the\n session or cookie. The url and the language code need to be\n specified in the request parameters.\n Since this view changes how the user will see the rest of the site, it must\n only be accessed as a POST request. If called as a GET request, it will\n redirect to the page in the request (the 'next' parameter) without changing\n any state.\n \"\"\"\n next = request.POST.get('next', request.GET.get('next'))\n if not is_safe_url(url=next, host=request.get_host()):\n next = request.META.get('HTTP_REFERER')\n if not is_safe_url(url=next, host=request.get_host()):\n next = '/'\n response = http.HttpResponseRedirect(next)\n if request.method == 'POST':\n lang_code = request.POST.get(LANGUAGE_QUERY_PARAMETER)\n if lang_code and check_for_language(lang_code):\n next_trans = translate_url(next, lang_code)\n if next_trans != next:\n response = http.HttpResponseRedirect(next_trans)\n if hasattr(request, 'session'):\n request.session[LANGUAGE_SESSION_KEY] = lang_code\n # Always set cookie\n response.set_cookie(settings.LANGUAGE_COOKIE_NAME, lang_code,\n max_age=settings.LANGUAGE_COOKIE_AGE,\n path=settings.LANGUAGE_COOKIE_PATH,\n domain=settings.LANGUAGE_COOKIE_DOMAIN)\n return response\n\n\ndef logout_view(request):\n logout(request)\n return http.HttpResponseRedirect('/')\n\n\ndef get_url_by_role(role, first_login):\n obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks\n if hook.role == role and hook.first_login == first_login), None)\n\n if obj is None and first_login:\n # If it is the first_login, do a fallback to find the non-first login behaviour when it is\n # not available\n obj = next((hook for hook in RoleBasedRedirectHook().registered_hooks\n if hook.role == role and hook.first_login is False), None)\n\n if obj:\n return obj.url\n\n\nclass GuestRedirectView(View):\n def get(self, request):\n \"\"\"\n Redirects a guest user to a learner accessible page.\n \"\"\"\n return HttpResponseRedirect(get_url_by_role(user_kinds.LEARNER, False))\n\n\nclass RootURLRedirectView(View):\n\n def get(self, request):\n \"\"\"\n Redirects user based on the highest role they have for which a redirect is defined.\n \"\"\"\n first_login = request.session.get(\"first_login\", False)\n if request.user.is_authenticated():\n url = None\n if request.user.is_superuser:\n url = url or get_url_by_role(user_kinds.SUPERUSER, first_login)\n roles = set(Role.objects.filter(user_id=request.user.id).values_list('kind', flat=True).distinct())\n if user_kinds.ADMIN in roles:\n url = url or get_url_by_role(user_kinds.ADMIN, first_login)\n if user_kinds.COACH in roles:\n url = url or get_url_by_role(user_kinds.COACH, first_login)\n url = url or get_url_by_role(user_kinds.LEARNER, first_login)\n else:\n url = get_url_by_role(user_kinds.ANONYMOUS, first_login)\n if url:\n return HttpResponseRedirect(url)\n raise Http404(_(\"No appropriate redirect pages found, it is likely that Kolibri is badly configured\"))\n", "path": "kolibri/core/views.py"}]}
| 1,753 | 177 |
gh_patches_debug_35237
|
rasdani/github-patches
|
git_diff
|
scikit-hep__pyhf-1435
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Correlated background simplemodel
something like
```python
def simplemodel2(s,b_up,b_nom,b_dn):
spec = {
'channels': [
{
'name': 'singlechannel',
'samples': [
{
'name': 'signal',
'data': s,
'modifiers': [{'name': 'mu', 'type': 'normfactor', 'data': None}]
},
{'name': 'background',
'data': b_nom,
'modifiers': [
{
'name': 'uncorr_bkguncrt',
'type': 'histosys',
'data': {
'hi_data': b_up,
'lo_data': b_dn
}
}
]
}
]
}
]
}
return pyhf.Model(spec)
```
with an API like `pyhf.simplemodels.correlated_bkg`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pyhf/simplemodels.py`
Content:
```
1 from . import Model
2
3 __all__ = ["hepdata_like"]
4
5
6 def __dir__():
7 return __all__
8
9
10 def hepdata_like(signal_data, bkg_data, bkg_uncerts, batch_size=None):
11 """
12 Construct a simple single channel :class:`~pyhf.pdf.Model` with a
13 :class:`~pyhf.modifiers.shapesys` modifier representing an uncorrelated
14 background uncertainty.
15
16 Example:
17 >>> import pyhf
18 >>> pyhf.set_backend("numpy")
19 >>> model = pyhf.simplemodels.hepdata_like(
20 ... signal_data=[12.0, 11.0], bkg_data=[50.0, 52.0], bkg_uncerts=[3.0, 7.0]
21 ... )
22 >>> model.schema
23 'model.json'
24 >>> model.config.channels
25 ['singlechannel']
26 >>> model.config.samples
27 ['background', 'signal']
28 >>> model.config.parameters
29 ['mu', 'uncorr_bkguncrt']
30 >>> model.expected_data(model.config.suggested_init())
31 array([ 62. , 63. , 277.77777778, 55.18367347])
32
33 Args:
34 signal_data (:obj:`list`): The data in the signal sample
35 bkg_data (:obj:`list`): The data in the background sample
36 bkg_uncerts (:obj:`list`): The statistical uncertainty on the background sample counts
37 batch_size (:obj:`None` or :obj:`int`): Number of simultaneous (batched) Models to compute
38
39 Returns:
40 ~pyhf.pdf.Model: The statistical model adhering to the :obj:`model.json` schema
41
42 """
43 spec = {
44 'channels': [
45 {
46 'name': 'singlechannel',
47 'samples': [
48 {
49 'name': 'signal',
50 'data': signal_data,
51 'modifiers': [
52 {'name': 'mu', 'type': 'normfactor', 'data': None}
53 ],
54 },
55 {
56 'name': 'background',
57 'data': bkg_data,
58 'modifiers': [
59 {
60 'name': 'uncorr_bkguncrt',
61 'type': 'shapesys',
62 'data': bkg_uncerts,
63 }
64 ],
65 },
66 ],
67 }
68 ]
69 }
70 return Model(spec, batch_size=batch_size)
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/pyhf/simplemodels.py b/src/pyhf/simplemodels.py
--- a/src/pyhf/simplemodels.py
+++ b/src/pyhf/simplemodels.py
@@ -1,12 +1,81 @@
from . import Model
-__all__ = ["hepdata_like"]
+__all__ = ["correlated_background", "hepdata_like"]
def __dir__():
return __all__
+def correlated_background(signal, bkg, bkg_up, bkg_down, batch_size=None):
+ r"""
+ Construct a simple single channel :class:`~pyhf.pdf.Model` with a
+ :class:`~pyhf.modifiers.histosys` modifier representing a background
+ with a fully correlated bin-by-bin uncertainty.
+
+ Args:
+ signal (:obj:`list`): The data in the signal sample.
+ bkg (:obj:`list`): The data in the background sample.
+ bkg_up (:obj:`list`): The background sample under an upward variation
+ corresponding to :math:`\alpha=+1`.
+ bkg_down (:obj:`list`): The background sample under a downward variation
+ corresponding to :math:`\alpha=-1`.
+ batch_size (:obj:`None` or :obj:`int`): Number of simultaneous (batched) Models to compute.
+
+ Returns:
+ ~pyhf.pdf.Model: The statistical model adhering to the :obj:`model.json` schema.
+
+ Example:
+ >>> import pyhf
+ >>> pyhf.set_backend("numpy")
+ >>> model = pyhf.simplemodels.correlated_background(
+ ... signal=[12.0, 11.0],
+ ... bkg=[50.0, 52.0],
+ ... bkg_up=[45.0, 57.0],
+ ... bkg_down=[55.0, 47.0],
+ ... )
+ >>> model.schema
+ 'model.json'
+ >>> model.config.channels
+ ['single_channel']
+ >>> model.config.samples
+ ['background', 'signal']
+ >>> model.config.parameters
+ ['correlated_bkg_uncertainty', 'mu']
+ >>> model.expected_data(model.config.suggested_init())
+ array([62., 63., 0.])
+
+ """
+ spec = {
+ "channels": [
+ {
+ "name": "single_channel",
+ "samples": [
+ {
+ "name": "signal",
+ "data": signal,
+ "modifiers": [
+ {"name": "mu", "type": "normfactor", "data": None}
+ ],
+ },
+ {
+ "name": "background",
+ "data": bkg,
+ "modifiers": [
+ {
+ "name": "correlated_bkg_uncertainty",
+ "type": "histosys",
+ "data": {"hi_data": bkg_up, "lo_data": bkg_down},
+ }
+ ],
+ },
+ ],
+ }
+ ]
+ }
+ return Model(spec, batch_size=batch_size)
+
+
def hepdata_like(signal_data, bkg_data, bkg_uncerts, batch_size=None):
"""
Construct a simple single channel :class:`~pyhf.pdf.Model` with a
|
{"golden_diff": "diff --git a/src/pyhf/simplemodels.py b/src/pyhf/simplemodels.py\n--- a/src/pyhf/simplemodels.py\n+++ b/src/pyhf/simplemodels.py\n@@ -1,12 +1,81 @@\n from . import Model\n \n-__all__ = [\"hepdata_like\"]\n+__all__ = [\"correlated_background\", \"hepdata_like\"]\n \n \n def __dir__():\n return __all__\n \n \n+def correlated_background(signal, bkg, bkg_up, bkg_down, batch_size=None):\n+ r\"\"\"\n+ Construct a simple single channel :class:`~pyhf.pdf.Model` with a\n+ :class:`~pyhf.modifiers.histosys` modifier representing a background\n+ with a fully correlated bin-by-bin uncertainty.\n+\n+ Args:\n+ signal (:obj:`list`): The data in the signal sample.\n+ bkg (:obj:`list`): The data in the background sample.\n+ bkg_up (:obj:`list`): The background sample under an upward variation\n+ corresponding to :math:`\\alpha=+1`.\n+ bkg_down (:obj:`list`): The background sample under a downward variation\n+ corresponding to :math:`\\alpha=-1`.\n+ batch_size (:obj:`None` or :obj:`int`): Number of simultaneous (batched) Models to compute.\n+\n+ Returns:\n+ ~pyhf.pdf.Model: The statistical model adhering to the :obj:`model.json` schema.\n+\n+ Example:\n+ >>> import pyhf\n+ >>> pyhf.set_backend(\"numpy\")\n+ >>> model = pyhf.simplemodels.correlated_background(\n+ ... signal=[12.0, 11.0],\n+ ... bkg=[50.0, 52.0],\n+ ... bkg_up=[45.0, 57.0],\n+ ... bkg_down=[55.0, 47.0],\n+ ... )\n+ >>> model.schema\n+ 'model.json'\n+ >>> model.config.channels\n+ ['single_channel']\n+ >>> model.config.samples\n+ ['background', 'signal']\n+ >>> model.config.parameters\n+ ['correlated_bkg_uncertainty', 'mu']\n+ >>> model.expected_data(model.config.suggested_init())\n+ array([62., 63., 0.])\n+\n+ \"\"\"\n+ spec = {\n+ \"channels\": [\n+ {\n+ \"name\": \"single_channel\",\n+ \"samples\": [\n+ {\n+ \"name\": \"signal\",\n+ \"data\": signal,\n+ \"modifiers\": [\n+ {\"name\": \"mu\", \"type\": \"normfactor\", \"data\": None}\n+ ],\n+ },\n+ {\n+ \"name\": \"background\",\n+ \"data\": bkg,\n+ \"modifiers\": [\n+ {\n+ \"name\": \"correlated_bkg_uncertainty\",\n+ \"type\": \"histosys\",\n+ \"data\": {\"hi_data\": bkg_up, \"lo_data\": bkg_down},\n+ }\n+ ],\n+ },\n+ ],\n+ }\n+ ]\n+ }\n+ return Model(spec, batch_size=batch_size)\n+\n+\n def hepdata_like(signal_data, bkg_data, bkg_uncerts, batch_size=None):\n \"\"\"\n Construct a simple single channel :class:`~pyhf.pdf.Model` with a\n", "issue": "Correlated background simplemodel\nsomething like\r\n\r\n```python\r\ndef simplemodel2(s,b_up,b_nom,b_dn):\r\n spec = {\r\n 'channels': [\r\n {\r\n 'name': 'singlechannel',\r\n 'samples': [\r\n {\r\n 'name': 'signal',\r\n 'data': s,\r\n 'modifiers': [{'name': 'mu', 'type': 'normfactor', 'data': None}]\r\n },\r\n {'name': 'background',\r\n 'data': b_nom,\r\n 'modifiers': [\r\n {\r\n 'name': 'uncorr_bkguncrt',\r\n 'type': 'histosys',\r\n 'data': {\r\n 'hi_data': b_up,\r\n 'lo_data': b_dn\r\n }\r\n }\r\n ]\r\n }\r\n ]\r\n }\r\n ]\r\n }\r\n return pyhf.Model(spec)\r\n\r\n```\r\n\r\nwith an API like `pyhf.simplemodels.correlated_bkg`\n", "before_files": [{"content": "from . import Model\n\n__all__ = [\"hepdata_like\"]\n\n\ndef __dir__():\n return __all__\n\n\ndef hepdata_like(signal_data, bkg_data, bkg_uncerts, batch_size=None):\n \"\"\"\n Construct a simple single channel :class:`~pyhf.pdf.Model` with a\n :class:`~pyhf.modifiers.shapesys` modifier representing an uncorrelated\n background uncertainty.\n\n Example:\n >>> import pyhf\n >>> pyhf.set_backend(\"numpy\")\n >>> model = pyhf.simplemodels.hepdata_like(\n ... signal_data=[12.0, 11.0], bkg_data=[50.0, 52.0], bkg_uncerts=[3.0, 7.0]\n ... )\n >>> model.schema\n 'model.json'\n >>> model.config.channels\n ['singlechannel']\n >>> model.config.samples\n ['background', 'signal']\n >>> model.config.parameters\n ['mu', 'uncorr_bkguncrt']\n >>> model.expected_data(model.config.suggested_init())\n array([ 62. , 63. , 277.77777778, 55.18367347])\n\n Args:\n signal_data (:obj:`list`): The data in the signal sample\n bkg_data (:obj:`list`): The data in the background sample\n bkg_uncerts (:obj:`list`): The statistical uncertainty on the background sample counts\n batch_size (:obj:`None` or :obj:`int`): Number of simultaneous (batched) Models to compute\n\n Returns:\n ~pyhf.pdf.Model: The statistical model adhering to the :obj:`model.json` schema\n\n \"\"\"\n spec = {\n 'channels': [\n {\n 'name': 'singlechannel',\n 'samples': [\n {\n 'name': 'signal',\n 'data': signal_data,\n 'modifiers': [\n {'name': 'mu', 'type': 'normfactor', 'data': None}\n ],\n },\n {\n 'name': 'background',\n 'data': bkg_data,\n 'modifiers': [\n {\n 'name': 'uncorr_bkguncrt',\n 'type': 'shapesys',\n 'data': bkg_uncerts,\n }\n ],\n },\n ],\n }\n ]\n }\n return Model(spec, batch_size=batch_size)\n", "path": "src/pyhf/simplemodels.py"}], "after_files": [{"content": "from . import Model\n\n__all__ = [\"correlated_background\", \"hepdata_like\"]\n\n\ndef __dir__():\n return __all__\n\n\ndef correlated_background(signal, bkg, bkg_up, bkg_down, batch_size=None):\n r\"\"\"\n Construct a simple single channel :class:`~pyhf.pdf.Model` with a\n :class:`~pyhf.modifiers.histosys` modifier representing a background\n with a fully correlated bin-by-bin uncertainty.\n\n Args:\n signal (:obj:`list`): The data in the signal sample.\n bkg (:obj:`list`): The data in the background sample.\n bkg_up (:obj:`list`): The background sample under an upward variation\n corresponding to :math:`\\alpha=+1`.\n bkg_down (:obj:`list`): The background sample under a downward variation\n corresponding to :math:`\\alpha=-1`.\n batch_size (:obj:`None` or :obj:`int`): Number of simultaneous (batched) Models to compute.\n\n Returns:\n ~pyhf.pdf.Model: The statistical model adhering to the :obj:`model.json` schema.\n\n Example:\n >>> import pyhf\n >>> pyhf.set_backend(\"numpy\")\n >>> model = pyhf.simplemodels.correlated_background(\n ... signal=[12.0, 11.0],\n ... bkg=[50.0, 52.0],\n ... bkg_up=[45.0, 57.0],\n ... bkg_down=[55.0, 47.0],\n ... )\n >>> model.schema\n 'model.json'\n >>> model.config.channels\n ['single_channel']\n >>> model.config.samples\n ['background', 'signal']\n >>> model.config.parameters\n ['correlated_bkg_uncertainty', 'mu']\n >>> model.expected_data(model.config.suggested_init())\n array([62., 63., 0.])\n\n \"\"\"\n spec = {\n \"channels\": [\n {\n \"name\": \"single_channel\",\n \"samples\": [\n {\n \"name\": \"signal\",\n \"data\": signal,\n \"modifiers\": [\n {\"name\": \"mu\", \"type\": \"normfactor\", \"data\": None}\n ],\n },\n {\n \"name\": \"background\",\n \"data\": bkg,\n \"modifiers\": [\n {\n \"name\": \"correlated_bkg_uncertainty\",\n \"type\": \"histosys\",\n \"data\": {\"hi_data\": bkg_up, \"lo_data\": bkg_down},\n }\n ],\n },\n ],\n }\n ]\n }\n return Model(spec, batch_size=batch_size)\n\n\ndef hepdata_like(signal_data, bkg_data, bkg_uncerts, batch_size=None):\n \"\"\"\n Construct a simple single channel :class:`~pyhf.pdf.Model` with a\n :class:`~pyhf.modifiers.shapesys` modifier representing an uncorrelated\n background uncertainty.\n\n Example:\n >>> import pyhf\n >>> pyhf.set_backend(\"numpy\")\n >>> model = pyhf.simplemodels.hepdata_like(\n ... signal_data=[12.0, 11.0], bkg_data=[50.0, 52.0], bkg_uncerts=[3.0, 7.0]\n ... )\n >>> model.schema\n 'model.json'\n >>> model.config.channels\n ['singlechannel']\n >>> model.config.samples\n ['background', 'signal']\n >>> model.config.parameters\n ['mu', 'uncorr_bkguncrt']\n >>> model.expected_data(model.config.suggested_init())\n array([ 62. , 63. , 277.77777778, 55.18367347])\n\n Args:\n signal_data (:obj:`list`): The data in the signal sample\n bkg_data (:obj:`list`): The data in the background sample\n bkg_uncerts (:obj:`list`): The statistical uncertainty on the background sample counts\n batch_size (:obj:`None` or :obj:`int`): Number of simultaneous (batched) Models to compute\n\n Returns:\n ~pyhf.pdf.Model: The statistical model adhering to the :obj:`model.json` schema\n\n \"\"\"\n spec = {\n 'channels': [\n {\n 'name': 'singlechannel',\n 'samples': [\n {\n 'name': 'signal',\n 'data': signal_data,\n 'modifiers': [\n {'name': 'mu', 'type': 'normfactor', 'data': None}\n ],\n },\n {\n 'name': 'background',\n 'data': bkg_data,\n 'modifiers': [\n {\n 'name': 'uncorr_bkguncrt',\n 'type': 'shapesys',\n 'data': bkg_uncerts,\n }\n ],\n },\n ],\n }\n ]\n }\n return Model(spec, batch_size=batch_size)\n", "path": "src/pyhf/simplemodels.py"}]}
| 1,117 | 746 |
gh_patches_debug_432
|
rasdani/github-patches
|
git_diff
|
HybirdCorp__creme_crm-431
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[WIP] fix python3.9.12 3.9.13 ci issue
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `creme/__init__.py`
Content:
```
1 __version__ = '2.4-alpha1'
2
3 # App registry hooking ---------------------------------------------------------
4
5 try:
6 from django.apps.config import AppConfig
7 from django.apps.registry import Apps
8 except ImportError:
9 # This error may appear with old versions of setuptools during installation
10 import sys
11
12 sys.stderr.write(
13 'Django is not installed ; '
14 'ignore this message if you are installing Creme.'
15 )
16 else:
17 AppConfig.all_apps_ready = lambda self: None
18
19 _original_populate = Apps.populate
20
21 def _hooked_populate(self, installed_apps=None):
22 if self.ready:
23 return
24
25 if getattr(self, '_all_apps_ready', False):
26 return
27
28 _original_populate(self, installed_apps)
29
30 with self._lock:
31 if getattr(self, '_all_apps_ready', False):
32 return
33
34 for app_config in self.get_app_configs():
35 app_config.all_apps_ready()
36
37 self._all_apps_ready = True
38
39 Apps.populate = _hooked_populate
40
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/creme/__init__.py b/creme/__init__.py
--- a/creme/__init__.py
+++ b/creme/__init__.py
@@ -1,5 +1,10 @@
__version__ = '2.4-alpha1'
+
+def get_version():
+ return __version__
+
+
# App registry hooking ---------------------------------------------------------
try:
|
{"golden_diff": "diff --git a/creme/__init__.py b/creme/__init__.py\n--- a/creme/__init__.py\n+++ b/creme/__init__.py\n@@ -1,5 +1,10 @@\n __version__ = '2.4-alpha1'\n \n+\n+def get_version():\n+ return __version__\n+\n+\n # App registry hooking ---------------------------------------------------------\n \n try:\n", "issue": "[WIP] fix python3.9.12 3.9.13 ci issue\n\n", "before_files": [{"content": "__version__ = '2.4-alpha1'\n\n# App registry hooking ---------------------------------------------------------\n\ntry:\n from django.apps.config import AppConfig\n from django.apps.registry import Apps\nexcept ImportError:\n # This error may appear with old versions of setuptools during installation\n import sys\n\n sys.stderr.write(\n 'Django is not installed ; '\n 'ignore this message if you are installing Creme.'\n )\nelse:\n AppConfig.all_apps_ready = lambda self: None\n\n _original_populate = Apps.populate\n\n def _hooked_populate(self, installed_apps=None):\n if self.ready:\n return\n\n if getattr(self, '_all_apps_ready', False):\n return\n\n _original_populate(self, installed_apps)\n\n with self._lock:\n if getattr(self, '_all_apps_ready', False):\n return\n\n for app_config in self.get_app_configs():\n app_config.all_apps_ready()\n\n self._all_apps_ready = True\n\n Apps.populate = _hooked_populate\n", "path": "creme/__init__.py"}], "after_files": [{"content": "__version__ = '2.4-alpha1'\n\n\ndef get_version():\n return __version__\n\n\n# App registry hooking ---------------------------------------------------------\n\ntry:\n from django.apps.config import AppConfig\n from django.apps.registry import Apps\nexcept ImportError:\n # This error may appear with old versions of setuptools during installation\n import sys\n\n sys.stderr.write(\n 'Django is not installed ; '\n 'ignore this message if you are installing Creme.'\n )\nelse:\n AppConfig.all_apps_ready = lambda self: None\n\n _original_populate = Apps.populate\n\n def _hooked_populate(self, installed_apps=None):\n if self.ready:\n return\n\n if getattr(self, '_all_apps_ready', False):\n return\n\n _original_populate(self, installed_apps)\n\n with self._lock:\n if getattr(self, '_all_apps_ready', False):\n return\n\n for app_config in self.get_app_configs():\n app_config.all_apps_ready()\n\n self._all_apps_ready = True\n\n Apps.populate = _hooked_populate\n", "path": "creme/__init__.py"}]}
| 573 | 87 |
gh_patches_debug_4679
|
rasdani/github-patches
|
git_diff
|
pyg-team__pytorch_geometric-8343
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Problem with torch_geometric.transforms
### 🐛 Describe the bug
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
[<ipython-input-20-2b41d296395c>](https://localhost:8080/#) in <cell line: 7>()
5 import torch.nn as nn
6 import torch.nn.functional as F
----> 7 import torch_geometric.transforms as T
8 from tqdm.auto import tqdm
9
3 frames
[/usr/local/lib/python3.10/dist-packages/torch_geometric/__init__.py](https://localhost:8080/#) in <module>
----> 1 import torch_geometric.utils
2 import torch_geometric.data
3 import torch_geometric.sampler
4 import torch_geometric.loader
5 import torch_geometric.transforms
[/usr/local/lib/python3.10/dist-packages/torch_geometric/utils/__init__.py](https://localhost:8080/#) in <module>
1 import copy
2
----> 3 from .scatter import scatter, group_argsort
4 from .segment import segment
5 from .sort import index_sort
[/usr/local/lib/python3.10/dist-packages/torch_geometric/utils/scatter.py](https://localhost:8080/#) in <module>
5
6 import torch_geometric.typing
----> 7 from torch_geometric import warnings
8 from torch_geometric.typing import torch_scatter
9 from torch_geometric.utils.functions import cumsum
[/usr/local/lib/python3.10/dist-packages/torch_geometric/warnings.py](https://localhost:8080/#) in <module>
3 import torch_geometric
4
----> 5 if torch_geometric.typing.WITH_PT20: # pragma: no cover
6 from torch._dynamo import is_compiling as _is_compiling
7 else:
AttributeError: partially initialized module 'torch_geometric' has no attribute 'typing' (most likely due to a circular import)
### Environment
* PyG version: 2.4.0
* PyTorch version: 2.1.0+cu118
* OS: Windows
* Python version: 3.10.12
* CUDA/cuDNN version:
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0
cuDNN version: 8900
* How you installed PyTorch and PyG (`conda`, `pip`, source): pip
* Any other relevant information (*e.g.*, version of `torch-scatter`):
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torch_geometric/warnings.py`
Content:
```
1 import warnings
2
3 import torch_geometric
4
5 if torch_geometric.typing.WITH_PT20: # pragma: no cover
6 from torch._dynamo import is_compiling as _is_compiling
7 else:
8
9 def _is_compiling() -> bool: # pragma: no cover
10 return False
11
12
13 def warn(message: str):
14 if _is_compiling():
15 return
16
17 warnings.warn(message)
18
19
20 def filterwarnings(action: str, message: str):
21 if _is_compiling():
22 return
23
24 warnings.filterwarnings(action, message)
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/torch_geometric/warnings.py b/torch_geometric/warnings.py
--- a/torch_geometric/warnings.py
+++ b/torch_geometric/warnings.py
@@ -1,13 +1,14 @@
import warnings
+import torch
+
import torch_geometric
-if torch_geometric.typing.WITH_PT20: # pragma: no cover
- from torch._dynamo import is_compiling as _is_compiling
-else:
- def _is_compiling() -> bool: # pragma: no cover
- return False
+def _is_compiling() -> bool: # pragma: no cover
+ if torch_geometric.typing.WITH_PT21:
+ return torch._dynamo.is_compiling()
+ return False
def warn(message: str):
|
{"golden_diff": "diff --git a/torch_geometric/warnings.py b/torch_geometric/warnings.py\n--- a/torch_geometric/warnings.py\n+++ b/torch_geometric/warnings.py\n@@ -1,13 +1,14 @@\n import warnings\n \n+import torch\n+\n import torch_geometric\n \n-if torch_geometric.typing.WITH_PT20: # pragma: no cover\n- from torch._dynamo import is_compiling as _is_compiling\n-else:\n \n- def _is_compiling() -> bool: # pragma: no cover\n- return False\n+def _is_compiling() -> bool: # pragma: no cover\n+ if torch_geometric.typing.WITH_PT21:\n+ return torch._dynamo.is_compiling()\n+ return False\n \n \n def warn(message: str):\n", "issue": "Problem with torch_geometric.transforms\n### \ud83d\udc1b Describe the bug\n\n---------------------------------------------------------------------------\r\nAttributeError Traceback (most recent call last)\r\n[<ipython-input-20-2b41d296395c>](https://localhost:8080/#) in <cell line: 7>()\r\n 5 import torch.nn as nn\r\n 6 import torch.nn.functional as F\r\n----> 7 import torch_geometric.transforms as T\r\n 8 from tqdm.auto import tqdm\r\n 9 \r\n\r\n3 frames\r\n[/usr/local/lib/python3.10/dist-packages/torch_geometric/__init__.py](https://localhost:8080/#) in <module>\r\n----> 1 import torch_geometric.utils\r\n 2 import torch_geometric.data\r\n 3 import torch_geometric.sampler\r\n 4 import torch_geometric.loader\r\n 5 import torch_geometric.transforms\r\n\r\n[/usr/local/lib/python3.10/dist-packages/torch_geometric/utils/__init__.py](https://localhost:8080/#) in <module>\r\n 1 import copy\r\n 2 \r\n----> 3 from .scatter import scatter, group_argsort\r\n 4 from .segment import segment\r\n 5 from .sort import index_sort\r\n\r\n[/usr/local/lib/python3.10/dist-packages/torch_geometric/utils/scatter.py](https://localhost:8080/#) in <module>\r\n 5 \r\n 6 import torch_geometric.typing\r\n----> 7 from torch_geometric import warnings\r\n 8 from torch_geometric.typing import torch_scatter\r\n 9 from torch_geometric.utils.functions import cumsum\r\n\r\n[/usr/local/lib/python3.10/dist-packages/torch_geometric/warnings.py](https://localhost:8080/#) in <module>\r\n 3 import torch_geometric\r\n 4 \r\n----> 5 if torch_geometric.typing.WITH_PT20: # pragma: no cover\r\n 6 from torch._dynamo import is_compiling as _is_compiling\r\n 7 else:\r\n\r\nAttributeError: partially initialized module 'torch_geometric' has no attribute 'typing' (most likely due to a circular import)\n\n### Environment\n\n* PyG version: 2.4.0\r\n* PyTorch version: 2.1.0+cu118\r\n* OS: Windows\r\n* Python version: 3.10.12\r\n* CUDA/cuDNN version:\r\nCuda compilation tools, release 11.8, V11.8.89\r\nBuild cuda_11.8.r11.8/compiler.31833905_0\r\ncuDNN version: 8900\r\n* How you installed PyTorch and PyG (`conda`, `pip`, source): pip\r\n* Any other relevant information (*e.g.*, version of `torch-scatter`):\r\n\n", "before_files": [{"content": "import warnings\n\nimport torch_geometric\n\nif torch_geometric.typing.WITH_PT20: # pragma: no cover\n from torch._dynamo import is_compiling as _is_compiling\nelse:\n\n def _is_compiling() -> bool: # pragma: no cover\n return False\n\n\ndef warn(message: str):\n if _is_compiling():\n return\n\n warnings.warn(message)\n\n\ndef filterwarnings(action: str, message: str):\n if _is_compiling():\n return\n\n warnings.filterwarnings(action, message)\n", "path": "torch_geometric/warnings.py"}], "after_files": [{"content": "import warnings\n\nimport torch\n\nimport torch_geometric\n\n\ndef _is_compiling() -> bool: # pragma: no cover\n if torch_geometric.typing.WITH_PT21:\n return torch._dynamo.is_compiling()\n return False\n\n\ndef warn(message: str):\n if _is_compiling():\n return\n\n warnings.warn(message)\n\n\ndef filterwarnings(action: str, message: str):\n if _is_compiling():\n return\n\n warnings.filterwarnings(action, message)\n", "path": "torch_geometric/warnings.py"}]}
| 1,061 | 182 |
gh_patches_debug_18838
|
rasdani/github-patches
|
git_diff
|
ytdl-org__youtube-dl-13606
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SITE REQUEST: www.5-tv.ru
Dear coders, I just visited this 5-tv.ru site, and I wanted to get news video from it, got no success.
Please, add this 5-tv.ru into YDL supported sources list. Thanks in advance!
The log is:
C:\>youtube-dl -v -F http://www.5-tv.ru/programs/broadcast/509514/
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: ['-v', '-F', 'http://www.5-tv.ru/programs/broadcast/509514/']
[debug] Encodings: locale cp1251, fs mbcs, out cp866, pref cp1251
[debug] youtube-dl version 2017.06.12
[debug] Python version 3.4.4 - Windows-7-6.1.7601-SP1
[debug] exe versions: ffmpeg 3.3.1, ffprobe 3.3.1, rtmpdump 2.4
[debug] Proxy map: {}
[FiveTV] 509514: Downloading webpage
ERROR: Unable to extract video url; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its completeoutput.
Traceback (most recent call last):
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\rg3\tmpkyaecyzu\build\youtube_dl\YoutubeDL.py", line 762,in extract_info
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\rg3\tmpkyaecyzu\build\youtube_dl\extractor\common.py", line 433, in extract
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\rg3\tmpkyaecyzu\build\youtube_dl\extractor\fivetv.py", line 74, in _real_extract
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\rg3\tmpkyaecyzu\build\youtube_dl\extractor\common.py", line 782, in _search_regex
youtube_dl.utils.RegexNotFoundError: Unable to extract video url; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `youtube_dl/extractor/fivetv.py`
Content:
```
1 # coding: utf-8
2 from __future__ import unicode_literals
3
4 import re
5
6 from .common import InfoExtractor
7 from ..utils import int_or_none
8
9
10 class FiveTVIE(InfoExtractor):
11 _VALID_URL = r'''(?x)
12 http://
13 (?:www\.)?5-tv\.ru/
14 (?:
15 (?:[^/]+/)+(?P<id>\d+)|
16 (?P<path>[^/?#]+)(?:[/?#])?
17 )
18 '''
19
20 _TESTS = [{
21 'url': 'http://5-tv.ru/news/96814/',
22 'md5': 'bbff554ad415ecf5416a2f48c22d9283',
23 'info_dict': {
24 'id': '96814',
25 'ext': 'mp4',
26 'title': 'Россияне выбрали имя для общенациональной платежной системы',
27 'description': 'md5:a8aa13e2b7ad36789e9f77a74b6de660',
28 'thumbnail': r're:^https?://.*\.jpg$',
29 'duration': 180,
30 },
31 }, {
32 'url': 'http://5-tv.ru/video/1021729/',
33 'info_dict': {
34 'id': '1021729',
35 'ext': 'mp4',
36 'title': '3D принтер',
37 'description': 'md5:d76c736d29ef7ec5c0cf7d7c65ffcb41',
38 'thumbnail': r're:^https?://.*\.jpg$',
39 'duration': 180,
40 },
41 }, {
42 'url': 'http://www.5-tv.ru/glavnoe/#itemDetails',
43 'info_dict': {
44 'id': 'glavnoe',
45 'ext': 'mp4',
46 'title': 'Итоги недели с 8 по 14 июня 2015 года',
47 'thumbnail': r're:^https?://.*\.jpg$',
48 },
49 }, {
50 'url': 'http://www.5-tv.ru/glavnoe/broadcasts/508645/',
51 'only_matching': True,
52 }, {
53 'url': 'http://5-tv.ru/films/1507502/',
54 'only_matching': True,
55 }, {
56 'url': 'http://5-tv.ru/programs/broadcast/508713/',
57 'only_matching': True,
58 }, {
59 'url': 'http://5-tv.ru/angel/',
60 'only_matching': True,
61 }, {
62 'url': 'http://www.5-tv.ru/schedule/?iframe=true&width=900&height=450',
63 'only_matching': True,
64 }]
65
66 def _real_extract(self, url):
67 mobj = re.match(self._VALID_URL, url)
68 video_id = mobj.group('id') or mobj.group('path')
69
70 webpage = self._download_webpage(url, video_id)
71
72 video_url = self._search_regex(
73 r'<a[^>]+?href="([^"]+)"[^>]+?class="videoplayer"',
74 webpage, 'video url')
75
76 title = self._og_search_title(webpage, default=None) or self._search_regex(
77 r'<title>([^<]+)</title>', webpage, 'title')
78 duration = int_or_none(self._og_search_property(
79 'video:duration', webpage, 'duration', default=None))
80
81 return {
82 'id': video_id,
83 'url': video_url,
84 'title': title,
85 'description': self._og_search_description(webpage, default=None),
86 'thumbnail': self._og_search_thumbnail(webpage, default=None),
87 'duration': duration,
88 }
89
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/youtube_dl/extractor/fivetv.py b/youtube_dl/extractor/fivetv.py
--- a/youtube_dl/extractor/fivetv.py
+++ b/youtube_dl/extractor/fivetv.py
@@ -43,7 +43,7 @@
'info_dict': {
'id': 'glavnoe',
'ext': 'mp4',
- 'title': 'Итоги недели с 8 по 14 июня 2015 года',
+ 'title': r're:^Итоги недели с \d+ по \d+ \w+ \d{4} года$',
'thumbnail': r're:^https?://.*\.jpg$',
},
}, {
@@ -70,7 +70,8 @@
webpage = self._download_webpage(url, video_id)
video_url = self._search_regex(
- r'<a[^>]+?href="([^"]+)"[^>]+?class="videoplayer"',
+ [r'<div[^>]+?class="flowplayer[^>]+?data-href="([^"]+)"',
+ r'<a[^>]+?href="([^"]+)"[^>]+?class="videoplayer"'],
webpage, 'video url')
title = self._og_search_title(webpage, default=None) or self._search_regex(
|
{"golden_diff": "diff --git a/youtube_dl/extractor/fivetv.py b/youtube_dl/extractor/fivetv.py\n--- a/youtube_dl/extractor/fivetv.py\n+++ b/youtube_dl/extractor/fivetv.py\n@@ -43,7 +43,7 @@\n 'info_dict': {\n 'id': 'glavnoe',\n 'ext': 'mp4',\n- 'title': '\u0418\u0442\u043e\u0433\u0438 \u043d\u0435\u0434\u0435\u043b\u0438 \u0441 8 \u043f\u043e 14 \u0438\u044e\u043d\u044f 2015 \u0433\u043e\u0434\u0430',\n+ 'title': r're:^\u0418\u0442\u043e\u0433\u0438 \u043d\u0435\u0434\u0435\u043b\u0438 \u0441\u00a0\\d+ \u043f\u043e\u00a0\\d+\u00a0\\w+\u00a0\\d{4}\u00a0\u0433\u043e\u0434\u0430$',\n 'thumbnail': r're:^https?://.*\\.jpg$',\n },\n }, {\n@@ -70,7 +70,8 @@\n webpage = self._download_webpage(url, video_id)\n \n video_url = self._search_regex(\n- r'<a[^>]+?href=\"([^\"]+)\"[^>]+?class=\"videoplayer\"',\n+ [r'<div[^>]+?class=\"flowplayer[^>]+?data-href=\"([^\"]+)\"',\n+ r'<a[^>]+?href=\"([^\"]+)\"[^>]+?class=\"videoplayer\"'],\n webpage, 'video url')\n \n title = self._og_search_title(webpage, default=None) or self._search_regex(\n", "issue": "SITE REQUEST: www.5-tv.ru\nDear coders, I just visited this 5-tv.ru site, and I wanted to get news video from it, got no success.\r\nPlease, add this 5-tv.ru into YDL supported sources list. Thanks in advance!\r\nThe log is:\r\nC:\\>youtube-dl -v -F http://www.5-tv.ru/programs/broadcast/509514/\r\n[debug] System config: []\r\n[debug] User config: []\r\n[debug] Custom config: []\r\n[debug] Command-line args: ['-v', '-F', 'http://www.5-tv.ru/programs/broadcast/509514/']\r\n[debug] Encodings: locale cp1251, fs mbcs, out cp866, pref cp1251\r\n[debug] youtube-dl version 2017.06.12\r\n[debug] Python version 3.4.4 - Windows-7-6.1.7601-SP1\r\n[debug] exe versions: ffmpeg 3.3.1, ffprobe 3.3.1, rtmpdump 2.4\r\n[debug] Proxy map: {}\r\n[FiveTV] 509514: Downloading webpage\r\nERROR: Unable to extract video url; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its completeoutput.\r\nTraceback (most recent call last):\r\n File \"C:\\Users\\dst\\AppData\\Roaming\\Build archive\\youtube-dl\\rg3\\tmpkyaecyzu\\build\\youtube_dl\\YoutubeDL.py\", line 762,in extract_info\r\n File \"C:\\Users\\dst\\AppData\\Roaming\\Build archive\\youtube-dl\\rg3\\tmpkyaecyzu\\build\\youtube_dl\\extractor\\common.py\", line 433, in extract\r\n File \"C:\\Users\\dst\\AppData\\Roaming\\Build archive\\youtube-dl\\rg3\\tmpkyaecyzu\\build\\youtube_dl\\extractor\\fivetv.py\", line 74, in _real_extract\r\n File \"C:\\Users\\dst\\AppData\\Roaming\\Build archive\\youtube-dl\\rg3\\tmpkyaecyzu\\build\\youtube_dl\\extractor\\common.py\", line 782, in _search_regex\r\nyoutube_dl.utils.RegexNotFoundError: Unable to extract video url; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.\n", "before_files": [{"content": "# coding: utf-8\nfrom __future__ import unicode_literals\n\nimport re\n\nfrom .common import InfoExtractor\nfrom ..utils import int_or_none\n\n\nclass FiveTVIE(InfoExtractor):\n _VALID_URL = r'''(?x)\n http://\n (?:www\\.)?5-tv\\.ru/\n (?:\n (?:[^/]+/)+(?P<id>\\d+)|\n (?P<path>[^/?#]+)(?:[/?#])?\n )\n '''\n\n _TESTS = [{\n 'url': 'http://5-tv.ru/news/96814/',\n 'md5': 'bbff554ad415ecf5416a2f48c22d9283',\n 'info_dict': {\n 'id': '96814',\n 'ext': 'mp4',\n 'title': '\u0420\u043e\u0441\u0441\u0438\u044f\u043d\u0435 \u0432\u044b\u0431\u0440\u0430\u043b\u0438 \u0438\u043c\u044f \u0434\u043b\u044f \u043e\u0431\u0449\u0435\u043d\u0430\u0446\u0438\u043e\u043d\u0430\u043b\u044c\u043d\u043e\u0439 \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u043e\u0439 \u0441\u0438\u0441\u0442\u0435\u043c\u044b',\n 'description': 'md5:a8aa13e2b7ad36789e9f77a74b6de660',\n 'thumbnail': r're:^https?://.*\\.jpg$',\n 'duration': 180,\n },\n }, {\n 'url': 'http://5-tv.ru/video/1021729/',\n 'info_dict': {\n 'id': '1021729',\n 'ext': 'mp4',\n 'title': '3D \u043f\u0440\u0438\u043d\u0442\u0435\u0440',\n 'description': 'md5:d76c736d29ef7ec5c0cf7d7c65ffcb41',\n 'thumbnail': r're:^https?://.*\\.jpg$',\n 'duration': 180,\n },\n }, {\n 'url': 'http://www.5-tv.ru/glavnoe/#itemDetails',\n 'info_dict': {\n 'id': 'glavnoe',\n 'ext': 'mp4',\n 'title': '\u0418\u0442\u043e\u0433\u0438 \u043d\u0435\u0434\u0435\u043b\u0438 \u0441 8 \u043f\u043e 14 \u0438\u044e\u043d\u044f 2015 \u0433\u043e\u0434\u0430',\n 'thumbnail': r're:^https?://.*\\.jpg$',\n },\n }, {\n 'url': 'http://www.5-tv.ru/glavnoe/broadcasts/508645/',\n 'only_matching': True,\n }, {\n 'url': 'http://5-tv.ru/films/1507502/',\n 'only_matching': True,\n }, {\n 'url': 'http://5-tv.ru/programs/broadcast/508713/',\n 'only_matching': True,\n }, {\n 'url': 'http://5-tv.ru/angel/',\n 'only_matching': True,\n }, {\n 'url': 'http://www.5-tv.ru/schedule/?iframe=true&width=900&height=450',\n 'only_matching': True,\n }]\n\n def _real_extract(self, url):\n mobj = re.match(self._VALID_URL, url)\n video_id = mobj.group('id') or mobj.group('path')\n\n webpage = self._download_webpage(url, video_id)\n\n video_url = self._search_regex(\n r'<a[^>]+?href=\"([^\"]+)\"[^>]+?class=\"videoplayer\"',\n webpage, 'video url')\n\n title = self._og_search_title(webpage, default=None) or self._search_regex(\n r'<title>([^<]+)</title>', webpage, 'title')\n duration = int_or_none(self._og_search_property(\n 'video:duration', webpage, 'duration', default=None))\n\n return {\n 'id': video_id,\n 'url': video_url,\n 'title': title,\n 'description': self._og_search_description(webpage, default=None),\n 'thumbnail': self._og_search_thumbnail(webpage, default=None),\n 'duration': duration,\n }\n", "path": "youtube_dl/extractor/fivetv.py"}], "after_files": [{"content": "# coding: utf-8\nfrom __future__ import unicode_literals\n\nimport re\n\nfrom .common import InfoExtractor\nfrom ..utils import int_or_none\n\n\nclass FiveTVIE(InfoExtractor):\n _VALID_URL = r'''(?x)\n http://\n (?:www\\.)?5-tv\\.ru/\n (?:\n (?:[^/]+/)+(?P<id>\\d+)|\n (?P<path>[^/?#]+)(?:[/?#])?\n )\n '''\n\n _TESTS = [{\n 'url': 'http://5-tv.ru/news/96814/',\n 'md5': 'bbff554ad415ecf5416a2f48c22d9283',\n 'info_dict': {\n 'id': '96814',\n 'ext': 'mp4',\n 'title': '\u0420\u043e\u0441\u0441\u0438\u044f\u043d\u0435 \u0432\u044b\u0431\u0440\u0430\u043b\u0438 \u0438\u043c\u044f \u0434\u043b\u044f \u043e\u0431\u0449\u0435\u043d\u0430\u0446\u0438\u043e\u043d\u0430\u043b\u044c\u043d\u043e\u0439 \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u043e\u0439 \u0441\u0438\u0441\u0442\u0435\u043c\u044b',\n 'description': 'md5:a8aa13e2b7ad36789e9f77a74b6de660',\n 'thumbnail': r're:^https?://.*\\.jpg$',\n 'duration': 180,\n },\n }, {\n 'url': 'http://5-tv.ru/video/1021729/',\n 'info_dict': {\n 'id': '1021729',\n 'ext': 'mp4',\n 'title': '3D \u043f\u0440\u0438\u043d\u0442\u0435\u0440',\n 'description': 'md5:d76c736d29ef7ec5c0cf7d7c65ffcb41',\n 'thumbnail': r're:^https?://.*\\.jpg$',\n 'duration': 180,\n },\n }, {\n 'url': 'http://www.5-tv.ru/glavnoe/#itemDetails',\n 'info_dict': {\n 'id': 'glavnoe',\n 'ext': 'mp4',\n 'title': r're:^\u0418\u0442\u043e\u0433\u0438 \u043d\u0435\u0434\u0435\u043b\u0438 \u0441\u00a0\\d+ \u043f\u043e\u00a0\\d+\u00a0\\w+\u00a0\\d{4}\u00a0\u0433\u043e\u0434\u0430$',\n 'thumbnail': r're:^https?://.*\\.jpg$',\n },\n }, {\n 'url': 'http://www.5-tv.ru/glavnoe/broadcasts/508645/',\n 'only_matching': True,\n }, {\n 'url': 'http://5-tv.ru/films/1507502/',\n 'only_matching': True,\n }, {\n 'url': 'http://5-tv.ru/programs/broadcast/508713/',\n 'only_matching': True,\n }, {\n 'url': 'http://5-tv.ru/angel/',\n 'only_matching': True,\n }, {\n 'url': 'http://www.5-tv.ru/schedule/?iframe=true&width=900&height=450',\n 'only_matching': True,\n }]\n\n def _real_extract(self, url):\n mobj = re.match(self._VALID_URL, url)\n video_id = mobj.group('id') or mobj.group('path')\n\n webpage = self._download_webpage(url, video_id)\n\n video_url = self._search_regex(\n [r'<div[^>]+?class=\"flowplayer[^>]+?data-href=\"([^\"]+)\"',\n r'<a[^>]+?href=\"([^\"]+)\"[^>]+?class=\"videoplayer\"'],\n webpage, 'video url')\n\n title = self._og_search_title(webpage, default=None) or self._search_regex(\n r'<title>([^<]+)</title>', webpage, 'title')\n duration = int_or_none(self._og_search_property(\n 'video:duration', webpage, 'duration', default=None))\n\n return {\n 'id': video_id,\n 'url': video_url,\n 'title': title,\n 'description': self._og_search_description(webpage, default=None),\n 'thumbnail': self._og_search_thumbnail(webpage, default=None),\n 'duration': duration,\n }\n", "path": "youtube_dl/extractor/fivetv.py"}]}
| 1,933 | 310 |
gh_patches_debug_47839
|
rasdani/github-patches
|
git_diff
|
holoviz__panel-3157
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add files to `__init__` for autocomplete in VS Code
When writing in vs code the autocomplete only works for modules imported in an `__init__.py` file, e.g. `pn.widgets.IntSlider` work but `pn.viewable.Viewer` does not. See here:
https://user-images.githubusercontent.com/19758978/150685703-a235b219-6052-4e6e-b1f5-b121dc1f1558.mp4
The solution is pretty easy as `.viewable` only needs to be added to the `__init__`.
https://user-images.githubusercontent.com/19758978/150685758-3b1e5468-bcbe-4337-a62a-f3a4da8d9caf.mp4
I don't know if #3132 will fix this. When you have time @MarcSkovMadsen can you check this?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `panel/__init__.py`
Content:
```
1 from . import layout # noqa
2 from . import links # noqa
3 from . import pane # noqa
4 from . import param # noqa
5 from . import pipeline # noqa
6 from . import widgets # noqa
7
8 from .config import config, panel_extension as extension, __version__ # noqa
9 from .depends import bind, depends # noqa
10 from .interact import interact # noqa
11 from .io import _jupyter_server_extension_paths, ipywidget, serve, state # noqa
12 from .layout import ( # noqa
13 Accordion, Card, Column, GridSpec, GridBox, FlexBox, Tabs, Row,
14 Spacer, WidgetBox
15 )
16 from .pane import panel, Pane # noqa
17 from .param import Param # noqa
18 from .template import Template # noqa
19 from .widgets import indicators # noqa
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/panel/__init__.py b/panel/__init__.py
--- a/panel/__init__.py
+++ b/panel/__init__.py
@@ -3,6 +3,8 @@
from . import pane # noqa
from . import param # noqa
from . import pipeline # noqa
+from . import reactive # noqa
+from . import viewable # noqa
from . import widgets # noqa
from .config import config, panel_extension as extension, __version__ # noqa
|
{"golden_diff": "diff --git a/panel/__init__.py b/panel/__init__.py\n--- a/panel/__init__.py\n+++ b/panel/__init__.py\n@@ -3,6 +3,8 @@\n from . import pane # noqa\n from . import param # noqa\n from . import pipeline # noqa\n+from . import reactive # noqa\n+from . import viewable # noqa\n from . import widgets # noqa\n \n from .config import config, panel_extension as extension, __version__ # noqa\n", "issue": "Add files to `__init__` for autocomplete in VS Code\nWhen writing in vs code the autocomplete only works for modules imported in an `__init__.py` file, e.g. `pn.widgets.IntSlider` work but `pn.viewable.Viewer` does not. See here:\r\n\r\nhttps://user-images.githubusercontent.com/19758978/150685703-a235b219-6052-4e6e-b1f5-b121dc1f1558.mp4\r\n\r\nThe solution is pretty easy as `.viewable` only needs to be added to the `__init__`.\r\n\r\nhttps://user-images.githubusercontent.com/19758978/150685758-3b1e5468-bcbe-4337-a62a-f3a4da8d9caf.mp4\r\n\r\nI don't know if #3132 will fix this. When you have time @MarcSkovMadsen can you check this? \n", "before_files": [{"content": "from . import layout # noqa\nfrom . import links # noqa\nfrom . import pane # noqa\nfrom . import param # noqa\nfrom . import pipeline # noqa\nfrom . import widgets # noqa\n\nfrom .config import config, panel_extension as extension, __version__ # noqa\nfrom .depends import bind, depends # noqa\nfrom .interact import interact # noqa\nfrom .io import _jupyter_server_extension_paths, ipywidget, serve, state # noqa\nfrom .layout import ( # noqa\n Accordion, Card, Column, GridSpec, GridBox, FlexBox, Tabs, Row,\n Spacer, WidgetBox\n)\nfrom .pane import panel, Pane # noqa\nfrom .param import Param # noqa\nfrom .template import Template # noqa\nfrom .widgets import indicators # noqa\n", "path": "panel/__init__.py"}], "after_files": [{"content": "from . import layout # noqa\nfrom . import links # noqa\nfrom . import pane # noqa\nfrom . import param # noqa\nfrom . import pipeline # noqa\nfrom . import reactive # noqa\nfrom . import viewable # noqa\nfrom . import widgets # noqa\n\nfrom .config import config, panel_extension as extension, __version__ # noqa\nfrom .depends import bind, depends # noqa\nfrom .interact import interact # noqa\nfrom .io import _jupyter_server_extension_paths, ipywidget, serve, state # noqa\nfrom .layout import ( # noqa\n Accordion, Card, Column, GridSpec, GridBox, FlexBox, Tabs, Row,\n Spacer, WidgetBox\n)\nfrom .pane import panel, Pane # noqa\nfrom .param import Param # noqa\nfrom .template import Template # noqa\nfrom .widgets import indicators # noqa\n", "path": "panel/__init__.py"}]}
| 694 | 108 |
gh_patches_debug_15842
|
rasdani/github-patches
|
git_diff
|
falconry__falcon-174
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update to_query_str to intelligently handle lists
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `falcon/util.py`
Content:
```
1 """Defines Falcon utility functions
2
3 Copyright 2013 by Rackspace Hosting, Inc.
4
5 Licensed under the Apache License, Version 2.0 (the "License");
6 you may not use this file except in compliance with the License.
7 You may obtain a copy of the License at
8
9 http://www.apache.org/licenses/LICENSE-2.0
10
11 Unless required by applicable law or agreed to in writing, software
12 distributed under the License is distributed on an "AS IS" BASIS,
13 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14 See the License for the specific language governing permissions and
15 limitations under the License.
16
17 """
18
19 import datetime
20 import six
21
22 if six.PY3: # pragma nocover
23 from urllib.parse import quote as url_quote
24 else: # pragma nocover
25 from urllib import quote as url_quote
26
27
28 __all__ = ('dt_to_http', 'http_date_to_dt', 'to_query_str', 'percent_escape')
29
30
31 def dt_to_http(dt):
32 """Converts a datetime instance to an HTTP date string.
33
34 Args:
35 dt: A datetime object, assumed to be UTC
36
37 Returns:
38 An HTTP date string, e.g., "Tue, 15 Nov 1994 12:45:26 GMT". See
39 also: http://goo.gl/R7So4
40 """
41
42 # Tue, 15 Nov 1994 12:45:26 GMT
43 return dt.strftime('%a, %d %b %Y %H:%M:%S GMT')
44
45
46 def http_date_to_dt(http_date):
47 """Converts an HTTP date string to a datetime instance.
48
49 Args:
50 http_date: An HTTP date string, e.g., "Tue, 15 Nov 1994 12:45:26 GMT".
51
52 Returns:
53 A UTC datetime instance corresponding to the given HTTP date.
54 """
55
56 return datetime.datetime.strptime(
57 http_date, '%a, %d %b %Y %H:%M:%S %Z')
58
59
60 def to_query_str(params):
61 """Converts a dict of params to afaln actual query string.
62
63 Args:
64 params: dict of simple key-value types, where key is a string and
65 value is a string or something that can be converted into a
66 string.
67
68 Returns:
69 A URI query string starting with '?', or and empty string if there
70 are no params (the dict is empty).
71 """
72
73 if not params:
74 return ''
75
76 # PERF: This is faster than a list comprehension and join, mainly
77 # because it allows us to inline the value transform.
78 query_str = '?'
79 for k, v in params.items():
80 if v is True:
81 v = 'true'
82 elif v is False:
83 v = 'false'
84 else:
85 v = str(v)
86
87 query_str += k + '=' + v + '&'
88
89 return query_str[:-1]
90
91
92 def percent_escape(url):
93 """Percent-escape reserved characters in the given url.
94
95 Args:
96 url: A full or relative URL.
97
98 Returns:
99 An escaped version of the URL, excluding '/', ',' and ':'
100 characters. In Python 2, unicode URL strings will be first
101 encoded to a UTF-8 byte string to work around a urllib
102 bug.
103 """
104
105 # Convert the string so that urllib.quote does not complain
106 # if it actually has Unicode chars in it.
107 if not six.PY3 and isinstance(url, six.text_type): # pragma nocover
108 url = url.encode('utf-8')
109
110 return url_quote(url, safe='/:,=?&-_')
111
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/falcon/util.py b/falcon/util.py
--- a/falcon/util.py
+++ b/falcon/util.py
@@ -63,7 +63,8 @@
Args:
params: dict of simple key-value types, where key is a string and
value is a string or something that can be converted into a
- string.
+ string. If value is a list, it will be converted to a comma-
+ delimited string (e.g., thing=1,2,3)
Returns:
A URI query string starting with '?', or and empty string if there
@@ -81,6 +82,8 @@
v = 'true'
elif v is False:
v = 'false'
+ elif isinstance(v, list):
+ v = ','.join([str(i) for i in v])
else:
v = str(v)
|
{"golden_diff": "diff --git a/falcon/util.py b/falcon/util.py\n--- a/falcon/util.py\n+++ b/falcon/util.py\n@@ -63,7 +63,8 @@\n Args:\n params: dict of simple key-value types, where key is a string and\n value is a string or something that can be converted into a\n- string.\n+ string. If value is a list, it will be converted to a comma-\n+ delimited string (e.g., thing=1,2,3)\n \n Returns:\n A URI query string starting with '?', or and empty string if there\n@@ -81,6 +82,8 @@\n v = 'true'\n elif v is False:\n v = 'false'\n+ elif isinstance(v, list):\n+ v = ','.join([str(i) for i in v])\n else:\n v = str(v)\n", "issue": "Update to_query_str to intelligently handle lists\n\n", "before_files": [{"content": "\"\"\"Defines Falcon utility functions\n\nCopyright 2013 by Rackspace Hosting, Inc.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n\n\"\"\"\n\nimport datetime\nimport six\n\nif six.PY3: # pragma nocover\n from urllib.parse import quote as url_quote\nelse: # pragma nocover\n from urllib import quote as url_quote\n\n\n__all__ = ('dt_to_http', 'http_date_to_dt', 'to_query_str', 'percent_escape')\n\n\ndef dt_to_http(dt):\n \"\"\"Converts a datetime instance to an HTTP date string.\n\n Args:\n dt: A datetime object, assumed to be UTC\n\n Returns:\n An HTTP date string, e.g., \"Tue, 15 Nov 1994 12:45:26 GMT\". See\n also: http://goo.gl/R7So4\n \"\"\"\n\n # Tue, 15 Nov 1994 12:45:26 GMT\n return dt.strftime('%a, %d %b %Y %H:%M:%S GMT')\n\n\ndef http_date_to_dt(http_date):\n \"\"\"Converts an HTTP date string to a datetime instance.\n\n Args:\n http_date: An HTTP date string, e.g., \"Tue, 15 Nov 1994 12:45:26 GMT\".\n\n Returns:\n A UTC datetime instance corresponding to the given HTTP date.\n \"\"\"\n\n return datetime.datetime.strptime(\n http_date, '%a, %d %b %Y %H:%M:%S %Z')\n\n\ndef to_query_str(params):\n \"\"\"Converts a dict of params to afaln actual query string.\n\n Args:\n params: dict of simple key-value types, where key is a string and\n value is a string or something that can be converted into a\n string.\n\n Returns:\n A URI query string starting with '?', or and empty string if there\n are no params (the dict is empty).\n \"\"\"\n\n if not params:\n return ''\n\n # PERF: This is faster than a list comprehension and join, mainly\n # because it allows us to inline the value transform.\n query_str = '?'\n for k, v in params.items():\n if v is True:\n v = 'true'\n elif v is False:\n v = 'false'\n else:\n v = str(v)\n\n query_str += k + '=' + v + '&'\n\n return query_str[:-1]\n\n\ndef percent_escape(url):\n \"\"\"Percent-escape reserved characters in the given url.\n\n Args:\n url: A full or relative URL.\n\n Returns:\n An escaped version of the URL, excluding '/', ',' and ':'\n characters. In Python 2, unicode URL strings will be first\n encoded to a UTF-8 byte string to work around a urllib\n bug.\n \"\"\"\n\n # Convert the string so that urllib.quote does not complain\n # if it actually has Unicode chars in it.\n if not six.PY3 and isinstance(url, six.text_type): # pragma nocover\n url = url.encode('utf-8')\n\n return url_quote(url, safe='/:,=?&-_')\n", "path": "falcon/util.py"}], "after_files": [{"content": "\"\"\"Defines Falcon utility functions\n\nCopyright 2013 by Rackspace Hosting, Inc.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n\n\"\"\"\n\nimport datetime\nimport six\n\nif six.PY3: # pragma nocover\n from urllib.parse import quote as url_quote\nelse: # pragma nocover\n from urllib import quote as url_quote\n\n\n__all__ = ('dt_to_http', 'http_date_to_dt', 'to_query_str', 'percent_escape')\n\n\ndef dt_to_http(dt):\n \"\"\"Converts a datetime instance to an HTTP date string.\n\n Args:\n dt: A datetime object, assumed to be UTC\n\n Returns:\n An HTTP date string, e.g., \"Tue, 15 Nov 1994 12:45:26 GMT\". See\n also: http://goo.gl/R7So4\n \"\"\"\n\n # Tue, 15 Nov 1994 12:45:26 GMT\n return dt.strftime('%a, %d %b %Y %H:%M:%S GMT')\n\n\ndef http_date_to_dt(http_date):\n \"\"\"Converts an HTTP date string to a datetime instance.\n\n Args:\n http_date: An HTTP date string, e.g., \"Tue, 15 Nov 1994 12:45:26 GMT\".\n\n Returns:\n A UTC datetime instance corresponding to the given HTTP date.\n \"\"\"\n\n return datetime.datetime.strptime(\n http_date, '%a, %d %b %Y %H:%M:%S %Z')\n\n\ndef to_query_str(params):\n \"\"\"Converts a dict of params to afaln actual query string.\n\n Args:\n params: dict of simple key-value types, where key is a string and\n value is a string or something that can be converted into a\n string. If value is a list, it will be converted to a comma-\n delimited string (e.g., thing=1,2,3)\n\n Returns:\n A URI query string starting with '?', or and empty string if there\n are no params (the dict is empty).\n \"\"\"\n\n if not params:\n return ''\n\n # PERF: This is faster than a list comprehension and join, mainly\n # because it allows us to inline the value transform.\n query_str = '?'\n for k, v in params.items():\n if v is True:\n v = 'true'\n elif v is False:\n v = 'false'\n elif isinstance(v, list):\n v = ','.join([str(i) for i in v])\n else:\n v = str(v)\n\n query_str += k + '=' + v + '&'\n\n return query_str[:-1]\n\n\ndef percent_escape(url):\n \"\"\"Percent-escape reserved characters in the given url.\n\n Args:\n url: A full or relative URL.\n\n Returns:\n An escaped version of the URL, excluding '/', ',' and ':'\n characters. In Python 2, unicode URL strings will be first\n encoded to a UTF-8 byte string to work around a urllib\n bug.\n \"\"\"\n\n # Convert the string so that urllib.quote does not complain\n # if it actually has Unicode chars in it.\n if not six.PY3 and isinstance(url, six.text_type): # pragma nocover\n url = url.encode('utf-8')\n\n return url_quote(url, safe='/:,=?&-_')\n", "path": "falcon/util.py"}]}
| 1,301 | 194 |
gh_patches_debug_6563
|
rasdani/github-patches
|
git_diff
|
cornellius-gp__gpytorch-1195
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug] Cannot serialize/deserialize SmoothedBoxPrior when some args are broadcast
# 🐛 Bug
It seems like `SmoothedBoxPrior` for >1d doesn't work with serialization/deserialization when only some args are broadcast.
## To reproduce
```python
import torch
import gpytorch
pr = gpytorch.priors.SmoothedBoxPrior(torch.zeros(2), torch.ones(2))
pr.load_state_dict(pr.state_dict())
```
** Stack trace/error message **
```
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-5-6b4b2e881beb> in <module>
2 import gpytorch
3 pr = gpytorch.priors.SmoothedBoxPrior(torch.zeros(2), torch.ones(2))
----> 4 pr.load_state_dict(pr.state_dict())
<...PATH..>/torch/nn/modules/module.py in load_state_dict(self, state_dict, strict)
877 if len(error_msgs) > 0:
878 raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
--> 879 self.__class__.__name__, "\n\t".join(error_msgs)))
880 return _IncompatibleKeys(missing_keys, unexpected_keys)
881
RuntimeError: Error(s) in loading state_dict for SmoothedBoxPrior:
While copying the parameter named "sigma", whose dimensions in the model are torch.Size([2]) and whose dimensions in the checkpoint are torch.Size([2]), an exception occured : ('unsupported operation: more than one element of the written-to tensor refers to a single memory location. Please clone() the tensor before performing the operation.',).
```
Note that `SmoothedBoxPrior(a=torch.zeros(2), b=torch.ones(2), sigma=torch.ones(2)*0.01)` succeeds, as does `gpytorch.priors.GammaPrior(torch.ones(2),1)`.
## Expected Behavior
Successful load.
## System information
**Please complete the following information:**
- gpytorch version: 1.1.1
- pytorch version: 1.5.0
- OS: tested on Centos and Mac OSX.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gpytorch/priors/smoothed_box_prior.py`
Content:
```
1 #!/usr/bin/env python3
2
3 import math
4 from numbers import Number
5
6 import torch
7 from torch.distributions import constraints
8 from torch.distributions.utils import broadcast_all
9 from torch.nn import Module as TModule
10
11 from .prior import Prior
12 from .torch_priors import NormalPrior
13
14
15 class SmoothedBoxPrior(Prior):
16 r"""A smoothed approximation of a uniform prior.
17
18 Has full support on the reals and is differentiable everywhere.
19
20 .. math::
21
22 \begin{equation*}
23 B = {x: a_i <= x_i <= b_i}
24 d(x, B) = min_{x' in B} |x - x'|
25 pdf(x) ~ exp(- d(x, B)**2 / sqrt(2 * sigma^2))
26 \end{equation*}
27
28 """
29
30 arg_constraints = {"sigma": constraints.positive, "a": constraints.real, "b": constraints.real}
31 support = constraints.real
32 _validate_args = True
33
34 def __init__(self, a, b, sigma=0.01, validate_args=False, transform=None):
35 TModule.__init__(self)
36 _a = torch.tensor(float(a)) if isinstance(a, Number) else a
37 _a = _a.view(-1) if _a.dim() < 1 else _a
38 _a, _b, _sigma = broadcast_all(_a, b, sigma)
39 if not torch.all(constraints.less_than(_b).check(_a)):
40 raise ValueError("must have that a < b (element-wise)")
41 # TODO: Proper argument validation including broadcasting
42 batch_shape, event_shape = _a.shape[:-1], _a.shape[-1:]
43 # need to assign values before registering as buffers to make argument validation work
44 self.a, self.b, self.sigma = _a, _b, _sigma
45 super(SmoothedBoxPrior, self).__init__(batch_shape, event_shape, validate_args=validate_args)
46 # now need to delete to be able to register buffer
47 del self.a, self.b, self.sigma
48 self.register_buffer("a", _a)
49 self.register_buffer("b", _b)
50 self.register_buffer("sigma", _sigma)
51 self.tails = NormalPrior(torch.zeros_like(_a), _sigma, validate_args=validate_args)
52 self._transform = transform
53
54 @property
55 def _c(self):
56 return (self.a + self.b) / 2
57
58 @property
59 def _r(self):
60 return (self.b - self.a) / 2
61
62 @property
63 def _M(self):
64 # normalization factor to make this a probability distribution
65 return torch.log(1 + (self.b - self.a) / (math.sqrt(2 * math.pi) * self.sigma))
66
67 def log_prob(self, x):
68 return self._log_prob(self.transform(x))
69
70 def _log_prob(self, x):
71 # x = "distances from box`"
72 X = ((x - self._c).abs_() - self._r).clamp(min=0)
73 return (self.tails.log_prob(X) - self._M).sum(-1)
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/gpytorch/priors/smoothed_box_prior.py b/gpytorch/priors/smoothed_box_prior.py
--- a/gpytorch/priors/smoothed_box_prior.py
+++ b/gpytorch/priors/smoothed_box_prior.py
@@ -47,7 +47,7 @@
del self.a, self.b, self.sigma
self.register_buffer("a", _a)
self.register_buffer("b", _b)
- self.register_buffer("sigma", _sigma)
+ self.register_buffer("sigma", _sigma.clone())
self.tails = NormalPrior(torch.zeros_like(_a), _sigma, validate_args=validate_args)
self._transform = transform
|
{"golden_diff": "diff --git a/gpytorch/priors/smoothed_box_prior.py b/gpytorch/priors/smoothed_box_prior.py\n--- a/gpytorch/priors/smoothed_box_prior.py\n+++ b/gpytorch/priors/smoothed_box_prior.py\n@@ -47,7 +47,7 @@\n del self.a, self.b, self.sigma\n self.register_buffer(\"a\", _a)\n self.register_buffer(\"b\", _b)\n- self.register_buffer(\"sigma\", _sigma)\n+ self.register_buffer(\"sigma\", _sigma.clone())\n self.tails = NormalPrior(torch.zeros_like(_a), _sigma, validate_args=validate_args)\n self._transform = transform\n", "issue": "[Bug] Cannot serialize/deserialize SmoothedBoxPrior when some args are broadcast\n# \ud83d\udc1b Bug\r\n\r\nIt seems like `SmoothedBoxPrior` for >1d doesn't work with serialization/deserialization when only some args are broadcast. \r\n\r\n## To reproduce\r\n\r\n```python\r\nimport torch\r\nimport gpytorch\r\npr = gpytorch.priors.SmoothedBoxPrior(torch.zeros(2), torch.ones(2))\r\npr.load_state_dict(pr.state_dict())\r\n```\r\n\r\n** Stack trace/error message **\r\n```\r\n---------------------------------------------------------------------------\r\nRuntimeError Traceback (most recent call last)\r\n<ipython-input-5-6b4b2e881beb> in <module>\r\n 2 import gpytorch\r\n 3 pr = gpytorch.priors.SmoothedBoxPrior(torch.zeros(2), torch.ones(2))\r\n----> 4 pr.load_state_dict(pr.state_dict())\r\n\r\n<...PATH..>/torch/nn/modules/module.py in load_state_dict(self, state_dict, strict)\r\n 877 if len(error_msgs) > 0:\r\n 878 raise RuntimeError('Error(s) in loading state_dict for {}:\\n\\t{}'.format(\r\n--> 879 self.__class__.__name__, \"\\n\\t\".join(error_msgs)))\r\n 880 return _IncompatibleKeys(missing_keys, unexpected_keys)\r\n 881 \r\n\r\nRuntimeError: Error(s) in loading state_dict for SmoothedBoxPrior:\r\n\tWhile copying the parameter named \"sigma\", whose dimensions in the model are torch.Size([2]) and whose dimensions in the checkpoint are torch.Size([2]), an exception occured : ('unsupported operation: more than one element of the written-to tensor refers to a single memory location. Please clone() the tensor before performing the operation.',).\r\n\r\n```\r\n\r\nNote that `SmoothedBoxPrior(a=torch.zeros(2), b=torch.ones(2), sigma=torch.ones(2)*0.01)` succeeds, as does `gpytorch.priors.GammaPrior(torch.ones(2),1)`.\r\n\r\n## Expected Behavior\r\n\r\nSuccessful load. \r\n\r\n## System information\r\n\r\n**Please complete the following information:**\r\n- gpytorch version: 1.1.1\r\n- pytorch version: 1.5.0\r\n- OS: tested on Centos and Mac OSX. \r\n\r\n\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n\nimport math\nfrom numbers import Number\n\nimport torch\nfrom torch.distributions import constraints\nfrom torch.distributions.utils import broadcast_all\nfrom torch.nn import Module as TModule\n\nfrom .prior import Prior\nfrom .torch_priors import NormalPrior\n\n\nclass SmoothedBoxPrior(Prior):\n r\"\"\"A smoothed approximation of a uniform prior.\n\n Has full support on the reals and is differentiable everywhere.\n\n .. math::\n\n \\begin{equation*}\n B = {x: a_i <= x_i <= b_i}\n d(x, B) = min_{x' in B} |x - x'|\n pdf(x) ~ exp(- d(x, B)**2 / sqrt(2 * sigma^2))\n \\end{equation*}\n\n \"\"\"\n\n arg_constraints = {\"sigma\": constraints.positive, \"a\": constraints.real, \"b\": constraints.real}\n support = constraints.real\n _validate_args = True\n\n def __init__(self, a, b, sigma=0.01, validate_args=False, transform=None):\n TModule.__init__(self)\n _a = torch.tensor(float(a)) if isinstance(a, Number) else a\n _a = _a.view(-1) if _a.dim() < 1 else _a\n _a, _b, _sigma = broadcast_all(_a, b, sigma)\n if not torch.all(constraints.less_than(_b).check(_a)):\n raise ValueError(\"must have that a < b (element-wise)\")\n # TODO: Proper argument validation including broadcasting\n batch_shape, event_shape = _a.shape[:-1], _a.shape[-1:]\n # need to assign values before registering as buffers to make argument validation work\n self.a, self.b, self.sigma = _a, _b, _sigma\n super(SmoothedBoxPrior, self).__init__(batch_shape, event_shape, validate_args=validate_args)\n # now need to delete to be able to register buffer\n del self.a, self.b, self.sigma\n self.register_buffer(\"a\", _a)\n self.register_buffer(\"b\", _b)\n self.register_buffer(\"sigma\", _sigma)\n self.tails = NormalPrior(torch.zeros_like(_a), _sigma, validate_args=validate_args)\n self._transform = transform\n\n @property\n def _c(self):\n return (self.a + self.b) / 2\n\n @property\n def _r(self):\n return (self.b - self.a) / 2\n\n @property\n def _M(self):\n # normalization factor to make this a probability distribution\n return torch.log(1 + (self.b - self.a) / (math.sqrt(2 * math.pi) * self.sigma))\n\n def log_prob(self, x):\n return self._log_prob(self.transform(x))\n\n def _log_prob(self, x):\n # x = \"distances from box`\"\n X = ((x - self._c).abs_() - self._r).clamp(min=0)\n return (self.tails.log_prob(X) - self._M).sum(-1)\n", "path": "gpytorch/priors/smoothed_box_prior.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nimport math\nfrom numbers import Number\n\nimport torch\nfrom torch.distributions import constraints\nfrom torch.distributions.utils import broadcast_all\nfrom torch.nn import Module as TModule\n\nfrom .prior import Prior\nfrom .torch_priors import NormalPrior\n\n\nclass SmoothedBoxPrior(Prior):\n r\"\"\"A smoothed approximation of a uniform prior.\n\n Has full support on the reals and is differentiable everywhere.\n\n .. math::\n\n \\begin{equation*}\n B = {x: a_i <= x_i <= b_i}\n d(x, B) = min_{x' in B} |x - x'|\n pdf(x) ~ exp(- d(x, B)**2 / sqrt(2 * sigma^2))\n \\end{equation*}\n\n \"\"\"\n\n arg_constraints = {\"sigma\": constraints.positive, \"a\": constraints.real, \"b\": constraints.real}\n support = constraints.real\n _validate_args = True\n\n def __init__(self, a, b, sigma=0.01, validate_args=False, transform=None):\n TModule.__init__(self)\n _a = torch.tensor(float(a)) if isinstance(a, Number) else a\n _a = _a.view(-1) if _a.dim() < 1 else _a\n _a, _b, _sigma = broadcast_all(_a, b, sigma)\n if not torch.all(constraints.less_than(_b).check(_a)):\n raise ValueError(\"must have that a < b (element-wise)\")\n # TODO: Proper argument validation including broadcasting\n batch_shape, event_shape = _a.shape[:-1], _a.shape[-1:]\n # need to assign values before registering as buffers to make argument validation work\n self.a, self.b, self.sigma = _a, _b, _sigma\n super(SmoothedBoxPrior, self).__init__(batch_shape, event_shape, validate_args=validate_args)\n # now need to delete to be able to register buffer\n del self.a, self.b, self.sigma\n self.register_buffer(\"a\", _a)\n self.register_buffer(\"b\", _b)\n self.register_buffer(\"sigma\", _sigma.clone())\n self.tails = NormalPrior(torch.zeros_like(_a), _sigma, validate_args=validate_args)\n self._transform = transform\n\n @property\n def _c(self):\n return (self.a + self.b) / 2\n\n @property\n def _r(self):\n return (self.b - self.a) / 2\n\n @property\n def _M(self):\n # normalization factor to make this a probability distribution\n return torch.log(1 + (self.b - self.a) / (math.sqrt(2 * math.pi) * self.sigma))\n\n def log_prob(self, x):\n return self._log_prob(self.transform(x))\n\n def _log_prob(self, x):\n # x = \"distances from box`\"\n X = ((x - self._c).abs_() - self._r).clamp(min=0)\n return (self.tails.log_prob(X) - self._M).sum(-1)\n", "path": "gpytorch/priors/smoothed_box_prior.py"}]}
| 1,586 | 151 |
gh_patches_debug_4473
|
rasdani/github-patches
|
git_diff
|
facebookresearch__hydra-1685
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CI fails with UserWarning raised from importing nevergrad
example failure here https://app.circleci.com/pipelines/github/facebookresearch/hydra/10584/workflows/d4c57363-bb31-42f4-a7ee-29c28a577f67/jobs/95695
this can be reproduced by simply importing nevergrad
```
>>> import nevergrad as ng
/Users/jieru/opt/anaconda3/envs/testnv/lib/python3.8/site-packages/cma/s.py:13: UserWarning: Could not import matplotlib.pyplot, therefore ``cma.plot()`` etc. is not available
_warnings.warn('Could not import matplotlib.pyplot, therefore'
```
the warnings comes from one of nevergrad's dependency `cma` which just had a new release https://github.com/CMA-ES/pycma/releases
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugins/hydra_nevergrad_sweeper/setup.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2 # type: ignore
3 from pathlib import Path
4
5 from read_version import read_version
6 from setuptools import find_namespace_packages, setup
7
8 setup(
9 name="hydra-nevergrad-sweeper",
10 version=read_version("hydra_plugins/hydra_nevergrad_sweeper", "__init__.py"),
11 author="Jeremy Rapin, Omry Yadan, Jieru Hu",
12 author_email="[email protected], [email protected], [email protected]",
13 description="Hydra Nevergrad Sweeper plugin",
14 long_description=(Path(__file__).parent / "README.md").read_text(),
15 long_description_content_type="text/markdown",
16 url="https://github.com/facebookresearch/hydra/",
17 packages=find_namespace_packages(include=["hydra_plugins.*"]),
18 classifiers=[
19 "License :: OSI Approved :: MIT License",
20 "Programming Language :: Python :: 3.6",
21 "Programming Language :: Python :: 3.7",
22 "Programming Language :: Python :: 3.8",
23 "Programming Language :: Python :: 3.9",
24 "Operating System :: OS Independent",
25 "Development Status :: 4 - Beta",
26 ],
27 install_requires=[
28 "hydra-core>=1.1.0.dev7",
29 "nevergrad>=0.4.3.post2",
30 "numpy<1.20.0", # remove once nevergrad is upgraded to support numpy 1.20
31 ],
32 include_package_data=True,
33 )
34
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/plugins/hydra_nevergrad_sweeper/setup.py b/plugins/hydra_nevergrad_sweeper/setup.py
--- a/plugins/hydra_nevergrad_sweeper/setup.py
+++ b/plugins/hydra_nevergrad_sweeper/setup.py
@@ -27,6 +27,7 @@
install_requires=[
"hydra-core>=1.1.0.dev7",
"nevergrad>=0.4.3.post2",
+ "cma==3.0.3", # https://github.com/facebookresearch/hydra/issues/1684
"numpy<1.20.0", # remove once nevergrad is upgraded to support numpy 1.20
],
include_package_data=True,
|
{"golden_diff": "diff --git a/plugins/hydra_nevergrad_sweeper/setup.py b/plugins/hydra_nevergrad_sweeper/setup.py\n--- a/plugins/hydra_nevergrad_sweeper/setup.py\n+++ b/plugins/hydra_nevergrad_sweeper/setup.py\n@@ -27,6 +27,7 @@\n install_requires=[\n \"hydra-core>=1.1.0.dev7\",\n \"nevergrad>=0.4.3.post2\",\n+ \"cma==3.0.3\", # https://github.com/facebookresearch/hydra/issues/1684\n \"numpy<1.20.0\", # remove once nevergrad is upgraded to support numpy 1.20\n ],\n include_package_data=True,\n", "issue": "CI fails with UserWarning raised from importing nevergrad\nexample failure here https://app.circleci.com/pipelines/github/facebookresearch/hydra/10584/workflows/d4c57363-bb31-42f4-a7ee-29c28a577f67/jobs/95695\r\n\r\nthis can be reproduced by simply importing nevergrad\r\n```\r\n>>> import nevergrad as ng\r\n/Users/jieru/opt/anaconda3/envs/testnv/lib/python3.8/site-packages/cma/s.py:13: UserWarning: Could not import matplotlib.pyplot, therefore ``cma.plot()`` etc. is not available\r\n _warnings.warn('Could not import matplotlib.pyplot, therefore'\r\n```\r\n\r\nthe warnings comes from one of nevergrad's dependency `cma` which just had a new release https://github.com/CMA-ES/pycma/releases\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n# type: ignore\nfrom pathlib import Path\n\nfrom read_version import read_version\nfrom setuptools import find_namespace_packages, setup\n\nsetup(\n name=\"hydra-nevergrad-sweeper\",\n version=read_version(\"hydra_plugins/hydra_nevergrad_sweeper\", \"__init__.py\"),\n author=\"Jeremy Rapin, Omry Yadan, Jieru Hu\",\n author_email=\"[email protected], [email protected], [email protected]\",\n description=\"Hydra Nevergrad Sweeper plugin\",\n long_description=(Path(__file__).parent / \"README.md\").read_text(),\n long_description_content_type=\"text/markdown\",\n url=\"https://github.com/facebookresearch/hydra/\",\n packages=find_namespace_packages(include=[\"hydra_plugins.*\"]),\n classifiers=[\n \"License :: OSI Approved :: MIT License\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Operating System :: OS Independent\",\n \"Development Status :: 4 - Beta\",\n ],\n install_requires=[\n \"hydra-core>=1.1.0.dev7\",\n \"nevergrad>=0.4.3.post2\",\n \"numpy<1.20.0\", # remove once nevergrad is upgraded to support numpy 1.20\n ],\n include_package_data=True,\n)\n", "path": "plugins/hydra_nevergrad_sweeper/setup.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n# type: ignore\nfrom pathlib import Path\n\nfrom read_version import read_version\nfrom setuptools import find_namespace_packages, setup\n\nsetup(\n name=\"hydra-nevergrad-sweeper\",\n version=read_version(\"hydra_plugins/hydra_nevergrad_sweeper\", \"__init__.py\"),\n author=\"Jeremy Rapin, Omry Yadan, Jieru Hu\",\n author_email=\"[email protected], [email protected], [email protected]\",\n description=\"Hydra Nevergrad Sweeper plugin\",\n long_description=(Path(__file__).parent / \"README.md\").read_text(),\n long_description_content_type=\"text/markdown\",\n url=\"https://github.com/facebookresearch/hydra/\",\n packages=find_namespace_packages(include=[\"hydra_plugins.*\"]),\n classifiers=[\n \"License :: OSI Approved :: MIT License\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Operating System :: OS Independent\",\n \"Development Status :: 4 - Beta\",\n ],\n install_requires=[\n \"hydra-core>=1.1.0.dev7\",\n \"nevergrad>=0.4.3.post2\",\n \"cma==3.0.3\", # https://github.com/facebookresearch/hydra/issues/1684\n \"numpy<1.20.0\", # remove once nevergrad is upgraded to support numpy 1.20\n ],\n include_package_data=True,\n)\n", "path": "plugins/hydra_nevergrad_sweeper/setup.py"}]}
| 855 | 169 |
gh_patches_debug_2196
|
rasdani/github-patches
|
git_diff
|
getredash__redash-1119
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
User should be able to delete an Alert
Can't remove Alert with UI.
Directly run sql as below.
``` sql
delete from alerts where id = 〜
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `redash/handlers/alerts.py`
Content:
```
1 import time
2
3 from flask import request
4 from funcy import project
5
6 from redash import models
7 from redash.permissions import require_access, require_admin_or_owner, view_only, require_permission
8 from redash.handlers.base import BaseResource, require_fields, get_object_or_404
9
10
11 class AlertResource(BaseResource):
12 def get(self, alert_id):
13 alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)
14 require_access(alert.groups, self.current_user, view_only)
15 return alert.to_dict()
16
17 def post(self, alert_id):
18 req = request.get_json(True)
19 params = project(req, ('options', 'name', 'query_id', 'rearm'))
20 alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)
21 require_admin_or_owner(alert.user.id)
22
23 if 'query_id' in params:
24 params['query'] = params.pop('query_id')
25
26 alert.update_instance(**params)
27
28 self.record_event({
29 'action': 'edit',
30 'timestamp': int(time.time()),
31 'object_id': alert.id,
32 'object_type': 'alert'
33 })
34
35 return alert.to_dict()
36
37
38 class AlertListResource(BaseResource):
39 def post(self):
40 req = request.get_json(True)
41 require_fields(req, ('options', 'name', 'query_id'))
42
43 query = models.Query.get_by_id_and_org(req['query_id'], self.current_org)
44 require_access(query.groups, self.current_user, view_only)
45
46 alert = models.Alert.create(
47 name=req['name'],
48 query=query,
49 user=self.current_user,
50 options=req['options']
51 )
52
53 self.record_event({
54 'action': 'create',
55 'timestamp': int(time.time()),
56 'object_id': alert.id,
57 'object_type': 'alert'
58 })
59
60 return alert.to_dict()
61
62 @require_permission('list_alerts')
63 def get(self):
64 return [alert.to_dict() for alert in models.Alert.all(groups=self.current_user.groups)]
65
66
67 class AlertSubscriptionListResource(BaseResource):
68 def post(self, alert_id):
69 req = request.get_json(True)
70
71 alert = models.Alert.get_by_id_and_org(alert_id, self.current_org)
72 require_access(alert.groups, self.current_user, view_only)
73 kwargs = {'alert': alert, 'user': self.current_user}
74
75 if 'destination_id' in req:
76 destination = models.NotificationDestination.get_by_id_and_org(req['destination_id'], self.current_org)
77 kwargs['destination'] = destination
78
79 subscription = models.AlertSubscription.create(**kwargs)
80
81 self.record_event({
82 'action': 'subscribe',
83 'timestamp': int(time.time()),
84 'object_id': alert_id,
85 'object_type': 'alert',
86 'destination': req.get('destination_id')
87 })
88
89 return subscription.to_dict()
90
91 def get(self, alert_id):
92 alert = models.Alert.get_by_id_and_org(alert_id, self.current_org)
93 require_access(alert.groups, self.current_user, view_only)
94
95 subscriptions = models.AlertSubscription.all(alert_id)
96 return [s.to_dict() for s in subscriptions]
97
98
99 class AlertSubscriptionResource(BaseResource):
100 def delete(self, alert_id, subscriber_id):
101
102 subscription = get_object_or_404(models.AlertSubscription.get_by_id, subscriber_id)
103 require_admin_or_owner(subscription.user.id)
104 subscription.delete_instance()
105
106 self.record_event({
107 'action': 'unsubscribe',
108 'timestamp': int(time.time()),
109 'object_id': alert_id,
110 'object_type': 'alert'
111 })
112
113
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/redash/handlers/alerts.py b/redash/handlers/alerts.py
--- a/redash/handlers/alerts.py
+++ b/redash/handlers/alerts.py
@@ -34,6 +34,11 @@
return alert.to_dict()
+ def delete(self, alert_id):
+ alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)
+ require_admin_or_owner(alert.user.id)
+ alert.delete_instance(recursive=True)
+
class AlertListResource(BaseResource):
def post(self):
|
{"golden_diff": "diff --git a/redash/handlers/alerts.py b/redash/handlers/alerts.py\n--- a/redash/handlers/alerts.py\n+++ b/redash/handlers/alerts.py\n@@ -34,6 +34,11 @@\n \n return alert.to_dict()\n \n+ def delete(self, alert_id):\n+ alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)\n+ require_admin_or_owner(alert.user.id)\n+ alert.delete_instance(recursive=True)\n+\n \n class AlertListResource(BaseResource):\n def post(self):\n", "issue": "User should be able to delete an Alert\nCan't remove Alert with UI.\n\nDirectly run sql as below.\n\n``` sql\ndelete from alerts where id = \u301c\n```\n\n", "before_files": [{"content": "import time\n\nfrom flask import request\nfrom funcy import project\n\nfrom redash import models\nfrom redash.permissions import require_access, require_admin_or_owner, view_only, require_permission\nfrom redash.handlers.base import BaseResource, require_fields, get_object_or_404\n\n\nclass AlertResource(BaseResource):\n def get(self, alert_id):\n alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)\n require_access(alert.groups, self.current_user, view_only)\n return alert.to_dict()\n\n def post(self, alert_id):\n req = request.get_json(True)\n params = project(req, ('options', 'name', 'query_id', 'rearm'))\n alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)\n require_admin_or_owner(alert.user.id)\n\n if 'query_id' in params:\n params['query'] = params.pop('query_id')\n\n alert.update_instance(**params)\n\n self.record_event({\n 'action': 'edit',\n 'timestamp': int(time.time()),\n 'object_id': alert.id,\n 'object_type': 'alert'\n })\n\n return alert.to_dict()\n\n\nclass AlertListResource(BaseResource):\n def post(self):\n req = request.get_json(True)\n require_fields(req, ('options', 'name', 'query_id'))\n\n query = models.Query.get_by_id_and_org(req['query_id'], self.current_org)\n require_access(query.groups, self.current_user, view_only)\n\n alert = models.Alert.create(\n name=req['name'],\n query=query,\n user=self.current_user,\n options=req['options']\n )\n\n self.record_event({\n 'action': 'create',\n 'timestamp': int(time.time()),\n 'object_id': alert.id,\n 'object_type': 'alert'\n })\n\n return alert.to_dict()\n\n @require_permission('list_alerts')\n def get(self):\n return [alert.to_dict() for alert in models.Alert.all(groups=self.current_user.groups)]\n\n\nclass AlertSubscriptionListResource(BaseResource):\n def post(self, alert_id):\n req = request.get_json(True)\n\n alert = models.Alert.get_by_id_and_org(alert_id, self.current_org)\n require_access(alert.groups, self.current_user, view_only)\n kwargs = {'alert': alert, 'user': self.current_user}\n\n if 'destination_id' in req:\n destination = models.NotificationDestination.get_by_id_and_org(req['destination_id'], self.current_org)\n kwargs['destination'] = destination\n\n subscription = models.AlertSubscription.create(**kwargs)\n\n self.record_event({\n 'action': 'subscribe',\n 'timestamp': int(time.time()),\n 'object_id': alert_id,\n 'object_type': 'alert',\n 'destination': req.get('destination_id')\n })\n\n return subscription.to_dict()\n\n def get(self, alert_id):\n alert = models.Alert.get_by_id_and_org(alert_id, self.current_org)\n require_access(alert.groups, self.current_user, view_only)\n\n subscriptions = models.AlertSubscription.all(alert_id)\n return [s.to_dict() for s in subscriptions]\n\n\nclass AlertSubscriptionResource(BaseResource):\n def delete(self, alert_id, subscriber_id):\n \n subscription = get_object_or_404(models.AlertSubscription.get_by_id, subscriber_id)\n require_admin_or_owner(subscription.user.id)\n subscription.delete_instance()\n\n self.record_event({\n 'action': 'unsubscribe',\n 'timestamp': int(time.time()),\n 'object_id': alert_id,\n 'object_type': 'alert'\n })\n\n", "path": "redash/handlers/alerts.py"}], "after_files": [{"content": "import time\n\nfrom flask import request\nfrom funcy import project\n\nfrom redash import models\nfrom redash.permissions import require_access, require_admin_or_owner, view_only, require_permission\nfrom redash.handlers.base import BaseResource, require_fields, get_object_or_404\n\n\nclass AlertResource(BaseResource):\n def get(self, alert_id):\n alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)\n require_access(alert.groups, self.current_user, view_only)\n return alert.to_dict()\n\n def post(self, alert_id):\n req = request.get_json(True)\n params = project(req, ('options', 'name', 'query_id', 'rearm'))\n alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)\n require_admin_or_owner(alert.user.id)\n\n if 'query_id' in params:\n params['query'] = params.pop('query_id')\n\n alert.update_instance(**params)\n\n self.record_event({\n 'action': 'edit',\n 'timestamp': int(time.time()),\n 'object_id': alert.id,\n 'object_type': 'alert'\n })\n\n return alert.to_dict()\n\n def delete(self, alert_id):\n alert = get_object_or_404(models.Alert.get_by_id_and_org, alert_id, self.current_org)\n require_admin_or_owner(alert.user.id)\n alert.delete_instance(recursive=True)\n\n\nclass AlertListResource(BaseResource):\n def post(self):\n req = request.get_json(True)\n require_fields(req, ('options', 'name', 'query_id'))\n\n query = models.Query.get_by_id_and_org(req['query_id'], self.current_org)\n require_access(query.groups, self.current_user, view_only)\n\n alert = models.Alert.create(\n name=req['name'],\n query=query,\n user=self.current_user,\n options=req['options']\n )\n\n self.record_event({\n 'action': 'create',\n 'timestamp': int(time.time()),\n 'object_id': alert.id,\n 'object_type': 'alert'\n })\n\n return alert.to_dict()\n\n @require_permission('list_alerts')\n def get(self):\n return [alert.to_dict() for alert in models.Alert.all(groups=self.current_user.groups)]\n\n\nclass AlertSubscriptionListResource(BaseResource):\n def post(self, alert_id):\n req = request.get_json(True)\n\n alert = models.Alert.get_by_id_and_org(alert_id, self.current_org)\n require_access(alert.groups, self.current_user, view_only)\n kwargs = {'alert': alert, 'user': self.current_user}\n\n if 'destination_id' in req:\n destination = models.NotificationDestination.get_by_id_and_org(req['destination_id'], self.current_org)\n kwargs['destination'] = destination\n\n subscription = models.AlertSubscription.create(**kwargs)\n\n self.record_event({\n 'action': 'subscribe',\n 'timestamp': int(time.time()),\n 'object_id': alert_id,\n 'object_type': 'alert',\n 'destination': req.get('destination_id')\n })\n\n return subscription.to_dict()\n\n def get(self, alert_id):\n alert = models.Alert.get_by_id_and_org(alert_id, self.current_org)\n require_access(alert.groups, self.current_user, view_only)\n\n subscriptions = models.AlertSubscription.all(alert_id)\n return [s.to_dict() for s in subscriptions]\n\n\nclass AlertSubscriptionResource(BaseResource):\n def delete(self, alert_id, subscriber_id):\n \n subscription = get_object_or_404(models.AlertSubscription.get_by_id, subscriber_id)\n require_admin_or_owner(subscription.user.id)\n subscription.delete_instance()\n\n self.record_event({\n 'action': 'unsubscribe',\n 'timestamp': int(time.time()),\n 'object_id': alert_id,\n 'object_type': 'alert'\n })\n\n", "path": "redash/handlers/alerts.py"}]}
| 1,315 | 133 |
gh_patches_debug_4407
|
rasdani/github-patches
|
git_diff
|
pwndbg__pwndbg-1222
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ipi doesn't work with multiline inputs
TL;DR:
<img width="550" alt="image" src="https://user-images.githubusercontent.com/10009354/193942063-af410d4d-3cdd-4bcb-a102-9bb87d101656.png">
```
pwndbg> ipi
In [1]: from ctypes import *
In [2]: class A(LittleEndianStructure):
...: a = LittleEndianStructure
...:
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-2-814bd2a1d7ec> in <module>
----> 1 class A(LittleEndianStructure):
2 a = LittleEndianStructure
3
<ipython-input-2-814bd2a1d7ec> in A()
1 class A(LittleEndianStructure):
----> 2 a = LittleEndianStructure
3
NameError: name 'LittleEndianStructure' is not defined
In [3]: ctypes
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-3-8c9cdb26e3f3> in <module>
----> 1 ctypes
NameError: name 'ctypes' is not defined
In [4]: LittleEndianStructure
Out[4]: _ctypes.Structure
In [5]: def foo():
...: return LittleEndianStructure
...:
In [6]: foo()
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-6-c19b6d9633cf> in <module>
----> 1 foo()
<ipython-input-5-0b19aa36e370> in foo()
1 def foo():
----> 2 return LittleEndianStructure
3
NameError: name 'LittleEndianStructure' is not defined
In [7]:
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwndbg/commands/ipython_interactive.py`
Content:
```
1 """
2 Command to start an interactive IPython prompt.
3 """
4 import sys
5 from contextlib import contextmanager
6
7 import gdb
8
9 import pwndbg.color.message as M
10 import pwndbg.commands
11 import pwndbg.lib.stdio
12
13
14 @contextmanager
15 def switch_to_ipython_env():
16 """We need to change stdout/stderr to the default ones, otherwise we can't use tab or autocomplete"""
17 # Save GDB's excepthook
18 saved_excepthook = sys.excepthook
19 # Switch to default stdout/stderr
20 with pwndbg.lib.stdio.stdio:
21 yield
22 # Restore Python's default ps1, ps2, and excepthook for GDB's `pi` command
23 sys.ps1 = ">>> "
24 sys.ps2 = "... "
25 sys.excepthook = saved_excepthook
26
27
28 @pwndbg.commands.ArgparsedCommand("Start an interactive IPython prompt.")
29 def ipi():
30 with switch_to_ipython_env():
31 # Use `gdb.execute` to embed IPython into GDB's variable scope
32 try:
33 gdb.execute("pi import IPython")
34 except gdb.error:
35 print(
36 M.warn(
37 "Cannot import IPython.\n"
38 "You need to install IPython if you want to use this command.\n"
39 "Maybe you can try `pip install ipython` first."
40 )
41 )
42 return
43 code4ipython = """import jedi
44 import pwn
45 jedi.Interpreter._allow_descriptor_getattr_default = False
46 IPython.embed(colors='neutral',banner1='',confirm_exit=False,simple_prompt=False)
47 """
48 gdb.execute(f"py\n{code4ipython}")
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pwndbg/commands/ipython_interactive.py b/pwndbg/commands/ipython_interactive.py
--- a/pwndbg/commands/ipython_interactive.py
+++ b/pwndbg/commands/ipython_interactive.py
@@ -43,6 +43,6 @@
code4ipython = """import jedi
import pwn
jedi.Interpreter._allow_descriptor_getattr_default = False
-IPython.embed(colors='neutral',banner1='',confirm_exit=False,simple_prompt=False)
+IPython.embed(colors='neutral',banner1='',confirm_exit=False,simple_prompt=False, user_ns=globals())
"""
gdb.execute(f"py\n{code4ipython}")
|
{"golden_diff": "diff --git a/pwndbg/commands/ipython_interactive.py b/pwndbg/commands/ipython_interactive.py\n--- a/pwndbg/commands/ipython_interactive.py\n+++ b/pwndbg/commands/ipython_interactive.py\n@@ -43,6 +43,6 @@\n code4ipython = \"\"\"import jedi\n import pwn\n jedi.Interpreter._allow_descriptor_getattr_default = False\n-IPython.embed(colors='neutral',banner1='',confirm_exit=False,simple_prompt=False)\n+IPython.embed(colors='neutral',banner1='',confirm_exit=False,simple_prompt=False, user_ns=globals())\n \"\"\"\n gdb.execute(f\"py\\n{code4ipython}\")\n", "issue": "ipi doesn't work with multiline inputs\nTL;DR:\r\n<img width=\"550\" alt=\"image\" src=\"https://user-images.githubusercontent.com/10009354/193942063-af410d4d-3cdd-4bcb-a102-9bb87d101656.png\">\r\n\r\n```\r\npwndbg> ipi\r\n\r\nIn [1]: from ctypes import *\r\n\r\nIn [2]: class A(LittleEndianStructure):\r\n ...: a = LittleEndianStructure\r\n ...:\r\n---------------------------------------------------------------------------\r\nNameError Traceback (most recent call last)\r\n<ipython-input-2-814bd2a1d7ec> in <module>\r\n----> 1 class A(LittleEndianStructure):\r\n 2 a = LittleEndianStructure\r\n 3\r\n\r\n<ipython-input-2-814bd2a1d7ec> in A()\r\n 1 class A(LittleEndianStructure):\r\n----> 2 a = LittleEndianStructure\r\n 3\r\n\r\nNameError: name 'LittleEndianStructure' is not defined\r\n\r\nIn [3]: ctypes\r\n---------------------------------------------------------------------------\r\nNameError Traceback (most recent call last)\r\n<ipython-input-3-8c9cdb26e3f3> in <module>\r\n----> 1 ctypes\r\n\r\nNameError: name 'ctypes' is not defined\r\n\r\nIn [4]: LittleEndianStructure\r\nOut[4]: _ctypes.Structure\r\n\r\nIn [5]: def foo():\r\n ...: return LittleEndianStructure\r\n ...:\r\n\r\nIn [6]: foo()\r\n---------------------------------------------------------------------------\r\nNameError Traceback (most recent call last)\r\n<ipython-input-6-c19b6d9633cf> in <module>\r\n----> 1 foo()\r\n\r\n<ipython-input-5-0b19aa36e370> in foo()\r\n 1 def foo():\r\n----> 2 return LittleEndianStructure\r\n 3\r\n\r\nNameError: name 'LittleEndianStructure' is not defined\r\n\r\nIn [7]:\r\n```\n", "before_files": [{"content": "\"\"\"\nCommand to start an interactive IPython prompt.\n\"\"\"\nimport sys\nfrom contextlib import contextmanager\n\nimport gdb\n\nimport pwndbg.color.message as M\nimport pwndbg.commands\nimport pwndbg.lib.stdio\n\n\n@contextmanager\ndef switch_to_ipython_env():\n \"\"\"We need to change stdout/stderr to the default ones, otherwise we can't use tab or autocomplete\"\"\"\n # Save GDB's excepthook\n saved_excepthook = sys.excepthook\n # Switch to default stdout/stderr\n with pwndbg.lib.stdio.stdio:\n yield\n # Restore Python's default ps1, ps2, and excepthook for GDB's `pi` command\n sys.ps1 = \">>> \"\n sys.ps2 = \"... \"\n sys.excepthook = saved_excepthook\n\n\[email protected](\"Start an interactive IPython prompt.\")\ndef ipi():\n with switch_to_ipython_env():\n # Use `gdb.execute` to embed IPython into GDB's variable scope\n try:\n gdb.execute(\"pi import IPython\")\n except gdb.error:\n print(\n M.warn(\n \"Cannot import IPython.\\n\"\n \"You need to install IPython if you want to use this command.\\n\"\n \"Maybe you can try `pip install ipython` first.\"\n )\n )\n return\n code4ipython = \"\"\"import jedi\nimport pwn\njedi.Interpreter._allow_descriptor_getattr_default = False\nIPython.embed(colors='neutral',banner1='',confirm_exit=False,simple_prompt=False)\n\"\"\"\n gdb.execute(f\"py\\n{code4ipython}\")\n", "path": "pwndbg/commands/ipython_interactive.py"}], "after_files": [{"content": "\"\"\"\nCommand to start an interactive IPython prompt.\n\"\"\"\nimport sys\nfrom contextlib import contextmanager\n\nimport gdb\n\nimport pwndbg.color.message as M\nimport pwndbg.commands\nimport pwndbg.lib.stdio\n\n\n@contextmanager\ndef switch_to_ipython_env():\n \"\"\"We need to change stdout/stderr to the default ones, otherwise we can't use tab or autocomplete\"\"\"\n # Save GDB's excepthook\n saved_excepthook = sys.excepthook\n # Switch to default stdout/stderr\n with pwndbg.lib.stdio.stdio:\n yield\n # Restore Python's default ps1, ps2, and excepthook for GDB's `pi` command\n sys.ps1 = \">>> \"\n sys.ps2 = \"... \"\n sys.excepthook = saved_excepthook\n\n\[email protected](\"Start an interactive IPython prompt.\")\ndef ipi():\n with switch_to_ipython_env():\n # Use `gdb.execute` to embed IPython into GDB's variable scope\n try:\n gdb.execute(\"pi import IPython\")\n except gdb.error:\n print(\n M.warn(\n \"Cannot import IPython.\\n\"\n \"You need to install IPython if you want to use this command.\\n\"\n \"Maybe you can try `pip install ipython` first.\"\n )\n )\n return\n code4ipython = \"\"\"import jedi\nimport pwn\njedi.Interpreter._allow_descriptor_getattr_default = False\nIPython.embed(colors='neutral',banner1='',confirm_exit=False,simple_prompt=False, user_ns=globals())\n\"\"\"\n gdb.execute(f\"py\\n{code4ipython}\")\n", "path": "pwndbg/commands/ipython_interactive.py"}]}
| 1,168 | 153 |
gh_patches_debug_8385
|
rasdani/github-patches
|
git_diff
|
explosion__spaCy-2274
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Typo in docs? multi-label text classification
The docs give an example of text classification on IMDB [here](https://spacy.io/usage/training#example-textcat). It says it's "multi-label", but this dataset is binary. So I'm not sure what you mean here.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/training/train_textcat.py`
Content:
```
1 #!/usr/bin/env python
2 # coding: utf8
3 """Train a multi-label convolutional neural network text classifier on the
4 IMDB dataset, using the TextCategorizer component. The dataset will be loaded
5 automatically via Thinc's built-in dataset loader. The model is added to
6 spacy.pipeline, and predictions are available via `doc.cats`. For more details,
7 see the documentation:
8 * Training: https://spacy.io/usage/training
9
10 Compatible with: spaCy v2.0.0+
11 """
12 from __future__ import unicode_literals, print_function
13 import plac
14 import random
15 from pathlib import Path
16 import thinc.extra.datasets
17
18 import spacy
19 from spacy.util import minibatch, compounding
20
21
22 @plac.annotations(
23 model=("Model name. Defaults to blank 'en' model.", "option", "m", str),
24 output_dir=("Optional output directory", "option", "o", Path),
25 n_texts=("Number of texts to train from", "option", "t", int),
26 n_iter=("Number of training iterations", "option", "n", int))
27 def main(model=None, output_dir=None, n_iter=20, n_texts=2000):
28 if model is not None:
29 nlp = spacy.load(model) # load existing spaCy model
30 print("Loaded model '%s'" % model)
31 else:
32 nlp = spacy.blank('en') # create blank Language class
33 print("Created blank 'en' model")
34
35 # add the text classifier to the pipeline if it doesn't exist
36 # nlp.create_pipe works for built-ins that are registered with spaCy
37 if 'textcat' not in nlp.pipe_names:
38 textcat = nlp.create_pipe('textcat')
39 nlp.add_pipe(textcat, last=True)
40 # otherwise, get it, so we can add labels to it
41 else:
42 textcat = nlp.get_pipe('textcat')
43
44 # add label to text classifier
45 textcat.add_label('POSITIVE')
46
47 # load the IMDB dataset
48 print("Loading IMDB data...")
49 (train_texts, train_cats), (dev_texts, dev_cats) = load_data(limit=n_texts)
50 print("Using {} examples ({} training, {} evaluation)"
51 .format(n_texts, len(train_texts), len(dev_texts)))
52 train_data = list(zip(train_texts,
53 [{'cats': cats} for cats in train_cats]))
54
55 # get names of other pipes to disable them during training
56 other_pipes = [pipe for pipe in nlp.pipe_names if pipe != 'textcat']
57 with nlp.disable_pipes(*other_pipes): # only train textcat
58 optimizer = nlp.begin_training()
59 print("Training the model...")
60 print('{:^5}\t{:^5}\t{:^5}\t{:^5}'.format('LOSS', 'P', 'R', 'F'))
61 for i in range(n_iter):
62 losses = {}
63 # batch up the examples using spaCy's minibatch
64 batches = minibatch(train_data, size=compounding(4., 32., 1.001))
65 for batch in batches:
66 texts, annotations = zip(*batch)
67 nlp.update(texts, annotations, sgd=optimizer, drop=0.2,
68 losses=losses)
69 with textcat.model.use_params(optimizer.averages):
70 # evaluate on the dev data split off in load_data()
71 scores = evaluate(nlp.tokenizer, textcat, dev_texts, dev_cats)
72 print('{0:.3f}\t{1:.3f}\t{2:.3f}\t{3:.3f}' # print a simple table
73 .format(losses['textcat'], scores['textcat_p'],
74 scores['textcat_r'], scores['textcat_f']))
75
76 # test the trained model
77 test_text = "This movie sucked"
78 doc = nlp(test_text)
79 print(test_text, doc.cats)
80
81 if output_dir is not None:
82 output_dir = Path(output_dir)
83 if not output_dir.exists():
84 output_dir.mkdir()
85 nlp.to_disk(output_dir)
86 print("Saved model to", output_dir)
87
88 # test the saved model
89 print("Loading from", output_dir)
90 nlp2 = spacy.load(output_dir)
91 doc2 = nlp2(test_text)
92 print(test_text, doc2.cats)
93
94
95 def load_data(limit=0, split=0.8):
96 """Load data from the IMDB dataset."""
97 # Partition off part of the train data for evaluation
98 train_data, _ = thinc.extra.datasets.imdb()
99 random.shuffle(train_data)
100 train_data = train_data[-limit:]
101 texts, labels = zip(*train_data)
102 cats = [{'POSITIVE': bool(y)} for y in labels]
103 split = int(len(train_data) * split)
104 return (texts[:split], cats[:split]), (texts[split:], cats[split:])
105
106
107 def evaluate(tokenizer, textcat, texts, cats):
108 docs = (tokenizer(text) for text in texts)
109 tp = 1e-8 # True positives
110 fp = 1e-8 # False positives
111 fn = 1e-8 # False negatives
112 tn = 1e-8 # True negatives
113 for i, doc in enumerate(textcat.pipe(docs)):
114 gold = cats[i]
115 for label, score in doc.cats.items():
116 if label not in gold:
117 continue
118 if score >= 0.5 and gold[label] >= 0.5:
119 tp += 1.
120 elif score >= 0.5 and gold[label] < 0.5:
121 fp += 1.
122 elif score < 0.5 and gold[label] < 0.5:
123 tn += 1
124 elif score < 0.5 and gold[label] >= 0.5:
125 fn += 1
126 precision = tp / (tp + fp)
127 recall = tp / (tp + fn)
128 f_score = 2 * (precision * recall) / (precision + recall)
129 return {'textcat_p': precision, 'textcat_r': recall, 'textcat_f': f_score}
130
131
132 if __name__ == '__main__':
133 plac.call(main)
134
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/examples/training/train_textcat.py b/examples/training/train_textcat.py
--- a/examples/training/train_textcat.py
+++ b/examples/training/train_textcat.py
@@ -1,6 +1,6 @@
#!/usr/bin/env python
# coding: utf8
-"""Train a multi-label convolutional neural network text classifier on the
+"""Train a convolutional neural network text classifier on the
IMDB dataset, using the TextCategorizer component. The dataset will be loaded
automatically via Thinc's built-in dataset loader. The model is added to
spacy.pipeline, and predictions are available via `doc.cats`. For more details,
|
{"golden_diff": "diff --git a/examples/training/train_textcat.py b/examples/training/train_textcat.py\n--- a/examples/training/train_textcat.py\n+++ b/examples/training/train_textcat.py\n@@ -1,6 +1,6 @@\n #!/usr/bin/env python\n # coding: utf8\n-\"\"\"Train a multi-label convolutional neural network text classifier on the\n+\"\"\"Train a convolutional neural network text classifier on the\n IMDB dataset, using the TextCategorizer component. The dataset will be loaded\n automatically via Thinc's built-in dataset loader. The model is added to\n spacy.pipeline, and predictions are available via `doc.cats`. For more details,\n", "issue": "Typo in docs? multi-label text classification\nThe docs give an example of text classification on IMDB [here](https://spacy.io/usage/training#example-textcat). It says it's \"multi-label\", but this dataset is binary. So I'm not sure what you mean here.\n", "before_files": [{"content": "#!/usr/bin/env python\n# coding: utf8\n\"\"\"Train a multi-label convolutional neural network text classifier on the\nIMDB dataset, using the TextCategorizer component. The dataset will be loaded\nautomatically via Thinc's built-in dataset loader. The model is added to\nspacy.pipeline, and predictions are available via `doc.cats`. For more details,\nsee the documentation:\n* Training: https://spacy.io/usage/training\n\nCompatible with: spaCy v2.0.0+\n\"\"\"\nfrom __future__ import unicode_literals, print_function\nimport plac\nimport random\nfrom pathlib import Path\nimport thinc.extra.datasets\n\nimport spacy\nfrom spacy.util import minibatch, compounding\n\n\[email protected](\n model=(\"Model name. Defaults to blank 'en' model.\", \"option\", \"m\", str),\n output_dir=(\"Optional output directory\", \"option\", \"o\", Path),\n n_texts=(\"Number of texts to train from\", \"option\", \"t\", int),\n n_iter=(\"Number of training iterations\", \"option\", \"n\", int))\ndef main(model=None, output_dir=None, n_iter=20, n_texts=2000):\n if model is not None:\n nlp = spacy.load(model) # load existing spaCy model\n print(\"Loaded model '%s'\" % model)\n else:\n nlp = spacy.blank('en') # create blank Language class\n print(\"Created blank 'en' model\")\n\n # add the text classifier to the pipeline if it doesn't exist\n # nlp.create_pipe works for built-ins that are registered with spaCy\n if 'textcat' not in nlp.pipe_names:\n textcat = nlp.create_pipe('textcat')\n nlp.add_pipe(textcat, last=True)\n # otherwise, get it, so we can add labels to it\n else:\n textcat = nlp.get_pipe('textcat')\n\n # add label to text classifier\n textcat.add_label('POSITIVE')\n\n # load the IMDB dataset\n print(\"Loading IMDB data...\")\n (train_texts, train_cats), (dev_texts, dev_cats) = load_data(limit=n_texts)\n print(\"Using {} examples ({} training, {} evaluation)\"\n .format(n_texts, len(train_texts), len(dev_texts)))\n train_data = list(zip(train_texts,\n [{'cats': cats} for cats in train_cats]))\n\n # get names of other pipes to disable them during training\n other_pipes = [pipe for pipe in nlp.pipe_names if pipe != 'textcat']\n with nlp.disable_pipes(*other_pipes): # only train textcat\n optimizer = nlp.begin_training()\n print(\"Training the model...\")\n print('{:^5}\\t{:^5}\\t{:^5}\\t{:^5}'.format('LOSS', 'P', 'R', 'F'))\n for i in range(n_iter):\n losses = {}\n # batch up the examples using spaCy's minibatch\n batches = minibatch(train_data, size=compounding(4., 32., 1.001))\n for batch in batches:\n texts, annotations = zip(*batch)\n nlp.update(texts, annotations, sgd=optimizer, drop=0.2,\n losses=losses)\n with textcat.model.use_params(optimizer.averages):\n # evaluate on the dev data split off in load_data()\n scores = evaluate(nlp.tokenizer, textcat, dev_texts, dev_cats)\n print('{0:.3f}\\t{1:.3f}\\t{2:.3f}\\t{3:.3f}' # print a simple table\n .format(losses['textcat'], scores['textcat_p'],\n scores['textcat_r'], scores['textcat_f']))\n\n # test the trained model\n test_text = \"This movie sucked\"\n doc = nlp(test_text)\n print(test_text, doc.cats)\n\n if output_dir is not None:\n output_dir = Path(output_dir)\n if not output_dir.exists():\n output_dir.mkdir()\n nlp.to_disk(output_dir)\n print(\"Saved model to\", output_dir)\n\n # test the saved model\n print(\"Loading from\", output_dir)\n nlp2 = spacy.load(output_dir)\n doc2 = nlp2(test_text)\n print(test_text, doc2.cats)\n\n\ndef load_data(limit=0, split=0.8):\n \"\"\"Load data from the IMDB dataset.\"\"\"\n # Partition off part of the train data for evaluation\n train_data, _ = thinc.extra.datasets.imdb()\n random.shuffle(train_data)\n train_data = train_data[-limit:]\n texts, labels = zip(*train_data)\n cats = [{'POSITIVE': bool(y)} for y in labels]\n split = int(len(train_data) * split)\n return (texts[:split], cats[:split]), (texts[split:], cats[split:])\n\n\ndef evaluate(tokenizer, textcat, texts, cats):\n docs = (tokenizer(text) for text in texts)\n tp = 1e-8 # True positives\n fp = 1e-8 # False positives\n fn = 1e-8 # False negatives\n tn = 1e-8 # True negatives\n for i, doc in enumerate(textcat.pipe(docs)):\n gold = cats[i]\n for label, score in doc.cats.items():\n if label not in gold:\n continue\n if score >= 0.5 and gold[label] >= 0.5:\n tp += 1.\n elif score >= 0.5 and gold[label] < 0.5:\n fp += 1.\n elif score < 0.5 and gold[label] < 0.5:\n tn += 1\n elif score < 0.5 and gold[label] >= 0.5:\n fn += 1\n precision = tp / (tp + fp)\n recall = tp / (tp + fn)\n f_score = 2 * (precision * recall) / (precision + recall)\n return {'textcat_p': precision, 'textcat_r': recall, 'textcat_f': f_score}\n\n\nif __name__ == '__main__':\n plac.call(main)\n", "path": "examples/training/train_textcat.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# coding: utf8\n\"\"\"Train a convolutional neural network text classifier on the\nIMDB dataset, using the TextCategorizer component. The dataset will be loaded\nautomatically via Thinc's built-in dataset loader. The model is added to\nspacy.pipeline, and predictions are available via `doc.cats`. For more details,\nsee the documentation:\n* Training: https://spacy.io/usage/training\n\nCompatible with: spaCy v2.0.0+\n\"\"\"\nfrom __future__ import unicode_literals, print_function\nimport plac\nimport random\nfrom pathlib import Path\nimport thinc.extra.datasets\n\nimport spacy\nfrom spacy.util import minibatch, compounding\n\n\[email protected](\n model=(\"Model name. Defaults to blank 'en' model.\", \"option\", \"m\", str),\n output_dir=(\"Optional output directory\", \"option\", \"o\", Path),\n n_texts=(\"Number of texts to train from\", \"option\", \"t\", int),\n n_iter=(\"Number of training iterations\", \"option\", \"n\", int))\ndef main(model=None, output_dir=None, n_iter=20, n_texts=2000):\n if model is not None:\n nlp = spacy.load(model) # load existing spaCy model\n print(\"Loaded model '%s'\" % model)\n else:\n nlp = spacy.blank('en') # create blank Language class\n print(\"Created blank 'en' model\")\n\n # add the text classifier to the pipeline if it doesn't exist\n # nlp.create_pipe works for built-ins that are registered with spaCy\n if 'textcat' not in nlp.pipe_names:\n textcat = nlp.create_pipe('textcat')\n nlp.add_pipe(textcat, last=True)\n # otherwise, get it, so we can add labels to it\n else:\n textcat = nlp.get_pipe('textcat')\n\n # add label to text classifier\n textcat.add_label('POSITIVE')\n\n # load the IMDB dataset\n print(\"Loading IMDB data...\")\n (train_texts, train_cats), (dev_texts, dev_cats) = load_data(limit=n_texts)\n print(\"Using {} examples ({} training, {} evaluation)\"\n .format(n_texts, len(train_texts), len(dev_texts)))\n train_data = list(zip(train_texts,\n [{'cats': cats} for cats in train_cats]))\n\n # get names of other pipes to disable them during training\n other_pipes = [pipe for pipe in nlp.pipe_names if pipe != 'textcat']\n with nlp.disable_pipes(*other_pipes): # only train textcat\n optimizer = nlp.begin_training()\n print(\"Training the model...\")\n print('{:^5}\\t{:^5}\\t{:^5}\\t{:^5}'.format('LOSS', 'P', 'R', 'F'))\n for i in range(n_iter):\n losses = {}\n # batch up the examples using spaCy's minibatch\n batches = minibatch(train_data, size=compounding(4., 32., 1.001))\n for batch in batches:\n texts, annotations = zip(*batch)\n nlp.update(texts, annotations, sgd=optimizer, drop=0.2,\n losses=losses)\n with textcat.model.use_params(optimizer.averages):\n # evaluate on the dev data split off in load_data()\n scores = evaluate(nlp.tokenizer, textcat, dev_texts, dev_cats)\n print('{0:.3f}\\t{1:.3f}\\t{2:.3f}\\t{3:.3f}' # print a simple table\n .format(losses['textcat'], scores['textcat_p'],\n scores['textcat_r'], scores['textcat_f']))\n\n # test the trained model\n test_text = \"This movie sucked\"\n doc = nlp(test_text)\n print(test_text, doc.cats)\n\n if output_dir is not None:\n output_dir = Path(output_dir)\n if not output_dir.exists():\n output_dir.mkdir()\n nlp.to_disk(output_dir)\n print(\"Saved model to\", output_dir)\n\n # test the saved model\n print(\"Loading from\", output_dir)\n nlp2 = spacy.load(output_dir)\n doc2 = nlp2(test_text)\n print(test_text, doc2.cats)\n\n\ndef load_data(limit=0, split=0.8):\n \"\"\"Load data from the IMDB dataset.\"\"\"\n # Partition off part of the train data for evaluation\n train_data, _ = thinc.extra.datasets.imdb()\n random.shuffle(train_data)\n train_data = train_data[-limit:]\n texts, labels = zip(*train_data)\n cats = [{'POSITIVE': bool(y)} for y in labels]\n split = int(len(train_data) * split)\n return (texts[:split], cats[:split]), (texts[split:], cats[split:])\n\n\ndef evaluate(tokenizer, textcat, texts, cats):\n docs = (tokenizer(text) for text in texts)\n tp = 1e-8 # True positives\n fp = 1e-8 # False positives\n fn = 1e-8 # False negatives\n tn = 1e-8 # True negatives\n for i, doc in enumerate(textcat.pipe(docs)):\n gold = cats[i]\n for label, score in doc.cats.items():\n if label not in gold:\n continue\n if score >= 0.5 and gold[label] >= 0.5:\n tp += 1.\n elif score >= 0.5 and gold[label] < 0.5:\n fp += 1.\n elif score < 0.5 and gold[label] < 0.5:\n tn += 1\n elif score < 0.5 and gold[label] >= 0.5:\n fn += 1\n precision = tp / (tp + fp)\n recall = tp / (tp + fn)\n f_score = 2 * (precision * recall) / (precision + recall)\n return {'textcat_p': precision, 'textcat_r': recall, 'textcat_f': f_score}\n\n\nif __name__ == '__main__':\n plac.call(main)\n", "path": "examples/training/train_textcat.py"}]}
| 1,986 | 143 |
gh_patches_debug_11995
|
rasdani/github-patches
|
git_diff
|
encode__starlette-1574
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Denying WebSocket connection in WebSocketEndpoint.on_connect leads to Exception
### Discussed in https://github.com/encode/starlette/discussions/1555
<div type='discussions-op-text'>
<sup>Originally posted by **dingensundso** March 27, 2022</sup>
I created a WebSocketEndpoint class in which I want to deny the connection in certain conditions.
When the WebSocket is closed in on_connect I receive the following exception:
```
File "starlette/endpoints.py", line 83, in dispatch
close_code = int(message.get("code", status.WS_1000_NORMAL_CLOSURE))
TypeError: int() argument must be a string, a bytes-like object or a real number, not 'NoneType'
```
So code is None when the server denies the connection instead of non-existant.
https://github.com/encode/starlette/blob/e086fc2da361767b532cf690e5203619bbae98aa/starlette/endpoints.py#L72-L87
Changing line 83 to the following should fix the issue:
```python
close_code = int(message.get("code") or status.WS_1000_NORMAL_CLOSURE)
````
</div>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlette/endpoints.py`
Content:
```
1 import asyncio
2 import json
3 import typing
4
5 from starlette import status
6 from starlette.concurrency import run_in_threadpool
7 from starlette.exceptions import HTTPException
8 from starlette.requests import Request
9 from starlette.responses import PlainTextResponse, Response
10 from starlette.types import Message, Receive, Scope, Send
11 from starlette.websockets import WebSocket
12
13
14 class HTTPEndpoint:
15 def __init__(self, scope: Scope, receive: Receive, send: Send) -> None:
16 assert scope["type"] == "http"
17 self.scope = scope
18 self.receive = receive
19 self.send = send
20 self._allowed_methods = [
21 method
22 for method in ("GET", "HEAD", "POST", "PUT", "PATCH", "DELETE", "OPTIONS")
23 if getattr(self, method.lower(), None) is not None
24 ]
25
26 def __await__(self) -> typing.Generator:
27 return self.dispatch().__await__()
28
29 async def dispatch(self) -> None:
30 request = Request(self.scope, receive=self.receive)
31 handler_name = (
32 "get"
33 if request.method == "HEAD" and not hasattr(self, "head")
34 else request.method.lower()
35 )
36
37 handler: typing.Callable[[Request], typing.Any] = getattr(
38 self, handler_name, self.method_not_allowed
39 )
40 is_async = asyncio.iscoroutinefunction(handler)
41 if is_async:
42 response = await handler(request)
43 else:
44 response = await run_in_threadpool(handler, request)
45 await response(self.scope, self.receive, self.send)
46
47 async def method_not_allowed(self, request: Request) -> Response:
48 # If we're running inside a starlette application then raise an
49 # exception, so that the configurable exception handler can deal with
50 # returning the response. For plain ASGI apps, just return the response.
51 headers = {"Allow": ", ".join(self._allowed_methods)}
52 if "app" in self.scope:
53 raise HTTPException(status_code=405, headers=headers)
54 return PlainTextResponse("Method Not Allowed", status_code=405, headers=headers)
55
56
57 class WebSocketEndpoint:
58
59 encoding: typing.Optional[str] = None # May be "text", "bytes", or "json".
60
61 def __init__(self, scope: Scope, receive: Receive, send: Send) -> None:
62 assert scope["type"] == "websocket"
63 self.scope = scope
64 self.receive = receive
65 self.send = send
66
67 def __await__(self) -> typing.Generator:
68 return self.dispatch().__await__()
69
70 async def dispatch(self) -> None:
71 websocket = WebSocket(self.scope, receive=self.receive, send=self.send)
72 await self.on_connect(websocket)
73
74 close_code = status.WS_1000_NORMAL_CLOSURE
75
76 try:
77 while True:
78 message = await websocket.receive()
79 if message["type"] == "websocket.receive":
80 data = await self.decode(websocket, message)
81 await self.on_receive(websocket, data)
82 elif message["type"] == "websocket.disconnect":
83 close_code = int(message.get("code", status.WS_1000_NORMAL_CLOSURE))
84 break
85 except Exception as exc:
86 close_code = status.WS_1011_INTERNAL_ERROR
87 raise exc
88 finally:
89 await self.on_disconnect(websocket, close_code)
90
91 async def decode(self, websocket: WebSocket, message: Message) -> typing.Any:
92
93 if self.encoding == "text":
94 if "text" not in message:
95 await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)
96 raise RuntimeError("Expected text websocket messages, but got bytes")
97 return message["text"]
98
99 elif self.encoding == "bytes":
100 if "bytes" not in message:
101 await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)
102 raise RuntimeError("Expected bytes websocket messages, but got text")
103 return message["bytes"]
104
105 elif self.encoding == "json":
106 if message.get("text") is not None:
107 text = message["text"]
108 else:
109 text = message["bytes"].decode("utf-8")
110
111 try:
112 return json.loads(text)
113 except json.decoder.JSONDecodeError:
114 await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)
115 raise RuntimeError("Malformed JSON data received.")
116
117 assert (
118 self.encoding is None
119 ), f"Unsupported 'encoding' attribute {self.encoding}"
120 return message["text"] if message.get("text") else message["bytes"]
121
122 async def on_connect(self, websocket: WebSocket) -> None:
123 """Override to handle an incoming websocket connection"""
124 await websocket.accept()
125
126 async def on_receive(self, websocket: WebSocket, data: typing.Any) -> None:
127 """Override to handle an incoming websocket message"""
128
129 async def on_disconnect(self, websocket: WebSocket, close_code: int) -> None:
130 """Override to handle a disconnecting websocket"""
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/starlette/endpoints.py b/starlette/endpoints.py
--- a/starlette/endpoints.py
+++ b/starlette/endpoints.py
@@ -80,7 +80,9 @@
data = await self.decode(websocket, message)
await self.on_receive(websocket, data)
elif message["type"] == "websocket.disconnect":
- close_code = int(message.get("code", status.WS_1000_NORMAL_CLOSURE))
+ close_code = int(
+ message.get("code") or status.WS_1000_NORMAL_CLOSURE
+ )
break
except Exception as exc:
close_code = status.WS_1011_INTERNAL_ERROR
|
{"golden_diff": "diff --git a/starlette/endpoints.py b/starlette/endpoints.py\n--- a/starlette/endpoints.py\n+++ b/starlette/endpoints.py\n@@ -80,7 +80,9 @@\n data = await self.decode(websocket, message)\n await self.on_receive(websocket, data)\n elif message[\"type\"] == \"websocket.disconnect\":\n- close_code = int(message.get(\"code\", status.WS_1000_NORMAL_CLOSURE))\n+ close_code = int(\n+ message.get(\"code\") or status.WS_1000_NORMAL_CLOSURE\n+ )\n break\n except Exception as exc:\n close_code = status.WS_1011_INTERNAL_ERROR\n", "issue": "Denying WebSocket connection in WebSocketEndpoint.on_connect leads to Exception\n### Discussed in https://github.com/encode/starlette/discussions/1555\r\n\r\n<div type='discussions-op-text'>\r\n\r\n<sup>Originally posted by **dingensundso** March 27, 2022</sup>\r\nI created a WebSocketEndpoint class in which I want to deny the connection in certain conditions.\r\nWhen the WebSocket is closed in on_connect I receive the following exception:\r\n\r\n```\r\n File \"starlette/endpoints.py\", line 83, in dispatch\r\n close_code = int(message.get(\"code\", status.WS_1000_NORMAL_CLOSURE))\r\nTypeError: int() argument must be a string, a bytes-like object or a real number, not 'NoneType'\r\n```\r\nSo code is None when the server denies the connection instead of non-existant.\r\n\r\nhttps://github.com/encode/starlette/blob/e086fc2da361767b532cf690e5203619bbae98aa/starlette/endpoints.py#L72-L87\r\n\r\nChanging line 83 to the following should fix the issue:\r\n```python\r\nclose_code = int(message.get(\"code\") or status.WS_1000_NORMAL_CLOSURE) \r\n````\r\n</div>\n", "before_files": [{"content": "import asyncio\nimport json\nimport typing\n\nfrom starlette import status\nfrom starlette.concurrency import run_in_threadpool\nfrom starlette.exceptions import HTTPException\nfrom starlette.requests import Request\nfrom starlette.responses import PlainTextResponse, Response\nfrom starlette.types import Message, Receive, Scope, Send\nfrom starlette.websockets import WebSocket\n\n\nclass HTTPEndpoint:\n def __init__(self, scope: Scope, receive: Receive, send: Send) -> None:\n assert scope[\"type\"] == \"http\"\n self.scope = scope\n self.receive = receive\n self.send = send\n self._allowed_methods = [\n method\n for method in (\"GET\", \"HEAD\", \"POST\", \"PUT\", \"PATCH\", \"DELETE\", \"OPTIONS\")\n if getattr(self, method.lower(), None) is not None\n ]\n\n def __await__(self) -> typing.Generator:\n return self.dispatch().__await__()\n\n async def dispatch(self) -> None:\n request = Request(self.scope, receive=self.receive)\n handler_name = (\n \"get\"\n if request.method == \"HEAD\" and not hasattr(self, \"head\")\n else request.method.lower()\n )\n\n handler: typing.Callable[[Request], typing.Any] = getattr(\n self, handler_name, self.method_not_allowed\n )\n is_async = asyncio.iscoroutinefunction(handler)\n if is_async:\n response = await handler(request)\n else:\n response = await run_in_threadpool(handler, request)\n await response(self.scope, self.receive, self.send)\n\n async def method_not_allowed(self, request: Request) -> Response:\n # If we're running inside a starlette application then raise an\n # exception, so that the configurable exception handler can deal with\n # returning the response. For plain ASGI apps, just return the response.\n headers = {\"Allow\": \", \".join(self._allowed_methods)}\n if \"app\" in self.scope:\n raise HTTPException(status_code=405, headers=headers)\n return PlainTextResponse(\"Method Not Allowed\", status_code=405, headers=headers)\n\n\nclass WebSocketEndpoint:\n\n encoding: typing.Optional[str] = None # May be \"text\", \"bytes\", or \"json\".\n\n def __init__(self, scope: Scope, receive: Receive, send: Send) -> None:\n assert scope[\"type\"] == \"websocket\"\n self.scope = scope\n self.receive = receive\n self.send = send\n\n def __await__(self) -> typing.Generator:\n return self.dispatch().__await__()\n\n async def dispatch(self) -> None:\n websocket = WebSocket(self.scope, receive=self.receive, send=self.send)\n await self.on_connect(websocket)\n\n close_code = status.WS_1000_NORMAL_CLOSURE\n\n try:\n while True:\n message = await websocket.receive()\n if message[\"type\"] == \"websocket.receive\":\n data = await self.decode(websocket, message)\n await self.on_receive(websocket, data)\n elif message[\"type\"] == \"websocket.disconnect\":\n close_code = int(message.get(\"code\", status.WS_1000_NORMAL_CLOSURE))\n break\n except Exception as exc:\n close_code = status.WS_1011_INTERNAL_ERROR\n raise exc\n finally:\n await self.on_disconnect(websocket, close_code)\n\n async def decode(self, websocket: WebSocket, message: Message) -> typing.Any:\n\n if self.encoding == \"text\":\n if \"text\" not in message:\n await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)\n raise RuntimeError(\"Expected text websocket messages, but got bytes\")\n return message[\"text\"]\n\n elif self.encoding == \"bytes\":\n if \"bytes\" not in message:\n await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)\n raise RuntimeError(\"Expected bytes websocket messages, but got text\")\n return message[\"bytes\"]\n\n elif self.encoding == \"json\":\n if message.get(\"text\") is not None:\n text = message[\"text\"]\n else:\n text = message[\"bytes\"].decode(\"utf-8\")\n\n try:\n return json.loads(text)\n except json.decoder.JSONDecodeError:\n await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)\n raise RuntimeError(\"Malformed JSON data received.\")\n\n assert (\n self.encoding is None\n ), f\"Unsupported 'encoding' attribute {self.encoding}\"\n return message[\"text\"] if message.get(\"text\") else message[\"bytes\"]\n\n async def on_connect(self, websocket: WebSocket) -> None:\n \"\"\"Override to handle an incoming websocket connection\"\"\"\n await websocket.accept()\n\n async def on_receive(self, websocket: WebSocket, data: typing.Any) -> None:\n \"\"\"Override to handle an incoming websocket message\"\"\"\n\n async def on_disconnect(self, websocket: WebSocket, close_code: int) -> None:\n \"\"\"Override to handle a disconnecting websocket\"\"\"\n", "path": "starlette/endpoints.py"}], "after_files": [{"content": "import asyncio\nimport json\nimport typing\n\nfrom starlette import status\nfrom starlette.concurrency import run_in_threadpool\nfrom starlette.exceptions import HTTPException\nfrom starlette.requests import Request\nfrom starlette.responses import PlainTextResponse, Response\nfrom starlette.types import Message, Receive, Scope, Send\nfrom starlette.websockets import WebSocket\n\n\nclass HTTPEndpoint:\n def __init__(self, scope: Scope, receive: Receive, send: Send) -> None:\n assert scope[\"type\"] == \"http\"\n self.scope = scope\n self.receive = receive\n self.send = send\n self._allowed_methods = [\n method\n for method in (\"GET\", \"HEAD\", \"POST\", \"PUT\", \"PATCH\", \"DELETE\", \"OPTIONS\")\n if getattr(self, method.lower(), None) is not None\n ]\n\n def __await__(self) -> typing.Generator:\n return self.dispatch().__await__()\n\n async def dispatch(self) -> None:\n request = Request(self.scope, receive=self.receive)\n handler_name = (\n \"get\"\n if request.method == \"HEAD\" and not hasattr(self, \"head\")\n else request.method.lower()\n )\n\n handler: typing.Callable[[Request], typing.Any] = getattr(\n self, handler_name, self.method_not_allowed\n )\n is_async = asyncio.iscoroutinefunction(handler)\n if is_async:\n response = await handler(request)\n else:\n response = await run_in_threadpool(handler, request)\n await response(self.scope, self.receive, self.send)\n\n async def method_not_allowed(self, request: Request) -> Response:\n # If we're running inside a starlette application then raise an\n # exception, so that the configurable exception handler can deal with\n # returning the response. For plain ASGI apps, just return the response.\n headers = {\"Allow\": \", \".join(self._allowed_methods)}\n if \"app\" in self.scope:\n raise HTTPException(status_code=405, headers=headers)\n return PlainTextResponse(\"Method Not Allowed\", status_code=405, headers=headers)\n\n\nclass WebSocketEndpoint:\n\n encoding: typing.Optional[str] = None # May be \"text\", \"bytes\", or \"json\".\n\n def __init__(self, scope: Scope, receive: Receive, send: Send) -> None:\n assert scope[\"type\"] == \"websocket\"\n self.scope = scope\n self.receive = receive\n self.send = send\n\n def __await__(self) -> typing.Generator:\n return self.dispatch().__await__()\n\n async def dispatch(self) -> None:\n websocket = WebSocket(self.scope, receive=self.receive, send=self.send)\n await self.on_connect(websocket)\n\n close_code = status.WS_1000_NORMAL_CLOSURE\n\n try:\n while True:\n message = await websocket.receive()\n if message[\"type\"] == \"websocket.receive\":\n data = await self.decode(websocket, message)\n await self.on_receive(websocket, data)\n elif message[\"type\"] == \"websocket.disconnect\":\n close_code = int(\n message.get(\"code\") or status.WS_1000_NORMAL_CLOSURE\n )\n break\n except Exception as exc:\n close_code = status.WS_1011_INTERNAL_ERROR\n raise exc\n finally:\n await self.on_disconnect(websocket, close_code)\n\n async def decode(self, websocket: WebSocket, message: Message) -> typing.Any:\n\n if self.encoding == \"text\":\n if \"text\" not in message:\n await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)\n raise RuntimeError(\"Expected text websocket messages, but got bytes\")\n return message[\"text\"]\n\n elif self.encoding == \"bytes\":\n if \"bytes\" not in message:\n await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)\n raise RuntimeError(\"Expected bytes websocket messages, but got text\")\n return message[\"bytes\"]\n\n elif self.encoding == \"json\":\n if message.get(\"text\") is not None:\n text = message[\"text\"]\n else:\n text = message[\"bytes\"].decode(\"utf-8\")\n\n try:\n return json.loads(text)\n except json.decoder.JSONDecodeError:\n await websocket.close(code=status.WS_1003_UNSUPPORTED_DATA)\n raise RuntimeError(\"Malformed JSON data received.\")\n\n assert (\n self.encoding is None\n ), f\"Unsupported 'encoding' attribute {self.encoding}\"\n return message[\"text\"] if message.get(\"text\") else message[\"bytes\"]\n\n async def on_connect(self, websocket: WebSocket) -> None:\n \"\"\"Override to handle an incoming websocket connection\"\"\"\n await websocket.accept()\n\n async def on_receive(self, websocket: WebSocket, data: typing.Any) -> None:\n \"\"\"Override to handle an incoming websocket message\"\"\"\n\n async def on_disconnect(self, websocket: WebSocket, close_code: int) -> None:\n \"\"\"Override to handle a disconnecting websocket\"\"\"\n", "path": "starlette/endpoints.py"}]}
| 1,902 | 155 |
gh_patches_debug_38768
|
rasdani/github-patches
|
git_diff
|
sublimelsp__LSP-717
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Formatting adding trailing newline clears last line
* OS and language server
Linux + Gopls
* How you installed LSP (Package Control or from git?)
Package Control
* Minimal reproduction steps
```go
package main
import (
"fmt"
)
func main() {
fmt.Println("Hello, world")
} // No newline!
```
Format
```go
package main
import (
"fmt"
)
func main() {
fmt.Println("Hello, world")
```
* Log
No diagnostic output.
Initially reported to [gopls](https://github.com/golang/go/issues/33717), but they pointed out that the gopls commandline does the right thing.
Is this a LSP issue or Sublime itself?
Let me know if I can provide any other helpful information!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugin/edit.py`
Content:
```
1 import sublime
2 import sublime_plugin
3 from .core.edit import sort_by_application_order
4 try:
5 from typing import List, Dict, Optional, Any, Iterable, Tuple
6 from .core.edit import TextEdit
7 assert List and Dict and Optional and Any and Iterable and Tuple and TextEdit
8 except ImportError:
9 pass
10 from .core.logging import debug
11
12
13 class LspApplyWorkspaceEditCommand(sublime_plugin.WindowCommand):
14 def run(self, changes: 'Optional[Dict[str, List[TextEdit]]]' = None):
15 documents_changed = 0
16 if changes:
17 for path, document_changes in changes.items():
18 self.open_and_apply_edits(path, document_changes)
19 documents_changed += 1
20
21 if documents_changed > 0:
22 message = 'Applied changes to {} documents'.format(documents_changed)
23 self.window.status_message(message)
24 else:
25 self.window.status_message('No changes to apply to workspace')
26
27 def open_and_apply_edits(self, path, file_changes):
28 view = self.window.open_file(path)
29 if view:
30 if view.is_loading():
31 # TODO: wait for event instead.
32 sublime.set_timeout_async(
33 lambda: view.run_command('lsp_apply_document_edit', {'changes': file_changes}),
34 500
35 )
36 else:
37 view.run_command('lsp_apply_document_edit',
38 {'changes': file_changes})
39 else:
40 debug('view not found to apply', path, file_changes)
41
42
43 class LspApplyDocumentEditCommand(sublime_plugin.TextCommand):
44 def run(self, edit, changes: 'Optional[List[TextEdit]]' = None):
45 # Apply the changes in reverse, so that we don't invalidate the range
46 # of any change that we haven't applied yet.
47 if changes:
48 for change in sort_by_application_order(changes):
49 start, end, newText = change
50 region = sublime.Region(self.view.text_point(*start), self.view.text_point(*end))
51 self.apply_change(region, newText, edit)
52
53 def apply_change(self, region: 'sublime.Region', newText: str, edit):
54 if region.empty():
55 self.view.insert(edit, region.a, newText)
56 else:
57 if len(newText) > 0:
58 self.view.replace(edit, region, newText)
59 else:
60 self.view.erase(edit, region)
61
```
Path: `plugin/core/edit.py`
Content:
```
1 from .url import uri_to_filename
2 try:
3 from typing import List, Dict, Optional, Any, Iterable, Tuple
4 TextEdit = Tuple[Tuple[int, int], Tuple[int, int], str]
5 assert List and Dict and Optional and Any and Iterable and Tuple
6 except ImportError:
7 pass
8
9
10 def parse_workspace_edit(workspace_edit: 'Dict[str, Any]') -> 'Dict[str, List[TextEdit]]':
11 changes = {} # type: Dict[str, List[TextEdit]]
12 if 'changes' in workspace_edit:
13 for uri, file_changes in workspace_edit.get('changes', {}).items():
14 changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes)
15 if 'documentChanges' in workspace_edit:
16 for document_change in workspace_edit.get('documentChanges', []):
17 uri = document_change.get('textDocument').get('uri')
18 changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in document_change.get('edits'))
19 return changes
20
21
22 def parse_range(range: 'Dict[str, int]') -> 'Tuple[int, int]':
23 return range['line'], range['character']
24
25
26 def parse_text_edit(text_edit: 'Dict[str, Any]') -> 'TextEdit':
27 return (
28 parse_range(text_edit['range']['start']),
29 parse_range(text_edit['range']['end']),
30 text_edit.get('newText', '')
31 )
32
33
34 def sort_by_application_order(changes: 'Iterable[TextEdit]') -> 'List[TextEdit]':
35
36 def get_start_position(pair: 'Tuple[int, TextEdit]'):
37 index, change = pair
38 return change[0][0], change[0][1], index
39
40 # The spec reads:
41 # > However, it is possible that multiple edits have the same start position: multiple
42 # > inserts, or any number of inserts followed by a single remove or replace edit. If
43 # > multiple inserts have the same position, the order in the array defines the order in
44 # > which the inserted strings appear in the resulting text.
45 # So we sort by start position. But if multiple text edits start at the same position,
46 # we use the index in the array as the key.
47
48 return list(map(lambda pair: pair[1], sorted(enumerate(changes), key=get_start_position, reverse=True)))
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/plugin/core/edit.py b/plugin/core/edit.py
--- a/plugin/core/edit.py
+++ b/plugin/core/edit.py
@@ -1,4 +1,5 @@
from .url import uri_to_filename
+import operator
try:
from typing import List, Dict, Optional, Any, Iterable, Tuple
TextEdit = Tuple[Tuple[int, int], Tuple[int, int], str]
@@ -32,11 +33,6 @@
def sort_by_application_order(changes: 'Iterable[TextEdit]') -> 'List[TextEdit]':
-
- def get_start_position(pair: 'Tuple[int, TextEdit]'):
- index, change = pair
- return change[0][0], change[0][1], index
-
# The spec reads:
# > However, it is possible that multiple edits have the same start position: multiple
# > inserts, or any number of inserts followed by a single remove or replace edit. If
@@ -45,4 +41,4 @@
# So we sort by start position. But if multiple text edits start at the same position,
# we use the index in the array as the key.
- return list(map(lambda pair: pair[1], sorted(enumerate(changes), key=get_start_position, reverse=True)))
+ return list(sorted(changes, key=operator.itemgetter(0)))
diff --git a/plugin/edit.py b/plugin/edit.py
--- a/plugin/edit.py
+++ b/plugin/edit.py
@@ -41,14 +41,24 @@
class LspApplyDocumentEditCommand(sublime_plugin.TextCommand):
+
def run(self, edit, changes: 'Optional[List[TextEdit]]' = None):
# Apply the changes in reverse, so that we don't invalidate the range
# of any change that we haven't applied yet.
if changes:
- for change in sort_by_application_order(changes):
+ last_row, last_col = self.view.rowcol(self.view.size())
+ for change in reversed(sort_by_application_order(changes)):
start, end, newText = change
region = sublime.Region(self.view.text_point(*start), self.view.text_point(*end))
- self.apply_change(region, newText, edit)
+
+ if start[0] > last_row and newText[0] != '\n':
+ # Handle when a language server (eg gopls) inserts at a row beyond the document
+ # some editors create the line automatically, sublime needs to have the newline prepended.
+ debug('adding new line for edit at line {}, document ended at line {}'.format(start[0], last_row))
+ self.apply_change(region, '\n' + newText, edit)
+ last_row, last_col = self.view.rowcol(self.view.size())
+ else:
+ self.apply_change(region, newText, edit)
def apply_change(self, region: 'sublime.Region', newText: str, edit):
if region.empty():
|
{"golden_diff": "diff --git a/plugin/core/edit.py b/plugin/core/edit.py\n--- a/plugin/core/edit.py\n+++ b/plugin/core/edit.py\n@@ -1,4 +1,5 @@\n from .url import uri_to_filename\n+import operator\n try:\n from typing import List, Dict, Optional, Any, Iterable, Tuple\n TextEdit = Tuple[Tuple[int, int], Tuple[int, int], str]\n@@ -32,11 +33,6 @@\n \n \n def sort_by_application_order(changes: 'Iterable[TextEdit]') -> 'List[TextEdit]':\n-\n- def get_start_position(pair: 'Tuple[int, TextEdit]'):\n- index, change = pair\n- return change[0][0], change[0][1], index\n-\n # The spec reads:\n # > However, it is possible that multiple edits have the same start position: multiple\n # > inserts, or any number of inserts followed by a single remove or replace edit. If\n@@ -45,4 +41,4 @@\n # So we sort by start position. But if multiple text edits start at the same position,\n # we use the index in the array as the key.\n \n- return list(map(lambda pair: pair[1], sorted(enumerate(changes), key=get_start_position, reverse=True)))\n+ return list(sorted(changes, key=operator.itemgetter(0)))\ndiff --git a/plugin/edit.py b/plugin/edit.py\n--- a/plugin/edit.py\n+++ b/plugin/edit.py\n@@ -41,14 +41,24 @@\n \n \n class LspApplyDocumentEditCommand(sublime_plugin.TextCommand):\n+\n def run(self, edit, changes: 'Optional[List[TextEdit]]' = None):\n # Apply the changes in reverse, so that we don't invalidate the range\n # of any change that we haven't applied yet.\n if changes:\n- for change in sort_by_application_order(changes):\n+ last_row, last_col = self.view.rowcol(self.view.size())\n+ for change in reversed(sort_by_application_order(changes)):\n start, end, newText = change\n region = sublime.Region(self.view.text_point(*start), self.view.text_point(*end))\n- self.apply_change(region, newText, edit)\n+\n+ if start[0] > last_row and newText[0] != '\\n':\n+ # Handle when a language server (eg gopls) inserts at a row beyond the document\n+ # some editors create the line automatically, sublime needs to have the newline prepended.\n+ debug('adding new line for edit at line {}, document ended at line {}'.format(start[0], last_row))\n+ self.apply_change(region, '\\n' + newText, edit)\n+ last_row, last_col = self.view.rowcol(self.view.size())\n+ else:\n+ self.apply_change(region, newText, edit)\n \n def apply_change(self, region: 'sublime.Region', newText: str, edit):\n if region.empty():\n", "issue": "Formatting adding trailing newline clears last line\n* OS and language server\r\nLinux + Gopls\r\n* How you installed LSP (Package Control or from git?)\r\nPackage Control\r\n* Minimal reproduction steps\r\n```go\r\npackage main\r\n\r\nimport (\r\n\t\"fmt\"\r\n)\r\n\r\nfunc main() {\r\n\tfmt.Println(\"Hello, world\")\r\n} // No newline!\r\n```\r\n\r\nFormat\r\n\r\n```go\r\npackage main\r\n\r\nimport (\r\n\t\"fmt\"\r\n)\r\n\r\nfunc main() {\r\n\tfmt.Println(\"Hello, world\")\r\n```\r\n* Log\r\nNo diagnostic output. \r\n\r\nInitially reported to [gopls](https://github.com/golang/go/issues/33717), but they pointed out that the gopls commandline does the right thing.\r\n\r\nIs this a LSP issue or Sublime itself?\r\nLet me know if I can provide any other helpful information!\n", "before_files": [{"content": "import sublime\nimport sublime_plugin\nfrom .core.edit import sort_by_application_order\ntry:\n from typing import List, Dict, Optional, Any, Iterable, Tuple\n from .core.edit import TextEdit\n assert List and Dict and Optional and Any and Iterable and Tuple and TextEdit\nexcept ImportError:\n pass\nfrom .core.logging import debug\n\n\nclass LspApplyWorkspaceEditCommand(sublime_plugin.WindowCommand):\n def run(self, changes: 'Optional[Dict[str, List[TextEdit]]]' = None):\n documents_changed = 0\n if changes:\n for path, document_changes in changes.items():\n self.open_and_apply_edits(path, document_changes)\n documents_changed += 1\n\n if documents_changed > 0:\n message = 'Applied changes to {} documents'.format(documents_changed)\n self.window.status_message(message)\n else:\n self.window.status_message('No changes to apply to workspace')\n\n def open_and_apply_edits(self, path, file_changes):\n view = self.window.open_file(path)\n if view:\n if view.is_loading():\n # TODO: wait for event instead.\n sublime.set_timeout_async(\n lambda: view.run_command('lsp_apply_document_edit', {'changes': file_changes}),\n 500\n )\n else:\n view.run_command('lsp_apply_document_edit',\n {'changes': file_changes})\n else:\n debug('view not found to apply', path, file_changes)\n\n\nclass LspApplyDocumentEditCommand(sublime_plugin.TextCommand):\n def run(self, edit, changes: 'Optional[List[TextEdit]]' = None):\n # Apply the changes in reverse, so that we don't invalidate the range\n # of any change that we haven't applied yet.\n if changes:\n for change in sort_by_application_order(changes):\n start, end, newText = change\n region = sublime.Region(self.view.text_point(*start), self.view.text_point(*end))\n self.apply_change(region, newText, edit)\n\n def apply_change(self, region: 'sublime.Region', newText: str, edit):\n if region.empty():\n self.view.insert(edit, region.a, newText)\n else:\n if len(newText) > 0:\n self.view.replace(edit, region, newText)\n else:\n self.view.erase(edit, region)\n", "path": "plugin/edit.py"}, {"content": "from .url import uri_to_filename\ntry:\n from typing import List, Dict, Optional, Any, Iterable, Tuple\n TextEdit = Tuple[Tuple[int, int], Tuple[int, int], str]\n assert List and Dict and Optional and Any and Iterable and Tuple\nexcept ImportError:\n pass\n\n\ndef parse_workspace_edit(workspace_edit: 'Dict[str, Any]') -> 'Dict[str, List[TextEdit]]':\n changes = {} # type: Dict[str, List[TextEdit]]\n if 'changes' in workspace_edit:\n for uri, file_changes in workspace_edit.get('changes', {}).items():\n changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes)\n if 'documentChanges' in workspace_edit:\n for document_change in workspace_edit.get('documentChanges', []):\n uri = document_change.get('textDocument').get('uri')\n changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in document_change.get('edits'))\n return changes\n\n\ndef parse_range(range: 'Dict[str, int]') -> 'Tuple[int, int]':\n return range['line'], range['character']\n\n\ndef parse_text_edit(text_edit: 'Dict[str, Any]') -> 'TextEdit':\n return (\n parse_range(text_edit['range']['start']),\n parse_range(text_edit['range']['end']),\n text_edit.get('newText', '')\n )\n\n\ndef sort_by_application_order(changes: 'Iterable[TextEdit]') -> 'List[TextEdit]':\n\n def get_start_position(pair: 'Tuple[int, TextEdit]'):\n index, change = pair\n return change[0][0], change[0][1], index\n\n # The spec reads:\n # > However, it is possible that multiple edits have the same start position: multiple\n # > inserts, or any number of inserts followed by a single remove or replace edit. If\n # > multiple inserts have the same position, the order in the array defines the order in\n # > which the inserted strings appear in the resulting text.\n # So we sort by start position. But if multiple text edits start at the same position,\n # we use the index in the array as the key.\n\n return list(map(lambda pair: pair[1], sorted(enumerate(changes), key=get_start_position, reverse=True)))\n", "path": "plugin/core/edit.py"}], "after_files": [{"content": "import sublime\nimport sublime_plugin\nfrom .core.edit import sort_by_application_order\ntry:\n from typing import List, Dict, Optional, Any, Iterable, Tuple\n from .core.edit import TextEdit\n assert List and Dict and Optional and Any and Iterable and Tuple and TextEdit\nexcept ImportError:\n pass\nfrom .core.logging import debug\n\n\nclass LspApplyWorkspaceEditCommand(sublime_plugin.WindowCommand):\n def run(self, changes: 'Optional[Dict[str, List[TextEdit]]]' = None):\n documents_changed = 0\n if changes:\n for path, document_changes in changes.items():\n self.open_and_apply_edits(path, document_changes)\n documents_changed += 1\n\n if documents_changed > 0:\n message = 'Applied changes to {} documents'.format(documents_changed)\n self.window.status_message(message)\n else:\n self.window.status_message('No changes to apply to workspace')\n\n def open_and_apply_edits(self, path, file_changes):\n view = self.window.open_file(path)\n if view:\n if view.is_loading():\n # TODO: wait for event instead.\n sublime.set_timeout_async(\n lambda: view.run_command('lsp_apply_document_edit', {'changes': file_changes}),\n 500\n )\n else:\n view.run_command('lsp_apply_document_edit',\n {'changes': file_changes})\n else:\n debug('view not found to apply', path, file_changes)\n\n\nclass LspApplyDocumentEditCommand(sublime_plugin.TextCommand):\n\n def run(self, edit, changes: 'Optional[List[TextEdit]]' = None):\n # Apply the changes in reverse, so that we don't invalidate the range\n # of any change that we haven't applied yet.\n if changes:\n last_row, last_col = self.view.rowcol(self.view.size())\n for change in reversed(sort_by_application_order(changes)):\n start, end, newText = change\n region = sublime.Region(self.view.text_point(*start), self.view.text_point(*end))\n\n if start[0] > last_row and newText[0] != '\\n':\n # Handle when a language server (eg gopls) inserts at a row beyond the document\n # some editors create the line automatically, sublime needs to have the newline prepended.\n debug('adding new line for edit at line {}, document ended at line {}'.format(start[0], last_row))\n self.apply_change(region, '\\n' + newText, edit)\n last_row, last_col = self.view.rowcol(self.view.size())\n else:\n self.apply_change(region, newText, edit)\n\n def apply_change(self, region: 'sublime.Region', newText: str, edit):\n if region.empty():\n self.view.insert(edit, region.a, newText)\n else:\n if len(newText) > 0:\n self.view.replace(edit, region, newText)\n else:\n self.view.erase(edit, region)\n", "path": "plugin/edit.py"}, {"content": "from .url import uri_to_filename\nimport operator\ntry:\n from typing import List, Dict, Optional, Any, Iterable, Tuple\n TextEdit = Tuple[Tuple[int, int], Tuple[int, int], str]\n assert List and Dict and Optional and Any and Iterable and Tuple\nexcept ImportError:\n pass\n\n\ndef parse_workspace_edit(workspace_edit: 'Dict[str, Any]') -> 'Dict[str, List[TextEdit]]':\n changes = {} # type: Dict[str, List[TextEdit]]\n if 'changes' in workspace_edit:\n for uri, file_changes in workspace_edit.get('changes', {}).items():\n changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in file_changes)\n if 'documentChanges' in workspace_edit:\n for document_change in workspace_edit.get('documentChanges', []):\n uri = document_change.get('textDocument').get('uri')\n changes[uri_to_filename(uri)] = list(parse_text_edit(change) for change in document_change.get('edits'))\n return changes\n\n\ndef parse_range(range: 'Dict[str, int]') -> 'Tuple[int, int]':\n return range['line'], range['character']\n\n\ndef parse_text_edit(text_edit: 'Dict[str, Any]') -> 'TextEdit':\n return (\n parse_range(text_edit['range']['start']),\n parse_range(text_edit['range']['end']),\n text_edit.get('newText', '')\n )\n\n\ndef sort_by_application_order(changes: 'Iterable[TextEdit]') -> 'List[TextEdit]':\n # The spec reads:\n # > However, it is possible that multiple edits have the same start position: multiple\n # > inserts, or any number of inserts followed by a single remove or replace edit. If\n # > multiple inserts have the same position, the order in the array defines the order in\n # > which the inserted strings appear in the resulting text.\n # So we sort by start position. But if multiple text edits start at the same position,\n # we use the index in the array as the key.\n\n return list(sorted(changes, key=operator.itemgetter(0)))\n", "path": "plugin/core/edit.py"}]}
| 1,642 | 635 |
gh_patches_debug_1674
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-4324
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tools.environment_append raises if tries to unset variable which was never set
after #4224, I may use the following code, for instance, to ensure variable is not set:
```
with environment_append({'CONAN_BASH_PATH': None}):
pass
```
however, it raises if `CONAN_BASH_PATH` is not set (prior to the environment_append invocation):
```
Traceback (most recent call last):
File "C:\bincrafters\conan\conans\test\unittests\client\tools\os_info\osinfo_test.py", line 39, in test_windows
with environment_append(new_env):
File "c:\users\sse4\appdata\local\programs\python\python36\lib\contextlib.py", line 81, in __enter__
return next(self.gen)
File "C:\bincrafters\conan\conans\client\tools\env.py", line 57, in environment_append
os.environ.pop(var)
File "c:\users\sse4\appdata\local\programs\python\python36\lib\_collections_abc.py", line 795, in pop
value = self[key]
File "c:\users\sse4\appdata\local\programs\python\python36\lib\os.py", line 669, in __getitem__
raise KeyError(key) from None
KeyError: 'CONAN_BASH_PATH'
```
I would expect `tools.environment_append` to be no op in such case, otherwise, it requires additional logic to workaround this behavior.
To help us debug your issue please explain:
- [ ] I've read the [CONTRIBUTING guide](https://github.com/conan-io/conan/blob/develop/.github/CONTRIBUTING.md).
- [ ] I've specified the Conan version, operating system version and any tool that can be relevant.
- [ ] I've explained the steps to reproduce the error or the motivation/use case of the question/suggestion.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conans/client/tools/env.py`
Content:
```
1 import os
2 import sys
3 from contextlib import contextmanager
4
5 from conans.client.run_environment import RunEnvironment
6 from conans.client.tools.files import _path_equals, which
7 from conans.errors import ConanException
8
9
10 @contextmanager
11 def pythonpath(conanfile):
12 python_path = conanfile.env.get("PYTHONPATH", None)
13 if python_path:
14 old_path = sys.path[:]
15 if isinstance(python_path, list):
16 sys.path.extend(python_path)
17 else:
18 sys.path.append(python_path)
19
20 yield
21 sys.path = old_path
22 else:
23 yield
24
25
26 @contextmanager
27 def run_environment(conanfile):
28 with environment_append(RunEnvironment(conanfile).vars):
29 yield
30
31
32 @contextmanager
33 def environment_append(env_vars):
34 """
35 :param env_vars: List (dict) of simple environment vars. {name: value, name2: value2} => e.g.: MYVAR=1
36 The values can also be lists of appendable environment vars. {name: [value, value2]}
37 => e.g. PATH=/path/1:/path/2
38 If the value is set to None, then that environment variable is unset.
39 :return: None
40 """
41 unset_vars = []
42 for key in env_vars.keys():
43 if env_vars[key] is None:
44 unset_vars.append(key)
45 for var in unset_vars:
46 env_vars.pop(var, None)
47 for name, value in env_vars.items():
48 if isinstance(value, list):
49 env_vars[name] = os.pathsep.join(value)
50 old = os.environ.get(name)
51 if old:
52 env_vars[name] += os.pathsep + old
53 if env_vars or unset_vars:
54 old_env = dict(os.environ)
55 os.environ.update(env_vars)
56 for var in unset_vars:
57 os.environ.pop(var)
58 try:
59 yield
60 finally:
61 os.environ.clear()
62 os.environ.update(old_env)
63 else:
64 yield
65
66
67 @contextmanager
68 def no_op():
69 yield
70
71
72 @contextmanager
73 def remove_from_path(command):
74 curpath = os.getenv("PATH")
75 first_it = True
76 for _ in range(30):
77 if not first_it:
78 with environment_append({"PATH": curpath}):
79 the_command = which(command)
80 else:
81 the_command = which(command)
82 first_it = False
83
84 if not the_command:
85 break
86 new_path = []
87 for entry in curpath.split(os.pathsep):
88 if not _path_equals(entry, os.path.dirname(the_command)):
89 new_path.append(entry)
90
91 curpath = os.pathsep.join(new_path)
92 else:
93 raise ConanException("Error in tools.remove_from_path!! couldn't remove the tool '%s' "
94 "from the path after 30 attempts, still found in '%s' this is a "
95 "Conan client bug, please open an issue at: "
96 "https://github.com/conan-io/conan\n\nPATH=%s"
97 % (command, the_command, os.getenv("PATH")))
98
99 with environment_append({"PATH": curpath}):
100 yield
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/conans/client/tools/env.py b/conans/client/tools/env.py
--- a/conans/client/tools/env.py
+++ b/conans/client/tools/env.py
@@ -54,7 +54,7 @@
old_env = dict(os.environ)
os.environ.update(env_vars)
for var in unset_vars:
- os.environ.pop(var)
+ os.environ.pop(var, None)
try:
yield
finally:
|
{"golden_diff": "diff --git a/conans/client/tools/env.py b/conans/client/tools/env.py\n--- a/conans/client/tools/env.py\n+++ b/conans/client/tools/env.py\n@@ -54,7 +54,7 @@\n old_env = dict(os.environ)\n os.environ.update(env_vars)\n for var in unset_vars:\n- os.environ.pop(var)\n+ os.environ.pop(var, None)\n try:\n yield\n finally:\n", "issue": "tools.environment_append raises if tries to unset variable which was never set\nafter #4224, I may use the following code, for instance, to ensure variable is not set:\r\n```\r\nwith environment_append({'CONAN_BASH_PATH': None}):\r\n pass\r\n```\r\nhowever, it raises if `CONAN_BASH_PATH` is not set (prior to the environment_append invocation):\r\n```\r\nTraceback (most recent call last):\r\n File \"C:\\bincrafters\\conan\\conans\\test\\unittests\\client\\tools\\os_info\\osinfo_test.py\", line 39, in test_windows\r\n with environment_append(new_env):\r\n File \"c:\\users\\sse4\\appdata\\local\\programs\\python\\python36\\lib\\contextlib.py\", line 81, in __enter__\r\n return next(self.gen)\r\n File \"C:\\bincrafters\\conan\\conans\\client\\tools\\env.py\", line 57, in environment_append\r\n os.environ.pop(var)\r\n File \"c:\\users\\sse4\\appdata\\local\\programs\\python\\python36\\lib\\_collections_abc.py\", line 795, in pop\r\n value = self[key]\r\n File \"c:\\users\\sse4\\appdata\\local\\programs\\python\\python36\\lib\\os.py\", line 669, in __getitem__\r\n raise KeyError(key) from None\r\nKeyError: 'CONAN_BASH_PATH'\r\n```\r\nI would expect `tools.environment_append` to be no op in such case, otherwise, it requires additional logic to workaround this behavior.\r\n\r\nTo help us debug your issue please explain:\r\n\r\n- [ ] I've read the [CONTRIBUTING guide](https://github.com/conan-io/conan/blob/develop/.github/CONTRIBUTING.md).\r\n- [ ] I've specified the Conan version, operating system version and any tool that can be relevant.\r\n- [ ] I've explained the steps to reproduce the error or the motivation/use case of the question/suggestion.\r\n\r\n\n", "before_files": [{"content": "import os\nimport sys\nfrom contextlib import contextmanager\n\nfrom conans.client.run_environment import RunEnvironment\nfrom conans.client.tools.files import _path_equals, which\nfrom conans.errors import ConanException\n\n\n@contextmanager\ndef pythonpath(conanfile):\n python_path = conanfile.env.get(\"PYTHONPATH\", None)\n if python_path:\n old_path = sys.path[:]\n if isinstance(python_path, list):\n sys.path.extend(python_path)\n else:\n sys.path.append(python_path)\n\n yield\n sys.path = old_path\n else:\n yield\n\n\n@contextmanager\ndef run_environment(conanfile):\n with environment_append(RunEnvironment(conanfile).vars):\n yield\n\n\n@contextmanager\ndef environment_append(env_vars):\n \"\"\"\n :param env_vars: List (dict) of simple environment vars. {name: value, name2: value2} => e.g.: MYVAR=1\n The values can also be lists of appendable environment vars. {name: [value, value2]}\n => e.g. PATH=/path/1:/path/2\n If the value is set to None, then that environment variable is unset.\n :return: None\n \"\"\"\n unset_vars = []\n for key in env_vars.keys():\n if env_vars[key] is None:\n unset_vars.append(key)\n for var in unset_vars:\n env_vars.pop(var, None)\n for name, value in env_vars.items():\n if isinstance(value, list):\n env_vars[name] = os.pathsep.join(value)\n old = os.environ.get(name)\n if old:\n env_vars[name] += os.pathsep + old\n if env_vars or unset_vars:\n old_env = dict(os.environ)\n os.environ.update(env_vars)\n for var in unset_vars:\n os.environ.pop(var)\n try:\n yield\n finally:\n os.environ.clear()\n os.environ.update(old_env)\n else:\n yield\n\n\n@contextmanager\ndef no_op():\n yield\n\n\n@contextmanager\ndef remove_from_path(command):\n curpath = os.getenv(\"PATH\")\n first_it = True\n for _ in range(30):\n if not first_it:\n with environment_append({\"PATH\": curpath}):\n the_command = which(command)\n else:\n the_command = which(command)\n first_it = False\n\n if not the_command:\n break\n new_path = []\n for entry in curpath.split(os.pathsep):\n if not _path_equals(entry, os.path.dirname(the_command)):\n new_path.append(entry)\n\n curpath = os.pathsep.join(new_path)\n else:\n raise ConanException(\"Error in tools.remove_from_path!! couldn't remove the tool '%s' \"\n \"from the path after 30 attempts, still found in '%s' this is a \"\n \"Conan client bug, please open an issue at: \"\n \"https://github.com/conan-io/conan\\n\\nPATH=%s\"\n % (command, the_command, os.getenv(\"PATH\")))\n\n with environment_append({\"PATH\": curpath}):\n yield\n", "path": "conans/client/tools/env.py"}], "after_files": [{"content": "import os\nimport sys\nfrom contextlib import contextmanager\n\nfrom conans.client.run_environment import RunEnvironment\nfrom conans.client.tools.files import _path_equals, which\nfrom conans.errors import ConanException\n\n\n@contextmanager\ndef pythonpath(conanfile):\n python_path = conanfile.env.get(\"PYTHONPATH\", None)\n if python_path:\n old_path = sys.path[:]\n if isinstance(python_path, list):\n sys.path.extend(python_path)\n else:\n sys.path.append(python_path)\n\n yield\n sys.path = old_path\n else:\n yield\n\n\n@contextmanager\ndef run_environment(conanfile):\n with environment_append(RunEnvironment(conanfile).vars):\n yield\n\n\n@contextmanager\ndef environment_append(env_vars):\n \"\"\"\n :param env_vars: List (dict) of simple environment vars. {name: value, name2: value2} => e.g.: MYVAR=1\n The values can also be lists of appendable environment vars. {name: [value, value2]}\n => e.g. PATH=/path/1:/path/2\n If the value is set to None, then that environment variable is unset.\n :return: None\n \"\"\"\n unset_vars = []\n for key in env_vars.keys():\n if env_vars[key] is None:\n unset_vars.append(key)\n for var in unset_vars:\n env_vars.pop(var, None)\n for name, value in env_vars.items():\n if isinstance(value, list):\n env_vars[name] = os.pathsep.join(value)\n old = os.environ.get(name)\n if old:\n env_vars[name] += os.pathsep + old\n if env_vars or unset_vars:\n old_env = dict(os.environ)\n os.environ.update(env_vars)\n for var in unset_vars:\n os.environ.pop(var, None)\n try:\n yield\n finally:\n os.environ.clear()\n os.environ.update(old_env)\n else:\n yield\n\n\n@contextmanager\ndef no_op():\n yield\n\n\n@contextmanager\ndef remove_from_path(command):\n curpath = os.getenv(\"PATH\")\n first_it = True\n for _ in range(30):\n if not first_it:\n with environment_append({\"PATH\": curpath}):\n the_command = which(command)\n else:\n the_command = which(command)\n first_it = False\n\n if not the_command:\n break\n new_path = []\n for entry in curpath.split(os.pathsep):\n if not _path_equals(entry, os.path.dirname(the_command)):\n new_path.append(entry)\n\n curpath = os.pathsep.join(new_path)\n else:\n raise ConanException(\"Error in tools.remove_from_path!! couldn't remove the tool '%s' \"\n \"from the path after 30 attempts, still found in '%s' this is a \"\n \"Conan client bug, please open an issue at: \"\n \"https://github.com/conan-io/conan\\n\\nPATH=%s\"\n % (command, the_command, os.getenv(\"PATH\")))\n\n with environment_append({\"PATH\": curpath}):\n yield\n", "path": "conans/client/tools/env.py"}]}
| 1,576 | 95 |
gh_patches_debug_3685
|
rasdani/github-patches
|
git_diff
|
praw-dev__praw-888
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ListingGenerator gets stuck in loop when 'before' parameter is supplied.
## Issue Description
When retrieving submissions and supplying the 'before' parameter, the ListingGenerator gets stuck in a loop where the same 'after' parameter is submitted to reddit infinitely. I'm submitting a pull request with a fix. Additional details can be found in the pull request.
Example:
``reddit.subreddit(subreddit).new(limit=1024,params={'before': 't3_7xxxxx', 'count': 1024 })``
## System Information
PRAW Version: 5.3.0
Python Version: 3.5.3
Operating System: Debian 9.3
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `praw/models/listing/generator.py`
Content:
```
1 """Provide the ListingGenerator class."""
2 from copy import deepcopy
3
4 from .listing import FlairListing
5 from ..base import PRAWBase
6
7
8 class ListingGenerator(PRAWBase):
9 """Instances of this class generate :class:`.RedditBase` instances.
10
11 .. warning:: This class should not be directly utilized. Instead you will
12 find a number of methods that return instances of the class:
13
14 http://praw.readthedocs.io/en/latest/search.html?q=ListingGenerator
15
16 """
17
18 def __init__(self, reddit, url, limit=100, params=None):
19 """Initialize a ListingGenerator instance.
20
21 :param reddit: An instance of :class:`.Reddit`.
22 :param url: A URL returning a reddit listing.
23 :param limit: The number of content entries to fetch. If ``limit`` is
24 None, then fetch as many entries as possible. Most of reddit's
25 listings contain a maximum of 1000 items, and are returned 100 at a
26 time. This class will automatically issue all necessary
27 requests (default: 100).
28 :param params: A dictionary containing additional query string
29 parameters to send with the request.
30
31 """
32 super(ListingGenerator, self).__init__(reddit, None)
33 self._exhausted = False
34 self._listing = None
35 self._list_index = None
36 self.limit = limit
37 self.params = deepcopy(params) if params else {}
38 self.params['limit'] = limit or 1024
39 self.url = url
40 self.yielded = 0
41
42 def __iter__(self):
43 """Permit ListingGenerator to operate as an iterator."""
44 return self
45
46 def __next__(self):
47 """Permit ListingGenerator to operate as a generator in py3."""
48 if self.limit is not None and self.yielded >= self.limit:
49 raise StopIteration()
50
51 if self._listing is None or self._list_index >= len(self._listing):
52 self._next_batch()
53
54 self._list_index += 1
55 self.yielded += 1
56 return self._listing[self._list_index - 1]
57
58 def _next_batch(self):
59 if self._exhausted:
60 raise StopIteration()
61
62 self._listing = self._reddit.get(self.url, params=self.params)
63 if isinstance(self._listing, list):
64 self._listing = self._listing[1] # for submission duplicates
65 elif isinstance(self._listing, dict):
66 self._listing = FlairListing(self._reddit, self._listing)
67 self._list_index = 0
68
69 if not self._listing:
70 raise StopIteration()
71
72 if self._listing.after:
73 self.params['after'] = self._listing.after
74 else:
75 self._exhausted = True
76
77 def next(self):
78 """Permit ListingGenerator to operate as a generator in py2."""
79 return self.__next__()
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/praw/models/listing/generator.py b/praw/models/listing/generator.py
--- a/praw/models/listing/generator.py
+++ b/praw/models/listing/generator.py
@@ -69,7 +69,8 @@
if not self._listing:
raise StopIteration()
- if self._listing.after:
+ if (self._listing.after and
+ self._listing.after != self.params.get('after')):
self.params['after'] = self._listing.after
else:
self._exhausted = True
|
{"golden_diff": "diff --git a/praw/models/listing/generator.py b/praw/models/listing/generator.py\n--- a/praw/models/listing/generator.py\n+++ b/praw/models/listing/generator.py\n@@ -69,7 +69,8 @@\n if not self._listing:\n raise StopIteration()\n \n- if self._listing.after:\n+ if (self._listing.after and\n+ self._listing.after != self.params.get('after')):\n self.params['after'] = self._listing.after\n else:\n self._exhausted = True\n", "issue": "ListingGenerator gets stuck in loop when 'before' parameter is supplied.\n## Issue Description\r\n\r\nWhen retrieving submissions and supplying the 'before' parameter, the ListingGenerator gets stuck in a loop where the same 'after' parameter is submitted to reddit infinitely. I'm submitting a pull request with a fix. Additional details can be found in the pull request.\r\n\r\nExample:\r\n``reddit.subreddit(subreddit).new(limit=1024,params={'before': 't3_7xxxxx', 'count': 1024 })``\r\n\r\n## System Information\r\n\r\n PRAW Version: 5.3.0\r\n Python Version: 3.5.3\r\nOperating System: Debian 9.3\r\n\n", "before_files": [{"content": "\"\"\"Provide the ListingGenerator class.\"\"\"\nfrom copy import deepcopy\n\nfrom .listing import FlairListing\nfrom ..base import PRAWBase\n\n\nclass ListingGenerator(PRAWBase):\n \"\"\"Instances of this class generate :class:`.RedditBase` instances.\n\n .. warning:: This class should not be directly utilized. Instead you will\n find a number of methods that return instances of the class:\n\n http://praw.readthedocs.io/en/latest/search.html?q=ListingGenerator\n\n \"\"\"\n\n def __init__(self, reddit, url, limit=100, params=None):\n \"\"\"Initialize a ListingGenerator instance.\n\n :param reddit: An instance of :class:`.Reddit`.\n :param url: A URL returning a reddit listing.\n :param limit: The number of content entries to fetch. If ``limit`` is\n None, then fetch as many entries as possible. Most of reddit's\n listings contain a maximum of 1000 items, and are returned 100 at a\n time. This class will automatically issue all necessary\n requests (default: 100).\n :param params: A dictionary containing additional query string\n parameters to send with the request.\n\n \"\"\"\n super(ListingGenerator, self).__init__(reddit, None)\n self._exhausted = False\n self._listing = None\n self._list_index = None\n self.limit = limit\n self.params = deepcopy(params) if params else {}\n self.params['limit'] = limit or 1024\n self.url = url\n self.yielded = 0\n\n def __iter__(self):\n \"\"\"Permit ListingGenerator to operate as an iterator.\"\"\"\n return self\n\n def __next__(self):\n \"\"\"Permit ListingGenerator to operate as a generator in py3.\"\"\"\n if self.limit is not None and self.yielded >= self.limit:\n raise StopIteration()\n\n if self._listing is None or self._list_index >= len(self._listing):\n self._next_batch()\n\n self._list_index += 1\n self.yielded += 1\n return self._listing[self._list_index - 1]\n\n def _next_batch(self):\n if self._exhausted:\n raise StopIteration()\n\n self._listing = self._reddit.get(self.url, params=self.params)\n if isinstance(self._listing, list):\n self._listing = self._listing[1] # for submission duplicates\n elif isinstance(self._listing, dict):\n self._listing = FlairListing(self._reddit, self._listing)\n self._list_index = 0\n\n if not self._listing:\n raise StopIteration()\n\n if self._listing.after:\n self.params['after'] = self._listing.after\n else:\n self._exhausted = True\n\n def next(self):\n \"\"\"Permit ListingGenerator to operate as a generator in py2.\"\"\"\n return self.__next__()\n", "path": "praw/models/listing/generator.py"}], "after_files": [{"content": "\"\"\"Provide the ListingGenerator class.\"\"\"\nfrom copy import deepcopy\n\nfrom .listing import FlairListing\nfrom ..base import PRAWBase\n\n\nclass ListingGenerator(PRAWBase):\n \"\"\"Instances of this class generate :class:`.RedditBase` instances.\n\n .. warning:: This class should not be directly utilized. Instead you will\n find a number of methods that return instances of the class:\n\n http://praw.readthedocs.io/en/latest/search.html?q=ListingGenerator\n\n \"\"\"\n\n def __init__(self, reddit, url, limit=100, params=None):\n \"\"\"Initialize a ListingGenerator instance.\n\n :param reddit: An instance of :class:`.Reddit`.\n :param url: A URL returning a reddit listing.\n :param limit: The number of content entries to fetch. If ``limit`` is\n None, then fetch as many entries as possible. Most of reddit's\n listings contain a maximum of 1000 items, and are returned 100 at a\n time. This class will automatically issue all necessary\n requests (default: 100).\n :param params: A dictionary containing additional query string\n parameters to send with the request.\n\n \"\"\"\n super(ListingGenerator, self).__init__(reddit, None)\n self._exhausted = False\n self._listing = None\n self._list_index = None\n self.limit = limit\n self.params = deepcopy(params) if params else {}\n self.params['limit'] = limit or 1024\n self.url = url\n self.yielded = 0\n\n def __iter__(self):\n \"\"\"Permit ListingGenerator to operate as an iterator.\"\"\"\n return self\n\n def __next__(self):\n \"\"\"Permit ListingGenerator to operate as a generator in py3.\"\"\"\n if self.limit is not None and self.yielded >= self.limit:\n raise StopIteration()\n\n if self._listing is None or self._list_index >= len(self._listing):\n self._next_batch()\n\n self._list_index += 1\n self.yielded += 1\n return self._listing[self._list_index - 1]\n\n def _next_batch(self):\n if self._exhausted:\n raise StopIteration()\n\n self._listing = self._reddit.get(self.url, params=self.params)\n if isinstance(self._listing, list):\n self._listing = self._listing[1] # for submission duplicates\n elif isinstance(self._listing, dict):\n self._listing = FlairListing(self._reddit, self._listing)\n self._list_index = 0\n\n if not self._listing:\n raise StopIteration()\n\n if (self._listing.after and\n self._listing.after != self.params.get('after')):\n self.params['after'] = self._listing.after\n else:\n self._exhausted = True\n\n def next(self):\n \"\"\"Permit ListingGenerator to operate as a generator in py2.\"\"\"\n return self.__next__()\n", "path": "praw/models/listing/generator.py"}]}
| 1,205 | 124 |
gh_patches_debug_17194
|
rasdani/github-patches
|
git_diff
|
vacanza__python-holidays-1592
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Regression in holidays in the Netherlands
I noticed the change Update Netherlands holidays: add holiday categories #1552 broke my unit tests as it no longer considers Liberation day as a holiday on the 5yr interval even though it is a holiday for most people in the Netherlands every 5 years.
On Liberation day (Bevrijdingsdag) the majority of employees have a day off every 5 years (2005, 2010, 2015, etc.). This was the previous behaviour which worked as expected.
Now the 5-year rule of Liberation day is combined with Good Friday in the OPTIONAL category. This equates the status of Liberation day with Good Friday, but this is not logical. Good Fridays is more similar to Liberation Day on the non 5yr-years as those are still a bank holiday but fewer people have a guaranteed holiday than in the 5-yr years. There is no option to add the non-5yr holidays it seems.
The behaviour I would expect is:
- PUBLIC includes 5yr Liberation days
- OPTIONAL includes Good Friday and the non-5yr liberation days
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `holidays/countries/netherlands.py`
Content:
```
1 # python-holidays
2 # ---------------
3 # A fast, efficient Python library for generating country, province and state
4 # specific sets of holidays on the fly. It aims to make determining whether a
5 # specific date is a holiday as fast and flexible as possible.
6 #
7 # Authors: dr-prodigy <[email protected]> (c) 2017-2023
8 # ryanss <[email protected]> (c) 2014-2017
9 # Website: https://github.com/dr-prodigy/python-holidays
10 # License: MIT (see LICENSE file)
11
12 from datetime import date
13 from datetime import timedelta as td
14 from gettext import gettext as tr
15
16 from holidays.calendars.gregorian import APR, AUG
17 from holidays.constants import OPTIONAL, PUBLIC
18 from holidays.groups import ChristianHolidays, InternationalHolidays
19 from holidays.holiday_base import HolidayBase
20
21
22 class Netherlands(HolidayBase, ChristianHolidays, InternationalHolidays):
23 """
24 References:
25
26 - https://en.wikipedia.org/wiki/Public_holidays_in_the_Netherlands
27 - https://nl.wikipedia.org/wiki/Feestdagen_in_Nederland
28 - http://www.iamsterdam.com/en/plan-your-trip/practical-info/public-holidays
29 """
30
31 country = "NL"
32 default_language = "nl"
33 supported_categories = (OPTIONAL, PUBLIC)
34 supported_languages = ("en_US", "nl", "uk")
35
36 def __init__(self, *args, **kwargs):
37 ChristianHolidays.__init__(self)
38 InternationalHolidays.__init__(self)
39 super().__init__(*args, **kwargs)
40
41 def _populate_public_holidays(self):
42 # New Year's Day.
43 self._add_new_years_day(tr("Nieuwjaarsdag"))
44
45 # Easter Sunday.
46 self._add_easter_sunday(tr("Eerste paasdag"))
47
48 # Easter Monday.
49 self._add_easter_monday(tr("Tweede paasdag"))
50
51 # King's / Queen's day
52 if self._year >= 1891:
53 name = (
54 # King's Day.
55 tr("Koningsdag")
56 if self._year >= 2014
57 # Queen's Day.
58 else tr("Koninginnedag")
59 )
60 if self._year >= 2014:
61 dt = date(self._year, APR, 27)
62 elif self._year >= 1949:
63 dt = date(self._year, APR, 30)
64 else:
65 dt = date(self._year, AUG, 31)
66 if self._is_sunday(dt):
67 dt += td(days=-1) if self._year >= 1980 else td(days=+1)
68 self._add_holiday(name, dt)
69
70 # Ascension Day.
71 self._add_ascension_thursday(tr("Hemelvaartsdag"))
72
73 # Whit Sunday.
74 self._add_whit_sunday(tr("Eerste Pinksterdag"))
75
76 # Whit Monday.
77 self._add_whit_monday(tr("Tweede Pinksterdag"))
78
79 # Christmas Day.
80 self._add_christmas_day(tr("Eerste Kerstdag"))
81
82 # Second Day of Christmas.
83 self._add_christmas_day_two(tr("Tweede Kerstdag"))
84
85 def _populate_optional_holidays(self):
86 # Good Friday.
87 self._add_good_friday(tr("Goede Vrijdag"))
88
89 if (self._year >= 1945 and self._year % 5 == 0) or self._year >= 1990:
90 # Liberation Day.
91 self._add_holiday_may_5(tr("Bevrijdingsdag"))
92
93
94 class NL(Netherlands):
95 pass
96
97
98 class NLD(Netherlands):
99 pass
100
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/holidays/countries/netherlands.py b/holidays/countries/netherlands.py
--- a/holidays/countries/netherlands.py
+++ b/holidays/countries/netherlands.py
@@ -67,6 +67,10 @@
dt += td(days=-1) if self._year >= 1980 else td(days=+1)
self._add_holiday(name, dt)
+ if self._year >= 1950 and self._year % 5 == 0:
+ # Liberation Day.
+ self._add_holiday_may_5(tr("Bevrijdingsdag"))
+
# Ascension Day.
self._add_ascension_thursday(tr("Hemelvaartsdag"))
@@ -86,7 +90,7 @@
# Good Friday.
self._add_good_friday(tr("Goede Vrijdag"))
- if (self._year >= 1945 and self._year % 5 == 0) or self._year >= 1990:
+ if self._year >= 1990:
# Liberation Day.
self._add_holiday_may_5(tr("Bevrijdingsdag"))
|
{"golden_diff": "diff --git a/holidays/countries/netherlands.py b/holidays/countries/netherlands.py\n--- a/holidays/countries/netherlands.py\n+++ b/holidays/countries/netherlands.py\n@@ -67,6 +67,10 @@\n dt += td(days=-1) if self._year >= 1980 else td(days=+1)\n self._add_holiday(name, dt)\n \n+ if self._year >= 1950 and self._year % 5 == 0:\n+ # Liberation Day.\n+ self._add_holiday_may_5(tr(\"Bevrijdingsdag\"))\n+\n # Ascension Day.\n self._add_ascension_thursday(tr(\"Hemelvaartsdag\"))\n \n@@ -86,7 +90,7 @@\n # Good Friday.\n self._add_good_friday(tr(\"Goede Vrijdag\"))\n \n- if (self._year >= 1945 and self._year % 5 == 0) or self._year >= 1990:\n+ if self._year >= 1990:\n # Liberation Day.\n self._add_holiday_may_5(tr(\"Bevrijdingsdag\"))\n", "issue": "Regression in holidays in the Netherlands\nI noticed the change Update Netherlands holidays: add holiday categories #1552 broke my unit tests as it no longer considers Liberation day as a holiday on the 5yr interval even though it is a holiday for most people in the Netherlands every 5 years.\r\n\r\nOn Liberation day (Bevrijdingsdag) the majority of employees have a day off every 5 years (2005, 2010, 2015, etc.). This was the previous behaviour which worked as expected.\r\n\r\nNow the 5-year rule of Liberation day is combined with Good Friday in the OPTIONAL category. This equates the status of Liberation day with Good Friday, but this is not logical. Good Fridays is more similar to Liberation Day on the non 5yr-years as those are still a bank holiday but fewer people have a guaranteed holiday than in the 5-yr years. There is no option to add the non-5yr holidays it seems.\r\n\r\nThe behaviour I would expect is:\r\n- PUBLIC includes 5yr Liberation days\r\n- OPTIONAL includes Good Friday and the non-5yr liberation days\n", "before_files": [{"content": "# python-holidays\n# ---------------\n# A fast, efficient Python library for generating country, province and state\n# specific sets of holidays on the fly. It aims to make determining whether a\n# specific date is a holiday as fast and flexible as possible.\n#\n# Authors: dr-prodigy <[email protected]> (c) 2017-2023\n# ryanss <[email protected]> (c) 2014-2017\n# Website: https://github.com/dr-prodigy/python-holidays\n# License: MIT (see LICENSE file)\n\nfrom datetime import date\nfrom datetime import timedelta as td\nfrom gettext import gettext as tr\n\nfrom holidays.calendars.gregorian import APR, AUG\nfrom holidays.constants import OPTIONAL, PUBLIC\nfrom holidays.groups import ChristianHolidays, InternationalHolidays\nfrom holidays.holiday_base import HolidayBase\n\n\nclass Netherlands(HolidayBase, ChristianHolidays, InternationalHolidays):\n \"\"\"\n References:\n\n - https://en.wikipedia.org/wiki/Public_holidays_in_the_Netherlands\n - https://nl.wikipedia.org/wiki/Feestdagen_in_Nederland\n - http://www.iamsterdam.com/en/plan-your-trip/practical-info/public-holidays\n \"\"\"\n\n country = \"NL\"\n default_language = \"nl\"\n supported_categories = (OPTIONAL, PUBLIC)\n supported_languages = (\"en_US\", \"nl\", \"uk\")\n\n def __init__(self, *args, **kwargs):\n ChristianHolidays.__init__(self)\n InternationalHolidays.__init__(self)\n super().__init__(*args, **kwargs)\n\n def _populate_public_holidays(self):\n # New Year's Day.\n self._add_new_years_day(tr(\"Nieuwjaarsdag\"))\n\n # Easter Sunday.\n self._add_easter_sunday(tr(\"Eerste paasdag\"))\n\n # Easter Monday.\n self._add_easter_monday(tr(\"Tweede paasdag\"))\n\n # King's / Queen's day\n if self._year >= 1891:\n name = (\n # King's Day.\n tr(\"Koningsdag\")\n if self._year >= 2014\n # Queen's Day.\n else tr(\"Koninginnedag\")\n )\n if self._year >= 2014:\n dt = date(self._year, APR, 27)\n elif self._year >= 1949:\n dt = date(self._year, APR, 30)\n else:\n dt = date(self._year, AUG, 31)\n if self._is_sunday(dt):\n dt += td(days=-1) if self._year >= 1980 else td(days=+1)\n self._add_holiday(name, dt)\n\n # Ascension Day.\n self._add_ascension_thursday(tr(\"Hemelvaartsdag\"))\n\n # Whit Sunday.\n self._add_whit_sunday(tr(\"Eerste Pinksterdag\"))\n\n # Whit Monday.\n self._add_whit_monday(tr(\"Tweede Pinksterdag\"))\n\n # Christmas Day.\n self._add_christmas_day(tr(\"Eerste Kerstdag\"))\n\n # Second Day of Christmas.\n self._add_christmas_day_two(tr(\"Tweede Kerstdag\"))\n\n def _populate_optional_holidays(self):\n # Good Friday.\n self._add_good_friday(tr(\"Goede Vrijdag\"))\n\n if (self._year >= 1945 and self._year % 5 == 0) or self._year >= 1990:\n # Liberation Day.\n self._add_holiday_may_5(tr(\"Bevrijdingsdag\"))\n\n\nclass NL(Netherlands):\n pass\n\n\nclass NLD(Netherlands):\n pass\n", "path": "holidays/countries/netherlands.py"}], "after_files": [{"content": "# python-holidays\n# ---------------\n# A fast, efficient Python library for generating country, province and state\n# specific sets of holidays on the fly. It aims to make determining whether a\n# specific date is a holiday as fast and flexible as possible.\n#\n# Authors: dr-prodigy <[email protected]> (c) 2017-2023\n# ryanss <[email protected]> (c) 2014-2017\n# Website: https://github.com/dr-prodigy/python-holidays\n# License: MIT (see LICENSE file)\n\nfrom datetime import date\nfrom datetime import timedelta as td\nfrom gettext import gettext as tr\n\nfrom holidays.calendars.gregorian import APR, AUG\nfrom holidays.constants import OPTIONAL, PUBLIC\nfrom holidays.groups import ChristianHolidays, InternationalHolidays\nfrom holidays.holiday_base import HolidayBase\n\n\nclass Netherlands(HolidayBase, ChristianHolidays, InternationalHolidays):\n \"\"\"\n References:\n\n - https://en.wikipedia.org/wiki/Public_holidays_in_the_Netherlands\n - https://nl.wikipedia.org/wiki/Feestdagen_in_Nederland\n - http://www.iamsterdam.com/en/plan-your-trip/practical-info/public-holidays\n \"\"\"\n\n country = \"NL\"\n default_language = \"nl\"\n supported_categories = (OPTIONAL, PUBLIC)\n supported_languages = (\"en_US\", \"nl\", \"uk\")\n\n def __init__(self, *args, **kwargs):\n ChristianHolidays.__init__(self)\n InternationalHolidays.__init__(self)\n super().__init__(*args, **kwargs)\n\n def _populate_public_holidays(self):\n # New Year's Day.\n self._add_new_years_day(tr(\"Nieuwjaarsdag\"))\n\n # Easter Sunday.\n self._add_easter_sunday(tr(\"Eerste paasdag\"))\n\n # Easter Monday.\n self._add_easter_monday(tr(\"Tweede paasdag\"))\n\n # King's / Queen's day\n if self._year >= 1891:\n name = (\n # King's Day.\n tr(\"Koningsdag\")\n if self._year >= 2014\n # Queen's Day.\n else tr(\"Koninginnedag\")\n )\n if self._year >= 2014:\n dt = date(self._year, APR, 27)\n elif self._year >= 1949:\n dt = date(self._year, APR, 30)\n else:\n dt = date(self._year, AUG, 31)\n if self._is_sunday(dt):\n dt += td(days=-1) if self._year >= 1980 else td(days=+1)\n self._add_holiday(name, dt)\n\n if self._year >= 1950 and self._year % 5 == 0:\n # Liberation Day.\n self._add_holiday_may_5(tr(\"Bevrijdingsdag\"))\n\n # Ascension Day.\n self._add_ascension_thursday(tr(\"Hemelvaartsdag\"))\n\n # Whit Sunday.\n self._add_whit_sunday(tr(\"Eerste Pinksterdag\"))\n\n # Whit Monday.\n self._add_whit_monday(tr(\"Tweede Pinksterdag\"))\n\n # Christmas Day.\n self._add_christmas_day(tr(\"Eerste Kerstdag\"))\n\n # Second Day of Christmas.\n self._add_christmas_day_two(tr(\"Tweede Kerstdag\"))\n\n def _populate_optional_holidays(self):\n # Good Friday.\n self._add_good_friday(tr(\"Goede Vrijdag\"))\n\n if self._year >= 1990:\n # Liberation Day.\n self._add_holiday_may_5(tr(\"Bevrijdingsdag\"))\n\n\nclass NL(Netherlands):\n pass\n\n\nclass NLD(Netherlands):\n pass\n", "path": "holidays/countries/netherlands.py"}]}
| 1,539 | 266 |
gh_patches_debug_6755
|
rasdani/github-patches
|
git_diff
|
wagtail__wagtail-3277
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Elasticsearch backend indexes draft updates of existing pages
### Issue Summary
When saving a draft version of an existing live page the Elasticsearch backend reindexes the page with the draft content. The reindexed content will potentially cause frontend search results to include the page if the search query matches the draft content.
I'm using the following search query in my view:
search_results = Page.objects.live().search(search_query)
New content that is saved as draft is not an issue since the live() filter excludes it.
### Steps to Reproduce
1. Edit an indexed field of an existing published page
2. Insert a unique term in the indexed field
3. Click 'Save Draft'
4. On the fontend search for the unique term.
5. The editted page will be returned in the results
I can see the value of having the draft content indexed on the Wagtail backend but since the frontend shares the same index, that can be a problem.
### Technical details
* Python version: 3.5.2.
* Django version: 1.10.4.
* Wagtail version: 1.8.
* Elasticsearch: 5
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/wagtailsearch/signal_handlers.py`
Content:
```
1 from __future__ import absolute_import, unicode_literals
2
3 from django.db.models.signals import post_delete, post_save
4
5 from wagtail.wagtailsearch import index
6
7
8 def post_save_signal_handler(instance, **kwargs):
9 index.insert_or_update_object(instance)
10
11
12 def post_delete_signal_handler(instance, **kwargs):
13 index.remove_object(instance)
14
15
16 def register_signal_handlers():
17 # Loop through list and register signal handlers for each one
18 for model in index.get_indexed_models():
19 post_save.connect(post_save_signal_handler, sender=model)
20 post_delete.connect(post_delete_signal_handler, sender=model)
21
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/wagtail/wagtailsearch/signal_handlers.py b/wagtail/wagtailsearch/signal_handlers.py
--- a/wagtail/wagtailsearch/signal_handlers.py
+++ b/wagtail/wagtailsearch/signal_handlers.py
@@ -5,7 +5,13 @@
from wagtail.wagtailsearch import index
-def post_save_signal_handler(instance, **kwargs):
+def post_save_signal_handler(instance, update_fields=None, **kwargs):
+ if update_fields is not None:
+ # fetch a fresh copy of instance from the database to ensure
+ # that we're not indexing any of the unsaved data contained in
+ # the fields that were not passed in update_fields
+ instance = type(instance).objects.get(pk=instance.pk)
+
index.insert_or_update_object(instance)
|
{"golden_diff": "diff --git a/wagtail/wagtailsearch/signal_handlers.py b/wagtail/wagtailsearch/signal_handlers.py\n--- a/wagtail/wagtailsearch/signal_handlers.py\n+++ b/wagtail/wagtailsearch/signal_handlers.py\n@@ -5,7 +5,13 @@\n from wagtail.wagtailsearch import index\n \n \n-def post_save_signal_handler(instance, **kwargs):\n+def post_save_signal_handler(instance, update_fields=None, **kwargs):\n+ if update_fields is not None:\n+ # fetch a fresh copy of instance from the database to ensure\n+ # that we're not indexing any of the unsaved data contained in\n+ # the fields that were not passed in update_fields\n+ instance = type(instance).objects.get(pk=instance.pk)\n+\n index.insert_or_update_object(instance)\n", "issue": "Elasticsearch backend indexes draft updates of existing pages\n### Issue Summary\r\n\r\nWhen saving a draft version of an existing live page the Elasticsearch backend reindexes the page with the draft content. The reindexed content will potentially cause frontend search results to include the page if the search query matches the draft content.\r\n\r\nI'm using the following search query in my view:\r\n\r\n search_results = Page.objects.live().search(search_query)\r\n\r\nNew content that is saved as draft is not an issue since the live() filter excludes it.\r\n\r\n\r\n### Steps to Reproduce\r\n\r\n1. Edit an indexed field of an existing published page\r\n2. Insert a unique term in the indexed field\r\n3. Click 'Save Draft'\r\n4. On the fontend search for the unique term.\r\n5. The editted page will be returned in the results\r\n\r\nI can see the value of having the draft content indexed on the Wagtail backend but since the frontend shares the same index, that can be a problem.\r\n\r\n### Technical details\r\n\r\n* Python version: 3.5.2.\r\n* Django version: 1.10.4.\r\n* Wagtail version: 1.8.\r\n* Elasticsearch: 5\r\n\n", "before_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nfrom django.db.models.signals import post_delete, post_save\n\nfrom wagtail.wagtailsearch import index\n\n\ndef post_save_signal_handler(instance, **kwargs):\n index.insert_or_update_object(instance)\n\n\ndef post_delete_signal_handler(instance, **kwargs):\n index.remove_object(instance)\n\n\ndef register_signal_handlers():\n # Loop through list and register signal handlers for each one\n for model in index.get_indexed_models():\n post_save.connect(post_save_signal_handler, sender=model)\n post_delete.connect(post_delete_signal_handler, sender=model)\n", "path": "wagtail/wagtailsearch/signal_handlers.py"}], "after_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nfrom django.db.models.signals import post_delete, post_save\n\nfrom wagtail.wagtailsearch import index\n\n\ndef post_save_signal_handler(instance, update_fields=None, **kwargs):\n if update_fields is not None:\n # fetch a fresh copy of instance from the database to ensure\n # that we're not indexing any of the unsaved data contained in\n # the fields that were not passed in update_fields\n instance = type(instance).objects.get(pk=instance.pk)\n\n index.insert_or_update_object(instance)\n\n\ndef post_delete_signal_handler(instance, **kwargs):\n index.remove_object(instance)\n\n\ndef register_signal_handlers():\n # Loop through list and register signal handlers for each one\n for model in index.get_indexed_models():\n post_save.connect(post_save_signal_handler, sender=model)\n post_delete.connect(post_delete_signal_handler, sender=model)\n", "path": "wagtail/wagtailsearch/signal_handlers.py"}]}
| 664 | 180 |
gh_patches_debug_23357
|
rasdani/github-patches
|
git_diff
|
iterative__dvc-1076
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug: dvc remote remove leads to Initialization error / config file error.
**Setup**
- dvc version 0.18.9, installed with pip, python Anaconda 3.6.4, Ubuntu 16.04
**Repro**
```bash
set -e
set -x
rm -rf foo
mkdir -p foo && cd foo
git init && dvc init
echo bar > bar.txt
dvc remote add -d dummy s3://dummy
dvc remote remove dummy
dvc add bar.txt
```
```bash
+ rm -rf foo
+ mkdir -p foo
+ cd foo
+ git init
Initialized empty Git repository in /home/tmain/foo/.git/
+ dvc init
Adding '.dvc/state' to '.dvc/.gitignore'.
Adding '.dvc/state.lock' to '.dvc/.gitignore'.
Adding '.dvc/link.state' to '.dvc/.gitignore'.
Adding '.dvc/link.state.lock' to '.dvc/.gitignore'.
Adding '.dvc/lock' to '.dvc/.gitignore'.
Adding '.dvc/config.local' to '.dvc/.gitignore'.
Adding '.dvc/updater' to '.dvc/.gitignore'.
Adding '.dvc/cache' to '.dvc/.gitignore'.
Checking for updates...
+ echo bar
+ dvc remote add -d dummy s3://dummy
Setting 'dummy' as a default remote.
+ dvc remote remove dummy
+ dvc add bar.txt
Error: Initialization error: Config file error: Can't find remote section 'remote "dummy"' in config
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dvc/command/remote.py`
Content:
```
1 import re
2
3 from dvc.config import Config
4 from dvc.command.config import CmdConfig
5 from dvc.logger import Logger
6
7
8 class CmdRemoteAdd(CmdConfig):
9 def run(self):
10 section = Config.SECTION_REMOTE_FMT.format(self.args.name)
11 ret = self.set(section, Config.SECTION_REMOTE_URL, self.args.url)
12 if ret != 0:
13 return ret
14
15 if self.args.default:
16 msg = 'Setting \'{}\' as a default remote.'.format(self.args.name)
17 Logger.info(msg)
18 ret = self.set(Config.SECTION_CORE,
19 Config.SECTION_CORE_REMOTE,
20 self.args.name)
21
22 return ret
23
24
25 class CmdRemoteRemove(CmdConfig):
26 def run(self):
27 section = Config.SECTION_REMOTE_FMT.format(self.args.name)
28 return self.unset(section, Config.SECTION_REMOTE_URL)
29
30
31 class CmdRemoteModify(CmdConfig):
32 def run(self):
33 section = Config.SECTION_REMOTE_FMT.format(self.args.name)
34 self.args.name = '{}.{}'.format(section, self.args.option)
35 return super(CmdRemoteModify, self).run()
36
37
38 class CmdRemoteList(CmdConfig):
39 def run(self):
40 for section in self.configobj.keys():
41 r = re.match(Config.SECTION_REMOTE_REGEX, section)
42 if r:
43 name = r.group('name')
44 url = self.configobj[section].get(Config.SECTION_REMOTE_URL,
45 '')
46 Logger.info('{}\t{}'.format(name, url))
47 return 0
48
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/dvc/command/remote.py b/dvc/command/remote.py
--- a/dvc/command/remote.py
+++ b/dvc/command/remote.py
@@ -1,4 +1,6 @@
+import os
import re
+import configobj
from dvc.config import Config
from dvc.command.config import CmdConfig
@@ -23,9 +25,35 @@
class CmdRemoteRemove(CmdConfig):
+ def _remove_default(self, config_file, remote):
+ path = os.path.join(os.path.dirname(self.config_file),
+ config_file)
+ config = configobj.ConfigObj(path)
+
+ core = config.get(Config.SECTION_CORE, None)
+ if core is None:
+ return
+
+ default = core.get(Config.SECTION_CORE_REMOTE, None)
+ if default is None:
+ return
+
+ if default == remote:
+ del config[Config.SECTION_CORE][Config.SECTION_CORE_REMOTE]
+ if len(config[Config.SECTION_CORE]) == 0:
+ del config[Config.SECTION_CORE]
+
+ config.write()
+
def run(self):
section = Config.SECTION_REMOTE_FMT.format(self.args.name)
- return self.unset(section, Config.SECTION_REMOTE_URL)
+ ret = self.unset(section)
+ if ret != 0:
+ return ret
+
+ self._remove_default(Config.CONFIG, self.args.name)
+ self._remove_default(Config.CONFIG_LOCAL, self.args.name)
+ return 0
class CmdRemoteModify(CmdConfig):
|
{"golden_diff": "diff --git a/dvc/command/remote.py b/dvc/command/remote.py\n--- a/dvc/command/remote.py\n+++ b/dvc/command/remote.py\n@@ -1,4 +1,6 @@\n+import os\n import re\n+import configobj\n \n from dvc.config import Config\n from dvc.command.config import CmdConfig\n@@ -23,9 +25,35 @@\n \n \n class CmdRemoteRemove(CmdConfig):\n+ def _remove_default(self, config_file, remote):\n+ path = os.path.join(os.path.dirname(self.config_file),\n+ config_file)\n+ config = configobj.ConfigObj(path)\n+\n+ core = config.get(Config.SECTION_CORE, None)\n+ if core is None:\n+ return\n+\n+ default = core.get(Config.SECTION_CORE_REMOTE, None)\n+ if default is None:\n+ return\n+\n+ if default == remote:\n+ del config[Config.SECTION_CORE][Config.SECTION_CORE_REMOTE]\n+ if len(config[Config.SECTION_CORE]) == 0:\n+ del config[Config.SECTION_CORE]\n+\n+ config.write()\n+\n def run(self):\n section = Config.SECTION_REMOTE_FMT.format(self.args.name)\n- return self.unset(section, Config.SECTION_REMOTE_URL)\n+ ret = self.unset(section)\n+ if ret != 0:\n+ return ret\n+\n+ self._remove_default(Config.CONFIG, self.args.name)\n+ self._remove_default(Config.CONFIG_LOCAL, self.args.name)\n+ return 0\n \n \n class CmdRemoteModify(CmdConfig):\n", "issue": "Bug: dvc remote remove leads to Initialization error / config file error.\n**Setup**\r\n\r\n- dvc version 0.18.9, installed with pip, python Anaconda 3.6.4, Ubuntu 16.04\r\n\r\n**Repro**\r\n\r\n```bash\r\nset -e\r\nset -x\r\nrm -rf foo\r\nmkdir -p foo && cd foo\r\ngit init && dvc init\r\necho bar > bar.txt\r\ndvc remote add -d dummy s3://dummy\r\ndvc remote remove dummy\r\ndvc add bar.txt\r\n```\r\n\r\n```bash\r\n+ rm -rf foo\r\n+ mkdir -p foo\r\n+ cd foo\r\n+ git init\r\nInitialized empty Git repository in /home/tmain/foo/.git/\r\n+ dvc init\r\nAdding '.dvc/state' to '.dvc/.gitignore'.\r\nAdding '.dvc/state.lock' to '.dvc/.gitignore'.\r\nAdding '.dvc/link.state' to '.dvc/.gitignore'.\r\nAdding '.dvc/link.state.lock' to '.dvc/.gitignore'.\r\nAdding '.dvc/lock' to '.dvc/.gitignore'.\r\nAdding '.dvc/config.local' to '.dvc/.gitignore'.\r\nAdding '.dvc/updater' to '.dvc/.gitignore'.\r\nAdding '.dvc/cache' to '.dvc/.gitignore'.\r\nChecking for updates...\r\n+ echo bar\r\n+ dvc remote add -d dummy s3://dummy\r\nSetting 'dummy' as a default remote.\r\n+ dvc remote remove dummy\r\n+ dvc add bar.txt\r\nError: Initialization error: Config file error: Can't find remote section 'remote \"dummy\"' in config\r\n\r\n```\n", "before_files": [{"content": "import re\n\nfrom dvc.config import Config\nfrom dvc.command.config import CmdConfig\nfrom dvc.logger import Logger\n\n\nclass CmdRemoteAdd(CmdConfig):\n def run(self):\n section = Config.SECTION_REMOTE_FMT.format(self.args.name)\n ret = self.set(section, Config.SECTION_REMOTE_URL, self.args.url)\n if ret != 0:\n return ret\n\n if self.args.default:\n msg = 'Setting \\'{}\\' as a default remote.'.format(self.args.name)\n Logger.info(msg)\n ret = self.set(Config.SECTION_CORE,\n Config.SECTION_CORE_REMOTE,\n self.args.name)\n\n return ret\n\n\nclass CmdRemoteRemove(CmdConfig):\n def run(self):\n section = Config.SECTION_REMOTE_FMT.format(self.args.name)\n return self.unset(section, Config.SECTION_REMOTE_URL)\n\n\nclass CmdRemoteModify(CmdConfig):\n def run(self):\n section = Config.SECTION_REMOTE_FMT.format(self.args.name)\n self.args.name = '{}.{}'.format(section, self.args.option)\n return super(CmdRemoteModify, self).run()\n\n\nclass CmdRemoteList(CmdConfig):\n def run(self):\n for section in self.configobj.keys():\n r = re.match(Config.SECTION_REMOTE_REGEX, section)\n if r:\n name = r.group('name')\n url = self.configobj[section].get(Config.SECTION_REMOTE_URL,\n '')\n Logger.info('{}\\t{}'.format(name, url))\n return 0\n", "path": "dvc/command/remote.py"}], "after_files": [{"content": "import os\nimport re\nimport configobj\n\nfrom dvc.config import Config\nfrom dvc.command.config import CmdConfig\nfrom dvc.logger import Logger\n\n\nclass CmdRemoteAdd(CmdConfig):\n def run(self):\n section = Config.SECTION_REMOTE_FMT.format(self.args.name)\n ret = self.set(section, Config.SECTION_REMOTE_URL, self.args.url)\n if ret != 0:\n return ret\n\n if self.args.default:\n msg = 'Setting \\'{}\\' as a default remote.'.format(self.args.name)\n Logger.info(msg)\n ret = self.set(Config.SECTION_CORE,\n Config.SECTION_CORE_REMOTE,\n self.args.name)\n\n return ret\n\n\nclass CmdRemoteRemove(CmdConfig):\n def _remove_default(self, config_file, remote):\n path = os.path.join(os.path.dirname(self.config_file),\n config_file)\n config = configobj.ConfigObj(path)\n\n core = config.get(Config.SECTION_CORE, None)\n if core is None:\n return\n\n default = core.get(Config.SECTION_CORE_REMOTE, None)\n if default is None:\n return\n\n if default == remote:\n del config[Config.SECTION_CORE][Config.SECTION_CORE_REMOTE]\n if len(config[Config.SECTION_CORE]) == 0:\n del config[Config.SECTION_CORE]\n\n config.write()\n\n def run(self):\n section = Config.SECTION_REMOTE_FMT.format(self.args.name)\n ret = self.unset(section)\n if ret != 0:\n return ret\n\n self._remove_default(Config.CONFIG, self.args.name)\n self._remove_default(Config.CONFIG_LOCAL, self.args.name)\n return 0\n\n\nclass CmdRemoteModify(CmdConfig):\n def run(self):\n section = Config.SECTION_REMOTE_FMT.format(self.args.name)\n self.args.name = '{}.{}'.format(section, self.args.option)\n return super(CmdRemoteModify, self).run()\n\n\nclass CmdRemoteList(CmdConfig):\n def run(self):\n for section in self.configobj.keys():\n r = re.match(Config.SECTION_REMOTE_REGEX, section)\n if r:\n name = r.group('name')\n url = self.configobj[section].get(Config.SECTION_REMOTE_URL,\n '')\n Logger.info('{}\\t{}'.format(name, url))\n return 0\n", "path": "dvc/command/remote.py"}]}
| 1,023 | 342 |
gh_patches_debug_12525
|
rasdani/github-patches
|
git_diff
|
ethereum__web3.py-306
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Transform Markdown on PyPi release
### What was wrong?
README is not readable on PyPI: https://pypi.python.org/pypi/web3/1.4.0
`setuptools-markdown` allows to publish README.md on PyPi
- https://pypi.python.org/pypi/setuptools-markdown
#### Cute Animal Picture

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 import os
4 import sys
5
6 from setuptools import (
7 setup,
8 find_packages,
9 )
10
11
12 DIR = os.path.dirname(os.path.abspath(__file__))
13
14
15 readme = open(os.path.join(DIR, 'README.md')).read()
16
17 install_requires = [
18 "cytoolz>=0.8.2",
19 "ethereum-abi-utils>=0.4.0",
20 "ethereum-utils>=0.4.0",
21 "pylru>=1.0.9",
22 "pysha3>=0.3",
23 "requests>=2.12.4",
24 "rlp>=0.4.7",
25 "toolz>=0.8.2",
26 ]
27
28 if sys.platform == 'win32':
29 install_requires.append('pypiwin32')
30
31 setup(
32 name='web3',
33 version='3.13.5',
34 description="""Web3.py""",
35 long_description=readme,
36 author='Piper Merriam',
37 author_email='[email protected]',
38 url='https://github.com/pipermerriam/web3.py',
39 include_package_data=True,
40 install_requires=install_requires,
41 extras_require={
42 'tester': ["eth-testrpc>=1.2.0"],
43 'gevent': [
44 "gevent>=1.1.1,<1.2.0",
45 "geventhttpclient>=1.3.1",
46 ],
47 },
48 py_modules=['web3'],
49 license="MIT",
50 zip_safe=False,
51 keywords='ethereum',
52 packages=find_packages(exclude=["tests", "tests.*"]),
53 classifiers=[
54 'Development Status :: 2 - Pre-Alpha',
55 'Intended Audience :: Developers',
56 'License :: OSI Approved :: MIT License',
57 'Natural Language :: English',
58 'Programming Language :: Python :: 2',
59 'Programming Language :: Python :: 2.7',
60 'Programming Language :: Python :: 3',
61 'Programming Language :: Python :: 3.4',
62 'Programming Language :: Python :: 3.5',
63 ],
64 )
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -32,12 +32,13 @@
name='web3',
version='3.13.5',
description="""Web3.py""",
- long_description=readme,
+ long_description_markdown_filename='README.md',
author='Piper Merriam',
author_email='[email protected]',
url='https://github.com/pipermerriam/web3.py',
include_package_data=True,
install_requires=install_requires,
+ setup_requires=['setuptools-markdown'],
extras_require={
'tester': ["eth-testrpc>=1.2.0"],
'gevent': [
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -32,12 +32,13 @@\n name='web3',\n version='3.13.5',\n description=\"\"\"Web3.py\"\"\",\n- long_description=readme,\n+ long_description_markdown_filename='README.md',\n author='Piper Merriam',\n author_email='[email protected]',\n url='https://github.com/pipermerriam/web3.py',\n include_package_data=True,\n install_requires=install_requires,\n+ setup_requires=['setuptools-markdown'],\n extras_require={\n 'tester': [\"eth-testrpc>=1.2.0\"],\n 'gevent': [\n", "issue": "Transform Markdown on PyPi release\n### What was wrong?\n\nREADME is not readable on PyPI: https://pypi.python.org/pypi/web3/1.4.0\n\n`setuptools-markdown` allows to publish README.md on PyPi\n- https://pypi.python.org/pypi/setuptools-markdown\n#### Cute Animal Picture\n\n\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nimport os\nimport sys\n\nfrom setuptools import (\n setup,\n find_packages,\n)\n\n\nDIR = os.path.dirname(os.path.abspath(__file__))\n\n\nreadme = open(os.path.join(DIR, 'README.md')).read()\n\ninstall_requires = [\n \"cytoolz>=0.8.2\",\n \"ethereum-abi-utils>=0.4.0\",\n \"ethereum-utils>=0.4.0\",\n \"pylru>=1.0.9\",\n \"pysha3>=0.3\",\n \"requests>=2.12.4\",\n \"rlp>=0.4.7\",\n \"toolz>=0.8.2\",\n]\n\nif sys.platform == 'win32':\n install_requires.append('pypiwin32')\n\nsetup(\n name='web3',\n version='3.13.5',\n description=\"\"\"Web3.py\"\"\",\n long_description=readme,\n author='Piper Merriam',\n author_email='[email protected]',\n url='https://github.com/pipermerriam/web3.py',\n include_package_data=True,\n install_requires=install_requires,\n extras_require={\n 'tester': [\"eth-testrpc>=1.2.0\"],\n 'gevent': [\n \"gevent>=1.1.1,<1.2.0\",\n \"geventhttpclient>=1.3.1\",\n ],\n },\n py_modules=['web3'],\n license=\"MIT\",\n zip_safe=False,\n keywords='ethereum',\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n classifiers=[\n 'Development Status :: 2 - Pre-Alpha',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: MIT License',\n 'Natural Language :: English',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nimport os\nimport sys\n\nfrom setuptools import (\n setup,\n find_packages,\n)\n\n\nDIR = os.path.dirname(os.path.abspath(__file__))\n\n\nreadme = open(os.path.join(DIR, 'README.md')).read()\n\ninstall_requires = [\n \"cytoolz>=0.8.2\",\n \"ethereum-abi-utils>=0.4.0\",\n \"ethereum-utils>=0.4.0\",\n \"pylru>=1.0.9\",\n \"pysha3>=0.3\",\n \"requests>=2.12.4\",\n \"rlp>=0.4.7\",\n \"toolz>=0.8.2\",\n]\n\nif sys.platform == 'win32':\n install_requires.append('pypiwin32')\n\nsetup(\n name='web3',\n version='3.13.5',\n description=\"\"\"Web3.py\"\"\",\n long_description_markdown_filename='README.md',\n author='Piper Merriam',\n author_email='[email protected]',\n url='https://github.com/pipermerriam/web3.py',\n include_package_data=True,\n install_requires=install_requires,\n setup_requires=['setuptools-markdown'],\n extras_require={\n 'tester': [\"eth-testrpc>=1.2.0\"],\n 'gevent': [\n \"gevent>=1.1.1,<1.2.0\",\n \"geventhttpclient>=1.3.1\",\n ],\n },\n py_modules=['web3'],\n license=\"MIT\",\n zip_safe=False,\n keywords='ethereum',\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n classifiers=[\n 'Development Status :: 2 - Pre-Alpha',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: MIT License',\n 'Natural Language :: English',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n ],\n)\n", "path": "setup.py"}]}
| 920 | 160 |
gh_patches_debug_6493
|
rasdani/github-patches
|
git_diff
|
pwr-Solaar__Solaar-2305
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Extend Makefile with installation and test targets
**Information**
- Solaar version: 1.1.11rc4
**Is your feature request related to a problem? Please describe.**
The Solaar installation with all its dependencies (pip, apt, udev etc.) is cumbersome. Although some setup steps for GItHub CI exists, they are not usable for local setup of Solaar.
**Describe the solution you'd like**
Move the setup commands into a Makefile and use the targets in GitHub workflow files. Thus, the commands are in a single place and also usable for local setups.
**Additional context**
This extends #2263
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python3
2 import subprocess
3
4 from glob import glob as _glob
5
6 try:
7 from setuptools import setup
8 except ImportError:
9 from distutils.core import setup
10
11 NAME = 'Solaar'
12
13 with open('lib/solaar/version', 'r') as vfile:
14 version = vfile.read().strip()
15
16 try: # get commit from git describe
17 commit = subprocess.check_output(['git', 'describe', '--always'], stderr=subprocess.DEVNULL).strip().decode()
18 with open('lib/solaar/commit', 'w') as vfile:
19 vfile.write(f'{commit}\n')
20 except Exception: # get commit from Ubuntu dpkg-parsechangelog
21 try:
22 commit = subprocess.check_output(['dpkg-parsechangelog', '--show-field', 'Version'],
23 stderr=subprocess.DEVNULL).strip().decode()
24 commit = commit.split('~')
25 with open('lib/solaar/commit', 'w') as vfile:
26 vfile.write(f'{commit[0]}\n')
27 except Exception as e:
28 print('Exception using dpkg-parsechangelog', e)
29
30
31 def _data_files():
32 from os.path import dirname as _dirname
33
34 yield 'share/icons/hicolor/scalable/apps', _glob('share/solaar/icons/solaar*.svg')
35 yield 'share/icons/hicolor/32x32/apps', _glob('share/solaar/icons/solaar-light_*.png')
36
37 for mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):
38 yield _dirname(mo), [mo]
39
40 yield 'share/applications', ['share/applications/solaar.desktop']
41 yield 'lib/udev/rules.d', ['rules.d/42-logitech-unify-permissions.rules']
42 yield 'share/metainfo', ['share/solaar/io.github.pwr_solaar.solaar.metainfo.xml']
43
44 del _dirname
45
46
47 setup(
48 name=NAME.lower(),
49 version=version,
50 description='Linux device manager for Logitech receivers, keyboards, mice, and tablets.',
51 long_description='''
52 Solaar is a Linux device manager for many Logitech peripherals that connect through
53 Unifying and other receivers or via USB or Bluetooth.
54 Solaar is able to pair/unpair devices with receivers and show and modify some of the
55 modifiable features of devices.
56 For instructions on installing Solaar see https://pwr-solaar.github.io/Solaar/installation'''.strip(),
57 author='Daniel Pavel',
58 license='GPLv2',
59 url='http://pwr-solaar.github.io/Solaar/',
60 classifiers=[
61 'Development Status :: 4 - Beta',
62 'Environment :: X11 Applications :: GTK',
63 'Environment :: Console',
64 'Intended Audience :: End Users/Desktop',
65 'License :: DFSG approved',
66 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',
67 'Natural Language :: English',
68 'Programming Language :: Python :: 3 :: Only',
69 'Operating System :: POSIX :: Linux',
70 'Topic :: Utilities',
71 ],
72 platforms=['linux'],
73
74 # sudo apt install python-gi python3-gi \
75 # gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1
76 # os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],
77 python_requires='>=3.7',
78 install_requires=[
79 'evdev (>= 1.1.2) ; platform_system=="Linux"',
80 'pyudev (>= 0.13)',
81 'PyYAML (>= 3.12)',
82 'python-xlib (>= 0.27)',
83 'psutil (>= 5.4.3)',
84 'dbus-python ; platform_system=="Linux"',
85 ],
86 extras_require={
87 'report-descriptor': ['hid-parser'],
88 'desktop-notifications': ['Notify (>= 0.7)'],
89 'git-commit': ['python-git-info'],
90 'test': ['pytest'],
91 },
92 package_dir={'': 'lib'},
93 packages=['keysyms', 'hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],
94 data_files=list(_data_files()),
95 include_package_data=True,
96 scripts=_glob('bin/*'),
97 )
98
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -87,7 +87,7 @@
'report-descriptor': ['hid-parser'],
'desktop-notifications': ['Notify (>= 0.7)'],
'git-commit': ['python-git-info'],
- 'test': ['pytest'],
+ 'test': ['pytest', 'pytest-cov'],
},
package_dir={'': 'lib'},
packages=['keysyms', 'hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -87,7 +87,7 @@\n 'report-descriptor': ['hid-parser'],\n 'desktop-notifications': ['Notify (>= 0.7)'],\n 'git-commit': ['python-git-info'],\n- 'test': ['pytest'],\n+ 'test': ['pytest', 'pytest-cov'],\n },\n package_dir={'': 'lib'},\n packages=['keysyms', 'hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n", "issue": "Extend Makefile with installation and test targets\n**Information**\r\n- Solaar version: 1.1.11rc4\r\n\r\n**Is your feature request related to a problem? Please describe.**\r\nThe Solaar installation with all its dependencies (pip, apt, udev etc.) is cumbersome. Although some setup steps for GItHub CI exists, they are not usable for local setup of Solaar. \r\n\r\n**Describe the solution you'd like**\r\nMove the setup commands into a Makefile and use the targets in GitHub workflow files. Thus, the commands are in a single place and also usable for local setups.\r\n\r\n**Additional context**\r\nThis extends #2263 \r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\nimport subprocess\n\nfrom glob import glob as _glob\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\nNAME = 'Solaar'\n\nwith open('lib/solaar/version', 'r') as vfile:\n version = vfile.read().strip()\n\ntry: # get commit from git describe\n commit = subprocess.check_output(['git', 'describe', '--always'], stderr=subprocess.DEVNULL).strip().decode()\n with open('lib/solaar/commit', 'w') as vfile:\n vfile.write(f'{commit}\\n')\nexcept Exception: # get commit from Ubuntu dpkg-parsechangelog\n try:\n commit = subprocess.check_output(['dpkg-parsechangelog', '--show-field', 'Version'],\n stderr=subprocess.DEVNULL).strip().decode()\n commit = commit.split('~')\n with open('lib/solaar/commit', 'w') as vfile:\n vfile.write(f'{commit[0]}\\n')\n except Exception as e:\n print('Exception using dpkg-parsechangelog', e)\n\n\ndef _data_files():\n from os.path import dirname as _dirname\n\n yield 'share/icons/hicolor/scalable/apps', _glob('share/solaar/icons/solaar*.svg')\n yield 'share/icons/hicolor/32x32/apps', _glob('share/solaar/icons/solaar-light_*.png')\n\n for mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):\n yield _dirname(mo), [mo]\n\n yield 'share/applications', ['share/applications/solaar.desktop']\n yield 'lib/udev/rules.d', ['rules.d/42-logitech-unify-permissions.rules']\n yield 'share/metainfo', ['share/solaar/io.github.pwr_solaar.solaar.metainfo.xml']\n\n del _dirname\n\n\nsetup(\n name=NAME.lower(),\n version=version,\n description='Linux device manager for Logitech receivers, keyboards, mice, and tablets.',\n long_description='''\nSolaar is a Linux device manager for many Logitech peripherals that connect through\nUnifying and other receivers or via USB or Bluetooth.\nSolaar is able to pair/unpair devices with receivers and show and modify some of the\nmodifiable features of devices.\nFor instructions on installing Solaar see https://pwr-solaar.github.io/Solaar/installation'''.strip(),\n author='Daniel Pavel',\n license='GPLv2',\n url='http://pwr-solaar.github.io/Solaar/',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Environment :: X11 Applications :: GTK',\n 'Environment :: Console',\n 'Intended Audience :: End Users/Desktop',\n 'License :: DFSG approved',\n 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',\n 'Natural Language :: English',\n 'Programming Language :: Python :: 3 :: Only',\n 'Operating System :: POSIX :: Linux',\n 'Topic :: Utilities',\n ],\n platforms=['linux'],\n\n # sudo apt install python-gi python3-gi \\\n # gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1\n # os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],\n python_requires='>=3.7',\n install_requires=[\n 'evdev (>= 1.1.2) ; platform_system==\"Linux\"',\n 'pyudev (>= 0.13)',\n 'PyYAML (>= 3.12)',\n 'python-xlib (>= 0.27)',\n 'psutil (>= 5.4.3)',\n 'dbus-python ; platform_system==\"Linux\"',\n ],\n extras_require={\n 'report-descriptor': ['hid-parser'],\n 'desktop-notifications': ['Notify (>= 0.7)'],\n 'git-commit': ['python-git-info'],\n 'test': ['pytest'],\n },\n package_dir={'': 'lib'},\n packages=['keysyms', 'hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n data_files=list(_data_files()),\n include_package_data=True,\n scripts=_glob('bin/*'),\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python3\nimport subprocess\n\nfrom glob import glob as _glob\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\nNAME = 'Solaar'\n\nwith open('lib/solaar/version', 'r') as vfile:\n version = vfile.read().strip()\n\ntry: # get commit from git describe\n commit = subprocess.check_output(['git', 'describe', '--always'], stderr=subprocess.DEVNULL).strip().decode()\n with open('lib/solaar/commit', 'w') as vfile:\n vfile.write(f'{commit}\\n')\nexcept Exception: # get commit from Ubuntu dpkg-parsechangelog\n try:\n commit = subprocess.check_output(['dpkg-parsechangelog', '--show-field', 'Version'],\n stderr=subprocess.DEVNULL).strip().decode()\n commit = commit.split('~')\n with open('lib/solaar/commit', 'w') as vfile:\n vfile.write(f'{commit[0]}\\n')\n except Exception as e:\n print('Exception using dpkg-parsechangelog', e)\n\n\ndef _data_files():\n from os.path import dirname as _dirname\n\n yield 'share/icons/hicolor/scalable/apps', _glob('share/solaar/icons/solaar*.svg')\n yield 'share/icons/hicolor/32x32/apps', _glob('share/solaar/icons/solaar-light_*.png')\n\n for mo in _glob('share/locale/*/LC_MESSAGES/solaar.mo'):\n yield _dirname(mo), [mo]\n\n yield 'share/applications', ['share/applications/solaar.desktop']\n yield 'lib/udev/rules.d', ['rules.d/42-logitech-unify-permissions.rules']\n yield 'share/metainfo', ['share/solaar/io.github.pwr_solaar.solaar.metainfo.xml']\n\n del _dirname\n\n\nsetup(\n name=NAME.lower(),\n version=version,\n description='Linux device manager for Logitech receivers, keyboards, mice, and tablets.',\n long_description='''\nSolaar is a Linux device manager for many Logitech peripherals that connect through\nUnifying and other receivers or via USB or Bluetooth.\nSolaar is able to pair/unpair devices with receivers and show and modify some of the\nmodifiable features of devices.\nFor instructions on installing Solaar see https://pwr-solaar.github.io/Solaar/installation'''.strip(),\n author='Daniel Pavel',\n license='GPLv2',\n url='http://pwr-solaar.github.io/Solaar/',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Environment :: X11 Applications :: GTK',\n 'Environment :: Console',\n 'Intended Audience :: End Users/Desktop',\n 'License :: DFSG approved',\n 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',\n 'Natural Language :: English',\n 'Programming Language :: Python :: 3 :: Only',\n 'Operating System :: POSIX :: Linux',\n 'Topic :: Utilities',\n ],\n platforms=['linux'],\n\n # sudo apt install python-gi python3-gi \\\n # gir1.2-gtk-3.0 gir1.2-notify-0.7 gir1.2-ayatanaappindicator3-0.1\n # os_requires=['gi.repository.GObject (>= 2.0)', 'gi.repository.Gtk (>= 3.0)'],\n python_requires='>=3.7',\n install_requires=[\n 'evdev (>= 1.1.2) ; platform_system==\"Linux\"',\n 'pyudev (>= 0.13)',\n 'PyYAML (>= 3.12)',\n 'python-xlib (>= 0.27)',\n 'psutil (>= 5.4.3)',\n 'dbus-python ; platform_system==\"Linux\"',\n ],\n extras_require={\n 'report-descriptor': ['hid-parser'],\n 'desktop-notifications': ['Notify (>= 0.7)'],\n 'git-commit': ['python-git-info'],\n 'test': ['pytest', 'pytest-cov'],\n },\n package_dir={'': 'lib'},\n packages=['keysyms', 'hidapi', 'logitech_receiver', 'solaar', 'solaar.ui', 'solaar.cli'],\n data_files=list(_data_files()),\n include_package_data=True,\n scripts=_glob('bin/*'),\n)\n", "path": "setup.py"}]}
| 1,547 | 130 |
gh_patches_debug_21755
|
rasdani/github-patches
|
git_diff
|
deepchecks__deepchecks-613
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[DOCS] Simple Model Comparison example improvement
Currently, the Simple Model Comparison notebook lacks some explanations.
Please follow the guidelines from this issue: #543 to improve it
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `deepchecks/utils/validation.py`
Content:
```
1 # ----------------------------------------------------------------------------
2 # Copyright (C) 2021 Deepchecks (https://www.deepchecks.com)
3 #
4 # This file is part of Deepchecks.
5 # Deepchecks is distributed under the terms of the GNU Affero General
6 # Public License (version 3 or later).
7 # You should have received a copy of the GNU Affero General Public License
8 # along with Deepchecks. If not, see <http://www.gnu.org/licenses/>.
9 # ----------------------------------------------------------------------------
10 #
11 """objects validation utilities."""
12 import typing as t
13
14 import pandas as pd
15
16 from deepchecks import base # pylint: disable=unused-import, is used in type annotations
17 from deepchecks import errors
18 from deepchecks.utils.typing import Hashable, BasicModel
19
20 __all__ = ['model_type_validation', 'ensure_hashable_or_mutable_sequence', 'validate_model', 'ensure_dataframe_type']
21
22
23 def model_type_validation(model: t.Any):
24 """Receive any object and check if it's an instance of a model we support.
25
26 Raises:
27 DeepchecksValueError: If the object is not of a supported type
28 """
29 if not isinstance(model, BasicModel):
30 raise errors.DeepchecksValueError(
31 'Model must inherit from one of supported '
32 'models: sklearn.base.BaseEstimator or CatBoost, '
33 f'Received: {model.__class__.__name__}'
34 )
35
36
37 def validate_model(
38 data: t.Union['base.Dataset', pd.DataFrame],
39 model: t.Any
40 ):
41 """Check model is able to predict on the dataset.
42
43 Args:
44 data (Dataset, pandas.DataFrame):
45 model (BaseEstimator):
46
47 Raise:
48 DeepchecksValueError: if dataset does not match model
49 """
50 model_type_validation(model)
51
52 error_message = (
53 'In order to evaluate model correctness we need not empty dataset '
54 'with the same set of features that was used to fit the model. {0}'
55 )
56
57 if isinstance(data, base.Dataset):
58 features = data.features_columns
59 features_names = set(data.features)
60 else:
61 features = data
62 features_names = set(data.columns)
63
64 model_features = getattr(model, 'feature_names_in_', None)
65
66 if features is None:
67 raise errors.DeepchecksValueError(error_message.format(
68 'But function received dataset without feature columns.'
69 ))
70
71 if len(features) == 0:
72 raise errors.DeepchecksValueError(error_message.format(
73 'But function received empty dataset.'
74 ))
75
76 try:
77 model_features = set(model_features) # type: ignore
78 if model_features != features_names:
79 raise errors.DeepchecksValueError(error_message.format(
80 'But function received dataset with a different set of features.'
81 ))
82 except (TypeError, ValueError):
83 # in case if 'model.feature_names_in_' was None or not iterable
84 pass
85
86 try:
87 model.predict(features.head(1))
88 except Exception as exc:
89 raise errors.DeepchecksValueError(
90 f'Got error when trying to predict with model on dataset: {str(exc)}'
91 )
92
93
94 T = t.TypeVar('T', bound=Hashable)
95
96
97 def ensure_hashable_or_mutable_sequence(
98 value: t.Union[T, t.MutableSequence[T]],
99 message: str = (
100 'Provided value is neither hashable nor mutable '
101 'sequence of hashable items. Got {type}')
102 ) -> t.List[T]:
103 """Validate that provided value is either hashable or mutable sequence of hashable values."""
104 if isinstance(value, Hashable):
105 return [value]
106
107 if isinstance(value, t.MutableSequence):
108 if len(value) > 0 and not isinstance(value[0], Hashable):
109 raise errors.DeepchecksValueError(message.format(
110 type=f'MutableSequence[{type(value).__name__}]'
111 ))
112 return list(value)
113
114 raise errors.DeepchecksValueError(message.format(
115 type=type(value).__name__
116 ))
117
118
119 def ensure_dataframe_type(obj: t.Any) -> pd.DataFrame:
120 """Ensure that given object is of type DataFrame or Dataset and return it as DataFrame. else raise error.
121
122 Args:
123 obj: Object to ensure it is DataFrame or Dataset
124
125 Returns:
126 (pd.DataFrame)
127 """
128 if isinstance(obj, pd.DataFrame):
129 return obj
130 elif isinstance(obj, base.Dataset):
131 return obj.data
132 else:
133 raise errors.DeepchecksValueError(
134 f'dataset must be of type DataFrame or Dataset, but got: {type(obj).__name__}'
135 )
136
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/deepchecks/utils/validation.py b/deepchecks/utils/validation.py
--- a/deepchecks/utils/validation.py
+++ b/deepchecks/utils/validation.py
@@ -56,12 +56,8 @@
if isinstance(data, base.Dataset):
features = data.features_columns
- features_names = set(data.features)
else:
features = data
- features_names = set(data.columns)
-
- model_features = getattr(model, 'feature_names_in_', None)
if features is None:
raise errors.DeepchecksValueError(error_message.format(
@@ -73,16 +69,6 @@
'But function received empty dataset.'
))
- try:
- model_features = set(model_features) # type: ignore
- if model_features != features_names:
- raise errors.DeepchecksValueError(error_message.format(
- 'But function received dataset with a different set of features.'
- ))
- except (TypeError, ValueError):
- # in case if 'model.feature_names_in_' was None or not iterable
- pass
-
try:
model.predict(features.head(1))
except Exception as exc:
|
{"golden_diff": "diff --git a/deepchecks/utils/validation.py b/deepchecks/utils/validation.py\n--- a/deepchecks/utils/validation.py\n+++ b/deepchecks/utils/validation.py\n@@ -56,12 +56,8 @@\n \n if isinstance(data, base.Dataset):\n features = data.features_columns\n- features_names = set(data.features)\n else:\n features = data\n- features_names = set(data.columns)\n-\n- model_features = getattr(model, 'feature_names_in_', None)\n \n if features is None:\n raise errors.DeepchecksValueError(error_message.format(\n@@ -73,16 +69,6 @@\n 'But function received empty dataset.'\n ))\n \n- try:\n- model_features = set(model_features) # type: ignore\n- if model_features != features_names:\n- raise errors.DeepchecksValueError(error_message.format(\n- 'But function received dataset with a different set of features.'\n- ))\n- except (TypeError, ValueError):\n- # in case if 'model.feature_names_in_' was None or not iterable\n- pass\n-\n try:\n model.predict(features.head(1))\n except Exception as exc:\n", "issue": "[DOCS] Simple Model Comparison example improvement \nCurrently, the Simple Model Comparison notebook lacks some explanations.\r\nPlease follow the guidelines from this issue: #543 to improve it\n", "before_files": [{"content": "# ----------------------------------------------------------------------------\n# Copyright (C) 2021 Deepchecks (https://www.deepchecks.com)\n#\n# This file is part of Deepchecks.\n# Deepchecks is distributed under the terms of the GNU Affero General\n# Public License (version 3 or later).\n# You should have received a copy of the GNU Affero General Public License\n# along with Deepchecks. If not, see <http://www.gnu.org/licenses/>.\n# ----------------------------------------------------------------------------\n#\n\"\"\"objects validation utilities.\"\"\"\nimport typing as t\n\nimport pandas as pd\n\nfrom deepchecks import base # pylint: disable=unused-import, is used in type annotations\nfrom deepchecks import errors\nfrom deepchecks.utils.typing import Hashable, BasicModel\n\n__all__ = ['model_type_validation', 'ensure_hashable_or_mutable_sequence', 'validate_model', 'ensure_dataframe_type']\n\n\ndef model_type_validation(model: t.Any):\n \"\"\"Receive any object and check if it's an instance of a model we support.\n\n Raises:\n DeepchecksValueError: If the object is not of a supported type\n \"\"\"\n if not isinstance(model, BasicModel):\n raise errors.DeepchecksValueError(\n 'Model must inherit from one of supported '\n 'models: sklearn.base.BaseEstimator or CatBoost, '\n f'Received: {model.__class__.__name__}'\n )\n\n\ndef validate_model(\n data: t.Union['base.Dataset', pd.DataFrame],\n model: t.Any\n):\n \"\"\"Check model is able to predict on the dataset.\n\n Args:\n data (Dataset, pandas.DataFrame):\n model (BaseEstimator):\n\n Raise:\n DeepchecksValueError: if dataset does not match model\n \"\"\"\n model_type_validation(model)\n\n error_message = (\n 'In order to evaluate model correctness we need not empty dataset '\n 'with the same set of features that was used to fit the model. {0}'\n )\n\n if isinstance(data, base.Dataset):\n features = data.features_columns\n features_names = set(data.features)\n else:\n features = data\n features_names = set(data.columns)\n\n model_features = getattr(model, 'feature_names_in_', None)\n\n if features is None:\n raise errors.DeepchecksValueError(error_message.format(\n 'But function received dataset without feature columns.'\n ))\n\n if len(features) == 0:\n raise errors.DeepchecksValueError(error_message.format(\n 'But function received empty dataset.'\n ))\n\n try:\n model_features = set(model_features) # type: ignore\n if model_features != features_names:\n raise errors.DeepchecksValueError(error_message.format(\n 'But function received dataset with a different set of features.'\n ))\n except (TypeError, ValueError):\n # in case if 'model.feature_names_in_' was None or not iterable\n pass\n\n try:\n model.predict(features.head(1))\n except Exception as exc:\n raise errors.DeepchecksValueError(\n f'Got error when trying to predict with model on dataset: {str(exc)}'\n )\n\n\nT = t.TypeVar('T', bound=Hashable)\n\n\ndef ensure_hashable_or_mutable_sequence(\n value: t.Union[T, t.MutableSequence[T]],\n message: str = (\n 'Provided value is neither hashable nor mutable '\n 'sequence of hashable items. Got {type}')\n) -> t.List[T]:\n \"\"\"Validate that provided value is either hashable or mutable sequence of hashable values.\"\"\"\n if isinstance(value, Hashable):\n return [value]\n\n if isinstance(value, t.MutableSequence):\n if len(value) > 0 and not isinstance(value[0], Hashable):\n raise errors.DeepchecksValueError(message.format(\n type=f'MutableSequence[{type(value).__name__}]'\n ))\n return list(value)\n\n raise errors.DeepchecksValueError(message.format(\n type=type(value).__name__\n ))\n\n\ndef ensure_dataframe_type(obj: t.Any) -> pd.DataFrame:\n \"\"\"Ensure that given object is of type DataFrame or Dataset and return it as DataFrame. else raise error.\n\n Args:\n obj: Object to ensure it is DataFrame or Dataset\n\n Returns:\n (pd.DataFrame)\n \"\"\"\n if isinstance(obj, pd.DataFrame):\n return obj\n elif isinstance(obj, base.Dataset):\n return obj.data\n else:\n raise errors.DeepchecksValueError(\n f'dataset must be of type DataFrame or Dataset, but got: {type(obj).__name__}'\n )\n", "path": "deepchecks/utils/validation.py"}], "after_files": [{"content": "# ----------------------------------------------------------------------------\n# Copyright (C) 2021 Deepchecks (https://www.deepchecks.com)\n#\n# This file is part of Deepchecks.\n# Deepchecks is distributed under the terms of the GNU Affero General\n# Public License (version 3 or later).\n# You should have received a copy of the GNU Affero General Public License\n# along with Deepchecks. If not, see <http://www.gnu.org/licenses/>.\n# ----------------------------------------------------------------------------\n#\n\"\"\"objects validation utilities.\"\"\"\nimport typing as t\n\nimport pandas as pd\n\nfrom deepchecks import base # pylint: disable=unused-import, is used in type annotations\nfrom deepchecks import errors\nfrom deepchecks.utils.typing import Hashable, BasicModel\n\n__all__ = ['model_type_validation', 'ensure_hashable_or_mutable_sequence', 'validate_model', 'ensure_dataframe_type']\n\n\ndef model_type_validation(model: t.Any):\n \"\"\"Receive any object and check if it's an instance of a model we support.\n\n Raises:\n DeepchecksValueError: If the object is not of a supported type\n \"\"\"\n if not isinstance(model, BasicModel):\n raise errors.DeepchecksValueError(\n 'Model must inherit from one of supported '\n 'models: sklearn.base.BaseEstimator or CatBoost, '\n f'Received: {model.__class__.__name__}'\n )\n\n\ndef validate_model(\n data: t.Union['base.Dataset', pd.DataFrame],\n model: t.Any\n):\n \"\"\"Check model is able to predict on the dataset.\n\n Args:\n data (Dataset, pandas.DataFrame):\n model (BaseEstimator):\n\n Raise:\n DeepchecksValueError: if dataset does not match model\n \"\"\"\n model_type_validation(model)\n\n error_message = (\n 'In order to evaluate model correctness we need not empty dataset '\n 'with the same set of features that was used to fit the model. {0}'\n )\n\n if isinstance(data, base.Dataset):\n features = data.features_columns\n else:\n features = data\n\n if features is None:\n raise errors.DeepchecksValueError(error_message.format(\n 'But function received dataset without feature columns.'\n ))\n\n if len(features) == 0:\n raise errors.DeepchecksValueError(error_message.format(\n 'But function received empty dataset.'\n ))\n\n try:\n model.predict(features.head(1))\n except Exception as exc:\n raise errors.DeepchecksValueError(\n f'Got error when trying to predict with model on dataset: {str(exc)}'\n )\n\n\nT = t.TypeVar('T', bound=Hashable)\n\n\ndef ensure_hashable_or_mutable_sequence(\n value: t.Union[T, t.MutableSequence[T]],\n message: str = (\n 'Provided value is neither hashable nor mutable '\n 'sequence of hashable items. Got {type}')\n) -> t.List[T]:\n \"\"\"Validate that provided value is either hashable or mutable sequence of hashable values.\"\"\"\n if isinstance(value, Hashable):\n return [value]\n\n if isinstance(value, t.MutableSequence):\n if len(value) > 0 and not isinstance(value[0], Hashable):\n raise errors.DeepchecksValueError(message.format(\n type=f'MutableSequence[{type(value).__name__}]'\n ))\n return list(value)\n\n raise errors.DeepchecksValueError(message.format(\n type=type(value).__name__\n ))\n\n\ndef ensure_dataframe_type(obj: t.Any) -> pd.DataFrame:\n \"\"\"Ensure that given object is of type DataFrame or Dataset and return it as DataFrame. else raise error.\n\n Args:\n obj: Object to ensure it is DataFrame or Dataset\n\n Returns:\n (pd.DataFrame)\n \"\"\"\n if isinstance(obj, pd.DataFrame):\n return obj\n elif isinstance(obj, base.Dataset):\n return obj.data\n else:\n raise errors.DeepchecksValueError(\n f'dataset must be of type DataFrame or Dataset, but got: {type(obj).__name__}'\n )\n", "path": "deepchecks/utils/validation.py"}]}
| 1,555 | 254 |
gh_patches_debug_15150
|
rasdani/github-patches
|
git_diff
|
huggingface__dataset-viewer-410
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error with RGBA images
https://huggingface.co/datasets/huggan/few-shot-skulls
```
Status code: 500
Exception: Status500Error
Message: cannot write mode RGBA as JPEG
```
reported by @NielsRogge
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `services/worker/src/worker/models/column/image.py`
Content:
```
1 from typing import Any, List
2
3 from datasets import Image
4 from PIL import Image as PILImage # type: ignore
5
6 from worker.models.asset import create_image_file
7 from worker.models.column.default import (
8 Cell,
9 CellTypeError,
10 ColumnInferenceError,
11 ColumnTypeError,
12 CommonColumn,
13 )
14
15
16 def check_value(value: Any) -> None:
17 if value is None:
18 return
19 if not isinstance(value, PILImage.Image):
20 raise CellTypeError("image cell must be a PIL image")
21
22
23 def infer_from_values(values: List[Any]) -> None:
24 for value in values:
25 check_value(value)
26 if values and all(value is None for value in values):
27 raise ColumnInferenceError("all the values are None, cannot infer column type")
28
29
30 class ImageColumn(CommonColumn):
31 def __init__(self, name: str, feature: Any, values: List[Any]):
32 if feature:
33 if not isinstance(feature, Image):
34 raise ColumnTypeError("feature type mismatch")
35 else:
36 infer_from_values(values)
37 self.name = name
38 self.type = "RELATIVE_IMAGE_URL"
39
40 def get_cell_value(self, dataset_name: str, config_name: str, split_name: str, row_idx: int, value: Any) -> Cell:
41 if value is None:
42 return None
43 check_value(value)
44 # this function can raise, we don't catch it
45 return create_image_file(dataset_name, config_name, split_name, row_idx, self.name, "image.jpg", value)
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/services/worker/src/worker/models/column/image.py b/services/worker/src/worker/models/column/image.py
--- a/services/worker/src/worker/models/column/image.py
+++ b/services/worker/src/worker/models/column/image.py
@@ -41,5 +41,15 @@
if value is None:
return None
check_value(value)
- # this function can raise, we don't catch it
- return create_image_file(dataset_name, config_name, split_name, row_idx, self.name, "image.jpg", value)
+ # attempt to generate one of the supported formats; if unsuccessful, throw an error
+ for ext in [".jpg", ".png"]:
+ try:
+ return create_image_file(
+ dataset_name, config_name, split_name, row_idx, self.name, f"image{ext}", value
+ )
+ except OSError:
+ # if wrong format, try the next one, see https://github.com/huggingface/datasets-server/issues/191
+ # OSError: cannot write mode P as JPEG
+ # OSError: cannot write mode RGBA as JPEG
+ continue
+ raise ValueError("Image cannot be written as JPEG or PNG")
|
{"golden_diff": "diff --git a/services/worker/src/worker/models/column/image.py b/services/worker/src/worker/models/column/image.py\n--- a/services/worker/src/worker/models/column/image.py\n+++ b/services/worker/src/worker/models/column/image.py\n@@ -41,5 +41,15 @@\n if value is None:\n return None\n check_value(value)\n- # this function can raise, we don't catch it\n- return create_image_file(dataset_name, config_name, split_name, row_idx, self.name, \"image.jpg\", value)\n+ # attempt to generate one of the supported formats; if unsuccessful, throw an error\n+ for ext in [\".jpg\", \".png\"]:\n+ try:\n+ return create_image_file(\n+ dataset_name, config_name, split_name, row_idx, self.name, f\"image{ext}\", value\n+ )\n+ except OSError:\n+ # if wrong format, try the next one, see https://github.com/huggingface/datasets-server/issues/191\n+ # OSError: cannot write mode P as JPEG\n+ # OSError: cannot write mode RGBA as JPEG\n+ continue\n+ raise ValueError(\"Image cannot be written as JPEG or PNG\")\n", "issue": "Error with RGBA images\nhttps://huggingface.co/datasets/huggan/few-shot-skulls\r\n\r\n```\r\nStatus code: 500\r\nException: Status500Error\r\nMessage: cannot write mode RGBA as JPEG\r\n```\r\n\r\nreported by @NielsRogge \r\n\r\n\n", "before_files": [{"content": "from typing import Any, List\n\nfrom datasets import Image\nfrom PIL import Image as PILImage # type: ignore\n\nfrom worker.models.asset import create_image_file\nfrom worker.models.column.default import (\n Cell,\n CellTypeError,\n ColumnInferenceError,\n ColumnTypeError,\n CommonColumn,\n)\n\n\ndef check_value(value: Any) -> None:\n if value is None:\n return\n if not isinstance(value, PILImage.Image):\n raise CellTypeError(\"image cell must be a PIL image\")\n\n\ndef infer_from_values(values: List[Any]) -> None:\n for value in values:\n check_value(value)\n if values and all(value is None for value in values):\n raise ColumnInferenceError(\"all the values are None, cannot infer column type\")\n\n\nclass ImageColumn(CommonColumn):\n def __init__(self, name: str, feature: Any, values: List[Any]):\n if feature:\n if not isinstance(feature, Image):\n raise ColumnTypeError(\"feature type mismatch\")\n else:\n infer_from_values(values)\n self.name = name\n self.type = \"RELATIVE_IMAGE_URL\"\n\n def get_cell_value(self, dataset_name: str, config_name: str, split_name: str, row_idx: int, value: Any) -> Cell:\n if value is None:\n return None\n check_value(value)\n # this function can raise, we don't catch it\n return create_image_file(dataset_name, config_name, split_name, row_idx, self.name, \"image.jpg\", value)\n", "path": "services/worker/src/worker/models/column/image.py"}], "after_files": [{"content": "from typing import Any, List\n\nfrom datasets import Image\nfrom PIL import Image as PILImage # type: ignore\n\nfrom worker.models.asset import create_image_file\nfrom worker.models.column.default import (\n Cell,\n CellTypeError,\n ColumnInferenceError,\n ColumnTypeError,\n CommonColumn,\n)\n\n\ndef check_value(value: Any) -> None:\n if value is None:\n return\n if not isinstance(value, PILImage.Image):\n raise CellTypeError(\"image cell must be a PIL image\")\n\n\ndef infer_from_values(values: List[Any]) -> None:\n for value in values:\n check_value(value)\n if values and all(value is None for value in values):\n raise ColumnInferenceError(\"all the values are None, cannot infer column type\")\n\n\nclass ImageColumn(CommonColumn):\n def __init__(self, name: str, feature: Any, values: List[Any]):\n if feature:\n if not isinstance(feature, Image):\n raise ColumnTypeError(\"feature type mismatch\")\n else:\n infer_from_values(values)\n self.name = name\n self.type = \"RELATIVE_IMAGE_URL\"\n\n def get_cell_value(self, dataset_name: str, config_name: str, split_name: str, row_idx: int, value: Any) -> Cell:\n if value is None:\n return None\n check_value(value)\n # attempt to generate one of the supported formats; if unsuccessful, throw an error\n for ext in [\".jpg\", \".png\"]:\n try:\n return create_image_file(\n dataset_name, config_name, split_name, row_idx, self.name, f\"image{ext}\", value\n )\n except OSError:\n # if wrong format, try the next one, see https://github.com/huggingface/datasets-server/issues/191\n # OSError: cannot write mode P as JPEG\n # OSError: cannot write mode RGBA as JPEG\n continue\n raise ValueError(\"Image cannot be written as JPEG or PNG\")\n", "path": "services/worker/src/worker/models/column/image.py"}]}
| 740 | 269 |
gh_patches_debug_32029
|
rasdani/github-patches
|
git_diff
|
PrefectHQ__prefect-3465
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Set flow labels through the CLI register command
## Current behavior
Right now there is no way to add a label to a **flow** when registering it using the CLI. You only can set the label from inside the flow's code.
## Proposed behavior
It would be really nice to have a `--label` parameter (just like the one we have for agents) that would allow us to add labels on the fly when registering a flow from the CLI.
## Example
Adding the parameter would look something like this:
`prefect register flow --file my_flow.py --name My-Flow --label my-label`
or this:
`prefect register flow --file my_flow.py --name My-Flow -l my-label`
This could be very useful for registering flows with continuous integration depending on the environment you want to build the flow for (for example, `dev` or `prod`).
Thanks a lot and keep it up!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/prefect/cli/register.py`
Content:
```
1 import os
2
3 import click
4
5 import prefect
6 from prefect.utilities.storage import extract_flow_from_file
7
8
9 @click.group(hidden=True)
10 def register():
11 """
12 Register flows
13
14 \b
15 Usage:
16 $ prefect register [OBJECT]
17
18 \b
19 Arguments:
20 flow Register flows with a backend API
21
22 \b
23 Examples:
24 $ prefect register flow --file my_flow.py --name My-Flow
25 """
26
27
28 @register.command(
29 hidden=True,
30 context_settings=dict(ignore_unknown_options=True, allow_extra_args=True),
31 )
32 @click.option(
33 "--file",
34 "-f",
35 required=True,
36 help="A file that contains a flow",
37 hidden=True,
38 default=None,
39 type=click.Path(exists=True),
40 )
41 @click.option(
42 "--name",
43 "-n",
44 required=False,
45 help="The `flow.name` to pull out of the file provided.",
46 hidden=True,
47 default=None,
48 )
49 @click.option(
50 "--project",
51 "-p",
52 required=False,
53 help="The name of a Prefect project to register this flow.",
54 hidden=True,
55 default=None,
56 )
57 def flow(file, name, project):
58 """
59 Register a flow from a file. This call will pull a Flow object out of a `.py` file
60 and call `flow.register` on it.
61
62 \b
63 Options:
64 --file, -f TEXT The path to a local file which contains a flow [required]
65 --name, -n TEXT The `flow.name` to pull out of the file provided. If a name
66 is not provided then the first flow object found will be registered.
67 --project TEXT The name of a Prefect project to register this flow
68
69 \b
70 Examples:
71 $ prefect register flow --file my_flow.py --name My-Flow
72 """
73
74 # Don't run extra `run` and `register` functions inside file
75 with prefect.context({"loading_flow": True}):
76 file_path = os.path.abspath(file)
77 flow_obj = extract_flow_from_file(file_path=file_path, flow_name=name)
78
79 flow_obj.register(project_name=project)
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/prefect/cli/register.py b/src/prefect/cli/register.py
--- a/src/prefect/cli/register.py
+++ b/src/prefect/cli/register.py
@@ -54,7 +54,14 @@
hidden=True,
default=None,
)
-def flow(file, name, project):
[email protected](
+ "--label",
+ "-l",
+ required=False,
+ hidden=True,
+ multiple=True,
+)
+def flow(file, name, project, label):
"""
Register a flow from a file. This call will pull a Flow object out of a `.py` file
and call `flow.register` on it.
@@ -64,16 +71,23 @@
--file, -f TEXT The path to a local file which contains a flow [required]
--name, -n TEXT The `flow.name` to pull out of the file provided. If a name
is not provided then the first flow object found will be registered.
- --project TEXT The name of a Prefect project to register this flow
+ --project, -p TEXT The name of a Prefect project to register this flow
+ --label, -l TEXT A label to set on the flow, extending any existing labels.
+ Multiple labels are supported, eg. `-l label1 -l label2`.
\b
Examples:
- $ prefect register flow --file my_flow.py --name My-Flow
+ $ prefect register flow --file my_flow.py --name My-Flow -l label1 -l label2
"""
# Don't run extra `run` and `register` functions inside file
with prefect.context({"loading_flow": True}):
file_path = os.path.abspath(file)
- flow_obj = extract_flow_from_file(file_path=file_path, flow_name=name)
+ flow = extract_flow_from_file(file_path=file_path, flow_name=name)
+
+ if getattr(flow, "run_config", None) is not None:
+ flow.run_config.labels.update(label)
+ else:
+ flow.environment.labels.update(label)
- flow_obj.register(project_name=project)
+ flow.register(project_name=project)
|
{"golden_diff": "diff --git a/src/prefect/cli/register.py b/src/prefect/cli/register.py\n--- a/src/prefect/cli/register.py\n+++ b/src/prefect/cli/register.py\n@@ -54,7 +54,14 @@\n hidden=True,\n default=None,\n )\n-def flow(file, name, project):\[email protected](\n+ \"--label\",\n+ \"-l\",\n+ required=False,\n+ hidden=True,\n+ multiple=True,\n+)\n+def flow(file, name, project, label):\n \"\"\"\n Register a flow from a file. This call will pull a Flow object out of a `.py` file\n and call `flow.register` on it.\n@@ -64,16 +71,23 @@\n --file, -f TEXT The path to a local file which contains a flow [required]\n --name, -n TEXT The `flow.name` to pull out of the file provided. If a name\n is not provided then the first flow object found will be registered.\n- --project TEXT The name of a Prefect project to register this flow\n+ --project, -p TEXT The name of a Prefect project to register this flow\n+ --label, -l TEXT A label to set on the flow, extending any existing labels.\n+ Multiple labels are supported, eg. `-l label1 -l label2`.\n \n \\b\n Examples:\n- $ prefect register flow --file my_flow.py --name My-Flow\n+ $ prefect register flow --file my_flow.py --name My-Flow -l label1 -l label2\n \"\"\"\n \n # Don't run extra `run` and `register` functions inside file\n with prefect.context({\"loading_flow\": True}):\n file_path = os.path.abspath(file)\n- flow_obj = extract_flow_from_file(file_path=file_path, flow_name=name)\n+ flow = extract_flow_from_file(file_path=file_path, flow_name=name)\n+\n+ if getattr(flow, \"run_config\", None) is not None:\n+ flow.run_config.labels.update(label)\n+ else:\n+ flow.environment.labels.update(label)\n \n- flow_obj.register(project_name=project)\n+ flow.register(project_name=project)\n", "issue": "Set flow labels through the CLI register command\n## Current behavior\r\nRight now there is no way to add a label to a **flow** when registering it using the CLI. You only can set the label from inside the flow's code.\r\n\r\n## Proposed behavior\r\nIt would be really nice to have a `--label` parameter (just like the one we have for agents) that would allow us to add labels on the fly when registering a flow from the CLI.\r\n\r\n## Example\r\nAdding the parameter would look something like this:\r\n`prefect register flow --file my_flow.py --name My-Flow --label my-label`\r\nor this:\r\n`prefect register flow --file my_flow.py --name My-Flow -l my-label`\r\n\r\nThis could be very useful for registering flows with continuous integration depending on the environment you want to build the flow for (for example, `dev` or `prod`).\r\n\r\nThanks a lot and keep it up!\r\n\n", "before_files": [{"content": "import os\n\nimport click\n\nimport prefect\nfrom prefect.utilities.storage import extract_flow_from_file\n\n\[email protected](hidden=True)\ndef register():\n \"\"\"\n Register flows\n\n \\b\n Usage:\n $ prefect register [OBJECT]\n\n \\b\n Arguments:\n flow Register flows with a backend API\n\n \\b\n Examples:\n $ prefect register flow --file my_flow.py --name My-Flow\n \"\"\"\n\n\[email protected](\n hidden=True,\n context_settings=dict(ignore_unknown_options=True, allow_extra_args=True),\n)\[email protected](\n \"--file\",\n \"-f\",\n required=True,\n help=\"A file that contains a flow\",\n hidden=True,\n default=None,\n type=click.Path(exists=True),\n)\[email protected](\n \"--name\",\n \"-n\",\n required=False,\n help=\"The `flow.name` to pull out of the file provided.\",\n hidden=True,\n default=None,\n)\[email protected](\n \"--project\",\n \"-p\",\n required=False,\n help=\"The name of a Prefect project to register this flow.\",\n hidden=True,\n default=None,\n)\ndef flow(file, name, project):\n \"\"\"\n Register a flow from a file. This call will pull a Flow object out of a `.py` file\n and call `flow.register` on it.\n\n \\b\n Options:\n --file, -f TEXT The path to a local file which contains a flow [required]\n --name, -n TEXT The `flow.name` to pull out of the file provided. If a name\n is not provided then the first flow object found will be registered.\n --project TEXT The name of a Prefect project to register this flow\n\n \\b\n Examples:\n $ prefect register flow --file my_flow.py --name My-Flow\n \"\"\"\n\n # Don't run extra `run` and `register` functions inside file\n with prefect.context({\"loading_flow\": True}):\n file_path = os.path.abspath(file)\n flow_obj = extract_flow_from_file(file_path=file_path, flow_name=name)\n\n flow_obj.register(project_name=project)\n", "path": "src/prefect/cli/register.py"}], "after_files": [{"content": "import os\n\nimport click\n\nimport prefect\nfrom prefect.utilities.storage import extract_flow_from_file\n\n\[email protected](hidden=True)\ndef register():\n \"\"\"\n Register flows\n\n \\b\n Usage:\n $ prefect register [OBJECT]\n\n \\b\n Arguments:\n flow Register flows with a backend API\n\n \\b\n Examples:\n $ prefect register flow --file my_flow.py --name My-Flow\n \"\"\"\n\n\[email protected](\n hidden=True,\n context_settings=dict(ignore_unknown_options=True, allow_extra_args=True),\n)\[email protected](\n \"--file\",\n \"-f\",\n required=True,\n help=\"A file that contains a flow\",\n hidden=True,\n default=None,\n type=click.Path(exists=True),\n)\[email protected](\n \"--name\",\n \"-n\",\n required=False,\n help=\"The `flow.name` to pull out of the file provided.\",\n hidden=True,\n default=None,\n)\[email protected](\n \"--project\",\n \"-p\",\n required=False,\n help=\"The name of a Prefect project to register this flow.\",\n hidden=True,\n default=None,\n)\[email protected](\n \"--label\",\n \"-l\",\n required=False,\n hidden=True,\n multiple=True,\n)\ndef flow(file, name, project, label):\n \"\"\"\n Register a flow from a file. This call will pull a Flow object out of a `.py` file\n and call `flow.register` on it.\n\n \\b\n Options:\n --file, -f TEXT The path to a local file which contains a flow [required]\n --name, -n TEXT The `flow.name` to pull out of the file provided. If a name\n is not provided then the first flow object found will be registered.\n --project, -p TEXT The name of a Prefect project to register this flow\n --label, -l TEXT A label to set on the flow, extending any existing labels.\n Multiple labels are supported, eg. `-l label1 -l label2`.\n\n \\b\n Examples:\n $ prefect register flow --file my_flow.py --name My-Flow -l label1 -l label2\n \"\"\"\n\n # Don't run extra `run` and `register` functions inside file\n with prefect.context({\"loading_flow\": True}):\n file_path = os.path.abspath(file)\n flow = extract_flow_from_file(file_path=file_path, flow_name=name)\n\n if getattr(flow, \"run_config\", None) is not None:\n flow.run_config.labels.update(label)\n else:\n flow.environment.labels.update(label)\n\n flow.register(project_name=project)\n", "path": "src/prefect/cli/register.py"}]}
| 1,075 | 490 |
gh_patches_debug_17801
|
rasdani/github-patches
|
git_diff
|
apache__airflow-9759
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Replace flask_oauthlib with Authlib
**Description**
flask_oauthlib has been deprecated in favour of Authlib. It would be good if airflow starts using Authlib
**Use case / motivation**
FlaskAppBuilder is now using Authlib.
Since FlaskAppBuilder is deeply integrated into Airflow, it will be good to also have this Authlib. Flask-oauthlib documentation recommends Authlib
**Related Issues**
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `airflow/config_templates/default_webserver_config.py`
Content:
```
1 #
2 # Licensed to the Apache Software Foundation (ASF) under one
3 # or more contributor license agreements. See the NOTICE file
4 # distributed with this work for additional information
5 # regarding copyright ownership. The ASF licenses this file
6 # to you under the Apache License, Version 2.0 (the
7 # "License"); you may not use this file except in compliance
8 # with the License. You may obtain a copy of the License at
9 #
10 # http://www.apache.org/licenses/LICENSE-2.0
11 #
12 # Unless required by applicable law or agreed to in writing,
13 # software distributed under the License is distributed on an
14 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15 # KIND, either express or implied. See the License for the
16 # specific language governing permissions and limitations
17 # under the License.
18 """Default configuration for the Airflow webserver"""
19 import os
20
21 from flask_appbuilder.security.manager import AUTH_DB
22
23 from airflow.configuration import conf
24
25 # from flask_appbuilder.security.manager import AUTH_LDAP
26 # from flask_appbuilder.security.manager import AUTH_OAUTH
27 # from flask_appbuilder.security.manager import AUTH_OID
28 # from flask_appbuilder.security.manager import AUTH_REMOTE_USER
29
30
31 basedir = os.path.abspath(os.path.dirname(__file__))
32
33 # The SQLAlchemy connection string.
34 SQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN')
35
36 # Flask-WTF flag for CSRF
37 WTF_CSRF_ENABLED = True
38
39 # ----------------------------------------------------
40 # AUTHENTICATION CONFIG
41 # ----------------------------------------------------
42 # For details on how to set up each of the following authentication, see
43 # http://flask-appbuilder.readthedocs.io/en/latest/security.html# authentication-methods
44 # for details.
45
46 # The authentication type
47 # AUTH_OID : Is for OpenID
48 # AUTH_DB : Is for database
49 # AUTH_LDAP : Is for LDAP
50 # AUTH_REMOTE_USER : Is for using REMOTE_USER from web server
51 # AUTH_OAUTH : Is for OAuth
52 AUTH_TYPE = AUTH_DB
53
54 # Uncomment to setup Full admin role name
55 # AUTH_ROLE_ADMIN = 'Admin'
56
57 # Uncomment to setup Public role name, no authentication needed
58 # AUTH_ROLE_PUBLIC = 'Public'
59
60 # Will allow user self registration
61 # AUTH_USER_REGISTRATION = True
62
63 # The default user self registration role
64 # AUTH_USER_REGISTRATION_ROLE = "Public"
65
66 # When using OAuth Auth, uncomment to setup provider(s) info
67 # Google OAuth example:
68 # OAUTH_PROVIDERS = [{
69 # 'name':'google',
70 # 'token_key':'access_token',
71 # 'icon':'fa-google',
72 # 'remote_app': {
73 # 'base_url':'https://www.googleapis.com/oauth2/v2/',
74 # 'request_token_params':{
75 # 'scope': 'email profile'
76 # },
77 # 'access_token_url':'https://accounts.google.com/o/oauth2/token',
78 # 'authorize_url':'https://accounts.google.com/o/oauth2/auth',
79 # 'request_token_url': None,
80 # 'consumer_key': CONSUMER_KEY,
81 # 'consumer_secret': SECRET_KEY,
82 # }
83 # }]
84
85 # When using LDAP Auth, setup the ldap server
86 # AUTH_LDAP_SERVER = "ldap://ldapserver.new"
87
88 # When using OpenID Auth, uncomment to setup OpenID providers.
89 # example for OpenID authentication
90 # OPENID_PROVIDERS = [
91 # { 'name': 'Yahoo', 'url': 'https://me.yahoo.com' },
92 # { 'name': 'AOL', 'url': 'http://openid.aol.com/<username>' },
93 # { 'name': 'Flickr', 'url': 'http://www.flickr.com/<username>' },
94 # { 'name': 'MyOpenID', 'url': 'https://www.myopenid.com' }]
95
96 # ----------------------------------------------------
97 # Theme CONFIG
98 # ----------------------------------------------------
99 # Flask App Builder comes up with a number of predefined themes
100 # that you can use for Apache Airflow.
101 # http://flask-appbuilder.readthedocs.io/en/latest/customizing.html#changing-themes
102 # Please make sure to remove "navbar_color" configuration from airflow.cfg
103 # in order to fully utilize the theme. (or use that property in conjunction with theme)
104 # APP_THEME = "bootstrap-theme.css" # default bootstrap
105 # APP_THEME = "amelia.css"
106 # APP_THEME = "cerulean.css"
107 # APP_THEME = "cosmo.css"
108 # APP_THEME = "cyborg.css"
109 # APP_THEME = "darkly.css"
110 # APP_THEME = "flatly.css"
111 # APP_THEME = "journal.css"
112 # APP_THEME = "lumen.css"
113 # APP_THEME = "paper.css"
114 # APP_THEME = "readable.css"
115 # APP_THEME = "sandstone.css"
116 # APP_THEME = "simplex.css"
117 # APP_THEME = "slate.css"
118 # APP_THEME = "solar.css"
119 # APP_THEME = "spacelab.css"
120 # APP_THEME = "superhero.css"
121 # APP_THEME = "united.css"
122 # APP_THEME = "yeti.css"
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/airflow/config_templates/default_webserver_config.py b/airflow/config_templates/default_webserver_config.py
--- a/airflow/config_templates/default_webserver_config.py
+++ b/airflow/config_templates/default_webserver_config.py
@@ -70,15 +70,15 @@
# 'token_key':'access_token',
# 'icon':'fa-google',
# 'remote_app': {
-# 'base_url':'https://www.googleapis.com/oauth2/v2/',
-# 'request_token_params':{
+# 'api_base_url':'https://www.googleapis.com/oauth2/v2/',
+# 'client_kwargs':{
# 'scope': 'email profile'
# },
# 'access_token_url':'https://accounts.google.com/o/oauth2/token',
# 'authorize_url':'https://accounts.google.com/o/oauth2/auth',
# 'request_token_url': None,
-# 'consumer_key': CONSUMER_KEY,
-# 'consumer_secret': SECRET_KEY,
+# 'client_id': GOOGLE_KEY,
+# 'client_secret': GOOGLE_SECRET_KEY,
# }
# }]
|
{"golden_diff": "diff --git a/airflow/config_templates/default_webserver_config.py b/airflow/config_templates/default_webserver_config.py\n--- a/airflow/config_templates/default_webserver_config.py\n+++ b/airflow/config_templates/default_webserver_config.py\n@@ -70,15 +70,15 @@\n # 'token_key':'access_token',\n # 'icon':'fa-google',\n # 'remote_app': {\n-# 'base_url':'https://www.googleapis.com/oauth2/v2/',\n-# 'request_token_params':{\n+# 'api_base_url':'https://www.googleapis.com/oauth2/v2/',\n+# 'client_kwargs':{\n # 'scope': 'email profile'\n # },\n # 'access_token_url':'https://accounts.google.com/o/oauth2/token',\n # 'authorize_url':'https://accounts.google.com/o/oauth2/auth',\n # 'request_token_url': None,\n-# 'consumer_key': CONSUMER_KEY,\n-# 'consumer_secret': SECRET_KEY,\n+# 'client_id': GOOGLE_KEY,\n+# 'client_secret': GOOGLE_SECRET_KEY,\n # }\n # }]\n", "issue": "Replace flask_oauthlib with Authlib\n\r\n**Description**\r\n\r\nflask_oauthlib has been deprecated in favour of Authlib. It would be good if airflow starts using Authlib\r\n\r\n**Use case / motivation**\r\n\r\nFlaskAppBuilder is now using Authlib. \r\nSince FlaskAppBuilder is deeply integrated into Airflow, it will be good to also have this Authlib. Flask-oauthlib documentation recommends Authlib\r\n\r\n**Related Issues**\r\n\n", "before_files": [{"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\"\"\"Default configuration for the Airflow webserver\"\"\"\nimport os\n\nfrom flask_appbuilder.security.manager import AUTH_DB\n\nfrom airflow.configuration import conf\n\n# from flask_appbuilder.security.manager import AUTH_LDAP\n# from flask_appbuilder.security.manager import AUTH_OAUTH\n# from flask_appbuilder.security.manager import AUTH_OID\n# from flask_appbuilder.security.manager import AUTH_REMOTE_USER\n\n\nbasedir = os.path.abspath(os.path.dirname(__file__))\n\n# The SQLAlchemy connection string.\nSQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN')\n\n# Flask-WTF flag for CSRF\nWTF_CSRF_ENABLED = True\n\n# ----------------------------------------------------\n# AUTHENTICATION CONFIG\n# ----------------------------------------------------\n# For details on how to set up each of the following authentication, see\n# http://flask-appbuilder.readthedocs.io/en/latest/security.html# authentication-methods\n# for details.\n\n# The authentication type\n# AUTH_OID : Is for OpenID\n# AUTH_DB : Is for database\n# AUTH_LDAP : Is for LDAP\n# AUTH_REMOTE_USER : Is for using REMOTE_USER from web server\n# AUTH_OAUTH : Is for OAuth\nAUTH_TYPE = AUTH_DB\n\n# Uncomment to setup Full admin role name\n# AUTH_ROLE_ADMIN = 'Admin'\n\n# Uncomment to setup Public role name, no authentication needed\n# AUTH_ROLE_PUBLIC = 'Public'\n\n# Will allow user self registration\n# AUTH_USER_REGISTRATION = True\n\n# The default user self registration role\n# AUTH_USER_REGISTRATION_ROLE = \"Public\"\n\n# When using OAuth Auth, uncomment to setup provider(s) info\n# Google OAuth example:\n# OAUTH_PROVIDERS = [{\n# 'name':'google',\n# 'token_key':'access_token',\n# 'icon':'fa-google',\n# 'remote_app': {\n# 'base_url':'https://www.googleapis.com/oauth2/v2/',\n# 'request_token_params':{\n# 'scope': 'email profile'\n# },\n# 'access_token_url':'https://accounts.google.com/o/oauth2/token',\n# 'authorize_url':'https://accounts.google.com/o/oauth2/auth',\n# 'request_token_url': None,\n# 'consumer_key': CONSUMER_KEY,\n# 'consumer_secret': SECRET_KEY,\n# }\n# }]\n\n# When using LDAP Auth, setup the ldap server\n# AUTH_LDAP_SERVER = \"ldap://ldapserver.new\"\n\n# When using OpenID Auth, uncomment to setup OpenID providers.\n# example for OpenID authentication\n# OPENID_PROVIDERS = [\n# { 'name': 'Yahoo', 'url': 'https://me.yahoo.com' },\n# { 'name': 'AOL', 'url': 'http://openid.aol.com/<username>' },\n# { 'name': 'Flickr', 'url': 'http://www.flickr.com/<username>' },\n# { 'name': 'MyOpenID', 'url': 'https://www.myopenid.com' }]\n\n# ----------------------------------------------------\n# Theme CONFIG\n# ----------------------------------------------------\n# Flask App Builder comes up with a number of predefined themes\n# that you can use for Apache Airflow.\n# http://flask-appbuilder.readthedocs.io/en/latest/customizing.html#changing-themes\n# Please make sure to remove \"navbar_color\" configuration from airflow.cfg\n# in order to fully utilize the theme. (or use that property in conjunction with theme)\n# APP_THEME = \"bootstrap-theme.css\" # default bootstrap\n# APP_THEME = \"amelia.css\"\n# APP_THEME = \"cerulean.css\"\n# APP_THEME = \"cosmo.css\"\n# APP_THEME = \"cyborg.css\"\n# APP_THEME = \"darkly.css\"\n# APP_THEME = \"flatly.css\"\n# APP_THEME = \"journal.css\"\n# APP_THEME = \"lumen.css\"\n# APP_THEME = \"paper.css\"\n# APP_THEME = \"readable.css\"\n# APP_THEME = \"sandstone.css\"\n# APP_THEME = \"simplex.css\"\n# APP_THEME = \"slate.css\"\n# APP_THEME = \"solar.css\"\n# APP_THEME = \"spacelab.css\"\n# APP_THEME = \"superhero.css\"\n# APP_THEME = \"united.css\"\n# APP_THEME = \"yeti.css\"\n", "path": "airflow/config_templates/default_webserver_config.py"}], "after_files": [{"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\"\"\"Default configuration for the Airflow webserver\"\"\"\nimport os\n\nfrom flask_appbuilder.security.manager import AUTH_DB\n\nfrom airflow.configuration import conf\n\n# from flask_appbuilder.security.manager import AUTH_LDAP\n# from flask_appbuilder.security.manager import AUTH_OAUTH\n# from flask_appbuilder.security.manager import AUTH_OID\n# from flask_appbuilder.security.manager import AUTH_REMOTE_USER\n\n\nbasedir = os.path.abspath(os.path.dirname(__file__))\n\n# The SQLAlchemy connection string.\nSQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN')\n\n# Flask-WTF flag for CSRF\nWTF_CSRF_ENABLED = True\n\n# ----------------------------------------------------\n# AUTHENTICATION CONFIG\n# ----------------------------------------------------\n# For details on how to set up each of the following authentication, see\n# http://flask-appbuilder.readthedocs.io/en/latest/security.html# authentication-methods\n# for details.\n\n# The authentication type\n# AUTH_OID : Is for OpenID\n# AUTH_DB : Is for database\n# AUTH_LDAP : Is for LDAP\n# AUTH_REMOTE_USER : Is for using REMOTE_USER from web server\n# AUTH_OAUTH : Is for OAuth\nAUTH_TYPE = AUTH_DB\n\n# Uncomment to setup Full admin role name\n# AUTH_ROLE_ADMIN = 'Admin'\n\n# Uncomment to setup Public role name, no authentication needed\n# AUTH_ROLE_PUBLIC = 'Public'\n\n# Will allow user self registration\n# AUTH_USER_REGISTRATION = True\n\n# The default user self registration role\n# AUTH_USER_REGISTRATION_ROLE = \"Public\"\n\n# When using OAuth Auth, uncomment to setup provider(s) info\n# Google OAuth example:\n# OAUTH_PROVIDERS = [{\n# 'name':'google',\n# 'token_key':'access_token',\n# 'icon':'fa-google',\n# 'remote_app': {\n# 'api_base_url':'https://www.googleapis.com/oauth2/v2/',\n# 'client_kwargs':{\n# 'scope': 'email profile'\n# },\n# 'access_token_url':'https://accounts.google.com/o/oauth2/token',\n# 'authorize_url':'https://accounts.google.com/o/oauth2/auth',\n# 'request_token_url': None,\n# 'client_id': GOOGLE_KEY,\n# 'client_secret': GOOGLE_SECRET_KEY,\n# }\n# }]\n\n# When using LDAP Auth, setup the ldap server\n# AUTH_LDAP_SERVER = \"ldap://ldapserver.new\"\n\n# When using OpenID Auth, uncomment to setup OpenID providers.\n# example for OpenID authentication\n# OPENID_PROVIDERS = [\n# { 'name': 'Yahoo', 'url': 'https://me.yahoo.com' },\n# { 'name': 'AOL', 'url': 'http://openid.aol.com/<username>' },\n# { 'name': 'Flickr', 'url': 'http://www.flickr.com/<username>' },\n# { 'name': 'MyOpenID', 'url': 'https://www.myopenid.com' }]\n\n# ----------------------------------------------------\n# Theme CONFIG\n# ----------------------------------------------------\n# Flask App Builder comes up with a number of predefined themes\n# that you can use for Apache Airflow.\n# http://flask-appbuilder.readthedocs.io/en/latest/customizing.html#changing-themes\n# Please make sure to remove \"navbar_color\" configuration from airflow.cfg\n# in order to fully utilize the theme. (or use that property in conjunction with theme)\n# APP_THEME = \"bootstrap-theme.css\" # default bootstrap\n# APP_THEME = \"amelia.css\"\n# APP_THEME = \"cerulean.css\"\n# APP_THEME = \"cosmo.css\"\n# APP_THEME = \"cyborg.css\"\n# APP_THEME = \"darkly.css\"\n# APP_THEME = \"flatly.css\"\n# APP_THEME = \"journal.css\"\n# APP_THEME = \"lumen.css\"\n# APP_THEME = \"paper.css\"\n# APP_THEME = \"readable.css\"\n# APP_THEME = \"sandstone.css\"\n# APP_THEME = \"simplex.css\"\n# APP_THEME = \"slate.css\"\n# APP_THEME = \"solar.css\"\n# APP_THEME = \"spacelab.css\"\n# APP_THEME = \"superhero.css\"\n# APP_THEME = \"united.css\"\n# APP_THEME = \"yeti.css\"\n", "path": "airflow/config_templates/default_webserver_config.py"}]}
| 1,682 | 239 |
gh_patches_debug_4985
|
rasdani/github-patches
|
git_diff
|
spack__spack-2022
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`spack checksum` finds wrong URLs
I was having some problems getting spack to find the correct URL for files.
With these settings
``` python
homepage = "http://fishshell.com/"
url = "http://fishshell.com/files/2.2.0/fish-2.2.0.tar.gz"
list_url = homepage
```
I get the following result (with wrong URLs):
``` sh
$ spack checksum fish
==> Found 5 versions of fish.
2.2.0 http://fishshell.com/fish-2.2.0.tar.gz
2.1.2 http://fishshell.com/fish-2.1.2.tar.gz
2.1.1 http://fishshell.com/fish-2.1.1.tar.gz
2.1.0 http://fishshell.com/fish-2.1.0.tar.gz
2.0.0 http://fishshell.com/fish-2.0.0.tar.gz
How many would you like to checksum? (default is 5, q to abort)
==> Downloading...
==> Trying to fetch from http://fishshell.com/fish-2.2.0.tar.gz
curl: (22) The requested URL returned error: 404 Not Found
==> Fetching from http://fishshell.com/fish-2.2.0.tar.gz failed.
==> Error: All fetchers failed for spack-stage-Slflbn
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `var/spack/repos/builtin/packages/fish/package.py`
Content:
```
1 ##############################################################################
2 # Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
3 # Produced at the Lawrence Livermore National Laboratory.
4 #
5 # This file is part of Spack.
6 # Created by Todd Gamblin, [email protected], All rights reserved.
7 # LLNL-CODE-647188
8 #
9 # For details, see https://github.com/llnl/spack
10 # Please also see the LICENSE file for our notice and the LGPL.
11 #
12 # This program is free software; you can redistribute it and/or modify
13 # it under the terms of the GNU Lesser General Public License (as
14 # published by the Free Software Foundation) version 2.1, February 1999.
15 #
16 # This program is distributed in the hope that it will be useful, but
17 # WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
18 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
19 # conditions of the GNU Lesser General Public License for more details.
20 #
21 # You should have received a copy of the GNU Lesser General Public
22 # License along with this program; if not, write to the Free Software
23 # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
24 ##############################################################################
25 from spack import *
26
27
28 class Fish(Package):
29 """fish is a smart and user-friendly command line shell for OS X, Linux, and
30 the rest of the family.
31 """
32
33 homepage = "http://fishshell.com/"
34 url = "http://fishshell.com/files/2.2.0/fish-2.2.0.tar.gz"
35 list_url = "http://fishshell.com/files/"
36 list_depth = 2
37
38 version('2.2.0', 'a76339fd14ce2ec229283c53e805faac48c3e99d9e3ede9d82c0554acfc7b77a')
39
40 def install(self, spec, prefix):
41 configure('--prefix=%s' % prefix)
42
43 make()
44 make("install")
45
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/var/spack/repos/builtin/packages/fish/package.py b/var/spack/repos/builtin/packages/fish/package.py
--- a/var/spack/repos/builtin/packages/fish/package.py
+++ b/var/spack/repos/builtin/packages/fish/package.py
@@ -32,8 +32,7 @@
homepage = "http://fishshell.com/"
url = "http://fishshell.com/files/2.2.0/fish-2.2.0.tar.gz"
- list_url = "http://fishshell.com/files/"
- list_depth = 2
+ list_url = "http://fishshell.com/"
version('2.2.0', 'a76339fd14ce2ec229283c53e805faac48c3e99d9e3ede9d82c0554acfc7b77a')
|
{"golden_diff": "diff --git a/var/spack/repos/builtin/packages/fish/package.py b/var/spack/repos/builtin/packages/fish/package.py\n--- a/var/spack/repos/builtin/packages/fish/package.py\n+++ b/var/spack/repos/builtin/packages/fish/package.py\n@@ -32,8 +32,7 @@\n \n homepage = \"http://fishshell.com/\"\n url = \"http://fishshell.com/files/2.2.0/fish-2.2.0.tar.gz\"\n- list_url = \"http://fishshell.com/files/\"\n- list_depth = 2\n+ list_url = \"http://fishshell.com/\"\n \n version('2.2.0', 'a76339fd14ce2ec229283c53e805faac48c3e99d9e3ede9d82c0554acfc7b77a')\n", "issue": "`spack checksum` finds wrong URLs\nI was having some problems getting spack to find the correct URL for files.\n\nWith these settings\n\n``` python\n homepage = \"http://fishshell.com/\"\n url = \"http://fishshell.com/files/2.2.0/fish-2.2.0.tar.gz\"\n list_url = homepage\n```\n\nI get the following result (with wrong URLs):\n\n``` sh\n$ spack checksum fish\n==> Found 5 versions of fish.\n 2.2.0 http://fishshell.com/fish-2.2.0.tar.gz\n 2.1.2 http://fishshell.com/fish-2.1.2.tar.gz\n 2.1.1 http://fishshell.com/fish-2.1.1.tar.gz\n 2.1.0 http://fishshell.com/fish-2.1.0.tar.gz\n 2.0.0 http://fishshell.com/fish-2.0.0.tar.gz\n\nHow many would you like to checksum? (default is 5, q to abort)\n==> Downloading...\n==> Trying to fetch from http://fishshell.com/fish-2.2.0.tar.gz\n\ncurl: (22) The requested URL returned error: 404 Not Found\n==> Fetching from http://fishshell.com/fish-2.2.0.tar.gz failed.\n==> Error: All fetchers failed for spack-stage-Slflbn\n```\n\n", "before_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/llnl/spack\n# Please also see the LICENSE file for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Fish(Package):\n \"\"\"fish is a smart and user-friendly command line shell for OS X, Linux, and\n the rest of the family.\n \"\"\"\n\n homepage = \"http://fishshell.com/\"\n url = \"http://fishshell.com/files/2.2.0/fish-2.2.0.tar.gz\"\n list_url = \"http://fishshell.com/files/\"\n list_depth = 2\n\n version('2.2.0', 'a76339fd14ce2ec229283c53e805faac48c3e99d9e3ede9d82c0554acfc7b77a')\n\n def install(self, spec, prefix):\n configure('--prefix=%s' % prefix)\n\n make()\n make(\"install\")\n", "path": "var/spack/repos/builtin/packages/fish/package.py"}], "after_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/llnl/spack\n# Please also see the LICENSE file for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Fish(Package):\n \"\"\"fish is a smart and user-friendly command line shell for OS X, Linux, and\n the rest of the family.\n \"\"\"\n\n homepage = \"http://fishshell.com/\"\n url = \"http://fishshell.com/files/2.2.0/fish-2.2.0.tar.gz\"\n list_url = \"http://fishshell.com/\"\n\n version('2.2.0', 'a76339fd14ce2ec229283c53e805faac48c3e99d9e3ede9d82c0554acfc7b77a')\n\n def install(self, spec, prefix):\n configure('--prefix=%s' % prefix)\n\n make()\n make(\"install\")\n", "path": "var/spack/repos/builtin/packages/fish/package.py"}]}
| 1,151 | 207 |
gh_patches_debug_29840
|
rasdani/github-patches
|
git_diff
|
onnx__sklearn-onnx-459
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix discrepencies introduced by sciki-learn 0.23
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `skl2onnx/operator_converters/gaussian_process.py`
Content:
```
1 # -------------------------------------------------------------------------
2 # Copyright (c) Microsoft Corporation. All rights reserved.
3 # Licensed under the MIT License. See License.txt in the project root for
4 # license information.
5 # --------------------------------------------------------------------------
6 import numpy as np
7 from sklearn.gaussian_process.kernels import ConstantKernel as C, RBF
8 from ..common._registration import register_converter
9 from ..algebra.onnx_ops import (
10 OnnxAdd, OnnxSqrt, OnnxMatMul, OnnxSub, OnnxReduceSum,
11 OnnxMul, OnnxMax
12 )
13 try:
14 from ..algebra.onnx_ops import OnnxConstantOfShape
15 except ImportError:
16 OnnxConstantOfShape = None
17
18 from ._gp_kernels import (
19 convert_kernel_diag,
20 convert_kernel,
21 _zero_vector_of_size
22 )
23
24
25 def convert_gaussian_process_regressor(scope, operator, container):
26 """
27 The method *predict* from class *GaussianProcessRegressor*
28 may cache some results if it is called with parameter
29 ``return_std=True`` or ``return_cov=True``. This converter
30 needs to be called with theses options to enable
31 the second results.
32 See example :ref:`l-gpr-example` to see how to
33 use this converter which does not behave exactly
34 as the others.
35 """
36 dtype = container.dtype
37 if dtype is None:
38 raise RuntimeError("dtype cannot be None")
39 X = operator.inputs[0]
40 out = operator.outputs
41 op = operator.raw_operator
42 opv = container.target_opset
43 if opv is None:
44 raise RuntimeError("container.target_opset must not be None")
45
46 options = container.get_options(op, dict(return_cov=False,
47 return_std=False,
48 optim=None))
49 if hasattr(op, 'kernel_') and op.kernel_ is not None:
50 kernel = op.kernel_
51 elif op.kernel is None:
52 kernel = (C(1.0, constant_value_bounds="fixed") *
53 RBF(1.0, length_scale_bounds="fixed"))
54 else:
55 kernel = op.kernel
56
57 if not hasattr(op, "X_train_") or op.X_train_ is None:
58 out0 = _zero_vector_of_size(X, keepdims=1, output_names=out[:1],
59 dtype=dtype)
60
61 outputs = [out0]
62 if options['return_cov']:
63 outputs.append(convert_kernel(kernel, X,
64 output_names=out[1:],
65 dtype=dtype,
66 op_version=opv))
67 if options['return_std']:
68 outputs.append(
69 OnnxSqrt(
70 convert_kernel_diag(
71 kernel, X, dtype=dtype, op_version=opv),
72 output_names=out[1:], op_version=opv))
73 else:
74 out0 = _zero_vector_of_size(
75 X, keepdims=1, dtype=dtype, op_version=opv)
76
77 # Code scikit-learn
78 # K_trans = self.kernel_(X, self.X_train_)
79 # y_mean = K_trans.dot(self.alpha_) # Line 4 (y_mean = f_star)
80 # y_mean = self._y_train_mean + y_mean # undo normal.
81
82 k_trans = convert_kernel(kernel, X,
83 x_train=op.X_train_.astype(dtype),
84 dtype=dtype,
85 optim=options.get('optim', None),
86 op_version=opv)
87 k_trans.set_onnx_name_prefix('kgpd')
88 y_mean_b = OnnxMatMul(k_trans, op.alpha_.astype(dtype), op_version=opv)
89
90 mean_y = op._y_train_mean.astype(dtype)
91 if len(mean_y.shape) == 1:
92 mean_y = mean_y.reshape(mean_y.shape + (1,))
93 y_mean = OnnxAdd(y_mean_b, mean_y,
94 output_names=out[:1],
95 op_version=opv)
96 y_mean.set_onnx_name_prefix('gpr')
97 outputs = [y_mean]
98
99 if options['return_cov']:
100 raise NotImplementedError()
101 if options['return_std']:
102 if op._K_inv is None:
103 raise RuntimeError(
104 "The method *predict* must be called once with parameter "
105 "return_std=True to compute internal variables. "
106 "They cannot be computed here as the same operation "
107 "(matrix inversion) produces too many discrepencies "
108 "if done with single floats than double floats.")
109 _K_inv = op._K_inv
110
111 # y_var = self.kernel_.diag(X)
112 y_var = convert_kernel_diag(kernel, X, dtype=dtype,
113 optim=options.get('optim', None),
114 op_version=opv)
115
116 # y_var -= np.einsum("ij,ij->i",
117 # np.dot(K_trans, self._K_inv), K_trans)
118 k_dot = OnnxMatMul(k_trans, _K_inv.astype(dtype), op_version=opv)
119 ys_var = OnnxSub(
120 y_var, OnnxReduceSum(
121 OnnxMul(k_dot, k_trans, op_version=opv),
122 axes=[1], keepdims=0, op_version=opv),
123 op_version=opv)
124
125 # y_var_negative = y_var < 0
126 # if np.any(y_var_negative):
127 # y_var[y_var_negative] = 0.0
128 ys0_var = OnnxMax(ys_var, np.array([0], dtype=dtype),
129 op_version=opv)
130
131 # var = np.sqrt(ys0_var)
132 var = OnnxSqrt(ys0_var, output_names=out[1:], op_version=opv)
133 var.set_onnx_name_prefix('gprv')
134 outputs.append(var)
135
136 for o in outputs:
137 o.add_to(scope, container)
138
139
140 if OnnxConstantOfShape is not None:
141 register_converter('SklearnGaussianProcessRegressor',
142 convert_gaussian_process_regressor,
143 options={'return_cov': [False, True],
144 'return_std': [False, True],
145 'optim': [None, 'cdist']})
146
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/skl2onnx/operator_converters/gaussian_process.py b/skl2onnx/operator_converters/gaussian_process.py
--- a/skl2onnx/operator_converters/gaussian_process.py
+++ b/skl2onnx/operator_converters/gaussian_process.py
@@ -90,9 +90,22 @@
mean_y = op._y_train_mean.astype(dtype)
if len(mean_y.shape) == 1:
mean_y = mean_y.reshape(mean_y.shape + (1,))
- y_mean = OnnxAdd(y_mean_b, mean_y,
- output_names=out[:1],
- op_version=opv)
+
+ if not hasattr(op, '_y_train_std') or op._y_train_std == 1:
+ y_mean = OnnxAdd(y_mean_b, mean_y, output_names=out[:1],
+ op_version=opv)
+ else:
+ # A bug was fixed in 0.23 and it changed
+ # the predictions when return_std is True.
+ # See https://github.com/scikit-learn/scikit-learn/pull/15782.
+ # y_mean = self._y_train_std * y_mean + self._y_train_mean
+ var_y = op._y_train_std.astype(dtype)
+ if len(var_y.shape) == 1:
+ var_y = var_y.reshape(var_y.shape + (1,))
+ y_mean = OnnxAdd(
+ OnnxMul(y_mean_b, var_y, op_version=opv),
+ mean_y, output_names=out[:1], op_version=opv)
+
y_mean.set_onnx_name_prefix('gpr')
outputs = [y_mean]
@@ -128,6 +141,10 @@
ys0_var = OnnxMax(ys_var, np.array([0], dtype=dtype),
op_version=opv)
+ if hasattr(op, '_y_train_std') and op._y_train_std != 1:
+ # y_var = y_var * self._y_train_std**2
+ ys0_var = OnnxMul(ys0_var, var_y ** 2, op_version=opv)
+
# var = np.sqrt(ys0_var)
var = OnnxSqrt(ys0_var, output_names=out[1:], op_version=opv)
var.set_onnx_name_prefix('gprv')
|
{"golden_diff": "diff --git a/skl2onnx/operator_converters/gaussian_process.py b/skl2onnx/operator_converters/gaussian_process.py\n--- a/skl2onnx/operator_converters/gaussian_process.py\n+++ b/skl2onnx/operator_converters/gaussian_process.py\n@@ -90,9 +90,22 @@\n mean_y = op._y_train_mean.astype(dtype)\n if len(mean_y.shape) == 1:\n mean_y = mean_y.reshape(mean_y.shape + (1,))\n- y_mean = OnnxAdd(y_mean_b, mean_y,\n- output_names=out[:1],\n- op_version=opv)\n+\n+ if not hasattr(op, '_y_train_std') or op._y_train_std == 1:\n+ y_mean = OnnxAdd(y_mean_b, mean_y, output_names=out[:1],\n+ op_version=opv)\n+ else:\n+ # A bug was fixed in 0.23 and it changed\n+ # the predictions when return_std is True.\n+ # See https://github.com/scikit-learn/scikit-learn/pull/15782.\n+ # y_mean = self._y_train_std * y_mean + self._y_train_mean\n+ var_y = op._y_train_std.astype(dtype)\n+ if len(var_y.shape) == 1:\n+ var_y = var_y.reshape(var_y.shape + (1,))\n+ y_mean = OnnxAdd(\n+ OnnxMul(y_mean_b, var_y, op_version=opv),\n+ mean_y, output_names=out[:1], op_version=opv)\n+\n y_mean.set_onnx_name_prefix('gpr')\n outputs = [y_mean]\n \n@@ -128,6 +141,10 @@\n ys0_var = OnnxMax(ys_var, np.array([0], dtype=dtype),\n op_version=opv)\n \n+ if hasattr(op, '_y_train_std') and op._y_train_std != 1:\n+ # y_var = y_var * self._y_train_std**2\n+ ys0_var = OnnxMul(ys0_var, var_y ** 2, op_version=opv)\n+\n # var = np.sqrt(ys0_var)\n var = OnnxSqrt(ys0_var, output_names=out[1:], op_version=opv)\n var.set_onnx_name_prefix('gprv')\n", "issue": "Fix discrepencies introduced by sciki-learn 0.23\n\n", "before_files": [{"content": "# -------------------------------------------------------------------------\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See License.txt in the project root for\n# license information.\n# --------------------------------------------------------------------------\nimport numpy as np\nfrom sklearn.gaussian_process.kernels import ConstantKernel as C, RBF\nfrom ..common._registration import register_converter\nfrom ..algebra.onnx_ops import (\n OnnxAdd, OnnxSqrt, OnnxMatMul, OnnxSub, OnnxReduceSum,\n OnnxMul, OnnxMax\n)\ntry:\n from ..algebra.onnx_ops import OnnxConstantOfShape\nexcept ImportError:\n OnnxConstantOfShape = None\n\nfrom ._gp_kernels import (\n convert_kernel_diag,\n convert_kernel,\n _zero_vector_of_size\n)\n\n\ndef convert_gaussian_process_regressor(scope, operator, container):\n \"\"\"\n The method *predict* from class *GaussianProcessRegressor*\n may cache some results if it is called with parameter\n ``return_std=True`` or ``return_cov=True``. This converter\n needs to be called with theses options to enable\n the second results.\n See example :ref:`l-gpr-example` to see how to\n use this converter which does not behave exactly\n as the others.\n \"\"\"\n dtype = container.dtype\n if dtype is None:\n raise RuntimeError(\"dtype cannot be None\")\n X = operator.inputs[0]\n out = operator.outputs\n op = operator.raw_operator\n opv = container.target_opset\n if opv is None:\n raise RuntimeError(\"container.target_opset must not be None\")\n\n options = container.get_options(op, dict(return_cov=False,\n return_std=False,\n optim=None))\n if hasattr(op, 'kernel_') and op.kernel_ is not None:\n kernel = op.kernel_\n elif op.kernel is None:\n kernel = (C(1.0, constant_value_bounds=\"fixed\") *\n RBF(1.0, length_scale_bounds=\"fixed\"))\n else:\n kernel = op.kernel\n\n if not hasattr(op, \"X_train_\") or op.X_train_ is None:\n out0 = _zero_vector_of_size(X, keepdims=1, output_names=out[:1],\n dtype=dtype)\n\n outputs = [out0]\n if options['return_cov']:\n outputs.append(convert_kernel(kernel, X,\n output_names=out[1:],\n dtype=dtype,\n op_version=opv))\n if options['return_std']:\n outputs.append(\n OnnxSqrt(\n convert_kernel_diag(\n kernel, X, dtype=dtype, op_version=opv),\n output_names=out[1:], op_version=opv))\n else:\n out0 = _zero_vector_of_size(\n X, keepdims=1, dtype=dtype, op_version=opv)\n\n # Code scikit-learn\n # K_trans = self.kernel_(X, self.X_train_)\n # y_mean = K_trans.dot(self.alpha_) # Line 4 (y_mean = f_star)\n # y_mean = self._y_train_mean + y_mean # undo normal.\n\n k_trans = convert_kernel(kernel, X,\n x_train=op.X_train_.astype(dtype),\n dtype=dtype,\n optim=options.get('optim', None),\n op_version=opv)\n k_trans.set_onnx_name_prefix('kgpd')\n y_mean_b = OnnxMatMul(k_trans, op.alpha_.astype(dtype), op_version=opv)\n\n mean_y = op._y_train_mean.astype(dtype)\n if len(mean_y.shape) == 1:\n mean_y = mean_y.reshape(mean_y.shape + (1,))\n y_mean = OnnxAdd(y_mean_b, mean_y,\n output_names=out[:1],\n op_version=opv)\n y_mean.set_onnx_name_prefix('gpr')\n outputs = [y_mean]\n\n if options['return_cov']:\n raise NotImplementedError()\n if options['return_std']:\n if op._K_inv is None:\n raise RuntimeError(\n \"The method *predict* must be called once with parameter \"\n \"return_std=True to compute internal variables. \"\n \"They cannot be computed here as the same operation \"\n \"(matrix inversion) produces too many discrepencies \"\n \"if done with single floats than double floats.\")\n _K_inv = op._K_inv\n\n # y_var = self.kernel_.diag(X)\n y_var = convert_kernel_diag(kernel, X, dtype=dtype,\n optim=options.get('optim', None),\n op_version=opv)\n\n # y_var -= np.einsum(\"ij,ij->i\",\n # np.dot(K_trans, self._K_inv), K_trans)\n k_dot = OnnxMatMul(k_trans, _K_inv.astype(dtype), op_version=opv)\n ys_var = OnnxSub(\n y_var, OnnxReduceSum(\n OnnxMul(k_dot, k_trans, op_version=opv),\n axes=[1], keepdims=0, op_version=opv),\n op_version=opv)\n\n # y_var_negative = y_var < 0\n # if np.any(y_var_negative):\n # y_var[y_var_negative] = 0.0\n ys0_var = OnnxMax(ys_var, np.array([0], dtype=dtype),\n op_version=opv)\n\n # var = np.sqrt(ys0_var)\n var = OnnxSqrt(ys0_var, output_names=out[1:], op_version=opv)\n var.set_onnx_name_prefix('gprv')\n outputs.append(var)\n\n for o in outputs:\n o.add_to(scope, container)\n\n\nif OnnxConstantOfShape is not None:\n register_converter('SklearnGaussianProcessRegressor',\n convert_gaussian_process_regressor,\n options={'return_cov': [False, True],\n 'return_std': [False, True],\n 'optim': [None, 'cdist']})\n", "path": "skl2onnx/operator_converters/gaussian_process.py"}], "after_files": [{"content": "# -------------------------------------------------------------------------\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See License.txt in the project root for\n# license information.\n# --------------------------------------------------------------------------\nimport numpy as np\nfrom sklearn.gaussian_process.kernels import ConstantKernel as C, RBF\nfrom ..common._registration import register_converter\nfrom ..algebra.onnx_ops import (\n OnnxAdd, OnnxSqrt, OnnxMatMul, OnnxSub, OnnxReduceSum,\n OnnxMul, OnnxMax\n)\ntry:\n from ..algebra.onnx_ops import OnnxConstantOfShape\nexcept ImportError:\n OnnxConstantOfShape = None\n\nfrom ._gp_kernels import (\n convert_kernel_diag,\n convert_kernel,\n _zero_vector_of_size\n)\n\n\ndef convert_gaussian_process_regressor(scope, operator, container):\n \"\"\"\n The method *predict* from class *GaussianProcessRegressor*\n may cache some results if it is called with parameter\n ``return_std=True`` or ``return_cov=True``. This converter\n needs to be called with theses options to enable\n the second results.\n See example :ref:`l-gpr-example` to see how to\n use this converter which does not behave exactly\n as the others.\n \"\"\"\n dtype = container.dtype\n if dtype is None:\n raise RuntimeError(\"dtype cannot be None\")\n X = operator.inputs[0]\n out = operator.outputs\n op = operator.raw_operator\n opv = container.target_opset\n if opv is None:\n raise RuntimeError(\"container.target_opset must not be None\")\n\n options = container.get_options(op, dict(return_cov=False,\n return_std=False,\n optim=None))\n if hasattr(op, 'kernel_') and op.kernel_ is not None:\n kernel = op.kernel_\n elif op.kernel is None:\n kernel = (C(1.0, constant_value_bounds=\"fixed\") *\n RBF(1.0, length_scale_bounds=\"fixed\"))\n else:\n kernel = op.kernel\n\n if not hasattr(op, \"X_train_\") or op.X_train_ is None:\n out0 = _zero_vector_of_size(X, keepdims=1, output_names=out[:1],\n dtype=dtype)\n\n outputs = [out0]\n if options['return_cov']:\n outputs.append(convert_kernel(kernel, X,\n output_names=out[1:],\n dtype=dtype,\n op_version=opv))\n if options['return_std']:\n outputs.append(\n OnnxSqrt(\n convert_kernel_diag(\n kernel, X, dtype=dtype, op_version=opv),\n output_names=out[1:], op_version=opv))\n else:\n out0 = _zero_vector_of_size(\n X, keepdims=1, dtype=dtype, op_version=opv)\n\n # Code scikit-learn\n # K_trans = self.kernel_(X, self.X_train_)\n # y_mean = K_trans.dot(self.alpha_) # Line 4 (y_mean = f_star)\n # y_mean = self._y_train_mean + y_mean # undo normal.\n\n k_trans = convert_kernel(kernel, X,\n x_train=op.X_train_.astype(dtype),\n dtype=dtype,\n optim=options.get('optim', None),\n op_version=opv)\n k_trans.set_onnx_name_prefix('kgpd')\n y_mean_b = OnnxMatMul(k_trans, op.alpha_.astype(dtype), op_version=opv)\n\n mean_y = op._y_train_mean.astype(dtype)\n if len(mean_y.shape) == 1:\n mean_y = mean_y.reshape(mean_y.shape + (1,))\n\n if not hasattr(op, '_y_train_std') or op._y_train_std == 1:\n y_mean = OnnxAdd(y_mean_b, mean_y, output_names=out[:1],\n op_version=opv)\n else:\n # A bug was fixed in 0.23 and it changed\n # the predictions when return_std is True.\n # See https://github.com/scikit-learn/scikit-learn/pull/15782.\n # y_mean = self._y_train_std * y_mean + self._y_train_mean\n var_y = op._y_train_std.astype(dtype)\n if len(var_y.shape) == 1:\n var_y = var_y.reshape(var_y.shape + (1,))\n y_mean = OnnxAdd(\n OnnxMul(y_mean_b, var_y, op_version=opv),\n mean_y, output_names=out[:1], op_version=opv)\n\n y_mean.set_onnx_name_prefix('gpr')\n outputs = [y_mean]\n\n if options['return_cov']:\n raise NotImplementedError()\n if options['return_std']:\n if op._K_inv is None:\n raise RuntimeError(\n \"The method *predict* must be called once with parameter \"\n \"return_std=True to compute internal variables. \"\n \"They cannot be computed here as the same operation \"\n \"(matrix inversion) produces too many discrepencies \"\n \"if done with single floats than double floats.\")\n _K_inv = op._K_inv\n\n # y_var = self.kernel_.diag(X)\n y_var = convert_kernel_diag(kernel, X, dtype=dtype,\n optim=options.get('optim', None),\n op_version=opv)\n\n # y_var -= np.einsum(\"ij,ij->i\",\n # np.dot(K_trans, self._K_inv), K_trans)\n k_dot = OnnxMatMul(k_trans, _K_inv.astype(dtype), op_version=opv)\n ys_var = OnnxSub(\n y_var, OnnxReduceSum(\n OnnxMul(k_dot, k_trans, op_version=opv),\n axes=[1], keepdims=0, op_version=opv),\n op_version=opv)\n\n # y_var_negative = y_var < 0\n # if np.any(y_var_negative):\n # y_var[y_var_negative] = 0.0\n ys0_var = OnnxMax(ys_var, np.array([0], dtype=dtype),\n op_version=opv)\n\n if hasattr(op, '_y_train_std') and op._y_train_std != 1:\n # y_var = y_var * self._y_train_std**2\n ys0_var = OnnxMul(ys0_var, var_y ** 2, op_version=opv)\n\n # var = np.sqrt(ys0_var)\n var = OnnxSqrt(ys0_var, output_names=out[1:], op_version=opv)\n var.set_onnx_name_prefix('gprv')\n outputs.append(var)\n\n for o in outputs:\n o.add_to(scope, container)\n\n\nif OnnxConstantOfShape is not None:\n register_converter('SklearnGaussianProcessRegressor',\n convert_gaussian_process_regressor,\n options={'return_cov': [False, True],\n 'return_std': [False, True],\n 'optim': [None, 'cdist']})\n", "path": "skl2onnx/operator_converters/gaussian_process.py"}]}
| 1,902 | 532 |
gh_patches_debug_19106
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-2319
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CKV_AWS_40: failure even when not setting users
**Describe the issue**
The check fails when executing checkov on Terraform plan's JSON output.
**Examples**
```
resource "aws_iam_policy_attachment" "attachment" {
...
roles = [...]
# no users
}
```
JSON:
```
{
"address": "aws_iam_policy_attachment.attachment",
...
"values": {
...
"roles": [
"data-analytics@eng-0"
],
"users": []
},
...
```
The `users` field is set to `[]` in JSON, and the [check implementation](https://github.com/bridgecrewio/checkov/blob/e2538c48df14363d6ed46e5b838e19cc71ba6cbf/checkov/terraform/checks/resource/base_resource_negative_value_check.py#L39-L53) doesn't handle this scenario correctly:
https://github.com/bridgecrewio/checkov/blob/e2538c48df14363d6ed46e5b838e19cc71ba6cbf/checkov/terraform/checks/resource/base_resource_negative_value_check.py#L39-L53
**Version (please complete the following information):**
```
> checkov --version
2.0.780
```
I saw there are tests for the check implementation, but only targeting Terraform source files. Are there tests targeting JSON output from Terraform plan?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/base_resource_negative_value_check.py`
Content:
```
1 from abc import abstractmethod
2 from collections.abc import Iterable
3 from typing import List, Dict, Any, Optional
4
5 import dpath
6
7 from checkov.common.models.consts import ANY_VALUE
8 from checkov.common.models.enums import CheckResult, CheckCategories
9 from checkov.common.util.type_forcers import force_list
10 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
11 from checkov.terraform.graph_builder.utils import get_referenced_vertices_in_value
12 from checkov.terraform.parser_functions import handle_dynamic_values
13
14
15 class BaseResourceNegativeValueCheck(BaseResourceCheck):
16 def __init__(
17 self,
18 name: str,
19 id: str,
20 categories: "Iterable[CheckCategories]",
21 supported_resources: "Iterable[str]",
22 missing_attribute_result: CheckResult = CheckResult.PASSED,
23 ) -> None:
24 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
25 self.missing_attribute_result = missing_attribute_result
26
27 def scan_resource_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:
28 handle_dynamic_values(conf)
29
30 excluded_key = self.get_excluded_key()
31 if excluded_key is not None:
32 if dpath.search(conf, excluded_key) != {}:
33 value = dpath.get(conf, excluded_key)
34 if isinstance(value, list) and len(value) == 1:
35 value = value[0]
36 if self.check_excluded_condition(value):
37 return CheckResult.PASSED
38
39 inspected_key = self.get_inspected_key()
40 bad_values = self.get_forbidden_values()
41 if dpath.search(conf, inspected_key) != {}:
42 value = dpath.get(conf, inspected_key)
43 if isinstance(value, list) and len(value) == 1:
44 value = value[0]
45 if get_referenced_vertices_in_value(value=value, aliases={}, resources_types=[]):
46 # we don't provide resources_types as we want to stay provider agnostic
47 return CheckResult.UNKNOWN
48 if value is None:
49 return self.missing_attribute_result
50 if value in bad_values or ANY_VALUE in bad_values:
51 return CheckResult.FAILED
52 else:
53 return CheckResult.PASSED
54
55 return self.missing_attribute_result
56
57 @abstractmethod
58 def get_inspected_key(self) -> str:
59 """
60 :return: JSONPath syntax path of the checked attribute
61 """
62 raise NotImplementedError()
63
64 @abstractmethod
65 def get_forbidden_values(self) -> List[Any]:
66 """
67 Returns a list of vulnerable values for the inspected key, governed by provider best practices
68 """
69 raise NotImplementedError()
70
71 def get_excluded_key(self) -> Optional[str]:
72 """
73 :return: JSONPath syntax path of the an attribute that provides exclusion condition for the inspected key
74 """
75 return None
76
77 def check_excluded_condition(self, value: str) -> bool:
78 """
79 :param: value: value for excluded_key
80 :return: True if the value should exclude the check from failing if the inspected key has a bad value
81 """
82 return False
83
84 def get_evaluated_keys(self) -> List[str]:
85 return force_list(self.get_inspected_key())
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/terraform/checks/resource/base_resource_negative_value_check.py b/checkov/terraform/checks/resource/base_resource_negative_value_check.py
--- a/checkov/terraform/checks/resource/base_resource_negative_value_check.py
+++ b/checkov/terraform/checks/resource/base_resource_negative_value_check.py
@@ -42,11 +42,11 @@
value = dpath.get(conf, inspected_key)
if isinstance(value, list) and len(value) == 1:
value = value[0]
+ if value is None or (isinstance(value, list) and not value):
+ return self.missing_attribute_result
if get_referenced_vertices_in_value(value=value, aliases={}, resources_types=[]):
# we don't provide resources_types as we want to stay provider agnostic
return CheckResult.UNKNOWN
- if value is None:
- return self.missing_attribute_result
if value in bad_values or ANY_VALUE in bad_values:
return CheckResult.FAILED
else:
|
{"golden_diff": "diff --git a/checkov/terraform/checks/resource/base_resource_negative_value_check.py b/checkov/terraform/checks/resource/base_resource_negative_value_check.py\n--- a/checkov/terraform/checks/resource/base_resource_negative_value_check.py\n+++ b/checkov/terraform/checks/resource/base_resource_negative_value_check.py\n@@ -42,11 +42,11 @@\n value = dpath.get(conf, inspected_key)\n if isinstance(value, list) and len(value) == 1:\n value = value[0]\n+ if value is None or (isinstance(value, list) and not value):\n+ return self.missing_attribute_result\n if get_referenced_vertices_in_value(value=value, aliases={}, resources_types=[]):\n # we don't provide resources_types as we want to stay provider agnostic\n return CheckResult.UNKNOWN\n- if value is None:\n- return self.missing_attribute_result\n if value in bad_values or ANY_VALUE in bad_values:\n return CheckResult.FAILED\n else:\n", "issue": "CKV_AWS_40: failure even when not setting users\n**Describe the issue**\r\nThe check fails when executing checkov on Terraform plan's JSON output.\r\n\r\n**Examples**\r\n```\r\nresource \"aws_iam_policy_attachment\" \"attachment\" {\r\n ...\r\n roles = [...]\r\n # no users\r\n}\r\n```\r\nJSON:\r\n```\r\n{\r\n \"address\": \"aws_iam_policy_attachment.attachment\",\r\n ...\r\n \"values\": {\r\n ...\r\n \"roles\": [\r\n \"data-analytics@eng-0\"\r\n ],\r\n \"users\": []\r\n },\r\n...\r\n```\r\nThe `users` field is set to `[]` in JSON, and the [check implementation](https://github.com/bridgecrewio/checkov/blob/e2538c48df14363d6ed46e5b838e19cc71ba6cbf/checkov/terraform/checks/resource/base_resource_negative_value_check.py#L39-L53) doesn't handle this scenario correctly:\r\n\r\nhttps://github.com/bridgecrewio/checkov/blob/e2538c48df14363d6ed46e5b838e19cc71ba6cbf/checkov/terraform/checks/resource/base_resource_negative_value_check.py#L39-L53\r\n\r\n**Version (please complete the following information):**\r\n```\r\n> checkov --version\r\n2.0.780\r\n```\r\n\r\nI saw there are tests for the check implementation, but only targeting Terraform source files. Are there tests targeting JSON output from Terraform plan?\n", "before_files": [{"content": "from abc import abstractmethod\nfrom collections.abc import Iterable\nfrom typing import List, Dict, Any, Optional\n\nimport dpath\n\nfrom checkov.common.models.consts import ANY_VALUE\nfrom checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.common.util.type_forcers import force_list\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\nfrom checkov.terraform.graph_builder.utils import get_referenced_vertices_in_value\nfrom checkov.terraform.parser_functions import handle_dynamic_values\n\n\nclass BaseResourceNegativeValueCheck(BaseResourceCheck):\n def __init__(\n self,\n name: str,\n id: str,\n categories: \"Iterable[CheckCategories]\",\n supported_resources: \"Iterable[str]\",\n missing_attribute_result: CheckResult = CheckResult.PASSED,\n ) -> None:\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n self.missing_attribute_result = missing_attribute_result\n\n def scan_resource_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:\n handle_dynamic_values(conf)\n\n excluded_key = self.get_excluded_key()\n if excluded_key is not None:\n if dpath.search(conf, excluded_key) != {}:\n value = dpath.get(conf, excluded_key)\n if isinstance(value, list) and len(value) == 1:\n value = value[0]\n if self.check_excluded_condition(value):\n return CheckResult.PASSED\n\n inspected_key = self.get_inspected_key()\n bad_values = self.get_forbidden_values()\n if dpath.search(conf, inspected_key) != {}:\n value = dpath.get(conf, inspected_key)\n if isinstance(value, list) and len(value) == 1:\n value = value[0]\n if get_referenced_vertices_in_value(value=value, aliases={}, resources_types=[]):\n # we don't provide resources_types as we want to stay provider agnostic\n return CheckResult.UNKNOWN\n if value is None:\n return self.missing_attribute_result\n if value in bad_values or ANY_VALUE in bad_values:\n return CheckResult.FAILED\n else:\n return CheckResult.PASSED\n\n return self.missing_attribute_result\n\n @abstractmethod\n def get_inspected_key(self) -> str:\n \"\"\"\n :return: JSONPath syntax path of the checked attribute\n \"\"\"\n raise NotImplementedError()\n\n @abstractmethod\n def get_forbidden_values(self) -> List[Any]:\n \"\"\"\n Returns a list of vulnerable values for the inspected key, governed by provider best practices\n \"\"\"\n raise NotImplementedError()\n\n def get_excluded_key(self) -> Optional[str]:\n \"\"\"\n :return: JSONPath syntax path of the an attribute that provides exclusion condition for the inspected key\n \"\"\"\n return None\n\n def check_excluded_condition(self, value: str) -> bool:\n \"\"\"\n :param: value: value for excluded_key\n :return: True if the value should exclude the check from failing if the inspected key has a bad value\n \"\"\"\n return False\n\n def get_evaluated_keys(self) -> List[str]:\n return force_list(self.get_inspected_key())\n", "path": "checkov/terraform/checks/resource/base_resource_negative_value_check.py"}], "after_files": [{"content": "from abc import abstractmethod\nfrom collections.abc import Iterable\nfrom typing import List, Dict, Any, Optional\n\nimport dpath\n\nfrom checkov.common.models.consts import ANY_VALUE\nfrom checkov.common.models.enums import CheckResult, CheckCategories\nfrom checkov.common.util.type_forcers import force_list\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\nfrom checkov.terraform.graph_builder.utils import get_referenced_vertices_in_value\nfrom checkov.terraform.parser_functions import handle_dynamic_values\n\n\nclass BaseResourceNegativeValueCheck(BaseResourceCheck):\n def __init__(\n self,\n name: str,\n id: str,\n categories: \"Iterable[CheckCategories]\",\n supported_resources: \"Iterable[str]\",\n missing_attribute_result: CheckResult = CheckResult.PASSED,\n ) -> None:\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n self.missing_attribute_result = missing_attribute_result\n\n def scan_resource_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:\n handle_dynamic_values(conf)\n\n excluded_key = self.get_excluded_key()\n if excluded_key is not None:\n if dpath.search(conf, excluded_key) != {}:\n value = dpath.get(conf, excluded_key)\n if isinstance(value, list) and len(value) == 1:\n value = value[0]\n if self.check_excluded_condition(value):\n return CheckResult.PASSED\n\n inspected_key = self.get_inspected_key()\n bad_values = self.get_forbidden_values()\n if dpath.search(conf, inspected_key) != {}:\n value = dpath.get(conf, inspected_key)\n if isinstance(value, list) and len(value) == 1:\n value = value[0]\n if value is None or (isinstance(value, list) and not value):\n return self.missing_attribute_result\n if get_referenced_vertices_in_value(value=value, aliases={}, resources_types=[]):\n # we don't provide resources_types as we want to stay provider agnostic\n return CheckResult.UNKNOWN\n if value in bad_values or ANY_VALUE in bad_values:\n return CheckResult.FAILED\n else:\n return CheckResult.PASSED\n\n return self.missing_attribute_result\n\n @abstractmethod\n def get_inspected_key(self) -> str:\n \"\"\"\n :return: JSONPath syntax path of the checked attribute\n \"\"\"\n raise NotImplementedError()\n\n @abstractmethod\n def get_forbidden_values(self) -> List[Any]:\n \"\"\"\n Returns a list of vulnerable values for the inspected key, governed by provider best practices\n \"\"\"\n raise NotImplementedError()\n\n def get_excluded_key(self) -> Optional[str]:\n \"\"\"\n :return: JSONPath syntax path of the an attribute that provides exclusion condition for the inspected key\n \"\"\"\n return None\n\n def check_excluded_condition(self, value: str) -> bool:\n \"\"\"\n :param: value: value for excluded_key\n :return: True if the value should exclude the check from failing if the inspected key has a bad value\n \"\"\"\n return False\n\n def get_evaluated_keys(self) -> List[str]:\n return force_list(self.get_inspected_key())\n", "path": "checkov/terraform/checks/resource/base_resource_negative_value_check.py"}]}
| 1,450 | 218 |
gh_patches_debug_30698
|
rasdani/github-patches
|
git_diff
|
digitalfabrik__integreat-cms-602
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Show latest feedback in dashboard
Show admins and content creators the latest feedback from app users in dashboard. This should be a list of the last ~5 messages. In some cases, the feedback only contains a thumbs up or down, in other cases it can contain a message. The title of the page or event concerned should be displayed as well and linked to the editing page.
Additional option: Add link that sends the message string to translate.google.com or deepl.com for translation. This can be useful if the feedback is given in a language the back end user does not understand. It is perfectly fine, if the link opens the translation website in a new tab. No need to fetch a translation via the API.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cms/views/dashboard/admin_dashboard_view.py`
Content:
```
1 from django.contrib.auth.decorators import login_required
2 from django.shortcuts import render
3 from django.utils.decorators import method_decorator
4 from django.views.generic import TemplateView
5
6 from ...decorators import staff_required
7
8
9 @method_decorator(login_required, name="dispatch")
10 @method_decorator(staff_required, name="dispatch")
11 class AdminDashboardView(TemplateView):
12 """
13 View for the admin dashboard
14 """
15
16 #: The template to render (see :class:`~django.views.generic.base.TemplateResponseMixin`)
17 template_name = "dashboard/admin_dashboard.html"
18 #: The context dict passed to the template (see :class:`~django.views.generic.base.ContextMixin`)
19 base_context = {"current_menu_item": "admin_dashboard"}
20
21 def get(self, request, *args, **kwargs):
22 """
23 Render admin dashboard
24
25 :param request: Object representing the user call
26 :type request: ~django.http.HttpRequest
27
28 :param args: The supplied arguments
29 :type args: list
30
31 :param kwargs: The supplied keyword arguments
32 :type kwargs: dict
33
34 :return: The rendered template response
35 :rtype: ~django.template.response.TemplateResponse
36 """
37
38 val = "To be defined"
39 return render(request, self.template_name, {"key": val, **self.base_context})
40
```
Path: `src/cms/views/dashboard/dashboard_view.py`
Content:
```
1 import html
2 from urllib.parse import urlparse
3 import feedparser
4
5 from django.contrib.auth.decorators import login_required
6 from django.shortcuts import render
7 from django.utils import translation
8 from django.utils.decorators import method_decorator
9 from django.views.generic import TemplateView
10
11 from backend.settings import RSS_FEED_URLS
12 from ...decorators import region_permission_required
13
14
15 @method_decorator(login_required, name="dispatch")
16 @method_decorator(region_permission_required, name="dispatch")
17 class DashboardView(TemplateView):
18 """
19 View for the region dashboard
20 """
21
22 #: The template to render (see :class:`~django.views.generic.base.TemplateResponseMixin`)
23 template_name = "dashboard/dashboard.html"
24 #: The context dict passed to the template (see :class:`~django.views.generic.base.ContextMixin`)
25 base_context = {"current_menu_item": "region_dashboard"}
26
27 def get(self, request, *args, **kwargs):
28 """
29 Render the region dashboard
30
31 :param request: Object representing the user call
32 :type request: ~django.http.HttpRequest
33
34 :param args: The supplied arguments
35 :type args: list
36
37 :param kwargs: The supplied keyword arguments
38 :type kwargs: dict
39
40 :return: The rendered template response
41 :rtype: ~django.template.response.TemplateResponse
42 """
43
44 val = "To be defined"
45 language_code = translation.get_language()
46 feed = feedparser.parse(RSS_FEED_URLS[language_code])
47 # select five most recent feeds
48 feed["entries"] = feed["entries"][:5]
49 # decode html entities like dash and split after line break
50 for entry in feed["entries"]:
51 entry["summary"] = html.unescape(entry["summary"]).split("\n")[0]
52 domain = urlparse(RSS_FEED_URLS["home-page"]).netloc
53 return render(
54 request,
55 self.template_name,
56 {
57 "key": val,
58 **self.base_context,
59 "feed": feed,
60 "home_page": RSS_FEED_URLS["home-page"],
61 "domain": domain,
62 },
63 )
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/cms/views/dashboard/admin_dashboard_view.py b/src/cms/views/dashboard/admin_dashboard_view.py
--- a/src/cms/views/dashboard/admin_dashboard_view.py
+++ b/src/cms/views/dashboard/admin_dashboard_view.py
@@ -4,6 +4,7 @@
from django.views.generic import TemplateView
from ...decorators import staff_required
+from ...models import Feedback
@method_decorator(login_required, name="dispatch")
@@ -34,6 +35,14 @@
:return: The rendered template response
:rtype: ~django.template.response.TemplateResponse
"""
-
- val = "To be defined"
- return render(request, self.template_name, {"key": val, **self.base_context})
+ all_feedback = Feedback.objects.filter(is_technical=True)[:5]
+
+ return render(
+ request,
+ self.template_name,
+ {
+ "current_menu_item": "admin_feedback",
+ "all_feedback": all_feedback,
+ **self.base_context,
+ },
+ )
diff --git a/src/cms/views/dashboard/dashboard_view.py b/src/cms/views/dashboard/dashboard_view.py
--- a/src/cms/views/dashboard/dashboard_view.py
+++ b/src/cms/views/dashboard/dashboard_view.py
@@ -41,7 +41,6 @@
:rtype: ~django.template.response.TemplateResponse
"""
- val = "To be defined"
language_code = translation.get_language()
feed = feedparser.parse(RSS_FEED_URLS[language_code])
# select five most recent feeds
@@ -54,7 +53,6 @@
request,
self.template_name,
{
- "key": val,
**self.base_context,
"feed": feed,
"home_page": RSS_FEED_URLS["home-page"],
|
{"golden_diff": "diff --git a/src/cms/views/dashboard/admin_dashboard_view.py b/src/cms/views/dashboard/admin_dashboard_view.py\n--- a/src/cms/views/dashboard/admin_dashboard_view.py\n+++ b/src/cms/views/dashboard/admin_dashboard_view.py\n@@ -4,6 +4,7 @@\n from django.views.generic import TemplateView\n \n from ...decorators import staff_required\n+from ...models import Feedback\n \n \n @method_decorator(login_required, name=\"dispatch\")\n@@ -34,6 +35,14 @@\n :return: The rendered template response\n :rtype: ~django.template.response.TemplateResponse\n \"\"\"\n-\n- val = \"To be defined\"\n- return render(request, self.template_name, {\"key\": val, **self.base_context})\n+ all_feedback = Feedback.objects.filter(is_technical=True)[:5]\n+\n+ return render(\n+ request,\n+ self.template_name,\n+ {\n+ \"current_menu_item\": \"admin_feedback\",\n+ \"all_feedback\": all_feedback,\n+ **self.base_context,\n+ },\n+ )\ndiff --git a/src/cms/views/dashboard/dashboard_view.py b/src/cms/views/dashboard/dashboard_view.py\n--- a/src/cms/views/dashboard/dashboard_view.py\n+++ b/src/cms/views/dashboard/dashboard_view.py\n@@ -41,7 +41,6 @@\n :rtype: ~django.template.response.TemplateResponse\n \"\"\"\n \n- val = \"To be defined\"\n language_code = translation.get_language()\n feed = feedparser.parse(RSS_FEED_URLS[language_code])\n # select five most recent feeds\n@@ -54,7 +53,6 @@\n request,\n self.template_name,\n {\n- \"key\": val,\n **self.base_context,\n \"feed\": feed,\n \"home_page\": RSS_FEED_URLS[\"home-page\"],\n", "issue": "Show latest feedback in dashboard\nShow admins and content creators the latest feedback from app users in dashboard. This should be a list of the last ~5 messages. In some cases, the feedback only contains a thumbs up or down, in other cases it can contain a message. The title of the page or event concerned should be displayed as well and linked to the editing page.\r\n\r\nAdditional option: Add link that sends the message string to translate.google.com or deepl.com for translation. This can be useful if the feedback is given in a language the back end user does not understand. It is perfectly fine, if the link opens the translation website in a new tab. No need to fetch a translation via the API.\n", "before_files": [{"content": "from django.contrib.auth.decorators import login_required\nfrom django.shortcuts import render\nfrom django.utils.decorators import method_decorator\nfrom django.views.generic import TemplateView\n\nfrom ...decorators import staff_required\n\n\n@method_decorator(login_required, name=\"dispatch\")\n@method_decorator(staff_required, name=\"dispatch\")\nclass AdminDashboardView(TemplateView):\n \"\"\"\n View for the admin dashboard\n \"\"\"\n\n #: The template to render (see :class:`~django.views.generic.base.TemplateResponseMixin`)\n template_name = \"dashboard/admin_dashboard.html\"\n #: The context dict passed to the template (see :class:`~django.views.generic.base.ContextMixin`)\n base_context = {\"current_menu_item\": \"admin_dashboard\"}\n\n def get(self, request, *args, **kwargs):\n \"\"\"\n Render admin dashboard\n\n :param request: Object representing the user call\n :type request: ~django.http.HttpRequest\n\n :param args: The supplied arguments\n :type args: list\n\n :param kwargs: The supplied keyword arguments\n :type kwargs: dict\n\n :return: The rendered template response\n :rtype: ~django.template.response.TemplateResponse\n \"\"\"\n\n val = \"To be defined\"\n return render(request, self.template_name, {\"key\": val, **self.base_context})\n", "path": "src/cms/views/dashboard/admin_dashboard_view.py"}, {"content": "import html\nfrom urllib.parse import urlparse\nimport feedparser\n\nfrom django.contrib.auth.decorators import login_required\nfrom django.shortcuts import render\nfrom django.utils import translation\nfrom django.utils.decorators import method_decorator\nfrom django.views.generic import TemplateView\n\nfrom backend.settings import RSS_FEED_URLS\nfrom ...decorators import region_permission_required\n\n\n@method_decorator(login_required, name=\"dispatch\")\n@method_decorator(region_permission_required, name=\"dispatch\")\nclass DashboardView(TemplateView):\n \"\"\"\n View for the region dashboard\n \"\"\"\n\n #: The template to render (see :class:`~django.views.generic.base.TemplateResponseMixin`)\n template_name = \"dashboard/dashboard.html\"\n #: The context dict passed to the template (see :class:`~django.views.generic.base.ContextMixin`)\n base_context = {\"current_menu_item\": \"region_dashboard\"}\n\n def get(self, request, *args, **kwargs):\n \"\"\"\n Render the region dashboard\n\n :param request: Object representing the user call\n :type request: ~django.http.HttpRequest\n\n :param args: The supplied arguments\n :type args: list\n\n :param kwargs: The supplied keyword arguments\n :type kwargs: dict\n\n :return: The rendered template response\n :rtype: ~django.template.response.TemplateResponse\n \"\"\"\n\n val = \"To be defined\"\n language_code = translation.get_language()\n feed = feedparser.parse(RSS_FEED_URLS[language_code])\n # select five most recent feeds\n feed[\"entries\"] = feed[\"entries\"][:5]\n # decode html entities like dash and split after line break\n for entry in feed[\"entries\"]:\n entry[\"summary\"] = html.unescape(entry[\"summary\"]).split(\"\\n\")[0]\n domain = urlparse(RSS_FEED_URLS[\"home-page\"]).netloc\n return render(\n request,\n self.template_name,\n {\n \"key\": val,\n **self.base_context,\n \"feed\": feed,\n \"home_page\": RSS_FEED_URLS[\"home-page\"],\n \"domain\": domain,\n },\n )\n", "path": "src/cms/views/dashboard/dashboard_view.py"}], "after_files": [{"content": "from django.contrib.auth.decorators import login_required\nfrom django.shortcuts import render\nfrom django.utils.decorators import method_decorator\nfrom django.views.generic import TemplateView\n\nfrom ...decorators import staff_required\nfrom ...models import Feedback\n\n\n@method_decorator(login_required, name=\"dispatch\")\n@method_decorator(staff_required, name=\"dispatch\")\nclass AdminDashboardView(TemplateView):\n \"\"\"\n View for the admin dashboard\n \"\"\"\n\n #: The template to render (see :class:`~django.views.generic.base.TemplateResponseMixin`)\n template_name = \"dashboard/admin_dashboard.html\"\n #: The context dict passed to the template (see :class:`~django.views.generic.base.ContextMixin`)\n base_context = {\"current_menu_item\": \"admin_dashboard\"}\n\n def get(self, request, *args, **kwargs):\n \"\"\"\n Render admin dashboard\n\n :param request: Object representing the user call\n :type request: ~django.http.HttpRequest\n\n :param args: The supplied arguments\n :type args: list\n\n :param kwargs: The supplied keyword arguments\n :type kwargs: dict\n\n :return: The rendered template response\n :rtype: ~django.template.response.TemplateResponse\n \"\"\"\n all_feedback = Feedback.objects.filter(is_technical=True)[:5]\n\n return render(\n request,\n self.template_name,\n {\n \"current_menu_item\": \"admin_feedback\",\n \"all_feedback\": all_feedback,\n **self.base_context,\n },\n )\n", "path": "src/cms/views/dashboard/admin_dashboard_view.py"}, {"content": "import html\nfrom urllib.parse import urlparse\nimport feedparser\n\nfrom django.contrib.auth.decorators import login_required\nfrom django.shortcuts import render\nfrom django.utils import translation\nfrom django.utils.decorators import method_decorator\nfrom django.views.generic import TemplateView\n\nfrom backend.settings import RSS_FEED_URLS\nfrom ...decorators import region_permission_required\n\n\n@method_decorator(login_required, name=\"dispatch\")\n@method_decorator(region_permission_required, name=\"dispatch\")\nclass DashboardView(TemplateView):\n \"\"\"\n View for the region dashboard\n \"\"\"\n\n #: The template to render (see :class:`~django.views.generic.base.TemplateResponseMixin`)\n template_name = \"dashboard/dashboard.html\"\n #: The context dict passed to the template (see :class:`~django.views.generic.base.ContextMixin`)\n base_context = {\"current_menu_item\": \"region_dashboard\"}\n\n def get(self, request, *args, **kwargs):\n \"\"\"\n Render the region dashboard\n\n :param request: Object representing the user call\n :type request: ~django.http.HttpRequest\n\n :param args: The supplied arguments\n :type args: list\n\n :param kwargs: The supplied keyword arguments\n :type kwargs: dict\n\n :return: The rendered template response\n :rtype: ~django.template.response.TemplateResponse\n \"\"\"\n\n language_code = translation.get_language()\n feed = feedparser.parse(RSS_FEED_URLS[language_code])\n # select five most recent feeds\n feed[\"entries\"] = feed[\"entries\"][:5]\n # decode html entities like dash and split after line break\n for entry in feed[\"entries\"]:\n entry[\"summary\"] = html.unescape(entry[\"summary\"]).split(\"\\n\")[0]\n domain = urlparse(RSS_FEED_URLS[\"home-page\"]).netloc\n return render(\n request,\n self.template_name,\n {\n **self.base_context,\n \"feed\": feed,\n \"home_page\": RSS_FEED_URLS[\"home-page\"],\n \"domain\": domain,\n },\n )\n", "path": "src/cms/views/dashboard/dashboard_view.py"}]}
| 1,328 | 380 |
gh_patches_debug_25733
|
rasdani/github-patches
|
git_diff
|
Mailu__Mailu-731
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Increase attachment size limit
Hello,
How can web change the max file limit of attachment in Roundcube ?
Version 1.5
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `webmails/rainloop/start.py`
Content:
```
1 #!/usr/bin/python3
2
3 import jinja2
4 import os
5 import shutil
6
7 convert = lambda src, dst: open(dst, "w").write(jinja2.Template(open(src).read()).render(**os.environ))
8
9 # Actual startup script
10 os.environ["FRONT_ADDRESS"] = os.environ.get("FRONT_ADDRESS", "front")
11 os.environ["IMAP_ADDRESS"] = os.environ.get("IMAP_ADDRESS", "imap")
12
13 base = "/data/_data_/_default_/"
14 shutil.rmtree(base + "domains/", ignore_errors=True)
15 os.makedirs(base + "domains", exist_ok=True)
16 os.makedirs(base + "configs", exist_ok=True)
17
18 convert("/default.ini", "/data/_data_/_default_/domains/default.ini")
19 convert("/config.ini", "/data/_data_/_default_/configs/config.ini")
20
21 os.system("chown -R www-data:www-data /data")
22
23 os.execv("/usr/local/bin/apache2-foreground", ["apache2-foreground"])
24
25
```
Path: `webmails/roundcube/start.py`
Content:
```
1 #!/usr/bin/python3
2
3 import os
4
5 # Fix some permissions
6 os.system("mkdir -p /data/gpg")
7 os.system("chown -R www-data:www-data /data")
8
9 # Run apache
10 os.execv("/usr/local/bin/apache2-foreground", ["apache2-foreground"])
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/webmails/rainloop/start.py b/webmails/rainloop/start.py
--- a/webmails/rainloop/start.py
+++ b/webmails/rainloop/start.py
@@ -10,6 +10,8 @@
os.environ["FRONT_ADDRESS"] = os.environ.get("FRONT_ADDRESS", "front")
os.environ["IMAP_ADDRESS"] = os.environ.get("IMAP_ADDRESS", "imap")
+os.environ["MAX_FILESIZE"] = str(int(int(os.environ.get("MESSAGE_SIZE_LIMIT"))*0.66/1048576))
+
base = "/data/_data_/_default_/"
shutil.rmtree(base + "domains/", ignore_errors=True)
os.makedirs(base + "domains", exist_ok=True)
@@ -17,6 +19,7 @@
convert("/default.ini", "/data/_data_/_default_/domains/default.ini")
convert("/config.ini", "/data/_data_/_default_/configs/config.ini")
+convert("/php.ini", "/usr/local/etc/php/conf.d/rainloop.ini")
os.system("chown -R www-data:www-data /data")
diff --git a/webmails/roundcube/start.py b/webmails/roundcube/start.py
--- a/webmails/roundcube/start.py
+++ b/webmails/roundcube/start.py
@@ -1,6 +1,13 @@
#!/usr/bin/python3
import os
+import jinja2
+
+convert = lambda src, dst: open(dst, "w").write(jinja2.Template(open(src).read()).render(**os.environ))
+
+os.environ["MAX_FILESIZE"] = str(int(int(os.environ.get("MESSAGE_SIZE_LIMIT"))*0.66/1048576))
+
+convert("/php.ini", "/usr/local/etc/php/conf.d/roundcube.ini")
# Fix some permissions
os.system("mkdir -p /data/gpg")
|
{"golden_diff": "diff --git a/webmails/rainloop/start.py b/webmails/rainloop/start.py\n--- a/webmails/rainloop/start.py\n+++ b/webmails/rainloop/start.py\n@@ -10,6 +10,8 @@\n os.environ[\"FRONT_ADDRESS\"] = os.environ.get(\"FRONT_ADDRESS\", \"front\")\n os.environ[\"IMAP_ADDRESS\"] = os.environ.get(\"IMAP_ADDRESS\", \"imap\")\n \n+os.environ[\"MAX_FILESIZE\"] = str(int(int(os.environ.get(\"MESSAGE_SIZE_LIMIT\"))*0.66/1048576))\n+\n base = \"/data/_data_/_default_/\"\n shutil.rmtree(base + \"domains/\", ignore_errors=True)\n os.makedirs(base + \"domains\", exist_ok=True)\n@@ -17,6 +19,7 @@\n \n convert(\"/default.ini\", \"/data/_data_/_default_/domains/default.ini\")\n convert(\"/config.ini\", \"/data/_data_/_default_/configs/config.ini\")\n+convert(\"/php.ini\", \"/usr/local/etc/php/conf.d/rainloop.ini\")\n \n os.system(\"chown -R www-data:www-data /data\")\n \ndiff --git a/webmails/roundcube/start.py b/webmails/roundcube/start.py\n--- a/webmails/roundcube/start.py\n+++ b/webmails/roundcube/start.py\n@@ -1,6 +1,13 @@\n #!/usr/bin/python3\n \n import os\n+import jinja2\n+\n+convert = lambda src, dst: open(dst, \"w\").write(jinja2.Template(open(src).read()).render(**os.environ))\n+\n+os.environ[\"MAX_FILESIZE\"] = str(int(int(os.environ.get(\"MESSAGE_SIZE_LIMIT\"))*0.66/1048576))\n+\n+convert(\"/php.ini\", \"/usr/local/etc/php/conf.d/roundcube.ini\")\n \n # Fix some permissions\n os.system(\"mkdir -p /data/gpg\")\n", "issue": "Increase attachment size limit\nHello, \r\n\r\nHow can web change the max file limit of attachment in Roundcube ?\r\n\r\nVersion 1.5\n", "before_files": [{"content": "#!/usr/bin/python3\n\nimport jinja2\nimport os\nimport shutil\n\nconvert = lambda src, dst: open(dst, \"w\").write(jinja2.Template(open(src).read()).render(**os.environ))\n\n# Actual startup script\nos.environ[\"FRONT_ADDRESS\"] = os.environ.get(\"FRONT_ADDRESS\", \"front\")\nos.environ[\"IMAP_ADDRESS\"] = os.environ.get(\"IMAP_ADDRESS\", \"imap\")\n\nbase = \"/data/_data_/_default_/\"\nshutil.rmtree(base + \"domains/\", ignore_errors=True)\nos.makedirs(base + \"domains\", exist_ok=True)\nos.makedirs(base + \"configs\", exist_ok=True)\n\nconvert(\"/default.ini\", \"/data/_data_/_default_/domains/default.ini\")\nconvert(\"/config.ini\", \"/data/_data_/_default_/configs/config.ini\")\n\nos.system(\"chown -R www-data:www-data /data\")\n\nos.execv(\"/usr/local/bin/apache2-foreground\", [\"apache2-foreground\"])\n\n", "path": "webmails/rainloop/start.py"}, {"content": "#!/usr/bin/python3\n\nimport os\n\n# Fix some permissions\nos.system(\"mkdir -p /data/gpg\")\nos.system(\"chown -R www-data:www-data /data\")\n\n# Run apache\nos.execv(\"/usr/local/bin/apache2-foreground\", [\"apache2-foreground\"])", "path": "webmails/roundcube/start.py"}], "after_files": [{"content": "#!/usr/bin/python3\n\nimport jinja2\nimport os\nimport shutil\n\nconvert = lambda src, dst: open(dst, \"w\").write(jinja2.Template(open(src).read()).render(**os.environ))\n\n# Actual startup script\nos.environ[\"FRONT_ADDRESS\"] = os.environ.get(\"FRONT_ADDRESS\", \"front\")\nos.environ[\"IMAP_ADDRESS\"] = os.environ.get(\"IMAP_ADDRESS\", \"imap\")\n\nos.environ[\"MAX_FILESIZE\"] = str(int(int(os.environ.get(\"MESSAGE_SIZE_LIMIT\"))*0.66/1048576))\n\nbase = \"/data/_data_/_default_/\"\nshutil.rmtree(base + \"domains/\", ignore_errors=True)\nos.makedirs(base + \"domains\", exist_ok=True)\nos.makedirs(base + \"configs\", exist_ok=True)\n\nconvert(\"/default.ini\", \"/data/_data_/_default_/domains/default.ini\")\nconvert(\"/config.ini\", \"/data/_data_/_default_/configs/config.ini\")\nconvert(\"/php.ini\", \"/usr/local/etc/php/conf.d/rainloop.ini\")\n\nos.system(\"chown -R www-data:www-data /data\")\n\nos.execv(\"/usr/local/bin/apache2-foreground\", [\"apache2-foreground\"])\n\n", "path": "webmails/rainloop/start.py"}, {"content": "#!/usr/bin/python3\n\nimport os\nimport jinja2\n\nconvert = lambda src, dst: open(dst, \"w\").write(jinja2.Template(open(src).read()).render(**os.environ))\n\nos.environ[\"MAX_FILESIZE\"] = str(int(int(os.environ.get(\"MESSAGE_SIZE_LIMIT\"))*0.66/1048576))\n\nconvert(\"/php.ini\", \"/usr/local/etc/php/conf.d/roundcube.ini\")\n\n# Fix some permissions\nos.system(\"mkdir -p /data/gpg\")\nos.system(\"chown -R www-data:www-data /data\")\n\n# Run apache\nos.execv(\"/usr/local/bin/apache2-foreground\", [\"apache2-foreground\"])", "path": "webmails/roundcube/start.py"}]}
| 635 | 417 |
gh_patches_debug_9841
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-993
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Checkov logs to stdout making the json un-parseable
**Describe the bug**
When you run checkov with the `-o json` and `LOG_LEVEL=WARNING` the logs are printed to `stdout` together with the json object. This way it's hard and error prone to parse the json output.
**Expected behavior**
At least in `-o json` (or any parseable output) mode It should use stderr to log to and use stdout only for the parseable output and log everything else to stderr.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/logging_init.py`
Content:
```
1 import sys
2
3 import logging
4 import os
5
6
7 def init():
8 LOG_LEVEL = os.environ.get('LOG_LEVEL', 'WARNING').upper()
9 logging.basicConfig(level=LOG_LEVEL)
10 logFormatter = logging.Formatter("%(asctime)s [%(threadName)-12.12s] [%(levelname)-5.5s] %(message)s")
11 rootLogger = logging.getLogger()
12 consoleHandler = logging.StreamHandler(sys.stdout)
13 consoleHandler.setFormatter(logFormatter)
14 consoleHandler.setLevel(LOG_LEVEL)
15 rootLogger.addHandler(consoleHandler)
16 logging.getLogger("urllib3").setLevel(logging.ERROR)
17 logging.getLogger("urllib3.connectionpool").setLevel(logging.ERROR)
18 logging.getLogger("urllib3.connectionpool").propagate = False
19 logging.getLogger("urllib3").propagate = False
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/logging_init.py b/checkov/logging_init.py
--- a/checkov/logging_init.py
+++ b/checkov/logging_init.py
@@ -9,7 +9,7 @@
logging.basicConfig(level=LOG_LEVEL)
logFormatter = logging.Formatter("%(asctime)s [%(threadName)-12.12s] [%(levelname)-5.5s] %(message)s")
rootLogger = logging.getLogger()
- consoleHandler = logging.StreamHandler(sys.stdout)
+ consoleHandler = logging.StreamHandler(sys.stderr)
consoleHandler.setFormatter(logFormatter)
consoleHandler.setLevel(LOG_LEVEL)
rootLogger.addHandler(consoleHandler)
|
{"golden_diff": "diff --git a/checkov/logging_init.py b/checkov/logging_init.py\n--- a/checkov/logging_init.py\n+++ b/checkov/logging_init.py\n@@ -9,7 +9,7 @@\n logging.basicConfig(level=LOG_LEVEL)\n logFormatter = logging.Formatter(\"%(asctime)s [%(threadName)-12.12s] [%(levelname)-5.5s] %(message)s\")\n rootLogger = logging.getLogger()\n- consoleHandler = logging.StreamHandler(sys.stdout)\n+ consoleHandler = logging.StreamHandler(sys.stderr)\n consoleHandler.setFormatter(logFormatter)\n consoleHandler.setLevel(LOG_LEVEL)\n rootLogger.addHandler(consoleHandler)\n", "issue": "Checkov logs to stdout making the json un-parseable\n**Describe the bug**\r\nWhen you run checkov with the `-o json` and `LOG_LEVEL=WARNING` the logs are printed to `stdout` together with the json object. This way it's hard and error prone to parse the json output.\r\n\r\n**Expected behavior**\r\nAt least in `-o json` (or any parseable output) mode It should use stderr to log to and use stdout only for the parseable output and log everything else to stderr.\r\n\n", "before_files": [{"content": "import sys\n\nimport logging\nimport os\n\n\ndef init():\n LOG_LEVEL = os.environ.get('LOG_LEVEL', 'WARNING').upper()\n logging.basicConfig(level=LOG_LEVEL)\n logFormatter = logging.Formatter(\"%(asctime)s [%(threadName)-12.12s] [%(levelname)-5.5s] %(message)s\")\n rootLogger = logging.getLogger()\n consoleHandler = logging.StreamHandler(sys.stdout)\n consoleHandler.setFormatter(logFormatter)\n consoleHandler.setLevel(LOG_LEVEL)\n rootLogger.addHandler(consoleHandler)\n logging.getLogger(\"urllib3\").setLevel(logging.ERROR)\n logging.getLogger(\"urllib3.connectionpool\").setLevel(logging.ERROR)\n logging.getLogger(\"urllib3.connectionpool\").propagate = False\n logging.getLogger(\"urllib3\").propagate = False\n", "path": "checkov/logging_init.py"}], "after_files": [{"content": "import sys\n\nimport logging\nimport os\n\n\ndef init():\n LOG_LEVEL = os.environ.get('LOG_LEVEL', 'WARNING').upper()\n logging.basicConfig(level=LOG_LEVEL)\n logFormatter = logging.Formatter(\"%(asctime)s [%(threadName)-12.12s] [%(levelname)-5.5s] %(message)s\")\n rootLogger = logging.getLogger()\n consoleHandler = logging.StreamHandler(sys.stderr)\n consoleHandler.setFormatter(logFormatter)\n consoleHandler.setLevel(LOG_LEVEL)\n rootLogger.addHandler(consoleHandler)\n logging.getLogger(\"urllib3\").setLevel(logging.ERROR)\n logging.getLogger(\"urllib3.connectionpool\").setLevel(logging.ERROR)\n logging.getLogger(\"urllib3.connectionpool\").propagate = False\n logging.getLogger(\"urllib3\").propagate = False\n", "path": "checkov/logging_init.py"}]}
| 568 | 137 |
gh_patches_debug_3547
|
rasdani/github-patches
|
git_diff
|
cisagov__manage.get.gov-199
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Reconfigure OIDC logout to send client_id
Login.gov recently changed their logout method to take `client_id` instead of the previous parameter `id_token_hint`. We need to change our code to match.

@SSPJ knows this code the best of any of us.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/djangooidc/views.py`
Content:
```
1 # coding: utf-8
2
3 import logging
4
5 from django.conf import settings
6 from django.contrib.auth import logout as auth_logout
7 from django.contrib.auth import authenticate, login
8 from django.http import HttpResponseRedirect
9 from django.shortcuts import redirect, render
10 from urllib.parse import parse_qs, urlencode
11
12 from djangooidc.oidc import Client
13 from djangooidc import exceptions as o_e
14
15
16 logger = logging.getLogger(__name__)
17
18 try:
19 # Initialize provider using pyOICD
20 OP = getattr(settings, "OIDC_ACTIVE_PROVIDER")
21 CLIENT = Client(OP)
22 logger.debug("client initialized %s" % CLIENT)
23 except Exception as err:
24 CLIENT = None # type: ignore
25 logger.warning(err)
26 logger.warning("Unable to configure OpenID Connect provider. Users cannot log in.")
27
28
29 def error_page(request, error):
30 """Display a sensible message and log the error."""
31 logger.error(error)
32 if isinstance(error, o_e.AuthenticationFailed):
33 return render(
34 request,
35 "401.html",
36 context={
37 "friendly_message": error.friendly_message,
38 "log_identifier": error.locator,
39 },
40 status=401,
41 )
42 if isinstance(error, o_e.InternalError):
43 return render(
44 request,
45 "500.html",
46 context={
47 "friendly_message": error.friendly_message,
48 "log_identifier": error.locator,
49 },
50 status=500,
51 )
52 if isinstance(error, Exception):
53 return render(request, "500.html", status=500)
54
55
56 def openid(request):
57 """Redirect the user to an authentication provider (OP)."""
58 request.session["next"] = request.GET.get("next", "/")
59
60 try:
61 return CLIENT.create_authn_request(request.session)
62 except Exception as err:
63 return error_page(request, err)
64
65
66 def login_callback(request):
67 """Analyze the token returned by the authentication provider (OP)."""
68 try:
69 query = parse_qs(request.GET.urlencode())
70 userinfo = CLIENT.callback(query, request.session)
71 user = authenticate(request=request, **userinfo)
72 if user:
73 login(request, user)
74 logger.info("Successfully logged in user %s" % user)
75 return redirect(request.session.get("next", "/"))
76 else:
77 raise o_e.BannedUser()
78 except Exception as err:
79 return error_page(request, err)
80
81
82 def logout(request, next_page=None):
83 """Redirect the user to the authentication provider (OP) logout page."""
84 try:
85 username = request.user.username
86 request_args = {
87 # it is perfectly fine to send the token, even if it is expired
88 "id_token_hint": request.session["id_token_raw"],
89 "state": request.session["state"],
90 }
91 if (
92 "post_logout_redirect_uris" in CLIENT.registration_response.keys()
93 and len(CLIENT.registration_response["post_logout_redirect_uris"]) > 0
94 ):
95 request_args.update(
96 {
97 "post_logout_redirect_uri": CLIENT.registration_response[
98 "post_logout_redirect_uris"
99 ][0]
100 }
101 )
102
103 url = CLIENT.provider_info["end_session_endpoint"]
104 url += "?" + urlencode(request_args)
105 return HttpResponseRedirect(url)
106 except Exception as err:
107 return error_page(request, err)
108 finally:
109 # Always remove Django session stuff - even if not logged out from OP.
110 # Don't wait for the callback as it may never come.
111 auth_logout(request)
112 logger.info("Successfully logged out user %s" % username)
113 next_page = getattr(settings, "LOGOUT_REDIRECT_URL", None)
114 if next_page:
115 request.session["next"] = next_page
116
117
118 def logout_callback(request):
119 """Simple redirection view: after logout, redirect to `next`."""
120 next = request.session.get("next", "/")
121 return redirect(next)
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/djangooidc/views.py b/src/djangooidc/views.py
--- a/src/djangooidc/views.py
+++ b/src/djangooidc/views.py
@@ -84,8 +84,7 @@
try:
username = request.user.username
request_args = {
- # it is perfectly fine to send the token, even if it is expired
- "id_token_hint": request.session["id_token_raw"],
+ "client_id": CLIENT.client_id,
"state": request.session["state"],
}
if (
|
{"golden_diff": "diff --git a/src/djangooidc/views.py b/src/djangooidc/views.py\n--- a/src/djangooidc/views.py\n+++ b/src/djangooidc/views.py\n@@ -84,8 +84,7 @@\n try:\n username = request.user.username\n request_args = {\n- # it is perfectly fine to send the token, even if it is expired\n- \"id_token_hint\": request.session[\"id_token_raw\"],\n+ \"client_id\": CLIENT.client_id,\n \"state\": request.session[\"state\"],\n }\n if (\n", "issue": "Reconfigure OIDC logout to send client_id\nLogin.gov recently changed their logout method to take `client_id` instead of the previous parameter `id_token_hint`. We need to change our code to match.\r\n\r\n\r\n\r\n@SSPJ knows this code the best of any of us.\n", "before_files": [{"content": "# coding: utf-8\n\nimport logging\n\nfrom django.conf import settings\nfrom django.contrib.auth import logout as auth_logout\nfrom django.contrib.auth import authenticate, login\nfrom django.http import HttpResponseRedirect\nfrom django.shortcuts import redirect, render\nfrom urllib.parse import parse_qs, urlencode\n\nfrom djangooidc.oidc import Client\nfrom djangooidc import exceptions as o_e\n\n\nlogger = logging.getLogger(__name__)\n\ntry:\n # Initialize provider using pyOICD\n OP = getattr(settings, \"OIDC_ACTIVE_PROVIDER\")\n CLIENT = Client(OP)\n logger.debug(\"client initialized %s\" % CLIENT)\nexcept Exception as err:\n CLIENT = None # type: ignore\n logger.warning(err)\n logger.warning(\"Unable to configure OpenID Connect provider. Users cannot log in.\")\n\n\ndef error_page(request, error):\n \"\"\"Display a sensible message and log the error.\"\"\"\n logger.error(error)\n if isinstance(error, o_e.AuthenticationFailed):\n return render(\n request,\n \"401.html\",\n context={\n \"friendly_message\": error.friendly_message,\n \"log_identifier\": error.locator,\n },\n status=401,\n )\n if isinstance(error, o_e.InternalError):\n return render(\n request,\n \"500.html\",\n context={\n \"friendly_message\": error.friendly_message,\n \"log_identifier\": error.locator,\n },\n status=500,\n )\n if isinstance(error, Exception):\n return render(request, \"500.html\", status=500)\n\n\ndef openid(request):\n \"\"\"Redirect the user to an authentication provider (OP).\"\"\"\n request.session[\"next\"] = request.GET.get(\"next\", \"/\")\n\n try:\n return CLIENT.create_authn_request(request.session)\n except Exception as err:\n return error_page(request, err)\n\n\ndef login_callback(request):\n \"\"\"Analyze the token returned by the authentication provider (OP).\"\"\"\n try:\n query = parse_qs(request.GET.urlencode())\n userinfo = CLIENT.callback(query, request.session)\n user = authenticate(request=request, **userinfo)\n if user:\n login(request, user)\n logger.info(\"Successfully logged in user %s\" % user)\n return redirect(request.session.get(\"next\", \"/\"))\n else:\n raise o_e.BannedUser()\n except Exception as err:\n return error_page(request, err)\n\n\ndef logout(request, next_page=None):\n \"\"\"Redirect the user to the authentication provider (OP) logout page.\"\"\"\n try:\n username = request.user.username\n request_args = {\n # it is perfectly fine to send the token, even if it is expired\n \"id_token_hint\": request.session[\"id_token_raw\"],\n \"state\": request.session[\"state\"],\n }\n if (\n \"post_logout_redirect_uris\" in CLIENT.registration_response.keys()\n and len(CLIENT.registration_response[\"post_logout_redirect_uris\"]) > 0\n ):\n request_args.update(\n {\n \"post_logout_redirect_uri\": CLIENT.registration_response[\n \"post_logout_redirect_uris\"\n ][0]\n }\n )\n\n url = CLIENT.provider_info[\"end_session_endpoint\"]\n url += \"?\" + urlencode(request_args)\n return HttpResponseRedirect(url)\n except Exception as err:\n return error_page(request, err)\n finally:\n # Always remove Django session stuff - even if not logged out from OP.\n # Don't wait for the callback as it may never come.\n auth_logout(request)\n logger.info(\"Successfully logged out user %s\" % username)\n next_page = getattr(settings, \"LOGOUT_REDIRECT_URL\", None)\n if next_page:\n request.session[\"next\"] = next_page\n\n\ndef logout_callback(request):\n \"\"\"Simple redirection view: after logout, redirect to `next`.\"\"\"\n next = request.session.get(\"next\", \"/\")\n return redirect(next)\n", "path": "src/djangooidc/views.py"}], "after_files": [{"content": "# coding: utf-8\n\nimport logging\n\nfrom django.conf import settings\nfrom django.contrib.auth import logout as auth_logout\nfrom django.contrib.auth import authenticate, login\nfrom django.http import HttpResponseRedirect\nfrom django.shortcuts import redirect, render\nfrom urllib.parse import parse_qs, urlencode\n\nfrom djangooidc.oidc import Client\nfrom djangooidc import exceptions as o_e\n\n\nlogger = logging.getLogger(__name__)\n\ntry:\n # Initialize provider using pyOICD\n OP = getattr(settings, \"OIDC_ACTIVE_PROVIDER\")\n CLIENT = Client(OP)\n logger.debug(\"client initialized %s\" % CLIENT)\nexcept Exception as err:\n CLIENT = None # type: ignore\n logger.warning(err)\n logger.warning(\"Unable to configure OpenID Connect provider. Users cannot log in.\")\n\n\ndef error_page(request, error):\n \"\"\"Display a sensible message and log the error.\"\"\"\n logger.error(error)\n if isinstance(error, o_e.AuthenticationFailed):\n return render(\n request,\n \"401.html\",\n context={\n \"friendly_message\": error.friendly_message,\n \"log_identifier\": error.locator,\n },\n status=401,\n )\n if isinstance(error, o_e.InternalError):\n return render(\n request,\n \"500.html\",\n context={\n \"friendly_message\": error.friendly_message,\n \"log_identifier\": error.locator,\n },\n status=500,\n )\n if isinstance(error, Exception):\n return render(request, \"500.html\", status=500)\n\n\ndef openid(request):\n \"\"\"Redirect the user to an authentication provider (OP).\"\"\"\n request.session[\"next\"] = request.GET.get(\"next\", \"/\")\n\n try:\n return CLIENT.create_authn_request(request.session)\n except Exception as err:\n return error_page(request, err)\n\n\ndef login_callback(request):\n \"\"\"Analyze the token returned by the authentication provider (OP).\"\"\"\n try:\n query = parse_qs(request.GET.urlencode())\n userinfo = CLIENT.callback(query, request.session)\n user = authenticate(request=request, **userinfo)\n if user:\n login(request, user)\n logger.info(\"Successfully logged in user %s\" % user)\n return redirect(request.session.get(\"next\", \"/\"))\n else:\n raise o_e.BannedUser()\n except Exception as err:\n return error_page(request, err)\n\n\ndef logout(request, next_page=None):\n \"\"\"Redirect the user to the authentication provider (OP) logout page.\"\"\"\n try:\n username = request.user.username\n request_args = {\n \"client_id\": CLIENT.client_id,\n \"state\": request.session[\"state\"],\n }\n if (\n \"post_logout_redirect_uris\" in CLIENT.registration_response.keys()\n and len(CLIENT.registration_response[\"post_logout_redirect_uris\"]) > 0\n ):\n request_args.update(\n {\n \"post_logout_redirect_uri\": CLIENT.registration_response[\n \"post_logout_redirect_uris\"\n ][0]\n }\n )\n\n url = CLIENT.provider_info[\"end_session_endpoint\"]\n url += \"?\" + urlencode(request_args)\n return HttpResponseRedirect(url)\n except Exception as err:\n return error_page(request, err)\n finally:\n # Always remove Django session stuff - even if not logged out from OP.\n # Don't wait for the callback as it may never come.\n auth_logout(request)\n logger.info(\"Successfully logged out user %s\" % username)\n next_page = getattr(settings, \"LOGOUT_REDIRECT_URL\", None)\n if next_page:\n request.session[\"next\"] = next_page\n\n\ndef logout_callback(request):\n \"\"\"Simple redirection view: after logout, redirect to `next`.\"\"\"\n next = request.session.get(\"next\", \"/\")\n return redirect(next)\n", "path": "src/djangooidc/views.py"}]}
| 1,480 | 123 |
gh_patches_debug_709
|
rasdani/github-patches
|
git_diff
|
GeotrekCE__Geotrek-admin-4021
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Problème de thumbnail avec les SVG
Bug détecté à partir de la version 2.101.4 de Geotrek Admin.
Celui est déclenché par l'ajout d'un SVG comme pictogramme sur un lieu de renseignement.
Explication : la dernière version de easy_thumbnail n'accepte pas de faire le thumbnail d'un SVG. -> l'api V2 plante
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/python3
2 import os
3 import distutils.command.build
4 from pathlib import Path
5 from setuptools import setup, find_packages
6 from shutil import copy
7
8 here = os.path.abspath(os.path.dirname(__file__))
9
10
11 class BuildCommand(distutils.command.build.build):
12 def run(self):
13 distutils.command.build.build.run(self)
14 from django.core.management import call_command
15 curdir = os.getcwd()
16 for subdir in ('geotrek', ):
17 os.chdir(subdir)
18 call_command('compilemessages')
19 for path in Path('.').rglob('*.mo'):
20 copy(path, os.path.join(curdir, self.build_lib, subdir, path))
21 os.chdir(curdir)
22
23
24 setup(
25 name='geotrek',
26 version=open(os.path.join(here, 'VERSION')).read().strip(),
27 author='Makina Corpus',
28 author_email='[email protected]',
29 url='https://makina-corpus.com',
30 description="Geotrek",
31 scripts=['manage.py'],
32 install_requires=[
33 'Django==3.2.*',
34 'mapentity',
35 'chardet',
36 'cairosvg',
37 'cairocffi',
38 'env_file',
39 # pinned by requirements.txt
40 'pymemcache',
41 'coreschema',
42 'coreapi',
43 'psycopg2',
44 'pdfimpose',
45 'docutils',
46 'Pillow',
47 'simplekml',
48 'pygal',
49 'paperclip',
50 'django-extended-choices',
51 'django-modelcluster',
52 'django-mptt',
53 'geojson',
54 'tif2geojson',
55 'drf-dynamic-fields',
56 'drf-yasg',
57 'xlrd',
58 'landez',
59 'large-image-source-vips',
60 'django-large-image',
61 'celery',
62 'redis',
63 'django-celery-results',
64 'drf-extensions',
65 'django-colorfield',
66 'Fiona',
67 'markdown',
68 "weasyprint==52.5", # newer version required libpango (not available in bionic)
69 'django-weasyprint<2.0.0', # 2.10 require weasyprint > 53
70 "django-clearcache",
71 "pyopenair",
72 # prod,
73 'gunicorn',
74 'sentry-sdk',
75 ],
76 cmdclass={"build": BuildCommand},
77 include_package_data=True,
78 license='BSD, see LICENSE file.',
79 packages=find_packages(),
80 classifiers=['Natural Language :: English',
81 'Environment :: Web Environment',
82 'Framework :: Django',
83 'Development Status :: 5 - Production/Stable',
84 'Programming Language :: Python :: 3'],
85 )
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -72,6 +72,7 @@
# prod,
'gunicorn',
'sentry-sdk',
+ 'easy-thumbnails[svg]',
],
cmdclass={"build": BuildCommand},
include_package_data=True,
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -72,6 +72,7 @@\n # prod,\n 'gunicorn',\n 'sentry-sdk',\n+ 'easy-thumbnails[svg]',\n ],\n cmdclass={\"build\": BuildCommand},\n include_package_data=True,\n", "issue": "Probl\u00e8me de thumbnail avec les SVG\nBug d\u00e9tect\u00e9 \u00e0 partir de la version 2.101.4 de Geotrek Admin. \r\n\r\nCelui est d\u00e9clench\u00e9 par l'ajout d'un SVG comme pictogramme sur un lieu de renseignement. \r\n\r\nExplication : la derni\u00e8re version de easy_thumbnail n'accepte pas de faire le thumbnail d'un SVG. -> l'api V2 plante\r\n\n", "before_files": [{"content": "#!/usr/bin/python3\nimport os\nimport distutils.command.build\nfrom pathlib import Path\nfrom setuptools import setup, find_packages\nfrom shutil import copy\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\nclass BuildCommand(distutils.command.build.build):\n def run(self):\n distutils.command.build.build.run(self)\n from django.core.management import call_command\n curdir = os.getcwd()\n for subdir in ('geotrek', ):\n os.chdir(subdir)\n call_command('compilemessages')\n for path in Path('.').rglob('*.mo'):\n copy(path, os.path.join(curdir, self.build_lib, subdir, path))\n os.chdir(curdir)\n\n\nsetup(\n name='geotrek',\n version=open(os.path.join(here, 'VERSION')).read().strip(),\n author='Makina Corpus',\n author_email='[email protected]',\n url='https://makina-corpus.com',\n description=\"Geotrek\",\n scripts=['manage.py'],\n install_requires=[\n 'Django==3.2.*',\n 'mapentity',\n 'chardet',\n 'cairosvg',\n 'cairocffi',\n 'env_file',\n # pinned by requirements.txt\n 'pymemcache',\n 'coreschema',\n 'coreapi',\n 'psycopg2',\n 'pdfimpose',\n 'docutils',\n 'Pillow',\n 'simplekml',\n 'pygal',\n 'paperclip',\n 'django-extended-choices',\n 'django-modelcluster',\n 'django-mptt',\n 'geojson',\n 'tif2geojson',\n 'drf-dynamic-fields',\n 'drf-yasg',\n 'xlrd',\n 'landez',\n 'large-image-source-vips',\n 'django-large-image',\n 'celery',\n 'redis',\n 'django-celery-results',\n 'drf-extensions',\n 'django-colorfield',\n 'Fiona',\n 'markdown',\n \"weasyprint==52.5\", # newer version required libpango (not available in bionic)\n 'django-weasyprint<2.0.0', # 2.10 require weasyprint > 53\n \"django-clearcache\",\n \"pyopenair\",\n # prod,\n 'gunicorn',\n 'sentry-sdk',\n ],\n cmdclass={\"build\": BuildCommand},\n include_package_data=True,\n license='BSD, see LICENSE file.',\n packages=find_packages(),\n classifiers=['Natural Language :: English',\n 'Environment :: Web Environment',\n 'Framework :: Django',\n 'Development Status :: 5 - Production/Stable',\n 'Programming Language :: Python :: 3'],\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/python3\nimport os\nimport distutils.command.build\nfrom pathlib import Path\nfrom setuptools import setup, find_packages\nfrom shutil import copy\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\nclass BuildCommand(distutils.command.build.build):\n def run(self):\n distutils.command.build.build.run(self)\n from django.core.management import call_command\n curdir = os.getcwd()\n for subdir in ('geotrek', ):\n os.chdir(subdir)\n call_command('compilemessages')\n for path in Path('.').rglob('*.mo'):\n copy(path, os.path.join(curdir, self.build_lib, subdir, path))\n os.chdir(curdir)\n\n\nsetup(\n name='geotrek',\n version=open(os.path.join(here, 'VERSION')).read().strip(),\n author='Makina Corpus',\n author_email='[email protected]',\n url='https://makina-corpus.com',\n description=\"Geotrek\",\n scripts=['manage.py'],\n install_requires=[\n 'Django==3.2.*',\n 'mapentity',\n 'chardet',\n 'cairosvg',\n 'cairocffi',\n 'env_file',\n # pinned by requirements.txt\n 'pymemcache',\n 'coreschema',\n 'coreapi',\n 'psycopg2',\n 'pdfimpose',\n 'docutils',\n 'Pillow',\n 'simplekml',\n 'pygal',\n 'paperclip',\n 'django-extended-choices',\n 'django-modelcluster',\n 'django-mptt',\n 'geojson',\n 'tif2geojson',\n 'drf-dynamic-fields',\n 'drf-yasg',\n 'xlrd',\n 'landez',\n 'large-image-source-vips',\n 'django-large-image',\n 'celery',\n 'redis',\n 'django-celery-results',\n 'drf-extensions',\n 'django-colorfield',\n 'Fiona',\n 'markdown',\n \"weasyprint==52.5\", # newer version required libpango (not available in bionic)\n 'django-weasyprint<2.0.0', # 2.10 require weasyprint > 53\n \"django-clearcache\",\n \"pyopenair\",\n # prod,\n 'gunicorn',\n 'sentry-sdk',\n 'easy-thumbnails[svg]',\n ],\n cmdclass={\"build\": BuildCommand},\n include_package_data=True,\n license='BSD, see LICENSE file.',\n packages=find_packages(),\n classifiers=['Natural Language :: English',\n 'Environment :: Web Environment',\n 'Framework :: Django',\n 'Development Status :: 5 - Production/Stable',\n 'Programming Language :: Python :: 3'],\n)\n", "path": "setup.py"}]}
| 1,104 | 72 |
gh_patches_debug_9365
|
rasdani/github-patches
|
git_diff
|
systemd__mkosi-1906
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Is it possible to create system extensions based on Arch Linux
I have followed the documentation to the best of my abilities, however it seems not to be possible. When using `BaseTrees=…` and `Overlay=true`, pacman fails because it cannot lock its database. This is the reported error message:
```
‣ Building extension image
Create subvolume '/home/x/mkosi-test/.mkosi-tmp_eq6cbps/root'
‣ Mounting base trees…
‣ Installing extra packages for Arch
:: Synchronizing package databases...
error: failed to synchronize all databases (unable to lock database)
‣ "bwrap --dev-bind / / --chdir /home/x/mkosi-test --tmpfs /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/run --tmpfs /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/tmp --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/var/tmp --proc /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/proc --dev /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/dev --ro-bind /sys /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/sys --unsetenv TMPDIR --ro-bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/machine-id /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/machine-id --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/passwd /etc/passwd --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/group /etc/group --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/shadow /etc/shadow --bind /x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/gshadow /etc/gshadow sh -c 'chmod 1777 /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/tmp /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/var/tmp /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/dev/shm && mkdir /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/run/host && echo mkosi >/home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/run/host/container-manager && exec $0 "$@"' pacman --config /home/x/mkosi-test/.mkosi-tmp_eq6cbps/pkgmngr/etc/pacman.conf --root /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root --logfile=/dev/null --cachedir '/home/x/mkosi-test/mkosi.cache/arch~rolling' --gpgdir /etc/pacman.d/gnupg --hookdir /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/pacman.d/hooks --arch x86_64 --color auto --noconfirm --needed -Sy base-devel helix" returned non-zero exit code 1.
```
Is there any known solution or workaround?
### Basic Info
Host OS: Arch Linux (up to date)
mkosi version: 16
### Small Example
Create a `mkosi` directory with two presets, `base` and `extension`. Create a `mkosi.conf` file in both, with the following contents.
`mkosi.presets/base/mkosi.conf`:
```
[Distribution]
Distribution=arch
[Output]
Format=directory
ImageId=base
[Content]
Packages=
base
linux
Bootable=false
```
`mkosi.presets/extension/mkosi.conf`:
```
[Preset]
Dependencies=base
[Distribution]
Distribution=arch
[Output]
Format=directory
ImageId=extension
Overlay=true
[Content]
BaseTrees=../../mkosi.output/base/
Packages=
base-devel
helix
Bootable=false
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mkosi/state.py`
Content:
```
1 # SPDX-License-Identifier: LGPL-2.1+
2
3 from pathlib import Path
4
5 from mkosi.config import MkosiArgs, MkosiConfig
6 from mkosi.tree import make_tree
7 from mkosi.util import umask
8
9
10 class MkosiState:
11 """State related properties."""
12
13 def __init__(self, args: MkosiArgs, config: MkosiConfig, workspace: Path) -> None:
14 self.args = args
15 self.config = config
16 self.workspace = workspace
17
18 with umask(~0o755):
19 make_tree(self.config, self.root)
20 self.staging.mkdir()
21 self.pkgmngr.mkdir()
22 self.install_dir.mkdir(exist_ok=True)
23 self.cache_dir.mkdir(parents=True, exist_ok=True)
24
25 @property
26 def root(self) -> Path:
27 return self.workspace / "root"
28
29 @property
30 def staging(self) -> Path:
31 return self.workspace / "staging"
32
33 @property
34 def pkgmngr(self) -> Path:
35 return self.workspace / "pkgmngr"
36
37 @property
38 def cache_dir(self) -> Path:
39 return self.config.cache_dir or self.workspace / f"cache/{self.config.distribution}~{self.config.release}"
40
41 @property
42 def install_dir(self) -> Path:
43 return self.workspace / "dest"
44
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mkosi/state.py b/mkosi/state.py
--- a/mkosi/state.py
+++ b/mkosi/state.py
@@ -16,7 +16,13 @@
self.workspace = workspace
with umask(~0o755):
- make_tree(self.config, self.root)
+ # Using a btrfs subvolume as the upperdir in an overlayfs results in EXDEV so make sure we create
+ # the root directory as a regular directory if the Overlay= option is enabled.
+ if config.overlay:
+ self.root.mkdir()
+ else:
+ make_tree(self.config, self.root)
+
self.staging.mkdir()
self.pkgmngr.mkdir()
self.install_dir.mkdir(exist_ok=True)
|
{"golden_diff": "diff --git a/mkosi/state.py b/mkosi/state.py\n--- a/mkosi/state.py\n+++ b/mkosi/state.py\n@@ -16,7 +16,13 @@\n self.workspace = workspace\n \n with umask(~0o755):\n- make_tree(self.config, self.root)\n+ # Using a btrfs subvolume as the upperdir in an overlayfs results in EXDEV so make sure we create\n+ # the root directory as a regular directory if the Overlay= option is enabled.\n+ if config.overlay:\n+ self.root.mkdir()\n+ else:\n+ make_tree(self.config, self.root)\n+\n self.staging.mkdir()\n self.pkgmngr.mkdir()\n self.install_dir.mkdir(exist_ok=True)\n", "issue": "Is it possible to create system extensions based on Arch Linux\nI have followed the documentation to the best of my abilities, however it seems not to be possible. When using `BaseTrees=\u2026` and `Overlay=true`, pacman fails because it cannot lock its database. This is the reported error message:\r\n\r\n```\r\n\u2023 Building extension image\r\nCreate subvolume '/home/x/mkosi-test/.mkosi-tmp_eq6cbps/root'\r\n\u2023 Mounting base trees\u2026\r\n\u2023 Installing extra packages for Arch\r\n:: Synchronizing package databases...\r\nerror: failed to synchronize all databases (unable to lock database)\r\n\u2023 \"bwrap --dev-bind / / --chdir /home/x/mkosi-test --tmpfs /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/run --tmpfs /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/tmp --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/var/tmp --proc /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/proc --dev /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/dev --ro-bind /sys /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/sys --unsetenv TMPDIR --ro-bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/machine-id /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/machine-id --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/passwd /etc/passwd --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/group /etc/group --bind /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/shadow /etc/shadow --bind /x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/gshadow /etc/gshadow sh -c 'chmod 1777 /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/tmp /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/var/tmp /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/dev/shm && mkdir /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/run/host && echo mkosi >/home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/run/host/container-manager && exec $0 \"$@\"' pacman --config /home/x/mkosi-test/.mkosi-tmp_eq6cbps/pkgmngr/etc/pacman.conf --root /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root --logfile=/dev/null --cachedir '/home/x/mkosi-test/mkosi.cache/arch~rolling' --gpgdir /etc/pacman.d/gnupg --hookdir /home/x/mkosi-test/.mkosi-tmp_eq6cbps/root/etc/pacman.d/hooks --arch x86_64 --color auto --noconfirm --needed -Sy base-devel helix\" returned non-zero exit code 1.\r\n```\r\n\r\nIs there any known solution or workaround?\r\n\r\n### Basic Info\r\nHost OS: Arch Linux (up to date)\r\nmkosi version: 16\r\n\r\n### Small Example\r\n\r\nCreate a `mkosi` directory with two presets, `base` and `extension`. Create a `mkosi.conf` file in both, with the following contents.\r\n\r\n`mkosi.presets/base/mkosi.conf`:\r\n```\r\n[Distribution]\r\nDistribution=arch\r\n\r\n[Output]\r\nFormat=directory\r\nImageId=base\r\n\r\n[Content]\r\nPackages=\r\n base\r\n linux\r\nBootable=false\r\n```\r\n\r\n`mkosi.presets/extension/mkosi.conf`:\r\n```\r\n[Preset]\r\nDependencies=base\r\n\r\n[Distribution]\r\nDistribution=arch\r\n\r\n[Output]\r\nFormat=directory\r\nImageId=extension\r\nOverlay=true\r\n\r\n[Content]\r\nBaseTrees=../../mkosi.output/base/\r\nPackages=\r\n base-devel\r\n helix\r\nBootable=false\r\n```\n", "before_files": [{"content": "# SPDX-License-Identifier: LGPL-2.1+\n\nfrom pathlib import Path\n\nfrom mkosi.config import MkosiArgs, MkosiConfig\nfrom mkosi.tree import make_tree\nfrom mkosi.util import umask\n\n\nclass MkosiState:\n \"\"\"State related properties.\"\"\"\n\n def __init__(self, args: MkosiArgs, config: MkosiConfig, workspace: Path) -> None:\n self.args = args\n self.config = config\n self.workspace = workspace\n\n with umask(~0o755):\n make_tree(self.config, self.root)\n self.staging.mkdir()\n self.pkgmngr.mkdir()\n self.install_dir.mkdir(exist_ok=True)\n self.cache_dir.mkdir(parents=True, exist_ok=True)\n\n @property\n def root(self) -> Path:\n return self.workspace / \"root\"\n\n @property\n def staging(self) -> Path:\n return self.workspace / \"staging\"\n\n @property\n def pkgmngr(self) -> Path:\n return self.workspace / \"pkgmngr\"\n\n @property\n def cache_dir(self) -> Path:\n return self.config.cache_dir or self.workspace / f\"cache/{self.config.distribution}~{self.config.release}\"\n\n @property\n def install_dir(self) -> Path:\n return self.workspace / \"dest\"\n", "path": "mkosi/state.py"}], "after_files": [{"content": "# SPDX-License-Identifier: LGPL-2.1+\n\nfrom pathlib import Path\n\nfrom mkosi.config import MkosiArgs, MkosiConfig\nfrom mkosi.tree import make_tree\nfrom mkosi.util import umask\n\n\nclass MkosiState:\n \"\"\"State related properties.\"\"\"\n\n def __init__(self, args: MkosiArgs, config: MkosiConfig, workspace: Path) -> None:\n self.args = args\n self.config = config\n self.workspace = workspace\n\n with umask(~0o755):\n # Using a btrfs subvolume as the upperdir in an overlayfs results in EXDEV so make sure we create\n # the root directory as a regular directory if the Overlay= option is enabled.\n if config.overlay:\n self.root.mkdir()\n else:\n make_tree(self.config, self.root)\n\n self.staging.mkdir()\n self.pkgmngr.mkdir()\n self.install_dir.mkdir(exist_ok=True)\n self.cache_dir.mkdir(parents=True, exist_ok=True)\n\n @property\n def root(self) -> Path:\n return self.workspace / \"root\"\n\n @property\n def staging(self) -> Path:\n return self.workspace / \"staging\"\n\n @property\n def pkgmngr(self) -> Path:\n return self.workspace / \"pkgmngr\"\n\n @property\n def cache_dir(self) -> Path:\n return self.config.cache_dir or self.workspace / f\"cache/{self.config.distribution}~{self.config.release}\"\n\n @property\n def install_dir(self) -> Path:\n return self.workspace / \"dest\"\n", "path": "mkosi/state.py"}]}
| 1,518 | 167 |
gh_patches_debug_27335
|
rasdani/github-patches
|
git_diff
|
geopandas__geopandas-1105
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Lat Long order is wrong in WKT example
In regards to documentation of converting WKT format to geopandas GeoDataFrame, the order of LAT LONG is wrong.
See #690
In the WKT POINT, you put the Lat first, amd then Long. It should be the opposite.
When plotting the WKT example, and comparing it to the simple Long Lat example, you can clearly see that the WKT points are not placed where they should be.
From what I understand, WKT should follow the WGS84 standard.
See for example: https://www.drupal.org/project/geo/issues/511370
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/create_geopandas_from_pandas.py`
Content:
```
1 """
2 Creating a GeoDataFrame from a DataFrame with coordinates
3 ---------------------------------------------------------
4
5 This example shows how to create a ``GeoDataFrame`` when starting from
6 a *regular* ``DataFrame`` that has coordinates either WKT
7 (`well-known text <https://en.wikipedia.org/wiki/Well-known_text>`_)
8 format, or in
9 two columns.
10
11 """
12 import pandas as pd
13 import geopandas
14 import matplotlib.pyplot as plt
15
16 ###############################################################################
17 # From longitudes and latitudes
18 # =============================
19 #
20 # First, let's consider a ``DataFrame`` containing cities and their respective
21 # longitudes and latitudes.
22
23 df = pd.DataFrame(
24 {'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],
25 'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],
26 'Latitude': [-34.58, -15.78, -33.45, 4.60, 10.48],
27 'Longitude': [-58.66, -47.91, -70.66, -74.08, -66.86]})
28
29 ###############################################################################
30 # A ``GeoDataFrame`` needs a ``shapely`` object. We use geopandas
31 # ``points_from_xy()`` to transform **Longitude** and **Latitude** into a list
32 # of ``shapely.Point`` objects and set it as a ``geometry`` while creating the
33 # ``GeoDataFrame``. (note that ``points_from_xy()`` is an enhanced wrapper for
34 # ``[Point(x, y) for x, y in zip(df.Longitude, df.Latitude)]``)
35
36 gdf = geopandas.GeoDataFrame(
37 df, geometry=geopandas.points_from_xy(df.Longitude, df.Latitude))
38
39
40 ###############################################################################
41 # ``gdf`` looks like this :
42
43 print(gdf.head())
44
45 ###############################################################################
46 # Finally, we plot the coordinates over a country-level map.
47
48 world = geopandas.read_file(geopandas.datasets.get_path('naturalearth_lowres'))
49
50 # We restrict to South America.
51 ax = world[world.continent == 'South America'].plot(
52 color='white', edgecolor='black')
53
54 # We can now plot our GeoDataFrame.
55 gdf.plot(ax=ax, color='red')
56
57 plt.show()
58
59 ###############################################################################
60 # From WKT format
61 # ===============
62 # Here, we consider a ``DataFrame`` having coordinates in WKT format.
63
64 df = pd.DataFrame(
65 {'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],
66 'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],
67 'Coordinates': ['POINT(-34.58 -58.66)', 'POINT(-15.78 -47.91)',
68 'POINT(-33.45 -70.66)', 'POINT(4.60 -74.08)',
69 'POINT(10.48 -66.86)']})
70
71 ###############################################################################
72 # We use ``shapely.wkt`` sub-module to parse wkt format:
73 from shapely import wkt
74
75 df['Coordinates'] = df['Coordinates'].apply(wkt.loads)
76
77 ###############################################################################
78 # The ``GeoDataFrame`` is constructed as follows :
79
80 gdf = geopandas.GeoDataFrame(df, geometry='Coordinates')
81
82 print(gdf.head())
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/examples/create_geopandas_from_pandas.py b/examples/create_geopandas_from_pandas.py
--- a/examples/create_geopandas_from_pandas.py
+++ b/examples/create_geopandas_from_pandas.py
@@ -51,7 +51,7 @@
ax = world[world.continent == 'South America'].plot(
color='white', edgecolor='black')
-# We can now plot our GeoDataFrame.
+# We can now plot our ``GeoDataFrame``.
gdf.plot(ax=ax, color='red')
plt.show()
@@ -64,9 +64,9 @@
df = pd.DataFrame(
{'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],
'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],
- 'Coordinates': ['POINT(-34.58 -58.66)', 'POINT(-15.78 -47.91)',
- 'POINT(-33.45 -70.66)', 'POINT(4.60 -74.08)',
- 'POINT(10.48 -66.86)']})
+ 'Coordinates': ['POINT(-58.66 -34.58)', 'POINT(-47.91 -15.78)',
+ 'POINT(-70.66 -33.45)', 'POINT(-74.08 4.60)',
+ 'POINT(-66.86 10.48)']})
###############################################################################
# We use ``shapely.wkt`` sub-module to parse wkt format:
@@ -80,3 +80,9 @@
gdf = geopandas.GeoDataFrame(df, geometry='Coordinates')
print(gdf.head())
+
+#################################################################################
+# Again, we can plot our ``GeoDataFrame``.
+gdf.plot(ax=ax, color='red')
+
+plt.show()
|
{"golden_diff": "diff --git a/examples/create_geopandas_from_pandas.py b/examples/create_geopandas_from_pandas.py\n--- a/examples/create_geopandas_from_pandas.py\n+++ b/examples/create_geopandas_from_pandas.py\n@@ -51,7 +51,7 @@\n ax = world[world.continent == 'South America'].plot(\n color='white', edgecolor='black')\n \n-# We can now plot our GeoDataFrame.\n+# We can now plot our ``GeoDataFrame``.\n gdf.plot(ax=ax, color='red')\n \n plt.show()\n@@ -64,9 +64,9 @@\n df = pd.DataFrame(\n {'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],\n 'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],\n- 'Coordinates': ['POINT(-34.58 -58.66)', 'POINT(-15.78 -47.91)',\n- 'POINT(-33.45 -70.66)', 'POINT(4.60 -74.08)',\n- 'POINT(10.48 -66.86)']})\n+ 'Coordinates': ['POINT(-58.66 -34.58)', 'POINT(-47.91 -15.78)',\n+ 'POINT(-70.66 -33.45)', 'POINT(-74.08 4.60)',\n+ 'POINT(-66.86 10.48)']})\n \n ###############################################################################\n # We use ``shapely.wkt`` sub-module to parse wkt format:\n@@ -80,3 +80,9 @@\n gdf = geopandas.GeoDataFrame(df, geometry='Coordinates')\n \n print(gdf.head())\n+\n+#################################################################################\n+# Again, we can plot our ``GeoDataFrame``.\n+gdf.plot(ax=ax, color='red')\n+\n+plt.show()\n", "issue": "Lat Long order is wrong in WKT example\nIn regards to documentation of converting WKT format to geopandas GeoDataFrame, the order of LAT LONG is wrong.\r\nSee #690 \r\nIn the WKT POINT, you put the Lat first, amd then Long. It should be the opposite. \r\nWhen plotting the WKT example, and comparing it to the simple Long Lat example, you can clearly see that the WKT points are not placed where they should be.\r\n\r\nFrom what I understand, WKT should follow the WGS84 standard.\r\nSee for example: https://www.drupal.org/project/geo/issues/511370\n", "before_files": [{"content": "\"\"\"\nCreating a GeoDataFrame from a DataFrame with coordinates\n---------------------------------------------------------\n\nThis example shows how to create a ``GeoDataFrame`` when starting from\na *regular* ``DataFrame`` that has coordinates either WKT\n(`well-known text <https://en.wikipedia.org/wiki/Well-known_text>`_)\nformat, or in\ntwo columns.\n\n\"\"\"\nimport pandas as pd\nimport geopandas\nimport matplotlib.pyplot as plt\n\n###############################################################################\n# From longitudes and latitudes\n# =============================\n#\n# First, let's consider a ``DataFrame`` containing cities and their respective\n# longitudes and latitudes.\n\ndf = pd.DataFrame(\n {'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],\n 'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],\n 'Latitude': [-34.58, -15.78, -33.45, 4.60, 10.48],\n 'Longitude': [-58.66, -47.91, -70.66, -74.08, -66.86]})\n\n###############################################################################\n# A ``GeoDataFrame`` needs a ``shapely`` object. We use geopandas\n# ``points_from_xy()`` to transform **Longitude** and **Latitude** into a list\n# of ``shapely.Point`` objects and set it as a ``geometry`` while creating the\n# ``GeoDataFrame``. (note that ``points_from_xy()`` is an enhanced wrapper for\n# ``[Point(x, y) for x, y in zip(df.Longitude, df.Latitude)]``)\n\ngdf = geopandas.GeoDataFrame(\n df, geometry=geopandas.points_from_xy(df.Longitude, df.Latitude))\n\n\n###############################################################################\n# ``gdf`` looks like this :\n\nprint(gdf.head())\n\n###############################################################################\n# Finally, we plot the coordinates over a country-level map.\n\nworld = geopandas.read_file(geopandas.datasets.get_path('naturalearth_lowres'))\n\n# We restrict to South America.\nax = world[world.continent == 'South America'].plot(\n color='white', edgecolor='black')\n\n# We can now plot our GeoDataFrame.\ngdf.plot(ax=ax, color='red')\n\nplt.show()\n\n###############################################################################\n# From WKT format\n# ===============\n# Here, we consider a ``DataFrame`` having coordinates in WKT format.\n\ndf = pd.DataFrame(\n {'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],\n 'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],\n 'Coordinates': ['POINT(-34.58 -58.66)', 'POINT(-15.78 -47.91)',\n 'POINT(-33.45 -70.66)', 'POINT(4.60 -74.08)',\n 'POINT(10.48 -66.86)']})\n\n###############################################################################\n# We use ``shapely.wkt`` sub-module to parse wkt format:\nfrom shapely import wkt\n\ndf['Coordinates'] = df['Coordinates'].apply(wkt.loads)\n\n###############################################################################\n# The ``GeoDataFrame`` is constructed as follows :\n\ngdf = geopandas.GeoDataFrame(df, geometry='Coordinates')\n\nprint(gdf.head())\n", "path": "examples/create_geopandas_from_pandas.py"}], "after_files": [{"content": "\"\"\"\nCreating a GeoDataFrame from a DataFrame with coordinates\n---------------------------------------------------------\n\nThis example shows how to create a ``GeoDataFrame`` when starting from\na *regular* ``DataFrame`` that has coordinates either WKT\n(`well-known text <https://en.wikipedia.org/wiki/Well-known_text>`_)\nformat, or in\ntwo columns.\n\n\"\"\"\nimport pandas as pd\nimport geopandas\nimport matplotlib.pyplot as plt\n\n###############################################################################\n# From longitudes and latitudes\n# =============================\n#\n# First, let's consider a ``DataFrame`` containing cities and their respective\n# longitudes and latitudes.\n\ndf = pd.DataFrame(\n {'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],\n 'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],\n 'Latitude': [-34.58, -15.78, -33.45, 4.60, 10.48],\n 'Longitude': [-58.66, -47.91, -70.66, -74.08, -66.86]})\n\n###############################################################################\n# A ``GeoDataFrame`` needs a ``shapely`` object. We use geopandas\n# ``points_from_xy()`` to transform **Longitude** and **Latitude** into a list\n# of ``shapely.Point`` objects and set it as a ``geometry`` while creating the\n# ``GeoDataFrame``. (note that ``points_from_xy()`` is an enhanced wrapper for\n# ``[Point(x, y) for x, y in zip(df.Longitude, df.Latitude)]``)\n\ngdf = geopandas.GeoDataFrame(\n df, geometry=geopandas.points_from_xy(df.Longitude, df.Latitude))\n\n\n###############################################################################\n# ``gdf`` looks like this :\n\nprint(gdf.head())\n\n###############################################################################\n# Finally, we plot the coordinates over a country-level map.\n\nworld = geopandas.read_file(geopandas.datasets.get_path('naturalearth_lowres'))\n\n# We restrict to South America.\nax = world[world.continent == 'South America'].plot(\n color='white', edgecolor='black')\n\n# We can now plot our ``GeoDataFrame``.\ngdf.plot(ax=ax, color='red')\n\nplt.show()\n\n###############################################################################\n# From WKT format\n# ===============\n# Here, we consider a ``DataFrame`` having coordinates in WKT format.\n\ndf = pd.DataFrame(\n {'City': ['Buenos Aires', 'Brasilia', 'Santiago', 'Bogota', 'Caracas'],\n 'Country': ['Argentina', 'Brazil', 'Chile', 'Colombia', 'Venezuela'],\n 'Coordinates': ['POINT(-58.66 -34.58)', 'POINT(-47.91 -15.78)',\n 'POINT(-70.66 -33.45)', 'POINT(-74.08 4.60)',\n 'POINT(-66.86 10.48)']})\n\n###############################################################################\n# We use ``shapely.wkt`` sub-module to parse wkt format:\nfrom shapely import wkt\n\ndf['Coordinates'] = df['Coordinates'].apply(wkt.loads)\n\n###############################################################################\n# The ``GeoDataFrame`` is constructed as follows :\n\ngdf = geopandas.GeoDataFrame(df, geometry='Coordinates')\n\nprint(gdf.head())\n\n#################################################################################\n# Again, we can plot our ``GeoDataFrame``.\ngdf.plot(ax=ax, color='red')\n\nplt.show()\n", "path": "examples/create_geopandas_from_pandas.py"}]}
| 1,317 | 445 |
gh_patches_debug_39786
|
rasdani/github-patches
|
git_diff
|
alltheplaces__alltheplaces-3457
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider statefarm is broken
During the global build at 2021-09-22-14-42-27, spider **statefarm** failed with **0 features** and **6575 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-09-22-14-42-27/logs/statefarm.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-09-22-14-42-27/output/statefarm.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-09-22-14-42-27/output/statefarm.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/statefarm.py`
Content:
```
1 import json
2 import re
3 import scrapy
4
5 from locations.items import GeojsonPointItem
6
7
8 class StateFarmSpider(scrapy.Spider):
9 name = "statefarm"
10 item_attributes = { 'brand': "State Farm" }
11 allowed_domains = ["statefarm.com"]
12 download_delay = 0.2
13
14 start_urls = [
15 'https://www.statefarm.com/agent/us',
16 ]
17
18 def parse_location(self, response):
19
20 name = response.xpath('//*[@id="AgentNameLabelId"]//span[@itemprop="name"]/text()').extract_first()
21 if name:
22 name += ' - State Farm Insurance Agent'
23
24 lat = response.xpath('//*[@id="agentOfficePrimaryLocLat"]/@value').extract_first()
25 lon = response.xpath('//*[@id="agentOfficePrimaryLocLong"]/@value').extract_first()
26
27 properties = {
28 'ref': "_".join(response.url.split('/')[-3:]),
29 'name': name,
30 'addr_full': response.xpath('normalize-space(//div[@itemtype="http://schema.org/PostalAddress"]//span[@id="locStreetContent_mainLocContent"]/text())').extract_first(),
31 'city': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]/div[2]/span/span[1]/text()').extract_first().strip(', '),
32 'state': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]/div[2]/span/span[2]/text()').extract_first(),
33 'postcode': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]/div[2]/span/span[3]/text()').extract_first(),
34 'phone': response.xpath('normalize-space(//span[@id="offNumber_mainLocContent"]/span/text())').extract_first(),
35 'lat': float(lat) if lat else None,
36 'lon': float(lon) if lon else None,
37 'website': response.url,
38 }
39
40 yield GeojsonPointItem(**properties)
41
42 def parse(self, response):
43 agents = response.xpath('//div[contains(@id, "agent-details")]')
44 # agent_sites = response.xpath('//a[contains(text(), "Visit agent site")]/@href').extract()
45
46 if agents:
47 for agent in agents:
48 agent_site = agent.xpath('.//a[contains(text(), "Visit agent site")]/@href').extract_first()
49 if not agent_site:
50 raise Exception('no agent site found')
51 yield scrapy.Request(response.urljoin(agent_site), callback=self.parse_location)
52
53 else:
54 urls = response.xpath('//li/div/a/@href').extract()
55
56 for url in urls:
57 yield scrapy.Request(response.urljoin(url))
58
59
60
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/locations/spiders/statefarm.py b/locations/spiders/statefarm.py
--- a/locations/spiders/statefarm.py
+++ b/locations/spiders/statefarm.py
@@ -9,7 +9,7 @@
name = "statefarm"
item_attributes = { 'brand': "State Farm" }
allowed_domains = ["statefarm.com"]
- download_delay = 0.2
+ download_delay = 0.1
start_urls = [
'https://www.statefarm.com/agent/us',
@@ -17,23 +17,20 @@
def parse_location(self, response):
- name = response.xpath('//*[@id="AgentNameLabelId"]//span[@itemprop="name"]/text()').extract_first()
+ name = response.xpath('//span[@itemprop="name"]/text()').extract_first()
if name:
name += ' - State Farm Insurance Agent'
- lat = response.xpath('//*[@id="agentOfficePrimaryLocLat"]/@value').extract_first()
- lon = response.xpath('//*[@id="agentOfficePrimaryLocLong"]/@value').extract_first()
-
properties = {
'ref': "_".join(response.url.split('/')[-3:]),
'name': name,
- 'addr_full': response.xpath('normalize-space(//div[@itemtype="http://schema.org/PostalAddress"]//span[@id="locStreetContent_mainLocContent"]/text())').extract_first(),
- 'city': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]/div[2]/span/span[1]/text()').extract_first().strip(', '),
- 'state': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]/div[2]/span/span[2]/text()').extract_first(),
- 'postcode': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]/div[2]/span/span[3]/text()').extract_first(),
- 'phone': response.xpath('normalize-space(//span[@id="offNumber_mainLocContent"]/span/text())').extract_first(),
- 'lat': float(lat) if lat else None,
- 'lon': float(lon) if lon else None,
+ 'addr_full': response.xpath('normalize-space(//div[@itemtype="http://schema.org/PostalAddress"]//*[@itemprop="streetAddress"]/text())').extract_first(),
+ 'city': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]//*[@itemprop="addressLocality"]/text()').extract_first(),
+ 'state': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]//*[@itemprop="addressRegion"]/text()').extract_first(),
+ 'postcode': response.xpath('//div[@itemtype="http://schema.org/PostalAddress"]//*[@itemprop="postalCode"]/text()').extract_first(),
+ 'phone': response.xpath('//*[@itemprop="telephone"]/a/text()').extract_first(),
+ 'lat': float(response.xpath('//@data-latitude').extract_first()),
+ 'lon': float(response.xpath('//@data-longitude').extract_first()),
'website': response.url,
}
@@ -45,7 +42,7 @@
if agents:
for agent in agents:
- agent_site = agent.xpath('.//a[contains(text(), "Visit agent site")]/@href').extract_first()
+ agent_site = agent.xpath('.//a[contains(text(), "Agent Website")]/@href').extract_first()
if not agent_site:
raise Exception('no agent site found')
yield scrapy.Request(response.urljoin(agent_site), callback=self.parse_location)
|
{"golden_diff": "diff --git a/locations/spiders/statefarm.py b/locations/spiders/statefarm.py\n--- a/locations/spiders/statefarm.py\n+++ b/locations/spiders/statefarm.py\n@@ -9,7 +9,7 @@\n name = \"statefarm\"\n item_attributes = { 'brand': \"State Farm\" }\n allowed_domains = [\"statefarm.com\"]\n- download_delay = 0.2\n+ download_delay = 0.1\n \n start_urls = [\n 'https://www.statefarm.com/agent/us',\n@@ -17,23 +17,20 @@\n \n def parse_location(self, response):\n \n- name = response.xpath('//*[@id=\"AgentNameLabelId\"]//span[@itemprop=\"name\"]/text()').extract_first()\n+ name = response.xpath('//span[@itemprop=\"name\"]/text()').extract_first()\n if name:\n name += ' - State Farm Insurance Agent'\n \n- lat = response.xpath('//*[@id=\"agentOfficePrimaryLocLat\"]/@value').extract_first()\n- lon = response.xpath('//*[@id=\"agentOfficePrimaryLocLong\"]/@value').extract_first()\n-\n properties = {\n 'ref': \"_\".join(response.url.split('/')[-3:]),\n 'name': name,\n- 'addr_full': response.xpath('normalize-space(//div[@itemtype=\"http://schema.org/PostalAddress\"]//span[@id=\"locStreetContent_mainLocContent\"]/text())').extract_first(),\n- 'city': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]/div[2]/span/span[1]/text()').extract_first().strip(', '),\n- 'state': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]/div[2]/span/span[2]/text()').extract_first(),\n- 'postcode': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]/div[2]/span/span[3]/text()').extract_first(),\n- 'phone': response.xpath('normalize-space(//span[@id=\"offNumber_mainLocContent\"]/span/text())').extract_first(),\n- 'lat': float(lat) if lat else None,\n- 'lon': float(lon) if lon else None,\n+ 'addr_full': response.xpath('normalize-space(//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"streetAddress\"]/text())').extract_first(),\n+ 'city': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"addressLocality\"]/text()').extract_first(),\n+ 'state': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"addressRegion\"]/text()').extract_first(),\n+ 'postcode': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"postalCode\"]/text()').extract_first(),\n+ 'phone': response.xpath('//*[@itemprop=\"telephone\"]/a/text()').extract_first(),\n+ 'lat': float(response.xpath('//@data-latitude').extract_first()),\n+ 'lon': float(response.xpath('//@data-longitude').extract_first()),\n 'website': response.url,\n }\n \n@@ -45,7 +42,7 @@\n \n if agents:\n for agent in agents:\n- agent_site = agent.xpath('.//a[contains(text(), \"Visit agent site\")]/@href').extract_first()\n+ agent_site = agent.xpath('.//a[contains(text(), \"Agent Website\")]/@href').extract_first()\n if not agent_site:\n raise Exception('no agent site found')\n yield scrapy.Request(response.urljoin(agent_site), callback=self.parse_location)\n", "issue": "Spider statefarm is broken\nDuring the global build at 2021-09-22-14-42-27, spider **statefarm** failed with **0 features** and **6575 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-09-22-14-42-27/logs/statefarm.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-09-22-14-42-27/output/statefarm.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-09-22-14-42-27/output/statefarm.geojson))\n", "before_files": [{"content": "import json\nimport re\nimport scrapy\n\nfrom locations.items import GeojsonPointItem\n\n\nclass StateFarmSpider(scrapy.Spider):\n name = \"statefarm\"\n item_attributes = { 'brand': \"State Farm\" }\n allowed_domains = [\"statefarm.com\"]\n download_delay = 0.2\n\n start_urls = [\n 'https://www.statefarm.com/agent/us',\n ]\n\n def parse_location(self, response):\n\n name = response.xpath('//*[@id=\"AgentNameLabelId\"]//span[@itemprop=\"name\"]/text()').extract_first()\n if name:\n name += ' - State Farm Insurance Agent'\n\n lat = response.xpath('//*[@id=\"agentOfficePrimaryLocLat\"]/@value').extract_first()\n lon = response.xpath('//*[@id=\"agentOfficePrimaryLocLong\"]/@value').extract_first()\n\n properties = {\n 'ref': \"_\".join(response.url.split('/')[-3:]),\n 'name': name,\n 'addr_full': response.xpath('normalize-space(//div[@itemtype=\"http://schema.org/PostalAddress\"]//span[@id=\"locStreetContent_mainLocContent\"]/text())').extract_first(),\n 'city': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]/div[2]/span/span[1]/text()').extract_first().strip(', '),\n 'state': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]/div[2]/span/span[2]/text()').extract_first(),\n 'postcode': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]/div[2]/span/span[3]/text()').extract_first(),\n 'phone': response.xpath('normalize-space(//span[@id=\"offNumber_mainLocContent\"]/span/text())').extract_first(),\n 'lat': float(lat) if lat else None,\n 'lon': float(lon) if lon else None,\n 'website': response.url,\n }\n\n yield GeojsonPointItem(**properties)\n\n def parse(self, response):\n agents = response.xpath('//div[contains(@id, \"agent-details\")]')\n # agent_sites = response.xpath('//a[contains(text(), \"Visit agent site\")]/@href').extract()\n\n if agents:\n for agent in agents:\n agent_site = agent.xpath('.//a[contains(text(), \"Visit agent site\")]/@href').extract_first()\n if not agent_site:\n raise Exception('no agent site found')\n yield scrapy.Request(response.urljoin(agent_site), callback=self.parse_location)\n\n else:\n urls = response.xpath('//li/div/a/@href').extract()\n\n for url in urls:\n yield scrapy.Request(response.urljoin(url))\n\n\n", "path": "locations/spiders/statefarm.py"}], "after_files": [{"content": "import json\nimport re\nimport scrapy\n\nfrom locations.items import GeojsonPointItem\n\n\nclass StateFarmSpider(scrapy.Spider):\n name = \"statefarm\"\n item_attributes = { 'brand': \"State Farm\" }\n allowed_domains = [\"statefarm.com\"]\n download_delay = 0.1\n\n start_urls = [\n 'https://www.statefarm.com/agent/us',\n ]\n\n def parse_location(self, response):\n\n name = response.xpath('//span[@itemprop=\"name\"]/text()').extract_first()\n if name:\n name += ' - State Farm Insurance Agent'\n\n properties = {\n 'ref': \"_\".join(response.url.split('/')[-3:]),\n 'name': name,\n 'addr_full': response.xpath('normalize-space(//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"streetAddress\"]/text())').extract_first(),\n 'city': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"addressLocality\"]/text()').extract_first(),\n 'state': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"addressRegion\"]/text()').extract_first(),\n 'postcode': response.xpath('//div[@itemtype=\"http://schema.org/PostalAddress\"]//*[@itemprop=\"postalCode\"]/text()').extract_first(),\n 'phone': response.xpath('//*[@itemprop=\"telephone\"]/a/text()').extract_first(),\n 'lat': float(response.xpath('//@data-latitude').extract_first()),\n 'lon': float(response.xpath('//@data-longitude').extract_first()),\n 'website': response.url,\n }\n\n yield GeojsonPointItem(**properties)\n\n def parse(self, response):\n agents = response.xpath('//div[contains(@id, \"agent-details\")]')\n # agent_sites = response.xpath('//a[contains(text(), \"Visit agent site\")]/@href').extract()\n\n if agents:\n for agent in agents:\n agent_site = agent.xpath('.//a[contains(text(), \"Agent Website\")]/@href').extract_first()\n if not agent_site:\n raise Exception('no agent site found')\n yield scrapy.Request(response.urljoin(agent_site), callback=self.parse_location)\n\n else:\n urls = response.xpath('//li/div/a/@href').extract()\n\n for url in urls:\n yield scrapy.Request(response.urljoin(url))\n\n\n", "path": "locations/spiders/statefarm.py"}]}
| 1,134 | 793 |
gh_patches_debug_17114
|
rasdani/github-patches
|
git_diff
|
safe-global__safe-config-service-15
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve admin page
- [x] Add custom Admin models with `list_display`, `display_filter` (for example, by network), `ordering` and `search_fields`: https://docs.djangoproject.com/en/3.2/ref/contrib/admin/
- [x] Use admin decorator https://docs.djangoproject.com/en/3.2/ref/contrib/admin/#the-register-decorator
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/safe_apps/admin.py`
Content:
```
1 from django.contrib import admin
2
3 from .models import SafeApp, Provider
4
5 models = [SafeApp, Provider]
6 admin.site.register(models)
7
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/safe_apps/admin.py b/src/safe_apps/admin.py
--- a/src/safe_apps/admin.py
+++ b/src/safe_apps/admin.py
@@ -2,5 +2,34 @@
from .models import SafeApp, Provider
-models = [SafeApp, Provider]
-admin.site.register(models)
+
+class NetworksFilter(admin.SimpleListFilter):
+ title = 'Networks'
+ parameter_name = 'networks'
+
+ def lookups(self, request, model_admin):
+ values = SafeApp.objects.values_list('networks', flat=True)
+ # lookups requires a tuple to be returned – (value, verbose value)
+ networks = [(network, network) for networks in values for network in networks]
+ networks = sorted(set(networks))
+ return networks
+
+ def queryset(self, request, queryset):
+ if value := self.value():
+ queryset = queryset.filter(networks__contains=[value])
+ return queryset
+
+
[email protected](SafeApp)
+class SafeAppAdmin(admin.ModelAdmin):
+ list_display = ('name', 'url', 'networks')
+ list_filter = (NetworksFilter,)
+ search_fields = ('name', 'url')
+ ordering = ('name',)
+
+
[email protected](Provider)
+class ProviderAdmin(admin.ModelAdmin):
+ list_display = ('name', 'url')
+ search_fields = ('name',)
+ ordering = ('name',)
|
{"golden_diff": "diff --git a/src/safe_apps/admin.py b/src/safe_apps/admin.py\n--- a/src/safe_apps/admin.py\n+++ b/src/safe_apps/admin.py\n@@ -2,5 +2,34 @@\n \n from .models import SafeApp, Provider\n \n-models = [SafeApp, Provider]\n-admin.site.register(models)\n+\n+class NetworksFilter(admin.SimpleListFilter):\n+ title = 'Networks'\n+ parameter_name = 'networks'\n+\n+ def lookups(self, request, model_admin):\n+ values = SafeApp.objects.values_list('networks', flat=True)\n+ # lookups requires a tuple to be returned \u2013 (value, verbose value)\n+ networks = [(network, network) for networks in values for network in networks]\n+ networks = sorted(set(networks))\n+ return networks\n+\n+ def queryset(self, request, queryset):\n+ if value := self.value():\n+ queryset = queryset.filter(networks__contains=[value])\n+ return queryset\n+\n+\[email protected](SafeApp)\n+class SafeAppAdmin(admin.ModelAdmin):\n+ list_display = ('name', 'url', 'networks')\n+ list_filter = (NetworksFilter,)\n+ search_fields = ('name', 'url')\n+ ordering = ('name',)\n+\n+\[email protected](Provider)\n+class ProviderAdmin(admin.ModelAdmin):\n+ list_display = ('name', 'url')\n+ search_fields = ('name',)\n+ ordering = ('name',)\n", "issue": "Improve admin page\n- [x] Add custom Admin models with `list_display`, `display_filter` (for example, by network), `ordering` and `search_fields`: https://docs.djangoproject.com/en/3.2/ref/contrib/admin/\r\n- [x] Use admin decorator https://docs.djangoproject.com/en/3.2/ref/contrib/admin/#the-register-decorator\n", "before_files": [{"content": "from django.contrib import admin\n\nfrom .models import SafeApp, Provider\n\nmodels = [SafeApp, Provider]\nadmin.site.register(models)\n", "path": "src/safe_apps/admin.py"}], "after_files": [{"content": "from django.contrib import admin\n\nfrom .models import SafeApp, Provider\n\n\nclass NetworksFilter(admin.SimpleListFilter):\n title = 'Networks'\n parameter_name = 'networks'\n\n def lookups(self, request, model_admin):\n values = SafeApp.objects.values_list('networks', flat=True)\n # lookups requires a tuple to be returned \u2013 (value, verbose value)\n networks = [(network, network) for networks in values for network in networks]\n networks = sorted(set(networks))\n return networks\n\n def queryset(self, request, queryset):\n if value := self.value():\n queryset = queryset.filter(networks__contains=[value])\n return queryset\n\n\[email protected](SafeApp)\nclass SafeAppAdmin(admin.ModelAdmin):\n list_display = ('name', 'url', 'networks')\n list_filter = (NetworksFilter,)\n search_fields = ('name', 'url')\n ordering = ('name',)\n\n\[email protected](Provider)\nclass ProviderAdmin(admin.ModelAdmin):\n list_display = ('name', 'url')\n search_fields = ('name',)\n ordering = ('name',)\n", "path": "src/safe_apps/admin.py"}]}
| 377 | 317 |
gh_patches_debug_32669
|
rasdani/github-patches
|
git_diff
|
OpenEnergyPlatform__oeplatform-1191
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve reading of version file in Welcome View
* use caching because file is read every time user requests main page
* wrap in try-except (missing release notice should not break the main page)
FYI: @jh-RLI
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `base/views.py`
Content:
```
1 import json
2 import os
3 import re
4
5 import markdown2
6 from django.core.mail import send_mail
7 from django.shortcuts import render
8 from django.views.generic import View
9
10 try:
11 import oeplatform.securitysettings as sec
12 except Exception:
13 import logging
14
15 logging.error("No securitysettings found. Triggerd in base/views.py")
16
17 from base.forms import ContactForm
18
19 # Create your views here.
20
21 SITE_ROOT = os.path.dirname(os.path.realpath(__file__))
22
23
24 class Welcome(View):
25 def get(self, request):
26 os.path.dirname(os.path.realpath(__file__))
27 version_expr = r"^(?P<major>\d+)\.(?P<minor>\d+)+\.(?P<patch>\d+)$"
28 markdowner = markdown2.Markdown()
29 with open(os.path.join(SITE_ROOT, "..", "VERSION")) as version_file:
30 match = re.match(version_expr, version_file.read())
31 major, minor, patch = match.groups()
32 with open(
33 os.path.join(
34 SITE_ROOT,
35 "..",
36 "versions/changelogs/%s_%s_%s.md" % (major, minor, patch),
37 )
38 ) as change_file:
39 changes = markdowner.convert(
40 "\n".join(line for line in change_file.readlines())
41 )
42 return render(
43 request,
44 "base/index.html",
45 {"version": "%s.%s.%s" % (major, minor, patch), "changes": changes},
46 )
47
48
49 def get_logs(request):
50 version_expr = r"^(?P<major>\d+)_(?P<major>\d+)+_(?P<major>\d+)\.md$"
51 logs = {}
52 for file in os.listdir("../versions/changelogs"):
53 match = re.match(version_expr, file)
54 markdowner = markdown2.Markdown()
55 if match:
56 major, minor, patch = match.groups()
57 with open("versions/changelogs" + file) as f:
58 logs[(major, minor, patch)] = markdowner.convert(
59 "\n".join(line for line in f.readlines())
60 )
61 return logs
62
63
64 def redir(request, target):
65 return render(request, "base/{target}.html".format(target=target), {})
66
67
68 class ContactView(View):
69 error_css_class = "error"
70 required_css_class = "required"
71
72 def post(self, request):
73 form = ContactForm(data=request.POST)
74 if form.is_valid():
75 receps = sec.CONTACT_ADDRESSES.get(
76 request.POST["contact_category"], "technical"
77 )
78 send_mail(
79 request.POST.get("contact_topic"),
80 f"{request.POST.get('contact_name')} "
81 + f"({request.POST.get('contact_email')}) wrote: \n"
82 + request.POST.get("content"),
83 sec.DEFAULT_FROM_EMAIL,
84 receps,
85 fail_silently=False,
86 )
87 return render(
88 request, "base/contact.html", {"form": ContactForm(), "success": True}
89 )
90 else:
91 return render(
92 request, "base/contact.html", {"form": form, "success": False}
93 )
94
95 def get(self, request):
96 return render(
97 request, "base/contact.html", {"form": ContactForm(), "success": False}
98 )
99
100
101 def robot(request):
102 return render(request, "base/robots.txt", {}, content_type="text/plain")
103
104
105 def handler500(request):
106 response = render(request, "base/500.html", {})
107 response.status_code = 500
108 return response
109
110
111 def handler404(request, exception):
112 response = render(request, "base/404.html", {})
113 response.status_code = 404
114 return response
115
116
117 def get_json_content(path, json_id=None):
118 """Parse all jsons from given path and return as
119 list or return a single parsed json by id ->
120 The json must have a field called id.
121
122 Args:
123 path (string): path to directory like 'static/project_pages_content/'
124 json_id (string, optional): ID value that must match the value of json[id].
125 Defaults to None.
126
127 Returns:
128 list[object]: List of all deserialized json files in path
129 or
130 object: single json python object
131 """
132
133 if path is not None:
134 all_jsons = []
135 for _json in os.listdir(path=path):
136 with open(os.path.join(path, _json), "r", encoding="utf-8") as json_content:
137 content = json.load(json_content)
138 all_jsons.append(content)
139
140 if json_id is None:
141 return all_jsons
142 else:
143 content_by_id = [
144 i for i in all_jsons if json_id == i["id"] and "template" != i["id"]
145 ]
146 return content_by_id[0]
147 # TODO: catch the exception if path is none
148 else:
149 return {
150 "error": "Path cant be None. Please provide the path to '/static/project_detail_pages_content/' . You can create a new Project by adding an JSON file like the '/static/project_detail_pages_content/PROJECT_TEMPLATE.json'." # noqa
151 }
152
153
154 class AboutPage(View):
155 # docstring
156 projects_content_static = "project_detail_pages_content"
157 projects_content_path = os.path.join(sec.STATIC_ROOT, projects_content_static)
158
159 def get(self, request, projects_content_path=projects_content_path):
160 projects = get_json_content(path=projects_content_path)
161
162 return render(request, "base/about.html", {"projects": projects})
163
164
165 class AboutProjectDetail(AboutPage):
166 # docstring
167
168 def get(self, request, project_id):
169 project = get_json_content(path=self.projects_content_path, json_id=project_id)
170
171 return render(request, "base/project-detail.html", {"project": project})
172
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/base/views.py b/base/views.py
--- a/base/views.py
+++ b/base/views.py
@@ -1,6 +1,7 @@
import json
import os
import re
+from functools import lru_cache
import markdown2
from django.core.mail import send_mail
@@ -21,11 +22,24 @@
SITE_ROOT = os.path.dirname(os.path.realpath(__file__))
-class Welcome(View):
- def get(self, request):
- os.path.dirname(os.path.realpath(__file__))
- version_expr = r"^(?P<major>\d+)\.(?P<minor>\d+)+\.(?P<patch>\d+)$"
- markdowner = markdown2.Markdown()
+@lru_cache(maxsize=None)
+def read_version_changes():
+ """read version and changes from changelog markdown
+
+ We use cache so it can stay in process memory and we dont have to read files
+ in every request. this only changes on a new release anyway, in which case
+ the process is restarted.
+
+ Returns:
+ dict: {"version": (major, minor, patch), "changes": changes}
+ """
+ os.path.dirname(os.path.realpath(__file__))
+ version_expr = r"^(?P<major>\d+)\.(?P<minor>\d+)+\.(?P<patch>\d+)$"
+ markdowner = markdown2.Markdown()
+ import logging
+
+ logging.error("READING")
+ try:
with open(os.path.join(SITE_ROOT, "..", "VERSION")) as version_file:
match = re.match(version_expr, version_file.read())
major, minor, patch = match.groups()
@@ -39,11 +53,16 @@
changes = markdowner.convert(
"\n".join(line for line in change_file.readlines())
)
- return render(
- request,
- "base/index.html",
- {"version": "%s.%s.%s" % (major, minor, patch), "changes": changes},
- )
+ except Exception:
+ # probably because change_file is missing
+ major, minor, patch, changes = "", "", "", ""
+ return {"version": (major, minor, patch), "changes": changes}
+
+
+class Welcome(View):
+ def get(self, request):
+ context = read_version_changes()
+ return render(request, "base/index.html", context)
def get_logs(request):
|
{"golden_diff": "diff --git a/base/views.py b/base/views.py\n--- a/base/views.py\n+++ b/base/views.py\n@@ -1,6 +1,7 @@\n import json\n import os\n import re\n+from functools import lru_cache\n \n import markdown2\n from django.core.mail import send_mail\n@@ -21,11 +22,24 @@\n SITE_ROOT = os.path.dirname(os.path.realpath(__file__))\n \n \n-class Welcome(View):\n- def get(self, request):\n- os.path.dirname(os.path.realpath(__file__))\n- version_expr = r\"^(?P<major>\\d+)\\.(?P<minor>\\d+)+\\.(?P<patch>\\d+)$\"\n- markdowner = markdown2.Markdown()\n+@lru_cache(maxsize=None)\n+def read_version_changes():\n+ \"\"\"read version and changes from changelog markdown\n+\n+ We use cache so it can stay in process memory and we dont have to read files\n+ in every request. this only changes on a new release anyway, in which case\n+ the process is restarted.\n+\n+ Returns:\n+ dict: {\"version\": (major, minor, patch), \"changes\": changes}\n+ \"\"\"\n+ os.path.dirname(os.path.realpath(__file__))\n+ version_expr = r\"^(?P<major>\\d+)\\.(?P<minor>\\d+)+\\.(?P<patch>\\d+)$\"\n+ markdowner = markdown2.Markdown()\n+ import logging\n+\n+ logging.error(\"READING\")\n+ try:\n with open(os.path.join(SITE_ROOT, \"..\", \"VERSION\")) as version_file:\n match = re.match(version_expr, version_file.read())\n major, minor, patch = match.groups()\n@@ -39,11 +53,16 @@\n changes = markdowner.convert(\n \"\\n\".join(line for line in change_file.readlines())\n )\n- return render(\n- request,\n- \"base/index.html\",\n- {\"version\": \"%s.%s.%s\" % (major, minor, patch), \"changes\": changes},\n- )\n+ except Exception:\n+ # probably because change_file is missing\n+ major, minor, patch, changes = \"\", \"\", \"\", \"\"\n+ return {\"version\": (major, minor, patch), \"changes\": changes}\n+\n+\n+class Welcome(View):\n+ def get(self, request):\n+ context = read_version_changes()\n+ return render(request, \"base/index.html\", context)\n \n \n def get_logs(request):\n", "issue": "Improve reading of version file in Welcome View\n* use caching because file is read every time user requests main page\r\n* wrap in try-except (missing release notice should not break the main page)\r\n\r\n\r\nFYI: @jh-RLI \n", "before_files": [{"content": "import json\nimport os\nimport re\n\nimport markdown2\nfrom django.core.mail import send_mail\nfrom django.shortcuts import render\nfrom django.views.generic import View\n\ntry:\n import oeplatform.securitysettings as sec\nexcept Exception:\n import logging\n\n logging.error(\"No securitysettings found. Triggerd in base/views.py\")\n\nfrom base.forms import ContactForm\n\n# Create your views here.\n\nSITE_ROOT = os.path.dirname(os.path.realpath(__file__))\n\n\nclass Welcome(View):\n def get(self, request):\n os.path.dirname(os.path.realpath(__file__))\n version_expr = r\"^(?P<major>\\d+)\\.(?P<minor>\\d+)+\\.(?P<patch>\\d+)$\"\n markdowner = markdown2.Markdown()\n with open(os.path.join(SITE_ROOT, \"..\", \"VERSION\")) as version_file:\n match = re.match(version_expr, version_file.read())\n major, minor, patch = match.groups()\n with open(\n os.path.join(\n SITE_ROOT,\n \"..\",\n \"versions/changelogs/%s_%s_%s.md\" % (major, minor, patch),\n )\n ) as change_file:\n changes = markdowner.convert(\n \"\\n\".join(line for line in change_file.readlines())\n )\n return render(\n request,\n \"base/index.html\",\n {\"version\": \"%s.%s.%s\" % (major, minor, patch), \"changes\": changes},\n )\n\n\ndef get_logs(request):\n version_expr = r\"^(?P<major>\\d+)_(?P<major>\\d+)+_(?P<major>\\d+)\\.md$\"\n logs = {}\n for file in os.listdir(\"../versions/changelogs\"):\n match = re.match(version_expr, file)\n markdowner = markdown2.Markdown()\n if match:\n major, minor, patch = match.groups()\n with open(\"versions/changelogs\" + file) as f:\n logs[(major, minor, patch)] = markdowner.convert(\n \"\\n\".join(line for line in f.readlines())\n )\n return logs\n\n\ndef redir(request, target):\n return render(request, \"base/{target}.html\".format(target=target), {})\n\n\nclass ContactView(View):\n error_css_class = \"error\"\n required_css_class = \"required\"\n\n def post(self, request):\n form = ContactForm(data=request.POST)\n if form.is_valid():\n receps = sec.CONTACT_ADDRESSES.get(\n request.POST[\"contact_category\"], \"technical\"\n )\n send_mail(\n request.POST.get(\"contact_topic\"),\n f\"{request.POST.get('contact_name')} \"\n + f\"({request.POST.get('contact_email')}) wrote: \\n\"\n + request.POST.get(\"content\"),\n sec.DEFAULT_FROM_EMAIL,\n receps,\n fail_silently=False,\n )\n return render(\n request, \"base/contact.html\", {\"form\": ContactForm(), \"success\": True}\n )\n else:\n return render(\n request, \"base/contact.html\", {\"form\": form, \"success\": False}\n )\n\n def get(self, request):\n return render(\n request, \"base/contact.html\", {\"form\": ContactForm(), \"success\": False}\n )\n\n\ndef robot(request):\n return render(request, \"base/robots.txt\", {}, content_type=\"text/plain\")\n\n\ndef handler500(request):\n response = render(request, \"base/500.html\", {})\n response.status_code = 500\n return response\n\n\ndef handler404(request, exception):\n response = render(request, \"base/404.html\", {})\n response.status_code = 404\n return response\n\n\ndef get_json_content(path, json_id=None):\n \"\"\"Parse all jsons from given path and return as\n list or return a single parsed json by id ->\n The json must have a field called id.\n\n Args:\n path (string): path to directory like 'static/project_pages_content/'\n json_id (string, optional): ID value that must match the value of json[id].\n Defaults to None.\n\n Returns:\n list[object]: List of all deserialized json files in path\n or\n object: single json python object\n \"\"\"\n\n if path is not None:\n all_jsons = []\n for _json in os.listdir(path=path):\n with open(os.path.join(path, _json), \"r\", encoding=\"utf-8\") as json_content:\n content = json.load(json_content)\n all_jsons.append(content)\n\n if json_id is None:\n return all_jsons\n else:\n content_by_id = [\n i for i in all_jsons if json_id == i[\"id\"] and \"template\" != i[\"id\"]\n ]\n return content_by_id[0]\n # TODO: catch the exception if path is none\n else:\n return {\n \"error\": \"Path cant be None. Please provide the path to '/static/project_detail_pages_content/' . You can create a new Project by adding an JSON file like the '/static/project_detail_pages_content/PROJECT_TEMPLATE.json'.\" # noqa\n }\n\n\nclass AboutPage(View):\n # docstring\n projects_content_static = \"project_detail_pages_content\"\n projects_content_path = os.path.join(sec.STATIC_ROOT, projects_content_static)\n\n def get(self, request, projects_content_path=projects_content_path):\n projects = get_json_content(path=projects_content_path)\n\n return render(request, \"base/about.html\", {\"projects\": projects})\n\n\nclass AboutProjectDetail(AboutPage):\n # docstring\n\n def get(self, request, project_id):\n project = get_json_content(path=self.projects_content_path, json_id=project_id)\n\n return render(request, \"base/project-detail.html\", {\"project\": project})\n", "path": "base/views.py"}], "after_files": [{"content": "import json\nimport os\nimport re\nfrom functools import lru_cache\n\nimport markdown2\nfrom django.core.mail import send_mail\nfrom django.shortcuts import render\nfrom django.views.generic import View\n\ntry:\n import oeplatform.securitysettings as sec\nexcept Exception:\n import logging\n\n logging.error(\"No securitysettings found. Triggerd in base/views.py\")\n\nfrom base.forms import ContactForm\n\n# Create your views here.\n\nSITE_ROOT = os.path.dirname(os.path.realpath(__file__))\n\n\n@lru_cache(maxsize=None)\ndef read_version_changes():\n \"\"\"read version and changes from changelog markdown\n\n We use cache so it can stay in process memory and we dont have to read files\n in every request. this only changes on a new release anyway, in which case\n the process is restarted.\n\n Returns:\n dict: {\"version\": (major, minor, patch), \"changes\": changes}\n \"\"\"\n os.path.dirname(os.path.realpath(__file__))\n version_expr = r\"^(?P<major>\\d+)\\.(?P<minor>\\d+)+\\.(?P<patch>\\d+)$\"\n markdowner = markdown2.Markdown()\n import logging\n\n logging.error(\"READING\")\n try:\n with open(os.path.join(SITE_ROOT, \"..\", \"VERSION\")) as version_file:\n match = re.match(version_expr, version_file.read())\n major, minor, patch = match.groups()\n with open(\n os.path.join(\n SITE_ROOT,\n \"..\",\n \"versions/changelogs/%s_%s_%s.md\" % (major, minor, patch),\n )\n ) as change_file:\n changes = markdowner.convert(\n \"\\n\".join(line for line in change_file.readlines())\n )\n except Exception:\n # probably because change_file is missing\n major, minor, patch, changes = \"\", \"\", \"\", \"\"\n return {\"version\": (major, minor, patch), \"changes\": changes}\n\n\nclass Welcome(View):\n def get(self, request):\n context = read_version_changes()\n return render(request, \"base/index.html\", context)\n\n\ndef get_logs(request):\n version_expr = r\"^(?P<major>\\d+)_(?P<major>\\d+)+_(?P<major>\\d+)\\.md$\"\n logs = {}\n for file in os.listdir(\"../versions/changelogs\"):\n match = re.match(version_expr, file)\n markdowner = markdown2.Markdown()\n if match:\n major, minor, patch = match.groups()\n with open(\"versions/changelogs\" + file) as f:\n logs[(major, minor, patch)] = markdowner.convert(\n \"\\n\".join(line for line in f.readlines())\n )\n return logs\n\n\ndef redir(request, target):\n return render(request, \"base/{target}.html\".format(target=target), {})\n\n\nclass ContactView(View):\n error_css_class = \"error\"\n required_css_class = \"required\"\n\n def post(self, request):\n form = ContactForm(data=request.POST)\n if form.is_valid():\n receps = sec.CONTACT_ADDRESSES.get(\n request.POST[\"contact_category\"], \"technical\"\n )\n send_mail(\n request.POST.get(\"contact_topic\"),\n f\"{request.POST.get('contact_name')} \"\n + f\"({request.POST.get('contact_email')}) wrote: \\n\"\n + request.POST.get(\"content\"),\n sec.DEFAULT_FROM_EMAIL,\n receps,\n fail_silently=False,\n )\n return render(\n request, \"base/contact.html\", {\"form\": ContactForm(), \"success\": True}\n )\n else:\n return render(\n request, \"base/contact.html\", {\"form\": form, \"success\": False}\n )\n\n def get(self, request):\n return render(\n request, \"base/contact.html\", {\"form\": ContactForm(), \"success\": False}\n )\n\n\ndef robot(request):\n return render(request, \"base/robots.txt\", {}, content_type=\"text/plain\")\n\n\ndef handler500(request):\n response = render(request, \"base/500.html\", {})\n response.status_code = 500\n return response\n\n\ndef handler404(request, exception):\n response = render(request, \"base/404.html\", {})\n response.status_code = 404\n return response\n\n\ndef get_json_content(path, json_id=None):\n \"\"\"Parse all jsons from given path and return as\n list or return a single parsed json by id ->\n The json must have a field called id.\n\n Args:\n path (string): path to directory like 'static/project_pages_content/'\n json_id (string, optional): ID value that must match the value of json[id].\n Defaults to None.\n\n Returns:\n list[object]: List of all deserialized json files in path\n or\n object: single json python object\n \"\"\"\n\n if path is not None:\n all_jsons = []\n for _json in os.listdir(path=path):\n with open(os.path.join(path, _json), \"r\", encoding=\"utf-8\") as json_content:\n content = json.load(json_content)\n all_jsons.append(content)\n\n if json_id is None:\n return all_jsons\n else:\n content_by_id = [\n i for i in all_jsons if json_id == i[\"id\"] and \"template\" != i[\"id\"]\n ]\n return content_by_id[0]\n # TODO: catch the exception if path is none\n else:\n return {\n \"error\": \"Path cant be None. Please provide the path to '/static/project_detail_pages_content/' . You can create a new Project by adding an JSON file like the '/static/project_detail_pages_content/PROJECT_TEMPLATE.json'.\" # noqa\n }\n\n\nclass AboutPage(View):\n # docstring\n projects_content_static = \"project_detail_pages_content\"\n projects_content_path = os.path.join(sec.STATIC_ROOT, projects_content_static)\n\n def get(self, request, projects_content_path=projects_content_path):\n projects = get_json_content(path=projects_content_path)\n\n return render(request, \"base/about.html\", {\"projects\": projects})\n\n\nclass AboutProjectDetail(AboutPage):\n # docstring\n\n def get(self, request, project_id):\n project = get_json_content(path=self.projects_content_path, json_id=project_id)\n\n return render(request, \"base/project-detail.html\", {\"project\": project})\n", "path": "base/views.py"}]}
| 1,976 | 542 |
gh_patches_debug_8670
|
rasdani/github-patches
|
git_diff
|
xonsh__xonsh-1566
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AttributeError: 'BaseShell' object has no attribute 'cmdloop'
My installed xonsh (from [AUR](https://aur.archlinux.org/packages/xonsh/)) suddenly broke?
``` console
$ xonsh
The most exciting phrase to hear in science, the one that heralds new
discoveries, is not "Eureka!" (I found it!) but "That's funny ..."
-- Isaac Asimov
Traceback (most recent call last):
File "/bin/xonsh", line 3, in <module>
main()
File "/usr/lib/python3.5/site-packages/xonsh/__amalgam__.py", line 16531, in main
shell.shell.cmdloop()
AttributeError: 'BaseShell' object has no attribute 'cmdloop'
$ xonsh --version
('xonsh/0.4.4',)
$ XONSH_DEBUG=1 xonsh
Today is the first day of the rest of your lossage.
Traceback (most recent call last):
File "/bin/xonsh", line 3, in <module>
main()
File "/usr/lib/python3.5/site-packages/xonsh/main.py", line 222, in main
shell.shell.cmdloop()
AttributeError: 'BaseShell' object has no attribute 'cmdloop'
```
(Ignore the output of `fortune`.)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `xonsh/shell.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """The xonsh shell"""
3 import os
4 import random
5 import builtins
6 import warnings
7
8 from xonsh.xontribs import update_context
9 from xonsh.environ import xonshrc_context
10 from xonsh.execer import Execer
11 from xonsh.platform import (best_shell_type, has_prompt_toolkit,
12 ptk_version_is_supported)
13 from xonsh.tools import XonshError, to_bool_or_int
14
15
16 class Shell(object):
17 """Main xonsh shell.
18
19 Initializes execution environment and decides if prompt_toolkit or
20 readline version of shell should be used.
21 """
22
23 def __init__(self, ctx=None, shell_type=None, config=None, rc=None,
24 **kwargs):
25 """
26 Parameters
27 ----------
28 ctx : Mapping, optional
29 The execution context for the shell (e.g. the globals namespace).
30 If none, this is computed by loading the rc files. If not None,
31 this no additional context is computed and this is used
32 directly.
33 shell_type : str, optional
34 The shell type to start, such as 'readline', 'prompt_toolkit',
35 or 'random'.
36 config : str, optional
37 Path to configuration file.
38 rc : list of str, optional
39 Sequence of paths to run control files.
40 """
41 self.login = kwargs.get('login', True)
42 self.stype = shell_type
43 self._init_environ(ctx, config, rc,
44 kwargs.get('scriptcache', True),
45 kwargs.get('cacheall', False))
46 env = builtins.__xonsh_env__
47 # pick a valid shell -- if no shell is specified by the user,
48 # shell type is pulled from env
49 if shell_type is None:
50 shell_type = env.get('SHELL_TYPE')
51 if shell_type == 'best' or shell_type is None:
52 shell_type = best_shell_type()
53 elif shell_type == 'random':
54 shell_type = random.choice(('readline', 'prompt_toolkit'))
55 if shell_type == 'prompt_toolkit':
56 if not has_prompt_toolkit():
57 warnings.warn('prompt_toolkit is not available, using '
58 'readline instead.')
59 shell_type = 'readline'
60 elif not ptk_version_is_supported():
61 warnings.warn('prompt-toolkit version < v1.0.0 is not '
62 'supported. Please update prompt-toolkit. Using '
63 'readline instead.')
64 shell_type = 'readline'
65 env['SHELL_TYPE'] = shell_type
66 # actually make the shell
67 if shell_type == 'none':
68 from xonsh.base_shell import BaseShell as shell_class
69 elif shell_type == 'prompt_toolkit':
70 from xonsh.ptk.shell import PromptToolkitShell as shell_class
71 elif shell_type == 'readline':
72 from xonsh.readline_shell import ReadlineShell as shell_class
73 else:
74 raise XonshError('{} is not recognized as a shell type'.format(
75 shell_type))
76 self.shell = shell_class(execer=self.execer,
77 ctx=self.ctx, **kwargs)
78 # allows history garbace colector to start running
79 builtins.__xonsh_history__.gc.wait_for_shell = False
80
81 def __getattr__(self, attr):
82 """Delegates calls to appropriate shell instance."""
83 return getattr(self.shell, attr)
84
85 def _init_environ(self, ctx, config, rc, scriptcache, cacheall):
86 self.ctx = {} if ctx is None else ctx
87 debug = to_bool_or_int(os.getenv('XONSH_DEBUG', '0'))
88 self.execer = Execer(config=config, login=self.login, xonsh_ctx=self.ctx,
89 debug_level=debug)
90 self.execer.scriptcache = scriptcache
91 self.execer.cacheall = cacheall
92 if self.stype != 'none' or self.login:
93 # load xontribs from config file
94 names = builtins.__xonsh_config__.get('xontribs', ())
95 for name in names:
96 update_context(name, ctx=self.ctx)
97 # load run control files
98 env = builtins.__xonsh_env__
99 rc = env.get('XONSHRC') if rc is None else rc
100 self.ctx.update(xonshrc_context(rcfiles=rc, execer=self.execer, initial=self.ctx))
101 self.ctx['__name__'] = '__main__'
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/xonsh/shell.py b/xonsh/shell.py
--- a/xonsh/shell.py
+++ b/xonsh/shell.py
@@ -48,6 +48,10 @@
# shell type is pulled from env
if shell_type is None:
shell_type = env.get('SHELL_TYPE')
+ if shell_type == 'none':
+ # This bricks interactive xonsh
+ # Can happen from the use of .xinitrc, .xsession, etc
+ shell_type = 'best'
if shell_type == 'best' or shell_type is None:
shell_type = best_shell_type()
elif shell_type == 'random':
|
{"golden_diff": "diff --git a/xonsh/shell.py b/xonsh/shell.py\n--- a/xonsh/shell.py\n+++ b/xonsh/shell.py\n@@ -48,6 +48,10 @@\n # shell type is pulled from env\n if shell_type is None:\n shell_type = env.get('SHELL_TYPE')\n+ if shell_type == 'none':\n+ # This bricks interactive xonsh\n+ # Can happen from the use of .xinitrc, .xsession, etc\n+ shell_type = 'best'\n if shell_type == 'best' or shell_type is None:\n shell_type = best_shell_type()\n elif shell_type == 'random':\n", "issue": "AttributeError: 'BaseShell' object has no attribute 'cmdloop'\nMy installed xonsh (from [AUR](https://aur.archlinux.org/packages/xonsh/)) suddenly broke?\n\n``` console\n$ xonsh\nThe most exciting phrase to hear in science, the one that heralds new\ndiscoveries, is not \"Eureka!\" (I found it!) but \"That's funny ...\"\n -- Isaac Asimov\nTraceback (most recent call last):\n File \"/bin/xonsh\", line 3, in <module>\n main()\n File \"/usr/lib/python3.5/site-packages/xonsh/__amalgam__.py\", line 16531, in main\n shell.shell.cmdloop()\nAttributeError: 'BaseShell' object has no attribute 'cmdloop'\n$ xonsh --version\n('xonsh/0.4.4',)\n$ XONSH_DEBUG=1 xonsh\nToday is the first day of the rest of your lossage.\nTraceback (most recent call last):\n File \"/bin/xonsh\", line 3, in <module>\n main()\n File \"/usr/lib/python3.5/site-packages/xonsh/main.py\", line 222, in main\n shell.shell.cmdloop()\nAttributeError: 'BaseShell' object has no attribute 'cmdloop'\n```\n\n(Ignore the output of `fortune`.)\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"The xonsh shell\"\"\"\nimport os\nimport random\nimport builtins\nimport warnings\n\nfrom xonsh.xontribs import update_context\nfrom xonsh.environ import xonshrc_context\nfrom xonsh.execer import Execer\nfrom xonsh.platform import (best_shell_type, has_prompt_toolkit,\n ptk_version_is_supported)\nfrom xonsh.tools import XonshError, to_bool_or_int\n\n\nclass Shell(object):\n \"\"\"Main xonsh shell.\n\n Initializes execution environment and decides if prompt_toolkit or\n readline version of shell should be used.\n \"\"\"\n\n def __init__(self, ctx=None, shell_type=None, config=None, rc=None,\n **kwargs):\n \"\"\"\n Parameters\n ----------\n ctx : Mapping, optional\n The execution context for the shell (e.g. the globals namespace).\n If none, this is computed by loading the rc files. If not None,\n this no additional context is computed and this is used\n directly.\n shell_type : str, optional\n The shell type to start, such as 'readline', 'prompt_toolkit',\n or 'random'.\n config : str, optional\n Path to configuration file.\n rc : list of str, optional\n Sequence of paths to run control files.\n \"\"\"\n self.login = kwargs.get('login', True)\n self.stype = shell_type\n self._init_environ(ctx, config, rc,\n kwargs.get('scriptcache', True),\n kwargs.get('cacheall', False))\n env = builtins.__xonsh_env__\n # pick a valid shell -- if no shell is specified by the user,\n # shell type is pulled from env\n if shell_type is None:\n shell_type = env.get('SHELL_TYPE')\n if shell_type == 'best' or shell_type is None:\n shell_type = best_shell_type()\n elif shell_type == 'random':\n shell_type = random.choice(('readline', 'prompt_toolkit'))\n if shell_type == 'prompt_toolkit':\n if not has_prompt_toolkit():\n warnings.warn('prompt_toolkit is not available, using '\n 'readline instead.')\n shell_type = 'readline'\n elif not ptk_version_is_supported():\n warnings.warn('prompt-toolkit version < v1.0.0 is not '\n 'supported. Please update prompt-toolkit. Using '\n 'readline instead.')\n shell_type = 'readline'\n env['SHELL_TYPE'] = shell_type\n # actually make the shell\n if shell_type == 'none':\n from xonsh.base_shell import BaseShell as shell_class\n elif shell_type == 'prompt_toolkit':\n from xonsh.ptk.shell import PromptToolkitShell as shell_class\n elif shell_type == 'readline':\n from xonsh.readline_shell import ReadlineShell as shell_class\n else:\n raise XonshError('{} is not recognized as a shell type'.format(\n shell_type))\n self.shell = shell_class(execer=self.execer,\n ctx=self.ctx, **kwargs)\n # allows history garbace colector to start running\n builtins.__xonsh_history__.gc.wait_for_shell = False\n\n def __getattr__(self, attr):\n \"\"\"Delegates calls to appropriate shell instance.\"\"\"\n return getattr(self.shell, attr)\n\n def _init_environ(self, ctx, config, rc, scriptcache, cacheall):\n self.ctx = {} if ctx is None else ctx\n debug = to_bool_or_int(os.getenv('XONSH_DEBUG', '0'))\n self.execer = Execer(config=config, login=self.login, xonsh_ctx=self.ctx,\n debug_level=debug)\n self.execer.scriptcache = scriptcache\n self.execer.cacheall = cacheall\n if self.stype != 'none' or self.login:\n # load xontribs from config file\n names = builtins.__xonsh_config__.get('xontribs', ())\n for name in names:\n update_context(name, ctx=self.ctx)\n # load run control files\n env = builtins.__xonsh_env__\n rc = env.get('XONSHRC') if rc is None else rc\n self.ctx.update(xonshrc_context(rcfiles=rc, execer=self.execer, initial=self.ctx))\n self.ctx['__name__'] = '__main__'\n", "path": "xonsh/shell.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"The xonsh shell\"\"\"\nimport os\nimport random\nimport builtins\nimport warnings\n\nfrom xonsh.xontribs import update_context\nfrom xonsh.environ import xonshrc_context\nfrom xonsh.execer import Execer\nfrom xonsh.platform import (best_shell_type, has_prompt_toolkit,\n ptk_version_is_supported)\nfrom xonsh.tools import XonshError, to_bool_or_int\n\n\nclass Shell(object):\n \"\"\"Main xonsh shell.\n\n Initializes execution environment and decides if prompt_toolkit or\n readline version of shell should be used.\n \"\"\"\n\n def __init__(self, ctx=None, shell_type=None, config=None, rc=None,\n **kwargs):\n \"\"\"\n Parameters\n ----------\n ctx : Mapping, optional\n The execution context for the shell (e.g. the globals namespace).\n If none, this is computed by loading the rc files. If not None,\n this no additional context is computed and this is used\n directly.\n shell_type : str, optional\n The shell type to start, such as 'readline', 'prompt_toolkit',\n or 'random'.\n config : str, optional\n Path to configuration file.\n rc : list of str, optional\n Sequence of paths to run control files.\n \"\"\"\n self.login = kwargs.get('login', True)\n self.stype = shell_type\n self._init_environ(ctx, config, rc,\n kwargs.get('scriptcache', True),\n kwargs.get('cacheall', False))\n env = builtins.__xonsh_env__\n # pick a valid shell -- if no shell is specified by the user,\n # shell type is pulled from env\n if shell_type is None:\n shell_type = env.get('SHELL_TYPE')\n if shell_type == 'none':\n # This bricks interactive xonsh\n # Can happen from the use of .xinitrc, .xsession, etc\n shell_type = 'best'\n if shell_type == 'best' or shell_type is None:\n shell_type = best_shell_type()\n elif shell_type == 'random':\n shell_type = random.choice(('readline', 'prompt_toolkit'))\n if shell_type == 'prompt_toolkit':\n if not has_prompt_toolkit():\n warnings.warn('prompt_toolkit is not available, using '\n 'readline instead.')\n shell_type = 'readline'\n elif not ptk_version_is_supported():\n warnings.warn('prompt-toolkit version < v1.0.0 is not '\n 'supported. Please update prompt-toolkit. Using '\n 'readline instead.')\n shell_type = 'readline'\n env['SHELL_TYPE'] = shell_type\n # actually make the shell\n if shell_type == 'none':\n from xonsh.base_shell import BaseShell as shell_class\n elif shell_type == 'prompt_toolkit':\n from xonsh.ptk.shell import PromptToolkitShell as shell_class\n elif shell_type == 'readline':\n from xonsh.readline_shell import ReadlineShell as shell_class\n else:\n raise XonshError('{} is not recognized as a shell type'.format(\n shell_type))\n self.shell = shell_class(execer=self.execer,\n ctx=self.ctx, **kwargs)\n # allows history garbace colector to start running\n builtins.__xonsh_history__.gc.wait_for_shell = False\n\n def __getattr__(self, attr):\n \"\"\"Delegates calls to appropriate shell instance.\"\"\"\n return getattr(self.shell, attr)\n\n def _init_environ(self, ctx, config, rc, scriptcache, cacheall):\n self.ctx = {} if ctx is None else ctx\n debug = to_bool_or_int(os.getenv('XONSH_DEBUG', '0'))\n self.execer = Execer(config=config, login=self.login, xonsh_ctx=self.ctx,\n debug_level=debug)\n self.execer.scriptcache = scriptcache\n self.execer.cacheall = cacheall\n if self.stype != 'none' or self.login:\n # load xontribs from config file\n names = builtins.__xonsh_config__.get('xontribs', ())\n for name in names:\n update_context(name, ctx=self.ctx)\n # load run control files\n env = builtins.__xonsh_env__\n rc = env.get('XONSHRC') if rc is None else rc\n self.ctx.update(xonshrc_context(rcfiles=rc, execer=self.execer, initial=self.ctx))\n self.ctx['__name__'] = '__main__'\n", "path": "xonsh/shell.py"}]}
| 1,705 | 153 |
gh_patches_debug_31353
|
rasdani/github-patches
|
git_diff
|
CTFd__CTFd-1832
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Don't require teams for viewing challenges if challenges are public
Looks like if challenges are set to be public but teams are required then challenges can't be seen. This requirement needs to be relaxed a bit.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `CTFd/challenges.py`
Content:
```
1 from flask import Blueprint, render_template
2
3 from CTFd.utils import config
4 from CTFd.utils.dates import ctf_ended, ctf_paused, ctf_started
5 from CTFd.utils.decorators import (
6 during_ctf_time_only,
7 require_team,
8 require_verified_emails,
9 )
10 from CTFd.utils.decorators.visibility import check_challenge_visibility
11 from CTFd.utils.helpers import get_errors, get_infos
12
13 challenges = Blueprint("challenges", __name__)
14
15
16 @challenges.route("/challenges", methods=["GET"])
17 @during_ctf_time_only
18 @require_verified_emails
19 @check_challenge_visibility
20 @require_team
21 def listing():
22 infos = get_infos()
23 errors = get_errors()
24
25 if ctf_started() is False:
26 errors.append(f"{config.ctf_name()} has not started yet")
27
28 if ctf_paused() is True:
29 infos.append(f"{config.ctf_name()} is paused")
30
31 if ctf_ended() is True:
32 infos.append(f"{config.ctf_name()} has ended")
33
34 return render_template("challenges.html", infos=infos, errors=errors)
35
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/CTFd/challenges.py b/CTFd/challenges.py
--- a/CTFd/challenges.py
+++ b/CTFd/challenges.py
@@ -1,14 +1,12 @@
-from flask import Blueprint, render_template
+from flask import Blueprint, redirect, render_template, request, url_for
-from CTFd.utils import config
+from CTFd.constants.config import ChallengeVisibilityTypes, Configs
+from CTFd.utils.config import is_teams_mode
from CTFd.utils.dates import ctf_ended, ctf_paused, ctf_started
-from CTFd.utils.decorators import (
- during_ctf_time_only,
- require_team,
- require_verified_emails,
-)
+from CTFd.utils.decorators import during_ctf_time_only, require_verified_emails
from CTFd.utils.decorators.visibility import check_challenge_visibility
from CTFd.utils.helpers import get_errors, get_infos
+from CTFd.utils.user import authed, get_current_team
challenges = Blueprint("challenges", __name__)
@@ -17,18 +15,26 @@
@during_ctf_time_only
@require_verified_emails
@check_challenge_visibility
-@require_team
def listing():
+ if (
+ Configs.challenge_visibility == ChallengeVisibilityTypes.PUBLIC
+ and authed() is False
+ ):
+ pass
+ else:
+ if is_teams_mode() and get_current_team() is None:
+ return redirect(url_for("teams.private", next=request.full_path))
+
infos = get_infos()
errors = get_errors()
if ctf_started() is False:
- errors.append(f"{config.ctf_name()} has not started yet")
+ errors.append(f"{Configs.ctf_name} has not started yet")
if ctf_paused() is True:
- infos.append(f"{config.ctf_name()} is paused")
+ infos.append(f"{Configs.ctf_name} is paused")
if ctf_ended() is True:
- infos.append(f"{config.ctf_name()} has ended")
+ infos.append(f"{Configs.ctf_name} has ended")
return render_template("challenges.html", infos=infos, errors=errors)
|
{"golden_diff": "diff --git a/CTFd/challenges.py b/CTFd/challenges.py\n--- a/CTFd/challenges.py\n+++ b/CTFd/challenges.py\n@@ -1,14 +1,12 @@\n-from flask import Blueprint, render_template\n+from flask import Blueprint, redirect, render_template, request, url_for\n \n-from CTFd.utils import config\n+from CTFd.constants.config import ChallengeVisibilityTypes, Configs\n+from CTFd.utils.config import is_teams_mode\n from CTFd.utils.dates import ctf_ended, ctf_paused, ctf_started\n-from CTFd.utils.decorators import (\n- during_ctf_time_only,\n- require_team,\n- require_verified_emails,\n-)\n+from CTFd.utils.decorators import during_ctf_time_only, require_verified_emails\n from CTFd.utils.decorators.visibility import check_challenge_visibility\n from CTFd.utils.helpers import get_errors, get_infos\n+from CTFd.utils.user import authed, get_current_team\n \n challenges = Blueprint(\"challenges\", __name__)\n \n@@ -17,18 +15,26 @@\n @during_ctf_time_only\n @require_verified_emails\n @check_challenge_visibility\n-@require_team\n def listing():\n+ if (\n+ Configs.challenge_visibility == ChallengeVisibilityTypes.PUBLIC\n+ and authed() is False\n+ ):\n+ pass\n+ else:\n+ if is_teams_mode() and get_current_team() is None:\n+ return redirect(url_for(\"teams.private\", next=request.full_path))\n+\n infos = get_infos()\n errors = get_errors()\n \n if ctf_started() is False:\n- errors.append(f\"{config.ctf_name()} has not started yet\")\n+ errors.append(f\"{Configs.ctf_name} has not started yet\")\n \n if ctf_paused() is True:\n- infos.append(f\"{config.ctf_name()} is paused\")\n+ infos.append(f\"{Configs.ctf_name} is paused\")\n \n if ctf_ended() is True:\n- infos.append(f\"{config.ctf_name()} has ended\")\n+ infos.append(f\"{Configs.ctf_name} has ended\")\n \n return render_template(\"challenges.html\", infos=infos, errors=errors)\n", "issue": "Don't require teams for viewing challenges if challenges are public\nLooks like if challenges are set to be public but teams are required then challenges can't be seen. This requirement needs to be relaxed a bit. \n", "before_files": [{"content": "from flask import Blueprint, render_template\n\nfrom CTFd.utils import config\nfrom CTFd.utils.dates import ctf_ended, ctf_paused, ctf_started\nfrom CTFd.utils.decorators import (\n during_ctf_time_only,\n require_team,\n require_verified_emails,\n)\nfrom CTFd.utils.decorators.visibility import check_challenge_visibility\nfrom CTFd.utils.helpers import get_errors, get_infos\n\nchallenges = Blueprint(\"challenges\", __name__)\n\n\[email protected](\"/challenges\", methods=[\"GET\"])\n@during_ctf_time_only\n@require_verified_emails\n@check_challenge_visibility\n@require_team\ndef listing():\n infos = get_infos()\n errors = get_errors()\n\n if ctf_started() is False:\n errors.append(f\"{config.ctf_name()} has not started yet\")\n\n if ctf_paused() is True:\n infos.append(f\"{config.ctf_name()} is paused\")\n\n if ctf_ended() is True:\n infos.append(f\"{config.ctf_name()} has ended\")\n\n return render_template(\"challenges.html\", infos=infos, errors=errors)\n", "path": "CTFd/challenges.py"}], "after_files": [{"content": "from flask import Blueprint, redirect, render_template, request, url_for\n\nfrom CTFd.constants.config import ChallengeVisibilityTypes, Configs\nfrom CTFd.utils.config import is_teams_mode\nfrom CTFd.utils.dates import ctf_ended, ctf_paused, ctf_started\nfrom CTFd.utils.decorators import during_ctf_time_only, require_verified_emails\nfrom CTFd.utils.decorators.visibility import check_challenge_visibility\nfrom CTFd.utils.helpers import get_errors, get_infos\nfrom CTFd.utils.user import authed, get_current_team\n\nchallenges = Blueprint(\"challenges\", __name__)\n\n\[email protected](\"/challenges\", methods=[\"GET\"])\n@during_ctf_time_only\n@require_verified_emails\n@check_challenge_visibility\ndef listing():\n if (\n Configs.challenge_visibility == ChallengeVisibilityTypes.PUBLIC\n and authed() is False\n ):\n pass\n else:\n if is_teams_mode() and get_current_team() is None:\n return redirect(url_for(\"teams.private\", next=request.full_path))\n\n infos = get_infos()\n errors = get_errors()\n\n if ctf_started() is False:\n errors.append(f\"{Configs.ctf_name} has not started yet\")\n\n if ctf_paused() is True:\n infos.append(f\"{Configs.ctf_name} is paused\")\n\n if ctf_ended() is True:\n infos.append(f\"{Configs.ctf_name} has ended\")\n\n return render_template(\"challenges.html\", infos=infos, errors=errors)\n", "path": "CTFd/challenges.py"}]}
| 603 | 485 |
gh_patches_debug_369
|
rasdani/github-patches
|
git_diff
|
kserve__kserve-2478
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update ModelMesh version to `v0.10.0`
#### Proposed Changes
Update to the latest ModelMesh version (`v0.10.0`)
> If your changes should also be in the most recent release, add the corresponding "cherrypick-0.X"
label to the original PR; for example, "cherrypick-0.10".
#### TODOs:
> - [ ] Best practice is to open a PR for the cherry-pick yourself after your original PR has been merged
into the main branch.
> - [ ] After the cherry-pick PR has merged, remove the cherry-pick label from the original PR.
**Type of changes**
- [x] This change requires a documentation update --> https://github.com/kserve/website/pull/214
**Special notes for your reviewer**:
1. Please confirm that if this PR changes any image versions, then that's the sole change this PR makes.
YES
**Checklist**:
- [x] Have you made corresponding changes to the documentation?
**Release note**:
<!-- Write your release note:
1. Enter your extended release note in the below block. If the PR requires additional action from users switching to the new release, include the string "action required".
3. If no release note is required, just write "NONE".
-->
```release-note
???
```
Not sure, maybe,
- "Updgrade to ModelMesh v0.10.0"
or
- point to ModelMesh release notes here
https://github.com/kserve/modelmesh-serving/releases/tag/v0.10.0
---
/cc @yuzisun @rachitchauhan43 @njhill
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/kserve/setup.py`
Content:
```
1 # Copyright 2021 The KServe Authors.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import pathlib
15
16 import setuptools
17
18 TESTS_REQUIRES = [
19 'pytest',
20 'pytest-xdist',
21 'pytest-cov',
22 'pytest-asyncio',
23 'pytest-tornasync',
24 'mypy'
25 ]
26
27 with open('requirements.txt') as f:
28 REQUIRES = f.readlines()
29
30 with open(pathlib.Path(__file__).parent.parent / 'VERSION') as version_file:
31 version = version_file.read().strip()
32
33 setuptools.setup(
34 name='kserve',
35 version=version,
36 author="The KServe Authors",
37 author_email='[email protected], [email protected], [email protected]',
38 license="Apache License Version 2.0",
39 url="https://github.com/kserve/kserve/tree/master/python/kserve",
40 description="KServe Python SDK",
41 long_description="Python SDK for KServe Server and Client.",
42 python_requires='>=3.7',
43 packages=[
44 'kserve',
45 'kserve.api',
46 'kserve.constants',
47 'kserve.models',
48 'kserve.handlers',
49 'kserve.utils',
50 ],
51 package_data={'': ['requirements.txt']},
52 include_package_data=True,
53 zip_safe=False,
54 classifiers=[
55 'Intended Audience :: Developers',
56 'Intended Audience :: Education',
57 'Intended Audience :: Science/Research',
58 'Programming Language :: Python :: 3',
59 'Programming Language :: Python :: 3.7',
60 'Programming Language :: Python :: 3.8',
61 'Programming Language :: Python :: 3.9',
62 "License :: OSI Approved :: Apache Software License",
63 "Operating System :: OS Independent",
64 'Topic :: Scientific/Engineering',
65 'Topic :: Scientific/Engineering :: Artificial Intelligence',
66 'Topic :: Software Development',
67 'Topic :: Software Development :: Libraries',
68 'Topic :: Software Development :: Libraries :: Python Modules',
69 ],
70 install_requires=REQUIRES,
71 tests_require=TESTS_REQUIRES,
72 extras_require={'test': TESTS_REQUIRES}
73 )
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/python/kserve/setup.py b/python/kserve/setup.py
--- a/python/kserve/setup.py
+++ b/python/kserve/setup.py
@@ -21,7 +21,8 @@
'pytest-cov',
'pytest-asyncio',
'pytest-tornasync',
- 'mypy'
+ 'mypy',
+ 'portforward',
]
with open('requirements.txt') as f:
|
{"golden_diff": "diff --git a/python/kserve/setup.py b/python/kserve/setup.py\n--- a/python/kserve/setup.py\n+++ b/python/kserve/setup.py\n@@ -21,7 +21,8 @@\n 'pytest-cov',\n 'pytest-asyncio',\n 'pytest-tornasync',\n- 'mypy'\n+ 'mypy',\n+ 'portforward',\n ]\n \n with open('requirements.txt') as f:\n", "issue": "Update ModelMesh version to `v0.10.0`\n#### Proposed Changes\r\n\r\nUpdate to the latest ModelMesh version (`v0.10.0`)\r\n\r\n> If your changes should also be in the most recent release, add the corresponding \"cherrypick-0.X\"\r\nlabel to the original PR; for example, \"cherrypick-0.10\".\r\n\r\n#### TODOs:\r\n\r\n> - [ ] Best practice is to open a PR for the cherry-pick yourself after your original PR has been merged\r\ninto the main branch.\r\n> - [ ] After the cherry-pick PR has merged, remove the cherry-pick label from the original PR.\r\n\r\n\r\n**Type of changes**\r\n\r\n- [x] This change requires a documentation update --> https://github.com/kserve/website/pull/214\r\n\r\n**Special notes for your reviewer**:\r\n\r\n1. Please confirm that if this PR changes any image versions, then that's the sole change this PR makes.\r\n\r\nYES\r\n\r\n**Checklist**:\r\n\r\n- [x] Have you made corresponding changes to the documentation?\r\n\r\n**Release note**:\r\n<!-- Write your release note:\r\n1. Enter your extended release note in the below block. If the PR requires additional action from users switching to the new release, include the string \"action required\".\r\n3. If no release note is required, just write \"NONE\".\r\n-->\r\n```release-note\r\n???\r\n```\r\nNot sure, maybe, \r\n- \"Updgrade to ModelMesh v0.10.0\" \r\nor\r\n- point to ModelMesh release notes here\r\nhttps://github.com/kserve/modelmesh-serving/releases/tag/v0.10.0\r\n\r\n\r\n---\r\n\r\n/cc @yuzisun @rachitchauhan43 @njhill \r\n\n", "before_files": [{"content": "# Copyright 2021 The KServe Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport pathlib\n\nimport setuptools\n\nTESTS_REQUIRES = [\n 'pytest',\n 'pytest-xdist',\n 'pytest-cov',\n 'pytest-asyncio',\n 'pytest-tornasync',\n 'mypy'\n]\n\nwith open('requirements.txt') as f:\n REQUIRES = f.readlines()\n\nwith open(pathlib.Path(__file__).parent.parent / 'VERSION') as version_file:\n version = version_file.read().strip()\n\nsetuptools.setup(\n name='kserve',\n version=version,\n author=\"The KServe Authors\",\n author_email='[email protected], [email protected], [email protected]',\n license=\"Apache License Version 2.0\",\n url=\"https://github.com/kserve/kserve/tree/master/python/kserve\",\n description=\"KServe Python SDK\",\n long_description=\"Python SDK for KServe Server and Client.\",\n python_requires='>=3.7',\n packages=[\n 'kserve',\n 'kserve.api',\n 'kserve.constants',\n 'kserve.models',\n 'kserve.handlers',\n 'kserve.utils',\n ],\n package_data={'': ['requirements.txt']},\n include_package_data=True,\n zip_safe=False,\n classifiers=[\n 'Intended Audience :: Developers',\n 'Intended Audience :: Education',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n 'Programming Language :: Python :: 3.9',\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: OS Independent\",\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Software Development',\n 'Topic :: Software Development :: Libraries',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n ],\n install_requires=REQUIRES,\n tests_require=TESTS_REQUIRES,\n extras_require={'test': TESTS_REQUIRES}\n)\n", "path": "python/kserve/setup.py"}], "after_files": [{"content": "# Copyright 2021 The KServe Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport pathlib\n\nimport setuptools\n\nTESTS_REQUIRES = [\n 'pytest',\n 'pytest-xdist',\n 'pytest-cov',\n 'pytest-asyncio',\n 'pytest-tornasync',\n 'mypy',\n 'portforward',\n]\n\nwith open('requirements.txt') as f:\n REQUIRES = f.readlines()\n\nwith open(pathlib.Path(__file__).parent.parent / 'VERSION') as version_file:\n version = version_file.read().strip()\n\nsetuptools.setup(\n name='kserve',\n version=version,\n author=\"The KServe Authors\",\n author_email='[email protected], [email protected], [email protected]',\n license=\"Apache License Version 2.0\",\n url=\"https://github.com/kserve/kserve/tree/master/python/kserve\",\n description=\"KServe Python SDK\",\n long_description=\"Python SDK for KServe Server and Client.\",\n python_requires='>=3.7',\n packages=[\n 'kserve',\n 'kserve.api',\n 'kserve.constants',\n 'kserve.models',\n 'kserve.handlers',\n 'kserve.utils',\n ],\n package_data={'': ['requirements.txt']},\n include_package_data=True,\n zip_safe=False,\n classifiers=[\n 'Intended Audience :: Developers',\n 'Intended Audience :: Education',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n 'Programming Language :: Python :: 3.9',\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: OS Independent\",\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: Software Development',\n 'Topic :: Software Development :: Libraries',\n 'Topic :: Software Development :: Libraries :: Python Modules',\n ],\n install_requires=REQUIRES,\n tests_require=TESTS_REQUIRES,\n extras_require={'test': TESTS_REQUIRES}\n)\n", "path": "python/kserve/setup.py"}]}
| 1,335 | 93 |
gh_patches_debug_21948
|
rasdani/github-patches
|
git_diff
|
urllib3__urllib3-817
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Honor PYTHONWARNINGS for SNIMissingWarning
.local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:315: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.
SNIMissingWarning
.local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
I'm not here to discuss the validity of those warnings, but when I tried to get rid of them with the PYTHONWARNINGS environment variable, it only works for the InsecurePlatformWarning one. The other is always displayed whatever I put in PYTHONWARNINGS.
Please advise what PYTHONWARNINGS value to use to get rid of this one, this is a documented python feature:
https://docs.python.org/2/library/warnings.html
https://docs.python.org/2/using/cmdline.html?highlight=pythonwarnings#envvar-PYTHONWARNINGS
Also seen the same problem here: in https://github.com/shazow/urllib3/issues/497
Thanks
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `urllib3/__init__.py`
Content:
```
1 """
2 urllib3 - Thread-safe connection pooling and re-using.
3 """
4 from __future__ import absolute_import
5 import warnings
6
7 from .connectionpool import (
8 HTTPConnectionPool,
9 HTTPSConnectionPool,
10 connection_from_url
11 )
12
13 from . import exceptions
14 from .filepost import encode_multipart_formdata
15 from .poolmanager import PoolManager, ProxyManager, proxy_from_url
16 from .response import HTTPResponse
17 from .util.request import make_headers
18 from .util.url import get_host
19 from .util.timeout import Timeout
20 from .util.retry import Retry
21
22
23 # Set default logging handler to avoid "No handler found" warnings.
24 import logging
25 try: # Python 2.7+
26 from logging import NullHandler
27 except ImportError:
28 class NullHandler(logging.Handler):
29 def emit(self, record):
30 pass
31
32 __author__ = 'Andrey Petrov ([email protected])'
33 __license__ = 'MIT'
34 __version__ = 'dev'
35
36 __all__ = (
37 'HTTPConnectionPool',
38 'HTTPSConnectionPool',
39 'PoolManager',
40 'ProxyManager',
41 'HTTPResponse',
42 'Retry',
43 'Timeout',
44 'add_stderr_logger',
45 'connection_from_url',
46 'disable_warnings',
47 'encode_multipart_formdata',
48 'get_host',
49 'make_headers',
50 'proxy_from_url',
51 )
52
53 logging.getLogger(__name__).addHandler(NullHandler())
54
55
56 def add_stderr_logger(level=logging.DEBUG):
57 """
58 Helper for quickly adding a StreamHandler to the logger. Useful for
59 debugging.
60
61 Returns the handler after adding it.
62 """
63 # This method needs to be in this __init__.py to get the __name__ correct
64 # even if urllib3 is vendored within another package.
65 logger = logging.getLogger(__name__)
66 handler = logging.StreamHandler()
67 handler.setFormatter(logging.Formatter('%(asctime)s %(levelname)s %(message)s'))
68 logger.addHandler(handler)
69 logger.setLevel(level)
70 logger.debug('Added a stderr logging handler to logger: %s', __name__)
71 return handler
72
73 # ... Clean up.
74 del NullHandler
75
76
77 # SecurityWarning's always go off by default.
78 warnings.simplefilter('always', exceptions.SecurityWarning, append=True)
79 # SubjectAltNameWarning's should go off once per host
80 warnings.simplefilter('default', exceptions.SubjectAltNameWarning)
81 # InsecurePlatformWarning's don't vary between requests, so we keep it default.
82 warnings.simplefilter('default', exceptions.InsecurePlatformWarning,
83 append=True)
84 # SNIMissingWarnings should go off only once.
85 warnings.simplefilter('default', exceptions.SNIMissingWarning)
86
87
88 def disable_warnings(category=exceptions.HTTPWarning):
89 """
90 Helper for quickly disabling all urllib3 warnings.
91 """
92 warnings.simplefilter('ignore', category)
93
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/urllib3/__init__.py b/urllib3/__init__.py
--- a/urllib3/__init__.py
+++ b/urllib3/__init__.py
@@ -74,15 +74,18 @@
del NullHandler
+# All warning filters *must* be appended unless you're really certain that they
+# shouldn't be: otherwise, it's very hard for users to use most Python
+# mechanisms to silence them.
# SecurityWarning's always go off by default.
warnings.simplefilter('always', exceptions.SecurityWarning, append=True)
# SubjectAltNameWarning's should go off once per host
-warnings.simplefilter('default', exceptions.SubjectAltNameWarning)
+warnings.simplefilter('default', exceptions.SubjectAltNameWarning, append=True)
# InsecurePlatformWarning's don't vary between requests, so we keep it default.
warnings.simplefilter('default', exceptions.InsecurePlatformWarning,
append=True)
# SNIMissingWarnings should go off only once.
-warnings.simplefilter('default', exceptions.SNIMissingWarning)
+warnings.simplefilter('default', exceptions.SNIMissingWarning, append=True)
def disable_warnings(category=exceptions.HTTPWarning):
|
{"golden_diff": "diff --git a/urllib3/__init__.py b/urllib3/__init__.py\n--- a/urllib3/__init__.py\n+++ b/urllib3/__init__.py\n@@ -74,15 +74,18 @@\n del NullHandler\n \n \n+# All warning filters *must* be appended unless you're really certain that they\n+# shouldn't be: otherwise, it's very hard for users to use most Python\n+# mechanisms to silence them.\n # SecurityWarning's always go off by default.\n warnings.simplefilter('always', exceptions.SecurityWarning, append=True)\n # SubjectAltNameWarning's should go off once per host\n-warnings.simplefilter('default', exceptions.SubjectAltNameWarning)\n+warnings.simplefilter('default', exceptions.SubjectAltNameWarning, append=True)\n # InsecurePlatformWarning's don't vary between requests, so we keep it default.\n warnings.simplefilter('default', exceptions.InsecurePlatformWarning,\n append=True)\n # SNIMissingWarnings should go off only once.\n-warnings.simplefilter('default', exceptions.SNIMissingWarning)\n+warnings.simplefilter('default', exceptions.SNIMissingWarning, append=True)\n \n \n def disable_warnings(category=exceptions.HTTPWarning):\n", "issue": "Honor PYTHONWARNINGS for SNIMissingWarning\n.local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:315: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.\n SNIMissingWarning\n.local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.\n InsecurePlatformWarning\n\nI'm not here to discuss the validity of those warnings, but when I tried to get rid of them with the PYTHONWARNINGS environment variable, it only works for the InsecurePlatformWarning one. The other is always displayed whatever I put in PYTHONWARNINGS.\n\nPlease advise what PYTHONWARNINGS value to use to get rid of this one, this is a documented python feature: \nhttps://docs.python.org/2/library/warnings.html\nhttps://docs.python.org/2/using/cmdline.html?highlight=pythonwarnings#envvar-PYTHONWARNINGS\n\nAlso seen the same problem here: in https://github.com/shazow/urllib3/issues/497\n\nThanks\n\n", "before_files": [{"content": "\"\"\"\nurllib3 - Thread-safe connection pooling and re-using.\n\"\"\"\nfrom __future__ import absolute_import\nimport warnings\n\nfrom .connectionpool import (\n HTTPConnectionPool,\n HTTPSConnectionPool,\n connection_from_url\n)\n\nfrom . import exceptions\nfrom .filepost import encode_multipart_formdata\nfrom .poolmanager import PoolManager, ProxyManager, proxy_from_url\nfrom .response import HTTPResponse\nfrom .util.request import make_headers\nfrom .util.url import get_host\nfrom .util.timeout import Timeout\nfrom .util.retry import Retry\n\n\n# Set default logging handler to avoid \"No handler found\" warnings.\nimport logging\ntry: # Python 2.7+\n from logging import NullHandler\nexcept ImportError:\n class NullHandler(logging.Handler):\n def emit(self, record):\n pass\n\n__author__ = 'Andrey Petrov ([email protected])'\n__license__ = 'MIT'\n__version__ = 'dev'\n\n__all__ = (\n 'HTTPConnectionPool',\n 'HTTPSConnectionPool',\n 'PoolManager',\n 'ProxyManager',\n 'HTTPResponse',\n 'Retry',\n 'Timeout',\n 'add_stderr_logger',\n 'connection_from_url',\n 'disable_warnings',\n 'encode_multipart_formdata',\n 'get_host',\n 'make_headers',\n 'proxy_from_url',\n)\n\nlogging.getLogger(__name__).addHandler(NullHandler())\n\n\ndef add_stderr_logger(level=logging.DEBUG):\n \"\"\"\n Helper for quickly adding a StreamHandler to the logger. Useful for\n debugging.\n\n Returns the handler after adding it.\n \"\"\"\n # This method needs to be in this __init__.py to get the __name__ correct\n # even if urllib3 is vendored within another package.\n logger = logging.getLogger(__name__)\n handler = logging.StreamHandler()\n handler.setFormatter(logging.Formatter('%(asctime)s %(levelname)s %(message)s'))\n logger.addHandler(handler)\n logger.setLevel(level)\n logger.debug('Added a stderr logging handler to logger: %s', __name__)\n return handler\n\n# ... Clean up.\ndel NullHandler\n\n\n# SecurityWarning's always go off by default.\nwarnings.simplefilter('always', exceptions.SecurityWarning, append=True)\n# SubjectAltNameWarning's should go off once per host\nwarnings.simplefilter('default', exceptions.SubjectAltNameWarning)\n# InsecurePlatformWarning's don't vary between requests, so we keep it default.\nwarnings.simplefilter('default', exceptions.InsecurePlatformWarning,\n append=True)\n# SNIMissingWarnings should go off only once.\nwarnings.simplefilter('default', exceptions.SNIMissingWarning)\n\n\ndef disable_warnings(category=exceptions.HTTPWarning):\n \"\"\"\n Helper for quickly disabling all urllib3 warnings.\n \"\"\"\n warnings.simplefilter('ignore', category)\n", "path": "urllib3/__init__.py"}], "after_files": [{"content": "\"\"\"\nurllib3 - Thread-safe connection pooling and re-using.\n\"\"\"\nfrom __future__ import absolute_import\nimport warnings\n\nfrom .connectionpool import (\n HTTPConnectionPool,\n HTTPSConnectionPool,\n connection_from_url\n)\n\nfrom . import exceptions\nfrom .filepost import encode_multipart_formdata\nfrom .poolmanager import PoolManager, ProxyManager, proxy_from_url\nfrom .response import HTTPResponse\nfrom .util.request import make_headers\nfrom .util.url import get_host\nfrom .util.timeout import Timeout\nfrom .util.retry import Retry\n\n\n# Set default logging handler to avoid \"No handler found\" warnings.\nimport logging\ntry: # Python 2.7+\n from logging import NullHandler\nexcept ImportError:\n class NullHandler(logging.Handler):\n def emit(self, record):\n pass\n\n__author__ = 'Andrey Petrov ([email protected])'\n__license__ = 'MIT'\n__version__ = 'dev'\n\n__all__ = (\n 'HTTPConnectionPool',\n 'HTTPSConnectionPool',\n 'PoolManager',\n 'ProxyManager',\n 'HTTPResponse',\n 'Retry',\n 'Timeout',\n 'add_stderr_logger',\n 'connection_from_url',\n 'disable_warnings',\n 'encode_multipart_formdata',\n 'get_host',\n 'make_headers',\n 'proxy_from_url',\n)\n\nlogging.getLogger(__name__).addHandler(NullHandler())\n\n\ndef add_stderr_logger(level=logging.DEBUG):\n \"\"\"\n Helper for quickly adding a StreamHandler to the logger. Useful for\n debugging.\n\n Returns the handler after adding it.\n \"\"\"\n # This method needs to be in this __init__.py to get the __name__ correct\n # even if urllib3 is vendored within another package.\n logger = logging.getLogger(__name__)\n handler = logging.StreamHandler()\n handler.setFormatter(logging.Formatter('%(asctime)s %(levelname)s %(message)s'))\n logger.addHandler(handler)\n logger.setLevel(level)\n logger.debug('Added a stderr logging handler to logger: %s', __name__)\n return handler\n\n# ... Clean up.\ndel NullHandler\n\n\n# All warning filters *must* be appended unless you're really certain that they\n# shouldn't be: otherwise, it's very hard for users to use most Python\n# mechanisms to silence them.\n# SecurityWarning's always go off by default.\nwarnings.simplefilter('always', exceptions.SecurityWarning, append=True)\n# SubjectAltNameWarning's should go off once per host\nwarnings.simplefilter('default', exceptions.SubjectAltNameWarning, append=True)\n# InsecurePlatformWarning's don't vary between requests, so we keep it default.\nwarnings.simplefilter('default', exceptions.InsecurePlatformWarning,\n append=True)\n# SNIMissingWarnings should go off only once.\nwarnings.simplefilter('default', exceptions.SNIMissingWarning, append=True)\n\n\ndef disable_warnings(category=exceptions.HTTPWarning):\n \"\"\"\n Helper for quickly disabling all urllib3 warnings.\n \"\"\"\n warnings.simplefilter('ignore', category)\n", "path": "urllib3/__init__.py"}]}
| 1,376 | 255 |
gh_patches_debug_277
|
rasdani/github-patches
|
git_diff
|
sanic-org__sanic-1045
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
0.6.1 release to PyPi
Hey folks,
There's been a bunch of substantive changes in the past few months; I think it warrants a release of 0.6.1 (or 0.7, considering there may be large changes in PRs like #939). Any chance we could get a new candidate uploaded to PyPi?
If there's a better place to ask this, I'm happy to head there.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sanic/__init__.py`
Content:
```
1 from sanic.app import Sanic
2 from sanic.blueprints import Blueprint
3
4 __version__ = '0.6.0'
5
6 __all__ = ['Sanic', 'Blueprint']
7
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sanic/__init__.py b/sanic/__init__.py
--- a/sanic/__init__.py
+++ b/sanic/__init__.py
@@ -1,6 +1,6 @@
from sanic.app import Sanic
from sanic.blueprints import Blueprint
-__version__ = '0.6.0'
+__version__ = '0.7.0'
__all__ = ['Sanic', 'Blueprint']
|
{"golden_diff": "diff --git a/sanic/__init__.py b/sanic/__init__.py\n--- a/sanic/__init__.py\n+++ b/sanic/__init__.py\n@@ -1,6 +1,6 @@\n from sanic.app import Sanic\n from sanic.blueprints import Blueprint\n \n-__version__ = '0.6.0'\n+__version__ = '0.7.0'\n \n __all__ = ['Sanic', 'Blueprint']\n", "issue": "0.6.1 release to PyPi\nHey folks,\r\n\r\nThere's been a bunch of substantive changes in the past few months; I think it warrants a release of 0.6.1 (or 0.7, considering there may be large changes in PRs like #939). Any chance we could get a new candidate uploaded to PyPi? \r\n\r\nIf there's a better place to ask this, I'm happy to head there.\n", "before_files": [{"content": "from sanic.app import Sanic\nfrom sanic.blueprints import Blueprint\n\n__version__ = '0.6.0'\n\n__all__ = ['Sanic', 'Blueprint']\n", "path": "sanic/__init__.py"}], "after_files": [{"content": "from sanic.app import Sanic\nfrom sanic.blueprints import Blueprint\n\n__version__ = '0.7.0'\n\n__all__ = ['Sanic', 'Blueprint']\n", "path": "sanic/__init__.py"}]}
| 400 | 98 |
gh_patches_debug_18265
|
rasdani/github-patches
|
git_diff
|
python-poetry__poetry-4110
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Exported requirements.txt contains spurious error message, making it unparseable
- [X] I am on the [latest](https://github.com/python-poetry/poetry/releases/latest) Poetry version.
- [X] I have searched the [issues](https://github.com/python-poetry/poetry/issues) of this repo and believe that this is not a duplicate.
- [X] If an exception occurs when executing a command, I executed it again in debug mode (`-vvv` option).
- **OS version and name**: macOS 10.15.7
- **Poetry version**: 1.1.6
- **Link of a [Gist](https://gist.github.com/) with the contents of your pyproject.toml file**: Too much internal stuff to share, it's not really related, though
## Issue
When I run `poetry export > requirements.txt && pip install -r requirements.txt`, I see this:
```
Invalid requirement: 'Warning: The lock file is not up to date with the latest changes in pyproject.toml. You may be getting outdated dependencies. Run update to update them.' (from line 1 of requirements.txt)
```
It appears that my Poetry lockfile is out of sync. No biggie, that's correctable and logged in #3092 when it's not necessary to resync.
However, this error message should not make it into the output of the `poetry export` command. That error log line should go to stderr, not stdout.
## Proposed Fix
I think that the `self.line` here https://github.com/python-poetry/poetry/blob/b753aaf4c3c08ef0e54941a6616fe318fdf4f6e4/poetry/console/commands/export.py#L56-L63
should be `self.line_error` which, according to the [docstring for that method](https://github.com/sdispater/cleo/blob/9d289bf709ab2119b58e95da58a655ab70661798/cleo/commands/command.py#L262-L276) outputs to stderr instead of stdout.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `poetry/console/commands/export.py`
Content:
```
1 from cleo.helpers import option
2
3 from poetry.utils.exporter import Exporter
4
5 from .command import Command
6
7
8 class ExportCommand(Command):
9
10 name = "export"
11 description = "Exports the lock file to alternative formats."
12
13 options = [
14 option(
15 "format",
16 "f",
17 "Format to export to. Currently, only requirements.txt is supported.",
18 flag=False,
19 default=Exporter.FORMAT_REQUIREMENTS_TXT,
20 ),
21 option("output", "o", "The name of the output file.", flag=False),
22 option("without-hashes", None, "Exclude hashes from the exported file."),
23 option("dev", None, "Include development dependencies."),
24 option(
25 "extras",
26 "E",
27 "Extra sets of dependencies to include.",
28 flag=False,
29 multiple=True,
30 ),
31 option("with-credentials", None, "Include credentials for extra indices."),
32 ]
33
34 def handle(self) -> None:
35 fmt = self.option("format")
36
37 if fmt not in Exporter.ACCEPTED_FORMATS:
38 raise ValueError("Invalid export format: {}".format(fmt))
39
40 output = self.option("output")
41
42 locker = self.poetry.locker
43 if not locker.is_locked():
44 self.line("<comment>The lock file does not exist. Locking.</comment>")
45 options = []
46 if self.io.is_debug():
47 options.append(("-vvv", None))
48 elif self.io.is_very_verbose():
49 options.append(("-vv", None))
50 elif self.io.is_verbose():
51 options.append(("-v", None))
52
53 self.call("lock", " ".join(options))
54
55 if not locker.is_fresh():
56 self.line(
57 "<warning>"
58 "Warning: The lock file is not up to date with "
59 "the latest changes in pyproject.toml. "
60 "You may be getting outdated dependencies. "
61 "Run update to update them."
62 "</warning>"
63 )
64
65 exporter = Exporter(self.poetry)
66 exporter.export(
67 fmt,
68 self.poetry.file.parent,
69 output or self.io,
70 with_hashes=not self.option("without-hashes"),
71 dev=self.option("dev"),
72 extras=self.option("extras"),
73 with_credentials=self.option("with-credentials"),
74 )
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/poetry/console/commands/export.py b/poetry/console/commands/export.py
--- a/poetry/console/commands/export.py
+++ b/poetry/console/commands/export.py
@@ -41,7 +41,7 @@
locker = self.poetry.locker
if not locker.is_locked():
- self.line("<comment>The lock file does not exist. Locking.</comment>")
+ self.line_error("<comment>The lock file does not exist. Locking.</comment>")
options = []
if self.io.is_debug():
options.append(("-vvv", None))
@@ -53,7 +53,7 @@
self.call("lock", " ".join(options))
if not locker.is_fresh():
- self.line(
+ self.line_error(
"<warning>"
"Warning: The lock file is not up to date with "
"the latest changes in pyproject.toml. "
|
{"golden_diff": "diff --git a/poetry/console/commands/export.py b/poetry/console/commands/export.py\n--- a/poetry/console/commands/export.py\n+++ b/poetry/console/commands/export.py\n@@ -41,7 +41,7 @@\n \n locker = self.poetry.locker\n if not locker.is_locked():\n- self.line(\"<comment>The lock file does not exist. Locking.</comment>\")\n+ self.line_error(\"<comment>The lock file does not exist. Locking.</comment>\")\n options = []\n if self.io.is_debug():\n options.append((\"-vvv\", None))\n@@ -53,7 +53,7 @@\n self.call(\"lock\", \" \".join(options))\n \n if not locker.is_fresh():\n- self.line(\n+ self.line_error(\n \"<warning>\"\n \"Warning: The lock file is not up to date with \"\n \"the latest changes in pyproject.toml. \"\n", "issue": "Exported requirements.txt contains spurious error message, making it unparseable\n- [X] I am on the [latest](https://github.com/python-poetry/poetry/releases/latest) Poetry version.\r\n- [X] I have searched the [issues](https://github.com/python-poetry/poetry/issues) of this repo and believe that this is not a duplicate.\r\n- [X] If an exception occurs when executing a command, I executed it again in debug mode (`-vvv` option).\r\n\r\n- **OS version and name**: macOS 10.15.7\r\n- **Poetry version**: 1.1.6\r\n- **Link of a [Gist](https://gist.github.com/) with the contents of your pyproject.toml file**: Too much internal stuff to share, it's not really related, though\r\n\r\n## Issue\r\n\r\nWhen I run `poetry export > requirements.txt && pip install -r requirements.txt`, I see this:\r\n\r\n```\r\n Invalid requirement: 'Warning: The lock file is not up to date with the latest changes in pyproject.toml. You may be getting outdated dependencies. Run update to update them.' (from line 1 of requirements.txt)\r\n```\r\n\r\nIt appears that my Poetry lockfile is out of sync. No biggie, that's correctable and logged in #3092 when it's not necessary to resync.\r\n\r\nHowever, this error message should not make it into the output of the `poetry export` command. That error log line should go to stderr, not stdout.\r\n\r\n## Proposed Fix\r\n\r\nI think that the `self.line` here https://github.com/python-poetry/poetry/blob/b753aaf4c3c08ef0e54941a6616fe318fdf4f6e4/poetry/console/commands/export.py#L56-L63\r\nshould be `self.line_error` which, according to the [docstring for that method](https://github.com/sdispater/cleo/blob/9d289bf709ab2119b58e95da58a655ab70661798/cleo/commands/command.py#L262-L276) outputs to stderr instead of stdout.\n", "before_files": [{"content": "from cleo.helpers import option\n\nfrom poetry.utils.exporter import Exporter\n\nfrom .command import Command\n\n\nclass ExportCommand(Command):\n\n name = \"export\"\n description = \"Exports the lock file to alternative formats.\"\n\n options = [\n option(\n \"format\",\n \"f\",\n \"Format to export to. Currently, only requirements.txt is supported.\",\n flag=False,\n default=Exporter.FORMAT_REQUIREMENTS_TXT,\n ),\n option(\"output\", \"o\", \"The name of the output file.\", flag=False),\n option(\"without-hashes\", None, \"Exclude hashes from the exported file.\"),\n option(\"dev\", None, \"Include development dependencies.\"),\n option(\n \"extras\",\n \"E\",\n \"Extra sets of dependencies to include.\",\n flag=False,\n multiple=True,\n ),\n option(\"with-credentials\", None, \"Include credentials for extra indices.\"),\n ]\n\n def handle(self) -> None:\n fmt = self.option(\"format\")\n\n if fmt not in Exporter.ACCEPTED_FORMATS:\n raise ValueError(\"Invalid export format: {}\".format(fmt))\n\n output = self.option(\"output\")\n\n locker = self.poetry.locker\n if not locker.is_locked():\n self.line(\"<comment>The lock file does not exist. Locking.</comment>\")\n options = []\n if self.io.is_debug():\n options.append((\"-vvv\", None))\n elif self.io.is_very_verbose():\n options.append((\"-vv\", None))\n elif self.io.is_verbose():\n options.append((\"-v\", None))\n\n self.call(\"lock\", \" \".join(options))\n\n if not locker.is_fresh():\n self.line(\n \"<warning>\"\n \"Warning: The lock file is not up to date with \"\n \"the latest changes in pyproject.toml. \"\n \"You may be getting outdated dependencies. \"\n \"Run update to update them.\"\n \"</warning>\"\n )\n\n exporter = Exporter(self.poetry)\n exporter.export(\n fmt,\n self.poetry.file.parent,\n output or self.io,\n with_hashes=not self.option(\"without-hashes\"),\n dev=self.option(\"dev\"),\n extras=self.option(\"extras\"),\n with_credentials=self.option(\"with-credentials\"),\n )\n", "path": "poetry/console/commands/export.py"}], "after_files": [{"content": "from cleo.helpers import option\n\nfrom poetry.utils.exporter import Exporter\n\nfrom .command import Command\n\n\nclass ExportCommand(Command):\n\n name = \"export\"\n description = \"Exports the lock file to alternative formats.\"\n\n options = [\n option(\n \"format\",\n \"f\",\n \"Format to export to. Currently, only requirements.txt is supported.\",\n flag=False,\n default=Exporter.FORMAT_REQUIREMENTS_TXT,\n ),\n option(\"output\", \"o\", \"The name of the output file.\", flag=False),\n option(\"without-hashes\", None, \"Exclude hashes from the exported file.\"),\n option(\"dev\", None, \"Include development dependencies.\"),\n option(\n \"extras\",\n \"E\",\n \"Extra sets of dependencies to include.\",\n flag=False,\n multiple=True,\n ),\n option(\"with-credentials\", None, \"Include credentials for extra indices.\"),\n ]\n\n def handle(self) -> None:\n fmt = self.option(\"format\")\n\n if fmt not in Exporter.ACCEPTED_FORMATS:\n raise ValueError(\"Invalid export format: {}\".format(fmt))\n\n output = self.option(\"output\")\n\n locker = self.poetry.locker\n if not locker.is_locked():\n self.line_error(\"<comment>The lock file does not exist. Locking.</comment>\")\n options = []\n if self.io.is_debug():\n options.append((\"-vvv\", None))\n elif self.io.is_very_verbose():\n options.append((\"-vv\", None))\n elif self.io.is_verbose():\n options.append((\"-v\", None))\n\n self.call(\"lock\", \" \".join(options))\n\n if not locker.is_fresh():\n self.line_error(\n \"<warning>\"\n \"Warning: The lock file is not up to date with \"\n \"the latest changes in pyproject.toml. \"\n \"You may be getting outdated dependencies. \"\n \"Run update to update them.\"\n \"</warning>\"\n )\n\n exporter = Exporter(self.poetry)\n exporter.export(\n fmt,\n self.poetry.file.parent,\n output or self.io,\n with_hashes=not self.option(\"without-hashes\"),\n dev=self.option(\"dev\"),\n extras=self.option(\"extras\"),\n with_credentials=self.option(\"with-credentials\"),\n )\n", "path": "poetry/console/commands/export.py"}]}
| 1,369 | 204 |
gh_patches_debug_24595
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-552
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Scope variables do not accept a value containing a ':'
Hi,
I modified the Boost package recipe so that I can specify an alternate url for downloading the source zip file. I used a scope variable for this purpose. However it seems that conan does not allow scope variables to contain a ':'.
When issuing this command:
conan install Boost/1.61.0@epr-eliaskousk/stable --build -s os=Windows -s arch=x86_64 -s compiler="Visual Studio" -s compiler.version=14 -s compiler.runtime=MDd -s build_type=Debug -sc ALL:source_url=http://sourceforge.net/projects/boost/files/boost
Here is the error message I get:
ERROR: Bad scope ALL:source_url=http://sourceforge.net/projects/boost/files/boost
I suppose there is an issue because the parsing treats the ':' as a special separator?...
Would it be possible to fix this or is it supposed to be like that?
(I can workaround the issue by replacing the character ':' in my string value by yet-another special character)
thanks
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conans/model/scope.py`
Content:
```
1 from collections import defaultdict
2 from conans.errors import ConanException
3
4
5 class Scope(dict):
6 """ the set of possible scopes than a package can have, by name(string):
7 "dev", "test", "myscope"...
8 it is just a set, but with syntax to be queried as:
9 if self.scope.dev:
10 """
11
12 def __getattr__(self, field):
13 return self.get(field)
14
15 def __setattr__(self, field, value):
16 self[field] = value
17
18 def __repr__(self):
19 return ", ".join("%s=%s" % (k, v) for k, v in sorted(self.items()))
20
21
22 # This is necessary, as None cannot be ordered in Py3
23 _root = "0CONAN_ROOT*"
24 _all = "ALL"
25
26
27 class Scopes(defaultdict):
28 """ all the scopes of a dependency graph, as a dict{package name(str): Scope
29 the root package of the graph might not have name, then its key is None.
30 It is loaded and saved to text as:
31 Package1:dev
32 Package1:test
33 Package2:dev
34 dev # for the root package, without name
35 other # any name allowed
36 This will be stored in memory as {Package1: Scopes(set[dev, test]),
37 Package2: Scopes(...),
38 None: Scopes(set[dev, other])
39 """
40 def __init__(self):
41 super(Scopes, self).__init__(Scope)
42 self[_root].dev = True
43
44 def package_scope(self, name=None):
45 """ return the scopes for the given package which are the scopes set
46 for ALL, updated (high priority) with the specific package scopes
47 if the package name is None, then it is the ROOT package/consumer
48 """
49 scope = Scope(self.get(_all, {}))
50 scope.update(self[name or _root])
51 return scope
52
53 @staticmethod
54 def from_list(items):
55 result = Scopes()
56 for item in items:
57 chunks = item.split(":")
58 if len(chunks) == 2:
59 root = chunks[0]
60 scope = chunks[1]
61 elif len(chunks) == 1:
62 root = _root
63 scope = chunks[0]
64 else:
65 raise ConanException("Bad scope %s" % item)
66 try:
67 key, value = scope.split("=")
68 except:
69 raise ConanException("Bad scope %s" % item)
70 v = value.upper()
71 if v == "TRUE":
72 value = True
73 elif v == "FALSE":
74 value = False
75 elif v == "NONE":
76 value = None
77 result[root][key] = value
78 return result
79
80 def update_scope(self, other):
81 for name, scopes in other.items():
82 self[name].update(scopes)
83
84 @staticmethod
85 def loads(text):
86 return Scopes.from_list([s.strip() for s in text.splitlines()])
87
88 def dumps(self):
89 result = []
90 for name, scopes in sorted(self.items()):
91 if name != _root:
92 result.extend("%s:%s=%s" % (name, k, v) for (k, v) in sorted(scopes.items()))
93 else:
94 result.extend("%s=%s" % (k, v) for (k, v) in sorted(scopes.items()))
95 return "\n".join(result)
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/conans/model/scope.py b/conans/model/scope.py
--- a/conans/model/scope.py
+++ b/conans/model/scope.py
@@ -54,17 +54,8 @@
def from_list(items):
result = Scopes()
for item in items:
- chunks = item.split(":")
- if len(chunks) == 2:
- root = chunks[0]
- scope = chunks[1]
- elif len(chunks) == 1:
- root = _root
- scope = chunks[0]
- else:
- raise ConanException("Bad scope %s" % item)
try:
- key, value = scope.split("=")
+ key, value = item.split("=")
except:
raise ConanException("Bad scope %s" % item)
v = value.upper()
@@ -74,7 +65,18 @@
value = False
elif v == "NONE":
value = None
- result[root][key] = value
+
+ chunks = key.split(":")
+ if len(chunks) == 2:
+ root = chunks[0]
+ scope = chunks[1]
+ elif len(chunks) == 1:
+ root = _root
+ scope = chunks[0]
+ else:
+ raise ConanException("Bad scope %s" % item)
+
+ result[root][scope] = value
return result
def update_scope(self, other):
|
{"golden_diff": "diff --git a/conans/model/scope.py b/conans/model/scope.py\n--- a/conans/model/scope.py\n+++ b/conans/model/scope.py\n@@ -54,17 +54,8 @@\n def from_list(items):\n result = Scopes()\n for item in items:\n- chunks = item.split(\":\")\n- if len(chunks) == 2:\n- root = chunks[0]\n- scope = chunks[1]\n- elif len(chunks) == 1:\n- root = _root\n- scope = chunks[0]\n- else:\n- raise ConanException(\"Bad scope %s\" % item)\n try:\n- key, value = scope.split(\"=\")\n+ key, value = item.split(\"=\")\n except:\n raise ConanException(\"Bad scope %s\" % item)\n v = value.upper()\n@@ -74,7 +65,18 @@\n value = False\n elif v == \"NONE\":\n value = None\n- result[root][key] = value\n+\n+ chunks = key.split(\":\")\n+ if len(chunks) == 2:\n+ root = chunks[0]\n+ scope = chunks[1]\n+ elif len(chunks) == 1:\n+ root = _root\n+ scope = chunks[0]\n+ else:\n+ raise ConanException(\"Bad scope %s\" % item)\n+\n+ result[root][scope] = value\n return result\n \n def update_scope(self, other):\n", "issue": "Scope variables do not accept a value containing a ':'\nHi,\nI modified the Boost package recipe so that I can specify an alternate url for downloading the source zip file. I used a scope variable for this purpose. However it seems that conan does not allow scope variables to contain a ':'.\n\nWhen issuing this command:\nconan install Boost/1.61.0@epr-eliaskousk/stable --build -s os=Windows -s arch=x86_64 -s compiler=\"Visual Studio\" -s compiler.version=14 -s compiler.runtime=MDd -s build_type=Debug -sc ALL:source_url=http://sourceforge.net/projects/boost/files/boost\n\nHere is the error message I get:\nERROR: Bad scope ALL:source_url=http://sourceforge.net/projects/boost/files/boost\n\nI suppose there is an issue because the parsing treats the ':' as a special separator?...\n\nWould it be possible to fix this or is it supposed to be like that?\n(I can workaround the issue by replacing the character ':' in my string value by yet-another special character)\n\nthanks\n\n", "before_files": [{"content": "from collections import defaultdict\nfrom conans.errors import ConanException\n\n\nclass Scope(dict):\n \"\"\" the set of possible scopes than a package can have, by name(string):\n \"dev\", \"test\", \"myscope\"...\n it is just a set, but with syntax to be queried as:\n if self.scope.dev:\n \"\"\"\n\n def __getattr__(self, field):\n return self.get(field)\n\n def __setattr__(self, field, value):\n self[field] = value\n\n def __repr__(self):\n return \", \".join(\"%s=%s\" % (k, v) for k, v in sorted(self.items()))\n\n\n# This is necessary, as None cannot be ordered in Py3\n_root = \"0CONAN_ROOT*\"\n_all = \"ALL\"\n\n\nclass Scopes(defaultdict):\n \"\"\" all the scopes of a dependency graph, as a dict{package name(str): Scope\n the root package of the graph might not have name, then its key is None.\n It is loaded and saved to text as:\n Package1:dev\n Package1:test\n Package2:dev\n dev # for the root package, without name\n other # any name allowed\n This will be stored in memory as {Package1: Scopes(set[dev, test]),\n Package2: Scopes(...),\n None: Scopes(set[dev, other])\n \"\"\"\n def __init__(self):\n super(Scopes, self).__init__(Scope)\n self[_root].dev = True\n\n def package_scope(self, name=None):\n \"\"\" return the scopes for the given package which are the scopes set\n for ALL, updated (high priority) with the specific package scopes\n if the package name is None, then it is the ROOT package/consumer\n \"\"\"\n scope = Scope(self.get(_all, {}))\n scope.update(self[name or _root])\n return scope\n\n @staticmethod\n def from_list(items):\n result = Scopes()\n for item in items:\n chunks = item.split(\":\")\n if len(chunks) == 2:\n root = chunks[0]\n scope = chunks[1]\n elif len(chunks) == 1:\n root = _root\n scope = chunks[0]\n else:\n raise ConanException(\"Bad scope %s\" % item)\n try:\n key, value = scope.split(\"=\")\n except:\n raise ConanException(\"Bad scope %s\" % item)\n v = value.upper()\n if v == \"TRUE\":\n value = True\n elif v == \"FALSE\":\n value = False\n elif v == \"NONE\":\n value = None\n result[root][key] = value\n return result\n\n def update_scope(self, other):\n for name, scopes in other.items():\n self[name].update(scopes)\n\n @staticmethod\n def loads(text):\n return Scopes.from_list([s.strip() for s in text.splitlines()])\n\n def dumps(self):\n result = []\n for name, scopes in sorted(self.items()):\n if name != _root:\n result.extend(\"%s:%s=%s\" % (name, k, v) for (k, v) in sorted(scopes.items()))\n else:\n result.extend(\"%s=%s\" % (k, v) for (k, v) in sorted(scopes.items()))\n return \"\\n\".join(result)\n", "path": "conans/model/scope.py"}], "after_files": [{"content": "from collections import defaultdict\nfrom conans.errors import ConanException\n\n\nclass Scope(dict):\n \"\"\" the set of possible scopes than a package can have, by name(string):\n \"dev\", \"test\", \"myscope\"...\n it is just a set, but with syntax to be queried as:\n if self.scope.dev:\n \"\"\"\n\n def __getattr__(self, field):\n return self.get(field)\n\n def __setattr__(self, field, value):\n self[field] = value\n\n def __repr__(self):\n return \", \".join(\"%s=%s\" % (k, v) for k, v in sorted(self.items()))\n\n\n# This is necessary, as None cannot be ordered in Py3\n_root = \"0CONAN_ROOT*\"\n_all = \"ALL\"\n\n\nclass Scopes(defaultdict):\n \"\"\" all the scopes of a dependency graph, as a dict{package name(str): Scope\n the root package of the graph might not have name, then its key is None.\n It is loaded and saved to text as:\n Package1:dev\n Package1:test\n Package2:dev\n dev # for the root package, without name\n other # any name allowed\n This will be stored in memory as {Package1: Scopes(set[dev, test]),\n Package2: Scopes(...),\n None: Scopes(set[dev, other])\n \"\"\"\n def __init__(self):\n super(Scopes, self).__init__(Scope)\n self[_root].dev = True\n\n def package_scope(self, name=None):\n \"\"\" return the scopes for the given package which are the scopes set\n for ALL, updated (high priority) with the specific package scopes\n if the package name is None, then it is the ROOT package/consumer\n \"\"\"\n scope = Scope(self.get(_all, {}))\n scope.update(self[name or _root])\n return scope\n\n @staticmethod\n def from_list(items):\n result = Scopes()\n for item in items:\n try:\n key, value = item.split(\"=\")\n except:\n raise ConanException(\"Bad scope %s\" % item)\n v = value.upper()\n if v == \"TRUE\":\n value = True\n elif v == \"FALSE\":\n value = False\n elif v == \"NONE\":\n value = None\n\n chunks = key.split(\":\")\n if len(chunks) == 2:\n root = chunks[0]\n scope = chunks[1]\n elif len(chunks) == 1:\n root = _root\n scope = chunks[0]\n else:\n raise ConanException(\"Bad scope %s\" % item)\n\n result[root][scope] = value\n return result\n\n def update_scope(self, other):\n for name, scopes in other.items():\n self[name].update(scopes)\n\n @staticmethod\n def loads(text):\n return Scopes.from_list([s.strip() for s in text.splitlines()])\n\n def dumps(self):\n result = []\n for name, scopes in sorted(self.items()):\n if name != _root:\n result.extend(\"%s:%s=%s\" % (name, k, v) for (k, v) in sorted(scopes.items()))\n else:\n result.extend(\"%s=%s\" % (k, v) for (k, v) in sorted(scopes.items()))\n return \"\\n\".join(result)\n", "path": "conans/model/scope.py"}]}
| 1,412 | 334 |
gh_patches_debug_1371
|
rasdani/github-patches
|
git_diff
|
dotkom__onlineweb4-1902
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cannot view inventory
## What kind of an issue is this?
- [x] Bug report
## What is the expected behaviour?
To be able to view the inventory
## What is the current behaviour?
A 500 error, with the message `TypeError: '>=' not supported between instances of 'datetime.date' and 'NoneType'`.
## How do you reproduce this problem?
Make sure the inventory is not empty, and try to visit it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/inventory/models.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 from django.conf import settings
4 from django.core.mail import EmailMessage
5 from django.db import models
6 from django.utils import timezone
7 from django.utils.translation import ugettext as _
8
9 from apps.gallery.models import ResponsiveImage
10
11
12 class ItemCategory(models.Model):
13 name = models.CharField(_("Kategori"), max_length=50)
14
15 def __str__(self):
16 return self.name
17
18
19 class Item(models.Model):
20
21 name = models.CharField(_("Varetype"), max_length=50)
22 description = models.CharField(_("Beskrivelse"), max_length=50, null=True, blank=True)
23 price = models.IntegerField(_("Pris"), null=True, blank=True)
24 available = models.BooleanField(_("Til salgs"), default=False)
25 category = models.ForeignKey(ItemCategory, verbose_name=_("Kategori"),
26 related_name="category", null=True, blank=True)
27 image = models.ForeignKey(ResponsiveImage, null=True, blank=True, default=None)
28
29 @property
30 def oldest_expiration_date(self):
31 batches = self.batches.all().order_by("expiration_date")
32 if batches:
33 return batches[0].expiration_date
34 else:
35 return None
36
37 @property
38 def last_added(self):
39 batches = self.batches.all().order_by("-date_added")
40 if batches:
41 return batches[0].date_added
42 else:
43 return None
44
45 def oldest_batch(self):
46 batches = self.batches.filter(amount__gt=0).order_by("date_added")
47 if batches:
48 return batches[0]
49 else:
50 return None
51
52 @property
53 def total_amount(self):
54 return sum([batch.amount for batch in self.batches.all()])
55
56 @property
57 def has_expired_batch(self):
58 if timezone.now().date() >= self.oldest_expiration_date:
59 return True
60 return False
61
62 def reduce_stock(self, amount):
63 """
64 Makes an assumption that the oldest batches are sold first and reduce them first.
65 """
66
67 oldest_batch = self.oldest_batch()
68
69 if oldest_batch:
70 if oldest_batch.amount > amount:
71 oldest_batch.amount = oldest_batch.amount - amount
72 oldest_batch.save()
73 else:
74 diff = amount - oldest_batch.amount
75 oldest_batch.amount = 0
76 oldest_batch.save()
77 self.reduce_stock(diff)
78
79 self.handle_notifications(amount)
80
81 def handle_notifications(self, amount):
82
83 # Send one notification when the stock goes to or below 10
84 if self.total_amount <= 10 and self.total_amount + amount > 10:
85 message = "Det er kun " + str(self.total_amount) + " igjen av " + str(self.name) + \
86 " på kontoret.\n\n" \
87 "Dette er en automatisk generert melding og antallet kan være noe feil."
88
89 EmailMessage(
90 "[Nibble] Lav stock på " + self.name,
91 str(message),
92 "[email protected]",
93 [],
94 [settings.EMAIL_TRIKOM]
95 ).send()
96
97 def __str__(self):
98 return self.name
99
100 class Meta(object):
101 verbose_name = _("Vare")
102 verbose_name_plural = _("Varer")
103 permissions = (
104 ("view_item", "View Inventory Item"),
105 )
106
107
108 class Batch(models.Model):
109
110 item = models.ForeignKey(Item, verbose_name=_("Vare"), related_name="batches")
111 amount = models.IntegerField(_("Antall"), default=0)
112 date_added = models.DateField(_("Dato lagt til"), editable=False, auto_now_add=True)
113 expiration_date = models.DateField(_("Utløpsdato"), null=True, blank=True, editable=True)
114
115 class Meta(object):
116 verbose_name = _("Batch")
117 verbose_name_plural = _("Batches")
118
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/apps/inventory/models.py b/apps/inventory/models.py
--- a/apps/inventory/models.py
+++ b/apps/inventory/models.py
@@ -55,7 +55,7 @@
@property
def has_expired_batch(self):
- if timezone.now().date() >= self.oldest_expiration_date:
+ if self.oldest_expiration_date and timezone.now().date() >= self.oldest_expiration_date:
return True
return False
|
{"golden_diff": "diff --git a/apps/inventory/models.py b/apps/inventory/models.py\n--- a/apps/inventory/models.py\n+++ b/apps/inventory/models.py\n@@ -55,7 +55,7 @@\n \n @property\n def has_expired_batch(self):\n- if timezone.now().date() >= self.oldest_expiration_date:\n+ if self.oldest_expiration_date and timezone.now().date() >= self.oldest_expiration_date:\n return True\n return False\n", "issue": "Cannot view inventory\n## What kind of an issue is this?\r\n\r\n- [x] Bug report\r\n\r\n\r\n## What is the expected behaviour?\r\n\r\nTo be able to view the inventory\r\n\r\n## What is the current behaviour?\r\n\r\nA 500 error, with the message `TypeError: '>=' not supported between instances of 'datetime.date' and 'NoneType'`. \r\n\r\n\r\n## How do you reproduce this problem? \r\n\r\nMake sure the inventory is not empty, and try to visit it.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom django.conf import settings\nfrom django.core.mail import EmailMessage\nfrom django.db import models\nfrom django.utils import timezone\nfrom django.utils.translation import ugettext as _\n\nfrom apps.gallery.models import ResponsiveImage\n\n\nclass ItemCategory(models.Model):\n name = models.CharField(_(\"Kategori\"), max_length=50)\n\n def __str__(self):\n return self.name\n\n\nclass Item(models.Model):\n\n name = models.CharField(_(\"Varetype\"), max_length=50)\n description = models.CharField(_(\"Beskrivelse\"), max_length=50, null=True, blank=True)\n price = models.IntegerField(_(\"Pris\"), null=True, blank=True)\n available = models.BooleanField(_(\"Til salgs\"), default=False)\n category = models.ForeignKey(ItemCategory, verbose_name=_(\"Kategori\"),\n related_name=\"category\", null=True, blank=True)\n image = models.ForeignKey(ResponsiveImage, null=True, blank=True, default=None)\n\n @property\n def oldest_expiration_date(self):\n batches = self.batches.all().order_by(\"expiration_date\")\n if batches:\n return batches[0].expiration_date\n else:\n return None\n\n @property\n def last_added(self):\n batches = self.batches.all().order_by(\"-date_added\")\n if batches:\n return batches[0].date_added\n else:\n return None\n\n def oldest_batch(self):\n batches = self.batches.filter(amount__gt=0).order_by(\"date_added\")\n if batches:\n return batches[0]\n else:\n return None\n\n @property\n def total_amount(self):\n return sum([batch.amount for batch in self.batches.all()])\n\n @property\n def has_expired_batch(self):\n if timezone.now().date() >= self.oldest_expiration_date:\n return True\n return False\n\n def reduce_stock(self, amount):\n \"\"\"\n Makes an assumption that the oldest batches are sold first and reduce them first.\n \"\"\"\n\n oldest_batch = self.oldest_batch()\n\n if oldest_batch:\n if oldest_batch.amount > amount:\n oldest_batch.amount = oldest_batch.amount - amount\n oldest_batch.save()\n else:\n diff = amount - oldest_batch.amount\n oldest_batch.amount = 0\n oldest_batch.save()\n self.reduce_stock(diff)\n\n self.handle_notifications(amount)\n\n def handle_notifications(self, amount):\n\n # Send one notification when the stock goes to or below 10\n if self.total_amount <= 10 and self.total_amount + amount > 10:\n message = \"Det er kun \" + str(self.total_amount) + \" igjen av \" + str(self.name) + \\\n \" p\u00e5 kontoret.\\n\\n\" \\\n \"Dette er en automatisk generert melding og antallet kan v\u00e6re noe feil.\"\n\n EmailMessage(\n \"[Nibble] Lav stock p\u00e5 \" + self.name,\n str(message),\n \"[email protected]\",\n [],\n [settings.EMAIL_TRIKOM]\n ).send()\n\n def __str__(self):\n return self.name\n\n class Meta(object):\n verbose_name = _(\"Vare\")\n verbose_name_plural = _(\"Varer\")\n permissions = (\n (\"view_item\", \"View Inventory Item\"),\n )\n\n\nclass Batch(models.Model):\n\n item = models.ForeignKey(Item, verbose_name=_(\"Vare\"), related_name=\"batches\")\n amount = models.IntegerField(_(\"Antall\"), default=0)\n date_added = models.DateField(_(\"Dato lagt til\"), editable=False, auto_now_add=True)\n expiration_date = models.DateField(_(\"Utl\u00f8psdato\"), null=True, blank=True, editable=True)\n\n class Meta(object):\n verbose_name = _(\"Batch\")\n verbose_name_plural = _(\"Batches\")\n", "path": "apps/inventory/models.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom django.conf import settings\nfrom django.core.mail import EmailMessage\nfrom django.db import models\nfrom django.utils import timezone\nfrom django.utils.translation import ugettext as _\n\nfrom apps.gallery.models import ResponsiveImage\n\n\nclass ItemCategory(models.Model):\n name = models.CharField(_(\"Kategori\"), max_length=50)\n\n def __str__(self):\n return self.name\n\n\nclass Item(models.Model):\n\n name = models.CharField(_(\"Varetype\"), max_length=50)\n description = models.CharField(_(\"Beskrivelse\"), max_length=50, null=True, blank=True)\n price = models.IntegerField(_(\"Pris\"), null=True, blank=True)\n available = models.BooleanField(_(\"Til salgs\"), default=False)\n category = models.ForeignKey(ItemCategory, verbose_name=_(\"Kategori\"),\n related_name=\"category\", null=True, blank=True)\n image = models.ForeignKey(ResponsiveImage, null=True, blank=True, default=None)\n\n @property\n def oldest_expiration_date(self):\n batches = self.batches.all().order_by(\"expiration_date\")\n if batches:\n return batches[0].expiration_date\n else:\n return None\n\n @property\n def last_added(self):\n batches = self.batches.all().order_by(\"-date_added\")\n if batches:\n return batches[0].date_added\n else:\n return None\n\n def oldest_batch(self):\n batches = self.batches.filter(amount__gt=0).order_by(\"date_added\")\n if batches:\n return batches[0]\n else:\n return None\n\n @property\n def total_amount(self):\n return sum([batch.amount for batch in self.batches.all()])\n\n @property\n def has_expired_batch(self):\n if self.oldest_expiration_date and timezone.now().date() >= self.oldest_expiration_date:\n return True\n return False\n\n def reduce_stock(self, amount):\n \"\"\"\n Makes an assumption that the oldest batches are sold first and reduce them first.\n \"\"\"\n\n oldest_batch = self.oldest_batch()\n\n if oldest_batch:\n if oldest_batch.amount > amount:\n oldest_batch.amount = oldest_batch.amount - amount\n oldest_batch.save()\n else:\n diff = amount - oldest_batch.amount\n oldest_batch.amount = 0\n oldest_batch.save()\n self.reduce_stock(diff)\n\n self.handle_notifications(amount)\n\n def handle_notifications(self, amount):\n\n # Send one notification when the stock goes to or below 10\n if self.total_amount <= 10 and self.total_amount + amount > 10:\n message = \"Det er kun \" + str(self.total_amount) + \" igjen av \" + str(self.name) + \\\n \" p\u00e5 kontoret.\\n\\n\" \\\n \"Dette er en automatisk generert melding og antallet kan v\u00e6re noe feil.\"\n\n EmailMessage(\n \"[Nibble] Lav stock p\u00e5 \" + self.name,\n str(message),\n \"[email protected]\",\n [],\n [settings.EMAIL_TRIKOM]\n ).send()\n\n def __str__(self):\n return self.name\n\n class Meta(object):\n verbose_name = _(\"Vare\")\n verbose_name_plural = _(\"Varer\")\n permissions = (\n (\"view_item\", \"View Inventory Item\"),\n )\n\n\nclass Batch(models.Model):\n\n item = models.ForeignKey(Item, verbose_name=_(\"Vare\"), related_name=\"batches\")\n amount = models.IntegerField(_(\"Antall\"), default=0)\n date_added = models.DateField(_(\"Dato lagt til\"), editable=False, auto_now_add=True)\n expiration_date = models.DateField(_(\"Utl\u00f8psdato\"), null=True, blank=True, editable=True)\n\n class Meta(object):\n verbose_name = _(\"Batch\")\n verbose_name_plural = _(\"Batches\")\n", "path": "apps/inventory/models.py"}]}
| 1,422 | 103 |
gh_patches_debug_4626
|
rasdani/github-patches
|
git_diff
|
sopel-irc__sopel-1677
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
etymology: Empty argument not handled
Doing `.ety` with no or an empty argument doesn't behave properly.
On Python 2 (my old personal instance), it causes `TypeError: None object cannot be quoted (file "/usr/lib/python2.7/urllib.py", line 1255, in quote)`
On Python 3 ("official" Sopel instance on freenode), the command looks up the etymology of "None" (and gets no results, because of the capital letter).
This is a good, easy issue for first-time contributors to fix.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sopel/modules/etymology.py`
Content:
```
1 # coding=utf-8
2 """
3 etymology.py - Sopel Etymology Module
4 Copyright 2007-9, Sean B. Palmer, inamidst.com
5 Copyright 2018-9, Sopel contributors
6 Licensed under the Eiffel Forum License 2.
7
8 https://sopel.chat
9 """
10 from __future__ import unicode_literals, absolute_import, print_function, division
11
12 from re import sub
13
14 from requests import get
15
16 from sopel.module import commands, example, NOLIMIT
17 from sopel.tools import web
18
19 try:
20 # Python 2.7
21 from HTMLParser import HTMLParser
22 h = HTMLParser()
23 unescape = h.unescape
24 except ImportError:
25 try:
26 # Python 3.4+
27 from html import unescape # https://stackoverflow.com/a/2087433
28 except ImportError:
29 # Python 3.3... sigh
30 from html.parser import HTMLParser
31 h = HTMLParser()
32 unescape = h.unescape
33
34
35 ETYURI = 'https://www.etymonline.com/word/%s'
36 ETYSEARCH = 'https://www.etymonline.com/search?q=%s'
37
38
39 def etymology(word):
40 # @@ <nsh> sbp, would it be possible to have a flag for .ety to get 2nd/etc
41 # entries? - http://swhack.com/logs/2006-07-19#T15-05-29
42
43 if len(word) > 25:
44 raise ValueError("Word too long: %s[…]" % word[:10])
45
46 ety = get(ETYURI % web.quote(word))
47 if ety.status_code != 200:
48 return None
49
50 # Let's find it
51 start = ety.text.find("word__defination")
52 start = ety.text.find("<p>", start)
53 stop = ety.text.find("</p>", start)
54 sentence = ety.text[start + 3:stop]
55 # Clean up
56 sentence = unescape(sentence)
57 sentence = sub('<[^<]+?>', '', sentence)
58
59 maxlength = 275
60 if len(sentence) > maxlength:
61 sentence = sentence[:maxlength]
62 words = sentence[:-5].split(' ')
63 words.pop()
64 sentence = ' '.join(words) + ' […]'
65
66 sentence = '"' + sentence.replace('"', "'") + '"'
67 return sentence + ' - ' + (ETYURI % web.quote(word))
68
69
70 @commands('ety')
71 @example('.ety word')
72 def f_etymology(bot, trigger):
73 """Look up the etymology of a word"""
74 word = trigger.group(2)
75
76 try:
77 result = etymology(word)
78 except IOError:
79 msg = "Can't connect to etymonline.com (%s)" % (ETYURI % web.quote(word))
80 bot.say(msg, trigger.sender)
81 return NOLIMIT
82 except (AttributeError, TypeError):
83 result = None
84 except ValueError as ve:
85 result = str(ve)
86
87 if result is not None:
88 bot.say(result, trigger.sender)
89 else:
90 uri = ETYSEARCH % web.quote(word)
91 msg = 'Can\'t find the etymology for "%s". Try %s' % (word, uri)
92 bot.say(msg, trigger.sender)
93 return NOLIMIT
94
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sopel/modules/etymology.py b/sopel/modules/etymology.py
--- a/sopel/modules/etymology.py
+++ b/sopel/modules/etymology.py
@@ -40,6 +40,9 @@
# @@ <nsh> sbp, would it be possible to have a flag for .ety to get 2nd/etc
# entries? - http://swhack.com/logs/2006-07-19#T15-05-29
+ if not word:
+ raise ValueError("No word to look for.")
+
if len(word) > 25:
raise ValueError("Word too long: %s[…]" % word[:10])
|
{"golden_diff": "diff --git a/sopel/modules/etymology.py b/sopel/modules/etymology.py\n--- a/sopel/modules/etymology.py\n+++ b/sopel/modules/etymology.py\n@@ -40,6 +40,9 @@\n # @@ <nsh> sbp, would it be possible to have a flag for .ety to get 2nd/etc\n # entries? - http://swhack.com/logs/2006-07-19#T15-05-29\n \n+ if not word:\n+ raise ValueError(\"No word to look for.\")\n+\n if len(word) > 25:\n raise ValueError(\"Word too long: %s[\u2026]\" % word[:10])\n", "issue": "etymology: Empty argument not handled\nDoing `.ety` with no or an empty argument doesn't behave properly.\r\n\r\nOn Python 2 (my old personal instance), it causes `TypeError: None object cannot be quoted (file \"/usr/lib/python2.7/urllib.py\", line 1255, in quote)`\r\n\r\nOn Python 3 (\"official\" Sopel instance on freenode), the command looks up the etymology of \"None\" (and gets no results, because of the capital letter).\r\n\r\nThis is a good, easy issue for first-time contributors to fix.\n", "before_files": [{"content": "# coding=utf-8\n\"\"\"\netymology.py - Sopel Etymology Module\nCopyright 2007-9, Sean B. Palmer, inamidst.com\nCopyright 2018-9, Sopel contributors\nLicensed under the Eiffel Forum License 2.\n\nhttps://sopel.chat\n\"\"\"\nfrom __future__ import unicode_literals, absolute_import, print_function, division\n\nfrom re import sub\n\nfrom requests import get\n\nfrom sopel.module import commands, example, NOLIMIT\nfrom sopel.tools import web\n\ntry:\n # Python 2.7\n from HTMLParser import HTMLParser\n h = HTMLParser()\n unescape = h.unescape\nexcept ImportError:\n try:\n # Python 3.4+\n from html import unescape # https://stackoverflow.com/a/2087433\n except ImportError:\n # Python 3.3... sigh\n from html.parser import HTMLParser\n h = HTMLParser()\n unescape = h.unescape\n\n\nETYURI = 'https://www.etymonline.com/word/%s'\nETYSEARCH = 'https://www.etymonline.com/search?q=%s'\n\n\ndef etymology(word):\n # @@ <nsh> sbp, would it be possible to have a flag for .ety to get 2nd/etc\n # entries? - http://swhack.com/logs/2006-07-19#T15-05-29\n\n if len(word) > 25:\n raise ValueError(\"Word too long: %s[\u2026]\" % word[:10])\n\n ety = get(ETYURI % web.quote(word))\n if ety.status_code != 200:\n return None\n\n # Let's find it\n start = ety.text.find(\"word__defination\")\n start = ety.text.find(\"<p>\", start)\n stop = ety.text.find(\"</p>\", start)\n sentence = ety.text[start + 3:stop]\n # Clean up\n sentence = unescape(sentence)\n sentence = sub('<[^<]+?>', '', sentence)\n\n maxlength = 275\n if len(sentence) > maxlength:\n sentence = sentence[:maxlength]\n words = sentence[:-5].split(' ')\n words.pop()\n sentence = ' '.join(words) + ' [\u2026]'\n\n sentence = '\"' + sentence.replace('\"', \"'\") + '\"'\n return sentence + ' - ' + (ETYURI % web.quote(word))\n\n\n@commands('ety')\n@example('.ety word')\ndef f_etymology(bot, trigger):\n \"\"\"Look up the etymology of a word\"\"\"\n word = trigger.group(2)\n\n try:\n result = etymology(word)\n except IOError:\n msg = \"Can't connect to etymonline.com (%s)\" % (ETYURI % web.quote(word))\n bot.say(msg, trigger.sender)\n return NOLIMIT\n except (AttributeError, TypeError):\n result = None\n except ValueError as ve:\n result = str(ve)\n\n if result is not None:\n bot.say(result, trigger.sender)\n else:\n uri = ETYSEARCH % web.quote(word)\n msg = 'Can\\'t find the etymology for \"%s\". Try %s' % (word, uri)\n bot.say(msg, trigger.sender)\n return NOLIMIT\n", "path": "sopel/modules/etymology.py"}], "after_files": [{"content": "# coding=utf-8\n\"\"\"\netymology.py - Sopel Etymology Module\nCopyright 2007-9, Sean B. Palmer, inamidst.com\nCopyright 2018-9, Sopel contributors\nLicensed under the Eiffel Forum License 2.\n\nhttps://sopel.chat\n\"\"\"\nfrom __future__ import unicode_literals, absolute_import, print_function, division\n\nfrom re import sub\n\nfrom requests import get\n\nfrom sopel.module import commands, example, NOLIMIT\nfrom sopel.tools import web\n\ntry:\n # Python 2.7\n from HTMLParser import HTMLParser\n h = HTMLParser()\n unescape = h.unescape\nexcept ImportError:\n try:\n # Python 3.4+\n from html import unescape # https://stackoverflow.com/a/2087433\n except ImportError:\n # Python 3.3... sigh\n from html.parser import HTMLParser\n h = HTMLParser()\n unescape = h.unescape\n\n\nETYURI = 'https://www.etymonline.com/word/%s'\nETYSEARCH = 'https://www.etymonline.com/search?q=%s'\n\n\ndef etymology(word):\n # @@ <nsh> sbp, would it be possible to have a flag for .ety to get 2nd/etc\n # entries? - http://swhack.com/logs/2006-07-19#T15-05-29\n\n if not word:\n raise ValueError(\"No word to look for.\")\n\n if len(word) > 25:\n raise ValueError(\"Word too long: %s[\u2026]\" % word[:10])\n\n ety = get(ETYURI % web.quote(word))\n if ety.status_code != 200:\n return None\n\n # Let's find it\n start = ety.text.find(\"word__defination\")\n start = ety.text.find(\"<p>\", start)\n stop = ety.text.find(\"</p>\", start)\n sentence = ety.text[start + 3:stop]\n # Clean up\n sentence = unescape(sentence)\n sentence = sub('<[^<]+?>', '', sentence)\n\n maxlength = 275\n if len(sentence) > maxlength:\n sentence = sentence[:maxlength]\n words = sentence[:-5].split(' ')\n words.pop()\n sentence = ' '.join(words) + ' [\u2026]'\n\n sentence = '\"' + sentence.replace('\"', \"'\") + '\"'\n return sentence + ' - ' + (ETYURI % web.quote(word))\n\n\n@commands('ety')\n@example('.ety word')\ndef f_etymology(bot, trigger):\n \"\"\"Look up the etymology of a word\"\"\"\n word = trigger.group(2)\n\n try:\n result = etymology(word)\n except IOError:\n msg = \"Can't connect to etymonline.com (%s)\" % (ETYURI % web.quote(word))\n bot.say(msg, trigger.sender)\n return NOLIMIT\n except (AttributeError, TypeError):\n result = None\n except ValueError as ve:\n result = str(ve)\n\n if result is not None:\n bot.say(result, trigger.sender)\n else:\n uri = ETYSEARCH % web.quote(word)\n msg = 'Can\\'t find the etymology for \"%s\". Try %s' % (word, uri)\n bot.say(msg, trigger.sender)\n return NOLIMIT\n", "path": "sopel/modules/etymology.py"}]}
| 1,305 | 164 |
gh_patches_debug_16778
|
rasdani/github-patches
|
git_diff
|
sanic-org__sanic-2373
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Websocket logger uses sanic.log.error_logger
Hey there,
Why do we see:
sanic.error - INFO - connection open
via stderr when getting new websocket connections. Shouldn't this go to stdout?
Also, is it possible to add "middleware" so we can properly log websocket connections and disconnects? Is it possible to get a callback on websocket disconnects?
Thanks!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sanic/server/protocols/websocket_protocol.py`
Content:
```
1 from typing import TYPE_CHECKING, Optional, Sequence, cast
2
3 from websockets.connection import CLOSED, CLOSING, OPEN
4 from websockets.server import ServerConnection
5 from websockets.typing import Subprotocol
6
7 from sanic.exceptions import ServerError
8 from sanic.log import error_logger
9 from sanic.server import HttpProtocol
10
11 from ..websockets.impl import WebsocketImplProtocol
12
13
14 if TYPE_CHECKING: # no cov
15 from websockets import http11
16
17
18 class WebSocketProtocol(HttpProtocol):
19 __slots__ = (
20 "websocket",
21 "websocket_timeout",
22 "websocket_max_size",
23 "websocket_ping_interval",
24 "websocket_ping_timeout",
25 )
26
27 def __init__(
28 self,
29 *args,
30 websocket_timeout: float = 10.0,
31 websocket_max_size: Optional[int] = None,
32 websocket_ping_interval: Optional[float] = 20.0,
33 websocket_ping_timeout: Optional[float] = 20.0,
34 **kwargs,
35 ):
36 super().__init__(*args, **kwargs)
37 self.websocket: Optional[WebsocketImplProtocol] = None
38 self.websocket_timeout = websocket_timeout
39 self.websocket_max_size = websocket_max_size
40 self.websocket_ping_interval = websocket_ping_interval
41 self.websocket_ping_timeout = websocket_ping_timeout
42
43 def connection_lost(self, exc):
44 if self.websocket is not None:
45 self.websocket.connection_lost(exc)
46 super().connection_lost(exc)
47
48 def data_received(self, data):
49 if self.websocket is not None:
50 self.websocket.data_received(data)
51 else:
52 # Pass it to HttpProtocol handler first
53 # That will (hopefully) upgrade it to a websocket.
54 super().data_received(data)
55
56 def eof_received(self) -> Optional[bool]:
57 if self.websocket is not None:
58 return self.websocket.eof_received()
59 else:
60 return False
61
62 def close(self, timeout: Optional[float] = None):
63 # Called by HttpProtocol at the end of connection_task
64 # If we've upgraded to websocket, we do our own closing
65 if self.websocket is not None:
66 # Note, we don't want to use websocket.close()
67 # That is used for user's application code to send a
68 # websocket close packet. This is different.
69 self.websocket.end_connection(1001)
70 else:
71 super().close()
72
73 def close_if_idle(self):
74 # Called by Sanic Server when shutting down
75 # If we've upgraded to websocket, shut it down
76 if self.websocket is not None:
77 if self.websocket.connection.state in (CLOSING, CLOSED):
78 return True
79 elif self.websocket.loop is not None:
80 self.websocket.loop.create_task(self.websocket.close(1001))
81 else:
82 self.websocket.end_connection(1001)
83 else:
84 return super().close_if_idle()
85
86 async def websocket_handshake(
87 self, request, subprotocols: Optional[Sequence[str]] = None
88 ):
89 # let the websockets package do the handshake with the client
90 try:
91 if subprotocols is not None:
92 # subprotocols can be a set or frozenset,
93 # but ServerConnection needs a list
94 subprotocols = cast(
95 Optional[Sequence[Subprotocol]],
96 list(
97 [
98 Subprotocol(subprotocol)
99 for subprotocol in subprotocols
100 ]
101 ),
102 )
103 ws_conn = ServerConnection(
104 max_size=self.websocket_max_size,
105 subprotocols=subprotocols,
106 state=OPEN,
107 logger=error_logger,
108 )
109 resp: "http11.Response" = ws_conn.accept(request)
110 except Exception:
111 msg = (
112 "Failed to open a WebSocket connection.\n"
113 "See server log for more information.\n"
114 )
115 raise ServerError(msg, status_code=500)
116 if 100 <= resp.status_code <= 299:
117 first_line = (
118 f"HTTP/1.1 {resp.status_code} {resp.reason_phrase}\r\n"
119 ).encode()
120 rbody = bytearray(first_line)
121 rbody += (
122 "".join([f"{k}: {v}\r\n" for k, v in resp.headers.items()])
123 ).encode()
124 rbody += b"\r\n"
125 if resp.body is not None:
126 rbody += resp.body
127 rbody += b"\r\n\r\n"
128 await super().send(rbody)
129 else:
130 raise ServerError(resp.body, resp.status_code)
131 self.websocket = WebsocketImplProtocol(
132 ws_conn,
133 ping_interval=self.websocket_ping_interval,
134 ping_timeout=self.websocket_ping_timeout,
135 close_timeout=self.websocket_timeout,
136 )
137 loop = (
138 request.transport.loop
139 if hasattr(request, "transport")
140 and hasattr(request.transport, "loop")
141 else None
142 )
143 await self.websocket.connection_made(self, loop=loop)
144 return self.websocket
145
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sanic/server/protocols/websocket_protocol.py b/sanic/server/protocols/websocket_protocol.py
--- a/sanic/server/protocols/websocket_protocol.py
+++ b/sanic/server/protocols/websocket_protocol.py
@@ -5,7 +5,7 @@
from websockets.typing import Subprotocol
from sanic.exceptions import ServerError
-from sanic.log import error_logger
+from sanic.log import logger
from sanic.server import HttpProtocol
from ..websockets.impl import WebsocketImplProtocol
@@ -104,7 +104,7 @@
max_size=self.websocket_max_size,
subprotocols=subprotocols,
state=OPEN,
- logger=error_logger,
+ logger=logger,
)
resp: "http11.Response" = ws_conn.accept(request)
except Exception:
|
{"golden_diff": "diff --git a/sanic/server/protocols/websocket_protocol.py b/sanic/server/protocols/websocket_protocol.py\n--- a/sanic/server/protocols/websocket_protocol.py\n+++ b/sanic/server/protocols/websocket_protocol.py\n@@ -5,7 +5,7 @@\n from websockets.typing import Subprotocol\n \n from sanic.exceptions import ServerError\n-from sanic.log import error_logger\n+from sanic.log import logger\n from sanic.server import HttpProtocol\n \n from ..websockets.impl import WebsocketImplProtocol\n@@ -104,7 +104,7 @@\n max_size=self.websocket_max_size,\n subprotocols=subprotocols,\n state=OPEN,\n- logger=error_logger,\n+ logger=logger,\n )\n resp: \"http11.Response\" = ws_conn.accept(request)\n except Exception:\n", "issue": "Websocket logger uses sanic.log.error_logger\nHey there,\r\n\r\nWhy do we see:\r\n\r\n sanic.error - INFO - connection open\r\n\r\nvia stderr when getting new websocket connections. Shouldn't this go to stdout?\r\n\r\nAlso, is it possible to add \"middleware\" so we can properly log websocket connections and disconnects? Is it possible to get a callback on websocket disconnects? \r\n\r\nThanks!\n", "before_files": [{"content": "from typing import TYPE_CHECKING, Optional, Sequence, cast\n\nfrom websockets.connection import CLOSED, CLOSING, OPEN\nfrom websockets.server import ServerConnection\nfrom websockets.typing import Subprotocol\n\nfrom sanic.exceptions import ServerError\nfrom sanic.log import error_logger\nfrom sanic.server import HttpProtocol\n\nfrom ..websockets.impl import WebsocketImplProtocol\n\n\nif TYPE_CHECKING: # no cov\n from websockets import http11\n\n\nclass WebSocketProtocol(HttpProtocol):\n __slots__ = (\n \"websocket\",\n \"websocket_timeout\",\n \"websocket_max_size\",\n \"websocket_ping_interval\",\n \"websocket_ping_timeout\",\n )\n\n def __init__(\n self,\n *args,\n websocket_timeout: float = 10.0,\n websocket_max_size: Optional[int] = None,\n websocket_ping_interval: Optional[float] = 20.0,\n websocket_ping_timeout: Optional[float] = 20.0,\n **kwargs,\n ):\n super().__init__(*args, **kwargs)\n self.websocket: Optional[WebsocketImplProtocol] = None\n self.websocket_timeout = websocket_timeout\n self.websocket_max_size = websocket_max_size\n self.websocket_ping_interval = websocket_ping_interval\n self.websocket_ping_timeout = websocket_ping_timeout\n\n def connection_lost(self, exc):\n if self.websocket is not None:\n self.websocket.connection_lost(exc)\n super().connection_lost(exc)\n\n def data_received(self, data):\n if self.websocket is not None:\n self.websocket.data_received(data)\n else:\n # Pass it to HttpProtocol handler first\n # That will (hopefully) upgrade it to a websocket.\n super().data_received(data)\n\n def eof_received(self) -> Optional[bool]:\n if self.websocket is not None:\n return self.websocket.eof_received()\n else:\n return False\n\n def close(self, timeout: Optional[float] = None):\n # Called by HttpProtocol at the end of connection_task\n # If we've upgraded to websocket, we do our own closing\n if self.websocket is not None:\n # Note, we don't want to use websocket.close()\n # That is used for user's application code to send a\n # websocket close packet. This is different.\n self.websocket.end_connection(1001)\n else:\n super().close()\n\n def close_if_idle(self):\n # Called by Sanic Server when shutting down\n # If we've upgraded to websocket, shut it down\n if self.websocket is not None:\n if self.websocket.connection.state in (CLOSING, CLOSED):\n return True\n elif self.websocket.loop is not None:\n self.websocket.loop.create_task(self.websocket.close(1001))\n else:\n self.websocket.end_connection(1001)\n else:\n return super().close_if_idle()\n\n async def websocket_handshake(\n self, request, subprotocols: Optional[Sequence[str]] = None\n ):\n # let the websockets package do the handshake with the client\n try:\n if subprotocols is not None:\n # subprotocols can be a set or frozenset,\n # but ServerConnection needs a list\n subprotocols = cast(\n Optional[Sequence[Subprotocol]],\n list(\n [\n Subprotocol(subprotocol)\n for subprotocol in subprotocols\n ]\n ),\n )\n ws_conn = ServerConnection(\n max_size=self.websocket_max_size,\n subprotocols=subprotocols,\n state=OPEN,\n logger=error_logger,\n )\n resp: \"http11.Response\" = ws_conn.accept(request)\n except Exception:\n msg = (\n \"Failed to open a WebSocket connection.\\n\"\n \"See server log for more information.\\n\"\n )\n raise ServerError(msg, status_code=500)\n if 100 <= resp.status_code <= 299:\n first_line = (\n f\"HTTP/1.1 {resp.status_code} {resp.reason_phrase}\\r\\n\"\n ).encode()\n rbody = bytearray(first_line)\n rbody += (\n \"\".join([f\"{k}: {v}\\r\\n\" for k, v in resp.headers.items()])\n ).encode()\n rbody += b\"\\r\\n\"\n if resp.body is not None:\n rbody += resp.body\n rbody += b\"\\r\\n\\r\\n\"\n await super().send(rbody)\n else:\n raise ServerError(resp.body, resp.status_code)\n self.websocket = WebsocketImplProtocol(\n ws_conn,\n ping_interval=self.websocket_ping_interval,\n ping_timeout=self.websocket_ping_timeout,\n close_timeout=self.websocket_timeout,\n )\n loop = (\n request.transport.loop\n if hasattr(request, \"transport\")\n and hasattr(request.transport, \"loop\")\n else None\n )\n await self.websocket.connection_made(self, loop=loop)\n return self.websocket\n", "path": "sanic/server/protocols/websocket_protocol.py"}], "after_files": [{"content": "from typing import TYPE_CHECKING, Optional, Sequence, cast\n\nfrom websockets.connection import CLOSED, CLOSING, OPEN\nfrom websockets.server import ServerConnection\nfrom websockets.typing import Subprotocol\n\nfrom sanic.exceptions import ServerError\nfrom sanic.log import logger\nfrom sanic.server import HttpProtocol\n\nfrom ..websockets.impl import WebsocketImplProtocol\n\n\nif TYPE_CHECKING: # no cov\n from websockets import http11\n\n\nclass WebSocketProtocol(HttpProtocol):\n __slots__ = (\n \"websocket\",\n \"websocket_timeout\",\n \"websocket_max_size\",\n \"websocket_ping_interval\",\n \"websocket_ping_timeout\",\n )\n\n def __init__(\n self,\n *args,\n websocket_timeout: float = 10.0,\n websocket_max_size: Optional[int] = None,\n websocket_ping_interval: Optional[float] = 20.0,\n websocket_ping_timeout: Optional[float] = 20.0,\n **kwargs,\n ):\n super().__init__(*args, **kwargs)\n self.websocket: Optional[WebsocketImplProtocol] = None\n self.websocket_timeout = websocket_timeout\n self.websocket_max_size = websocket_max_size\n self.websocket_ping_interval = websocket_ping_interval\n self.websocket_ping_timeout = websocket_ping_timeout\n\n def connection_lost(self, exc):\n if self.websocket is not None:\n self.websocket.connection_lost(exc)\n super().connection_lost(exc)\n\n def data_received(self, data):\n if self.websocket is not None:\n self.websocket.data_received(data)\n else:\n # Pass it to HttpProtocol handler first\n # That will (hopefully) upgrade it to a websocket.\n super().data_received(data)\n\n def eof_received(self) -> Optional[bool]:\n if self.websocket is not None:\n return self.websocket.eof_received()\n else:\n return False\n\n def close(self, timeout: Optional[float] = None):\n # Called by HttpProtocol at the end of connection_task\n # If we've upgraded to websocket, we do our own closing\n if self.websocket is not None:\n # Note, we don't want to use websocket.close()\n # That is used for user's application code to send a\n # websocket close packet. This is different.\n self.websocket.end_connection(1001)\n else:\n super().close()\n\n def close_if_idle(self):\n # Called by Sanic Server when shutting down\n # If we've upgraded to websocket, shut it down\n if self.websocket is not None:\n if self.websocket.connection.state in (CLOSING, CLOSED):\n return True\n elif self.websocket.loop is not None:\n self.websocket.loop.create_task(self.websocket.close(1001))\n else:\n self.websocket.end_connection(1001)\n else:\n return super().close_if_idle()\n\n async def websocket_handshake(\n self, request, subprotocols: Optional[Sequence[str]] = None\n ):\n # let the websockets package do the handshake with the client\n try:\n if subprotocols is not None:\n # subprotocols can be a set or frozenset,\n # but ServerConnection needs a list\n subprotocols = cast(\n Optional[Sequence[Subprotocol]],\n list(\n [\n Subprotocol(subprotocol)\n for subprotocol in subprotocols\n ]\n ),\n )\n ws_conn = ServerConnection(\n max_size=self.websocket_max_size,\n subprotocols=subprotocols,\n state=OPEN,\n logger=logger,\n )\n resp: \"http11.Response\" = ws_conn.accept(request)\n except Exception:\n msg = (\n \"Failed to open a WebSocket connection.\\n\"\n \"See server log for more information.\\n\"\n )\n raise ServerError(msg, status_code=500)\n if 100 <= resp.status_code <= 299:\n first_line = (\n f\"HTTP/1.1 {resp.status_code} {resp.reason_phrase}\\r\\n\"\n ).encode()\n rbody = bytearray(first_line)\n rbody += (\n \"\".join([f\"{k}: {v}\\r\\n\" for k, v in resp.headers.items()])\n ).encode()\n rbody += b\"\\r\\n\"\n if resp.body is not None:\n rbody += resp.body\n rbody += b\"\\r\\n\\r\\n\"\n await super().send(rbody)\n else:\n raise ServerError(resp.body, resp.status_code)\n self.websocket = WebsocketImplProtocol(\n ws_conn,\n ping_interval=self.websocket_ping_interval,\n ping_timeout=self.websocket_ping_timeout,\n close_timeout=self.websocket_timeout,\n )\n loop = (\n request.transport.loop\n if hasattr(request, \"transport\")\n and hasattr(request.transport, \"loop\")\n else None\n )\n await self.websocket.connection_made(self, loop=loop)\n return self.websocket\n", "path": "sanic/server/protocols/websocket_protocol.py"}]}
| 1,723 | 177 |
gh_patches_debug_3912
|
rasdani/github-patches
|
git_diff
|
saulpw__visidata-1338
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix documentation on website
Not sure if this belongs here, but I believe to have found a small issue on the website.
Paragraph # 7 in [10 Ways to Use VisiData](https://www.visidata.org/blog/2020/ten/) says you should use `visidata.view_pandas(df)` to view a dataframe in pandas.
I believe that should be `visidata.vd.view_pandas(df)`

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `visidata/vdobj.py`
Content:
```
1 from functools import wraps
2 from unittest import mock
3 import curses
4
5 import visidata
6
7 __all__ = ['ENTER', 'ALT', 'ESC', 'asyncthread', 'VisiData']
8
9
10 ENTER='Enter'
11 ALT=ESC='^['
12
13
14 # define @asyncthread for potentially long-running functions
15 # when function is called, instead launches a thread
16 def asyncthread(func):
17 'Function decorator, to make calls to `func()` spawn a separate thread if available.'
18 @wraps(func)
19 def _execAsync(*args, **kwargs):
20 if args and isinstance(args[0], visidata.BaseSheet): #1136: allow cancel of async methods on Sheet
21 if 'sheet' not in kwargs:
22 kwargs['sheet'] = args[0]
23 return visidata.vd.execAsync(func, *args, **kwargs)
24 return _execAsync
25
26
27 class VisiData(visidata.Extensible):
28 allPrefixes = ['g', 'z', ESC] # embig'g'en, 'z'mallify, ESC=Alt/Meta
29
30 @classmethod
31 def global_api(cls, func):
32 'Make global func() and identical vd.func()'
33 def _vdfunc(*args, **kwargs):
34 return func(visidata.vd, *args, **kwargs)
35 setattr(cls, func.__name__, func)
36 return wraps(func)(_vdfunc)
37
38 def __init__(self):
39 self.sheets = [] # list of BaseSheet; all sheets on the sheet stack
40 self.allSheets = [] # list of all non-precious sheets ever pushed
41 self.lastErrors = []
42 self.keystrokes = ''
43 self.scrFull = mock.MagicMock(__bool__=mock.Mock(return_value=False)) # disable curses in batch mode
44 self._cmdlog = None
45 self.contexts = [self] # objects whose attributes are in the fallback context for eval/exec.
46
47 def sheetstack(self, pane=0):
48 'Return list of sheets in given *pane*. pane=0 is the active pane. pane=-1 is the inactive pane.'
49 if pane == -1:
50 return list(vs for vs in self.sheets if vs.pane and (vs.pane != self.activePane))
51 else:
52 return list(vs for vs in self.sheets if vs.pane == (pane or self.activePane))
53
54 @property
55 def stackedSheets(self):
56 return list(vs for vs in self.sheets if vs.pane)
57
58 @property
59 def activeSheet(self):
60 'Return top sheet on sheets stack, or cmdlog if no sheets.'
61 for vs in self.sheets:
62 if vs.pane and vs.pane == self.activePane:
63 return vs
64
65 for vs in self.sheets:
66 if vs.pane and vs.pane != self.activePane:
67 return vs
68
69 return self._cmdlog
70
71 @property
72 def activeStack(self):
73 return self.sheetstack() or self.sheetstack(-1)
74
75 @visidata.drawcache_property
76 def mousereg(self):
77 return []
78
79 def __copy__(self):
80 'Dummy method for Extensible.init()'
81 pass
82
83 def finalInit(self):
84 'Initialize members specified in other modules with init()'
85 pass
86
87 @classmethod
88 def init(cls, membername, initfunc, **kwargs):
89 'Overload Extensible.init() to call finalInit instead of __init__'
90 oldinit = cls.finalInit
91 def newinit(self, *args, **kwargs):
92 oldinit(self, *args, **kwargs)
93 setattr(self, membername, initfunc())
94 cls.finalInit = newinit
95 super().init(membername, lambda: None, **kwargs)
96
97 def clearCaches(self):
98 'Invalidate internal caches between command inputs.'
99 visidata.Extensible.clear_all_caches()
100
101 def getkeystroke(self, scr, vs=None):
102 'Get keystroke and display it on status bar.'
103 k = None
104 try:
105 scr.refresh()
106 k = scr.get_wch()
107 vs = vs or self.activeSheet
108 if vs:
109 self.drawRightStatus(vs._scr, vs) # continue to display progress %
110 except curses.error:
111 return '' # curses timeout
112
113 if isinstance(k, str):
114 if ord(k) >= 32 and ord(k) != 127: # 127 == DEL or ^?
115 return k
116 k = ord(k)
117 return curses.keyname(k).decode('utf-8')
118
119 def onMouse(self, scr, y, x, h, w, **kwargs):
120 self.mousereg.append((scr, y, x, h, w, kwargs))
121
122 def getMouse(self, _scr, _x, _y, button):
123 for scr, y, x, h, w, kwargs in self.mousereg[::-1]:
124 if scr is _scr and x <= _x < x+w and y <= _y < y+h and button in kwargs:
125 return kwargs[button]
126
127 @property
128 def screenHeight(self):
129 return self.scrFull.getmaxyx()[0] if self.scrFull else 25
130
131 @property
132 def screenWidth(self):
133 return self.scrFull.getmaxyx()[1] if self.scrFull else 80
134
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/visidata/vdobj.py b/visidata/vdobj.py
--- a/visidata/vdobj.py
+++ b/visidata/vdobj.py
@@ -32,6 +32,7 @@
'Make global func() and identical vd.func()'
def _vdfunc(*args, **kwargs):
return func(visidata.vd, *args, **kwargs)
+ visidata.vd.addGlobals({func.__name__: func})
setattr(cls, func.__name__, func)
return wraps(func)(_vdfunc)
|
{"golden_diff": "diff --git a/visidata/vdobj.py b/visidata/vdobj.py\n--- a/visidata/vdobj.py\n+++ b/visidata/vdobj.py\n@@ -32,6 +32,7 @@\n 'Make global func() and identical vd.func()'\n def _vdfunc(*args, **kwargs):\n return func(visidata.vd, *args, **kwargs)\n+ visidata.vd.addGlobals({func.__name__: func})\n setattr(cls, func.__name__, func)\n return wraps(func)(_vdfunc)\n", "issue": "Fix documentation on website\nNot sure if this belongs here, but I believe to have found a small issue on the website.\r\n\r\nParagraph # 7 in [10 Ways to Use VisiData](https://www.visidata.org/blog/2020/ten/) says you should use `visidata.view_pandas(df)` to view a dataframe in pandas.\r\n\r\nI believe that should be `visidata.vd.view_pandas(df)`\r\n\r\n\r\n\n", "before_files": [{"content": "from functools import wraps\nfrom unittest import mock\nimport curses\n\nimport visidata\n\n__all__ = ['ENTER', 'ALT', 'ESC', 'asyncthread', 'VisiData']\n\n\nENTER='Enter'\nALT=ESC='^['\n\n\n# define @asyncthread for potentially long-running functions\n# when function is called, instead launches a thread\ndef asyncthread(func):\n 'Function decorator, to make calls to `func()` spawn a separate thread if available.'\n @wraps(func)\n def _execAsync(*args, **kwargs):\n if args and isinstance(args[0], visidata.BaseSheet): #1136: allow cancel of async methods on Sheet\n if 'sheet' not in kwargs:\n kwargs['sheet'] = args[0]\n return visidata.vd.execAsync(func, *args, **kwargs)\n return _execAsync\n\n\nclass VisiData(visidata.Extensible):\n allPrefixes = ['g', 'z', ESC] # embig'g'en, 'z'mallify, ESC=Alt/Meta\n\n @classmethod\n def global_api(cls, func):\n 'Make global func() and identical vd.func()'\n def _vdfunc(*args, **kwargs):\n return func(visidata.vd, *args, **kwargs)\n setattr(cls, func.__name__, func)\n return wraps(func)(_vdfunc)\n\n def __init__(self):\n self.sheets = [] # list of BaseSheet; all sheets on the sheet stack\n self.allSheets = [] # list of all non-precious sheets ever pushed\n self.lastErrors = []\n self.keystrokes = ''\n self.scrFull = mock.MagicMock(__bool__=mock.Mock(return_value=False)) # disable curses in batch mode\n self._cmdlog = None\n self.contexts = [self] # objects whose attributes are in the fallback context for eval/exec.\n\n def sheetstack(self, pane=0):\n 'Return list of sheets in given *pane*. pane=0 is the active pane. pane=-1 is the inactive pane.'\n if pane == -1:\n return list(vs for vs in self.sheets if vs.pane and (vs.pane != self.activePane))\n else:\n return list(vs for vs in self.sheets if vs.pane == (pane or self.activePane))\n\n @property\n def stackedSheets(self):\n return list(vs for vs in self.sheets if vs.pane)\n\n @property\n def activeSheet(self):\n 'Return top sheet on sheets stack, or cmdlog if no sheets.'\n for vs in self.sheets:\n if vs.pane and vs.pane == self.activePane:\n return vs\n\n for vs in self.sheets:\n if vs.pane and vs.pane != self.activePane:\n return vs\n\n return self._cmdlog\n\n @property\n def activeStack(self):\n return self.sheetstack() or self.sheetstack(-1)\n\n @visidata.drawcache_property\n def mousereg(self):\n return []\n\n def __copy__(self):\n 'Dummy method for Extensible.init()'\n pass\n\n def finalInit(self):\n 'Initialize members specified in other modules with init()'\n pass\n\n @classmethod\n def init(cls, membername, initfunc, **kwargs):\n 'Overload Extensible.init() to call finalInit instead of __init__'\n oldinit = cls.finalInit\n def newinit(self, *args, **kwargs):\n oldinit(self, *args, **kwargs)\n setattr(self, membername, initfunc())\n cls.finalInit = newinit\n super().init(membername, lambda: None, **kwargs)\n\n def clearCaches(self):\n 'Invalidate internal caches between command inputs.'\n visidata.Extensible.clear_all_caches()\n\n def getkeystroke(self, scr, vs=None):\n 'Get keystroke and display it on status bar.'\n k = None\n try:\n scr.refresh()\n k = scr.get_wch()\n vs = vs or self.activeSheet\n if vs:\n self.drawRightStatus(vs._scr, vs) # continue to display progress %\n except curses.error:\n return '' # curses timeout\n\n if isinstance(k, str):\n if ord(k) >= 32 and ord(k) != 127: # 127 == DEL or ^?\n return k\n k = ord(k)\n return curses.keyname(k).decode('utf-8')\n\n def onMouse(self, scr, y, x, h, w, **kwargs):\n self.mousereg.append((scr, y, x, h, w, kwargs))\n\n def getMouse(self, _scr, _x, _y, button):\n for scr, y, x, h, w, kwargs in self.mousereg[::-1]:\n if scr is _scr and x <= _x < x+w and y <= _y < y+h and button in kwargs:\n return kwargs[button]\n\n @property\n def screenHeight(self):\n return self.scrFull.getmaxyx()[0] if self.scrFull else 25\n\n @property\n def screenWidth(self):\n return self.scrFull.getmaxyx()[1] if self.scrFull else 80\n", "path": "visidata/vdobj.py"}], "after_files": [{"content": "from functools import wraps\nfrom unittest import mock\nimport curses\n\nimport visidata\n\n__all__ = ['ENTER', 'ALT', 'ESC', 'asyncthread', 'VisiData']\n\n\nENTER='Enter'\nALT=ESC='^['\n\n\n# define @asyncthread for potentially long-running functions\n# when function is called, instead launches a thread\ndef asyncthread(func):\n 'Function decorator, to make calls to `func()` spawn a separate thread if available.'\n @wraps(func)\n def _execAsync(*args, **kwargs):\n if args and isinstance(args[0], visidata.BaseSheet): #1136: allow cancel of async methods on Sheet\n if 'sheet' not in kwargs:\n kwargs['sheet'] = args[0]\n return visidata.vd.execAsync(func, *args, **kwargs)\n return _execAsync\n\n\nclass VisiData(visidata.Extensible):\n allPrefixes = ['g', 'z', ESC] # embig'g'en, 'z'mallify, ESC=Alt/Meta\n\n @classmethod\n def global_api(cls, func):\n 'Make global func() and identical vd.func()'\n def _vdfunc(*args, **kwargs):\n return func(visidata.vd, *args, **kwargs)\n visidata.vd.addGlobals({func.__name__: func})\n setattr(cls, func.__name__, func)\n return wraps(func)(_vdfunc)\n\n def __init__(self):\n self.sheets = [] # list of BaseSheet; all sheets on the sheet stack\n self.allSheets = [] # list of all non-precious sheets ever pushed\n self.lastErrors = []\n self.keystrokes = ''\n self.scrFull = mock.MagicMock(__bool__=mock.Mock(return_value=False)) # disable curses in batch mode\n self._cmdlog = None\n self.contexts = [self] # objects whose attributes are in the fallback context for eval/exec.\n\n def sheetstack(self, pane=0):\n 'Return list of sheets in given *pane*. pane=0 is the active pane. pane=-1 is the inactive pane.'\n if pane == -1:\n return list(vs for vs in self.sheets if vs.pane and (vs.pane != self.activePane))\n else:\n return list(vs for vs in self.sheets if vs.pane == (pane or self.activePane))\n\n @property\n def stackedSheets(self):\n return list(vs for vs in self.sheets if vs.pane)\n\n @property\n def activeSheet(self):\n 'Return top sheet on sheets stack, or cmdlog if no sheets.'\n for vs in self.sheets:\n if vs.pane and vs.pane == self.activePane:\n return vs\n\n for vs in self.sheets:\n if vs.pane and vs.pane != self.activePane:\n return vs\n\n return self._cmdlog\n\n @property\n def activeStack(self):\n return self.sheetstack() or self.sheetstack(-1)\n\n @visidata.drawcache_property\n def mousereg(self):\n return []\n\n def __copy__(self):\n 'Dummy method for Extensible.init()'\n pass\n\n def finalInit(self):\n 'Initialize members specified in other modules with init()'\n pass\n\n @classmethod\n def init(cls, membername, initfunc, **kwargs):\n 'Overload Extensible.init() to call finalInit instead of __init__'\n oldinit = cls.finalInit\n def newinit(self, *args, **kwargs):\n oldinit(self, *args, **kwargs)\n setattr(self, membername, initfunc())\n cls.finalInit = newinit\n super().init(membername, lambda: None, **kwargs)\n\n def clearCaches(self):\n 'Invalidate internal caches between command inputs.'\n visidata.Extensible.clear_all_caches()\n\n def getkeystroke(self, scr, vs=None):\n 'Get keystroke and display it on status bar.'\n k = None\n try:\n scr.refresh()\n k = scr.get_wch()\n vs = vs or self.activeSheet\n if vs:\n self.drawRightStatus(vs._scr, vs) # continue to display progress %\n except curses.error:\n return '' # curses timeout\n\n if isinstance(k, str):\n if ord(k) >= 32 and ord(k) != 127: # 127 == DEL or ^?\n return k\n k = ord(k)\n return curses.keyname(k).decode('utf-8')\n\n def onMouse(self, scr, y, x, h, w, **kwargs):\n self.mousereg.append((scr, y, x, h, w, kwargs))\n\n def getMouse(self, _scr, _x, _y, button):\n for scr, y, x, h, w, kwargs in self.mousereg[::-1]:\n if scr is _scr and x <= _x < x+w and y <= _y < y+h and button in kwargs:\n return kwargs[button]\n\n @property\n def screenHeight(self):\n return self.scrFull.getmaxyx()[0] if self.scrFull else 25\n\n @property\n def screenWidth(self):\n return self.scrFull.getmaxyx()[1] if self.scrFull else 80\n", "path": "visidata/vdobj.py"}]}
| 1,882 | 122 |
gh_patches_debug_25651
|
rasdani/github-patches
|
git_diff
|
comic__grand-challenge.org-1136
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve annotation admin pages
The graders have asked me to delete all annotations for one user for a specific image. There is currently no easy way to do this in the admin pages. I will need to find a way to make this possible. Probably by adding certain search_fields or list_filters.
Also, I have noticed that some admin pages take extremely long to load. For example the change page for a polygon annotation set. I think this happens because it has to load a lot of data which it does not do efficiently and because it makes all inline model fields are editable by default. It will most likely never happen that we need to change this data manually so those fields can all be set to read_only. It might also be better to just have some statistics there about the number of inline models and not show the actual inline models at all.
Example loading time for an admin page:

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/grandchallenge/annotations/admin.py`
Content:
```
1 from django.contrib import admin
2
3 from grandchallenge.annotations.models import (
4 BooleanClassificationAnnotation,
5 ETDRSGridAnnotation,
6 LandmarkAnnotationSet,
7 MeasurementAnnotation,
8 PolygonAnnotationSet,
9 SingleLandmarkAnnotation,
10 SinglePolygonAnnotation,
11 )
12
13
14 class BooleanClassificationAnnotationAdmin(admin.ModelAdmin):
15 search_fields = ("grader__username", "name", "created")
16 list_filter = ("created", "value", "name")
17
18
19 class SinglePolygonAnnotationInline(admin.StackedInline):
20 model = SinglePolygonAnnotation
21 extra = 0
22
23
24 class PolygonAnnotationSetAdmin(admin.ModelAdmin):
25 search_fields = ("grader__username", "created", "name")
26 list_filter = ("created", "grader__username", "name")
27 inlines = [SinglePolygonAnnotationInline]
28
29
30 class SingleLandmarkAnnotationInline(admin.StackedInline):
31 model = SingleLandmarkAnnotation
32 extra = 0
33
34
35 class LandmarkAnnotationSetAdmin(admin.ModelAdmin):
36 search_fields = ("grader__username", "created")
37 list_filter = ("created", "grader__username")
38 inlines = [SingleLandmarkAnnotationInline]
39
40
41 admin.site.register(ETDRSGridAnnotation)
42 admin.site.register(MeasurementAnnotation)
43 admin.site.register(
44 BooleanClassificationAnnotation, BooleanClassificationAnnotationAdmin
45 )
46 admin.site.register(PolygonAnnotationSet, PolygonAnnotationSetAdmin)
47 admin.site.register(SinglePolygonAnnotation)
48 admin.site.register(LandmarkAnnotationSet, LandmarkAnnotationSetAdmin)
49 admin.site.register(SingleLandmarkAnnotation)
50
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/app/grandchallenge/annotations/admin.py b/app/grandchallenge/annotations/admin.py
--- a/app/grandchallenge/annotations/admin.py
+++ b/app/grandchallenge/annotations/admin.py
@@ -19,23 +19,33 @@
class SinglePolygonAnnotationInline(admin.StackedInline):
model = SinglePolygonAnnotation
extra = 0
+ readonly_fields = (
+ "annotation_set",
+ "value",
+ "x_axis_orientation",
+ "y_axis_orientation",
+ "z",
+ )
class PolygonAnnotationSetAdmin(admin.ModelAdmin):
- search_fields = ("grader__username", "created", "name")
+ search_fields = ("grader__username", "created", "name", "image__name")
list_filter = ("created", "grader__username", "name")
inlines = [SinglePolygonAnnotationInline]
+ readonly_fields = ("grader", "image", "name", "created")
class SingleLandmarkAnnotationInline(admin.StackedInline):
model = SingleLandmarkAnnotation
extra = 0
+ readonly_fields = ("image", "landmarks")
class LandmarkAnnotationSetAdmin(admin.ModelAdmin):
search_fields = ("grader__username", "created")
list_filter = ("created", "grader__username")
inlines = [SingleLandmarkAnnotationInline]
+ readonly_fields = ("grader", "created")
admin.site.register(ETDRSGridAnnotation)
|
{"golden_diff": "diff --git a/app/grandchallenge/annotations/admin.py b/app/grandchallenge/annotations/admin.py\n--- a/app/grandchallenge/annotations/admin.py\n+++ b/app/grandchallenge/annotations/admin.py\n@@ -19,23 +19,33 @@\n class SinglePolygonAnnotationInline(admin.StackedInline):\n model = SinglePolygonAnnotation\n extra = 0\n+ readonly_fields = (\n+ \"annotation_set\",\n+ \"value\",\n+ \"x_axis_orientation\",\n+ \"y_axis_orientation\",\n+ \"z\",\n+ )\n \n \n class PolygonAnnotationSetAdmin(admin.ModelAdmin):\n- search_fields = (\"grader__username\", \"created\", \"name\")\n+ search_fields = (\"grader__username\", \"created\", \"name\", \"image__name\")\n list_filter = (\"created\", \"grader__username\", \"name\")\n inlines = [SinglePolygonAnnotationInline]\n+ readonly_fields = (\"grader\", \"image\", \"name\", \"created\")\n \n \n class SingleLandmarkAnnotationInline(admin.StackedInline):\n model = SingleLandmarkAnnotation\n extra = 0\n+ readonly_fields = (\"image\", \"landmarks\")\n \n \n class LandmarkAnnotationSetAdmin(admin.ModelAdmin):\n search_fields = (\"grader__username\", \"created\")\n list_filter = (\"created\", \"grader__username\")\n inlines = [SingleLandmarkAnnotationInline]\n+ readonly_fields = (\"grader\", \"created\")\n \n \n admin.site.register(ETDRSGridAnnotation)\n", "issue": "Improve annotation admin pages\nThe graders have asked me to delete all annotations for one user for a specific image. There is currently no easy way to do this in the admin pages. I will need to find a way to make this possible. Probably by adding certain search_fields or list_filters.\r\n\r\nAlso, I have noticed that some admin pages take extremely long to load. For example the change page for a polygon annotation set. I think this happens because it has to load a lot of data which it does not do efficiently and because it makes all inline model fields are editable by default. It will most likely never happen that we need to change this data manually so those fields can all be set to read_only. It might also be better to just have some statistics there about the number of inline models and not show the actual inline models at all.\r\nExample loading time for an admin page:\r\n\r\n\n", "before_files": [{"content": "from django.contrib import admin\n\nfrom grandchallenge.annotations.models import (\n BooleanClassificationAnnotation,\n ETDRSGridAnnotation,\n LandmarkAnnotationSet,\n MeasurementAnnotation,\n PolygonAnnotationSet,\n SingleLandmarkAnnotation,\n SinglePolygonAnnotation,\n)\n\n\nclass BooleanClassificationAnnotationAdmin(admin.ModelAdmin):\n search_fields = (\"grader__username\", \"name\", \"created\")\n list_filter = (\"created\", \"value\", \"name\")\n\n\nclass SinglePolygonAnnotationInline(admin.StackedInline):\n model = SinglePolygonAnnotation\n extra = 0\n\n\nclass PolygonAnnotationSetAdmin(admin.ModelAdmin):\n search_fields = (\"grader__username\", \"created\", \"name\")\n list_filter = (\"created\", \"grader__username\", \"name\")\n inlines = [SinglePolygonAnnotationInline]\n\n\nclass SingleLandmarkAnnotationInline(admin.StackedInline):\n model = SingleLandmarkAnnotation\n extra = 0\n\n\nclass LandmarkAnnotationSetAdmin(admin.ModelAdmin):\n search_fields = (\"grader__username\", \"created\")\n list_filter = (\"created\", \"grader__username\")\n inlines = [SingleLandmarkAnnotationInline]\n\n\nadmin.site.register(ETDRSGridAnnotation)\nadmin.site.register(MeasurementAnnotation)\nadmin.site.register(\n BooleanClassificationAnnotation, BooleanClassificationAnnotationAdmin\n)\nadmin.site.register(PolygonAnnotationSet, PolygonAnnotationSetAdmin)\nadmin.site.register(SinglePolygonAnnotation)\nadmin.site.register(LandmarkAnnotationSet, LandmarkAnnotationSetAdmin)\nadmin.site.register(SingleLandmarkAnnotation)\n", "path": "app/grandchallenge/annotations/admin.py"}], "after_files": [{"content": "from django.contrib import admin\n\nfrom grandchallenge.annotations.models import (\n BooleanClassificationAnnotation,\n ETDRSGridAnnotation,\n LandmarkAnnotationSet,\n MeasurementAnnotation,\n PolygonAnnotationSet,\n SingleLandmarkAnnotation,\n SinglePolygonAnnotation,\n)\n\n\nclass BooleanClassificationAnnotationAdmin(admin.ModelAdmin):\n search_fields = (\"grader__username\", \"name\", \"created\")\n list_filter = (\"created\", \"value\", \"name\")\n\n\nclass SinglePolygonAnnotationInline(admin.StackedInline):\n model = SinglePolygonAnnotation\n extra = 0\n readonly_fields = (\n \"annotation_set\",\n \"value\",\n \"x_axis_orientation\",\n \"y_axis_orientation\",\n \"z\",\n )\n\n\nclass PolygonAnnotationSetAdmin(admin.ModelAdmin):\n search_fields = (\"grader__username\", \"created\", \"name\", \"image__name\")\n list_filter = (\"created\", \"grader__username\", \"name\")\n inlines = [SinglePolygonAnnotationInline]\n readonly_fields = (\"grader\", \"image\", \"name\", \"created\")\n\n\nclass SingleLandmarkAnnotationInline(admin.StackedInline):\n model = SingleLandmarkAnnotation\n extra = 0\n readonly_fields = (\"image\", \"landmarks\")\n\n\nclass LandmarkAnnotationSetAdmin(admin.ModelAdmin):\n search_fields = (\"grader__username\", \"created\")\n list_filter = (\"created\", \"grader__username\")\n inlines = [SingleLandmarkAnnotationInline]\n readonly_fields = (\"grader\", \"created\")\n\n\nadmin.site.register(ETDRSGridAnnotation)\nadmin.site.register(MeasurementAnnotation)\nadmin.site.register(\n BooleanClassificationAnnotation, BooleanClassificationAnnotationAdmin\n)\nadmin.site.register(PolygonAnnotationSet, PolygonAnnotationSetAdmin)\nadmin.site.register(SinglePolygonAnnotation)\nadmin.site.register(LandmarkAnnotationSet, LandmarkAnnotationSetAdmin)\nadmin.site.register(SingleLandmarkAnnotation)\n", "path": "app/grandchallenge/annotations/admin.py"}]}
| 918 | 320 |
gh_patches_debug_14177
|
rasdani/github-patches
|
git_diff
|
spacetelescope__jwql-280
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update install_requires in setup.py
After some research on StackOverflow and discussion with @SaOgaz and @laurenmarietta on the subject of `setup.py` dependencies and `requirements.txt`/`environment.yml` dependencies (#261, #94), we should probably update the list of dependencies in `install_requires` in `setup.py` to better reflect what is actually needed to install and run our application.
In short: `setup.py` should _list the range of dependencies that allow a user to run any code in our repository_, while `environment.yml` should _list a set of specific dependencies that allow the application to operate on a specific architecture_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import numpy as np
2 from setuptools import setup
3 from setuptools import find_packages
4
5 VERSION = '0.17.0'
6
7 AUTHORS = 'Matthew Bourque, Sara Ogaz, Joe Filippazzo, Bryan Hilbert, Misty Cracraft, '
8 AUTHORS += 'Graham Kanarek, Johannes Sahlmann, Lauren Chambers, Catherine Martlin'
9
10 REQUIRES = ['astroquery', 'bokeh==1.0.1', 'django==2.1.2', 'matplotlib', 'numpy',
11 'python-dateutil', 'sphinx', 'sphinx-automodapi', 'sqlalchemy']
12
13 setup(
14 name='jwql',
15 version=VERSION,
16 description='The JWST Quicklook Project',
17 url='https://github.com/spacetelescope/jwql.git',
18 author=AUTHORS,
19 author_email='[email protected]',
20 license='BSD',
21 keywords=['astronomy', 'python'],
22 classifiers=['Programming Language :: Python'],
23 packages=find_packages(),
24 install_requires=REQUIRES,
25 include_package_data=True,
26 include_dirs=[np.get_include()],
27 )
28
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -7,8 +7,22 @@
AUTHORS = 'Matthew Bourque, Sara Ogaz, Joe Filippazzo, Bryan Hilbert, Misty Cracraft, '
AUTHORS += 'Graham Kanarek, Johannes Sahlmann, Lauren Chambers, Catherine Martlin'
-REQUIRES = ['astroquery', 'bokeh==1.0.1', 'django==2.1.2', 'matplotlib', 'numpy',
- 'python-dateutil', 'sphinx', 'sphinx-automodapi', 'sqlalchemy']
+REQUIRES = ['astropy',
+ 'astroquery>=0.3.9',
+ 'authlib',
+ 'bokeh>=1.0',
+ 'django>=2.0',
+ 'jinja2',
+ 'jwst',
+ 'matplotlib',
+ 'numpy',
+ 'numpydoc',
+ 'pandas',
+ 'psycopg2',
+ 'pytest',
+ 'sphinx',
+ 'sqlalchemy',
+ 'stsci_rtd_theme']
setup(
name='jwql',
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -7,8 +7,22 @@\n AUTHORS = 'Matthew Bourque, Sara Ogaz, Joe Filippazzo, Bryan Hilbert, Misty Cracraft, '\n AUTHORS += 'Graham Kanarek, Johannes Sahlmann, Lauren Chambers, Catherine Martlin'\n \n-REQUIRES = ['astroquery', 'bokeh==1.0.1', 'django==2.1.2', 'matplotlib', 'numpy',\n- 'python-dateutil', 'sphinx', 'sphinx-automodapi', 'sqlalchemy']\n+REQUIRES = ['astropy',\n+ 'astroquery>=0.3.9',\n+ 'authlib',\n+ 'bokeh>=1.0',\n+ 'django>=2.0',\n+ 'jinja2',\n+ 'jwst',\n+ 'matplotlib',\n+ 'numpy',\n+ 'numpydoc',\n+ 'pandas',\n+ 'psycopg2',\n+ 'pytest',\n+ 'sphinx',\n+ 'sqlalchemy',\n+ 'stsci_rtd_theme']\n \n setup(\n name='jwql',\n", "issue": "Update install_requires in setup.py\nAfter some research on StackOverflow and discussion with @SaOgaz and @laurenmarietta on the subject of `setup.py` dependencies and `requirements.txt`/`environment.yml` dependencies (#261, #94), we should probably update the list of dependencies in `install_requires` in `setup.py` to better reflect what is actually needed to install and run our application.\r\n\r\nIn short: `setup.py` should _list the range of dependencies that allow a user to run any code in our repository_, while `environment.yml` should _list a set of specific dependencies that allow the application to operate on a specific architecture_ \n", "before_files": [{"content": "import numpy as np\nfrom setuptools import setup\nfrom setuptools import find_packages\n\nVERSION = '0.17.0'\n\nAUTHORS = 'Matthew Bourque, Sara Ogaz, Joe Filippazzo, Bryan Hilbert, Misty Cracraft, '\nAUTHORS += 'Graham Kanarek, Johannes Sahlmann, Lauren Chambers, Catherine Martlin'\n\nREQUIRES = ['astroquery', 'bokeh==1.0.1', 'django==2.1.2', 'matplotlib', 'numpy',\n 'python-dateutil', 'sphinx', 'sphinx-automodapi', 'sqlalchemy']\n\nsetup(\n name='jwql',\n version=VERSION,\n description='The JWST Quicklook Project',\n url='https://github.com/spacetelescope/jwql.git',\n author=AUTHORS,\n author_email='[email protected]',\n license='BSD',\n keywords=['astronomy', 'python'],\n classifiers=['Programming Language :: Python'],\n packages=find_packages(),\n install_requires=REQUIRES,\n include_package_data=True,\n include_dirs=[np.get_include()],\n)\n", "path": "setup.py"}], "after_files": [{"content": "import numpy as np\nfrom setuptools import setup\nfrom setuptools import find_packages\n\nVERSION = '0.17.0'\n\nAUTHORS = 'Matthew Bourque, Sara Ogaz, Joe Filippazzo, Bryan Hilbert, Misty Cracraft, '\nAUTHORS += 'Graham Kanarek, Johannes Sahlmann, Lauren Chambers, Catherine Martlin'\n\nREQUIRES = ['astropy',\n 'astroquery>=0.3.9',\n 'authlib',\n 'bokeh>=1.0',\n 'django>=2.0',\n 'jinja2',\n 'jwst',\n 'matplotlib',\n 'numpy',\n 'numpydoc',\n 'pandas',\n 'psycopg2',\n 'pytest',\n 'sphinx',\n 'sqlalchemy',\n 'stsci_rtd_theme']\n\nsetup(\n name='jwql',\n version=VERSION,\n description='The JWST Quicklook Project',\n url='https://github.com/spacetelescope/jwql.git',\n author=AUTHORS,\n author_email='[email protected]',\n license='BSD',\n keywords=['astronomy', 'python'],\n classifiers=['Programming Language :: Python'],\n packages=find_packages(),\n install_requires=REQUIRES,\n include_package_data=True,\n include_dirs=[np.get_include()],\n)\n", "path": "setup.py"}]}
| 682 | 260 |
gh_patches_debug_38504
|
rasdani/github-patches
|
git_diff
|
wright-group__WrightTools-789
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
from_Cary handle duplicate named data
Currently it will fail to read in data that has the same column header
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `WrightTools/collection/_cary.py`
Content:
```
1 """Cary."""
2
3
4 # --- import --------------------------------------------------------------------------------------
5
6
7 import os
8 import re
9
10 import numpy as np
11
12 from .. import exceptions as wt_exceptions
13 from ._collection import Collection
14
15
16 # --- define --------------------------------------------------------------------------------------
17
18
19 __all__ = ["from_Cary"]
20
21
22 # --- from function -------------------------------------------------------------------------------
23
24
25 def from_Cary(filepath, name=None, parent=None, verbose=True):
26 """Create a collection object from a Cary UV VIS absorbance file.
27
28 We hope to support as many Cary instruments and datasets as possible.
29 This function has been tested with data collected on a Cary50 UV/VIS spectrometer.
30 If any alternate instruments are found not to work as expected, please
31 submit a bug report on our `issue tracker`__.
32
33 __ github.com/wright-group/WrightTools/issues
34
35 .. plot::
36
37 >>> import WrightTools as wt
38 >>> from WrightTools import datasets
39 >>> p = datasets.Cary.CuPCtS_H2O_vis
40 >>> data = wt.collection.from_Cary(p)[0]
41 >>> wt.artists.quick1D(data)
42
43 Parameters
44 ----------
45 filepath : string
46 Path to Cary output file (.csv).
47 parent : WrightTools.Collection
48 A collection object in which to place a collection of Data objects.
49 verbose : boolean (optional)
50 Toggle talkback. Default is True.
51
52 Returns
53 -------
54 data
55 New data object.
56 """
57 # check filepath
58 filesuffix = os.path.basename(filepath).split(".")[-1]
59 if filesuffix != "csv":
60 wt_exceptions.WrongFileTypeWarning.warn(filepath, "csv")
61 if name is None:
62 name = "cary"
63 # import array
64 lines = []
65 with open(filepath, "r", encoding="iso-8859-1") as f:
66 header = f.readline()
67 columns = f.readline()
68 while True:
69 line = f.readline()
70 if line == "\n" or line == "":
71 break
72 else:
73 # Note, it is necessary to call this twice, as a single call will
74 # result in something like ',,,,' > ',nan,,nan,'.
75 line = line.replace(",,", ",nan,")
76 line = line.replace(",,", ",nan,")
77 # Ensure that the first column has nan, if necessary
78 if line[0] == ",":
79 line = "nan" + line
80 clean = line[:-2] # lines end with ',/n'
81 lines.append(np.fromstring(clean, sep=","))
82 lines = [line for line in lines if len(line) > 0]
83 header = header.split(",")
84 columns = columns.split(",")
85 arr = np.array(lines).T
86 # chew through all scans
87 datas = Collection(name=name, parent=parent, edit_local=parent is not None)
88 units_dict = {"°c": "deg_C", "°f": "deg_F"}
89 for i in range(0, len(header) - 1, 2):
90 r = re.compile(r"[ \t\(\)]+")
91 spl = r.split(columns[i])
92 ax = spl[0].lower() if len(spl) > 0 else None
93 units = spl[1].lower() if len(spl) > 1 else None
94 units = units_dict.get(units, units)
95 dat = datas.create_data(header[i], kind="Cary", source=filepath)
96 dat.create_variable(ax, arr[i][~np.isnan(arr[i])], units=units)
97 dat.create_channel(
98 columns[i + 1].lower(), arr[i + 1][~np.isnan(arr[i + 1])], label=columns[i + 1].lower()
99 )
100 dat.transform(ax)
101 # finish
102 if verbose:
103 print("{0} data objects successfully created from Cary file:".format(len(datas)))
104 for i, data in enumerate(datas):
105 print(" {0}: {1}".format(i, data))
106 return datas
107
```
Path: `examples/filters.py`
Content:
```
1 #! /usr/bin/env python3
2 # -*- coding: utf-8 -*-
3 """
4 Optical Filters
5 ===============
6
7 A set of optical filters transmission spectra.
8 """
9
10 import WrightTools as wt
11 from WrightTools import datasets
12 from matplotlib import pyplot as plt
13
14 p = datasets.Cary.filters
15 col = wt.collection.from_Cary(p)
16
17 fig, gs = wt.artists.create_figure(width="double", default_aspect=.5)
18 ax = plt.subplot(gs[0])
19
20 for data in col.values():
21 if data.natural_name in ("600LP", "550LP2"):
22 continue
23 data.convert("wn", verbose=False)
24 ax.plot(data, label=data.natural_name)
25
26
27 ax.set_ylabel("%T")
28 ax.set_xlabel("Frequency (cm$^{-1}$)")
29 ax.legend()
30
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/WrightTools/collection/_cary.py b/WrightTools/collection/_cary.py
--- a/WrightTools/collection/_cary.py
+++ b/WrightTools/collection/_cary.py
@@ -4,7 +4,7 @@
# --- import --------------------------------------------------------------------------------------
-import os
+import pathlib
import re
import numpy as np
@@ -55,14 +55,14 @@
New data object.
"""
# check filepath
- filesuffix = os.path.basename(filepath).split(".")[-1]
- if filesuffix != "csv":
+ filesuffix = pathlib.Path(filepath).suffix
+ if filesuffix != ".csv":
wt_exceptions.WrongFileTypeWarning.warn(filepath, "csv")
if name is None:
name = "cary"
# import array
lines = []
- with open(filepath, "r", encoding="iso-8859-1") as f:
+ with open(str(filepath), "r", encoding="iso-8859-1") as f:
header = f.readline()
columns = f.readline()
while True:
@@ -83,6 +83,7 @@
header = header.split(",")
columns = columns.split(",")
arr = np.array(lines).T
+ duplicate = len(header) // 2 == len(set(header) - {""})
# chew through all scans
datas = Collection(name=name, parent=parent, edit_local=parent is not None)
units_dict = {"°c": "deg_C", "°f": "deg_F"}
@@ -92,7 +93,11 @@
ax = spl[0].lower() if len(spl) > 0 else None
units = spl[1].lower() if len(spl) > 1 else None
units = units_dict.get(units, units)
- dat = datas.create_data(header[i], kind="Cary", source=filepath)
+ if duplicate:
+ name = "{}_{:03d}".format(header[i], i // 2)
+ else:
+ name = header[i]
+ dat = datas.create_data(name, kind="Cary", source=filepath)
dat.create_variable(ax, arr[i][~np.isnan(arr[i])], units=units)
dat.create_channel(
columns[i + 1].lower(), arr[i + 1][~np.isnan(arr[i + 1])], label=columns[i + 1].lower()
diff --git a/examples/filters.py b/examples/filters.py
--- a/examples/filters.py
+++ b/examples/filters.py
@@ -1,10 +1,14 @@
#! /usr/bin/env python3
# -*- coding: utf-8 -*-
"""
-Optical Filters
-===============
+Plotting Multiple Lines
+=======================
-A set of optical filters transmission spectra.
+A quick demonstration of how to plot multiple lines on the same
+set of axes, using :meth:`create_figure` to have a set of axes
+which can plot data objects directly.
+
+The dataset is a set of optical filters transmission spectra.
"""
import WrightTools as wt
|
{"golden_diff": "diff --git a/WrightTools/collection/_cary.py b/WrightTools/collection/_cary.py\n--- a/WrightTools/collection/_cary.py\n+++ b/WrightTools/collection/_cary.py\n@@ -4,7 +4,7 @@\n # --- import --------------------------------------------------------------------------------------\n \n \n-import os\n+import pathlib\n import re\n \n import numpy as np\n@@ -55,14 +55,14 @@\n New data object.\n \"\"\"\n # check filepath\n- filesuffix = os.path.basename(filepath).split(\".\")[-1]\n- if filesuffix != \"csv\":\n+ filesuffix = pathlib.Path(filepath).suffix\n+ if filesuffix != \".csv\":\n wt_exceptions.WrongFileTypeWarning.warn(filepath, \"csv\")\n if name is None:\n name = \"cary\"\n # import array\n lines = []\n- with open(filepath, \"r\", encoding=\"iso-8859-1\") as f:\n+ with open(str(filepath), \"r\", encoding=\"iso-8859-1\") as f:\n header = f.readline()\n columns = f.readline()\n while True:\n@@ -83,6 +83,7 @@\n header = header.split(\",\")\n columns = columns.split(\",\")\n arr = np.array(lines).T\n+ duplicate = len(header) // 2 == len(set(header) - {\"\"})\n # chew through all scans\n datas = Collection(name=name, parent=parent, edit_local=parent is not None)\n units_dict = {\"\u00b0c\": \"deg_C\", \"\u00b0f\": \"deg_F\"}\n@@ -92,7 +93,11 @@\n ax = spl[0].lower() if len(spl) > 0 else None\n units = spl[1].lower() if len(spl) > 1 else None\n units = units_dict.get(units, units)\n- dat = datas.create_data(header[i], kind=\"Cary\", source=filepath)\n+ if duplicate:\n+ name = \"{}_{:03d}\".format(header[i], i // 2)\n+ else:\n+ name = header[i]\n+ dat = datas.create_data(name, kind=\"Cary\", source=filepath)\n dat.create_variable(ax, arr[i][~np.isnan(arr[i])], units=units)\n dat.create_channel(\n columns[i + 1].lower(), arr[i + 1][~np.isnan(arr[i + 1])], label=columns[i + 1].lower()\ndiff --git a/examples/filters.py b/examples/filters.py\n--- a/examples/filters.py\n+++ b/examples/filters.py\n@@ -1,10 +1,14 @@\n #! /usr/bin/env python3\n # -*- coding: utf-8 -*-\n \"\"\"\n-Optical Filters\n-===============\n+Plotting Multiple Lines\n+=======================\n \n-A set of optical filters transmission spectra.\n+A quick demonstration of how to plot multiple lines on the same\n+set of axes, using :meth:`create_figure` to have a set of axes\n+which can plot data objects directly.\n+\n+The dataset is a set of optical filters transmission spectra.\n \"\"\"\n \n import WrightTools as wt\n", "issue": "from_Cary handle duplicate named data\nCurrently it will fail to read in data that has the same column header\n", "before_files": [{"content": "\"\"\"Cary.\"\"\"\n\n\n# --- import --------------------------------------------------------------------------------------\n\n\nimport os\nimport re\n\nimport numpy as np\n\nfrom .. import exceptions as wt_exceptions\nfrom ._collection import Collection\n\n\n# --- define --------------------------------------------------------------------------------------\n\n\n__all__ = [\"from_Cary\"]\n\n\n# --- from function -------------------------------------------------------------------------------\n\n\ndef from_Cary(filepath, name=None, parent=None, verbose=True):\n \"\"\"Create a collection object from a Cary UV VIS absorbance file.\n\n We hope to support as many Cary instruments and datasets as possible.\n This function has been tested with data collected on a Cary50 UV/VIS spectrometer.\n If any alternate instruments are found not to work as expected, please\n submit a bug report on our `issue tracker`__.\n\n __ github.com/wright-group/WrightTools/issues\n\n .. plot::\n\n >>> import WrightTools as wt\n >>> from WrightTools import datasets\n >>> p = datasets.Cary.CuPCtS_H2O_vis\n >>> data = wt.collection.from_Cary(p)[0]\n >>> wt.artists.quick1D(data)\n\n Parameters\n ----------\n filepath : string\n Path to Cary output file (.csv).\n parent : WrightTools.Collection\n A collection object in which to place a collection of Data objects.\n verbose : boolean (optional)\n Toggle talkback. Default is True.\n\n Returns\n -------\n data\n New data object.\n \"\"\"\n # check filepath\n filesuffix = os.path.basename(filepath).split(\".\")[-1]\n if filesuffix != \"csv\":\n wt_exceptions.WrongFileTypeWarning.warn(filepath, \"csv\")\n if name is None:\n name = \"cary\"\n # import array\n lines = []\n with open(filepath, \"r\", encoding=\"iso-8859-1\") as f:\n header = f.readline()\n columns = f.readline()\n while True:\n line = f.readline()\n if line == \"\\n\" or line == \"\":\n break\n else:\n # Note, it is necessary to call this twice, as a single call will\n # result in something like ',,,,' > ',nan,,nan,'.\n line = line.replace(\",,\", \",nan,\")\n line = line.replace(\",,\", \",nan,\")\n # Ensure that the first column has nan, if necessary\n if line[0] == \",\":\n line = \"nan\" + line\n clean = line[:-2] # lines end with ',/n'\n lines.append(np.fromstring(clean, sep=\",\"))\n lines = [line for line in lines if len(line) > 0]\n header = header.split(\",\")\n columns = columns.split(\",\")\n arr = np.array(lines).T\n # chew through all scans\n datas = Collection(name=name, parent=parent, edit_local=parent is not None)\n units_dict = {\"\u00b0c\": \"deg_C\", \"\u00b0f\": \"deg_F\"}\n for i in range(0, len(header) - 1, 2):\n r = re.compile(r\"[ \\t\\(\\)]+\")\n spl = r.split(columns[i])\n ax = spl[0].lower() if len(spl) > 0 else None\n units = spl[1].lower() if len(spl) > 1 else None\n units = units_dict.get(units, units)\n dat = datas.create_data(header[i], kind=\"Cary\", source=filepath)\n dat.create_variable(ax, arr[i][~np.isnan(arr[i])], units=units)\n dat.create_channel(\n columns[i + 1].lower(), arr[i + 1][~np.isnan(arr[i + 1])], label=columns[i + 1].lower()\n )\n dat.transform(ax)\n # finish\n if verbose:\n print(\"{0} data objects successfully created from Cary file:\".format(len(datas)))\n for i, data in enumerate(datas):\n print(\" {0}: {1}\".format(i, data))\n return datas\n", "path": "WrightTools/collection/_cary.py"}, {"content": "#! /usr/bin/env python3\n# -*- coding: utf-8 -*-\n\"\"\"\nOptical Filters\n===============\n\nA set of optical filters transmission spectra.\n\"\"\"\n\nimport WrightTools as wt\nfrom WrightTools import datasets\nfrom matplotlib import pyplot as plt\n\np = datasets.Cary.filters\ncol = wt.collection.from_Cary(p)\n\nfig, gs = wt.artists.create_figure(width=\"double\", default_aspect=.5)\nax = plt.subplot(gs[0])\n\nfor data in col.values():\n if data.natural_name in (\"600LP\", \"550LP2\"):\n continue\n data.convert(\"wn\", verbose=False)\n ax.plot(data, label=data.natural_name)\n\n\nax.set_ylabel(\"%T\")\nax.set_xlabel(\"Frequency (cm$^{-1}$)\")\nax.legend()\n", "path": "examples/filters.py"}], "after_files": [{"content": "\"\"\"Cary.\"\"\"\n\n\n# --- import --------------------------------------------------------------------------------------\n\n\nimport pathlib\nimport re\n\nimport numpy as np\n\nfrom .. import exceptions as wt_exceptions\nfrom ._collection import Collection\n\n\n# --- define --------------------------------------------------------------------------------------\n\n\n__all__ = [\"from_Cary\"]\n\n\n# --- from function -------------------------------------------------------------------------------\n\n\ndef from_Cary(filepath, name=None, parent=None, verbose=True):\n \"\"\"Create a collection object from a Cary UV VIS absorbance file.\n\n We hope to support as many Cary instruments and datasets as possible.\n This function has been tested with data collected on a Cary50 UV/VIS spectrometer.\n If any alternate instruments are found not to work as expected, please\n submit a bug report on our `issue tracker`__.\n\n __ github.com/wright-group/WrightTools/issues\n\n .. plot::\n\n >>> import WrightTools as wt\n >>> from WrightTools import datasets\n >>> p = datasets.Cary.CuPCtS_H2O_vis\n >>> data = wt.collection.from_Cary(p)[0]\n >>> wt.artists.quick1D(data)\n\n Parameters\n ----------\n filepath : string\n Path to Cary output file (.csv).\n parent : WrightTools.Collection\n A collection object in which to place a collection of Data objects.\n verbose : boolean (optional)\n Toggle talkback. Default is True.\n\n Returns\n -------\n data\n New data object.\n \"\"\"\n # check filepath\n filesuffix = pathlib.Path(filepath).suffix\n if filesuffix != \".csv\":\n wt_exceptions.WrongFileTypeWarning.warn(filepath, \"csv\")\n if name is None:\n name = \"cary\"\n # import array\n lines = []\n with open(str(filepath), \"r\", encoding=\"iso-8859-1\") as f:\n header = f.readline()\n columns = f.readline()\n while True:\n line = f.readline()\n if line == \"\\n\" or line == \"\":\n break\n else:\n # Note, it is necessary to call this twice, as a single call will\n # result in something like ',,,,' > ',nan,,nan,'.\n line = line.replace(\",,\", \",nan,\")\n line = line.replace(\",,\", \",nan,\")\n # Ensure that the first column has nan, if necessary\n if line[0] == \",\":\n line = \"nan\" + line\n clean = line[:-2] # lines end with ',/n'\n lines.append(np.fromstring(clean, sep=\",\"))\n lines = [line for line in lines if len(line) > 0]\n header = header.split(\",\")\n columns = columns.split(\",\")\n arr = np.array(lines).T\n duplicate = len(header) // 2 == len(set(header) - {\"\"})\n # chew through all scans\n datas = Collection(name=name, parent=parent, edit_local=parent is not None)\n units_dict = {\"\u00b0c\": \"deg_C\", \"\u00b0f\": \"deg_F\"}\n for i in range(0, len(header) - 1, 2):\n r = re.compile(r\"[ \\t\\(\\)]+\")\n spl = r.split(columns[i])\n ax = spl[0].lower() if len(spl) > 0 else None\n units = spl[1].lower() if len(spl) > 1 else None\n units = units_dict.get(units, units)\n if duplicate:\n name = \"{}_{:03d}\".format(header[i], i // 2)\n else:\n name = header[i]\n dat = datas.create_data(name, kind=\"Cary\", source=filepath)\n dat.create_variable(ax, arr[i][~np.isnan(arr[i])], units=units)\n dat.create_channel(\n columns[i + 1].lower(), arr[i + 1][~np.isnan(arr[i + 1])], label=columns[i + 1].lower()\n )\n dat.transform(ax)\n # finish\n if verbose:\n print(\"{0} data objects successfully created from Cary file:\".format(len(datas)))\n for i, data in enumerate(datas):\n print(\" {0}: {1}\".format(i, data))\n return datas\n", "path": "WrightTools/collection/_cary.py"}, {"content": "#! /usr/bin/env python3\n# -*- coding: utf-8 -*-\n\"\"\"\nPlotting Multiple Lines\n=======================\n\nA quick demonstration of how to plot multiple lines on the same\nset of axes, using :meth:`create_figure` to have a set of axes\nwhich can plot data objects directly.\n\nThe dataset is a set of optical filters transmission spectra.\n\"\"\"\n\nimport WrightTools as wt\nfrom WrightTools import datasets\nfrom matplotlib import pyplot as plt\n\np = datasets.Cary.filters\ncol = wt.collection.from_Cary(p)\n\nfig, gs = wt.artists.create_figure(width=\"double\", default_aspect=.5)\nax = plt.subplot(gs[0])\n\nfor data in col.values():\n if data.natural_name in (\"600LP\", \"550LP2\"):\n continue\n data.convert(\"wn\", verbose=False)\n ax.plot(data, label=data.natural_name)\n\n\nax.set_ylabel(\"%T\")\nax.set_xlabel(\"Frequency (cm$^{-1}$)\")\nax.legend()\n", "path": "examples/filters.py"}]}
| 1,595 | 685 |
gh_patches_debug_35632
|
rasdani/github-patches
|
git_diff
|
chainer__chainer-1343
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support basic indexing in Variable class
Currently `Variable` partially supports `__getitem__`. I need full support of basic indexing.
- [x] int index #1343
- [x] slices #1154
- [x] new axis #1257
- [x] elipsis #1343
see also: http://docs.scipy.org/doc/numpy/reference/arrays.indexing.html#basic-slicing-and-indexing
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `chainer/functions/array/get_item.py`
Content:
```
1 import collections
2
3 from chainer import cuda
4 from chainer import function
5 from chainer.utils import type_check
6 from chainer import variable
7
8
9 class GetItem(function.Function):
10
11 """Function that slices array and extract elements."""
12
13 def __init__(self, slices):
14 if not isinstance(slices, collections.Iterable):
15 slices = tuple([slices])
16 self.slices = slices
17
18 def check_type_forward(self, in_types):
19 type_check.expect(in_types.size() == 1)
20 valid_slice = len(self.slices) - self.slices.count(None)
21 type_check.expect(in_types[0].ndim == valid_slice)
22
23 def forward(self, xs):
24 ary = xs[0]
25 return ary[tuple(self.slices)],
26
27 def backward(self, xs, gys):
28 xp = cuda.get_array_module(*xs)
29 gy = gys[0]
30 gx = xp.zeros_like(xs[0])
31 gx[tuple(self.slices)] = gy
32 return gx,
33
34
35 def get_item(x, slices):
36 """Extract elements from array with specified shape, axes and offsets.
37
38 Args:
39 x (tuple of Variables): Variable to be sliced.
40 slices (slice or tuple of slices): Slice objects to slice variable.
41
42 Returns:
43 Variable: :class:`~chainer.Variable` object
44 which contains sliced array of ``x``.
45
46 """
47 return GetItem(slices)(x)
48
49
50 def install_variable_get_item():
51 variable.Variable.__getitem__ = get_item
52
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/chainer/functions/array/get_item.py b/chainer/functions/array/get_item.py
--- a/chainer/functions/array/get_item.py
+++ b/chainer/functions/array/get_item.py
@@ -1,7 +1,11 @@
import collections
+import numpy
+
+import chainer
from chainer import cuda
from chainer import function
+from chainer import utils
from chainer.utils import type_check
from chainer import variable
@@ -13,16 +17,29 @@
def __init__(self, slices):
if not isinstance(slices, collections.Iterable):
slices = tuple([slices])
+
+ if chainer.is_debug():
+ n_ellipses = 0
+ for s in slices:
+ if numpy.isscalar(s) or s is None or isinstance(s, slice):
+ pass
+ elif s is Ellipsis:
+ n_ellipses += 1
+ else:
+ raise ValueError('Only basic indexing is supported')
+ if n_ellipses > 1:
+ raise ValueError('Only one Ellipsis is allowed')
+
self.slices = slices
def check_type_forward(self, in_types):
type_check.expect(in_types.size() == 1)
valid_slice = len(self.slices) - self.slices.count(None)
- type_check.expect(in_types[0].ndim == valid_slice)
+ type_check.expect(in_types[0].ndim >= valid_slice)
def forward(self, xs):
ary = xs[0]
- return ary[tuple(self.slices)],
+ return utils.force_array(ary[tuple(self.slices)]),
def backward(self, xs, gys):
xp = cuda.get_array_module(*xs)
@@ -37,12 +54,19 @@
Args:
x (tuple of Variables): Variable to be sliced.
- slices (slice or tuple of slices): Slice objects to slice variable.
+ slices (int, slice, None or Ellipsis or tuple of them): Basic slicing
+ to slice a variable. It supports ``int``, ``slice``, ``newaxis``
+ (equivalent to ``None``) and ``Ellipsis``.
Returns:
Variable: :class:`~chainer.Variable` object
which contains sliced array of ``x``.
+ .. note::
+
+ See NumPy document for details of `indexing
+ <http://docs.scipy.org/doc/numpy/reference/arrays.indexing.html>`_.
+
"""
return GetItem(slices)(x)
|
{"golden_diff": "diff --git a/chainer/functions/array/get_item.py b/chainer/functions/array/get_item.py\n--- a/chainer/functions/array/get_item.py\n+++ b/chainer/functions/array/get_item.py\n@@ -1,7 +1,11 @@\n import collections\n \n+import numpy\n+\n+import chainer\n from chainer import cuda\n from chainer import function\n+from chainer import utils\n from chainer.utils import type_check\n from chainer import variable\n \n@@ -13,16 +17,29 @@\n def __init__(self, slices):\n if not isinstance(slices, collections.Iterable):\n slices = tuple([slices])\n+\n+ if chainer.is_debug():\n+ n_ellipses = 0\n+ for s in slices:\n+ if numpy.isscalar(s) or s is None or isinstance(s, slice):\n+ pass\n+ elif s is Ellipsis:\n+ n_ellipses += 1\n+ else:\n+ raise ValueError('Only basic indexing is supported')\n+ if n_ellipses > 1:\n+ raise ValueError('Only one Ellipsis is allowed')\n+\n self.slices = slices\n \n def check_type_forward(self, in_types):\n type_check.expect(in_types.size() == 1)\n valid_slice = len(self.slices) - self.slices.count(None)\n- type_check.expect(in_types[0].ndim == valid_slice)\n+ type_check.expect(in_types[0].ndim >= valid_slice)\n \n def forward(self, xs):\n ary = xs[0]\n- return ary[tuple(self.slices)],\n+ return utils.force_array(ary[tuple(self.slices)]),\n \n def backward(self, xs, gys):\n xp = cuda.get_array_module(*xs)\n@@ -37,12 +54,19 @@\n \n Args:\n x (tuple of Variables): Variable to be sliced.\n- slices (slice or tuple of slices): Slice objects to slice variable.\n+ slices (int, slice, None or Ellipsis or tuple of them): Basic slicing\n+ to slice a variable. It supports ``int``, ``slice``, ``newaxis``\n+ (equivalent to ``None``) and ``Ellipsis``.\n \n Returns:\n Variable: :class:`~chainer.Variable` object\n which contains sliced array of ``x``.\n \n+ .. note::\n+\n+ See NumPy document for details of `indexing\n+ <http://docs.scipy.org/doc/numpy/reference/arrays.indexing.html>`_.\n+\n \"\"\"\n return GetItem(slices)(x)\n", "issue": "Support basic indexing in Variable class\nCurrently `Variable` partially supports `__getitem__`. I need full support of basic indexing.\n- [x] int index #1343 \n- [x] slices #1154\n- [x] new axis #1257 \n- [x] elipsis #1343 \n\nsee also: http://docs.scipy.org/doc/numpy/reference/arrays.indexing.html#basic-slicing-and-indexing\n\n", "before_files": [{"content": "import collections\n\nfrom chainer import cuda\nfrom chainer import function\nfrom chainer.utils import type_check\nfrom chainer import variable\n\n\nclass GetItem(function.Function):\n\n \"\"\"Function that slices array and extract elements.\"\"\"\n\n def __init__(self, slices):\n if not isinstance(slices, collections.Iterable):\n slices = tuple([slices])\n self.slices = slices\n\n def check_type_forward(self, in_types):\n type_check.expect(in_types.size() == 1)\n valid_slice = len(self.slices) - self.slices.count(None)\n type_check.expect(in_types[0].ndim == valid_slice)\n\n def forward(self, xs):\n ary = xs[0]\n return ary[tuple(self.slices)],\n\n def backward(self, xs, gys):\n xp = cuda.get_array_module(*xs)\n gy = gys[0]\n gx = xp.zeros_like(xs[0])\n gx[tuple(self.slices)] = gy\n return gx,\n\n\ndef get_item(x, slices):\n \"\"\"Extract elements from array with specified shape, axes and offsets.\n\n Args:\n x (tuple of Variables): Variable to be sliced.\n slices (slice or tuple of slices): Slice objects to slice variable.\n\n Returns:\n Variable: :class:`~chainer.Variable` object\n which contains sliced array of ``x``.\n\n \"\"\"\n return GetItem(slices)(x)\n\n\ndef install_variable_get_item():\n variable.Variable.__getitem__ = get_item\n", "path": "chainer/functions/array/get_item.py"}], "after_files": [{"content": "import collections\n\nimport numpy\n\nimport chainer\nfrom chainer import cuda\nfrom chainer import function\nfrom chainer import utils\nfrom chainer.utils import type_check\nfrom chainer import variable\n\n\nclass GetItem(function.Function):\n\n \"\"\"Function that slices array and extract elements.\"\"\"\n\n def __init__(self, slices):\n if not isinstance(slices, collections.Iterable):\n slices = tuple([slices])\n\n if chainer.is_debug():\n n_ellipses = 0\n for s in slices:\n if numpy.isscalar(s) or s is None or isinstance(s, slice):\n pass\n elif s is Ellipsis:\n n_ellipses += 1\n else:\n raise ValueError('Only basic indexing is supported')\n if n_ellipses > 1:\n raise ValueError('Only one Ellipsis is allowed')\n\n self.slices = slices\n\n def check_type_forward(self, in_types):\n type_check.expect(in_types.size() == 1)\n valid_slice = len(self.slices) - self.slices.count(None)\n type_check.expect(in_types[0].ndim >= valid_slice)\n\n def forward(self, xs):\n ary = xs[0]\n return utils.force_array(ary[tuple(self.slices)]),\n\n def backward(self, xs, gys):\n xp = cuda.get_array_module(*xs)\n gy = gys[0]\n gx = xp.zeros_like(xs[0])\n gx[tuple(self.slices)] = gy\n return gx,\n\n\ndef get_item(x, slices):\n \"\"\"Extract elements from array with specified shape, axes and offsets.\n\n Args:\n x (tuple of Variables): Variable to be sliced.\n slices (int, slice, None or Ellipsis or tuple of them): Basic slicing\n to slice a variable. It supports ``int``, ``slice``, ``newaxis``\n (equivalent to ``None``) and ``Ellipsis``.\n\n Returns:\n Variable: :class:`~chainer.Variable` object\n which contains sliced array of ``x``.\n\n .. note::\n\n See NumPy document for details of `indexing\n <http://docs.scipy.org/doc/numpy/reference/arrays.indexing.html>`_.\n\n \"\"\"\n return GetItem(slices)(x)\n\n\ndef install_variable_get_item():\n variable.Variable.__getitem__ = get_item\n", "path": "chainer/functions/array/get_item.py"}]}
| 782 | 566 |
gh_patches_debug_33281
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-277
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
GCP: GoogleCloudSqlDatabasePublicallyAccessible raises AttributeError
# Description
A SQL database instance declared like this crashes the check for publically-accessible instances:
```hcl
resource "google_sql_database_instance" "sql_instance" {
name = "${var.gcp-project}-db-dev"
region = "${var.region}"
settings {
tier = "${var.db_machine_type}"
ip_configuration {
ipv4_enabled = true
authorized_networks {
name = "${var.gcp-project}-sql-network"
value = google_compute_address.ip_address-dev.address
}
}
}
}
```
## Crash Log
```
ERROR:checkov.terraform.checks.resource.gcp.GoogleCloudSqlDatabasePublicallyAccessible:Failed to run check: Ensure that Cloud SQL database Instances are not open to the world for configuration: {'name': ['${var.gcp-project}-db-dev'], 'region': ['us-central1'], 'settings': [{'tier': ['${var.db_machine_type}'], 'ip_configuration': [{'ipv4_enabled': [True], 'authorized_networks': [{'name': ['${var.gcp-project}-sql-network'], 'value': ['${google_compute_address.ip_address-dev.address}']}]}]}]} at file: /bcs/modules/aws/prod/cloud-sql.tf
```
<details>
```
Traceback (most recent call last):
File "/usr/local/bin/checkov", line 5, in <module>
run()
File "/usr/local/lib/python3.8/site-packages/checkov/main.py", line 80, in run
scan_reports = runner_registry.run(root_folder=root_folder, external_checks_dir=args.external_checks_dir,
File "/usr/local/lib/python3.8/site-packages/checkov/common/runners/runner_registry.py", line 23, in run
scan_report = runner.run(root_folder, external_checks_dir=external_checks_dir, files=files,
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/runner.py", line 48, in run
self.check_tf_definition(report, root_folder,runner_filter)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/runner.py", line 88, in check_tf_definition
self.run_block(definition[block_type], definitions_context, full_file_path, report, scanned_file,
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/runner.py", line 110, in run_block
results = registry.scan(scanned_file, entity, skipped_checks,runner_filter)
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py", line 64, in scan
result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py", line 70, in run_check
result = check.run(scanned_file=scanned_file, entity_configuration=entity_configuration,
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py", line 44, in run
raise e
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py", line 33, in run
check_result['result'] = self.scan_entity_conf(entity_configuration)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py", line 20, in scan_entity_conf
return self.scan_resource_conf(conf)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py", line 24, in scan_resource_conf
if 'value' in authorized_network.keys():
AttributeError: 'str' object has no attribute 'keys'
```
</details>
## System Information
- macOS 10.15.4
- Python 3.8.2
- 1.0.308
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py`
Content:
```
1 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
2 from checkov.common.models.enums import CheckResult, CheckCategories
3
4 class GoogleCloudSqlDatabasePublicallyAccessible(BaseResourceCheck):
5 def __init__(self):
6 name = "Ensure that Cloud SQL database Instances are not open to the world"
7 id = "CKV_GCP_11"
8 supported_resources = ['google_sql_database_instance']
9 categories = [CheckCategories.NETWORKING]
10 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
11
12 def scan_resource_conf(self, conf):
13 """
14 Looks for google_sql_database_instance which is open to the world:
15 :param conf: google_sql_database_instance configuration
16 :return: <CheckResult>
17 """
18 authorized_networks_count = 0
19 authorized_networks_passed = 0
20 if 'settings' in conf and 'ip_configuration' in conf['settings'][0]:
21 if 'authorized_networks' in conf['settings'][0]['ip_configuration'][0].keys():
22 authorized_networks_count = len(conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0])
23 for authorized_network in conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0]:
24 if 'value' in authorized_network.keys():
25 if "/0" not in authorized_network['value']:
26 authorized_networks_passed += 1
27
28 if authorized_networks_passed == authorized_networks_count:
29 return CheckResult.PASSED
30 else:
31 return CheckResult.FAILED
32
33 check = GoogleCloudSqlDatabasePublicallyAccessible()
34
```
Path: `checkov/kubernetes/parser/parser.py`
Content:
```
1 import logging
2 from yaml import YAMLError
3
4 from checkov.kubernetes.parser import k8_yaml, k8_json
5
6 try:
7 from json.decoder import JSONDecodeError
8 except ImportError:
9 JSONDecodeError = ValueError
10
11 logger = logging.getLogger(__name__)
12
13
14 def parse(filename):
15 template = None
16 template_lines = None
17 try:
18 if filename.endswith(".yaml") or filename.endswith(".yml"):
19 (template, template_lines) = k8_yaml.load(filename)
20 if filename.endswith(".json"):
21 (template, template_lines) = k8_json.load(filename)
22 if template:
23 if isinstance(template,list):
24 for i in range(len(template)):
25 if isinstance(template[i],dict):
26 if (not 'apiVersion' in template[i].keys()) and (not 'kind' in template[i].keys()):
27 return
28 else:
29 return
30 else:
31 return
32 else:
33 return
34 except IOError as e:
35 if e.errno == 2:
36 logger.error('Template file not found: %s', filename)
37 return
38 elif e.errno == 21:
39 logger.error('Template references a directory, not a file: %s',
40 filename)
41 return
42 elif e.errno == 13:
43 logger.error('Permission denied when accessing template file: %s',
44 filename)
45 return
46 except UnicodeDecodeError as err:
47 logger.error('Cannot read file contents: %s', filename)
48 return
49 except YAMLError as err:
50 return
51
52 return template, template_lines
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/kubernetes/parser/parser.py b/checkov/kubernetes/parser/parser.py
--- a/checkov/kubernetes/parser/parser.py
+++ b/checkov/kubernetes/parser/parser.py
@@ -20,10 +20,10 @@
if filename.endswith(".json"):
(template, template_lines) = k8_json.load(filename)
if template:
- if isinstance(template,list):
+ if isinstance(template, list):
for i in range(len(template)):
- if isinstance(template[i],dict):
- if (not 'apiVersion' in template[i].keys()) and (not 'kind' in template[i].keys()):
+ if isinstance(template[i], dict):
+ if not ('apiVersion' in template[i].keys() and 'kind' in template[i].keys()):
return
else:
return
diff --git a/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py b/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py
--- a/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py
+++ b/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py
@@ -21,6 +21,8 @@
if 'authorized_networks' in conf['settings'][0]['ip_configuration'][0].keys():
authorized_networks_count = len(conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0])
for authorized_network in conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0]:
+ if isinstance(authorized_network, str):
+ return CheckResult.UNKNOWN
if 'value' in authorized_network.keys():
if "/0" not in authorized_network['value']:
authorized_networks_passed += 1
@@ -30,4 +32,5 @@
else:
return CheckResult.FAILED
+
check = GoogleCloudSqlDatabasePublicallyAccessible()
|
{"golden_diff": "diff --git a/checkov/kubernetes/parser/parser.py b/checkov/kubernetes/parser/parser.py\n--- a/checkov/kubernetes/parser/parser.py\n+++ b/checkov/kubernetes/parser/parser.py\n@@ -20,10 +20,10 @@\n if filename.endswith(\".json\"):\n (template, template_lines) = k8_json.load(filename)\n if template:\n- if isinstance(template,list):\n+ if isinstance(template, list):\n for i in range(len(template)):\n- if isinstance(template[i],dict):\n- if (not 'apiVersion' in template[i].keys()) and (not 'kind' in template[i].keys()):\n+ if isinstance(template[i], dict):\n+ if not ('apiVersion' in template[i].keys() and 'kind' in template[i].keys()):\n return\n else:\n return\ndiff --git a/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py b/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py\n--- a/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py\n+++ b/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py\n@@ -21,6 +21,8 @@\n if 'authorized_networks' in conf['settings'][0]['ip_configuration'][0].keys():\n authorized_networks_count = len(conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0])\n for authorized_network in conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0]:\n+ if isinstance(authorized_network, str):\n+ return CheckResult.UNKNOWN\n if 'value' in authorized_network.keys():\n if \"/0\" not in authorized_network['value']:\n authorized_networks_passed += 1\n@@ -30,4 +32,5 @@\n else: \n return CheckResult.FAILED\n \n+\n check = GoogleCloudSqlDatabasePublicallyAccessible()\n", "issue": "GCP: GoogleCloudSqlDatabasePublicallyAccessible raises AttributeError\n# Description\r\n\r\nA SQL database instance declared like this crashes the check for publically-accessible instances:\r\n\r\n```hcl\r\nresource \"google_sql_database_instance\" \"sql_instance\" {\r\n name = \"${var.gcp-project}-db-dev\"\r\n region = \"${var.region}\"\r\n settings {\r\n tier = \"${var.db_machine_type}\"\r\n ip_configuration {\r\n ipv4_enabled = true\r\n authorized_networks {\r\n name = \"${var.gcp-project}-sql-network\"\r\n value = google_compute_address.ip_address-dev.address\r\n }\r\n }\r\n }\r\n}\r\n```\r\n\r\n## Crash Log\r\n\r\n```\r\nERROR:checkov.terraform.checks.resource.gcp.GoogleCloudSqlDatabasePublicallyAccessible:Failed to run check: Ensure that Cloud SQL database Instances are not open to the world for configuration: {'name': ['${var.gcp-project}-db-dev'], 'region': ['us-central1'], 'settings': [{'tier': ['${var.db_machine_type}'], 'ip_configuration': [{'ipv4_enabled': [True], 'authorized_networks': [{'name': ['${var.gcp-project}-sql-network'], 'value': ['${google_compute_address.ip_address-dev.address}']}]}]}]} at file: /bcs/modules/aws/prod/cloud-sql.tf\r\n```\r\n\r\n<details>\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/local/bin/checkov\", line 5, in <module>\r\n run()\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/main.py\", line 80, in run\r\n scan_reports = runner_registry.run(root_folder=root_folder, external_checks_dir=args.external_checks_dir,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/runners/runner_registry.py\", line 23, in run\r\n scan_report = runner.run(root_folder, external_checks_dir=external_checks_dir, files=files,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/runner.py\", line 48, in run\r\n self.check_tf_definition(report, root_folder,runner_filter)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/runner.py\", line 88, in check_tf_definition\r\n self.run_block(definition[block_type], definitions_context, full_file_path, report, scanned_file,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/runner.py\", line 110, in run_block\r\n results = registry.scan(scanned_file, entity, skipped_checks,runner_filter)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py\", line 64, in scan\r\n result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py\", line 70, in run_check\r\n result = check.run(scanned_file=scanned_file, entity_configuration=entity_configuration,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py\", line 44, in run\r\n raise e\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py\", line 33, in run\r\n check_result['result'] = self.scan_entity_conf(entity_configuration)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py\", line 20, in scan_entity_conf\r\n return self.scan_resource_conf(conf)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py\", line 24, in scan_resource_conf\r\n if 'value' in authorized_network.keys():\r\nAttributeError: 'str' object has no attribute 'keys'\r\n```\r\n\r\n</details>\r\n\r\n## System Information\r\n\r\n - macOS 10.15.4\r\n- Python 3.8.2\r\n - 1.0.308\r\n\n", "before_files": [{"content": "from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\nfrom checkov.common.models.enums import CheckResult, CheckCategories\n\nclass GoogleCloudSqlDatabasePublicallyAccessible(BaseResourceCheck):\n def __init__(self):\n name = \"Ensure that Cloud SQL database Instances are not open to the world\"\n id = \"CKV_GCP_11\"\n supported_resources = ['google_sql_database_instance']\n categories = [CheckCategories.NETWORKING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n \"\"\"\n Looks for google_sql_database_instance which is open to the world:\n :param conf: google_sql_database_instance configuration\n :return: <CheckResult>\n \"\"\"\n authorized_networks_count = 0\n authorized_networks_passed = 0\n if 'settings' in conf and 'ip_configuration' in conf['settings'][0]:\n if 'authorized_networks' in conf['settings'][0]['ip_configuration'][0].keys():\n authorized_networks_count = len(conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0])\n for authorized_network in conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0]:\n if 'value' in authorized_network.keys():\n if \"/0\" not in authorized_network['value']:\n authorized_networks_passed += 1\n\n if authorized_networks_passed == authorized_networks_count:\n return CheckResult.PASSED\n else: \n return CheckResult.FAILED\n\ncheck = GoogleCloudSqlDatabasePublicallyAccessible()\n", "path": "checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py"}, {"content": "import logging\nfrom yaml import YAMLError\n\nfrom checkov.kubernetes.parser import k8_yaml, k8_json\n\ntry:\n from json.decoder import JSONDecodeError\nexcept ImportError:\n JSONDecodeError = ValueError\n\nlogger = logging.getLogger(__name__)\n\n\ndef parse(filename):\n template = None\n template_lines = None\n try:\n if filename.endswith(\".yaml\") or filename.endswith(\".yml\"):\n (template, template_lines) = k8_yaml.load(filename)\n if filename.endswith(\".json\"):\n (template, template_lines) = k8_json.load(filename)\n if template:\n if isinstance(template,list):\n for i in range(len(template)):\n if isinstance(template[i],dict):\n if (not 'apiVersion' in template[i].keys()) and (not 'kind' in template[i].keys()):\n return\n else:\n return\n else:\n return\n else:\n return\n except IOError as e:\n if e.errno == 2:\n logger.error('Template file not found: %s', filename)\n return\n elif e.errno == 21:\n logger.error('Template references a directory, not a file: %s',\n filename)\n return\n elif e.errno == 13:\n logger.error('Permission denied when accessing template file: %s',\n filename)\n return\n except UnicodeDecodeError as err:\n logger.error('Cannot read file contents: %s', filename)\n return\n except YAMLError as err:\n return\n\n return template, template_lines\n", "path": "checkov/kubernetes/parser/parser.py"}], "after_files": [{"content": "from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\nfrom checkov.common.models.enums import CheckResult, CheckCategories\n\nclass GoogleCloudSqlDatabasePublicallyAccessible(BaseResourceCheck):\n def __init__(self):\n name = \"Ensure that Cloud SQL database Instances are not open to the world\"\n id = \"CKV_GCP_11\"\n supported_resources = ['google_sql_database_instance']\n categories = [CheckCategories.NETWORKING]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n \"\"\"\n Looks for google_sql_database_instance which is open to the world:\n :param conf: google_sql_database_instance configuration\n :return: <CheckResult>\n \"\"\"\n authorized_networks_count = 0\n authorized_networks_passed = 0\n if 'settings' in conf and 'ip_configuration' in conf['settings'][0]:\n if 'authorized_networks' in conf['settings'][0]['ip_configuration'][0].keys():\n authorized_networks_count = len(conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0])\n for authorized_network in conf['settings'][0]['ip_configuration'][0]['authorized_networks'][0]:\n if isinstance(authorized_network, str):\n return CheckResult.UNKNOWN\n if 'value' in authorized_network.keys():\n if \"/0\" not in authorized_network['value']:\n authorized_networks_passed += 1\n\n if authorized_networks_passed == authorized_networks_count:\n return CheckResult.PASSED\n else: \n return CheckResult.FAILED\n\n\ncheck = GoogleCloudSqlDatabasePublicallyAccessible()\n", "path": "checkov/terraform/checks/resource/gcp/GoogleCloudSqlDatabasePublicallyAccessible.py"}, {"content": "import logging\nfrom yaml import YAMLError\n\nfrom checkov.kubernetes.parser import k8_yaml, k8_json\n\ntry:\n from json.decoder import JSONDecodeError\nexcept ImportError:\n JSONDecodeError = ValueError\n\nlogger = logging.getLogger(__name__)\n\n\ndef parse(filename):\n template = None\n template_lines = None\n try:\n if filename.endswith(\".yaml\") or filename.endswith(\".yml\"):\n (template, template_lines) = k8_yaml.load(filename)\n if filename.endswith(\".json\"):\n (template, template_lines) = k8_json.load(filename)\n if template:\n if isinstance(template, list):\n for i in range(len(template)):\n if isinstance(template[i], dict):\n if not ('apiVersion' in template[i].keys() and 'kind' in template[i].keys()):\n return\n else:\n return\n else:\n return\n else:\n return\n except IOError as e:\n if e.errno == 2:\n logger.error('Template file not found: %s', filename)\n return\n elif e.errno == 21:\n logger.error('Template references a directory, not a file: %s',\n filename)\n return\n elif e.errno == 13:\n logger.error('Permission denied when accessing template file: %s',\n filename)\n return\n except UnicodeDecodeError as err:\n logger.error('Cannot read file contents: %s', filename)\n return\n except YAMLError as err:\n return\n\n return template, template_lines\n", "path": "checkov/kubernetes/parser/parser.py"}]}
| 1,992 | 422 |
gh_patches_debug_30158
|
rasdani/github-patches
|
git_diff
|
saleor__saleor-1480
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Custom 404 page
Right now in demo store we display generic 404 page
https://demo.getsaleor.com/i-do-not-exist/
Could implement something more interesting, with a nice design.
What should be included:
- [ ] Drawing
- [ ] Link to the homepage
- [ ] Search bar (?)
- [ ] ...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/urls.py`
Content:
```
1 from __future__ import unicode_literals
2
3 from django.conf import settings
4 from django.conf.urls import url, include
5 from django.conf.urls.static import static
6 from django.contrib.sitemaps.views import sitemap
7 from django.contrib.staticfiles.views import serve
8 from django.views.i18n import JavaScriptCatalog
9 from graphene_django.views import GraphQLView
10
11 from .cart.urls import urlpatterns as cart_urls
12 from .checkout.urls import urlpatterns as checkout_urls
13 from .core.sitemaps import sitemaps
14 from .core.urls import urlpatterns as core_urls
15 from .dashboard.urls import urlpatterns as dashboard_urls
16 from .data_feeds.urls import urlpatterns as feed_urls
17 from .order.urls import urlpatterns as order_urls
18 from .product.urls import urlpatterns as product_urls
19 from .registration.urls import urlpatterns as registration_urls
20 from .search.urls import urlpatterns as search_urls
21 from .userprofile.urls import urlpatterns as userprofile_urls
22
23 urlpatterns = [
24 url(r'^', include(core_urls)),
25 url(r'^account/', include(registration_urls)),
26 url(r'^cart/', include((cart_urls, 'cart'), namespace='cart')),
27 url(r'^checkout/',
28 include((checkout_urls, 'checkout'), namespace='checkout')),
29 url(r'^dashboard/',
30 include((dashboard_urls, 'dashboard'), namespace='dashboard')),
31 url(r'^graphql', GraphQLView.as_view(graphiql=settings.DEBUG)),
32 url(r'^impersonate/', include('impersonate.urls')),
33 url(r'^jsi18n/$', JavaScriptCatalog.as_view(), name='javascript-catalog'),
34 url(r'^order/', include((order_urls, 'order'), namespace='order')),
35 url(r'^products/',
36 include((product_urls, 'product'), namespace='product')),
37 url(r'^profile/',
38 include((userprofile_urls, 'profile'), namespace='profile')),
39 url(r'^feeds/',
40 include((feed_urls, 'data_feeds'), namespace='data_feeds')),
41 url(r'^search/', include((search_urls, 'search'), namespace='search')),
42 url(r'^sitemap\.xml$', sitemap, {'sitemaps': sitemaps},
43 name='django.contrib.sitemaps.views.sitemap'),
44 url(r'', include('payments.urls')),
45 url('', include('social_django.urls', namespace='social')),
46 ]
47
48 if settings.DEBUG:
49 # static files (images, css, javascript, etc.)
50 urlpatterns += [
51 url(r'^static/(?P<path>.*)$', serve)
52 ] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
53
```
Path: `saleor/core/urls.py`
Content:
```
1 from __future__ import unicode_literals
2
3 from django.conf.urls import url
4
5 from . import views
6
7
8 urlpatterns = [
9 url(r'^$', views.home, name='home'),
10 url(r'^style-guide/', views.styleguide, name='styleguide'),
11 url(r'^impersonate/(?P<uid>\d+)/', views.impersonate,
12 name='impersonate-start'),
13 ]
14
```
Path: `saleor/core/views.py`
Content:
```
1 from __future__ import unicode_literals
2
3 from django.template.response import TemplateResponse
4 from django.contrib import messages
5 from django.conf import settings
6 from django.utils.translation import pgettext_lazy
7 from impersonate.views import impersonate as orig_impersonate
8
9 from ..dashboard.views import staff_member_required
10 from ..product.utils import products_with_availability, products_for_homepage
11 from ..userprofile.models import User
12
13
14 def home(request):
15 products = products_for_homepage()[:8]
16 products = products_with_availability(
17 products, discounts=request.discounts, local_currency=request.currency)
18 return TemplateResponse(
19 request, 'home.html',
20 {'products': products, 'parent': None})
21
22
23 @staff_member_required
24 def styleguide(request):
25 return TemplateResponse(request, 'styleguide.html')
26
27
28 def impersonate(request, uid):
29 response = orig_impersonate(request, uid)
30 if request.session.modified:
31 msg = pgettext_lazy(
32 'Impersonation message',
33 'You are now logged as {}'.format(User.objects.get(pk=uid)))
34 messages.success(request, msg)
35 return response
36
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/saleor/core/urls.py b/saleor/core/urls.py
--- a/saleor/core/urls.py
+++ b/saleor/core/urls.py
@@ -8,6 +8,7 @@
urlpatterns = [
url(r'^$', views.home, name='home'),
url(r'^style-guide/', views.styleguide, name='styleguide'),
+ url(r'^404', views.handle_404, name='404'),
url(r'^impersonate/(?P<uid>\d+)/', views.impersonate,
name='impersonate-start'),
]
diff --git a/saleor/core/views.py b/saleor/core/views.py
--- a/saleor/core/views.py
+++ b/saleor/core/views.py
@@ -5,6 +5,7 @@
from django.conf import settings
from django.utils.translation import pgettext_lazy
from impersonate.views import impersonate as orig_impersonate
+from random import randint
from ..dashboard.views import staff_member_required
from ..product.utils import products_with_availability, products_for_homepage
@@ -33,3 +34,8 @@
'You are now logged as {}'.format(User.objects.get(pk=uid)))
messages.success(request, msg)
return response
+
+
+def handle_404(request):
+ ctx = {'variant': randint(0, 2)}
+ return TemplateResponse(request, '404.html', ctx, status=404)
diff --git a/saleor/urls.py b/saleor/urls.py
--- a/saleor/urls.py
+++ b/saleor/urls.py
@@ -20,6 +20,9 @@
from .search.urls import urlpatterns as search_urls
from .userprofile.urls import urlpatterns as userprofile_urls
+
+handler404 = 'saleor.core.views.handle_404'
+
urlpatterns = [
url(r'^', include(core_urls)),
url(r'^account/', include(registration_urls)),
|
{"golden_diff": "diff --git a/saleor/core/urls.py b/saleor/core/urls.py\n--- a/saleor/core/urls.py\n+++ b/saleor/core/urls.py\n@@ -8,6 +8,7 @@\n urlpatterns = [\n url(r'^$', views.home, name='home'),\n url(r'^style-guide/', views.styleguide, name='styleguide'),\n+ url(r'^404', views.handle_404, name='404'),\n url(r'^impersonate/(?P<uid>\\d+)/', views.impersonate,\n name='impersonate-start'),\n ]\ndiff --git a/saleor/core/views.py b/saleor/core/views.py\n--- a/saleor/core/views.py\n+++ b/saleor/core/views.py\n@@ -5,6 +5,7 @@\n from django.conf import settings\n from django.utils.translation import pgettext_lazy\n from impersonate.views import impersonate as orig_impersonate\n+from random import randint\n \n from ..dashboard.views import staff_member_required\n from ..product.utils import products_with_availability, products_for_homepage\n@@ -33,3 +34,8 @@\n 'You are now logged as {}'.format(User.objects.get(pk=uid)))\n messages.success(request, msg)\n return response\n+\n+\n+def handle_404(request):\n+ ctx = {'variant': randint(0, 2)}\n+ return TemplateResponse(request, '404.html', ctx, status=404)\ndiff --git a/saleor/urls.py b/saleor/urls.py\n--- a/saleor/urls.py\n+++ b/saleor/urls.py\n@@ -20,6 +20,9 @@\n from .search.urls import urlpatterns as search_urls\n from .userprofile.urls import urlpatterns as userprofile_urls\n \n+\n+handler404 = 'saleor.core.views.handle_404'\n+\n urlpatterns = [\n url(r'^', include(core_urls)),\n url(r'^account/', include(registration_urls)),\n", "issue": "Custom 404 page\nRight now in demo store we display generic 404 page\r\nhttps://demo.getsaleor.com/i-do-not-exist/\r\n\r\nCould implement something more interesting, with a nice design.\r\n\r\nWhat should be included:\r\n- [ ] Drawing\r\n- [ ] Link to the homepage\r\n- [ ] Search bar (?)\r\n- [ ] ...\n", "before_files": [{"content": "from __future__ import unicode_literals\n\nfrom django.conf import settings\nfrom django.conf.urls import url, include\nfrom django.conf.urls.static import static\nfrom django.contrib.sitemaps.views import sitemap\nfrom django.contrib.staticfiles.views import serve\nfrom django.views.i18n import JavaScriptCatalog\nfrom graphene_django.views import GraphQLView\n\nfrom .cart.urls import urlpatterns as cart_urls\nfrom .checkout.urls import urlpatterns as checkout_urls\nfrom .core.sitemaps import sitemaps\nfrom .core.urls import urlpatterns as core_urls\nfrom .dashboard.urls import urlpatterns as dashboard_urls\nfrom .data_feeds.urls import urlpatterns as feed_urls\nfrom .order.urls import urlpatterns as order_urls\nfrom .product.urls import urlpatterns as product_urls\nfrom .registration.urls import urlpatterns as registration_urls\nfrom .search.urls import urlpatterns as search_urls\nfrom .userprofile.urls import urlpatterns as userprofile_urls\n\nurlpatterns = [\n url(r'^', include(core_urls)),\n url(r'^account/', include(registration_urls)),\n url(r'^cart/', include((cart_urls, 'cart'), namespace='cart')),\n url(r'^checkout/',\n include((checkout_urls, 'checkout'), namespace='checkout')),\n url(r'^dashboard/',\n include((dashboard_urls, 'dashboard'), namespace='dashboard')),\n url(r'^graphql', GraphQLView.as_view(graphiql=settings.DEBUG)),\n url(r'^impersonate/', include('impersonate.urls')),\n url(r'^jsi18n/$', JavaScriptCatalog.as_view(), name='javascript-catalog'),\n url(r'^order/', include((order_urls, 'order'), namespace='order')),\n url(r'^products/',\n include((product_urls, 'product'), namespace='product')),\n url(r'^profile/',\n include((userprofile_urls, 'profile'), namespace='profile')),\n url(r'^feeds/',\n include((feed_urls, 'data_feeds'), namespace='data_feeds')),\n url(r'^search/', include((search_urls, 'search'), namespace='search')),\n url(r'^sitemap\\.xml$', sitemap, {'sitemaps': sitemaps},\n name='django.contrib.sitemaps.views.sitemap'),\n url(r'', include('payments.urls')),\n url('', include('social_django.urls', namespace='social')),\n]\n\nif settings.DEBUG:\n # static files (images, css, javascript, etc.)\n urlpatterns += [\n url(r'^static/(?P<path>.*)$', serve)\n ] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)\n", "path": "saleor/urls.py"}, {"content": "from __future__ import unicode_literals\n\nfrom django.conf.urls import url\n\nfrom . import views\n\n\nurlpatterns = [\n url(r'^$', views.home, name='home'),\n url(r'^style-guide/', views.styleguide, name='styleguide'),\n url(r'^impersonate/(?P<uid>\\d+)/', views.impersonate,\n name='impersonate-start'),\n]\n", "path": "saleor/core/urls.py"}, {"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.conf import settings\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\n\nfrom ..dashboard.views import staff_member_required\nfrom ..product.utils import products_with_availability, products_for_homepage\nfrom ..userprofile.models import User\n\n\ndef home(request):\n products = products_for_homepage()[:8]\n products = products_with_availability(\n products, discounts=request.discounts, local_currency=request.currency)\n return TemplateResponse(\n request, 'home.html',\n {'products': products, 'parent': None})\n\n\n@staff_member_required\ndef styleguide(request):\n return TemplateResponse(request, 'styleguide.html')\n\n\ndef impersonate(request, uid):\n response = orig_impersonate(request, uid)\n if request.session.modified:\n msg = pgettext_lazy(\n 'Impersonation message',\n 'You are now logged as {}'.format(User.objects.get(pk=uid)))\n messages.success(request, msg)\n return response\n", "path": "saleor/core/views.py"}], "after_files": [{"content": "from __future__ import unicode_literals\n\nfrom django.conf import settings\nfrom django.conf.urls import url, include\nfrom django.conf.urls.static import static\nfrom django.contrib.sitemaps.views import sitemap\nfrom django.contrib.staticfiles.views import serve\nfrom django.views.i18n import JavaScriptCatalog\nfrom graphene_django.views import GraphQLView\n\nfrom .cart.urls import urlpatterns as cart_urls\nfrom .checkout.urls import urlpatterns as checkout_urls\nfrom .core.sitemaps import sitemaps\nfrom .core.urls import urlpatterns as core_urls\nfrom .dashboard.urls import urlpatterns as dashboard_urls\nfrom .data_feeds.urls import urlpatterns as feed_urls\nfrom .order.urls import urlpatterns as order_urls\nfrom .product.urls import urlpatterns as product_urls\nfrom .registration.urls import urlpatterns as registration_urls\nfrom .search.urls import urlpatterns as search_urls\nfrom .userprofile.urls import urlpatterns as userprofile_urls\n\n\nhandler404 = 'saleor.core.views.handle_404'\n\nurlpatterns = [\n url(r'^', include(core_urls)),\n url(r'^account/', include(registration_urls)),\n url(r'^cart/', include((cart_urls, 'cart'), namespace='cart')),\n url(r'^checkout/',\n include((checkout_urls, 'checkout'), namespace='checkout')),\n url(r'^dashboard/',\n include((dashboard_urls, 'dashboard'), namespace='dashboard')),\n url(r'^graphql', GraphQLView.as_view(graphiql=settings.DEBUG)),\n url(r'^impersonate/', include('impersonate.urls')),\n url(r'^jsi18n/$', JavaScriptCatalog.as_view(), name='javascript-catalog'),\n url(r'^order/', include((order_urls, 'order'), namespace='order')),\n url(r'^products/',\n include((product_urls, 'product'), namespace='product')),\n url(r'^profile/',\n include((userprofile_urls, 'profile'), namespace='profile')),\n url(r'^feeds/',\n include((feed_urls, 'data_feeds'), namespace='data_feeds')),\n url(r'^search/', include((search_urls, 'search'), namespace='search')),\n url(r'^sitemap\\.xml$', sitemap, {'sitemaps': sitemaps},\n name='django.contrib.sitemaps.views.sitemap'),\n url(r'', include('payments.urls')),\n url('', include('social_django.urls', namespace='social')),\n]\n\nif settings.DEBUG:\n # static files (images, css, javascript, etc.)\n urlpatterns += [\n url(r'^static/(?P<path>.*)$', serve)\n ] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)\n", "path": "saleor/urls.py"}, {"content": "from __future__ import unicode_literals\n\nfrom django.conf.urls import url\n\nfrom . import views\n\n\nurlpatterns = [\n url(r'^$', views.home, name='home'),\n url(r'^style-guide/', views.styleguide, name='styleguide'),\n url(r'^404', views.handle_404, name='404'),\n url(r'^impersonate/(?P<uid>\\d+)/', views.impersonate,\n name='impersonate-start'),\n]\n", "path": "saleor/core/urls.py"}, {"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.conf import settings\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\nfrom random import randint\n\nfrom ..dashboard.views import staff_member_required\nfrom ..product.utils import products_with_availability, products_for_homepage\nfrom ..userprofile.models import User\n\n\ndef home(request):\n products = products_for_homepage()[:8]\n products = products_with_availability(\n products, discounts=request.discounts, local_currency=request.currency)\n return TemplateResponse(\n request, 'home.html',\n {'products': products, 'parent': None})\n\n\n@staff_member_required\ndef styleguide(request):\n return TemplateResponse(request, 'styleguide.html')\n\n\ndef impersonate(request, uid):\n response = orig_impersonate(request, uid)\n if request.session.modified:\n msg = pgettext_lazy(\n 'Impersonation message',\n 'You are now logged as {}'.format(User.objects.get(pk=uid)))\n messages.success(request, msg)\n return response\n\n\ndef handle_404(request):\n ctx = {'variant': randint(0, 2)}\n return TemplateResponse(request, '404.html', ctx, status=404)\n", "path": "saleor/core/views.py"}]}
| 1,386 | 432 |
gh_patches_debug_26413
|
rasdani/github-patches
|
git_diff
|
svthalia__concrexit-2105
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add support for multiple local partners
### Is your feature request related to a problem? Please describe.
We really want to be able to have multiple local partners at Thalia. And the website current does not support this.
### Describe the solution you'd like
We would like the ability to add multiple local partners.
### Motivation
We think it's useful to have multiple local partners so we can show our students what is available in Nijmegen.
### Describe alternatives you've considered
Only having one local partner, which does not make sense in the current partner climate.
### Additional context
The local partner is also shown in the newsletter.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/newsletters/views.py`
Content:
```
1 """Views provided by the newsletters package."""
2 import os
3
4 from django.conf import settings
5 from django.contrib.admin.views.decorators import staff_member_required
6 from django.contrib.auth.decorators import permission_required
7 from django.shortcuts import get_object_or_404, redirect, render
8 from django.utils.translation import activate, get_language_info
9 from django_sendfile import sendfile
10
11 from newsletters import services
12 from newsletters.models import Newsletter
13 from partners.models import Partner
14
15
16 def preview(request, pk, lang=None):
17 """View that renders the newsletter as HTML.
18
19 :param request: the request object
20 :param pk: the newsletter's primary key
21 :param lang: the language of the render
22 :return: HttpResponse 200 containing the newsletter HTML
23 """
24 lang_code = request.LANGUAGE_CODE
25
26 if lang is not None:
27 try:
28 get_language_info(lang)
29 activate(lang)
30 lang_code = lang
31 except KeyError:
32 # Language code not recognised by get_language_info
33 pass
34
35 # Send cached file, if it exists
36 file_path = os.path.join(
37 settings.MEDIA_ROOT, "newsletters", f"{pk}_{lang_code}.html"
38 )
39 if os.path.isfile(file_path):
40 return sendfile(request, file_path)
41
42 newsletter = get_object_or_404(Newsletter, pk=pk)
43 events = services.get_agenda(newsletter.date) if newsletter.date else None
44
45 return render(
46 request,
47 "newsletters/email.html",
48 {
49 "newsletter": newsletter,
50 "agenda_events": events,
51 "main_partner": Partner.objects.filter(is_main_partner=True).first(),
52 "local_partner": Partner.objects.filter(is_local_partner=True).first(),
53 "lang_code": lang_code,
54 },
55 )
56
57
58 @staff_member_required
59 @permission_required("newsletters.send_newsletter")
60 def admin_send(request, pk):
61 """If this is a GET request this view will render a confirmation page for the administrator.
62
63 If it is a POST request the newsletter will be sent to all recipients.
64
65 :param request: the request object
66 :param pk: the newsletter's primary key
67 :return: 302 RedirectResponse if POST else 200 with the
68 confirmation page HTML
69 """
70 newsletter = get_object_or_404(Newsletter, pk=pk)
71
72 if newsletter.sent:
73 return redirect(newsletter)
74
75 if request.POST:
76 services.send_newsletter(newsletter)
77
78 return redirect("admin:newsletters_newsletter_changelist")
79
80 return render(
81 request, "newsletters/admin/send_confirm.html", {"newsletter": newsletter}
82 )
83
```
Path: `website/newsletters/emails.py`
Content:
```
1 """The emails defined by the newsletters package."""
2 import logging
3 from smtplib import SMTPException
4
5 from django.conf import settings
6 from django.core import mail
7 from django.core.mail import EmailMultiAlternatives
8 from django.template.loader import get_template
9 from django.utils import translation, timezone
10 from django.utils.timezone import make_aware
11
12 from newsletters import services
13 from partners.models import Partner
14
15 logger = logging.getLogger(__name__)
16
17
18 def send_newsletter(newsletter):
19 """Send the newsletter as HTML and plaintext email.
20
21 :param newsletter: the newsletter to be send
22 """
23 events = None
24 if newsletter.date:
25 datetime = (
26 make_aware(
27 timezone.datetime(
28 year=newsletter.date.year,
29 month=newsletter.date.month,
30 day=newsletter.date.day,
31 )
32 )
33 if newsletter.date
34 else None
35 )
36 events = services.get_agenda(datetime)
37
38 from_email = settings.NEWSLETTER_FROM_ADDRESS
39 html_template = get_template("newsletters/email.html")
40 text_template = get_template("newsletters/email.txt")
41
42 main_partner = Partner.objects.filter(is_main_partner=True).first()
43 local_partner = Partner.objects.filter(is_local_partner=True).first()
44
45 with mail.get_connection() as connection:
46 language = ("en", "English")
47 translation.activate(language[0])
48
49 subject = "[THALIA] " + newsletter.title
50
51 context = {
52 "newsletter": newsletter,
53 "agenda_events": events,
54 "main_partner": main_partner,
55 "local_partner": local_partner,
56 "lang_code": language[0],
57 }
58
59 html_message = html_template.render(context)
60 text_message = text_template.render(context)
61
62 msg = EmailMultiAlternatives(
63 subject=subject,
64 body=text_message,
65 to=[f"newsletter@{settings.GSUITE_DOMAIN}"],
66 from_email=from_email,
67 connection=connection,
68 )
69 msg.attach_alternative(html_message, "text/html")
70
71 try:
72 msg.send()
73 logger.info("Sent %s newsletter", language[1])
74 except SMTPException:
75 logger.exception("Failed to send the %s newsletter", language[1])
76
77 translation.deactivate()
78
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/website/newsletters/emails.py b/website/newsletters/emails.py
--- a/website/newsletters/emails.py
+++ b/website/newsletters/emails.py
@@ -40,7 +40,7 @@
text_template = get_template("newsletters/email.txt")
main_partner = Partner.objects.filter(is_main_partner=True).first()
- local_partner = Partner.objects.filter(is_local_partner=True).first()
+ local_partners = Partner.objects.filter(is_local_partner=True)
with mail.get_connection() as connection:
language = ("en", "English")
@@ -52,7 +52,7 @@
"newsletter": newsletter,
"agenda_events": events,
"main_partner": main_partner,
- "local_partner": local_partner,
+ "local_partners": local_partners,
"lang_code": language[0],
}
diff --git a/website/newsletters/views.py b/website/newsletters/views.py
--- a/website/newsletters/views.py
+++ b/website/newsletters/views.py
@@ -49,7 +49,7 @@
"newsletter": newsletter,
"agenda_events": events,
"main_partner": Partner.objects.filter(is_main_partner=True).first(),
- "local_partner": Partner.objects.filter(is_local_partner=True).first(),
+ "local_partners": Partner.objects.filter(is_local_partner=True),
"lang_code": lang_code,
},
)
|
{"golden_diff": "diff --git a/website/newsletters/emails.py b/website/newsletters/emails.py\n--- a/website/newsletters/emails.py\n+++ b/website/newsletters/emails.py\n@@ -40,7 +40,7 @@\n text_template = get_template(\"newsletters/email.txt\")\n \n main_partner = Partner.objects.filter(is_main_partner=True).first()\n- local_partner = Partner.objects.filter(is_local_partner=True).first()\n+ local_partners = Partner.objects.filter(is_local_partner=True)\n \n with mail.get_connection() as connection:\n language = (\"en\", \"English\")\n@@ -52,7 +52,7 @@\n \"newsletter\": newsletter,\n \"agenda_events\": events,\n \"main_partner\": main_partner,\n- \"local_partner\": local_partner,\n+ \"local_partners\": local_partners,\n \"lang_code\": language[0],\n }\n \ndiff --git a/website/newsletters/views.py b/website/newsletters/views.py\n--- a/website/newsletters/views.py\n+++ b/website/newsletters/views.py\n@@ -49,7 +49,7 @@\n \"newsletter\": newsletter,\n \"agenda_events\": events,\n \"main_partner\": Partner.objects.filter(is_main_partner=True).first(),\n- \"local_partner\": Partner.objects.filter(is_local_partner=True).first(),\n+ \"local_partners\": Partner.objects.filter(is_local_partner=True),\n \"lang_code\": lang_code,\n },\n )\n", "issue": "Add support for multiple local partners\n### Is your feature request related to a problem? Please describe.\r\nWe really want to be able to have multiple local partners at Thalia. And the website current does not support this.\r\n\r\n### Describe the solution you'd like\r\nWe would like the ability to add multiple local partners.\r\n\r\n### Motivation\r\nWe think it's useful to have multiple local partners so we can show our students what is available in Nijmegen.\r\n\r\n### Describe alternatives you've considered\r\nOnly having one local partner, which does not make sense in the current partner climate.\r\n\r\n### Additional context\r\nThe local partner is also shown in the newsletter.\n", "before_files": [{"content": "\"\"\"Views provided by the newsletters package.\"\"\"\nimport os\n\nfrom django.conf import settings\nfrom django.contrib.admin.views.decorators import staff_member_required\nfrom django.contrib.auth.decorators import permission_required\nfrom django.shortcuts import get_object_or_404, redirect, render\nfrom django.utils.translation import activate, get_language_info\nfrom django_sendfile import sendfile\n\nfrom newsletters import services\nfrom newsletters.models import Newsletter\nfrom partners.models import Partner\n\n\ndef preview(request, pk, lang=None):\n \"\"\"View that renders the newsletter as HTML.\n\n :param request: the request object\n :param pk: the newsletter's primary key\n :param lang: the language of the render\n :return: HttpResponse 200 containing the newsletter HTML\n \"\"\"\n lang_code = request.LANGUAGE_CODE\n\n if lang is not None:\n try:\n get_language_info(lang)\n activate(lang)\n lang_code = lang\n except KeyError:\n # Language code not recognised by get_language_info\n pass\n\n # Send cached file, if it exists\n file_path = os.path.join(\n settings.MEDIA_ROOT, \"newsletters\", f\"{pk}_{lang_code}.html\"\n )\n if os.path.isfile(file_path):\n return sendfile(request, file_path)\n\n newsletter = get_object_or_404(Newsletter, pk=pk)\n events = services.get_agenda(newsletter.date) if newsletter.date else None\n\n return render(\n request,\n \"newsletters/email.html\",\n {\n \"newsletter\": newsletter,\n \"agenda_events\": events,\n \"main_partner\": Partner.objects.filter(is_main_partner=True).first(),\n \"local_partner\": Partner.objects.filter(is_local_partner=True).first(),\n \"lang_code\": lang_code,\n },\n )\n\n\n@staff_member_required\n@permission_required(\"newsletters.send_newsletter\")\ndef admin_send(request, pk):\n \"\"\"If this is a GET request this view will render a confirmation page for the administrator.\n\n If it is a POST request the newsletter will be sent to all recipients.\n\n :param request: the request object\n :param pk: the newsletter's primary key\n :return: 302 RedirectResponse if POST else 200 with the\n confirmation page HTML\n \"\"\"\n newsletter = get_object_or_404(Newsletter, pk=pk)\n\n if newsletter.sent:\n return redirect(newsletter)\n\n if request.POST:\n services.send_newsletter(newsletter)\n\n return redirect(\"admin:newsletters_newsletter_changelist\")\n\n return render(\n request, \"newsletters/admin/send_confirm.html\", {\"newsletter\": newsletter}\n )\n", "path": "website/newsletters/views.py"}, {"content": "\"\"\"The emails defined by the newsletters package.\"\"\"\nimport logging\nfrom smtplib import SMTPException\n\nfrom django.conf import settings\nfrom django.core import mail\nfrom django.core.mail import EmailMultiAlternatives\nfrom django.template.loader import get_template\nfrom django.utils import translation, timezone\nfrom django.utils.timezone import make_aware\n\nfrom newsletters import services\nfrom partners.models import Partner\n\nlogger = logging.getLogger(__name__)\n\n\ndef send_newsletter(newsletter):\n \"\"\"Send the newsletter as HTML and plaintext email.\n\n :param newsletter: the newsletter to be send\n \"\"\"\n events = None\n if newsletter.date:\n datetime = (\n make_aware(\n timezone.datetime(\n year=newsletter.date.year,\n month=newsletter.date.month,\n day=newsletter.date.day,\n )\n )\n if newsletter.date\n else None\n )\n events = services.get_agenda(datetime)\n\n from_email = settings.NEWSLETTER_FROM_ADDRESS\n html_template = get_template(\"newsletters/email.html\")\n text_template = get_template(\"newsletters/email.txt\")\n\n main_partner = Partner.objects.filter(is_main_partner=True).first()\n local_partner = Partner.objects.filter(is_local_partner=True).first()\n\n with mail.get_connection() as connection:\n language = (\"en\", \"English\")\n translation.activate(language[0])\n\n subject = \"[THALIA] \" + newsletter.title\n\n context = {\n \"newsletter\": newsletter,\n \"agenda_events\": events,\n \"main_partner\": main_partner,\n \"local_partner\": local_partner,\n \"lang_code\": language[0],\n }\n\n html_message = html_template.render(context)\n text_message = text_template.render(context)\n\n msg = EmailMultiAlternatives(\n subject=subject,\n body=text_message,\n to=[f\"newsletter@{settings.GSUITE_DOMAIN}\"],\n from_email=from_email,\n connection=connection,\n )\n msg.attach_alternative(html_message, \"text/html\")\n\n try:\n msg.send()\n logger.info(\"Sent %s newsletter\", language[1])\n except SMTPException:\n logger.exception(\"Failed to send the %s newsletter\", language[1])\n\n translation.deactivate()\n", "path": "website/newsletters/emails.py"}], "after_files": [{"content": "\"\"\"Views provided by the newsletters package.\"\"\"\nimport os\n\nfrom django.conf import settings\nfrom django.contrib.admin.views.decorators import staff_member_required\nfrom django.contrib.auth.decorators import permission_required\nfrom django.shortcuts import get_object_or_404, redirect, render\nfrom django.utils.translation import activate, get_language_info\nfrom django_sendfile import sendfile\n\nfrom newsletters import services\nfrom newsletters.models import Newsletter\nfrom partners.models import Partner\n\n\ndef preview(request, pk, lang=None):\n \"\"\"View that renders the newsletter as HTML.\n\n :param request: the request object\n :param pk: the newsletter's primary key\n :param lang: the language of the render\n :return: HttpResponse 200 containing the newsletter HTML\n \"\"\"\n lang_code = request.LANGUAGE_CODE\n\n if lang is not None:\n try:\n get_language_info(lang)\n activate(lang)\n lang_code = lang\n except KeyError:\n # Language code not recognised by get_language_info\n pass\n\n # Send cached file, if it exists\n file_path = os.path.join(\n settings.MEDIA_ROOT, \"newsletters\", f\"{pk}_{lang_code}.html\"\n )\n if os.path.isfile(file_path):\n return sendfile(request, file_path)\n\n newsletter = get_object_or_404(Newsletter, pk=pk)\n events = services.get_agenda(newsletter.date) if newsletter.date else None\n\n return render(\n request,\n \"newsletters/email.html\",\n {\n \"newsletter\": newsletter,\n \"agenda_events\": events,\n \"main_partner\": Partner.objects.filter(is_main_partner=True).first(),\n \"local_partners\": Partner.objects.filter(is_local_partner=True),\n \"lang_code\": lang_code,\n },\n )\n\n\n@staff_member_required\n@permission_required(\"newsletters.send_newsletter\")\ndef admin_send(request, pk):\n \"\"\"If this is a GET request this view will render a confirmation page for the administrator.\n\n If it is a POST request the newsletter will be sent to all recipients.\n\n :param request: the request object\n :param pk: the newsletter's primary key\n :return: 302 RedirectResponse if POST else 200 with the\n confirmation page HTML\n \"\"\"\n newsletter = get_object_or_404(Newsletter, pk=pk)\n\n if newsletter.sent:\n return redirect(newsletter)\n\n if request.POST:\n services.send_newsletter(newsletter)\n\n return redirect(\"admin:newsletters_newsletter_changelist\")\n\n return render(\n request, \"newsletters/admin/send_confirm.html\", {\"newsletter\": newsletter}\n )\n", "path": "website/newsletters/views.py"}, {"content": "\"\"\"The emails defined by the newsletters package.\"\"\"\nimport logging\nfrom smtplib import SMTPException\n\nfrom django.conf import settings\nfrom django.core import mail\nfrom django.core.mail import EmailMultiAlternatives\nfrom django.template.loader import get_template\nfrom django.utils import translation, timezone\nfrom django.utils.timezone import make_aware\n\nfrom newsletters import services\nfrom partners.models import Partner\n\nlogger = logging.getLogger(__name__)\n\n\ndef send_newsletter(newsletter):\n \"\"\"Send the newsletter as HTML and plaintext email.\n\n :param newsletter: the newsletter to be send\n \"\"\"\n events = None\n if newsletter.date:\n datetime = (\n make_aware(\n timezone.datetime(\n year=newsletter.date.year,\n month=newsletter.date.month,\n day=newsletter.date.day,\n )\n )\n if newsletter.date\n else None\n )\n events = services.get_agenda(datetime)\n\n from_email = settings.NEWSLETTER_FROM_ADDRESS\n html_template = get_template(\"newsletters/email.html\")\n text_template = get_template(\"newsletters/email.txt\")\n\n main_partner = Partner.objects.filter(is_main_partner=True).first()\n local_partners = Partner.objects.filter(is_local_partner=True)\n\n with mail.get_connection() as connection:\n language = (\"en\", \"English\")\n translation.activate(language[0])\n\n subject = \"[THALIA] \" + newsletter.title\n\n context = {\n \"newsletter\": newsletter,\n \"agenda_events\": events,\n \"main_partner\": main_partner,\n \"local_partners\": local_partners,\n \"lang_code\": language[0],\n }\n\n html_message = html_template.render(context)\n text_message = text_template.render(context)\n\n msg = EmailMultiAlternatives(\n subject=subject,\n body=text_message,\n to=[f\"newsletter@{settings.GSUITE_DOMAIN}\"],\n from_email=from_email,\n connection=connection,\n )\n msg.attach_alternative(html_message, \"text/html\")\n\n try:\n msg.send()\n logger.info(\"Sent %s newsletter\", language[1])\n except SMTPException:\n logger.exception(\"Failed to send the %s newsletter\", language[1])\n\n translation.deactivate()\n", "path": "website/newsletters/emails.py"}]}
| 1,745 | 311 |
gh_patches_debug_32206
|
rasdani/github-patches
|
git_diff
|
learningequality__kolibri-5236
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
reminder: remove debug logging code before final release
### Observed behavior
#5223 introduced some verbose logging code
### Expected behavior
remove before release
### User-facing consequences
logs will get swamped
### Errors and logs
<!--
Relevant logs from:
* the command line
* ~/.kolibri/kolibri.log
* the browser console
Please wrap errors in triple backticks for clean formatting like this:
```
01:10 info: something happened
01:12 error: something bad happened
```
-->
…
### Steps to reproduce
<!--
Precise steps that someone else can follow in order to see this behavior
-->
…
### Context
<!--
Tell us about your environment, including:
* Kolibri version
* Operating system
* Browser
-->
…
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kolibri/core/notifications/tasks.py`
Content:
```
1 import logging as logger
2 import threading
3 import time
4
5 from django.db import connection
6 from django.db import transaction
7
8 logging = logger.getLogger(__name__)
9
10
11 class AsyncNotificationQueue():
12
13 def __init__(self):
14
15 # Value in seconds to determine the sleep time between log saving batches
16 self.log_saving_interval = 5
17
18 # Where new log saving functions are appended
19 self.queue = []
20
21 # Where the to be executed log saving functions are stored
22 # once a batch save has been invoked
23 self.running = []
24
25 def append(self, fn):
26 """
27 Convenience method to append log saving function to the current queue
28 """
29 self.queue.append(fn)
30
31 def toggle_queue(self):
32 """
33 Method to swap the queue and running, to allow new log saving functions
34 to be added to the queue while previously added functions are being executed
35 and cleared without fear of race conditions dropping saves.
36 """
37 old_queue = self.queue
38 new_queue = self.running
39 self.queue = new_queue
40 self.running = old_queue
41
42 def clear_running(self):
43 """
44 Reset the running list to drop references to already executed log saving functions
45 """
46 self.running = []
47
48 def run(self):
49 """
50 Execute any log saving functions in the self.running list
51 """
52 if self.running:
53 # Do this conditionally to avoid opening an unnecessary transaction
54 with transaction.atomic():
55 for fn in self.running:
56 try:
57 logging.warn('>>>>>> AsyncNotificationQueue.run try')
58 fn()
59 except Exception as e:
60 # Catch all exceptions and log, otherwise the background process will end
61 # and no more logs will be saved!
62 logging.warn('>>>>>> AsyncNotificationQueue.run except {}'.format(e))
63 logging.debug("Exception raised during background notification calculation: ", e)
64 connection.close()
65
66 def start(self):
67 while True:
68 logging.warn('>>>>>> AsyncNotificationQueue.start: {}'.format(threading.currentThread().ident))
69 logging.warn('\t\t len(self.running): {}'.format(self.running))
70 logging.warn('\t\t len(self.queue): {}'.format(self.queue))
71 self.toggle_queue()
72 self.run()
73 self.clear_running()
74 time.sleep(self.log_saving_interval)
75
76
77 log_queue = AsyncNotificationQueue()
78
79
80 def add_to_save_queue(fn):
81 log_queue.append(fn)
82
83
84 def wrap_to_save_queue(fn, *args):
85 def wrapper():
86 fn(*args)
87 log_queue.append(wrapper)
88
89
90 class AsyncNotificationsThread(threading.Thread):
91
92 @classmethod
93 def start_command(cls):
94 logging.warn('>>>>>> AsyncNotificationsThread.start_command: {}'.format(threading.currentThread().ident))
95 thread = cls()
96 thread.daemon = True
97 thread.start()
98
99 def run(self):
100 logging.info("Initializing background log saving process")
101 logging.warn('>>>>>> AsyncNotificationsThread.run: {}'.format(threading.currentThread().ident))
102 log_queue.start()
103
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/kolibri/core/notifications/tasks.py b/kolibri/core/notifications/tasks.py
--- a/kolibri/core/notifications/tasks.py
+++ b/kolibri/core/notifications/tasks.py
@@ -54,20 +54,15 @@
with transaction.atomic():
for fn in self.running:
try:
- logging.warn('>>>>>> AsyncNotificationQueue.run try')
fn()
except Exception as e:
# Catch all exceptions and log, otherwise the background process will end
# and no more logs will be saved!
- logging.warn('>>>>>> AsyncNotificationQueue.run except {}'.format(e))
- logging.debug("Exception raised during background notification calculation: ", e)
+ logging.warn("Exception raised during background notification calculation: ", e)
connection.close()
def start(self):
while True:
- logging.warn('>>>>>> AsyncNotificationQueue.start: {}'.format(threading.currentThread().ident))
- logging.warn('\t\t len(self.running): {}'.format(self.running))
- logging.warn('\t\t len(self.queue): {}'.format(self.queue))
self.toggle_queue()
self.run()
self.clear_running()
@@ -91,12 +86,10 @@
@classmethod
def start_command(cls):
- logging.warn('>>>>>> AsyncNotificationsThread.start_command: {}'.format(threading.currentThread().ident))
thread = cls()
thread.daemon = True
thread.start()
def run(self):
logging.info("Initializing background log saving process")
- logging.warn('>>>>>> AsyncNotificationsThread.run: {}'.format(threading.currentThread().ident))
log_queue.start()
|
{"golden_diff": "diff --git a/kolibri/core/notifications/tasks.py b/kolibri/core/notifications/tasks.py\n--- a/kolibri/core/notifications/tasks.py\n+++ b/kolibri/core/notifications/tasks.py\n@@ -54,20 +54,15 @@\n with transaction.atomic():\n for fn in self.running:\n try:\n- logging.warn('>>>>>> AsyncNotificationQueue.run try')\n fn()\n except Exception as e:\n # Catch all exceptions and log, otherwise the background process will end\n # and no more logs will be saved!\n- logging.warn('>>>>>> AsyncNotificationQueue.run except {}'.format(e))\n- logging.debug(\"Exception raised during background notification calculation: \", e)\n+ logging.warn(\"Exception raised during background notification calculation: \", e)\n connection.close()\n \n def start(self):\n while True:\n- logging.warn('>>>>>> AsyncNotificationQueue.start: {}'.format(threading.currentThread().ident))\n- logging.warn('\\t\\t len(self.running): {}'.format(self.running))\n- logging.warn('\\t\\t len(self.queue): {}'.format(self.queue))\n self.toggle_queue()\n self.run()\n self.clear_running()\n@@ -91,12 +86,10 @@\n \n @classmethod\n def start_command(cls):\n- logging.warn('>>>>>> AsyncNotificationsThread.start_command: {}'.format(threading.currentThread().ident))\n thread = cls()\n thread.daemon = True\n thread.start()\n \n def run(self):\n logging.info(\"Initializing background log saving process\")\n- logging.warn('>>>>>> AsyncNotificationsThread.run: {}'.format(threading.currentThread().ident))\n log_queue.start()\n", "issue": "reminder: remove debug logging code before final release\n\r\n### Observed behavior\r\n\r\n#5223 introduced some verbose logging code\r\n\r\n### Expected behavior\r\n\r\nremove before release\r\n\r\n### User-facing consequences\r\n\r\nlogs will get swamped\r\n\r\n### Errors and logs\r\n<!--\r\nRelevant logs from:\r\n * the command line\r\n * ~/.kolibri/kolibri.log\r\n * the browser console\r\n\r\nPlease wrap errors in triple backticks for clean formatting like this:\r\n```\r\n01:10 info: something happened\r\n01:12 error: something bad happened\r\n```\r\n-->\r\n\r\n\u2026\r\n\r\n### Steps to reproduce\r\n<!--\r\nPrecise steps that someone else can follow in order to see this behavior\r\n-->\r\n\r\n\u2026\r\n\r\n### Context\r\n<!--\r\nTell us about your environment, including:\r\n * Kolibri version\r\n * Operating system\r\n * Browser\r\n-->\r\n\r\n\u2026\r\n\n", "before_files": [{"content": "import logging as logger\nimport threading\nimport time\n\nfrom django.db import connection\nfrom django.db import transaction\n\nlogging = logger.getLogger(__name__)\n\n\nclass AsyncNotificationQueue():\n\n def __init__(self):\n\n # Value in seconds to determine the sleep time between log saving batches\n self.log_saving_interval = 5\n\n # Where new log saving functions are appended\n self.queue = []\n\n # Where the to be executed log saving functions are stored\n # once a batch save has been invoked\n self.running = []\n\n def append(self, fn):\n \"\"\"\n Convenience method to append log saving function to the current queue\n \"\"\"\n self.queue.append(fn)\n\n def toggle_queue(self):\n \"\"\"\n Method to swap the queue and running, to allow new log saving functions\n to be added to the queue while previously added functions are being executed\n and cleared without fear of race conditions dropping saves.\n \"\"\"\n old_queue = self.queue\n new_queue = self.running\n self.queue = new_queue\n self.running = old_queue\n\n def clear_running(self):\n \"\"\"\n Reset the running list to drop references to already executed log saving functions\n \"\"\"\n self.running = []\n\n def run(self):\n \"\"\"\n Execute any log saving functions in the self.running list\n \"\"\"\n if self.running:\n # Do this conditionally to avoid opening an unnecessary transaction\n with transaction.atomic():\n for fn in self.running:\n try:\n logging.warn('>>>>>> AsyncNotificationQueue.run try')\n fn()\n except Exception as e:\n # Catch all exceptions and log, otherwise the background process will end\n # and no more logs will be saved!\n logging.warn('>>>>>> AsyncNotificationQueue.run except {}'.format(e))\n logging.debug(\"Exception raised during background notification calculation: \", e)\n connection.close()\n\n def start(self):\n while True:\n logging.warn('>>>>>> AsyncNotificationQueue.start: {}'.format(threading.currentThread().ident))\n logging.warn('\\t\\t len(self.running): {}'.format(self.running))\n logging.warn('\\t\\t len(self.queue): {}'.format(self.queue))\n self.toggle_queue()\n self.run()\n self.clear_running()\n time.sleep(self.log_saving_interval)\n\n\nlog_queue = AsyncNotificationQueue()\n\n\ndef add_to_save_queue(fn):\n log_queue.append(fn)\n\n\ndef wrap_to_save_queue(fn, *args):\n def wrapper():\n fn(*args)\n log_queue.append(wrapper)\n\n\nclass AsyncNotificationsThread(threading.Thread):\n\n @classmethod\n def start_command(cls):\n logging.warn('>>>>>> AsyncNotificationsThread.start_command: {}'.format(threading.currentThread().ident))\n thread = cls()\n thread.daemon = True\n thread.start()\n\n def run(self):\n logging.info(\"Initializing background log saving process\")\n logging.warn('>>>>>> AsyncNotificationsThread.run: {}'.format(threading.currentThread().ident))\n log_queue.start()\n", "path": "kolibri/core/notifications/tasks.py"}], "after_files": [{"content": "import logging as logger\nimport threading\nimport time\n\nfrom django.db import connection\nfrom django.db import transaction\n\nlogging = logger.getLogger(__name__)\n\n\nclass AsyncNotificationQueue():\n\n def __init__(self):\n\n # Value in seconds to determine the sleep time between log saving batches\n self.log_saving_interval = 5\n\n # Where new log saving functions are appended\n self.queue = []\n\n # Where the to be executed log saving functions are stored\n # once a batch save has been invoked\n self.running = []\n\n def append(self, fn):\n \"\"\"\n Convenience method to append log saving function to the current queue\n \"\"\"\n self.queue.append(fn)\n\n def toggle_queue(self):\n \"\"\"\n Method to swap the queue and running, to allow new log saving functions\n to be added to the queue while previously added functions are being executed\n and cleared without fear of race conditions dropping saves.\n \"\"\"\n old_queue = self.queue\n new_queue = self.running\n self.queue = new_queue\n self.running = old_queue\n\n def clear_running(self):\n \"\"\"\n Reset the running list to drop references to already executed log saving functions\n \"\"\"\n self.running = []\n\n def run(self):\n \"\"\"\n Execute any log saving functions in the self.running list\n \"\"\"\n if self.running:\n # Do this conditionally to avoid opening an unnecessary transaction\n with transaction.atomic():\n for fn in self.running:\n try:\n fn()\n except Exception as e:\n # Catch all exceptions and log, otherwise the background process will end\n # and no more logs will be saved!\n logging.warn(\"Exception raised during background notification calculation: \", e)\n connection.close()\n\n def start(self):\n while True:\n self.toggle_queue()\n self.run()\n self.clear_running()\n time.sleep(self.log_saving_interval)\n\n\nlog_queue = AsyncNotificationQueue()\n\n\ndef add_to_save_queue(fn):\n log_queue.append(fn)\n\n\ndef wrap_to_save_queue(fn, *args):\n def wrapper():\n fn(*args)\n log_queue.append(wrapper)\n\n\nclass AsyncNotificationsThread(threading.Thread):\n\n @classmethod\n def start_command(cls):\n thread = cls()\n thread.daemon = True\n thread.start()\n\n def run(self):\n logging.info(\"Initializing background log saving process\")\n log_queue.start()\n", "path": "kolibri/core/notifications/tasks.py"}]}
| 1,262 | 349 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.